ZIm/crates/language_models/src
Umesh Yadav 65e3e84cbc
language_models: Add thinking support for ollama (#31665)
This PR updates how we handle Ollama responses, leveraging the new
[v0.9.0](https://github.com/ollama/ollama/releases/tag/v0.9.0) release.
Previously, thinking text was embedded within the model's main content,
leading to it appearing directly in the agent's response. Now, thinking
content is provided as a separate parameter, allowing us to display it
correctly within the agent panel, similar to other providers. I have
tested this with qwen3:8b and works nicely. ~~We can release this once
the ollama is release is stable.~~ It's released now as stable.

<img width="433" alt="image"
src="https://github.com/user-attachments/assets/2983ef06-6679-4033-82c2-231ea9cd6434"
/>


Release Notes:

- Add thinking support for ollama

---------

Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-06-02 15:12:41 +00:00
..
provider language_models: Add thinking support for ollama (#31665) 2025-06-02 15:12:41 +00:00
ui agent: Fix instruction list item with multiple buttons not working (#30541) 2025-05-12 06:19:20 -03:00
language_models.rs language_models: Remove language-models feature flag (#29416) 2025-04-25 14:18:48 +00:00
provider.rs assistant: Add Bedrock support (#21092) 2025-02-24 18:10:12 -05:00
settings.rs agent: Allow customizing temperature by provider/model (#30033) 2025-05-06 20:36:25 +00:00
ui.rs assistant: Refine settings view's instruction visuals (#25812) 2025-02-28 12:06:47 -03:00