language_models: Add tool use support for Mistral models (#29994)
Closes https://github.com/zed-industries/zed/issues/29855 Implement tool use handling in Mistral provider, including mapping tool call events and updating request construction. Add support for tool_choice and parallel_tool_calls in Mistral API requests. This works fine with all the existing models. Didn't touched anything else but for future. Fetching models using their models api, deducting tool call support, parallel tool calls etc should be done from model data from api response. <img width="547" alt="Screenshot 2025-05-06 at 4 52 37 PM" src="https://github.com/user-attachments/assets/4c08b544-1174-40cc-a40d-522989953448" /> Tasks: - [x] Add tool call support - [x] Auto Fetch models using mistral api - [x] Add tests for mistral crates. - [x] Fix mistral configurations for llm providers. Release Notes: - agent: Add tool call support for existing mistral models --------- Co-authored-by: Peter Tripp <peter@zed.dev> Co-authored-by: Bennet Bo Fenner <bennet@zed.dev>
This commit is contained in:
parent
26a8cac0d8
commit
926f377c6c
6 changed files with 347 additions and 50 deletions
|
@ -14,6 +14,7 @@ Here's an overview of the supported providers and tool call support:
|
|||
| [Anthropic](#anthropic) | ✅ |
|
||||
| [GitHub Copilot Chat](#github-copilot-chat) | In Some Cases |
|
||||
| [Google AI](#google-ai) | ✅ |
|
||||
| [Mistral](#mistral) | ✅ |
|
||||
| [Ollama](#ollama) | ✅ |
|
||||
| [OpenAI](#openai) | ✅ |
|
||||
| [DeepSeek](#deepseek) | 🚫 |
|
||||
|
@ -128,6 +129,44 @@ By default Zed will use `stable` versions of models, but you can use specific ve
|
|||
|
||||
Custom models will be listed in the model dropdown in the Agent Panel.
|
||||
|
||||
### Mistral {#mistral}
|
||||
|
||||
> 🔨Supports tool use
|
||||
|
||||
1. Visit the Mistral platform and [create an API key](https://console.mistral.ai/api-keys/)
|
||||
2. Open the configuration view (`assistant: show configuration`) and navigate to the Mistral section
|
||||
3. Enter your Mistral API key
|
||||
|
||||
The Mistral API key will be saved in your keychain.
|
||||
|
||||
Zed will also use the `MISTRAL_API_KEY` environment variable if it's defined.
|
||||
|
||||
#### Mistral Custom Models {#mistral-custom-models}
|
||||
|
||||
The Zed Assistant comes pre-configured with several Mistral models (codestral-latest, mistral-large-latest, mistral-medium-latest, mistral-small-latest, open-mistral-nemo, and open-codestral-mamba). All the default models support tool use. If you wish to use alternate models or customize their parameters, you can do so by adding the following to your Zed `settings.json`:
|
||||
|
||||
```json
|
||||
{
|
||||
"language_models": {
|
||||
"mistral": {
|
||||
"api_url": "https://api.mistral.ai/v1",
|
||||
"available_models": [
|
||||
{
|
||||
"name": "mistral-tiny-latest",
|
||||
"display_name": "Mistral Tiny",
|
||||
"max_tokens": 32000,
|
||||
"max_output_tokens": 4096,
|
||||
"max_completion_tokens": 1024,
|
||||
"supports_tools": true
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Custom models will be listed in the model dropdown in the assistant panel.
|
||||
|
||||
### Ollama {#ollama}
|
||||
|
||||
> ✅ Supports tool use
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue