docs: Add OpenRouter agent support (#32011)
Update few other docs as well. Like recently tool support was added for deepseek. Also there was recent thinking and images support for ollama model. Release Notes: - N/A
This commit is contained in:
parent
988d834c33
commit
ac15194d11
1 changed files with 27 additions and 3 deletions
|
@ -13,13 +13,14 @@ Here's an overview of the supported providers and tool call support:
|
||||||
| ----------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
| ----------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||||
| [Amazon Bedrock](#amazon-bedrock) | Depends on the model |
|
| [Amazon Bedrock](#amazon-bedrock) | Depends on the model |
|
||||||
| [Anthropic](#anthropic) | ✅ |
|
| [Anthropic](#anthropic) | ✅ |
|
||||||
| [DeepSeek](#deepseek) | 🚫 |
|
| [DeepSeek](#deepseek) | ✅ |
|
||||||
| [GitHub Copilot Chat](#github-copilot-chat) | For Some Models ([link](https://github.com/zed-industries/zed/blob/9e0330ba7d848755c9734bf456c716bddf0973f3/crates/language_models/src/provider/copilot_chat.rs#L189-L198)) |
|
| [GitHub Copilot Chat](#github-copilot-chat) | For Some Models ([link](https://github.com/zed-industries/zed/blob/9e0330ba7d848755c9734bf456c716bddf0973f3/crates/language_models/src/provider/copilot_chat.rs#L189-L198)) |
|
||||||
| [Google AI](#google-ai) | ✅ |
|
| [Google AI](#google-ai) | ✅ |
|
||||||
| [LM Studio](#lmstudio) | ✅ |
|
| [LM Studio](#lmstudio) | ✅ |
|
||||||
| [Mistral](#mistral) | ✅ |
|
| [Mistral](#mistral) | ✅ |
|
||||||
| [Ollama](#ollama) | ✅ |
|
| [Ollama](#ollama) | ✅ |
|
||||||
| [OpenAI](#openai) | ✅ |
|
| [OpenAI](#openai) | ✅ |
|
||||||
|
| [OpenRouter](#openrouter) | ✅ |
|
||||||
| [OpenAI API Compatible](#openai-api-compatible) | 🚫 |
|
| [OpenAI API Compatible](#openai-api-compatible) | 🚫 |
|
||||||
|
|
||||||
## Use Your Own Keys {#use-your-own-keys}
|
## Use Your Own Keys {#use-your-own-keys}
|
||||||
|
@ -164,7 +165,7 @@ You can configure a model to use [extended thinking](https://docs.anthropic.com/
|
||||||
|
|
||||||
### DeepSeek {#deepseek}
|
### DeepSeek {#deepseek}
|
||||||
|
|
||||||
> 🚫 Does not support tool use
|
> ✅ Supports tool use
|
||||||
|
|
||||||
1. Visit the DeepSeek platform and [create an API key](https://platform.deepseek.com/api_keys)
|
1. Visit the DeepSeek platform and [create an API key](https://platform.deepseek.com/api_keys)
|
||||||
2. Open the settings view (`agent: open configuration`) and go to the DeepSeek section
|
2. Open the settings view (`agent: open configuration`) and go to the DeepSeek section
|
||||||
|
@ -351,7 +352,9 @@ Depending on your hardware or use-case you may wish to limit or increase the con
|
||||||
"name": "qwen2.5-coder",
|
"name": "qwen2.5-coder",
|
||||||
"display_name": "qwen 2.5 coder 32K",
|
"display_name": "qwen 2.5 coder 32K",
|
||||||
"max_tokens": 32768,
|
"max_tokens": 32768,
|
||||||
"supports_tools": true
|
"supports_tools": true,
|
||||||
|
"supports_thinking": true,
|
||||||
|
"supports_images": true
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
@ -371,6 +374,12 @@ The `supports_tools` option controls whether or not the model will use additiona
|
||||||
If the model is tagged with `tools` in the Ollama catalog this option should be supplied, and built in profiles `Ask` and `Write` can be used.
|
If the model is tagged with `tools` in the Ollama catalog this option should be supplied, and built in profiles `Ask` and `Write` can be used.
|
||||||
If the model is not tagged with `tools` in the Ollama catalog, this option can still be supplied with value `true`; however be aware that only the `Minimal` built in profile will work.
|
If the model is not tagged with `tools` in the Ollama catalog, this option can still be supplied with value `true`; however be aware that only the `Minimal` built in profile will work.
|
||||||
|
|
||||||
|
The `supports_thinking` option controls whether or not the model will perform an explicit “thinking” (reasoning) pass before producing its final answer.
|
||||||
|
If the model is tagged with `thinking` in the Ollama catalog, set this option and you can use it in zed.
|
||||||
|
|
||||||
|
The `supports_images` option enables the model’s vision capabilities, allowing it to process images included in the conversation context.
|
||||||
|
If the model is tagged with `vision` in the Ollama catalog, set this option and you can use it in zed.
|
||||||
|
|
||||||
### OpenAI {#openai}
|
### OpenAI {#openai}
|
||||||
|
|
||||||
> ✅ Supports tool use
|
> ✅ Supports tool use
|
||||||
|
@ -416,6 +425,21 @@ You must provide the model's Context Window in the `max_tokens` parameter; this
|
||||||
OpenAI `o1` models should set `max_completion_tokens` as well to avoid incurring high reasoning token costs.
|
OpenAI `o1` models should set `max_completion_tokens` as well to avoid incurring high reasoning token costs.
|
||||||
Custom models will be listed in the model dropdown in the Agent Panel.
|
Custom models will be listed in the model dropdown in the Agent Panel.
|
||||||
|
|
||||||
|
### OpenRouter {#openrouter}
|
||||||
|
|
||||||
|
> ✅ Supports tool use
|
||||||
|
|
||||||
|
OpenRouter provides access to multiple AI models through a single API. It supports tool use for compatible models.
|
||||||
|
|
||||||
|
1. Visit [OpenRouter](https://openrouter.ai) and create an account
|
||||||
|
2. Generate an API key from your [OpenRouter keys page](https://openrouter.ai/keys)
|
||||||
|
3. Open the settings view (`agent: open configuration`) and go to the OpenRouter section
|
||||||
|
4. Enter your OpenRouter API key
|
||||||
|
|
||||||
|
The OpenRouter API key will be saved in your keychain.
|
||||||
|
|
||||||
|
Zed will also use the `OPENROUTER_API_KEY` environment variable if it's defined.
|
||||||
|
|
||||||
### OpenAI API Compatible {#openai-api-compatible}
|
### OpenAI API Compatible {#openai-api-compatible}
|
||||||
|
|
||||||
Zed supports using OpenAI compatible APIs by specifying a custom `endpoint` and `available_models` for the OpenAI provider.
|
Zed supports using OpenAI compatible APIs by specifying a custom `endpoint` and `available_models` for the OpenAI provider.
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue