Add support for OpenRouter as a language model provider (#29496)

This pull request adds full integration with OpenRouter, allowing users
to access a wide variety of language models through a single API key.

**Implementation Details:**

* **Provider Registration:** Registers OpenRouter as a new language
model provider within the application's model registry. This includes UI
for API key authentication, token counting, streaming completions, and
tool-call handling.
* **Dedicated Crate:** Adds a new `open_router` crate to manage
interactions with the OpenRouter HTTP API, including model discovery and
streaming helpers.
* **UI & Configuration:** Extends workspace manifests, the settings
schema, icons, and default configurations to surface the OpenRouter
provider and its settings within the UI.
* **Readability:** Reformats JSON arrays within the settings files for
improved readability.

**Design Decisions & Discussion Points:**

* **Code Reuse:** I leveraged much of the existing logic from the
`openai` provider integration due to the significant similarities
between the OpenAI and OpenRouter API specifications.
* **Default Model:** I set the default model to `openrouter/auto`. This
model automatically routes user prompts to the most suitable underlying
model on OpenRouter, providing a convenient starting point.
* **Model Population Strategy:**
* <strike>I've implemented dynamic population of available models by
querying the OpenRouter API upon initialization.
* Currently, this involves three separate API calls: one for all models,
one for tool-use models, and one for models good at programming.
* The data from the tool-use API call sets a `tool_use` flag for
relevant models.
* The data from the programming models API call is used to sort the
list, prioritizing coding-focused models in the dropdown.</strike>
* <strike>**Feedback Welcome:** I acknowledge this multi-call approach
is API-intensive. I am open to feedback and alternative implementation
suggestions if the team believes this can be optimized.</strike>
    * **Update: Now this has been simplified to one api call.**
* **UI/UX Considerations:**
* <strike>Authentication Method: Currently, I've implemented the
standard API key input in settings, similar to other providers like
OpenAI/Anthropic. However, OpenRouter also supports OAuth 2.0 with PKCE.
This could offer a potentially smoother, more integrated setup
experience for users (e.g., clicking a button to authorize instead of
copy-pasting a key). Should we prioritize implementing OAuth PKCE now,
or perhaps add it as an alternative option later?</strike>(PKCE is not
straight forward and complicated so skipping this for now. So that we
can add the support and work on this later.)
* <strike>To visually distinguish models better suited for programming,
I've considered adding a marker (e.g., `</>` or `🧠`) next to their
names. Thoughts on this proposal?</strike>. (This will require a changes
and discussion across model provider. This doesn't fall under the scope
of current PR).
* OpenRouter offers 300+ models. The current implementation loads all of
them. **Feedback Needed:** Should we refine this list or implement more
sophisticated filtering/categorization for better usability?

**Motivation:**

This integration directly addresses one of the most highly upvoted
feature requests/discussions within the Zed community. Adding OpenRouter
support significantly expands the range of AI models accessible to
users.

I welcome feedback from the Zed team on this implementation and the
design choices made. I am eager to refine this feature and make it
available to users.

ISSUES: https://github.com/zed-industries/zed/discussions/16576

Release Notes:

- Added support for OpenRouter as a language model provider.

---------

Signed-off-by: Umesh Yadav <umesh4257@gmail.com>
Co-authored-by: Marshall Bowers <git@maxdeviant.com>
This commit is contained in:
Umesh Yadav 2025-06-03 21:29:46 +05:30 committed by GitHub
parent e13b494c9e
commit c9c603b1d1
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
14 changed files with 1356 additions and 0 deletions

View file

@ -20,6 +20,7 @@ use crate::provider::{
mistral::MistralSettings,
ollama::OllamaSettings,
open_ai::OpenAiSettings,
open_router::OpenRouterSettings,
};
/// Initializes the language model settings.
@ -61,6 +62,7 @@ pub struct AllLanguageModelSettings {
pub bedrock: AmazonBedrockSettings,
pub ollama: OllamaSettings,
pub openai: OpenAiSettings,
pub open_router: OpenRouterSettings,
pub zed_dot_dev: ZedDotDevSettings,
pub google: GoogleSettings,
pub copilot_chat: CopilotChatSettings,
@ -76,6 +78,7 @@ pub struct AllLanguageModelSettingsContent {
pub ollama: Option<OllamaSettingsContent>,
pub lmstudio: Option<LmStudioSettingsContent>,
pub openai: Option<OpenAiSettingsContent>,
pub open_router: Option<OpenRouterSettingsContent>,
#[serde(rename = "zed.dev")]
pub zed_dot_dev: Option<ZedDotDevSettingsContent>,
pub google: Option<GoogleSettingsContent>,
@ -271,6 +274,12 @@ pub struct ZedDotDevSettingsContent {
#[derive(Default, Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
pub struct CopilotChatSettingsContent {}
#[derive(Default, Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
pub struct OpenRouterSettingsContent {
pub api_url: Option<String>,
pub available_models: Option<Vec<provider::open_router::AvailableModel>>,
}
impl settings::Settings for AllLanguageModelSettings {
const KEY: Option<&'static str> = Some("language_models");
@ -409,6 +418,19 @@ impl settings::Settings for AllLanguageModelSettings {
&mut settings.mistral.available_models,
mistral.as_ref().and_then(|s| s.available_models.clone()),
);
// OpenRouter
let open_router = value.open_router.clone();
merge(
&mut settings.open_router.api_url,
open_router.as_ref().and_then(|s| s.api_url.clone()),
);
merge(
&mut settings.open_router.available_models,
open_router
.as_ref()
.and_then(|s| s.available_models.clone()),
);
}
Ok(settings)