
This pull request adds full integration with OpenRouter, allowing users to access a wide variety of language models through a single API key. **Implementation Details:** * **Provider Registration:** Registers OpenRouter as a new language model provider within the application's model registry. This includes UI for API key authentication, token counting, streaming completions, and tool-call handling. * **Dedicated Crate:** Adds a new `open_router` crate to manage interactions with the OpenRouter HTTP API, including model discovery and streaming helpers. * **UI & Configuration:** Extends workspace manifests, the settings schema, icons, and default configurations to surface the OpenRouter provider and its settings within the UI. * **Readability:** Reformats JSON arrays within the settings files for improved readability. **Design Decisions & Discussion Points:** * **Code Reuse:** I leveraged much of the existing logic from the `openai` provider integration due to the significant similarities between the OpenAI and OpenRouter API specifications. * **Default Model:** I set the default model to `openrouter/auto`. This model automatically routes user prompts to the most suitable underlying model on OpenRouter, providing a convenient starting point. * **Model Population Strategy:** * <strike>I've implemented dynamic population of available models by querying the OpenRouter API upon initialization. * Currently, this involves three separate API calls: one for all models, one for tool-use models, and one for models good at programming. * The data from the tool-use API call sets a `tool_use` flag for relevant models. * The data from the programming models API call is used to sort the list, prioritizing coding-focused models in the dropdown.</strike> * <strike>**Feedback Welcome:** I acknowledge this multi-call approach is API-intensive. I am open to feedback and alternative implementation suggestions if the team believes this can be optimized.</strike> * **Update: Now this has been simplified to one api call.** * **UI/UX Considerations:** * <strike>Authentication Method: Currently, I've implemented the standard API key input in settings, similar to other providers like OpenAI/Anthropic. However, OpenRouter also supports OAuth 2.0 with PKCE. This could offer a potentially smoother, more integrated setup experience for users (e.g., clicking a button to authorize instead of copy-pasting a key). Should we prioritize implementing OAuth PKCE now, or perhaps add it as an alternative option later?</strike>(PKCE is not straight forward and complicated so skipping this for now. So that we can add the support and work on this later.) * <strike>To visually distinguish models better suited for programming, I've considered adding a marker (e.g., `</>` or `🧠`) next to their names. Thoughts on this proposal?</strike>. (This will require a changes and discussion across model provider. This doesn't fall under the scope of current PR). * OpenRouter offers 300+ models. The current implementation loads all of them. **Feedback Needed:** Should we refine this list or implement more sophisticated filtering/categorization for better usability? **Motivation:** This integration directly addresses one of the most highly upvoted feature requests/discussions within the Zed community. Adding OpenRouter support significantly expands the range of AI models accessible to users. I welcome feedback from the Zed team on this implementation and the design choices made. I am eager to refine this feature and make it available to users. ISSUES: https://github.com/zed-industries/zed/discussions/16576 Release Notes: - Added support for OpenRouter as a language model provider. --------- Signed-off-by: Umesh Yadav <umesh4257@gmail.com> Co-authored-by: Marshall Bowers <git@maxdeviant.com>
81 lines
2.6 KiB
Rust
81 lines
2.6 KiB
Rust
use std::sync::Arc;
|
|
|
|
use client::{Client, UserStore};
|
|
use fs::Fs;
|
|
use gpui::{App, Context, Entity};
|
|
use language_model::LanguageModelRegistry;
|
|
use provider::deepseek::DeepSeekLanguageModelProvider;
|
|
|
|
pub mod provider;
|
|
mod settings;
|
|
pub mod ui;
|
|
|
|
use crate::provider::anthropic::AnthropicLanguageModelProvider;
|
|
use crate::provider::bedrock::BedrockLanguageModelProvider;
|
|
use crate::provider::cloud::CloudLanguageModelProvider;
|
|
use crate::provider::copilot_chat::CopilotChatLanguageModelProvider;
|
|
use crate::provider::google::GoogleLanguageModelProvider;
|
|
use crate::provider::lmstudio::LmStudioLanguageModelProvider;
|
|
use crate::provider::mistral::MistralLanguageModelProvider;
|
|
use crate::provider::ollama::OllamaLanguageModelProvider;
|
|
use crate::provider::open_ai::OpenAiLanguageModelProvider;
|
|
use crate::provider::open_router::OpenRouterLanguageModelProvider;
|
|
pub use crate::settings::*;
|
|
|
|
pub fn init(user_store: Entity<UserStore>, client: Arc<Client>, fs: Arc<dyn Fs>, cx: &mut App) {
|
|
crate::settings::init(fs, cx);
|
|
let registry = LanguageModelRegistry::global(cx);
|
|
registry.update(cx, |registry, cx| {
|
|
register_language_model_providers(registry, user_store, client, cx);
|
|
});
|
|
}
|
|
|
|
fn register_language_model_providers(
|
|
registry: &mut LanguageModelRegistry,
|
|
user_store: Entity<UserStore>,
|
|
client: Arc<Client>,
|
|
cx: &mut Context<LanguageModelRegistry>,
|
|
) {
|
|
registry.register_provider(
|
|
CloudLanguageModelProvider::new(user_store.clone(), client.clone(), cx),
|
|
cx,
|
|
);
|
|
|
|
registry.register_provider(
|
|
AnthropicLanguageModelProvider::new(client.http_client(), cx),
|
|
cx,
|
|
);
|
|
registry.register_provider(
|
|
OpenAiLanguageModelProvider::new(client.http_client(), cx),
|
|
cx,
|
|
);
|
|
registry.register_provider(
|
|
OllamaLanguageModelProvider::new(client.http_client(), cx),
|
|
cx,
|
|
);
|
|
registry.register_provider(
|
|
LmStudioLanguageModelProvider::new(client.http_client(), cx),
|
|
cx,
|
|
);
|
|
registry.register_provider(
|
|
DeepSeekLanguageModelProvider::new(client.http_client(), cx),
|
|
cx,
|
|
);
|
|
registry.register_provider(
|
|
GoogleLanguageModelProvider::new(client.http_client(), cx),
|
|
cx,
|
|
);
|
|
registry.register_provider(
|
|
MistralLanguageModelProvider::new(client.http_client(), cx),
|
|
cx,
|
|
);
|
|
registry.register_provider(
|
|
BedrockLanguageModelProvider::new(client.http_client(), cx),
|
|
cx,
|
|
);
|
|
registry.register_provider(
|
|
OpenRouterLanguageModelProvider::new(client.http_client(), cx),
|
|
cx,
|
|
);
|
|
registry.register_provider(CopilotChatLanguageModelProvider::new(cx), cx);
|
|
}
|