Compare commits

...
Sign in to create a new pull request.

14 commits

Author SHA1 Message Date
Peter Tripp
fa815dbf70
ci: Disable FreeBSD builds (#34511)
Recently FreeBSD zed-remote-server builds are failing 90%+ of the time
for unknown reasons.

Temporarily suspend them.

Example failing builds:
- [2025-07-15 16:15 Nightly
Failure](https://github.com/zed-industries/zed/actions/runs/16302777887/job/46042358675)
- [2025-07-15 12:20 Nightly
Success](https://github.com/zed-industries/zed/actions/runs/16297907892/job/46025281518)
- [2025-07-14 08:21 Nightly
Failure](https://github.com/zed-industries/zed/actions/runs/16266193889/job/45923004940)
- [2025-06-17 Nightly
Failure](https://github.com/zed-industries/zed/actions/runs/15700462603/job/44234573761)

Release Notes:

- Temporarily disable FreeBSD zed-remote-server builds due to CI failures.
2025-07-15 21:25:34 -04:00
Zed Bot
a394df5c0c Bump to 0.195.2 for @ConradIrwin 2025-07-15 20:00:15 +00:00
Conrad Irwin
76d78e8a14 Add zed://extension/{id} links (#34492)
Release Notes:

- Add zed://extension/{id} links to open the extensions UI with a
specific extension
2025-07-15 13:54:46 -06:00
Umesh Yadav
3a7871d248
Add xAI language model provider (#33593)
Closes #30010

Release Notes:

- Add support for xAI language model provider
2025-07-15 15:47:53 -04:00
Cole Miller
7f2283749b
Remove auto-width editor type (#34438)
Closes #34044

`EditorMode::SingleLine { auto_width: true }` was only used for the
title editor in the rules library, and following
https://github.com/zed-industries/zed/pull/31994 we can replace that
with a normal single-line editor without problems. The auto-width editor
was interacting badly with the recently-added newline visualization
code, causing a panic during layout---by switching it to
`Editor::single_line` the newline visualization works there too.

Release Notes:

- Fixed a panic that could occur when opening the rules library.

---------

Co-authored-by: Finn <finn@zed.dev>
2025-07-15 14:57:56 -04:00
Zed Bot
2d724520bc Bump to 0.195.1 for @ConradIrwin 2025-07-14 19:06:38 +00:00
gcp-cherry-pick-bot[bot]
473062aeef
debugger: Fix endless restarts when connecting to TCP adapters over SSH (cherry-pick #34328) (#34343)
Cherry-picked debugger: Fix endless restarts when connecting to TCP
adapters over SSH (#34328)

Closes #34323
Closes #34313

The previous PR #33932 introduced a way to "close" the
`pending_requests` buffer of the `TransportDelegate`, preventing any
more requests from being added. This prevents pending requests from
accumulating without ever being drained during the shutdown sequence;
without it, some of our tests hang at this point (due to using a
single-threaded executor).

The bug occurred because we were closing `pending_requests` whenever we
detected the server side of the transport shut down, and this closed
state stuck around and interfered with the retry logic for SSH+TCP
adapter connections.

This PR fixes the bug by only closing `pending_requests` on session
shutdown, and adds a regression test covering the SSH retry logic.

Release Notes:

- debugger: Fixed a bug causing SSH connections to some adapters
(Python, Go, JavaScript) to fail and restart endlessly.

Co-authored-by: Cole Miller <cole@zed.dev>
2025-07-12 17:13:27 -04:00
gcp-cherry-pick-bot[bot]
612c9addff
Return back the guards when goto targets are queried for (cherry-pick #34340) (#34344) 2025-07-12 19:49:32 +03:00
gcp-cherry-pick-bot[bot]
19a60dbf9c
Fix bad kerning in integrated terminal (cherry-pick #34292) (#34298)
Cherry-picked Fix bad kerning in integrated terminal (#34292)

Closes #16869

Release Notes:

- (preview only): Fix bad kerning in integrated terminal.

Co-authored-by: Alisina Bahadori <alisina.bm@gmail.com>
2025-07-11 12:13:36 -06:00
gcp-cherry-pick-bot[bot]
acba38dabd
language_models: Refresh the list of models when the LLM token is refreshed (cherry-pick #34222) (#34294)
Cherry-picked language_models: Refresh the list of models when the LLM
token is refreshed (#34222)

This PR makes it so we refresh the list of models whenever the LLM token
is refreshed.

This allows us to add or remove models based on the plan in the new
token.

Release Notes:

- Fixed model list not refreshing when subscribing to Zed Pro.

---------

Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>

Co-authored-by: Marshall Bowers <git@maxdeviant.com>
Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-07-11 11:48:51 -04:00
gcp-cherry-pick-bot[bot]
c1b3111c15
vim: Fix panic when scrolling beyond last line (cherry-pick #34172) (#34174)
Cherry-picked vim: Fix panic when scrolling beyond last line (#34172)

cc @dinocosta

Release Notes:

- (preview only) vim: Fix panic when scrolling down at end of file

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-07-09 20:25:03 -06:00
localcc
623388ad80
Fix inno dir (#34116)
Fix inno dir for nightly builds

Release Notes:

- N/A
2025-07-09 14:45:49 -04:00
Max Brunsfeld
eb89e9a572
Don't upload windows installer to preview releases for now (#34147)
Release Notes:

- N/A
2025-07-09 14:45:46 -04:00
Peter Tripp
e306a55073
v0.195.x preview 2025-07-09 11:02:11 -04:00
42 changed files with 1450 additions and 293 deletions

View file

@ -686,8 +686,10 @@ jobs:
timeout-minutes: 60 timeout-minutes: 60
runs-on: github-8vcpu-ubuntu-2404 runs-on: github-8vcpu-ubuntu-2404
if: | if: |
false && (
startsWith(github.ref, 'refs/tags/v') startsWith(github.ref, 'refs/tags/v')
|| contains(github.event.pull_request.labels.*.name, 'run-bundling') || contains(github.event.pull_request.labels.*.name, 'run-bundling')
)
needs: [linux_tests] needs: [linux_tests]
name: Build Zed on FreeBSD name: Build Zed on FreeBSD
# env: # env:
@ -800,7 +802,8 @@ jobs:
- name: Upload Artifacts to release - name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1 uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
if: ${{ !(contains(github.event.pull_request.labels.*.name, 'run-bundling')) && env.RELEASE_CHANNEL == 'preview' }} # upload only preview # Re-enable when we are ready to publish windows preview releases
if: false && ${{ !(contains(github.event.pull_request.labels.*.name, 'run-bundling')) && env.RELEASE_CHANNEL == 'preview' }} # upload only preview
with: with:
draft: true draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }} prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
@ -813,7 +816,7 @@ jobs:
if: | if: |
startsWith(github.ref, 'refs/tags/v') startsWith(github.ref, 'refs/tags/v')
&& endsWith(github.ref, '-pre') && !endsWith(github.ref, '.0-pre') && endsWith(github.ref, '-pre') && !endsWith(github.ref, '.0-pre')
needs: [bundle-mac, bundle-linux-x86_x64, bundle-linux-aarch64, bundle-windows-x64, freebsd] needs: [bundle-mac, bundle-linux-x86_x64, bundle-linux-aarch64, bundle-windows-x64]
runs-on: runs-on:
- self-hosted - self-hosted
- bundle - bundle

View file

@ -195,7 +195,7 @@ jobs:
freebsd: freebsd:
timeout-minutes: 60 timeout-minutes: 60
if: github.repository_owner == 'zed-industries' if: false && github.repository_owner == 'zed-industries'
runs-on: github-8vcpu-ubuntu-2404 runs-on: github-8vcpu-ubuntu-2404
needs: tests needs: tests
env: env:

15
Cargo.lock generated
View file

@ -3043,6 +3043,7 @@ dependencies = [
"context_server", "context_server",
"ctor", "ctor",
"dap", "dap",
"dap-types",
"dap_adapters", "dap_adapters",
"dashmap 6.1.0", "dashmap 6.1.0",
"debugger_ui", "debugger_ui",
@ -9000,6 +9001,7 @@ dependencies = [
"util", "util",
"vercel", "vercel",
"workspace-hack", "workspace-hack",
"x_ai",
"zed_llm_client", "zed_llm_client",
] ]
@ -19731,6 +19733,17 @@ version = "0.13.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ec107c4503ea0b4a98ef47356329af139c0a4f7750e621cf2973cd3385ebcb3d" checksum = "ec107c4503ea0b4a98ef47356329af139c0a4f7750e621cf2973cd3385ebcb3d"
[[package]]
name = "x_ai"
version = "0.1.0"
dependencies = [
"anyhow",
"schemars",
"serde",
"strum 0.27.1",
"workspace-hack",
]
[[package]] [[package]]
name = "xattr" name = "xattr"
version = "0.2.3" version = "0.2.3"
@ -19972,7 +19985,7 @@ dependencies = [
[[package]] [[package]]
name = "zed" name = "zed"
version = "0.195.0" version = "0.195.2"
dependencies = [ dependencies = [
"activity_indicator", "activity_indicator",
"agent", "agent",

View file

@ -177,6 +177,7 @@ members = [
"crates/welcome", "crates/welcome",
"crates/workspace", "crates/workspace",
"crates/worktree", "crates/worktree",
"crates/x_ai",
"crates/zed", "crates/zed",
"crates/zed_actions", "crates/zed_actions",
"crates/zeta", "crates/zeta",
@ -390,6 +391,7 @@ web_search_providers = { path = "crates/web_search_providers" }
welcome = { path = "crates/welcome" } welcome = { path = "crates/welcome" }
workspace = { path = "crates/workspace" } workspace = { path = "crates/workspace" }
worktree = { path = "crates/worktree" } worktree = { path = "crates/worktree" }
x_ai = { path = "crates/x_ai" }
zed = { path = "crates/zed" } zed = { path = "crates/zed" }
zed_actions = { path = "crates/zed_actions" } zed_actions = { path = "crates/zed_actions" }
zeta = { path = "crates/zeta" } zeta = { path = "crates/zeta" }

3
assets/icons/ai_x_ai.svg Normal file
View file

@ -0,0 +1,3 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="m12.414 5.47.27 9.641h2.157l.27-13.15zM15.11.889h-3.293L6.651 7.613l1.647 2.142zM.889 15.11H4.18l1.647-2.142-1.647-2.143zm0-9.641 7.409 9.641h3.292L4.181 5.47z" fill="#000"/>
</svg>

After

Width:  |  Height:  |  Size: 289 B

View file

@ -491,6 +491,7 @@ impl AgentConfiguration {
category_filter: Some( category_filter: Some(
ExtensionCategoryFilter::ContextServers, ExtensionCategoryFilter::ContextServers,
), ),
id: None,
} }
.boxed_clone(), .boxed_clone(),
cx, cx,

View file

@ -1778,6 +1778,7 @@ impl AgentPanel {
category_filter: Some( category_filter: Some(
zed_actions::ExtensionCategoryFilter::ContextServers, zed_actions::ExtensionCategoryFilter::ContextServers,
), ),
id: None,
}), }),
) )
.action("Add Custom Server…", Box::new(AddContextServer)) .action("Add Custom Server…", Box::new(AddContextServer))

View file

@ -94,6 +94,7 @@ context_server.workspace = true
ctor.workspace = true ctor.workspace = true
dap = { workspace = true, features = ["test-support"] } dap = { workspace = true, features = ["test-support"] }
dap_adapters = { workspace = true, features = ["test-support"] } dap_adapters = { workspace = true, features = ["test-support"] }
dap-types.workspace = true
debugger_ui = { workspace = true, features = ["test-support"] } debugger_ui = { workspace = true, features = ["test-support"] }
editor = { workspace = true, features = ["test-support"] } editor = { workspace = true, features = ["test-support"] }
extension.workspace = true extension.workspace = true

View file

@ -2,6 +2,7 @@ use crate::tests::TestServer;
use call::ActiveCall; use call::ActiveCall;
use collections::{HashMap, HashSet}; use collections::{HashMap, HashSet};
use dap::{Capabilities, adapters::DebugTaskDefinition, transport::RequestHandling};
use debugger_ui::debugger_panel::DebugPanel; use debugger_ui::debugger_panel::DebugPanel;
use extension::ExtensionHostProxy; use extension::ExtensionHostProxy;
use fs::{FakeFs, Fs as _, RemoveOptions}; use fs::{FakeFs, Fs as _, RemoveOptions};
@ -22,6 +23,7 @@ use language::{
use node_runtime::NodeRuntime; use node_runtime::NodeRuntime;
use project::{ use project::{
ProjectPath, ProjectPath,
debugger::session::ThreadId,
lsp_store::{FormatTrigger, LspFormatTarget}, lsp_store::{FormatTrigger, LspFormatTarget},
}; };
use remote::SshRemoteClient; use remote::SshRemoteClient;
@ -29,7 +31,11 @@ use remote_server::{HeadlessAppState, HeadlessProject};
use rpc::proto; use rpc::proto;
use serde_json::json; use serde_json::json;
use settings::SettingsStore; use settings::SettingsStore;
use std::{path::Path, sync::Arc}; use std::{
path::Path,
sync::{Arc, atomic::AtomicUsize},
};
use task::TcpArgumentsTemplate;
use util::path; use util::path;
#[gpui::test(iterations = 10)] #[gpui::test(iterations = 10)]
@ -688,3 +694,162 @@ async fn test_remote_server_debugger(
shutdown_session.await.unwrap(); shutdown_session.await.unwrap();
} }
#[gpui::test]
async fn test_slow_adapter_startup_retries(
cx_a: &mut TestAppContext,
server_cx: &mut TestAppContext,
executor: BackgroundExecutor,
) {
cx_a.update(|cx| {
release_channel::init(SemanticVersion::default(), cx);
command_palette_hooks::init(cx);
zlog::init_test();
dap_adapters::init(cx);
});
server_cx.update(|cx| {
release_channel::init(SemanticVersion::default(), cx);
dap_adapters::init(cx);
});
let (opts, server_ssh) = SshRemoteClient::fake_server(cx_a, server_cx);
let remote_fs = FakeFs::new(server_cx.executor());
remote_fs
.insert_tree(
path!("/code"),
json!({
"lib.rs": "fn one() -> usize { 1 }"
}),
)
.await;
// User A connects to the remote project via SSH.
server_cx.update(HeadlessProject::init);
let remote_http_client = Arc::new(BlockedHttpClient);
let node = NodeRuntime::unavailable();
let languages = Arc::new(LanguageRegistry::new(server_cx.executor()));
let _headless_project = server_cx.new(|cx| {
client::init_settings(cx);
HeadlessProject::new(
HeadlessAppState {
session: server_ssh,
fs: remote_fs.clone(),
http_client: remote_http_client,
node_runtime: node,
languages,
extension_host_proxy: Arc::new(ExtensionHostProxy::new()),
},
cx,
)
});
let client_ssh = SshRemoteClient::fake_client(opts, cx_a).await;
let mut server = TestServer::start(server_cx.executor()).await;
let client_a = server.create_client(cx_a, "user_a").await;
cx_a.update(|cx| {
debugger_ui::init(cx);
command_palette_hooks::init(cx);
});
let (project_a, _) = client_a
.build_ssh_project(path!("/code"), client_ssh.clone(), cx_a)
.await;
let (workspace, cx_a) = client_a.build_workspace(&project_a, cx_a);
let debugger_panel = workspace
.update_in(cx_a, |_workspace, window, cx| {
cx.spawn_in(window, DebugPanel::load)
})
.await
.unwrap();
workspace.update_in(cx_a, |workspace, window, cx| {
workspace.add_panel(debugger_panel, window, cx);
});
cx_a.run_until_parked();
let debug_panel = workspace
.update(cx_a, |workspace, cx| workspace.panel::<DebugPanel>(cx))
.unwrap();
let workspace_window = cx_a
.window_handle()
.downcast::<workspace::Workspace>()
.unwrap();
let count = Arc::new(AtomicUsize::new(0));
let session = debugger_ui::tests::start_debug_session_with(
&workspace_window,
cx_a,
DebugTaskDefinition {
adapter: "fake-adapter".into(),
label: "test".into(),
config: json!({
"request": "launch"
}),
tcp_connection: Some(TcpArgumentsTemplate {
port: None,
host: None,
timeout: None,
}),
},
move |client| {
let count = count.clone();
client.on_request_ext::<dap::requests::Initialize, _>(move |_seq, _request| {
if count.fetch_add(1, std::sync::atomic::Ordering::SeqCst) < 5 {
return RequestHandling::Exit;
}
RequestHandling::Respond(Ok(Capabilities::default()))
});
},
)
.unwrap();
cx_a.run_until_parked();
let client = session.update(cx_a, |session, _| session.adapter_client().unwrap());
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
reason: dap::StoppedEventReason::Pause,
description: None,
thread_id: Some(1),
preserve_focus_hint: None,
text: None,
all_threads_stopped: None,
hit_breakpoint_ids: None,
}))
.await;
cx_a.run_until_parked();
let active_session = debug_panel
.update(cx_a, |this, _| this.active_session())
.unwrap();
let running_state = active_session.update(cx_a, |active_session, _| {
active_session.running_state().clone()
});
assert_eq!(
client.id(),
running_state.read_with(cx_a, |running_state, _| running_state.session_id())
);
assert_eq!(
ThreadId(1),
running_state.read_with(cx_a, |running_state, _| running_state
.selected_thread_id()
.unwrap())
);
let shutdown_session = workspace.update(cx_a, |workspace, cx| {
workspace.project().update(cx, |project, cx| {
project.dap_store().update(cx, |dap_store, cx| {
dap_store.shutdown_session(session.read(cx).session_id(), cx)
})
})
});
client_ssh.update(cx_a, |a, _| {
a.shutdown_processes(Some(proto::ShutdownRemoteServer {}), executor)
});
shutdown_session.await.unwrap();
}

View file

@ -442,10 +442,18 @@ impl DebugAdapter for FakeAdapter {
_: Option<Vec<String>>, _: Option<Vec<String>>,
_: &mut AsyncApp, _: &mut AsyncApp,
) -> Result<DebugAdapterBinary> { ) -> Result<DebugAdapterBinary> {
let connection = task_definition
.tcp_connection
.as_ref()
.map(|connection| TcpArguments {
host: connection.host(),
port: connection.port.unwrap_or(17),
timeout: connection.timeout,
});
Ok(DebugAdapterBinary { Ok(DebugAdapterBinary {
command: Some("command".into()), command: Some("command".into()),
arguments: vec![], arguments: vec![],
connection: None, connection,
envs: HashMap::default(), envs: HashMap::default(),
cwd: None, cwd: None,
request_args: StartDebuggingRequestArguments { request_args: StartDebuggingRequestArguments {

View file

@ -2,7 +2,7 @@ use crate::{
adapters::DebugAdapterBinary, adapters::DebugAdapterBinary,
transport::{IoKind, LogKind, TransportDelegate}, transport::{IoKind, LogKind, TransportDelegate},
}; };
use anyhow::{Context as _, Result}; use anyhow::Result;
use dap_types::{ use dap_types::{
messages::{Message, Response}, messages::{Message, Response},
requests::Request, requests::Request,
@ -110,9 +110,7 @@ impl DebugAdapterClient {
self.transport_delegate self.transport_delegate
.pending_requests .pending_requests
.lock() .lock()
.as_mut() .insert(sequence_id, callback_tx)?;
.context("client is closed")?
.insert(sequence_id, callback_tx);
log::debug!( log::debug!(
"Client {} send `{}` request with sequence_id: {}", "Client {} send `{}` request with sequence_id: {}",
@ -170,6 +168,7 @@ impl DebugAdapterClient {
pub fn kill(&self) { pub fn kill(&self) {
log::debug!("Killing DAP process"); log::debug!("Killing DAP process");
self.transport_delegate.transport.lock().kill(); self.transport_delegate.transport.lock().kill();
self.transport_delegate.pending_requests.lock().shutdown();
} }
pub fn has_adapter_logs(&self) -> bool { pub fn has_adapter_logs(&self) -> bool {
@ -184,11 +183,34 @@ impl DebugAdapterClient {
} }
#[cfg(any(test, feature = "test-support"))] #[cfg(any(test, feature = "test-support"))]
pub fn on_request<R: dap_types::requests::Request, F>(&self, handler: F) pub fn on_request<R: dap_types::requests::Request, F>(&self, mut handler: F)
where where
F: 'static F: 'static
+ Send + Send
+ FnMut(u64, R::Arguments) -> Result<R::Response, dap_types::ErrorResponse>, + FnMut(u64, R::Arguments) -> Result<R::Response, dap_types::ErrorResponse>,
{
use crate::transport::RequestHandling;
self.transport_delegate
.transport
.lock()
.as_fake()
.on_request::<R, _>(move |seq, request| {
RequestHandling::Respond(handler(seq, request))
});
}
#[cfg(any(test, feature = "test-support"))]
pub fn on_request_ext<R: dap_types::requests::Request, F>(&self, handler: F)
where
F: 'static
+ Send
+ FnMut(
u64,
R::Arguments,
) -> crate::transport::RequestHandling<
Result<R::Response, dap_types::ErrorResponse>,
>,
{ {
self.transport_delegate self.transport_delegate
.transport .transport

View file

@ -49,6 +49,12 @@ pub enum IoKind {
StdErr, StdErr,
} }
#[cfg(any(test, feature = "test-support"))]
pub enum RequestHandling<T> {
Respond(T),
Exit,
}
type LogHandlers = Arc<Mutex<SmallVec<[(LogKind, IoHandler); 2]>>>; type LogHandlers = Arc<Mutex<SmallVec<[(LogKind, IoHandler); 2]>>>;
pub trait Transport: Send + Sync { pub trait Transport: Send + Sync {
@ -76,7 +82,11 @@ async fn start(
) -> Result<Box<dyn Transport>> { ) -> Result<Box<dyn Transport>> {
#[cfg(any(test, feature = "test-support"))] #[cfg(any(test, feature = "test-support"))]
if cfg!(any(test, feature = "test-support")) { if cfg!(any(test, feature = "test-support")) {
return Ok(Box::new(FakeTransport::start(cx).await?)); if let Some(connection) = binary.connection.clone() {
return Ok(Box::new(FakeTransport::start_tcp(connection, cx).await?));
} else {
return Ok(Box::new(FakeTransport::start_stdio(cx).await?));
}
} }
if binary.connection.is_some() { if binary.connection.is_some() {
@ -90,11 +100,57 @@ async fn start(
} }
} }
pub(crate) struct PendingRequests {
inner: Option<HashMap<u64, oneshot::Sender<Result<Response>>>>,
}
impl PendingRequests {
fn new() -> Self {
Self {
inner: Some(HashMap::default()),
}
}
fn flush(&mut self, e: anyhow::Error) {
let Some(inner) = self.inner.as_mut() else {
return;
};
for (_, sender) in inner.drain() {
sender.send(Err(e.cloned())).ok();
}
}
pub(crate) fn insert(
&mut self,
sequence_id: u64,
callback_tx: oneshot::Sender<Result<Response>>,
) -> anyhow::Result<()> {
let Some(inner) = self.inner.as_mut() else {
bail!("client is closed")
};
inner.insert(sequence_id, callback_tx);
Ok(())
}
pub(crate) fn remove(
&mut self,
sequence_id: u64,
) -> anyhow::Result<Option<oneshot::Sender<Result<Response>>>> {
let Some(inner) = self.inner.as_mut() else {
bail!("client is closed");
};
Ok(inner.remove(&sequence_id))
}
pub(crate) fn shutdown(&mut self) {
self.flush(anyhow!("transport shutdown"));
self.inner = None;
}
}
pub(crate) struct TransportDelegate { pub(crate) struct TransportDelegate {
log_handlers: LogHandlers, log_handlers: LogHandlers,
// TODO this should really be some kind of associative channel pub(crate) pending_requests: Arc<Mutex<PendingRequests>>,
pub(crate) pending_requests:
Arc<Mutex<Option<HashMap<u64, oneshot::Sender<Result<Response>>>>>>,
pub(crate) transport: Mutex<Box<dyn Transport>>, pub(crate) transport: Mutex<Box<dyn Transport>>,
pub(crate) server_tx: smol::lock::Mutex<Option<Sender<Message>>>, pub(crate) server_tx: smol::lock::Mutex<Option<Sender<Message>>>,
tasks: Mutex<Vec<Task<()>>>, tasks: Mutex<Vec<Task<()>>>,
@ -108,7 +164,7 @@ impl TransportDelegate {
transport: Mutex::new(transport), transport: Mutex::new(transport),
log_handlers, log_handlers,
server_tx: Default::default(), server_tx: Default::default(),
pending_requests: Arc::new(Mutex::new(Some(HashMap::default()))), pending_requests: Arc::new(Mutex::new(PendingRequests::new())),
tasks: Default::default(), tasks: Default::default(),
}) })
} }
@ -151,24 +207,10 @@ impl TransportDelegate {
Ok(()) => { Ok(()) => {
pending_requests pending_requests
.lock() .lock()
.take() .flush(anyhow!("debugger shutdown unexpectedly"));
.into_iter()
.flatten()
.for_each(|(_, request)| {
request
.send(Err(anyhow!("debugger shutdown unexpectedly")))
.ok();
});
} }
Err(e) => { Err(e) => {
pending_requests pending_requests.lock().flush(e);
.lock()
.take()
.into_iter()
.flatten()
.for_each(|(_, request)| {
request.send(Err(e.cloned())).ok();
});
} }
} }
})); }));
@ -286,7 +328,7 @@ impl TransportDelegate {
async fn recv_from_server<Stdout>( async fn recv_from_server<Stdout>(
server_stdout: Stdout, server_stdout: Stdout,
mut message_handler: DapMessageHandler, mut message_handler: DapMessageHandler,
pending_requests: Arc<Mutex<Option<HashMap<u64, oneshot::Sender<Result<Response>>>>>>, pending_requests: Arc<Mutex<PendingRequests>>,
log_handlers: Option<LogHandlers>, log_handlers: Option<LogHandlers>,
) -> Result<()> ) -> Result<()>
where where
@ -303,14 +345,10 @@ impl TransportDelegate {
ConnectionResult::Timeout => anyhow::bail!("Timed out when connecting to debugger"), ConnectionResult::Timeout => anyhow::bail!("Timed out when connecting to debugger"),
ConnectionResult::ConnectionReset => { ConnectionResult::ConnectionReset => {
log::info!("Debugger closed the connection"); log::info!("Debugger closed the connection");
break Ok(()); return Ok(());
} }
ConnectionResult::Result(Ok(Message::Response(res))) => { ConnectionResult::Result(Ok(Message::Response(res))) => {
let tx = pending_requests let tx = pending_requests.lock().remove(res.request_seq)?;
.lock()
.as_mut()
.context("client is closed")?
.remove(&res.request_seq);
if let Some(tx) = tx { if let Some(tx) = tx {
if let Err(e) = tx.send(Self::process_response(res)) { if let Err(e) = tx.send(Self::process_response(res)) {
log::trace!("Did not send response `{:?}` for a cancelled", e); log::trace!("Did not send response `{:?}` for a cancelled", e);
@ -704,8 +742,7 @@ impl Drop for StdioTransport {
} }
#[cfg(any(test, feature = "test-support"))] #[cfg(any(test, feature = "test-support"))]
type RequestHandler = type RequestHandler = Box<dyn Send + FnMut(u64, serde_json::Value) -> RequestHandling<Response>>;
Box<dyn Send + FnMut(u64, serde_json::Value) -> dap_types::messages::Response>;
#[cfg(any(test, feature = "test-support"))] #[cfg(any(test, feature = "test-support"))]
type ResponseHandler = Box<dyn Send + Fn(Response)>; type ResponseHandler = Box<dyn Send + Fn(Response)>;
@ -716,23 +753,38 @@ pub struct FakeTransport {
request_handlers: Arc<Mutex<HashMap<&'static str, RequestHandler>>>, request_handlers: Arc<Mutex<HashMap<&'static str, RequestHandler>>>,
// for reverse request responses // for reverse request responses
response_handlers: Arc<Mutex<HashMap<&'static str, ResponseHandler>>>, response_handlers: Arc<Mutex<HashMap<&'static str, ResponseHandler>>>,
stdin_writer: Option<PipeWriter>,
stdout_reader: Option<PipeReader>,
message_handler: Option<Task<Result<()>>>, message_handler: Option<Task<Result<()>>>,
kind: FakeTransportKind,
}
#[cfg(any(test, feature = "test-support"))]
pub enum FakeTransportKind {
Stdio {
stdin_writer: Option<PipeWriter>,
stdout_reader: Option<PipeReader>,
},
Tcp {
connection: TcpArguments,
executor: BackgroundExecutor,
},
} }
#[cfg(any(test, feature = "test-support"))] #[cfg(any(test, feature = "test-support"))]
impl FakeTransport { impl FakeTransport {
pub fn on_request<R: dap_types::requests::Request, F>(&self, mut handler: F) pub fn on_request<R: dap_types::requests::Request, F>(&self, mut handler: F)
where where
F: 'static + Send + FnMut(u64, R::Arguments) -> Result<R::Response, ErrorResponse>, F: 'static
+ Send
+ FnMut(u64, R::Arguments) -> RequestHandling<Result<R::Response, ErrorResponse>>,
{ {
self.request_handlers.lock().insert( self.request_handlers.lock().insert(
R::COMMAND, R::COMMAND,
Box::new(move |seq, args| { Box::new(move |seq, args| {
let result = handler(seq, serde_json::from_value(args).unwrap()); let result = handler(seq, serde_json::from_value(args).unwrap());
let response = match result { let RequestHandling::Respond(response) = result else {
return RequestHandling::Exit;
};
let response = match response {
Ok(response) => Response { Ok(response) => Response {
seq: seq + 1, seq: seq + 1,
request_seq: seq, request_seq: seq,
@ -750,7 +802,7 @@ impl FakeTransport {
message: None, message: None,
}, },
}; };
response RequestHandling::Respond(response)
}), }),
); );
} }
@ -764,86 +816,75 @@ impl FakeTransport {
.insert(R::COMMAND, Box::new(handler)); .insert(R::COMMAND, Box::new(handler));
} }
async fn start(cx: &mut AsyncApp) -> Result<Self> { async fn start_tcp(connection: TcpArguments, cx: &mut AsyncApp) -> Result<Self> {
Ok(Self {
request_handlers: Arc::new(Mutex::new(HashMap::default())),
response_handlers: Arc::new(Mutex::new(HashMap::default())),
message_handler: None,
kind: FakeTransportKind::Tcp {
connection,
executor: cx.background_executor().clone(),
},
})
}
async fn handle_messages(
request_handlers: Arc<Mutex<HashMap<&'static str, RequestHandler>>>,
response_handlers: Arc<Mutex<HashMap<&'static str, ResponseHandler>>>,
stdin_reader: PipeReader,
stdout_writer: PipeWriter,
) -> Result<()> {
use dap_types::requests::{Request, RunInTerminal, StartDebugging}; use dap_types::requests::{Request, RunInTerminal, StartDebugging};
use serde_json::json; use serde_json::json;
let (stdin_writer, stdin_reader) = async_pipe::pipe(); let mut reader = BufReader::new(stdin_reader);
let (stdout_writer, stdout_reader) = async_pipe::pipe();
let mut this = Self {
request_handlers: Arc::new(Mutex::new(HashMap::default())),
response_handlers: Arc::new(Mutex::new(HashMap::default())),
stdin_writer: Some(stdin_writer),
stdout_reader: Some(stdout_reader),
message_handler: None,
};
let request_handlers = this.request_handlers.clone();
let response_handlers = this.response_handlers.clone();
let stdout_writer = Arc::new(smol::lock::Mutex::new(stdout_writer)); let stdout_writer = Arc::new(smol::lock::Mutex::new(stdout_writer));
let mut buffer = String::new();
this.message_handler = Some(cx.background_spawn(async move { loop {
let mut reader = BufReader::new(stdin_reader); match TransportDelegate::receive_server_message(&mut reader, &mut buffer, None).await {
let mut buffer = String::new(); ConnectionResult::Timeout => {
anyhow::bail!("Timed out when connecting to debugger");
loop { }
match TransportDelegate::receive_server_message(&mut reader, &mut buffer, None) ConnectionResult::ConnectionReset => {
.await log::info!("Debugger closed the connection");
{ break Ok(());
ConnectionResult::Timeout => { }
anyhow::bail!("Timed out when connecting to debugger"); ConnectionResult::Result(Err(e)) => break Err(e),
} ConnectionResult::Result(Ok(message)) => {
ConnectionResult::ConnectionReset => { match message {
log::info!("Debugger closed the connection"); Message::Request(request) => {
break Ok(()); // redirect reverse requests to stdout writer/reader
} if request.command == RunInTerminal::COMMAND
ConnectionResult::Result(Err(e)) => break Err(e), || request.command == StartDebugging::COMMAND
ConnectionResult::Result(Ok(message)) => { {
match message {
Message::Request(request) => {
// redirect reverse requests to stdout writer/reader
if request.command == RunInTerminal::COMMAND
|| request.command == StartDebugging::COMMAND
{
let message =
serde_json::to_string(&Message::Request(request)).unwrap();
let mut writer = stdout_writer.lock().await;
writer
.write_all(
TransportDelegate::build_rpc_message(message)
.as_bytes(),
)
.await
.unwrap();
writer.flush().await.unwrap();
} else {
let response = if let Some(handle) =
request_handlers.lock().get_mut(request.command.as_str())
{
handle(request.seq, request.arguments.unwrap_or(json!({})))
} else {
panic!("No request handler for {}", request.command);
};
let message =
serde_json::to_string(&Message::Response(response))
.unwrap();
let mut writer = stdout_writer.lock().await;
writer
.write_all(
TransportDelegate::build_rpc_message(message)
.as_bytes(),
)
.await
.unwrap();
writer.flush().await.unwrap();
}
}
Message::Event(event) => {
let message = let message =
serde_json::to_string(&Message::Event(event)).unwrap(); serde_json::to_string(&Message::Request(request)).unwrap();
let mut writer = stdout_writer.lock().await;
writer
.write_all(
TransportDelegate::build_rpc_message(message).as_bytes(),
)
.await
.unwrap();
writer.flush().await.unwrap();
} else {
let response = if let Some(handle) =
request_handlers.lock().get_mut(request.command.as_str())
{
handle(request.seq, request.arguments.unwrap_or(json!({})))
} else {
panic!("No request handler for {}", request.command);
};
let response = match response {
RequestHandling::Respond(response) => response,
RequestHandling::Exit => {
break Err(anyhow!("exit in response to request"));
}
};
let message =
serde_json::to_string(&Message::Response(response)).unwrap();
let mut writer = stdout_writer.lock().await; let mut writer = stdout_writer.lock().await;
writer writer
@ -854,20 +895,56 @@ impl FakeTransport {
.unwrap(); .unwrap();
writer.flush().await.unwrap(); writer.flush().await.unwrap();
} }
Message::Response(response) => { }
if let Some(handle) = Message::Event(event) => {
response_handlers.lock().get(response.command.as_str()) let message = serde_json::to_string(&Message::Event(event)).unwrap();
{
handle(response); let mut writer = stdout_writer.lock().await;
} else { writer
log::error!("No response handler for {}", response.command); .write_all(TransportDelegate::build_rpc_message(message).as_bytes())
} .await
.unwrap();
writer.flush().await.unwrap();
}
Message::Response(response) => {
if let Some(handle) =
response_handlers.lock().get(response.command.as_str())
{
handle(response);
} else {
log::error!("No response handler for {}", response.command);
} }
} }
} }
} }
} }
})); }
}
async fn start_stdio(cx: &mut AsyncApp) -> Result<Self> {
let (stdin_writer, stdin_reader) = async_pipe::pipe();
let (stdout_writer, stdout_reader) = async_pipe::pipe();
let kind = FakeTransportKind::Stdio {
stdin_writer: Some(stdin_writer),
stdout_reader: Some(stdout_reader),
};
let mut this = Self {
request_handlers: Arc::new(Mutex::new(HashMap::default())),
response_handlers: Arc::new(Mutex::new(HashMap::default())),
message_handler: None,
kind,
};
let request_handlers = this.request_handlers.clone();
let response_handlers = this.response_handlers.clone();
this.message_handler = Some(cx.background_spawn(Self::handle_messages(
request_handlers,
response_handlers,
stdin_reader,
stdout_writer,
)));
Ok(this) Ok(this)
} }
@ -876,7 +953,10 @@ impl FakeTransport {
#[cfg(any(test, feature = "test-support"))] #[cfg(any(test, feature = "test-support"))]
impl Transport for FakeTransport { impl Transport for FakeTransport {
fn tcp_arguments(&self) -> Option<TcpArguments> { fn tcp_arguments(&self) -> Option<TcpArguments> {
None match &self.kind {
FakeTransportKind::Stdio { .. } => None,
FakeTransportKind::Tcp { connection, .. } => Some(connection.clone()),
}
} }
fn connect( fn connect(
@ -887,12 +967,33 @@ impl Transport for FakeTransport {
Box<dyn AsyncRead + Unpin + Send + 'static>, Box<dyn AsyncRead + Unpin + Send + 'static>,
)>, )>,
> { > {
let result = util::maybe!({ let result = match &mut self.kind {
Ok(( FakeTransportKind::Stdio {
Box::new(self.stdin_writer.take().context("Cannot reconnect")?) as _, stdin_writer,
Box::new(self.stdout_reader.take().context("Cannot reconnect")?) as _, stdout_reader,
)) } => util::maybe!({
}); Ok((
Box::new(stdin_writer.take().context("Cannot reconnect")?) as _,
Box::new(stdout_reader.take().context("Cannot reconnect")?) as _,
))
}),
FakeTransportKind::Tcp { executor, .. } => {
let (stdin_writer, stdin_reader) = async_pipe::pipe();
let (stdout_writer, stdout_reader) = async_pipe::pipe();
let request_handlers = self.request_handlers.clone();
let response_handlers = self.response_handlers.clone();
self.message_handler = Some(executor.spawn(Self::handle_messages(
request_handlers,
response_handlers,
stdin_reader,
stdout_writer,
)));
Ok((Box::new(stdin_writer) as _, Box::new(stdout_reader) as _))
}
};
Task::ready(result) Task::ready(result)
} }

View file

@ -1694,6 +1694,7 @@ impl Render for DebugPanel {
category_filter: Some( category_filter: Some(
zed_actions::ExtensionCategoryFilter::DebugAdapters, zed_actions::ExtensionCategoryFilter::DebugAdapters,
), ),
id: None,
} }
.boxed_clone(), .boxed_clone(),
cx, cx,

View file

@ -122,7 +122,7 @@ impl DebugSession {
.to_owned() .to_owned()
} }
pub(crate) fn running_state(&self) -> &Entity<RunningState> { pub fn running_state(&self) -> &Entity<RunningState> {
&self.running_state &self.running_state
} }

View file

@ -1459,7 +1459,7 @@ impl RunningState {
} }
} }
pub(crate) fn selected_thread_id(&self) -> Option<ThreadId> { pub fn selected_thread_id(&self) -> Option<ThreadId> {
self.thread_id self.thread_id
} }

View file

@ -482,9 +482,7 @@ pub enum SelectMode {
#[derive(Clone, PartialEq, Eq, Debug)] #[derive(Clone, PartialEq, Eq, Debug)]
pub enum EditorMode { pub enum EditorMode {
SingleLine { SingleLine,
auto_width: bool,
},
AutoHeight { AutoHeight {
min_lines: usize, min_lines: usize,
max_lines: Option<usize>, max_lines: Option<usize>,
@ -1662,13 +1660,7 @@ impl Editor {
pub fn single_line(window: &mut Window, cx: &mut Context<Self>) -> Self { pub fn single_line(window: &mut Window, cx: &mut Context<Self>) -> Self {
let buffer = cx.new(|cx| Buffer::local("", cx)); let buffer = cx.new(|cx| Buffer::local("", cx));
let buffer = cx.new(|cx| MultiBuffer::singleton(buffer, cx)); let buffer = cx.new(|cx| MultiBuffer::singleton(buffer, cx));
Self::new( Self::new(EditorMode::SingleLine, buffer, None, window, cx)
EditorMode::SingleLine { auto_width: false },
buffer,
None,
window,
cx,
)
} }
pub fn multi_line(window: &mut Window, cx: &mut Context<Self>) -> Self { pub fn multi_line(window: &mut Window, cx: &mut Context<Self>) -> Self {
@ -1677,18 +1669,6 @@ impl Editor {
Self::new(EditorMode::full(), buffer, None, window, cx) Self::new(EditorMode::full(), buffer, None, window, cx)
} }
pub fn auto_width(window: &mut Window, cx: &mut Context<Self>) -> Self {
let buffer = cx.new(|cx| Buffer::local("", cx));
let buffer = cx.new(|cx| MultiBuffer::singleton(buffer, cx));
Self::new(
EditorMode::SingleLine { auto_width: true },
buffer,
None,
window,
cx,
)
}
pub fn auto_height( pub fn auto_height(
min_lines: usize, min_lines: usize,
max_lines: usize, max_lines: usize,

View file

@ -7777,46 +7777,13 @@ impl Element for EditorElement {
editor.set_style(self.style.clone(), window, cx); editor.set_style(self.style.clone(), window, cx);
let layout_id = match editor.mode { let layout_id = match editor.mode {
EditorMode::SingleLine { auto_width } => { EditorMode::SingleLine => {
let rem_size = window.rem_size(); let rem_size = window.rem_size();
let height = self.style.text.line_height_in_pixels(rem_size); let height = self.style.text.line_height_in_pixels(rem_size);
if auto_width { let mut style = Style::default();
let editor_handle = cx.entity().clone(); style.size.height = height.into();
let style = self.style.clone(); style.size.width = relative(1.).into();
window.request_measured_layout( window.request_layout(style, None, cx)
Style::default(),
move |_, _, window, cx| {
let editor_snapshot = editor_handle
.update(cx, |editor, cx| editor.snapshot(window, cx));
let line = Self::layout_lines(
DisplayRow(0)..DisplayRow(1),
&editor_snapshot,
&style,
px(f32::MAX),
|_| false, // Single lines never soft wrap
window,
cx,
)
.pop()
.unwrap();
let font_id =
window.text_system().resolve_font(&style.text.font());
let font_size =
style.text.font_size.to_pixels(window.rem_size());
let em_width =
window.text_system().em_width(font_id, font_size).unwrap();
size(line.width + em_width, height)
},
)
} else {
let mut style = Style::default();
style.size.height = height.into();
style.size.width = relative(1.).into();
window.request_layout(style, None, cx)
}
} }
EditorMode::AutoHeight { EditorMode::AutoHeight {
min_lines, min_lines,
@ -10388,7 +10355,7 @@ mod tests {
}); });
for editor_mode_without_invisibles in [ for editor_mode_without_invisibles in [
EditorMode::SingleLine { auto_width: false }, EditorMode::SingleLine,
EditorMode::AutoHeight { EditorMode::AutoHeight {
min_lines: 1, min_lines: 1,
max_lines: Some(100), max_lines: Some(100),

View file

@ -6,6 +6,7 @@ use std::sync::OnceLock;
use std::time::Duration; use std::time::Duration;
use std::{ops::Range, sync::Arc}; use std::{ops::Range, sync::Arc};
use anyhow::Context as _;
use client::{ExtensionMetadata, ExtensionProvides}; use client::{ExtensionMetadata, ExtensionProvides};
use collections::{BTreeMap, BTreeSet}; use collections::{BTreeMap, BTreeSet};
use editor::{Editor, EditorElement, EditorStyle}; use editor::{Editor, EditorElement, EditorStyle};
@ -80,16 +81,24 @@ pub fn init(cx: &mut App) {
.find_map(|item| item.downcast::<ExtensionsPage>()); .find_map(|item| item.downcast::<ExtensionsPage>());
if let Some(existing) = existing { if let Some(existing) = existing {
if provides_filter.is_some() { existing.update(cx, |extensions_page, cx| {
existing.update(cx, |extensions_page, cx| { if provides_filter.is_some() {
extensions_page.change_provides_filter(provides_filter, cx); extensions_page.change_provides_filter(provides_filter, cx);
}); }
} if let Some(id) = action.id.as_ref() {
extensions_page.focus_extension(id, window, cx);
}
});
workspace.activate_item(&existing, true, true, window, cx); workspace.activate_item(&existing, true, true, window, cx);
} else { } else {
let extensions_page = let extensions_page = ExtensionsPage::new(
ExtensionsPage::new(workspace, provides_filter, window, cx); workspace,
provides_filter,
action.id.as_deref(),
window,
cx,
);
workspace.add_item_to_active_pane( workspace.add_item_to_active_pane(
Box::new(extensions_page), Box::new(extensions_page),
None, None,
@ -287,6 +296,7 @@ impl ExtensionsPage {
pub fn new( pub fn new(
workspace: &Workspace, workspace: &Workspace,
provides_filter: Option<ExtensionProvides>, provides_filter: Option<ExtensionProvides>,
focus_extension_id: Option<&str>,
window: &mut Window, window: &mut Window,
cx: &mut Context<Workspace>, cx: &mut Context<Workspace>,
) -> Entity<Self> { ) -> Entity<Self> {
@ -317,6 +327,9 @@ impl ExtensionsPage {
let query_editor = cx.new(|cx| { let query_editor = cx.new(|cx| {
let mut input = Editor::single_line(window, cx); let mut input = Editor::single_line(window, cx);
input.set_placeholder_text("Search extensions...", cx); input.set_placeholder_text("Search extensions...", cx);
if let Some(id) = focus_extension_id {
input.set_text(format!("id:{id}"), window, cx);
}
input input
}); });
cx.subscribe(&query_editor, Self::on_query_change).detach(); cx.subscribe(&query_editor, Self::on_query_change).detach();
@ -340,7 +353,7 @@ impl ExtensionsPage {
scrollbar_state: ScrollbarState::new(scroll_handle), scrollbar_state: ScrollbarState::new(scroll_handle),
}; };
this.fetch_extensions( this.fetch_extensions(
None, this.search_query(cx),
Some(BTreeSet::from_iter(this.provides_filter)), Some(BTreeSet::from_iter(this.provides_filter)),
None, None,
cx, cx,
@ -464,9 +477,23 @@ impl ExtensionsPage {
.cloned() .cloned()
.collect::<Vec<_>>(); .collect::<Vec<_>>();
let remote_extensions = extension_store.update(cx, |store, cx| { let remote_extensions =
store.fetch_extensions(search.as_deref(), provides_filter.as_ref(), cx) if let Some(id) = search.as_ref().and_then(|s| s.strip_prefix("id:")) {
}); let versions =
extension_store.update(cx, |store, cx| store.fetch_extension_versions(id, cx));
cx.foreground_executor().spawn(async move {
let versions = versions.await?;
let latest = versions
.into_iter()
.max_by_key(|v| v.published_at)
.context("no extension found")?;
Ok(vec![latest])
})
} else {
extension_store.update(cx, |store, cx| {
store.fetch_extensions(search.as_deref(), provides_filter.as_ref(), cx)
})
};
cx.spawn(async move |this, cx| { cx.spawn(async move |this, cx| {
let dev_extensions = if let Some(search) = search { let dev_extensions = if let Some(search) = search {
@ -1156,6 +1183,13 @@ impl ExtensionsPage {
self.refresh_feature_upsells(cx); self.refresh_feature_upsells(cx);
} }
pub fn focus_extension(&mut self, id: &str, window: &mut Window, cx: &mut Context<Self>) {
self.query_editor.update(cx, |editor, cx| {
editor.set_text(format!("id:{id}"), window, cx)
});
self.refresh_search(cx);
}
pub fn change_provides_filter( pub fn change_provides_filter(
&mut self, &mut self,
provides_filter: Option<ExtensionProvides>, provides_filter: Option<ExtensionProvides>,

View file

@ -20,6 +20,7 @@ pub enum IconName {
AiOpenAi, AiOpenAi,
AiOpenRouter, AiOpenRouter,
AiVZero, AiVZero,
AiXAi,
AiZed, AiZed,
ArrowCircle, ArrowCircle,
ArrowDown, ArrowDown,

View file

@ -44,6 +44,7 @@ ollama = { workspace = true, features = ["schemars"] }
open_ai = { workspace = true, features = ["schemars"] } open_ai = { workspace = true, features = ["schemars"] }
open_router = { workspace = true, features = ["schemars"] } open_router = { workspace = true, features = ["schemars"] }
vercel = { workspace = true, features = ["schemars"] } vercel = { workspace = true, features = ["schemars"] }
x_ai = { workspace = true, features = ["schemars"] }
partial-json-fixer.workspace = true partial-json-fixer.workspace = true
proto.workspace = true proto.workspace = true
release_channel.workspace = true release_channel.workspace = true

View file

@ -20,6 +20,7 @@ use crate::provider::ollama::OllamaLanguageModelProvider;
use crate::provider::open_ai::OpenAiLanguageModelProvider; use crate::provider::open_ai::OpenAiLanguageModelProvider;
use crate::provider::open_router::OpenRouterLanguageModelProvider; use crate::provider::open_router::OpenRouterLanguageModelProvider;
use crate::provider::vercel::VercelLanguageModelProvider; use crate::provider::vercel::VercelLanguageModelProvider;
use crate::provider::x_ai::XAiLanguageModelProvider;
pub use crate::settings::*; pub use crate::settings::*;
pub fn init(user_store: Entity<UserStore>, client: Arc<Client>, cx: &mut App) { pub fn init(user_store: Entity<UserStore>, client: Arc<Client>, cx: &mut App) {
@ -81,5 +82,6 @@ fn register_language_model_providers(
VercelLanguageModelProvider::new(client.http_client(), cx), VercelLanguageModelProvider::new(client.http_client(), cx),
cx, cx,
); );
registry.register_provider(XAiLanguageModelProvider::new(client.http_client(), cx), cx);
registry.register_provider(CopilotChatLanguageModelProvider::new(cx), cx); registry.register_provider(CopilotChatLanguageModelProvider::new(cx), cx);
} }

View file

@ -10,3 +10,4 @@ pub mod ollama;
pub mod open_ai; pub mod open_ai;
pub mod open_router; pub mod open_router;
pub mod vercel; pub mod vercel;
pub mod x_ai;

View file

@ -166,46 +166,9 @@ impl State {
} }
let response = Self::fetch_models(client, llm_api_token, use_cloud).await?; let response = Self::fetch_models(client, llm_api_token, use_cloud).await?;
cx.update(|cx| { this.update(cx, |this, cx| {
this.update(cx, |this, cx| { this.update_models(response, cx);
let mut models = Vec::new(); })
for model in response.models {
models.push(Arc::new(model.clone()));
// Right now we represent thinking variants of models as separate models on the client,
// so we need to insert variants for any model that supports thinking.
if model.supports_thinking {
models.push(Arc::new(zed_llm_client::LanguageModel {
id: zed_llm_client::LanguageModelId(
format!("{}-thinking", model.id).into(),
),
display_name: format!("{} Thinking", model.display_name),
..model
}));
}
}
this.default_model = models
.iter()
.find(|model| model.id == response.default_model)
.cloned();
this.default_fast_model = models
.iter()
.find(|model| model.id == response.default_fast_model)
.cloned();
this.recommended_models = response
.recommended_models
.iter()
.filter_map(|id| models.iter().find(|model| &model.id == id))
.cloned()
.collect();
this.models = models;
cx.notify();
})
})??;
anyhow::Ok(())
}) })
.await .await
.context("failed to fetch Zed models") .context("failed to fetch Zed models")
@ -216,12 +179,15 @@ impl State {
}), }),
_llm_token_subscription: cx.subscribe( _llm_token_subscription: cx.subscribe(
&refresh_llm_token_listener, &refresh_llm_token_listener,
|this, _listener, _event, cx| { move |this, _listener, _event, cx| {
let client = this.client.clone(); let client = this.client.clone();
let llm_api_token = this.llm_api_token.clone(); let llm_api_token = this.llm_api_token.clone();
cx.spawn(async move |_this, _cx| { cx.spawn(async move |this, cx| {
llm_api_token.refresh(&client).await?; llm_api_token.refresh(&client).await?;
anyhow::Ok(()) let response = Self::fetch_models(client, llm_api_token, use_cloud).await?;
this.update(cx, |this, cx| {
this.update_models(response, cx);
})
}) })
.detach_and_log_err(cx); .detach_and_log_err(cx);
}, },
@ -264,6 +230,41 @@ impl State {
})); }));
} }
fn update_models(&mut self, response: ListModelsResponse, cx: &mut Context<Self>) {
let mut models = Vec::new();
for model in response.models {
models.push(Arc::new(model.clone()));
// Right now we represent thinking variants of models as separate models on the client,
// so we need to insert variants for any model that supports thinking.
if model.supports_thinking {
models.push(Arc::new(zed_llm_client::LanguageModel {
id: zed_llm_client::LanguageModelId(format!("{}-thinking", model.id).into()),
display_name: format!("{} Thinking", model.display_name),
..model
}));
}
}
self.default_model = models
.iter()
.find(|model| model.id == response.default_model)
.cloned();
self.default_fast_model = models
.iter()
.find(|model| model.id == response.default_fast_model)
.cloned();
self.recommended_models = response
.recommended_models
.iter()
.filter_map(|id| models.iter().find(|model| &model.id == id))
.cloned()
.collect();
self.models = models;
cx.notify();
}
async fn fetch_models( async fn fetch_models(
client: Arc<Client>, client: Arc<Client>,
llm_api_token: LlmApiToken, llm_api_token: LlmApiToken,

View file

@ -376,7 +376,7 @@ impl LanguageModel for OpenRouterLanguageModel {
fn tool_input_format(&self) -> LanguageModelToolSchemaFormat { fn tool_input_format(&self) -> LanguageModelToolSchemaFormat {
let model_id = self.model.id().trim().to_lowercase(); let model_id = self.model.id().trim().to_lowercase();
if model_id.contains("gemini") { if model_id.contains("gemini") || model_id.contains("grok-4") {
LanguageModelToolSchemaFormat::JsonSchemaSubset LanguageModelToolSchemaFormat::JsonSchemaSubset
} else { } else {
LanguageModelToolSchemaFormat::JsonSchema LanguageModelToolSchemaFormat::JsonSchema

View file

@ -0,0 +1,571 @@
use anyhow::{Context as _, Result, anyhow};
use collections::BTreeMap;
use credentials_provider::CredentialsProvider;
use futures::{FutureExt, StreamExt, future::BoxFuture};
use gpui::{AnyView, App, AsyncApp, Context, Entity, Subscription, Task, Window};
use http_client::HttpClient;
use language_model::{
AuthenticateError, LanguageModel, LanguageModelCompletionError, LanguageModelCompletionEvent,
LanguageModelId, LanguageModelName, LanguageModelProvider, LanguageModelProviderId,
LanguageModelProviderName, LanguageModelProviderState, LanguageModelRequest,
LanguageModelToolChoice, LanguageModelToolSchemaFormat, RateLimiter, Role,
};
use menu;
use open_ai::ResponseStreamEvent;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsStore};
use std::sync::Arc;
use strum::IntoEnumIterator;
use x_ai::Model;
use ui::{ElevationIndex, List, Tooltip, prelude::*};
use ui_input::SingleLineInput;
use util::ResultExt;
use crate::{AllLanguageModelSettings, ui::InstructionListItem};
const PROVIDER_ID: &str = "x_ai";
const PROVIDER_NAME: &str = "xAI";
#[derive(Default, Clone, Debug, PartialEq)]
pub struct XAiSettings {
pub api_url: String,
pub available_models: Vec<AvailableModel>,
}
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, JsonSchema)]
pub struct AvailableModel {
pub name: String,
pub display_name: Option<String>,
pub max_tokens: u64,
pub max_output_tokens: Option<u64>,
pub max_completion_tokens: Option<u64>,
}
pub struct XAiLanguageModelProvider {
http_client: Arc<dyn HttpClient>,
state: gpui::Entity<State>,
}
pub struct State {
api_key: Option<String>,
api_key_from_env: bool,
_subscription: Subscription,
}
const XAI_API_KEY_VAR: &str = "XAI_API_KEY";
impl State {
fn is_authenticated(&self) -> bool {
self.api_key.is_some()
}
fn reset_api_key(&self, cx: &mut Context<Self>) -> Task<Result<()>> {
let credentials_provider = <dyn CredentialsProvider>::global(cx);
let settings = &AllLanguageModelSettings::get_global(cx).x_ai;
let api_url = if settings.api_url.is_empty() {
x_ai::XAI_API_URL.to_string()
} else {
settings.api_url.clone()
};
cx.spawn(async move |this, cx| {
credentials_provider
.delete_credentials(&api_url, &cx)
.await
.log_err();
this.update(cx, |this, cx| {
this.api_key = None;
this.api_key_from_env = false;
cx.notify();
})
})
}
fn set_api_key(&mut self, api_key: String, cx: &mut Context<Self>) -> Task<Result<()>> {
let credentials_provider = <dyn CredentialsProvider>::global(cx);
let settings = &AllLanguageModelSettings::get_global(cx).x_ai;
let api_url = if settings.api_url.is_empty() {
x_ai::XAI_API_URL.to_string()
} else {
settings.api_url.clone()
};
cx.spawn(async move |this, cx| {
credentials_provider
.write_credentials(&api_url, "Bearer", api_key.as_bytes(), &cx)
.await
.log_err();
this.update(cx, |this, cx| {
this.api_key = Some(api_key);
cx.notify();
})
})
}
fn authenticate(&self, cx: &mut Context<Self>) -> Task<Result<(), AuthenticateError>> {
if self.is_authenticated() {
return Task::ready(Ok(()));
}
let credentials_provider = <dyn CredentialsProvider>::global(cx);
let settings = &AllLanguageModelSettings::get_global(cx).x_ai;
let api_url = if settings.api_url.is_empty() {
x_ai::XAI_API_URL.to_string()
} else {
settings.api_url.clone()
};
cx.spawn(async move |this, cx| {
let (api_key, from_env) = if let Ok(api_key) = std::env::var(XAI_API_KEY_VAR) {
(api_key, true)
} else {
let (_, api_key) = credentials_provider
.read_credentials(&api_url, &cx)
.await?
.ok_or(AuthenticateError::CredentialsNotFound)?;
(
String::from_utf8(api_key).context("invalid {PROVIDER_NAME} API key")?,
false,
)
};
this.update(cx, |this, cx| {
this.api_key = Some(api_key);
this.api_key_from_env = from_env;
cx.notify();
})?;
Ok(())
})
}
}
impl XAiLanguageModelProvider {
pub fn new(http_client: Arc<dyn HttpClient>, cx: &mut App) -> Self {
let state = cx.new(|cx| State {
api_key: None,
api_key_from_env: false,
_subscription: cx.observe_global::<SettingsStore>(|_this: &mut State, cx| {
cx.notify();
}),
});
Self { http_client, state }
}
fn create_language_model(&self, model: x_ai::Model) -> Arc<dyn LanguageModel> {
Arc::new(XAiLanguageModel {
id: LanguageModelId::from(model.id().to_string()),
model,
state: self.state.clone(),
http_client: self.http_client.clone(),
request_limiter: RateLimiter::new(4),
})
}
}
impl LanguageModelProviderState for XAiLanguageModelProvider {
type ObservableEntity = State;
fn observable_entity(&self) -> Option<gpui::Entity<Self::ObservableEntity>> {
Some(self.state.clone())
}
}
impl LanguageModelProvider for XAiLanguageModelProvider {
fn id(&self) -> LanguageModelProviderId {
LanguageModelProviderId(PROVIDER_ID.into())
}
fn name(&self) -> LanguageModelProviderName {
LanguageModelProviderName(PROVIDER_NAME.into())
}
fn icon(&self) -> IconName {
IconName::AiXAi
}
fn default_model(&self, _cx: &App) -> Option<Arc<dyn LanguageModel>> {
Some(self.create_language_model(x_ai::Model::default()))
}
fn default_fast_model(&self, _cx: &App) -> Option<Arc<dyn LanguageModel>> {
Some(self.create_language_model(x_ai::Model::default_fast()))
}
fn provided_models(&self, cx: &App) -> Vec<Arc<dyn LanguageModel>> {
let mut models = BTreeMap::default();
for model in x_ai::Model::iter() {
if !matches!(model, x_ai::Model::Custom { .. }) {
models.insert(model.id().to_string(), model);
}
}
for model in &AllLanguageModelSettings::get_global(cx)
.x_ai
.available_models
{
models.insert(
model.name.clone(),
x_ai::Model::Custom {
name: model.name.clone(),
display_name: model.display_name.clone(),
max_tokens: model.max_tokens,
max_output_tokens: model.max_output_tokens,
max_completion_tokens: model.max_completion_tokens,
},
);
}
models
.into_values()
.map(|model| self.create_language_model(model))
.collect()
}
fn is_authenticated(&self, cx: &App) -> bool {
self.state.read(cx).is_authenticated()
}
fn authenticate(&self, cx: &mut App) -> Task<Result<(), AuthenticateError>> {
self.state.update(cx, |state, cx| state.authenticate(cx))
}
fn configuration_view(&self, window: &mut Window, cx: &mut App) -> AnyView {
cx.new(|cx| ConfigurationView::new(self.state.clone(), window, cx))
.into()
}
fn reset_credentials(&self, cx: &mut App) -> Task<Result<()>> {
self.state.update(cx, |state, cx| state.reset_api_key(cx))
}
}
pub struct XAiLanguageModel {
id: LanguageModelId,
model: x_ai::Model,
state: gpui::Entity<State>,
http_client: Arc<dyn HttpClient>,
request_limiter: RateLimiter,
}
impl XAiLanguageModel {
fn stream_completion(
&self,
request: open_ai::Request,
cx: &AsyncApp,
) -> BoxFuture<'static, Result<futures::stream::BoxStream<'static, Result<ResponseStreamEvent>>>>
{
let http_client = self.http_client.clone();
let Ok((api_key, api_url)) = cx.read_entity(&self.state, |state, cx| {
let settings = &AllLanguageModelSettings::get_global(cx).x_ai;
let api_url = if settings.api_url.is_empty() {
x_ai::XAI_API_URL.to_string()
} else {
settings.api_url.clone()
};
(state.api_key.clone(), api_url)
}) else {
return futures::future::ready(Err(anyhow!("App state dropped"))).boxed();
};
let future = self.request_limiter.stream(async move {
let api_key = api_key.context("Missing xAI API Key")?;
let request =
open_ai::stream_completion(http_client.as_ref(), &api_url, &api_key, request);
let response = request.await?;
Ok(response)
});
async move { Ok(future.await?.boxed()) }.boxed()
}
}
impl LanguageModel for XAiLanguageModel {
fn id(&self) -> LanguageModelId {
self.id.clone()
}
fn name(&self) -> LanguageModelName {
LanguageModelName::from(self.model.display_name().to_string())
}
fn provider_id(&self) -> LanguageModelProviderId {
LanguageModelProviderId(PROVIDER_ID.into())
}
fn provider_name(&self) -> LanguageModelProviderName {
LanguageModelProviderName(PROVIDER_NAME.into())
}
fn supports_tools(&self) -> bool {
self.model.supports_tool()
}
fn supports_images(&self) -> bool {
self.model.supports_images()
}
fn supports_tool_choice(&self, choice: LanguageModelToolChoice) -> bool {
match choice {
LanguageModelToolChoice::Auto
| LanguageModelToolChoice::Any
| LanguageModelToolChoice::None => true,
}
}
fn tool_input_format(&self) -> LanguageModelToolSchemaFormat {
let model_id = self.model.id().trim().to_lowercase();
if model_id.eq(x_ai::Model::Grok4.id()) {
LanguageModelToolSchemaFormat::JsonSchemaSubset
} else {
LanguageModelToolSchemaFormat::JsonSchema
}
}
fn telemetry_id(&self) -> String {
format!("x_ai/{}", self.model.id())
}
fn max_token_count(&self) -> u64 {
self.model.max_token_count()
}
fn max_output_tokens(&self) -> Option<u64> {
self.model.max_output_tokens()
}
fn count_tokens(
&self,
request: LanguageModelRequest,
cx: &App,
) -> BoxFuture<'static, Result<u64>> {
count_xai_tokens(request, self.model.clone(), cx)
}
fn stream_completion(
&self,
request: LanguageModelRequest,
cx: &AsyncApp,
) -> BoxFuture<
'static,
Result<
futures::stream::BoxStream<
'static,
Result<LanguageModelCompletionEvent, LanguageModelCompletionError>,
>,
LanguageModelCompletionError,
>,
> {
let request = crate::provider::open_ai::into_open_ai(
request,
self.model.id(),
self.model.supports_parallel_tool_calls(),
self.max_output_tokens(),
);
let completions = self.stream_completion(request, cx);
async move {
let mapper = crate::provider::open_ai::OpenAiEventMapper::new();
Ok(mapper.map_stream(completions.await?).boxed())
}
.boxed()
}
}
pub fn count_xai_tokens(
request: LanguageModelRequest,
model: Model,
cx: &App,
) -> BoxFuture<'static, Result<u64>> {
cx.background_spawn(async move {
let messages = request
.messages
.into_iter()
.map(|message| tiktoken_rs::ChatCompletionRequestMessage {
role: match message.role {
Role::User => "user".into(),
Role::Assistant => "assistant".into(),
Role::System => "system".into(),
},
content: Some(message.string_contents()),
name: None,
function_call: None,
})
.collect::<Vec<_>>();
let model_name = if model.max_token_count() >= 100_000 {
"gpt-4o"
} else {
"gpt-4"
};
tiktoken_rs::num_tokens_from_messages(model_name, &messages).map(|tokens| tokens as u64)
})
.boxed()
}
struct ConfigurationView {
api_key_editor: Entity<SingleLineInput>,
state: gpui::Entity<State>,
load_credentials_task: Option<Task<()>>,
}
impl ConfigurationView {
fn new(state: gpui::Entity<State>, window: &mut Window, cx: &mut Context<Self>) -> Self {
let api_key_editor = cx.new(|cx| {
SingleLineInput::new(
window,
cx,
"xai-0000000000000000000000000000000000000000000000000",
)
.label("API key")
});
cx.observe(&state, |_, _, cx| {
cx.notify();
})
.detach();
let load_credentials_task = Some(cx.spawn_in(window, {
let state = state.clone();
async move |this, cx| {
if let Some(task) = state
.update(cx, |state, cx| state.authenticate(cx))
.log_err()
{
// We don't log an error, because "not signed in" is also an error.
let _ = task.await;
}
this.update(cx, |this, cx| {
this.load_credentials_task = None;
cx.notify();
})
.log_err();
}
}));
Self {
api_key_editor,
state,
load_credentials_task,
}
}
fn save_api_key(&mut self, _: &menu::Confirm, window: &mut Window, cx: &mut Context<Self>) {
let api_key = self
.api_key_editor
.read(cx)
.editor()
.read(cx)
.text(cx)
.trim()
.to_string();
// Don't proceed if no API key is provided and we're not authenticated
if api_key.is_empty() && !self.state.read(cx).is_authenticated() {
return;
}
let state = self.state.clone();
cx.spawn_in(window, async move |_, cx| {
state
.update(cx, |state, cx| state.set_api_key(api_key, cx))?
.await
})
.detach_and_log_err(cx);
cx.notify();
}
fn reset_api_key(&mut self, window: &mut Window, cx: &mut Context<Self>) {
self.api_key_editor.update(cx, |input, cx| {
input.editor.update(cx, |editor, cx| {
editor.set_text("", window, cx);
});
});
let state = self.state.clone();
cx.spawn_in(window, async move |_, cx| {
state.update(cx, |state, cx| state.reset_api_key(cx))?.await
})
.detach_and_log_err(cx);
cx.notify();
}
fn should_render_editor(&self, cx: &mut Context<Self>) -> bool {
!self.state.read(cx).is_authenticated()
}
}
impl Render for ConfigurationView {
fn render(&mut self, _: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let env_var_set = self.state.read(cx).api_key_from_env;
let api_key_section = if self.should_render_editor(cx) {
v_flex()
.on_action(cx.listener(Self::save_api_key))
.child(Label::new("To use Zed's agent with xAI, you need to add an API key. Follow these steps:"))
.child(
List::new()
.child(InstructionListItem::new(
"Create one by visiting",
Some("xAI console"),
Some("https://console.x.ai/team/default/api-keys"),
))
.child(InstructionListItem::text_only(
"Paste your API key below and hit enter to start using the agent",
)),
)
.child(self.api_key_editor.clone())
.child(
Label::new(format!(
"You can also assign the {XAI_API_KEY_VAR} environment variable and restart Zed."
))
.size(LabelSize::Small)
.color(Color::Muted),
)
.child(
Label::new("Note that xAI is a custom OpenAI-compatible provider.")
.size(LabelSize::Small)
.color(Color::Muted),
)
.into_any()
} else {
h_flex()
.mt_1()
.p_1()
.justify_between()
.rounded_md()
.border_1()
.border_color(cx.theme().colors().border)
.bg(cx.theme().colors().background)
.child(
h_flex()
.gap_1()
.child(Icon::new(IconName::Check).color(Color::Success))
.child(Label::new(if env_var_set {
format!("API key set in {XAI_API_KEY_VAR} environment variable.")
} else {
"API key configured.".to_string()
})),
)
.child(
Button::new("reset-api-key", "Reset API Key")
.label_size(LabelSize::Small)
.icon(IconName::Undo)
.icon_size(IconSize::Small)
.icon_position(IconPosition::Start)
.layer(ElevationIndex::ModalSurface)
.when(env_var_set, |this| {
this.tooltip(Tooltip::text(format!("To reset your API key, unset the {XAI_API_KEY_VAR} environment variable.")))
})
.on_click(cx.listener(|this, _, window, cx| this.reset_api_key(window, cx))),
)
.into_any()
};
if self.load_credentials_task.is_some() {
div().child(Label::new("Loading credentials…")).into_any()
} else {
v_flex().size_full().child(api_key_section).into_any()
}
}
}

View file

@ -17,6 +17,7 @@ use crate::provider::{
open_ai::OpenAiSettings, open_ai::OpenAiSettings,
open_router::OpenRouterSettings, open_router::OpenRouterSettings,
vercel::VercelSettings, vercel::VercelSettings,
x_ai::XAiSettings,
}; };
/// Initializes the language model settings. /// Initializes the language model settings.
@ -28,33 +29,33 @@ pub fn init(cx: &mut App) {
pub struct AllLanguageModelSettings { pub struct AllLanguageModelSettings {
pub anthropic: AnthropicSettings, pub anthropic: AnthropicSettings,
pub bedrock: AmazonBedrockSettings, pub bedrock: AmazonBedrockSettings,
pub ollama: OllamaSettings,
pub openai: OpenAiSettings,
pub open_router: OpenRouterSettings,
pub zed_dot_dev: ZedDotDevSettings,
pub google: GoogleSettings,
pub vercel: VercelSettings,
pub lmstudio: LmStudioSettings,
pub deepseek: DeepSeekSettings, pub deepseek: DeepSeekSettings,
pub google: GoogleSettings,
pub lmstudio: LmStudioSettings,
pub mistral: MistralSettings, pub mistral: MistralSettings,
pub ollama: OllamaSettings,
pub open_router: OpenRouterSettings,
pub openai: OpenAiSettings,
pub vercel: VercelSettings,
pub x_ai: XAiSettings,
pub zed_dot_dev: ZedDotDevSettings,
} }
#[derive(Default, Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)] #[derive(Default, Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
pub struct AllLanguageModelSettingsContent { pub struct AllLanguageModelSettingsContent {
pub anthropic: Option<AnthropicSettingsContent>, pub anthropic: Option<AnthropicSettingsContent>,
pub bedrock: Option<AmazonBedrockSettingsContent>, pub bedrock: Option<AmazonBedrockSettingsContent>,
pub ollama: Option<OllamaSettingsContent>, pub deepseek: Option<DeepseekSettingsContent>,
pub google: Option<GoogleSettingsContent>,
pub lmstudio: Option<LmStudioSettingsContent>, pub lmstudio: Option<LmStudioSettingsContent>,
pub openai: Option<OpenAiSettingsContent>, pub mistral: Option<MistralSettingsContent>,
pub ollama: Option<OllamaSettingsContent>,
pub open_router: Option<OpenRouterSettingsContent>, pub open_router: Option<OpenRouterSettingsContent>,
pub openai: Option<OpenAiSettingsContent>,
pub vercel: Option<VercelSettingsContent>,
pub x_ai: Option<XAiSettingsContent>,
#[serde(rename = "zed.dev")] #[serde(rename = "zed.dev")]
pub zed_dot_dev: Option<ZedDotDevSettingsContent>, pub zed_dot_dev: Option<ZedDotDevSettingsContent>,
pub google: Option<GoogleSettingsContent>,
pub deepseek: Option<DeepseekSettingsContent>,
pub vercel: Option<VercelSettingsContent>,
pub mistral: Option<MistralSettingsContent>,
} }
#[derive(Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)] #[derive(Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
@ -114,6 +115,12 @@ pub struct GoogleSettingsContent {
pub available_models: Option<Vec<provider::google::AvailableModel>>, pub available_models: Option<Vec<provider::google::AvailableModel>>,
} }
#[derive(Default, Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
pub struct XAiSettingsContent {
pub api_url: Option<String>,
pub available_models: Option<Vec<provider::x_ai::AvailableModel>>,
}
#[derive(Default, Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)] #[derive(Default, Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
pub struct ZedDotDevSettingsContent { pub struct ZedDotDevSettingsContent {
available_models: Option<Vec<cloud::AvailableModel>>, available_models: Option<Vec<cloud::AvailableModel>>,
@ -230,6 +237,18 @@ impl settings::Settings for AllLanguageModelSettings {
vercel.as_ref().and_then(|s| s.available_models.clone()), vercel.as_ref().and_then(|s| s.available_models.clone()),
); );
// XAI
let x_ai = value.x_ai.clone();
merge(
&mut settings.x_ai.api_url,
x_ai.as_ref().and_then(|s| s.api_url.clone()),
);
merge(
&mut settings.x_ai.available_models,
x_ai.as_ref().and_then(|s| s.available_models.clone()),
);
// ZedDotDev
merge( merge(
&mut settings.zed_dot_dev.available_models, &mut settings.zed_dot_dev.available_models,
value value

View file

@ -3362,8 +3362,14 @@ impl Project {
cx: &mut Context<Self>, cx: &mut Context<Self>,
) -> Task<Result<Vec<LocationLink>>> { ) -> Task<Result<Vec<LocationLink>>> {
let position = position.to_point_utf16(buffer.read(cx)); let position = position.to_point_utf16(buffer.read(cx));
self.lsp_store.update(cx, |lsp_store, cx| { let guard = self.retain_remotely_created_models(cx);
let task = self.lsp_store.update(cx, |lsp_store, cx| {
lsp_store.definitions(buffer, position, cx) lsp_store.definitions(buffer, position, cx)
});
cx.spawn(async move |_, _| {
let result = task.await;
drop(guard);
result
}) })
} }
@ -3374,8 +3380,14 @@ impl Project {
cx: &mut Context<Self>, cx: &mut Context<Self>,
) -> Task<Result<Vec<LocationLink>>> { ) -> Task<Result<Vec<LocationLink>>> {
let position = position.to_point_utf16(buffer.read(cx)); let position = position.to_point_utf16(buffer.read(cx));
self.lsp_store.update(cx, |lsp_store, cx| { let guard = self.retain_remotely_created_models(cx);
let task = self.lsp_store.update(cx, |lsp_store, cx| {
lsp_store.declarations(buffer, position, cx) lsp_store.declarations(buffer, position, cx)
});
cx.spawn(async move |_, _| {
let result = task.await;
drop(guard);
result
}) })
} }
@ -3386,8 +3398,14 @@ impl Project {
cx: &mut Context<Self>, cx: &mut Context<Self>,
) -> Task<Result<Vec<LocationLink>>> { ) -> Task<Result<Vec<LocationLink>>> {
let position = position.to_point_utf16(buffer.read(cx)); let position = position.to_point_utf16(buffer.read(cx));
self.lsp_store.update(cx, |lsp_store, cx| { let guard = self.retain_remotely_created_models(cx);
let task = self.lsp_store.update(cx, |lsp_store, cx| {
lsp_store.type_definitions(buffer, position, cx) lsp_store.type_definitions(buffer, position, cx)
});
cx.spawn(async move |_, _| {
let result = task.await;
drop(guard);
result
}) })
} }
@ -3398,8 +3416,14 @@ impl Project {
cx: &mut Context<Self>, cx: &mut Context<Self>,
) -> Task<Result<Vec<LocationLink>>> { ) -> Task<Result<Vec<LocationLink>>> {
let position = position.to_point_utf16(buffer.read(cx)); let position = position.to_point_utf16(buffer.read(cx));
self.lsp_store.update(cx, |lsp_store, cx| { let guard = self.retain_remotely_created_models(cx);
let task = self.lsp_store.update(cx, |lsp_store, cx| {
lsp_store.implementations(buffer, position, cx) lsp_store.implementations(buffer, position, cx)
});
cx.spawn(async move |_, _| {
let result = task.await;
drop(guard);
result
}) })
} }
@ -3410,8 +3434,14 @@ impl Project {
cx: &mut Context<Self>, cx: &mut Context<Self>,
) -> Task<Result<Vec<Location>>> { ) -> Task<Result<Vec<Location>>> {
let position = position.to_point_utf16(buffer.read(cx)); let position = position.to_point_utf16(buffer.read(cx));
self.lsp_store.update(cx, |lsp_store, cx| { let guard = self.retain_remotely_created_models(cx);
let task = self.lsp_store.update(cx, |lsp_store, cx| {
lsp_store.references(buffer, position, cx) lsp_store.references(buffer, position, cx)
});
cx.spawn(async move |_, _| {
let result = task.await;
drop(guard);
result
}) })
} }

View file

@ -611,7 +611,7 @@ impl RulesLibrary {
this.update_in(cx, |this, window, cx| match rule { this.update_in(cx, |this, window, cx| match rule {
Ok(rule) => { Ok(rule) => {
let title_editor = cx.new(|cx| { let title_editor = cx.new(|cx| {
let mut editor = Editor::auto_width(window, cx); let mut editor = Editor::single_line(window, cx);
editor.set_placeholder_text("Untitled", cx); editor.set_placeholder_text("Untitled", cx);
editor.set_text(rule_metadata.title.unwrap_or_default(), window, cx); editor.set_text(rule_metadata.title.unwrap_or_default(), window, cx);
if prompt_id.is_built_in() { if prompt_id.is_built_in() {

View file

@ -127,7 +127,7 @@ impl BatchedTextRun {
cx: &mut App, cx: &mut App,
) { ) {
let pos = Point::new( let pos = Point::new(
(origin.x + self.start_point.column as f32 * dimensions.cell_width).floor(), origin.x + self.start_point.column as f32 * dimensions.cell_width,
origin.y + self.start_point.line as f32 * dimensions.line_height, origin.y + self.start_point.line as f32 * dimensions.line_height,
); );

View file

@ -327,6 +327,7 @@ impl PickerDelegate for IconThemeSelectorDelegate {
window.dispatch_action( window.dispatch_action(
Box::new(Extensions { Box::new(Extensions {
category_filter: Some(ExtensionCategoryFilter::IconThemes), category_filter: Some(ExtensionCategoryFilter::IconThemes),
id: None,
}), }),
cx, cx,
); );

View file

@ -385,6 +385,7 @@ impl PickerDelegate for ThemeSelectorDelegate {
window.dispatch_action( window.dispatch_action(
Box::new(Extensions { Box::new(Extensions {
category_filter: Some(ExtensionCategoryFilter::Themes), category_filter: Some(ExtensionCategoryFilter::Themes),
id: None,
}), }),
cx, cx,
); );

View file

@ -230,7 +230,11 @@ fn scroll_editor(
// column position, or the right-most column in the current // column position, or the right-most column in the current
// line, seeing as the cursor might be in a short line, in which // line, seeing as the cursor might be in a short line, in which
// case we don't want to go past its last column. // case we don't want to go past its last column.
let max_row_column = map.line_len(new_row); let max_row_column = if new_row <= map.max_point().row() {
map.line_len(new_row)
} else {
0
};
let max_column = match min_column + visible_column_count as u32 { let max_column = match min_column + visible_column_count as u32 {
max_column if max_column >= max_row_column => max_row_column, max_column if max_column >= max_row_column => max_row_column,
max_column => max_column, max_column => max_column,

23
crates/x_ai/Cargo.toml Normal file
View file

@ -0,0 +1,23 @@
[package]
name = "x_ai"
version = "0.1.0"
edition.workspace = true
publish.workspace = true
license = "GPL-3.0-or-later"
[lints]
workspace = true
[lib]
path = "src/x_ai.rs"
[features]
default = []
schemars = ["dep:schemars"]
[dependencies]
anyhow.workspace = true
schemars = { workspace = true, optional = true }
serde.workspace = true
strum.workspace = true
workspace-hack.workspace = true

1
crates/x_ai/LICENSE-GPL Symbolic link
View file

@ -0,0 +1 @@
../../LICENSE-GPL

126
crates/x_ai/src/x_ai.rs Normal file
View file

@ -0,0 +1,126 @@
use anyhow::Result;
use serde::{Deserialize, Serialize};
use strum::EnumIter;
pub const XAI_API_URL: &str = "https://api.x.ai/v1";
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
#[derive(Clone, Debug, Default, Serialize, Deserialize, PartialEq, EnumIter)]
pub enum Model {
#[serde(rename = "grok-2-vision-latest")]
Grok2Vision,
#[default]
#[serde(rename = "grok-3-latest")]
Grok3,
#[serde(rename = "grok-3-mini-latest")]
Grok3Mini,
#[serde(rename = "grok-3-fast-latest")]
Grok3Fast,
#[serde(rename = "grok-3-mini-fast-latest")]
Grok3MiniFast,
#[serde(rename = "grok-4-latest")]
Grok4,
#[serde(rename = "custom")]
Custom {
name: String,
/// The name displayed in the UI, such as in the assistant panel model dropdown menu.
display_name: Option<String>,
max_tokens: u64,
max_output_tokens: Option<u64>,
max_completion_tokens: Option<u64>,
},
}
impl Model {
pub fn default_fast() -> Self {
Self::Grok3Fast
}
pub fn from_id(id: &str) -> Result<Self> {
match id {
"grok-2-vision" => Ok(Self::Grok2Vision),
"grok-3" => Ok(Self::Grok3),
"grok-3-mini" => Ok(Self::Grok3Mini),
"grok-3-fast" => Ok(Self::Grok3Fast),
"grok-3-mini-fast" => Ok(Self::Grok3MiniFast),
_ => anyhow::bail!("invalid model id '{id}'"),
}
}
pub fn id(&self) -> &str {
match self {
Self::Grok2Vision => "grok-2-vision",
Self::Grok3 => "grok-3",
Self::Grok3Mini => "grok-3-mini",
Self::Grok3Fast => "grok-3-fast",
Self::Grok3MiniFast => "grok-3-mini-fast",
Self::Grok4 => "grok-4",
Self::Custom { name, .. } => name,
}
}
pub fn display_name(&self) -> &str {
match self {
Self::Grok2Vision => "Grok 2 Vision",
Self::Grok3 => "Grok 3",
Self::Grok3Mini => "Grok 3 Mini",
Self::Grok3Fast => "Grok 3 Fast",
Self::Grok3MiniFast => "Grok 3 Mini Fast",
Self::Grok4 => "Grok 4",
Self::Custom {
name, display_name, ..
} => display_name.as_ref().unwrap_or(name),
}
}
pub fn max_token_count(&self) -> u64 {
match self {
Self::Grok3 | Self::Grok3Mini | Self::Grok3Fast | Self::Grok3MiniFast => 131_072,
Self::Grok4 => 256_000,
Self::Grok2Vision => 8_192,
Self::Custom { max_tokens, .. } => *max_tokens,
}
}
pub fn max_output_tokens(&self) -> Option<u64> {
match self {
Self::Grok3 | Self::Grok3Mini | Self::Grok3Fast | Self::Grok3MiniFast => Some(8_192),
Self::Grok4 => Some(64_000),
Self::Grok2Vision => Some(4_096),
Self::Custom {
max_output_tokens, ..
} => *max_output_tokens,
}
}
pub fn supports_parallel_tool_calls(&self) -> bool {
match self {
Self::Grok2Vision
| Self::Grok3
| Self::Grok3Mini
| Self::Grok3Fast
| Self::Grok3MiniFast
| Self::Grok4 => true,
Model::Custom { .. } => false,
}
}
pub fn supports_tool(&self) -> bool {
match self {
Self::Grok2Vision
| Self::Grok3
| Self::Grok3Mini
| Self::Grok3Fast
| Self::Grok3MiniFast
| Self::Grok4 => true,
Model::Custom { .. } => false,
}
}
pub fn supports_images(&self) -> bool {
match self {
Self::Grok2Vision => true,
_ => false,
}
}
}

View file

@ -2,7 +2,7 @@
description = "The fast, collaborative code editor." description = "The fast, collaborative code editor."
edition.workspace = true edition.workspace = true
name = "zed" name = "zed"
version = "0.195.0" version = "0.195.2"
publish.workspace = true publish.workspace = true
license = "GPL-3.0-or-later" license = "GPL-3.0-or-later"
authors = ["Zed Team <hi@zed.dev>"] authors = ["Zed Team <hi@zed.dev>"]

View file

@ -1 +1 @@
dev preview

View file

@ -725,6 +725,23 @@ fn handle_open_request(request: OpenRequest, app_state: Arc<AppState>, cx: &mut
return; return;
} }
if let Some(extension) = request.extension_id {
cx.spawn(async move |cx| {
let workspace = workspace::get_any_active_workspace(app_state, cx.clone()).await?;
workspace.update(cx, |_, window, cx| {
window.dispatch_action(
Box::new(zed_actions::Extensions {
category_filter: None,
id: Some(extension),
}),
cx,
);
})
})
.detach_and_log_err(cx);
return;
}
if let Some(connection_options) = request.ssh_connection { if let Some(connection_options) = request.ssh_connection {
cx.spawn(async move |mut cx| { cx.spawn(async move |mut cx| {
let paths: Vec<PathBuf> = request.open_paths.into_iter().map(PathBuf::from).collect(); let paths: Vec<PathBuf> = request.open_paths.into_iter().map(PathBuf::from).collect();

View file

@ -37,6 +37,7 @@ pub struct OpenRequest {
pub join_channel: Option<u64>, pub join_channel: Option<u64>,
pub ssh_connection: Option<SshConnectionOptions>, pub ssh_connection: Option<SshConnectionOptions>,
pub dock_menu_action: Option<usize>, pub dock_menu_action: Option<usize>,
pub extension_id: Option<String>,
} }
impl OpenRequest { impl OpenRequest {
@ -54,6 +55,8 @@ impl OpenRequest {
} else if let Some(file) = url.strip_prefix("zed://ssh") { } else if let Some(file) = url.strip_prefix("zed://ssh") {
let ssh_url = "ssh:/".to_string() + file; let ssh_url = "ssh:/".to_string() + file;
this.parse_ssh_file_path(&ssh_url, cx)? this.parse_ssh_file_path(&ssh_url, cx)?
} else if let Some(file) = url.strip_prefix("zed://extension/") {
this.extension_id = Some(file.to_string())
} else if url.starts_with("ssh://") { } else if url.starts_with("ssh://") {
this.parse_ssh_file_path(&url, cx)? this.parse_ssh_file_path(&url, cx)?
} else if let Some(request_path) = parse_zed_link(&url, cx) { } else if let Some(request_path) = parse_zed_link(&url, cx) {

View file

@ -76,6 +76,9 @@ pub struct Extensions {
/// Filters the extensions page down to extensions that are in the specified category. /// Filters the extensions page down to extensions that are in the specified category.
#[serde(default)] #[serde(default)]
pub category_filter: Option<ExtensionCategoryFilter>, pub category_filter: Option<ExtensionCategoryFilter>,
/// Focuses just the extension with the specified ID.
#[serde(default)]
pub id: Option<String>,
} }
/// Decreases the font size in the editor buffer. /// Decreases the font size in the editor buffer.

View file

@ -23,6 +23,8 @@ Here's an overview of the supported providers and tool call support:
| [OpenAI](#openai) | ✅ | | [OpenAI](#openai) | ✅ |
| [OpenAI API Compatible](#openai-api-compatible) | 🚫 | | [OpenAI API Compatible](#openai-api-compatible) | 🚫 |
| [OpenRouter](#openrouter) | ✅ | | [OpenRouter](#openrouter) | ✅ |
| [Vercel](#vercel-v0) | ✅ |
| [xAI](#xai) | ✅ |
## Use Your Own Keys {#use-your-own-keys} ## Use Your Own Keys {#use-your-own-keys}
@ -442,27 +444,30 @@ Custom models will be listed in the model dropdown in the Agent Panel.
Zed supports using OpenAI compatible APIs by specifying a custom `endpoint` and `available_models` for the OpenAI provider. Zed supports using OpenAI compatible APIs by specifying a custom `endpoint` and `available_models` for the OpenAI provider.
You can add a custom API URL for OpenAI either via the UI or by editing your `settings.json`. Zed supports using OpenAI compatible APIs by specifying a custom `api_url` and `available_models` for the OpenAI provider. This is useful for connecting to other hosted services (like Together AI, Anyscale, etc.) or local models.
Here are a few model examples you can plug in by using this feature:
#### X.ai Grok To configure a compatible API, you can add a custom API URL for OpenAI either via the UI or by editing your `settings.json`. For example, to connect to [Together AI](https://www.together.ai/):
Example configuration for using X.ai Grok with Zed: 1. Get an API key from your [Together AI account](https://api.together.ai/settings/api-keys).
2. Add the following to your `settings.json`:
```json ```json
{
"language_models": { "language_models": {
"openai": { "openai": {
"api_url": "https://api.x.ai/v1", "api_url": "https://api.together.xyz/v1",
"api_key": "YOUR_TOGETHER_AI_API_KEY",
"available_models": [ "available_models": [
{ {
"name": "grok-beta", "name": "mistralai/Mixtral-8x7B-Instruct-v0.1",
"display_name": "X.ai Grok (Beta)", "display_name": "Together Mixtral 8x7B",
"max_tokens": 131072 "max_tokens": 32768,
"supports_tools": true
} }
], ]
"version": "1" }
},
} }
}
``` ```
### OpenRouter {#openrouter} ### OpenRouter {#openrouter}
@ -523,7 +528,9 @@ You can find available models and their specifications on the [OpenRouter models
Custom models will be listed in the model dropdown in the Agent Panel. Custom models will be listed in the model dropdown in the Agent Panel.
### Vercel v0 ### Vercel v0 {#vercel-v0}
> ✅ Supports tool use
[Vercel v0](https://vercel.com/docs/v0/api) is an expert model for generating full-stack apps, with framework-aware completions optimized for modern stacks like Next.js and Vercel. [Vercel v0](https://vercel.com/docs/v0/api) is an expert model for generating full-stack apps, with framework-aware completions optimized for modern stacks like Next.js and Vercel.
It supports text and image inputs and provides fast streaming responses. It supports text and image inputs and provides fast streaming responses.
@ -535,6 +542,49 @@ Once you have it, paste it directly into the Vercel provider section in the pane
You should then find it as `v0-1.5-md` in the model dropdown in the Agent Panel. You should then find it as `v0-1.5-md` in the model dropdown in the Agent Panel.
### xAI {#xai}
> ✅ Supports tool use
Zed has first-class support for [xAI](https://x.ai/) models. You can use your own API key to access Grok models.
1. [Create an API key in the xAI Console](https://console.x.ai/team/default/api-keys)
2. Open the settings view (`agent: open configuration`) and go to the **xAI** section
3. Enter your xAI API key
The xAI API key will be saved in your keychain. Zed will also use the `XAI_API_KEY` environment variable if it's defined.
> **Note:** While the xAI API is OpenAI-compatible, Zed has first-class support for it as a dedicated provider. For the best experience, we recommend using the dedicated `x_ai` provider configuration instead of the [OpenAI API Compatible](#openai-api-compatible) method.
#### Custom Models {#xai-custom-models}
The Zed agent comes pre-configured with common Grok models. If you wish to use alternate models or customize their parameters, you can do so by adding the following to your Zed `settings.json`:
```json
{
"language_models": {
"x_ai": {
"api_url": "https://api.x.ai/v1",
"available_models": [
{
"name": "grok-1.5",
"display_name": "Grok 1.5",
"max_tokens": 131072,
"max_output_tokens": 8192
},
{
"name": "grok-1.5v",
"display_name": "Grok 1.5V (Vision)",
"max_tokens": 131072,
"max_output_tokens": 8192,
"supports_images": true
}
]
}
}
}
```
## Advanced Configuration {#advanced-configuration} ## Advanced Configuration {#advanced-configuration}
### Custom Provider Endpoints {#custom-provider-endpoint} ### Custom Provider Endpoints {#custom-provider-endpoint}

View file

@ -44,8 +44,6 @@ function CheckEnvironmentVariables {
} }
} }
$innoDir = "$env:ZED_WORKSPACE\inno"
function PrepareForBundle { function PrepareForBundle {
if (Test-Path "$innoDir") { if (Test-Path "$innoDir") {
Remove-Item -Path "$innoDir" -Recurse -Force Remove-Item -Path "$innoDir" -Recurse -Force
@ -236,6 +234,8 @@ function BuildInstaller {
} }
ParseZedWorkspace ParseZedWorkspace
$innoDir = "$env:ZED_WORKSPACE\inno"
CheckEnvironmentVariables CheckEnvironmentVariables
PrepareForBundle PrepareForBundle
BuildZedAndItsFriends BuildZedAndItsFriends