Compare commits

...

17 commits

Author SHA1 Message Date
Julia Ryan
6c0eaf674e
zed 0.200.3 2025-08-18 09:32:33 -07:00
Julia Ryan
e9e376deb5
Separate minidump crashes from panics (#36267)
The minidump-based crash reporting is now entirely separate from our
legacy panic_hook-based reporting. This should improve the association
of minidumps with their metadata and give us more consistent crash
reports.

Release Notes:

- N/A

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-08-18 09:19:23 -07:00
Finn Evers
78e56ce8fd keymap_ui: Ensure keybind with empty arguments can be saved (#36393)
Follow up to #36278 to ensure this bug is actually fixed. Also fixes
this on two layers and adds a test for the lower layer, as we cannot
properly test it in the UI.

Furthermore, this improves the error message to show some more context
and ensures the status toast is actually only shown when the keybind was
successfully updated: Before, we would show the success toast whilst
also showing an error in the editor.

Lastly, this also fixes some issues with the status toast (and
animations) where no status toast or no animation would show in certain
scenarios.

Release Notes:

- N/A
2025-08-18 13:14:40 +02:00
Finn Evers
0367e93667 onboarding: Fix minimap typo on editing page (#36143)
This PR fixes a small typo on the onboarding editing page where it
should be "Minimap" instead of "Mini Map"

Release Notes:

- N/A
2025-08-18 13:14:35 +02:00
Oleksiy Syvokon
e2dec85365 agent: Create checkpoint before/after every edit operation (#36253)
1. Previously, checkpoints only appeared when an agent's edit happened
immediately after a user message. This is rare (agent usually collects
some context first), so they were almost never shown. This is now fixed.

2. After this change, a checkpoint is created after every edit
operation. So when the agent edits files five times in a single dialog
turn, we will now display five checkpoints.

As a bonus, it's now possible to undo only a part of a long agent
response.

Closes #36092, #32917

Release Notes:

- Create agent checkpoints more frequently (before every edit)
2025-08-18 12:37:19 +03:00
Piotr Osiewicz
4a0e8f0844 agent_ui: Ensure that all configuration views get rendered with full width (#36362)
Closes #36097

Release Notes:

- Fixed API key input fields getting shrunk in Agent Panel settings view
on low panel widths paired with high UI font sizes.
2025-08-18 12:36:01 +03:00
Cale Sennett
c2f0df9b8e Add capabilities to OpenAI-compatible model settings (#36370)
### TL;DR
* Adds `capabilities` configuration for OpenAI-compatible models
* Relates to
https://github.com/zed-industries/zed/issues/36215#issuecomment-3193920491

### Summary
This PR introduces support for configuring model capabilities for
OpenAI-compatible language models. The implementation addresses the
issue that not all OpenAI-compatible APIs support the same features -
for example, Cerebras' API explicitly does not support
`parallel_tool_calls` as documented in their [OpenAI compatibility
guide](https://inference-docs.cerebras.ai/resources/openai#currently-unsupported-openai-features).

### Changes

1. **Model Capabilities Structure**:
- Added `ModelCapabilityToggles` struct for UI representation with
boolean toggle states
- Implemented proper parsing of capability toggles into
`ModelCapabilities`

2. **UI Updates**:
- Modified the "Add LLM Provider" modal to include checkboxes for each
capability
- Each OpenAI-compatible model can now be configured with its specific
capabilities through the UI

3. **Configuration File Structure**:
- Updated the settings schema to support a `capabilities` object for
each `openai_compatible` model
- Each capability (`tools`, `images`, `parallel_tool_calls`,
`prompt_cache_key`) can be individually specified per model

### Example Configuration

```json
{
  "openai_compatible": {
    "Cerebras": {
      "api_url": "https://api.cerebras.ai/v1",
      "available_models": [
        {
          "name": "gpt-oss-120b",
          "max_tokens": 131000,
          "capabilities": {
            "tools": true,
            "images": false,
            "parallel_tool_calls": false,
            "prompt_cache_key": false
          }
        }
      ]
    }
  }
}
```

### Tests Added

- Added tests to verify default capability values are correctly applied
- Added tests to verify that deselected toggles are properly parsed as
`false`
- Added tests to verify that mixed capability selections work correctly

Thanks to @osyvokon for the desired `capabilities` configuration
structure!


Release Notes:

- OpenAI-compatible models now have configurable capabilities (#36370;
thanks @calesennett)

---------

Co-authored-by: Oleksiy Syvokon <oleksiy@zed.dev>
2025-08-18 12:35:08 +03:00
Ben Kunkle
2bd61668dc keymap_ui: Don't try to parse empty action arguments as JSON (#36278)
Closes #ISSUE

Release Notes:

- Keymap Editor: Fixed an issue where leaving the arguments field empty
would result in an error even if arguments were optional
2025-08-15 17:06:23 -05:00
Joseph T. Lyons
2ab445dfd4 zed 0.200.2 2025-08-15 13:17:26 -04:00
Oleksiy Syvokon
b96f76f377 openai: Don't send prompt_cache_key for OpenAI-compatible models (#36231)
Some APIs fail when they get this parameter

Closes #36215

Release Notes:

- Fixed OpenAI-compatible providers that don't support prompt caching
and/or reasoning
2025-08-15 16:26:41 +03:00
Oleksiy Syvokon
e9a4f6767b openai: Don't send reasoning_effort if it's not set (#36228)
Release Notes:

- N/A
2025-08-15 16:26:32 +03:00
smit
177cf12ca1 project: Fix LSP TextDocumentSyncCapability dynamic registration (#36234)
Closes #36213

Use `textDocument/didChange`
([docs](https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#textDocument_synchronization))
instead of `textDocument/synchronization`.

Release Notes:

- Fixed an issue where Dart projects were being formatted incorrectly by
the language server.
2025-08-15 14:08:58 +03:00
Joseph T. Lyons
fda9369bfd Emit a BreadcrumbsChanged event when associated settings changed (#36177)
Closes https://github.com/zed-industries/zed/issues/36149

Release Notes:

- Fixed a bug where changing the `toolbar.breadcrumbs` setting didn't
immediately update the UI when saving the `settings.json` file.
2025-08-14 15:31:29 -04:00
Zed Bot
08351cb3e7 Bump to 0.200.1 for @smitbarmase 2025-08-13 20:05:16 +00:00
smit
ab41359e24 ci: Disable FreeBSD builds (#36140)
Revert accidental change introduced in
[#35880](https://github.com/zed-industries/zed/pull/35880/files#diff-b803fcb7f17ed9235f1e5cb1fcd2f5d3b2838429d4368ae4c57ce4436577f03fL706)

Release Notes:

- N/A
2025-08-14 01:01:17 +05:30
smit
d29341bf44 copilot: Fix Copilot fails to sign in (#36138)
Closes #36093

Pin copilot version to 1.354 for now until further investigation.

Release Notes:

- Fixes issue where Copilot failed to sign in.

Co-authored-by: MrSubidubi <dev@bahn.sh>
2025-08-14 00:24:00 +05:30
Joseph T. Lyons
189ea49e00 v0.200.x preview 2025-08-13 12:47:57 -04:00
40 changed files with 728 additions and 357 deletions

View file

@ -718,7 +718,7 @@ jobs:
timeout-minutes: 60 timeout-minutes: 60
runs-on: github-8vcpu-ubuntu-2404 runs-on: github-8vcpu-ubuntu-2404
if: | if: |
( startsWith(github.ref, 'refs/tags/v') false && ( startsWith(github.ref, 'refs/tags/v')
|| contains(github.event.pull_request.labels.*.name, 'run-bundling') ) || contains(github.event.pull_request.labels.*.name, 'run-bundling') )
needs: [linux_tests] needs: [linux_tests]
name: Build Zed on FreeBSD name: Build Zed on FreeBSD

4
Cargo.lock generated
View file

@ -4065,6 +4065,8 @@ dependencies = [
"minidumper", "minidumper",
"paths", "paths",
"release_channel", "release_channel",
"serde",
"serde_json",
"smol", "smol",
"workspace-hack", "workspace-hack",
] ]
@ -20500,7 +20502,7 @@ dependencies = [
[[package]] [[package]]
name = "zed" name = "zed"
version = "0.200.0" version = "0.200.3"
dependencies = [ dependencies = [
"activity_indicator", "activity_indicator",
"agent", "agent",

View file

@ -844,11 +844,17 @@ impl Thread {
.await .await
.unwrap_or(false); .unwrap_or(false);
if !equal {
this.update(cx, |this, cx| { this.update(cx, |this, cx| {
this.insert_checkpoint(pending_checkpoint, cx) this.pending_checkpoint = if equal {
})?; Some(pending_checkpoint)
} else {
this.insert_checkpoint(pending_checkpoint, cx);
Some(ThreadCheckpoint {
message_id: this.next_message_id,
git_checkpoint: final_checkpoint,
})
} }
})?;
Ok(()) Ok(())
} }

View file

@ -300,6 +300,7 @@ impl AgentConfiguration {
) )
.child( .child(
div() div()
.w_full()
.px_2() .px_2()
.when(is_expanded, |parent| match configuration_view { .when(is_expanded, |parent| match configuration_view {
Some(configuration_view) => parent.child(configuration_view), Some(configuration_view) => parent.child(configuration_view),

View file

@ -7,10 +7,12 @@ use gpui::{DismissEvent, Entity, EventEmitter, FocusHandle, Focusable, Render, T
use language_model::LanguageModelRegistry; use language_model::LanguageModelRegistry;
use language_models::{ use language_models::{
AllLanguageModelSettings, OpenAiCompatibleSettingsContent, AllLanguageModelSettings, OpenAiCompatibleSettingsContent,
provider::open_ai_compatible::AvailableModel, provider::open_ai_compatible::{AvailableModel, ModelCapabilities},
}; };
use settings::update_settings_file; use settings::update_settings_file;
use ui::{Banner, KeyBinding, Modal, ModalFooter, ModalHeader, Section, prelude::*}; use ui::{
Banner, Checkbox, KeyBinding, Modal, ModalFooter, ModalHeader, Section, ToggleState, prelude::*,
};
use ui_input::SingleLineInput; use ui_input::SingleLineInput;
use workspace::{ModalView, Workspace}; use workspace::{ModalView, Workspace};
@ -69,11 +71,19 @@ impl AddLlmProviderInput {
} }
} }
struct ModelCapabilityToggles {
pub supports_tools: ToggleState,
pub supports_images: ToggleState,
pub supports_parallel_tool_calls: ToggleState,
pub supports_prompt_cache_key: ToggleState,
}
struct ModelInput { struct ModelInput {
name: Entity<SingleLineInput>, name: Entity<SingleLineInput>,
max_completion_tokens: Entity<SingleLineInput>, max_completion_tokens: Entity<SingleLineInput>,
max_output_tokens: Entity<SingleLineInput>, max_output_tokens: Entity<SingleLineInput>,
max_tokens: Entity<SingleLineInput>, max_tokens: Entity<SingleLineInput>,
capabilities: ModelCapabilityToggles,
} }
impl ModelInput { impl ModelInput {
@ -100,11 +110,23 @@ impl ModelInput {
cx, cx,
); );
let max_tokens = single_line_input("Max Tokens", "Max Tokens", Some("200000"), window, cx); let max_tokens = single_line_input("Max Tokens", "Max Tokens", Some("200000"), window, cx);
let ModelCapabilities {
tools,
images,
parallel_tool_calls,
prompt_cache_key,
} = ModelCapabilities::default();
Self { Self {
name: model_name, name: model_name,
max_completion_tokens, max_completion_tokens,
max_output_tokens, max_output_tokens,
max_tokens, max_tokens,
capabilities: ModelCapabilityToggles {
supports_tools: tools.into(),
supports_images: images.into(),
supports_parallel_tool_calls: parallel_tool_calls.into(),
supports_prompt_cache_key: prompt_cache_key.into(),
},
} }
} }
@ -136,6 +158,12 @@ impl ModelInput {
.text(cx) .text(cx)
.parse::<u64>() .parse::<u64>()
.map_err(|_| SharedString::from("Max Tokens must be a number"))?, .map_err(|_| SharedString::from("Max Tokens must be a number"))?,
capabilities: ModelCapabilities {
tools: self.capabilities.supports_tools.selected(),
images: self.capabilities.supports_images.selected(),
parallel_tool_calls: self.capabilities.supports_parallel_tool_calls.selected(),
prompt_cache_key: self.capabilities.supports_prompt_cache_key.selected(),
},
}) })
} }
} }
@ -322,6 +350,55 @@ impl AddLlmProviderModal {
.child(model.max_output_tokens.clone()), .child(model.max_output_tokens.clone()),
) )
.child(model.max_tokens.clone()) .child(model.max_tokens.clone())
.child(
v_flex()
.gap_1()
.child(
Checkbox::new(("supports-tools", ix), model.capabilities.supports_tools)
.label("Supports tools")
.on_click(cx.listener(move |this, checked, _window, cx| {
this.input.models[ix].capabilities.supports_tools = *checked;
cx.notify();
})),
)
.child(
Checkbox::new(("supports-images", ix), model.capabilities.supports_images)
.label("Supports images")
.on_click(cx.listener(move |this, checked, _window, cx| {
this.input.models[ix].capabilities.supports_images = *checked;
cx.notify();
})),
)
.child(
Checkbox::new(
("supports-parallel-tool-calls", ix),
model.capabilities.supports_parallel_tool_calls,
)
.label("Supports parallel_tool_calls")
.on_click(cx.listener(
move |this, checked, _window, cx| {
this.input.models[ix]
.capabilities
.supports_parallel_tool_calls = *checked;
cx.notify();
},
)),
)
.child(
Checkbox::new(
("supports-prompt-cache-key", ix),
model.capabilities.supports_prompt_cache_key,
)
.label("Supports prompt_cache_key")
.on_click(cx.listener(
move |this, checked, _window, cx| {
this.input.models[ix].capabilities.supports_prompt_cache_key =
*checked;
cx.notify();
},
)),
),
)
.when(has_more_than_one_model, |this| { .when(has_more_than_one_model, |this| {
this.child( this.child(
Button::new(("remove-model", ix), "Remove Model") Button::new(("remove-model", ix), "Remove Model")
@ -562,6 +639,93 @@ mod tests {
); );
} }
#[gpui::test]
async fn test_model_input_default_capabilities(cx: &mut TestAppContext) {
let cx = setup_test(cx).await;
cx.update(|window, cx| {
let model_input = ModelInput::new(window, cx);
model_input.name.update(cx, |input, cx| {
input.editor().update(cx, |editor, cx| {
editor.set_text("somemodel", window, cx);
});
});
assert_eq!(
model_input.capabilities.supports_tools,
ToggleState::Selected
);
assert_eq!(
model_input.capabilities.supports_images,
ToggleState::Unselected
);
assert_eq!(
model_input.capabilities.supports_parallel_tool_calls,
ToggleState::Unselected
);
assert_eq!(
model_input.capabilities.supports_prompt_cache_key,
ToggleState::Unselected
);
let parsed_model = model_input.parse(cx).unwrap();
assert_eq!(parsed_model.capabilities.tools, true);
assert_eq!(parsed_model.capabilities.images, false);
assert_eq!(parsed_model.capabilities.parallel_tool_calls, false);
assert_eq!(parsed_model.capabilities.prompt_cache_key, false);
});
}
#[gpui::test]
async fn test_model_input_deselected_capabilities(cx: &mut TestAppContext) {
let cx = setup_test(cx).await;
cx.update(|window, cx| {
let mut model_input = ModelInput::new(window, cx);
model_input.name.update(cx, |input, cx| {
input.editor().update(cx, |editor, cx| {
editor.set_text("somemodel", window, cx);
});
});
model_input.capabilities.supports_tools = ToggleState::Unselected;
model_input.capabilities.supports_images = ToggleState::Unselected;
model_input.capabilities.supports_parallel_tool_calls = ToggleState::Unselected;
model_input.capabilities.supports_prompt_cache_key = ToggleState::Unselected;
let parsed_model = model_input.parse(cx).unwrap();
assert_eq!(parsed_model.capabilities.tools, false);
assert_eq!(parsed_model.capabilities.images, false);
assert_eq!(parsed_model.capabilities.parallel_tool_calls, false);
assert_eq!(parsed_model.capabilities.prompt_cache_key, false);
});
}
#[gpui::test]
async fn test_model_input_with_name_and_capabilities(cx: &mut TestAppContext) {
let cx = setup_test(cx).await;
cx.update(|window, cx| {
let mut model_input = ModelInput::new(window, cx);
model_input.name.update(cx, |input, cx| {
input.editor().update(cx, |editor, cx| {
editor.set_text("somemodel", window, cx);
});
});
model_input.capabilities.supports_tools = ToggleState::Selected;
model_input.capabilities.supports_images = ToggleState::Unselected;
model_input.capabilities.supports_parallel_tool_calls = ToggleState::Selected;
model_input.capabilities.supports_prompt_cache_key = ToggleState::Unselected;
let parsed_model = model_input.parse(cx).unwrap();
assert_eq!(parsed_model.name, "somemodel");
assert_eq!(parsed_model.capabilities.tools, true);
assert_eq!(parsed_model.capabilities.images, false);
assert_eq!(parsed_model.capabilities.parallel_tool_calls, true);
assert_eq!(parsed_model.capabilities.prompt_cache_key, false);
});
}
async fn setup_test(cx: &mut TestAppContext) -> &mut VisualTestContext { async fn setup_test(cx: &mut TestAppContext) -> &mut VisualTestContext {
cx.update(|cx| { cx.update(|cx| {
let store = SettingsStore::test(cx); let store = SettingsStore::test(cx);

View file

@ -21,7 +21,7 @@ use language::{
point_from_lsp, point_to_lsp, point_from_lsp, point_to_lsp,
}; };
use lsp::{LanguageServer, LanguageServerBinary, LanguageServerId, LanguageServerName}; use lsp::{LanguageServer, LanguageServerBinary, LanguageServerId, LanguageServerName};
use node_runtime::NodeRuntime; use node_runtime::{NodeRuntime, VersionCheck};
use parking_lot::Mutex; use parking_lot::Mutex;
use project::DisableAiSettings; use project::DisableAiSettings;
use request::StatusNotification; use request::StatusNotification;
@ -1169,9 +1169,8 @@ async fn get_copilot_lsp(fs: Arc<dyn Fs>, node_runtime: NodeRuntime) -> anyhow::
const SERVER_PATH: &str = const SERVER_PATH: &str =
"node_modules/@github/copilot-language-server/dist/language-server.js"; "node_modules/@github/copilot-language-server/dist/language-server.js";
let latest_version = node_runtime // pinning it: https://github.com/zed-industries/zed/issues/36093
.npm_package_latest_version(PACKAGE_NAME) const PINNED_VERSION: &str = "1.354";
.await?;
let server_path = paths::copilot_dir().join(SERVER_PATH); let server_path = paths::copilot_dir().join(SERVER_PATH);
fs.create_dir(paths::copilot_dir()).await?; fs.create_dir(paths::copilot_dir()).await?;
@ -1181,12 +1180,13 @@ async fn get_copilot_lsp(fs: Arc<dyn Fs>, node_runtime: NodeRuntime) -> anyhow::
PACKAGE_NAME, PACKAGE_NAME,
&server_path, &server_path,
paths::copilot_dir(), paths::copilot_dir(),
&latest_version, &PINNED_VERSION,
VersionCheck::VersionMismatch,
) )
.await; .await;
if should_install { if should_install {
node_runtime node_runtime
.npm_install_packages(paths::copilot_dir(), &[(PACKAGE_NAME, &latest_version)]) .npm_install_packages(paths::copilot_dir(), &[(PACKAGE_NAME, &PINNED_VERSION)])
.await?; .await?;
} }

View file

@ -12,6 +12,8 @@ minidumper.workspace = true
paths.workspace = true paths.workspace = true
release_channel.workspace = true release_channel.workspace = true
smol.workspace = true smol.workspace = true
serde.workspace = true
serde_json.workspace = true
workspace-hack.workspace = true workspace-hack.workspace = true
[lints] [lints]

View file

@ -2,15 +2,17 @@ use crash_handler::CrashHandler;
use log::info; use log::info;
use minidumper::{Client, LoopAction, MinidumpBinary}; use minidumper::{Client, LoopAction, MinidumpBinary};
use release_channel::{RELEASE_CHANNEL, ReleaseChannel}; use release_channel::{RELEASE_CHANNEL, ReleaseChannel};
use serde::{Deserialize, Serialize};
use std::{ use std::{
env, env,
fs::File, fs::{self, File},
io, io,
panic::Location,
path::{Path, PathBuf}, path::{Path, PathBuf},
process::{self, Command}, process::{self, Command},
sync::{ sync::{
LazyLock, OnceLock, Arc, OnceLock,
atomic::{AtomicBool, Ordering}, atomic::{AtomicBool, Ordering},
}, },
thread, thread,
@ -18,19 +20,17 @@ use std::{
}; };
// set once the crash handler has initialized and the client has connected to it // set once the crash handler has initialized and the client has connected to it
pub static CRASH_HANDLER: AtomicBool = AtomicBool::new(false); pub static CRASH_HANDLER: OnceLock<Arc<Client>> = OnceLock::new();
// set when the first minidump request is made to avoid generating duplicate crash reports // set when the first minidump request is made to avoid generating duplicate crash reports
pub static REQUESTED_MINIDUMP: AtomicBool = AtomicBool::new(false); pub static REQUESTED_MINIDUMP: AtomicBool = AtomicBool::new(false);
const CRASH_HANDLER_TIMEOUT: Duration = Duration::from_secs(60); const CRASH_HANDLER_PING_TIMEOUT: Duration = Duration::from_secs(60);
const CRASH_HANDLER_CONNECT_TIMEOUT: Duration = Duration::from_secs(10);
pub static GENERATE_MINIDUMPS: LazyLock<bool> = LazyLock::new(|| { pub async fn init(crash_init: InitCrashHandler) {
*RELEASE_CHANNEL != ReleaseChannel::Dev || env::var("ZED_GENERATE_MINIDUMPS").is_ok() if *RELEASE_CHANNEL == ReleaseChannel::Dev && env::var("ZED_GENERATE_MINIDUMPS").is_err() {
});
pub async fn init(id: String) {
if !*GENERATE_MINIDUMPS {
return; return;
} }
let exe = env::current_exe().expect("unable to find ourselves"); let exe = env::current_exe().expect("unable to find ourselves");
let zed_pid = process::id(); let zed_pid = process::id();
// TODO: we should be able to get away with using 1 crash-handler process per machine, // TODO: we should be able to get away with using 1 crash-handler process per machine,
@ -61,9 +61,11 @@ pub async fn init(id: String) {
smol::Timer::after(retry_frequency).await; smol::Timer::after(retry_frequency).await;
} }
let client = maybe_client.unwrap(); let client = maybe_client.unwrap();
client.send_message(1, id).unwrap(); // set session id on the server client
.send_message(1, serde_json::to_vec(&crash_init).unwrap())
.unwrap();
let client = std::sync::Arc::new(client); let client = Arc::new(client);
let handler = crash_handler::CrashHandler::attach(unsafe { let handler = crash_handler::CrashHandler::attach(unsafe {
let client = client.clone(); let client = client.clone();
crash_handler::make_crash_event(move |crash_context: &crash_handler::CrashContext| { crash_handler::make_crash_event(move |crash_context: &crash_handler::CrashContext| {
@ -72,7 +74,6 @@ pub async fn init(id: String) {
.compare_exchange(false, true, Ordering::Acquire, Ordering::Relaxed) .compare_exchange(false, true, Ordering::Acquire, Ordering::Relaxed)
.is_ok() .is_ok()
{ {
client.send_message(2, "mistakes were made").unwrap();
client.ping().unwrap(); client.ping().unwrap();
client.request_dump(crash_context).is_ok() client.request_dump(crash_context).is_ok()
} else { } else {
@ -87,7 +88,7 @@ pub async fn init(id: String) {
{ {
handler.set_ptracer(Some(server_pid)); handler.set_ptracer(Some(server_pid));
} }
CRASH_HANDLER.store(true, Ordering::Release); CRASH_HANDLER.set(client.clone()).ok();
std::mem::forget(handler); std::mem::forget(handler);
info!("crash handler registered"); info!("crash handler registered");
@ -98,14 +99,43 @@ pub async fn init(id: String) {
} }
pub struct CrashServer { pub struct CrashServer {
session_id: OnceLock<String>, initialization_params: OnceLock<InitCrashHandler>,
panic_info: OnceLock<CrashPanic>,
has_connection: Arc<AtomicBool>,
}
#[derive(Debug, Deserialize, Serialize, Clone)]
pub struct CrashInfo {
pub init: InitCrashHandler,
pub panic: Option<CrashPanic>,
}
#[derive(Debug, Deserialize, Serialize, Clone)]
pub struct InitCrashHandler {
pub session_id: String,
pub zed_version: String,
pub release_channel: String,
pub commit_sha: String,
// pub gpu: String,
}
#[derive(Deserialize, Serialize, Debug, Clone)]
pub struct CrashPanic {
pub message: String,
pub span: String,
} }
impl minidumper::ServerHandler for CrashServer { impl minidumper::ServerHandler for CrashServer {
fn create_minidump_file(&self) -> Result<(File, PathBuf), io::Error> { fn create_minidump_file(&self) -> Result<(File, PathBuf), io::Error> {
let err_message = "Need to send a message with the ID upon starting the crash handler"; let err_message = "Missing initialization data";
let dump_path = paths::logs_dir() let dump_path = paths::logs_dir()
.join(self.session_id.get().expect(err_message)) .join(
&self
.initialization_params
.get()
.expect(err_message)
.session_id,
)
.with_extension("dmp"); .with_extension("dmp");
let file = File::create(&dump_path)?; let file = File::create(&dump_path)?;
Ok((file, dump_path)) Ok((file, dump_path))
@ -122,38 +152,71 @@ impl minidumper::ServerHandler for CrashServer {
info!("failed to write minidump: {:#}", e); info!("failed to write minidump: {:#}", e);
} }
} }
let crash_info = CrashInfo {
init: self
.initialization_params
.get()
.expect("not initialized")
.clone(),
panic: self.panic_info.get().cloned(),
};
let crash_data_path = paths::logs_dir()
.join(&crash_info.init.session_id)
.with_extension("json");
fs::write(crash_data_path, serde_json::to_vec(&crash_info).unwrap()).ok();
LoopAction::Exit LoopAction::Exit
} }
fn on_message(&self, kind: u32, buffer: Vec<u8>) { fn on_message(&self, kind: u32, buffer: Vec<u8>) {
let message = String::from_utf8(buffer).expect("invalid utf-8"); match kind {
info!("kind: {kind}, message: {message}",); 1 => {
if kind == 1 { let init_data =
self.session_id serde_json::from_slice::<InitCrashHandler>(&buffer).expect("invalid init data");
.set(message) self.initialization_params
.expect("session id already initialized"); .set(init_data)
.expect("already initialized");
}
2 => {
let panic_data =
serde_json::from_slice::<CrashPanic>(&buffer).expect("invalid panic data");
self.panic_info.set(panic_data).expect("already panicked");
}
_ => {
panic!("invalid message kind");
}
} }
} }
fn on_client_disconnected(&self, clients: usize) -> LoopAction { fn on_client_disconnected(&self, _clients: usize) -> LoopAction {
info!("client disconnected, {clients} remaining");
if clients == 0 {
LoopAction::Exit LoopAction::Exit
} else { }
fn on_client_connected(&self, _clients: usize) -> LoopAction {
self.has_connection.store(true, Ordering::SeqCst);
LoopAction::Continue LoopAction::Continue
} }
} }
}
pub fn handle_panic() { pub fn handle_panic(message: String, span: Option<&Location>) {
if !*GENERATE_MINIDUMPS { let span = span
return; .map(|loc| format!("{}:{}", loc.file(), loc.line()))
} .unwrap_or_default();
// wait 500ms for the crash handler process to start up // wait 500ms for the crash handler process to start up
// if it's still not there just write panic info and no minidump // if it's still not there just write panic info and no minidump
let retry_frequency = Duration::from_millis(100); let retry_frequency = Duration::from_millis(100);
for _ in 0..5 { for _ in 0..5 {
if CRASH_HANDLER.load(Ordering::Acquire) { if let Some(client) = CRASH_HANDLER.get() {
client
.send_message(
2,
serde_json::to_vec(&CrashPanic { message, span }).unwrap(),
)
.ok();
log::error!("triggering a crash to generate a minidump..."); log::error!("triggering a crash to generate a minidump...");
#[cfg(target_os = "linux")] #[cfg(target_os = "linux")]
CrashHandler.simulate_signal(crash_handler::Signal::Trap as u32); CrashHandler.simulate_signal(crash_handler::Signal::Trap as u32);
@ -170,14 +233,30 @@ pub fn crash_server(socket: &Path) {
log::info!("Couldn't create socket, there may already be a running crash server"); log::info!("Couldn't create socket, there may already be a running crash server");
return; return;
}; };
let ab = AtomicBool::new(false);
let shutdown = Arc::new(AtomicBool::new(false));
let has_connection = Arc::new(AtomicBool::new(false));
std::thread::spawn({
let shutdown = shutdown.clone();
let has_connection = has_connection.clone();
move || {
std::thread::sleep(CRASH_HANDLER_CONNECT_TIMEOUT);
if !has_connection.load(Ordering::SeqCst) {
shutdown.store(true, Ordering::SeqCst);
}
}
});
server server
.run( .run(
Box::new(CrashServer { Box::new(CrashServer {
session_id: OnceLock::new(), initialization_params: OnceLock::new(),
panic_info: OnceLock::new(),
has_connection,
}), }),
&ab, &shutdown,
Some(CRASH_HANDLER_TIMEOUT), Some(CRASH_HANDLER_PING_TIMEOUT),
) )
.expect("failed to run server"); .expect("failed to run server");
} }

View file

@ -20200,6 +20200,7 @@ impl Editor {
); );
let old_cursor_shape = self.cursor_shape; let old_cursor_shape = self.cursor_shape;
let old_show_breadcrumbs = self.show_breadcrumbs;
{ {
let editor_settings = EditorSettings::get_global(cx); let editor_settings = EditorSettings::get_global(cx);
@ -20213,6 +20214,10 @@ impl Editor {
cx.emit(EditorEvent::CursorShapeChanged); cx.emit(EditorEvent::CursorShapeChanged);
} }
if old_show_breadcrumbs != self.show_breadcrumbs {
cx.emit(EditorEvent::BreadcrumbsChanged);
}
let project_settings = ProjectSettings::get_global(cx); let project_settings = ProjectSettings::get_global(cx);
self.serialize_dirty_buffers = self.serialize_dirty_buffers =
!self.mode.is_minimap() && project_settings.session.restore_unsaved_buffers; !self.mode.is_minimap() && project_settings.session.restore_unsaved_buffers;
@ -22834,6 +22839,7 @@ pub enum EditorEvent {
}, },
Reloaded, Reloaded,
CursorShapeChanged, CursorShapeChanged,
BreadcrumbsChanged,
PushedToNavHistory { PushedToNavHistory {
anchor: Anchor, anchor: Anchor,
is_deactivate: bool, is_deactivate: bool,

View file

@ -1036,6 +1036,10 @@ impl Item for Editor {
f(ItemEvent::UpdateBreadcrumbs); f(ItemEvent::UpdateBreadcrumbs);
} }
EditorEvent::BreadcrumbsChanged => {
f(ItemEvent::UpdateBreadcrumbs);
}
EditorEvent::DirtyChanged => { EditorEvent::DirtyChanged => {
f(ItemEvent::UpdateTab); f(ItemEvent::UpdateTab);
} }

View file

@ -941,6 +941,7 @@ impl LanguageModel for CloudLanguageModel {
request, request,
model.id(), model.id(),
model.supports_parallel_tool_calls(), model.supports_parallel_tool_calls(),
model.supports_prompt_cache_key(),
None, None,
None, None,
); );

View file

@ -370,6 +370,7 @@ impl LanguageModel for OpenAiLanguageModel {
request, request,
self.model.id(), self.model.id(),
self.model.supports_parallel_tool_calls(), self.model.supports_parallel_tool_calls(),
self.model.supports_prompt_cache_key(),
self.max_output_tokens(), self.max_output_tokens(),
self.model.reasoning_effort(), self.model.reasoning_effort(),
); );
@ -386,6 +387,7 @@ pub fn into_open_ai(
request: LanguageModelRequest, request: LanguageModelRequest,
model_id: &str, model_id: &str,
supports_parallel_tool_calls: bool, supports_parallel_tool_calls: bool,
supports_prompt_cache_key: bool,
max_output_tokens: Option<u64>, max_output_tokens: Option<u64>,
reasoning_effort: Option<ReasoningEffort>, reasoning_effort: Option<ReasoningEffort>,
) -> open_ai::Request { ) -> open_ai::Request {
@ -477,7 +479,11 @@ pub fn into_open_ai(
} else { } else {
None None
}, },
prompt_cache_key: request.thread_id, prompt_cache_key: if supports_prompt_cache_key {
request.thread_id
} else {
None
},
tools: request tools: request
.tools .tools
.into_iter() .into_iter()

View file

@ -38,6 +38,27 @@ pub struct AvailableModel {
pub max_tokens: u64, pub max_tokens: u64,
pub max_output_tokens: Option<u64>, pub max_output_tokens: Option<u64>,
pub max_completion_tokens: Option<u64>, pub max_completion_tokens: Option<u64>,
#[serde(default)]
pub capabilities: ModelCapabilities,
}
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, JsonSchema)]
pub struct ModelCapabilities {
pub tools: bool,
pub images: bool,
pub parallel_tool_calls: bool,
pub prompt_cache_key: bool,
}
impl Default for ModelCapabilities {
fn default() -> Self {
Self {
tools: true,
images: false,
parallel_tool_calls: false,
prompt_cache_key: false,
}
}
} }
pub struct OpenAiCompatibleLanguageModelProvider { pub struct OpenAiCompatibleLanguageModelProvider {
@ -293,17 +314,17 @@ impl LanguageModel for OpenAiCompatibleLanguageModel {
} }
fn supports_tools(&self) -> bool { fn supports_tools(&self) -> bool {
true self.model.capabilities.tools
} }
fn supports_images(&self) -> bool { fn supports_images(&self) -> bool {
false self.model.capabilities.images
} }
fn supports_tool_choice(&self, choice: LanguageModelToolChoice) -> bool { fn supports_tool_choice(&self, choice: LanguageModelToolChoice) -> bool {
match choice { match choice {
LanguageModelToolChoice::Auto => true, LanguageModelToolChoice::Auto => self.model.capabilities.tools,
LanguageModelToolChoice::Any => true, LanguageModelToolChoice::Any => self.model.capabilities.tools,
LanguageModelToolChoice::None => true, LanguageModelToolChoice::None => true,
} }
} }
@ -358,7 +379,8 @@ impl LanguageModel for OpenAiCompatibleLanguageModel {
let request = into_open_ai( let request = into_open_ai(
request, request,
&self.model.name, &self.model.name,
true, self.model.capabilities.parallel_tool_calls,
self.model.capabilities.prompt_cache_key,
self.max_output_tokens(), self.max_output_tokens(),
None, None,
); );

View file

@ -355,6 +355,7 @@ impl LanguageModel for VercelLanguageModel {
request, request,
self.model.id(), self.model.id(),
self.model.supports_parallel_tool_calls(), self.model.supports_parallel_tool_calls(),
self.model.supports_prompt_cache_key(),
self.max_output_tokens(), self.max_output_tokens(),
None, None,
); );

View file

@ -359,6 +359,7 @@ impl LanguageModel for XAiLanguageModel {
request, request,
self.model.id(), self.model.id(),
self.model.supports_parallel_tool_calls(), self.model.supports_parallel_tool_calls(),
self.model.supports_prompt_cache_key(),
self.max_output_tokens(), self.max_output_tokens(),
None, None,
); );

View file

@ -103,7 +103,13 @@ impl LspAdapter for CssLspAdapter {
let should_install_language_server = self let should_install_language_server = self
.node .node
.should_install_npm_package(Self::PACKAGE_NAME, &server_path, &container_dir, &version) .should_install_npm_package(
Self::PACKAGE_NAME,
&server_path,
&container_dir,
&version,
Default::default(),
)
.await; .await;
if should_install_language_server { if should_install_language_server {

View file

@ -340,7 +340,13 @@ impl LspAdapter for JsonLspAdapter {
let should_install_language_server = self let should_install_language_server = self
.node .node
.should_install_npm_package(Self::PACKAGE_NAME, &server_path, &container_dir, &version) .should_install_npm_package(
Self::PACKAGE_NAME,
&server_path,
&container_dir,
&version,
Default::default(),
)
.await; .await;
if should_install_language_server { if should_install_language_server {

View file

@ -206,6 +206,7 @@ impl LspAdapter for PythonLspAdapter {
&server_path, &server_path,
&container_dir, &container_dir,
&version, &version,
Default::default(),
) )
.await; .await;

View file

@ -108,7 +108,13 @@ impl LspAdapter for TailwindLspAdapter {
let should_install_language_server = self let should_install_language_server = self
.node .node
.should_install_npm_package(Self::PACKAGE_NAME, &server_path, &container_dir, &version) .should_install_npm_package(
Self::PACKAGE_NAME,
&server_path,
&container_dir,
&version,
Default::default(),
)
.await; .await;
if should_install_language_server { if should_install_language_server {

View file

@ -589,6 +589,7 @@ impl LspAdapter for TypeScriptLspAdapter {
&server_path, &server_path,
&container_dir, &container_dir,
version.typescript_version.as_str(), version.typescript_version.as_str(),
Default::default(),
) )
.await; .await;

View file

@ -116,6 +116,7 @@ impl LspAdapter for VtslsLspAdapter {
&server_path, &server_path,
&container_dir, &container_dir,
&latest_version.server_version, &latest_version.server_version,
Default::default(),
) )
.await .await
{ {
@ -129,6 +130,7 @@ impl LspAdapter for VtslsLspAdapter {
&container_dir.join(Self::TYPESCRIPT_TSDK_PATH), &container_dir.join(Self::TYPESCRIPT_TSDK_PATH),
&container_dir, &container_dir,
&latest_version.typescript_version, &latest_version.typescript_version,
Default::default(),
) )
.await .await
{ {

View file

@ -104,7 +104,13 @@ impl LspAdapter for YamlLspAdapter {
let should_install_language_server = self let should_install_language_server = self
.node .node
.should_install_npm_package(Self::PACKAGE_NAME, &server_path, &container_dir, &version) .should_install_npm_package(
Self::PACKAGE_NAME,
&server_path,
&container_dir,
&version,
Default::default(),
)
.await; .await;
if should_install_language_server { if should_install_language_server {

View file

@ -29,6 +29,15 @@ pub struct NodeBinaryOptions {
pub use_paths: Option<(PathBuf, PathBuf)>, pub use_paths: Option<(PathBuf, PathBuf)>,
} }
#[derive(Default)]
pub enum VersionCheck {
/// Check whether the installed and requested version have a mismatch
VersionMismatch,
/// Only check whether the currently installed version is older than the newest one
#[default]
OlderVersion,
}
#[derive(Clone)] #[derive(Clone)]
pub struct NodeRuntime(Arc<Mutex<NodeRuntimeState>>); pub struct NodeRuntime(Arc<Mutex<NodeRuntimeState>>);
@ -287,6 +296,7 @@ impl NodeRuntime {
local_executable_path: &Path, local_executable_path: &Path,
local_package_directory: &Path, local_package_directory: &Path,
latest_version: &str, latest_version: &str,
version_check: VersionCheck,
) -> bool { ) -> bool {
// In the case of the local system not having the package installed, // In the case of the local system not having the package installed,
// or in the instances where we fail to parse package.json data, // or in the instances where we fail to parse package.json data,
@ -311,7 +321,10 @@ impl NodeRuntime {
return true; return true;
}; };
installed_version < latest_version match version_check {
VersionCheck::VersionMismatch => installed_version != latest_version,
VersionCheck::OlderVersion => installed_version < latest_version,
}
} }
} }

View file

@ -721,7 +721,7 @@ fn render_popular_settings_section(
.items_start() .items_start()
.justify_between() .justify_between()
.child( .child(
v_flex().child(Label::new("Mini Map")).child( v_flex().child(Label::new("Minimap")).child(
Label::new("See a high-level overview of your source code.") Label::new("See a high-level overview of your source code.")
.color(Color::Muted), .color(Color::Muted),
), ),

View file

@ -236,6 +236,13 @@ impl Model {
Self::O1 | Self::O3 | Self::O3Mini | Self::O4Mini | Model::Custom { .. } => false, Self::O1 | Self::O3 | Self::O3Mini | Self::O4Mini | Model::Custom { .. } => false,
} }
} }
/// Returns whether the given model supports the `prompt_cache_key` parameter.
///
/// If the model does not support the parameter, do not pass it up.
pub fn supports_prompt_cache_key(&self) -> bool {
return true;
}
} }
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Serialize, Deserialize)]
@ -257,6 +264,7 @@ pub struct Request {
pub tools: Vec<ToolDefinition>, pub tools: Vec<ToolDefinition>,
#[serde(default, skip_serializing_if = "Option::is_none")] #[serde(default, skip_serializing_if = "Option::is_none")]
pub prompt_cache_key: Option<String>, pub prompt_cache_key: Option<String>,
#[serde(default, skip_serializing_if = "Option::is_none")]
pub reasoning_effort: Option<ReasoningEffort>, pub reasoning_effort: Option<ReasoningEffort>,
} }

View file

@ -11813,14 +11813,16 @@ impl LspStore {
notify_server_capabilities_updated(&server, cx); notify_server_capabilities_updated(&server, cx);
} }
} }
"textDocument/synchronization" => { "textDocument/didChange" => {
if let Some(caps) = reg if let Some(sync_kind) = reg
.register_options .register_options
.map(serde_json::from_value) .and_then(|opts| opts.get("syncKind").cloned())
.map(serde_json::from_value::<lsp::TextDocumentSyncKind>)
.transpose()? .transpose()?
{ {
server.update_capabilities(|capabilities| { server.update_capabilities(|capabilities| {
capabilities.text_document_sync = Some(caps); capabilities.text_document_sync =
Some(lsp::TextDocumentSyncCapability::Kind(sync_kind));
}); });
notify_server_capabilities_updated(&server, cx); notify_server_capabilities_updated(&server, cx);
} }
@ -11970,7 +11972,7 @@ impl LspStore {
}); });
notify_server_capabilities_updated(&server, cx); notify_server_capabilities_updated(&server, cx);
} }
"textDocument/synchronization" => { "textDocument/didChange" => {
server.update_capabilities(|capabilities| { server.update_capabilities(|capabilities| {
capabilities.text_document_sync = None; capabilities.text_document_sync = None;
}); });

View file

@ -84,11 +84,13 @@ message GetCrashFiles {
message GetCrashFilesResponse { message GetCrashFilesResponse {
repeated CrashReport crashes = 1; repeated CrashReport crashes = 1;
repeated string legacy_panics = 2;
} }
message CrashReport { message CrashReport {
optional string panic_contents = 1; reserved 1, 2;
optional bytes minidump_contents = 2; string metadata = 3;
bytes minidump_contents = 4;
} }
message Extension { message Extension {

View file

@ -1484,20 +1484,17 @@ impl RemoteConnection for SshRemoteConnection {
identifier = &unique_identifier, identifier = &unique_identifier,
); );
if let Some(rust_log) = std::env::var("RUST_LOG").ok() { for env_var in ["RUST_LOG", "RUST_BACKTRACE", "ZED_GENERATE_MINIDUMPS"] {
if let Some(value) = std::env::var(env_var).ok() {
start_proxy_command = format!( start_proxy_command = format!(
"RUST_LOG={} {}", "{}={} {} ",
shlex::try_quote(&rust_log).unwrap(), env_var,
start_proxy_command shlex::try_quote(&value).unwrap(),
) start_proxy_command,
);
} }
if let Some(rust_backtrace) = std::env::var("RUST_BACKTRACE").ok() {
start_proxy_command = format!(
"RUST_BACKTRACE={} {}",
shlex::try_quote(&rust_backtrace).unwrap(),
start_proxy_command
)
} }
if reconnect { if reconnect {
start_proxy_command.push_str(" --reconnect"); start_proxy_command.push_str(" --reconnect");
} }
@ -2229,8 +2226,7 @@ impl SshRemoteConnection {
#[cfg(not(target_os = "windows"))] #[cfg(not(target_os = "windows"))]
{ {
run_cmd(Command::new("gzip").args(["-9", "-f", &bin_path.to_string_lossy()])) run_cmd(Command::new("gzip").args(["-f", &bin_path.to_string_lossy()])).await?;
.await?;
} }
#[cfg(target_os = "windows")] #[cfg(target_os = "windows")]
{ {
@ -2462,7 +2458,7 @@ impl ChannelClient {
}, },
async { async {
smol::Timer::after(timeout).await; smol::Timer::after(timeout).await;
anyhow::bail!("Timeout detected") anyhow::bail!("Timed out resyncing remote client")
}, },
) )
.await .await
@ -2476,7 +2472,7 @@ impl ChannelClient {
}, },
async { async {
smol::Timer::after(timeout).await; smol::Timer::after(timeout).await;
anyhow::bail!("Timeout detected") anyhow::bail!("Timed out pinging remote client")
}, },
) )
.await .await

View file

@ -34,10 +34,10 @@ use smol::io::AsyncReadExt;
use smol::Async; use smol::Async;
use smol::{net::unix::UnixListener, stream::StreamExt as _}; use smol::{net::unix::UnixListener, stream::StreamExt as _};
use std::collections::HashMap;
use std::ffi::OsStr; use std::ffi::OsStr;
use std::ops::ControlFlow; use std::ops::ControlFlow;
use std::str::FromStr; use std::str::FromStr;
use std::sync::LazyLock;
use std::{env, thread}; use std::{env, thread};
use std::{ use std::{
io::Write, io::Write,
@ -48,6 +48,13 @@ use std::{
use telemetry_events::LocationData; use telemetry_events::LocationData;
use util::ResultExt; use util::ResultExt;
pub static VERSION: LazyLock<&str> = LazyLock::new(|| match *RELEASE_CHANNEL {
ReleaseChannel::Stable | ReleaseChannel::Preview => env!("ZED_PKG_VERSION"),
ReleaseChannel::Nightly | ReleaseChannel::Dev => {
option_env!("ZED_COMMIT_SHA").unwrap_or("missing-zed-commit-sha")
}
});
fn init_logging_proxy() { fn init_logging_proxy() {
env_logger::builder() env_logger::builder()
.format(|buf, record| { .format(|buf, record| {
@ -113,7 +120,6 @@ fn init_logging_server(log_file_path: PathBuf) -> Result<Receiver<Vec<u8>>> {
fn init_panic_hook(session_id: String) { fn init_panic_hook(session_id: String) {
std::panic::set_hook(Box::new(move |info| { std::panic::set_hook(Box::new(move |info| {
crashes::handle_panic();
let payload = info let payload = info
.payload() .payload()
.downcast_ref::<&str>() .downcast_ref::<&str>()
@ -121,6 +127,8 @@ fn init_panic_hook(session_id: String) {
.or_else(|| info.payload().downcast_ref::<String>().cloned()) .or_else(|| info.payload().downcast_ref::<String>().cloned())
.unwrap_or_else(|| "Box<Any>".to_string()); .unwrap_or_else(|| "Box<Any>".to_string());
crashes::handle_panic(payload.clone(), info.location());
let backtrace = backtrace::Backtrace::new(); let backtrace = backtrace::Backtrace::new();
let mut backtrace = backtrace let mut backtrace = backtrace
.frames() .frames()
@ -150,14 +158,6 @@ fn init_panic_hook(session_id: String) {
(&backtrace).join("\n") (&backtrace).join("\n")
); );
let release_channel = *RELEASE_CHANNEL;
let version = match release_channel {
ReleaseChannel::Stable | ReleaseChannel::Preview => env!("ZED_PKG_VERSION"),
ReleaseChannel::Nightly | ReleaseChannel::Dev => {
option_env!("ZED_COMMIT_SHA").unwrap_or("missing-zed-commit-sha")
}
};
let panic_data = telemetry_events::Panic { let panic_data = telemetry_events::Panic {
thread: thread_name.into(), thread: thread_name.into(),
payload: payload.clone(), payload: payload.clone(),
@ -165,9 +165,9 @@ fn init_panic_hook(session_id: String) {
file: location.file().into(), file: location.file().into(),
line: location.line(), line: location.line(),
}), }),
app_version: format!("remote-server-{version}"), app_version: format!("remote-server-{}", *VERSION),
app_commit_sha: option_env!("ZED_COMMIT_SHA").map(|sha| sha.into()), app_commit_sha: option_env!("ZED_COMMIT_SHA").map(|sha| sha.into()),
release_channel: release_channel.dev_name().into(), release_channel: RELEASE_CHANNEL.dev_name().into(),
target: env!("TARGET").to_owned().into(), target: env!("TARGET").to_owned().into(),
os_name: telemetry::os_name(), os_name: telemetry::os_name(),
os_version: Some(telemetry::os_version()), os_version: Some(telemetry::os_version()),
@ -204,8 +204,8 @@ fn handle_crash_files_requests(project: &Entity<HeadlessProject>, client: &Arc<C
client.add_request_handler( client.add_request_handler(
project.downgrade(), project.downgrade(),
|_, _: TypedEnvelope<proto::GetCrashFiles>, _cx| async move { |_, _: TypedEnvelope<proto::GetCrashFiles>, _cx| async move {
let mut legacy_panics = Vec::new();
let mut crashes = Vec::new(); let mut crashes = Vec::new();
let mut minidumps_by_session_id = HashMap::new();
let mut children = smol::fs::read_dir(paths::logs_dir()).await?; let mut children = smol::fs::read_dir(paths::logs_dir()).await?;
while let Some(child) = children.next().await { while let Some(child) = children.next().await {
let child = child?; let child = child?;
@ -227,41 +227,31 @@ fn handle_crash_files_requests(project: &Entity<HeadlessProject>, client: &Arc<C
.await .await
.context("error reading panic file")?; .context("error reading panic file")?;
crashes.push(proto::CrashReport { legacy_panics.push(file_contents);
panic_contents: Some(file_contents),
minidump_contents: None,
});
} else if extension == Some(OsStr::new("dmp")) {
let session_id = child_path.file_stem().unwrap().to_string_lossy();
minidumps_by_session_id
.insert(session_id.to_string(), smol::fs::read(&child_path).await?);
}
// We've done what we can, delete the file
smol::fs::remove_file(&child_path) smol::fs::remove_file(&child_path)
.await .await
.context("error removing panic") .context("error removing panic")
.log_err(); .log_err();
} else if extension == Some(OsStr::new("dmp")) {
let mut json_path = child_path.clone();
json_path.set_extension("json");
if let Ok(json_content) = smol::fs::read_to_string(&json_path).await {
crashes.push(CrashReport {
metadata: json_content,
minidump_contents: smol::fs::read(&child_path).await?,
});
smol::fs::remove_file(&child_path).await.log_err();
smol::fs::remove_file(&json_path).await.log_err();
} else {
log::error!("Couldn't find json metadata for crash: {child_path:?}");
} }
for crash in &mut crashes {
let panic: telemetry_events::Panic =
serde_json::from_str(crash.panic_contents.as_ref().unwrap())?;
if let dump @ Some(_) = minidumps_by_session_id.remove(&panic.session_id) {
crash.minidump_contents = dump;
} }
} }
crashes.extend( anyhow::Ok(proto::GetCrashFilesResponse {
minidumps_by_session_id crashes,
.into_values() legacy_panics,
.map(|dmp| CrashReport { })
panic_contents: None,
minidump_contents: Some(dmp),
}),
);
anyhow::Ok(proto::GetCrashFilesResponse { crashes })
}, },
); );
} }
@ -442,7 +432,12 @@ pub fn execute_run(
let app = gpui::Application::headless(); let app = gpui::Application::headless();
let id = std::process::id().to_string(); let id = std::process::id().to_string();
app.background_executor() app.background_executor()
.spawn(crashes::init(id.clone())) .spawn(crashes::init(crashes::InitCrashHandler {
session_id: id.clone(),
zed_version: VERSION.to_owned(),
release_channel: release_channel::RELEASE_CHANNEL_NAME.clone(),
commit_sha: option_env!("ZED_COMMIT_SHA").unwrap_or("no_sha").to_owned(),
}))
.detach(); .detach();
init_panic_hook(id); init_panic_hook(id);
let log_rx = init_logging_server(log_file)?; let log_rx = init_logging_server(log_file)?;
@ -569,7 +564,13 @@ pub fn execute_proxy(identifier: String, is_reconnecting: bool) -> Result<()> {
let server_paths = ServerPaths::new(&identifier)?; let server_paths = ServerPaths::new(&identifier)?;
let id = std::process::id().to_string(); let id = std::process::id().to_string();
smol::spawn(crashes::init(id.clone())).detach(); smol::spawn(crashes::init(crashes::InitCrashHandler {
session_id: id.clone(),
zed_version: VERSION.to_owned(),
release_channel: release_channel::RELEASE_CHANNEL_NAME.clone(),
commit_sha: option_env!("ZED_COMMIT_SHA").unwrap_or("no_sha").to_owned(),
}))
.detach();
init_panic_hook(id); init_panic_hook(id);
log::info!("starting proxy process. PID: {}", std::process::id()); log::info!("starting proxy process. PID: {}", std::process::id());

View file

@ -928,14 +928,14 @@ impl<'a> KeybindUpdateTarget<'a> {
} }
let action_name: Value = self.action_name.into(); let action_name: Value = self.action_name.into();
let value = match self.action_arguments { let value = match self.action_arguments {
Some(args) => { Some(args) if !args.is_empty() => {
let args = serde_json::from_str::<Value>(args) let args = serde_json::from_str::<Value>(args)
.context("Failed to parse action arguments as JSON")?; .context("Failed to parse action arguments as JSON")?;
serde_json::json!([action_name, args]) serde_json::json!([action_name, args])
} }
None => action_name, _ => action_name,
}; };
return Ok(value); Ok(value)
} }
fn keystrokes_unparsed(&self) -> String { fn keystrokes_unparsed(&self) -> String {
@ -1084,6 +1084,24 @@ mod tests {
.unindent(), .unindent(),
); );
check_keymap_update(
"[]",
KeybindUpdateOperation::add(KeybindUpdateTarget {
keystrokes: &parse_keystrokes("ctrl-a"),
action_name: "zed::SomeAction",
context: None,
action_arguments: Some(""),
}),
r#"[
{
"bindings": {
"ctrl-a": "zed::SomeAction"
}
}
]"#
.unindent(),
);
check_keymap_update( check_keymap_update(
r#"[ r#"[
{ {

View file

@ -2150,7 +2150,8 @@ impl KeybindingEditorModal {
let action_arguments = self let action_arguments = self
.action_arguments_editor .action_arguments_editor
.as_ref() .as_ref()
.map(|editor| editor.read(cx).editor.read(cx).text(cx)); .map(|arguments_editor| arguments_editor.read(cx).editor.read(cx).text(cx))
.filter(|args| !args.is_empty());
let value = action_arguments let value = action_arguments
.as_ref() .as_ref()
@ -2261,29 +2262,11 @@ impl KeybindingEditorModal {
let create = self.creating; let create = self.creating;
let status_toast = StatusToast::new(
format!(
"Saved edits to the {} action.",
&self.editing_keybind.action().humanized_name
),
cx,
move |this, _cx| {
this.icon(ToastIcon::new(IconName::Check).color(Color::Success))
.dismiss_button(true)
// .action("Undo", f) todo: wire the undo functionality
},
);
self.workspace
.update(cx, |workspace, cx| {
workspace.toggle_status_toast(status_toast, cx);
})
.log_err();
cx.spawn(async move |this, cx| { cx.spawn(async move |this, cx| {
let action_name = existing_keybind.action().name; let action_name = existing_keybind.action().name;
let humanized_action_name = existing_keybind.action().humanized_name.clone();
if let Err(err) = save_keybinding_update( match save_keybinding_update(
create, create,
existing_keybind, existing_keybind,
&action_mapping, &action_mapping,
@ -2293,11 +2276,7 @@ impl KeybindingEditorModal {
) )
.await .await
{ {
this.update(cx, |this, cx| { Ok(_) => {
this.set_error(InputError::error(err), cx);
})
.log_err();
} else {
this.update(cx, |this, cx| { this.update(cx, |this, cx| {
this.keymap_editor.update(cx, |keymap, cx| { this.keymap_editor.update(cx, |keymap, cx| {
keymap.previous_edit = Some(PreviousEdit::Keybinding { keymap.previous_edit = Some(PreviousEdit::Keybinding {
@ -2307,12 +2286,34 @@ impl KeybindingEditorModal {
.table_interaction_state .table_interaction_state
.read(cx) .read(cx)
.get_scrollbar_offset(Axis::Vertical), .get_scrollbar_offset(Axis::Vertical),
});
let status_toast = StatusToast::new(
format!("Saved edits to the {} action.", humanized_action_name),
cx,
move |this, _cx| {
this.icon(ToastIcon::new(IconName::Check).color(Color::Success))
.dismiss_button(true)
// .action("Undo", f) todo: wire the undo functionality
},
);
this.workspace
.update(cx, |workspace, cx| {
workspace.toggle_status_toast(status_toast, cx);
}) })
.log_err();
}); });
cx.emit(DismissEvent); cx.emit(DismissEvent);
}) })
.ok(); .ok();
} }
Err(err) => {
this.update(cx, |this, cx| {
this.set_error(InputError::error(err), cx);
})
.log_err();
}
}
}) })
.detach(); .detach();
@ -2983,7 +2984,7 @@ async fn save_keybinding_update(
let updated_keymap_contents = let updated_keymap_contents =
settings::KeymapFile::update_keybinding(operation, keymap_contents, tab_size) settings::KeymapFile::update_keybinding(operation, keymap_contents, tab_size)
.context("Failed to update keybinding")?; .map_err(|err| anyhow::anyhow!("Could not save updated keybinding: {}", err))?;
fs.write( fs.write(
paths::keymap_file().as_path(), paths::keymap_file().as_path(),
updated_keymap_contents.as_bytes(), updated_keymap_contents.as_bytes(),

View file

@ -31,7 +31,7 @@ pub enum AnimationDirection {
FromTop, FromTop,
} }
pub trait DefaultAnimations: Styled + Sized { pub trait DefaultAnimations: Styled + Sized + Element {
fn animate_in( fn animate_in(
self, self,
animation_type: AnimationDirection, animation_type: AnimationDirection,
@ -44,8 +44,13 @@ pub trait DefaultAnimations: Styled + Sized {
AnimationDirection::FromTop => "animate_from_top", AnimationDirection::FromTop => "animate_from_top",
}; };
let animation_id = self.id().map_or_else(
|| ElementId::from(animation_name),
|id| (id, animation_name).into(),
);
self.with_animation( self.with_animation(
animation_name, animation_id,
gpui::Animation::new(AnimationDuration::Fast.into()).with_easing(ease_out_quint()), gpui::Animation::new(AnimationDuration::Fast.into()).with_easing(ease_out_quint()),
move |mut this, delta| { move |mut this, delta| {
let start_opacity = 0.4; let start_opacity = 0.4;
@ -91,7 +96,7 @@ pub trait DefaultAnimations: Styled + Sized {
} }
} }
impl<E: Styled> DefaultAnimations for E {} impl<E: Styled + Element> DefaultAnimations for E {}
// Don't use this directly, it only exists to show animation previews // Don't use this directly, it only exists to show animation previews
#[derive(RegisterComponent)] #[derive(RegisterComponent)]
@ -132,7 +137,7 @@ impl Component for Animation {
.left(px(offset)) .left(px(offset))
.rounded_md() .rounded_md()
.bg(gpui::red()) .bg(gpui::red())
.animate_in(AnimationDirection::FromBottom, false), .animate_in_from_bottom(false),
) )
.into_any_element(), .into_any_element(),
), ),
@ -151,7 +156,7 @@ impl Component for Animation {
.left(px(offset)) .left(px(offset))
.rounded_md() .rounded_md()
.bg(gpui::blue()) .bg(gpui::blue())
.animate_in(AnimationDirection::FromTop, false), .animate_in_from_top(false),
) )
.into_any_element(), .into_any_element(),
), ),
@ -170,7 +175,7 @@ impl Component for Animation {
.top(px(offset)) .top(px(offset))
.rounded_md() .rounded_md()
.bg(gpui::green()) .bg(gpui::green())
.animate_in(AnimationDirection::FromLeft, false), .animate_in_from_left(false),
) )
.into_any_element(), .into_any_element(),
), ),
@ -189,7 +194,7 @@ impl Component for Animation {
.top(px(offset)) .top(px(offset))
.rounded_md() .rounded_md()
.bg(gpui::yellow()) .bg(gpui::yellow())
.animate_in(AnimationDirection::FromRight, false), .animate_in_from_right(false),
) )
.into_any_element(), .into_any_element(),
), ),
@ -214,7 +219,7 @@ impl Component for Animation {
.left(px(offset)) .left(px(offset))
.rounded_md() .rounded_md()
.bg(gpui::red()) .bg(gpui::red())
.animate_in(AnimationDirection::FromBottom, true), .animate_in_from_bottom(true),
) )
.into_any_element(), .into_any_element(),
), ),
@ -233,7 +238,7 @@ impl Component for Animation {
.left(px(offset)) .left(px(offset))
.rounded_md() .rounded_md()
.bg(gpui::blue()) .bg(gpui::blue())
.animate_in(AnimationDirection::FromTop, true), .animate_in_from_top(true),
) )
.into_any_element(), .into_any_element(),
), ),
@ -252,7 +257,7 @@ impl Component for Animation {
.top(px(offset)) .top(px(offset))
.rounded_md() .rounded_md()
.bg(gpui::green()) .bg(gpui::green())
.animate_in(AnimationDirection::FromLeft, true), .animate_in_from_left(true),
) )
.into_any_element(), .into_any_element(),
), ),
@ -271,7 +276,7 @@ impl Component for Animation {
.top(px(offset)) .top(px(offset))
.rounded_md() .rounded_md()
.bg(gpui::yellow()) .bg(gpui::yellow())
.animate_in(AnimationDirection::FromRight, true), .animate_in_from_right(true),
) )
.into_any_element(), .into_any_element(),
), ),

View file

@ -71,4 +71,8 @@ impl Model {
Model::Custom { .. } => false, Model::Custom { .. } => false,
} }
} }
pub fn supports_prompt_cache_key(&self) -> bool {
false
}
} }

View file

@ -3,7 +3,7 @@ use std::{
time::{Duration, Instant}, time::{Duration, Instant},
}; };
use gpui::{AnyView, DismissEvent, Entity, FocusHandle, ManagedView, Subscription, Task}; use gpui::{AnyView, DismissEvent, Entity, EntityId, FocusHandle, ManagedView, Subscription, Task};
use ui::{animation::DefaultAnimations, prelude::*}; use ui::{animation::DefaultAnimations, prelude::*};
use zed_actions::toast; use zed_actions::toast;
@ -76,6 +76,7 @@ impl<V: ToastView> ToastViewHandle for Entity<V> {
} }
pub struct ActiveToast { pub struct ActiveToast {
id: EntityId,
toast: Box<dyn ToastViewHandle>, toast: Box<dyn ToastViewHandle>,
action: Option<ToastAction>, action: Option<ToastAction>,
_subscriptions: [Subscription; 1], _subscriptions: [Subscription; 1],
@ -113,9 +114,9 @@ impl ToastLayer {
V: ToastView, V: ToastView,
{ {
if let Some(active_toast) = &self.active_toast { if let Some(active_toast) = &self.active_toast {
let is_close = active_toast.toast.view().downcast::<V>().is_ok(); let show_new = active_toast.id != new_toast.entity_id();
let did_close = self.hide_toast(cx); self.hide_toast(cx);
if is_close || !did_close { if !show_new {
return; return;
} }
} }
@ -130,11 +131,12 @@ impl ToastLayer {
let focus_handle = cx.focus_handle(); let focus_handle = cx.focus_handle();
self.active_toast = Some(ActiveToast { self.active_toast = Some(ActiveToast {
toast: Box::new(new_toast.clone()),
action,
_subscriptions: [cx.subscribe(&new_toast, |this, _, _: &DismissEvent, cx| { _subscriptions: [cx.subscribe(&new_toast, |this, _, _: &DismissEvent, cx| {
this.hide_toast(cx); this.hide_toast(cx);
})], })],
id: new_toast.entity_id(),
toast: Box::new(new_toast),
action,
focus_handle, focus_handle,
}); });
@ -143,11 +145,9 @@ impl ToastLayer {
cx.notify(); cx.notify();
} }
pub fn hide_toast(&mut self, cx: &mut Context<Self>) -> bool { pub fn hide_toast(&mut self, cx: &mut Context<Self>) {
self.active_toast.take(); self.active_toast.take();
cx.notify(); cx.notify();
true
} }
pub fn active_toast<V>(&self) -> Option<Entity<V>> pub fn active_toast<V>(&self) -> Option<Entity<V>>
@ -218,11 +218,10 @@ impl Render for ToastLayer {
let Some(active_toast) = &self.active_toast else { let Some(active_toast) = &self.active_toast else {
return div(); return div();
}; };
let handle = cx.weak_entity();
div().absolute().size_full().bottom_0().left_0().child( div().absolute().size_full().bottom_0().left_0().child(
v_flex() v_flex()
.id("toast-layer-container") .id(("toast-layer-container", active_toast.id))
.absolute() .absolute()
.w_full() .w_full()
.bottom(px(0.)) .bottom(px(0.))
@ -234,17 +233,14 @@ impl Render for ToastLayer {
h_flex() h_flex()
.id("active-toast-container") .id("active-toast-container")
.occlude() .occlude()
.on_hover(move |hover_start, _window, cx| { .on_hover(cx.listener(|this, hover_start, _window, cx| {
let Some(this) = handle.upgrade() else {
return;
};
if *hover_start { if *hover_start {
this.update(cx, |this, _| this.pause_dismiss_timer()); this.pause_dismiss_timer();
} else { } else {
this.update(cx, |this, cx| this.restart_dismiss_timer(cx)); this.restart_dismiss_timer(cx);
} }
cx.stop_propagation(); cx.stop_propagation();
}) }))
.on_click(|_, _, cx| { .on_click(|_, _, cx| {
cx.stop_propagation(); cx.stop_propagation();
}) })

View file

@ -105,6 +105,10 @@ impl Model {
} }
} }
pub fn supports_prompt_cache_key(&self) -> bool {
false
}
pub fn supports_tool(&self) -> bool { pub fn supports_tool(&self) -> bool {
match self { match self {
Self::Grok2Vision Self::Grok2Vision

View file

@ -2,7 +2,7 @@
description = "The fast, collaborative code editor." description = "The fast, collaborative code editor."
edition.workspace = true edition.workspace = true
name = "zed" name = "zed"
version = "0.200.0" version = "0.200.3"
publish.workspace = true publish.workspace = true
license = "GPL-3.0-or-later" license = "GPL-3.0-or-later"
authors = ["Zed Team <hi@zed.dev>"] authors = ["Zed Team <hi@zed.dev>"]

View file

@ -1 +1 @@
dev preview

View file

@ -8,6 +8,7 @@ use cli::FORCE_CLI_MODE_ENV_VAR_NAME;
use client::{Client, ProxySettings, UserStore, parse_zed_link}; use client::{Client, ProxySettings, UserStore, parse_zed_link};
use collab_ui::channel_view::ChannelView; use collab_ui::channel_view::ChannelView;
use collections::HashMap; use collections::HashMap;
use crashes::InitCrashHandler;
use db::kvp::{GLOBAL_KEY_VALUE_STORE, KEY_VALUE_STORE}; use db::kvp::{GLOBAL_KEY_VALUE_STORE, KEY_VALUE_STORE};
use editor::Editor; use editor::Editor;
use extension::ExtensionHostProxy; use extension::ExtensionHostProxy;
@ -269,7 +270,15 @@ pub fn main() {
let session = app.background_executor().block(Session::new()); let session = app.background_executor().block(Session::new());
app.background_executor() app.background_executor()
.spawn(crashes::init(session_id.clone())) .spawn(crashes::init(InitCrashHandler {
session_id: session_id.clone(),
zed_version: app_version.to_string(),
release_channel: release_channel::RELEASE_CHANNEL_NAME.clone(),
commit_sha: app_commit_sha
.as_ref()
.map(|sha| sha.full())
.unwrap_or_else(|| "no sha".to_owned()),
}))
.detach(); .detach();
reliability::init_panic_hook( reliability::init_panic_hook(
app_version, app_version,

View file

@ -12,6 +12,7 @@ use gpui::{App, AppContext as _, SemanticVersion};
use http_client::{self, HttpClient, HttpClientWithUrl, HttpRequestExt, Method}; use http_client::{self, HttpClient, HttpClientWithUrl, HttpRequestExt, Method};
use paths::{crashes_dir, crashes_retired_dir}; use paths::{crashes_dir, crashes_retired_dir};
use project::Project; use project::Project;
use proto::{CrashReport, GetCrashFilesResponse};
use release_channel::{AppCommitSha, RELEASE_CHANNEL, ReleaseChannel}; use release_channel::{AppCommitSha, RELEASE_CHANNEL, ReleaseChannel};
use reqwest::multipart::{Form, Part}; use reqwest::multipart::{Form, Part};
use settings::Settings; use settings::Settings;
@ -51,10 +52,6 @@ pub fn init_panic_hook(
thread::yield_now(); thread::yield_now();
} }
} }
crashes::handle_panic();
let thread = thread::current();
let thread_name = thread.name().unwrap_or("<unnamed>");
let payload = info let payload = info
.payload() .payload()
@ -63,6 +60,11 @@ pub fn init_panic_hook(
.or_else(|| info.payload().downcast_ref::<String>().cloned()) .or_else(|| info.payload().downcast_ref::<String>().cloned())
.unwrap_or_else(|| "Box<Any>".to_string()); .unwrap_or_else(|| "Box<Any>".to_string());
crashes::handle_panic(payload.clone(), info.location());
let thread = thread::current();
let thread_name = thread.name().unwrap_or("<unnamed>");
if *release_channel::RELEASE_CHANNEL == ReleaseChannel::Dev { if *release_channel::RELEASE_CHANNEL == ReleaseChannel::Dev {
let location = info.location().unwrap(); let location = info.location().unwrap();
let backtrace = Backtrace::new(); let backtrace = Backtrace::new();
@ -214,45 +216,53 @@ pub fn init(
let installation_id = installation_id.clone(); let installation_id = installation_id.clone();
let system_id = system_id.clone(); let system_id = system_id.clone();
if let Some(ssh_client) = project.ssh_client() { let Some(ssh_client) = project.ssh_client() else {
return;
};
ssh_client.update(cx, |client, cx| { ssh_client.update(cx, |client, cx| {
if TelemetrySettings::get_global(cx).diagnostics { if !TelemetrySettings::get_global(cx).diagnostics {
return;
}
let request = client.proto_client().request(proto::GetCrashFiles {}); let request = client.proto_client().request(proto::GetCrashFiles {});
cx.background_spawn(async move { cx.background_spawn(async move {
let crash_files = request.await?; let GetCrashFilesResponse {
for crash in crash_files.crashes { legacy_panics,
let mut panic: Option<Panic> = crash crashes,
.panic_contents } = request.await?;
.and_then(|s| serde_json::from_str(&s).log_err());
if let Some(panic) = panic.as_mut() { for panic in legacy_panics {
if let Some(mut panic) = serde_json::from_str::<Panic>(&panic).log_err() {
panic.session_id = session_id.clone(); panic.session_id = session_id.clone();
panic.system_id = system_id.clone(); panic.system_id = system_id.clone();
panic.installation_id = installation_id.clone(); panic.installation_id = installation_id.clone();
upload_panic(&http_client, &panic_report_url, panic, &mut None).await?;
}
} }
if let Some(minidump) = crash.minidump_contents { let Some(endpoint) = MINIDUMP_ENDPOINT.as_ref() else {
return Ok(());
};
for CrashReport {
metadata,
minidump_contents,
} in crashes
{
if let Some(metadata) = serde_json::from_str(&metadata).log_err() {
upload_minidump( upload_minidump(
http_client.clone(), http_client.clone(),
minidump.clone(), endpoint,
panic.as_ref(), minidump_contents,
&metadata,
) )
.await .await
.log_err(); .log_err();
} }
if let Some(panic) = panic {
upload_panic(&http_client, &panic_report_url, panic, &mut None)
.await?;
}
} }
anyhow::Ok(()) anyhow::Ok(())
}) })
.detach_and_log_err(cx); .detach_and_log_err(cx);
}
}) })
}
}) })
.detach(); .detach();
} }
@ -466,16 +476,18 @@ fn upload_panics_and_crashes(
installation_id: Option<String>, installation_id: Option<String>,
cx: &App, cx: &App,
) { ) {
let telemetry_settings = *client::TelemetrySettings::get_global(cx); if !client::TelemetrySettings::get_global(cx).diagnostics {
return;
}
cx.background_spawn(async move { cx.background_spawn(async move {
let most_recent_panic = upload_previous_minidumps(http.clone()).await.warn_on_err();
upload_previous_panics(http.clone(), &panic_report_url, telemetry_settings) let most_recent_panic = upload_previous_panics(http.clone(), &panic_report_url)
.await .await
.log_err() .log_err()
.flatten(); .flatten();
upload_previous_crashes(http, most_recent_panic, installation_id, telemetry_settings) upload_previous_crashes(http, most_recent_panic, installation_id)
.await .await
.log_err() .log_err();
}) })
.detach() .detach()
} }
@ -484,7 +496,6 @@ fn upload_panics_and_crashes(
async fn upload_previous_panics( async fn upload_previous_panics(
http: Arc<HttpClientWithUrl>, http: Arc<HttpClientWithUrl>,
panic_report_url: &Url, panic_report_url: &Url,
telemetry_settings: client::TelemetrySettings,
) -> anyhow::Result<Option<(i64, String)>> { ) -> anyhow::Result<Option<(i64, String)>> {
let mut children = smol::fs::read_dir(paths::logs_dir()).await?; let mut children = smol::fs::read_dir(paths::logs_dir()).await?;
@ -507,7 +518,6 @@ async fn upload_previous_panics(
continue; continue;
} }
if telemetry_settings.diagnostics {
let panic_file_content = smol::fs::read_to_string(&child_path) let panic_file_content = smol::fs::read_to_string(&child_path)
.await .await
.context("error reading panic file")?; .context("error reading panic file")?;
@ -525,40 +535,24 @@ async fn upload_previous_panics(
None None
}); });
if let Some(panic) = panic { if let Some(panic) = panic
let minidump_path = paths::logs_dir() && upload_panic(&http, &panic_report_url, panic, &mut most_recent_panic).await?
.join(&panic.session_id)
.with_extension("dmp");
if minidump_path.exists() {
let minidump = smol::fs::read(&minidump_path)
.await
.context("Failed to read minidump")?;
if upload_minidump(http.clone(), minidump, Some(&panic))
.await
.log_err()
.is_some()
{ {
fs::remove_file(minidump_path).ok();
}
}
if !upload_panic(&http, &panic_report_url, panic, &mut most_recent_panic).await? {
continue;
}
}
}
// We've done what we can, delete the file // We've done what we can, delete the file
fs::remove_file(child_path) fs::remove_file(child_path)
.context("error removing panic") .context("error removing panic")
.log_err(); .log_err();
} }
if MINIDUMP_ENDPOINT.is_none() {
return Ok(most_recent_panic);
} }
// loop back over the directory again to upload any minidumps that are missing panics Ok(most_recent_panic)
}
pub async fn upload_previous_minidumps(http: Arc<HttpClientWithUrl>) -> anyhow::Result<()> {
let Some(minidump_endpoint) = MINIDUMP_ENDPOINT.as_ref() else {
return Err(anyhow::anyhow!("Minidump endpoint not set"));
};
let mut children = smol::fs::read_dir(paths::logs_dir()).await?; let mut children = smol::fs::read_dir(paths::logs_dir()).await?;
while let Some(child) = children.next().await { while let Some(child) = children.next().await {
let child = child?; let child = child?;
@ -566,33 +560,35 @@ async fn upload_previous_panics(
if child_path.extension() != Some(OsStr::new("dmp")) { if child_path.extension() != Some(OsStr::new("dmp")) {
continue; continue;
} }
let mut json_path = child_path.clone();
json_path.set_extension("json");
if let Ok(metadata) = serde_json::from_slice(&smol::fs::read(&json_path).await?) {
if upload_minidump( if upload_minidump(
http.clone(), http.clone(),
&minidump_endpoint,
smol::fs::read(&child_path) smol::fs::read(&child_path)
.await .await
.context("Failed to read minidump")?, .context("Failed to read minidump")?,
None, &metadata,
) )
.await .await
.log_err() .log_err()
.is_some() .is_some()
{ {
fs::remove_file(child_path).ok(); fs::remove_file(child_path).ok();
fs::remove_file(json_path).ok();
} }
} }
}
Ok(most_recent_panic) Ok(())
} }
async fn upload_minidump( async fn upload_minidump(
http: Arc<HttpClientWithUrl>, http: Arc<HttpClientWithUrl>,
endpoint: &str,
minidump: Vec<u8>, minidump: Vec<u8>,
panic: Option<&Panic>, metadata: &crashes::CrashInfo,
) -> Result<()> { ) -> Result<()> {
let minidump_endpoint = MINIDUMP_ENDPOINT
.to_owned()
.ok_or_else(|| anyhow::anyhow!("Minidump endpoint not set"))?;
let mut form = Form::new() let mut form = Form::new()
.part( .part(
"upload_file_minidump", "upload_file_minidump",
@ -600,38 +596,22 @@ async fn upload_minidump(
.file_name("minidump.dmp") .file_name("minidump.dmp")
.mime_str("application/octet-stream")?, .mime_str("application/octet-stream")?,
) )
.text("platform", "rust");
if let Some(panic) = panic {
form = form
.text("sentry[tags][channel]", panic.release_channel.clone())
.text("sentry[tags][version]", panic.app_version.clone())
.text("sentry[context][os][name]", panic.os_name.clone())
.text( .text(
"sentry[context][device][architecture]", "sentry[tags][channel]",
panic.architecture.clone(), metadata.init.release_channel.clone(),
) )
.text("sentry[logentry][formatted]", panic.payload.clone()); .text("sentry[tags][version]", metadata.init.zed_version.clone())
.text("sentry[release]", metadata.init.commit_sha.clone())
if let Some(sha) = panic.app_commit_sha.clone() { .text("platform", "rust");
form = form.text("sentry[release]", sha) if let Some(panic_info) = metadata.panic.as_ref() {
} else { form = form.text("sentry[logentry][formatted]", panic_info.message.clone());
form = form.text( form = form.text("span", panic_info.span.clone());
"sentry[release]",
format!("{}-{}", panic.release_channel, panic.app_version),
)
}
if let Some(v) = panic.os_version.clone() {
form = form.text("sentry[context][os][release]", v);
}
if let Some(location) = panic.location_data.as_ref() {
form = form.text("span", format!("{}:{}", location.file, location.line))
}
// TODO: add gpu-context, feature-flag-context, and more of device-context like gpu // TODO: add gpu-context, feature-flag-context, and more of device-context like gpu
// name, screen resolution, available ram, device model, etc // name, screen resolution, available ram, device model, etc
} }
let mut response_text = String::new(); let mut response_text = String::new();
let mut response = http.send_multipart_form(&minidump_endpoint, form).await?; let mut response = http.send_multipart_form(endpoint, form).await?;
response response
.body_mut() .body_mut()
.read_to_string(&mut response_text) .read_to_string(&mut response_text)
@ -681,11 +661,7 @@ async fn upload_previous_crashes(
http: Arc<HttpClientWithUrl>, http: Arc<HttpClientWithUrl>,
most_recent_panic: Option<(i64, String)>, most_recent_panic: Option<(i64, String)>,
installation_id: Option<String>, installation_id: Option<String>,
telemetry_settings: client::TelemetrySettings,
) -> Result<()> { ) -> Result<()> {
if !telemetry_settings.diagnostics {
return Ok(());
}
let last_uploaded = KEY_VALUE_STORE let last_uploaded = KEY_VALUE_STORE
.read_kvp(LAST_CRASH_UPLOADED)? .read_kvp(LAST_CRASH_UPLOADED)?
.unwrap_or("zed-2024-01-17-221900.ips".to_string()); // don't upload old crash reports from before we had this. .unwrap_or("zed-2024-01-17-221900.ips".to_string()); // don't upload old crash reports from before we had this.

View file

@ -427,7 +427,7 @@ Custom models will be listed in the model dropdown in the Agent Panel.
Zed supports using [OpenAI compatible APIs](https://platform.openai.com/docs/api-reference/chat) by specifying a custom `api_url` and `available_models` for the OpenAI provider. Zed supports using [OpenAI compatible APIs](https://platform.openai.com/docs/api-reference/chat) by specifying a custom `api_url` and `available_models` for the OpenAI provider.
This is useful for connecting to other hosted services (like Together AI, Anyscale, etc.) or local models. This is useful for connecting to other hosted services (like Together AI, Anyscale, etc.) or local models.
You can add a custom, OpenAI-compatible model via either via the UI or by editing your `settings.json`. You can add a custom, OpenAI-compatible model either via the UI or by editing your `settings.json`.
To do it via the UI, go to the Agent Panel settings (`agent: open settings`) and look for the "Add Provider" button to the right of the "LLM Providers" section title. To do it via the UI, go to the Agent Panel settings (`agent: open settings`) and look for the "Add Provider" button to the right of the "LLM Providers" section title.
Then, fill up the input fields available in the modal. Then, fill up the input fields available in the modal.
@ -443,7 +443,13 @@ To do it via your `settings.json`, add the following snippet under `language_mod
{ {
"name": "mistralai/Mixtral-8x7B-Instruct-v0.1", "name": "mistralai/Mixtral-8x7B-Instruct-v0.1",
"display_name": "Together Mixtral 8x7B", "display_name": "Together Mixtral 8x7B",
"max_tokens": 32768 "max_tokens": 32768,
"capabilities": {
"tools": true,
"images": false,
"parallel_tool_calls": false,
"prompt_cache_key": false
}
} }
] ]
} }
@ -451,6 +457,13 @@ To do it via your `settings.json`, add the following snippet under `language_mod
} }
``` ```
By default, OpenAI-compatible models inherit the following capabilities:
- `tools`: true (supports tool/function calling)
- `images`: false (does not support image inputs)
- `parallel_tool_calls`: false (does not support `parallel_tool_calls` parameter)
- `prompt_cache_key`: false (does not support `prompt_cache_key` parameter)
Note that LLM API keys aren't stored in your settings file. Note that LLM API keys aren't stored in your settings file.
So, ensure you have it set in your environment variables (`OPENAI_API_KEY=<your api key>`) so your settings can pick it up. So, ensure you have it set in your environment variables (`OPENAI_API_KEY=<your api key>`) so your settings can pick it up.