Compare commits

...
Sign in to create a new pull request.

30 commits

Author SHA1 Message Date
Joseph T. Lyons
537f6781a9 zed 0.200.5 2025-08-22 13:13:33 -04:00
Oleksiy Syvokon
f74281b2ec themes: Implement Bright Black and Bright White colors (#36761)
Before:
<img width="356" height="50" alt="image"
src="https://github.com/user-attachments/assets/c4f4ae53-8820-4f22-b306-2e5062cfe552"
/>

After:
<img width="340" height="41" alt="image"
src="https://github.com/user-attachments/assets/8e69d9dc-5640-4e41-845d-f299fc5954e3"
/>


Release Notes:

- Fixed ANSI Bright Black and Bright White colors
2025-08-22 13:09:06 -04:00
Julia Ryan
608495ec2f
Use Tokio::spawn instead of getting an executor handle (#36701)
This was causing panics due to the handles being dropped out of order.
It doesn't seem possible to guarantee the correct drop ordering given
that we're holding them over await points, so lets just spawn on the
tokio executor itself which gives us access to the state we needed those
handles for in the first place.

Fixes: ZED-1R

Release Notes:

- N/A

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
Co-authored-by: Marshall Bowers <git@maxdeviant.com>
2025-08-21 11:11:26 -07:00
Joseph T. Lyons
55ffbb43c8 v0.200.x stable 2025-08-20 15:13:52 -04:00
Smit Barmase
e22c7592d9 project: Register dynamic capabilities even when registerOptions doesn't exist (#36554)
Closes #36482

Looks like we accidentally referenced
[common/formatting.ts#L67-L70](d90a87f955/client/src/common/formatting.ts (L67-L70))
instead of
[common/client.ts#L2133](d90a87f955/client/src/common/client.ts (L2133)).

Release Notes:

- Fixed code not formatting on save in language servers like Biome.
(Preview Only)
2025-08-20 13:43:07 +05:30
Smit Barmase
27156279bb lsp: Enable dynamic registration for TextDocumentSyncClientCapabilities post revert (#36494)
Follow up: https://github.com/zed-industries/zed/pull/36485

Release Notes:

- N/A
2025-08-20 13:42:54 +05:30
Smit Barmase
6f69698257 project: Take 2 on Handle textDocument/didSave and textDocument/didChange (un)registration and usage correctly (#36485)
Relands https://github.com/zed-industries/zed/pull/36441 with a
deserialization fix.

Previously, deserializing `"includeText"` into
`lsp::TextDocumentSyncSaveOptions` resulted in a `Supported(false)` type
instead of `SaveOptions(SaveOptions { include_text: Option<bool> })`.

```rs
impl From<bool> for TextDocumentSyncSaveOptions {
    fn from(from: bool) -> Self {
        Self::Supported(from)
    }
}
```

Looks like, while dynamic registartion we only get `SaveOptions` type
and never `Supported` type.
(https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#textDocumentSaveRegistrationOptions)

Release Notes:

- N/A

---------

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-08-20 13:42:37 +05:30
Lukas Wirth
76afc0d67e Revert "project: Handle textDocument/didSave and textDocument/didChange (un)registration and usage correctly (#36441)" (#36480)
This reverts commit c5991e74bb.

This PR broke rust-analyzer's check on save function, so reverting for
now

Release Notes:

- N/A
2025-08-20 13:42:23 +05:30
Smit Barmase
320d706576 project: Handle textDocument/didSave and textDocument/didChange (un)registration and usage correctly (#36441)
Follow-up of https://github.com/zed-industries/zed/pull/35306

This PR contains two changes:

Both changes are inspired from:
d90a87f955/client/src/common/textSynchronization.ts

1. Handling `textDocument/didSave` and `textDocument/didChange`
registration and unregistration correctly:

```rs
#[derive(Debug, Eq, PartialEq, Clone, Deserialize, Serialize)]
#[serde(untagged)]
pub enum TextDocumentSyncCapability {
    Kind(TextDocumentSyncKind),
    Options(TextDocumentSyncOptions),
}
```

- `textDocument/didSave` dynamic registration contains "includeText"
- `textDocument/didChange` dynamic registration contains "syncKind"

While storing this to Language Server, we use
`TextDocumentSyncCapability::Options` instead of
`TextDocumentSyncCapability::Kind` since it also include
[change](be7336e92a/src/lib.rs (L1714-L1717))
field as `TextDocumentSyncCapability::Kind` as well as
[save](be7336e92a/src/lib.rs (L1727-L1729))
field as `TextDocumentSyncSaveOptions`. This way while registering or
unregistering both of them, we don't accidentaly mess with other data.

So, if at intialization we end up getting
`TextDocumentSyncCapability::Kind` and we receive any above kind of
dynamic registration, we change `TextDocumentSyncCapability::Kind` to
`TextDocumentSyncCapability::Options` so we can store more data anyway.

2. Modify `include_text` method to only depend on
`TextDocumentSyncSaveOptions`, instead of depending on
`TextDocumentSyncKind`. Idea behind this is,
`TextDocumentSyncSaveOptions` should be responsible for
"textDocument/didSave" notification, and `TextDocumentSyncKind` should
be responsible for "textDocument/didChange", which it already is:
4b79eade1d/crates/project/src/lsp_store.rs (L7324-L7331)

Release Notes:

- N/A
2025-08-20 13:41:12 +05:30
Smit Barmase
1adbbfc6f4 editor: Fix panic in inlay hint while padding (#36405)
Closes #36247

Fix a panic when padding inlay hints if the last character is a
multi-byte character. Regressed in
https://github.com/zed-industries/zed/pull/35786.

Release Notes:

- Fixed a crash that could occur when an inlay hint ended with `...`.
2025-08-19 10:17:12 +05:30
Smit Barmase
7703cdb70b gpui: Fix crash when starting Zed on macOS during texture creation (#36382)
Closes #36229

Fix zero-sized texture creation that triggers a SIGABRT in the Metal
renderer. Not sure why this happens yet, but it likely occurs when
`native_window.contentView()` returns a zero `NSSize` during initial
window creation, before the view size is computed.

Release Notes:

- Fixed a rare startup crash on macOS.
2025-08-19 10:17:02 +05:30
Zed Bot
3593691a05 Bump to 0.200.4 for @maxdeviant 2025-08-18 20:48:00 +00:00
Marshall Bowers
66e6649aed client: Parse auth callback query parameters before showing sign-in success page (#36440)
This PR fixes an issue where we would redirect the user's browser to the
sign-in success page even if the OAuth callback was malformed.

We now parse the OAuth callback parameters from the query string and
only redirect to the sign-in success page when they are valid.

Release Notes:

- Updated the sign-in flow to not show the sign-in success page
prematurely.
2025-08-18 16:34:46 -04:00
Julia Ryan
6c0eaf674e
zed 0.200.3 2025-08-18 09:32:33 -07:00
Julia Ryan
e9e376deb5
Separate minidump crashes from panics (#36267)
The minidump-based crash reporting is now entirely separate from our
legacy panic_hook-based reporting. This should improve the association
of minidumps with their metadata and give us more consistent crash
reports.

Release Notes:

- N/A

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-08-18 09:19:23 -07:00
Finn Evers
78e56ce8fd keymap_ui: Ensure keybind with empty arguments can be saved (#36393)
Follow up to #36278 to ensure this bug is actually fixed. Also fixes
this on two layers and adds a test for the lower layer, as we cannot
properly test it in the UI.

Furthermore, this improves the error message to show some more context
and ensures the status toast is actually only shown when the keybind was
successfully updated: Before, we would show the success toast whilst
also showing an error in the editor.

Lastly, this also fixes some issues with the status toast (and
animations) where no status toast or no animation would show in certain
scenarios.

Release Notes:

- N/A
2025-08-18 13:14:40 +02:00
Finn Evers
0367e93667 onboarding: Fix minimap typo on editing page (#36143)
This PR fixes a small typo on the onboarding editing page where it
should be "Minimap" instead of "Mini Map"

Release Notes:

- N/A
2025-08-18 13:14:35 +02:00
Oleksiy Syvokon
e2dec85365 agent: Create checkpoint before/after every edit operation (#36253)
1. Previously, checkpoints only appeared when an agent's edit happened
immediately after a user message. This is rare (agent usually collects
some context first), so they were almost never shown. This is now fixed.

2. After this change, a checkpoint is created after every edit
operation. So when the agent edits files five times in a single dialog
turn, we will now display five checkpoints.

As a bonus, it's now possible to undo only a part of a long agent
response.

Closes #36092, #32917

Release Notes:

- Create agent checkpoints more frequently (before every edit)
2025-08-18 12:37:19 +03:00
Piotr Osiewicz
4a0e8f0844 agent_ui: Ensure that all configuration views get rendered with full width (#36362)
Closes #36097

Release Notes:

- Fixed API key input fields getting shrunk in Agent Panel settings view
on low panel widths paired with high UI font sizes.
2025-08-18 12:36:01 +03:00
Cale Sennett
c2f0df9b8e Add capabilities to OpenAI-compatible model settings (#36370)
### TL;DR
* Adds `capabilities` configuration for OpenAI-compatible models
* Relates to
https://github.com/zed-industries/zed/issues/36215#issuecomment-3193920491

### Summary
This PR introduces support for configuring model capabilities for
OpenAI-compatible language models. The implementation addresses the
issue that not all OpenAI-compatible APIs support the same features -
for example, Cerebras' API explicitly does not support
`parallel_tool_calls` as documented in their [OpenAI compatibility
guide](https://inference-docs.cerebras.ai/resources/openai#currently-unsupported-openai-features).

### Changes

1. **Model Capabilities Structure**:
- Added `ModelCapabilityToggles` struct for UI representation with
boolean toggle states
- Implemented proper parsing of capability toggles into
`ModelCapabilities`

2. **UI Updates**:
- Modified the "Add LLM Provider" modal to include checkboxes for each
capability
- Each OpenAI-compatible model can now be configured with its specific
capabilities through the UI

3. **Configuration File Structure**:
- Updated the settings schema to support a `capabilities` object for
each `openai_compatible` model
- Each capability (`tools`, `images`, `parallel_tool_calls`,
`prompt_cache_key`) can be individually specified per model

### Example Configuration

```json
{
  "openai_compatible": {
    "Cerebras": {
      "api_url": "https://api.cerebras.ai/v1",
      "available_models": [
        {
          "name": "gpt-oss-120b",
          "max_tokens": 131000,
          "capabilities": {
            "tools": true,
            "images": false,
            "parallel_tool_calls": false,
            "prompt_cache_key": false
          }
        }
      ]
    }
  }
}
```

### Tests Added

- Added tests to verify default capability values are correctly applied
- Added tests to verify that deselected toggles are properly parsed as
`false`
- Added tests to verify that mixed capability selections work correctly

Thanks to @osyvokon for the desired `capabilities` configuration
structure!


Release Notes:

- OpenAI-compatible models now have configurable capabilities (#36370;
thanks @calesennett)

---------

Co-authored-by: Oleksiy Syvokon <oleksiy@zed.dev>
2025-08-18 12:35:08 +03:00
Ben Kunkle
2bd61668dc keymap_ui: Don't try to parse empty action arguments as JSON (#36278)
Closes #ISSUE

Release Notes:

- Keymap Editor: Fixed an issue where leaving the arguments field empty
would result in an error even if arguments were optional
2025-08-15 17:06:23 -05:00
Joseph T. Lyons
2ab445dfd4 zed 0.200.2 2025-08-15 13:17:26 -04:00
Oleksiy Syvokon
b96f76f377 openai: Don't send prompt_cache_key for OpenAI-compatible models (#36231)
Some APIs fail when they get this parameter

Closes #36215

Release Notes:

- Fixed OpenAI-compatible providers that don't support prompt caching
and/or reasoning
2025-08-15 16:26:41 +03:00
Oleksiy Syvokon
e9a4f6767b openai: Don't send reasoning_effort if it's not set (#36228)
Release Notes:

- N/A
2025-08-15 16:26:32 +03:00
smit
177cf12ca1 project: Fix LSP TextDocumentSyncCapability dynamic registration (#36234)
Closes #36213

Use `textDocument/didChange`
([docs](https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#textDocument_synchronization))
instead of `textDocument/synchronization`.

Release Notes:

- Fixed an issue where Dart projects were being formatted incorrectly by
the language server.
2025-08-15 14:08:58 +03:00
Joseph T. Lyons
fda9369bfd Emit a BreadcrumbsChanged event when associated settings changed (#36177)
Closes https://github.com/zed-industries/zed/issues/36149

Release Notes:

- Fixed a bug where changing the `toolbar.breadcrumbs` setting didn't
immediately update the UI when saving the `settings.json` file.
2025-08-14 15:31:29 -04:00
Zed Bot
08351cb3e7 Bump to 0.200.1 for @smitbarmase 2025-08-13 20:05:16 +00:00
smit
ab41359e24 ci: Disable FreeBSD builds (#36140)
Revert accidental change introduced in
[#35880](https://github.com/zed-industries/zed/pull/35880/files#diff-b803fcb7f17ed9235f1e5cb1fcd2f5d3b2838429d4368ae4c57ce4436577f03fL706)

Release Notes:

- N/A
2025-08-14 01:01:17 +05:30
smit
d29341bf44 copilot: Fix Copilot fails to sign in (#36138)
Closes #36093

Pin copilot version to 1.354 for now until further investigation.

Release Notes:

- Fixes issue where Copilot failed to sign in.

Co-authored-by: MrSubidubi <dev@bahn.sh>
2025-08-14 00:24:00 +05:30
Joseph T. Lyons
189ea49e00 v0.200.x preview 2025-08-13 12:47:57 -04:00
51 changed files with 902 additions and 423 deletions

View file

@ -718,7 +718,7 @@ jobs:
timeout-minutes: 60
runs-on: github-8vcpu-ubuntu-2404
if: |
( startsWith(github.ref, 'refs/tags/v')
false && ( startsWith(github.ref, 'refs/tags/v')
|| contains(github.event.pull_request.labels.*.name, 'run-bundling') )
needs: [linux_tests]
name: Build Zed on FreeBSD

6
Cargo.lock generated
View file

@ -3094,6 +3094,7 @@ dependencies = [
"schemars",
"serde",
"serde_json",
"serde_urlencoded",
"settings",
"sha2",
"smol",
@ -4065,6 +4066,8 @@ dependencies = [
"minidumper",
"paths",
"release_channel",
"serde",
"serde_json",
"smol",
"workspace-hack",
]
@ -7556,6 +7559,7 @@ dependencies = [
name = "gpui_tokio"
version = "0.1.0"
dependencies = [
"anyhow",
"gpui",
"tokio",
"util",
@ -20500,7 +20504,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.200.0"
version = "0.200.5"
dependencies = [
"activity_indicator",
"agent",

View file

@ -585,6 +585,7 @@ serde_json_lenient = { version = "0.2", features = [
"raw_value",
] }
serde_repr = "0.1"
serde_urlencoded = "0.7"
sha2 = "0.10"
shellexpand = "2.1.0"
shlex = "1.3.0"

View file

@ -93,7 +93,7 @@
"terminal.ansi.bright_cyan": "#4c806fff",
"terminal.ansi.dim_cyan": "#cbf2e4ff",
"terminal.ansi.white": "#bfbdb6ff",
"terminal.ansi.bright_white": "#bfbdb6ff",
"terminal.ansi.bright_white": "#fafafaff",
"terminal.ansi.dim_white": "#787876ff",
"link_text.hover": "#5ac1feff",
"conflict": "#feb454ff",
@ -479,7 +479,7 @@
"terminal.ansi.bright_cyan": "#ace0cbff",
"terminal.ansi.dim_cyan": "#2a5f4aff",
"terminal.ansi.white": "#fcfcfcff",
"terminal.ansi.bright_white": "#fcfcfcff",
"terminal.ansi.bright_white": "#ffffffff",
"terminal.ansi.dim_white": "#bcbec0ff",
"link_text.hover": "#3b9ee5ff",
"conflict": "#f1ad49ff",
@ -865,7 +865,7 @@
"terminal.ansi.bright_cyan": "#4c806fff",
"terminal.ansi.dim_cyan": "#cbf2e4ff",
"terminal.ansi.white": "#cccac2ff",
"terminal.ansi.bright_white": "#cccac2ff",
"terminal.ansi.bright_white": "#fafafaff",
"terminal.ansi.dim_white": "#898a8aff",
"link_text.hover": "#72cffeff",
"conflict": "#fecf72ff",

View file

@ -94,7 +94,7 @@
"terminal.ansi.bright_cyan": "#45603eff",
"terminal.ansi.dim_cyan": "#c7dfbdff",
"terminal.ansi.white": "#fbf1c7ff",
"terminal.ansi.bright_white": "#fbf1c7ff",
"terminal.ansi.bright_white": "#ffffffff",
"terminal.ansi.dim_white": "#b0a189ff",
"link_text.hover": "#83a598ff",
"version_control.added": "#b7bb26ff",
@ -494,7 +494,7 @@
"terminal.ansi.bright_cyan": "#45603eff",
"terminal.ansi.dim_cyan": "#c7dfbdff",
"terminal.ansi.white": "#fbf1c7ff",
"terminal.ansi.bright_white": "#fbf1c7ff",
"terminal.ansi.bright_white": "#ffffffff",
"terminal.ansi.dim_white": "#b0a189ff",
"link_text.hover": "#83a598ff",
"version_control.added": "#b7bb26ff",
@ -894,7 +894,7 @@
"terminal.ansi.bright_cyan": "#45603eff",
"terminal.ansi.dim_cyan": "#c7dfbdff",
"terminal.ansi.white": "#fbf1c7ff",
"terminal.ansi.bright_white": "#fbf1c7ff",
"terminal.ansi.bright_white": "#ffffffff",
"terminal.ansi.dim_white": "#b0a189ff",
"link_text.hover": "#83a598ff",
"version_control.added": "#b7bb26ff",
@ -1294,7 +1294,7 @@
"terminal.ansi.bright_cyan": "#9fbca8ff",
"terminal.ansi.dim_cyan": "#253e2eff",
"terminal.ansi.white": "#fbf1c7ff",
"terminal.ansi.bright_white": "#fbf1c7ff",
"terminal.ansi.bright_white": "#ffffffff",
"terminal.ansi.dim_white": "#b0a189ff",
"link_text.hover": "#0b6678ff",
"version_control.added": "#797410ff",
@ -1694,7 +1694,7 @@
"terminal.ansi.bright_cyan": "#9fbca8ff",
"terminal.ansi.dim_cyan": "#253e2eff",
"terminal.ansi.white": "#f9f5d7ff",
"terminal.ansi.bright_white": "#f9f5d7ff",
"terminal.ansi.bright_white": "#ffffffff",
"terminal.ansi.dim_white": "#b0a189ff",
"link_text.hover": "#0b6678ff",
"version_control.added": "#797410ff",
@ -2094,7 +2094,7 @@
"terminal.ansi.bright_cyan": "#9fbca8ff",
"terminal.ansi.dim_cyan": "#253e2eff",
"terminal.ansi.white": "#f2e5bcff",
"terminal.ansi.bright_white": "#f2e5bcff",
"terminal.ansi.bright_white": "#ffffffff",
"terminal.ansi.dim_white": "#b0a189ff",
"link_text.hover": "#0b6678ff",
"version_control.added": "#797410ff",

View file

@ -93,7 +93,7 @@
"terminal.ansi.bright_cyan": "#3a565bff",
"terminal.ansi.dim_cyan": "#b9d9dfff",
"terminal.ansi.white": "#dce0e5ff",
"terminal.ansi.bright_white": "#dce0e5ff",
"terminal.ansi.bright_white": "#fafafaff",
"terminal.ansi.dim_white": "#575d65ff",
"link_text.hover": "#74ade8ff",
"version_control.added": "#27a657ff",
@ -468,7 +468,7 @@
"terminal.bright_foreground": "#242529ff",
"terminal.dim_foreground": "#fafafaff",
"terminal.ansi.black": "#242529ff",
"terminal.ansi.bright_black": "#242529ff",
"terminal.ansi.bright_black": "#747579ff",
"terminal.ansi.dim_black": "#97979aff",
"terminal.ansi.red": "#d36151ff",
"terminal.ansi.bright_red": "#f0b0a4ff",
@ -489,7 +489,7 @@
"terminal.ansi.bright_cyan": "#a3bedaff",
"terminal.ansi.dim_cyan": "#254058ff",
"terminal.ansi.white": "#fafafaff",
"terminal.ansi.bright_white": "#fafafaff",
"terminal.ansi.bright_white": "#ffffffff",
"terminal.ansi.dim_white": "#aaaaaaff",
"link_text.hover": "#5c78e2ff",
"version_control.added": "#27a657ff",

View file

@ -844,11 +844,17 @@ impl Thread {
.await
.unwrap_or(false);
if !equal {
this.update(cx, |this, cx| {
this.insert_checkpoint(pending_checkpoint, cx)
})?;
}
this.update(cx, |this, cx| {
this.pending_checkpoint = if equal {
Some(pending_checkpoint)
} else {
this.insert_checkpoint(pending_checkpoint, cx);
Some(ThreadCheckpoint {
message_id: this.next_message_id,
git_checkpoint: final_checkpoint,
})
}
})?;
Ok(())
}

View file

@ -300,6 +300,7 @@ impl AgentConfiguration {
)
.child(
div()
.w_full()
.px_2()
.when(is_expanded, |parent| match configuration_view {
Some(configuration_view) => parent.child(configuration_view),

View file

@ -7,10 +7,12 @@ use gpui::{DismissEvent, Entity, EventEmitter, FocusHandle, Focusable, Render, T
use language_model::LanguageModelRegistry;
use language_models::{
AllLanguageModelSettings, OpenAiCompatibleSettingsContent,
provider::open_ai_compatible::AvailableModel,
provider::open_ai_compatible::{AvailableModel, ModelCapabilities},
};
use settings::update_settings_file;
use ui::{Banner, KeyBinding, Modal, ModalFooter, ModalHeader, Section, prelude::*};
use ui::{
Banner, Checkbox, KeyBinding, Modal, ModalFooter, ModalHeader, Section, ToggleState, prelude::*,
};
use ui_input::SingleLineInput;
use workspace::{ModalView, Workspace};
@ -69,11 +71,19 @@ impl AddLlmProviderInput {
}
}
struct ModelCapabilityToggles {
pub supports_tools: ToggleState,
pub supports_images: ToggleState,
pub supports_parallel_tool_calls: ToggleState,
pub supports_prompt_cache_key: ToggleState,
}
struct ModelInput {
name: Entity<SingleLineInput>,
max_completion_tokens: Entity<SingleLineInput>,
max_output_tokens: Entity<SingleLineInput>,
max_tokens: Entity<SingleLineInput>,
capabilities: ModelCapabilityToggles,
}
impl ModelInput {
@ -100,11 +110,23 @@ impl ModelInput {
cx,
);
let max_tokens = single_line_input("Max Tokens", "Max Tokens", Some("200000"), window, cx);
let ModelCapabilities {
tools,
images,
parallel_tool_calls,
prompt_cache_key,
} = ModelCapabilities::default();
Self {
name: model_name,
max_completion_tokens,
max_output_tokens,
max_tokens,
capabilities: ModelCapabilityToggles {
supports_tools: tools.into(),
supports_images: images.into(),
supports_parallel_tool_calls: parallel_tool_calls.into(),
supports_prompt_cache_key: prompt_cache_key.into(),
},
}
}
@ -136,6 +158,12 @@ impl ModelInput {
.text(cx)
.parse::<u64>()
.map_err(|_| SharedString::from("Max Tokens must be a number"))?,
capabilities: ModelCapabilities {
tools: self.capabilities.supports_tools.selected(),
images: self.capabilities.supports_images.selected(),
parallel_tool_calls: self.capabilities.supports_parallel_tool_calls.selected(),
prompt_cache_key: self.capabilities.supports_prompt_cache_key.selected(),
},
})
}
}
@ -322,6 +350,55 @@ impl AddLlmProviderModal {
.child(model.max_output_tokens.clone()),
)
.child(model.max_tokens.clone())
.child(
v_flex()
.gap_1()
.child(
Checkbox::new(("supports-tools", ix), model.capabilities.supports_tools)
.label("Supports tools")
.on_click(cx.listener(move |this, checked, _window, cx| {
this.input.models[ix].capabilities.supports_tools = *checked;
cx.notify();
})),
)
.child(
Checkbox::new(("supports-images", ix), model.capabilities.supports_images)
.label("Supports images")
.on_click(cx.listener(move |this, checked, _window, cx| {
this.input.models[ix].capabilities.supports_images = *checked;
cx.notify();
})),
)
.child(
Checkbox::new(
("supports-parallel-tool-calls", ix),
model.capabilities.supports_parallel_tool_calls,
)
.label("Supports parallel_tool_calls")
.on_click(cx.listener(
move |this, checked, _window, cx| {
this.input.models[ix]
.capabilities
.supports_parallel_tool_calls = *checked;
cx.notify();
},
)),
)
.child(
Checkbox::new(
("supports-prompt-cache-key", ix),
model.capabilities.supports_prompt_cache_key,
)
.label("Supports prompt_cache_key")
.on_click(cx.listener(
move |this, checked, _window, cx| {
this.input.models[ix].capabilities.supports_prompt_cache_key =
*checked;
cx.notify();
},
)),
),
)
.when(has_more_than_one_model, |this| {
this.child(
Button::new(("remove-model", ix), "Remove Model")
@ -562,6 +639,93 @@ mod tests {
);
}
#[gpui::test]
async fn test_model_input_default_capabilities(cx: &mut TestAppContext) {
let cx = setup_test(cx).await;
cx.update(|window, cx| {
let model_input = ModelInput::new(window, cx);
model_input.name.update(cx, |input, cx| {
input.editor().update(cx, |editor, cx| {
editor.set_text("somemodel", window, cx);
});
});
assert_eq!(
model_input.capabilities.supports_tools,
ToggleState::Selected
);
assert_eq!(
model_input.capabilities.supports_images,
ToggleState::Unselected
);
assert_eq!(
model_input.capabilities.supports_parallel_tool_calls,
ToggleState::Unselected
);
assert_eq!(
model_input.capabilities.supports_prompt_cache_key,
ToggleState::Unselected
);
let parsed_model = model_input.parse(cx).unwrap();
assert_eq!(parsed_model.capabilities.tools, true);
assert_eq!(parsed_model.capabilities.images, false);
assert_eq!(parsed_model.capabilities.parallel_tool_calls, false);
assert_eq!(parsed_model.capabilities.prompt_cache_key, false);
});
}
#[gpui::test]
async fn test_model_input_deselected_capabilities(cx: &mut TestAppContext) {
let cx = setup_test(cx).await;
cx.update(|window, cx| {
let mut model_input = ModelInput::new(window, cx);
model_input.name.update(cx, |input, cx| {
input.editor().update(cx, |editor, cx| {
editor.set_text("somemodel", window, cx);
});
});
model_input.capabilities.supports_tools = ToggleState::Unselected;
model_input.capabilities.supports_images = ToggleState::Unselected;
model_input.capabilities.supports_parallel_tool_calls = ToggleState::Unselected;
model_input.capabilities.supports_prompt_cache_key = ToggleState::Unselected;
let parsed_model = model_input.parse(cx).unwrap();
assert_eq!(parsed_model.capabilities.tools, false);
assert_eq!(parsed_model.capabilities.images, false);
assert_eq!(parsed_model.capabilities.parallel_tool_calls, false);
assert_eq!(parsed_model.capabilities.prompt_cache_key, false);
});
}
#[gpui::test]
async fn test_model_input_with_name_and_capabilities(cx: &mut TestAppContext) {
let cx = setup_test(cx).await;
cx.update(|window, cx| {
let mut model_input = ModelInput::new(window, cx);
model_input.name.update(cx, |input, cx| {
input.editor().update(cx, |editor, cx| {
editor.set_text("somemodel", window, cx);
});
});
model_input.capabilities.supports_tools = ToggleState::Selected;
model_input.capabilities.supports_images = ToggleState::Unselected;
model_input.capabilities.supports_parallel_tool_calls = ToggleState::Selected;
model_input.capabilities.supports_prompt_cache_key = ToggleState::Unselected;
let parsed_model = model_input.parse(cx).unwrap();
assert_eq!(parsed_model.name, "somemodel");
assert_eq!(parsed_model.capabilities.tools, true);
assert_eq!(parsed_model.capabilities.images, false);
assert_eq!(parsed_model.capabilities.parallel_tool_calls, true);
assert_eq!(parsed_model.capabilities.prompt_cache_key, false);
});
}
async fn setup_test(cx: &mut TestAppContext) -> &mut VisualTestContext {
cx.update(|cx| {
let store = SettingsStore::test(cx);

View file

@ -44,6 +44,7 @@ rpc = { workspace = true, features = ["gpui"] }
schemars.workspace = true
serde.workspace = true
serde_json.workspace = true
serde_urlencoded.workspace = true
settings.workspace = true
sha2.workspace = true
smol.workspace = true

View file

@ -1284,19 +1284,21 @@ impl Client {
"http" => Http,
_ => Err(anyhow!("invalid rpc url: {}", rpc_url))?,
};
let rpc_host = rpc_url
.host_str()
.zip(rpc_url.port_or_known_default())
.context("missing host in rpc url")?;
let stream = {
let handle = cx.update(|cx| gpui_tokio::Tokio::handle(cx)).ok().unwrap();
let _guard = handle.enter();
match proxy {
Some(proxy) => connect_proxy_stream(&proxy, rpc_host).await?,
None => Box::new(TcpStream::connect(rpc_host).await?),
let stream = gpui_tokio::Tokio::spawn_result(cx, {
let rpc_url = rpc_url.clone();
async move {
let rpc_host = rpc_url
.host_str()
.zip(rpc_url.port_or_known_default())
.context("missing host in rpc url")?;
Ok(match proxy {
Some(proxy) => connect_proxy_stream(&proxy, rpc_host).await?,
None => Box::new(TcpStream::connect(rpc_host).await?),
})
}
};
})?
.await?;
log::info!("connected to rpc endpoint {}", rpc_url);
@ -1410,6 +1412,12 @@ impl Client {
open_url_tx.send(url).log_err();
#[derive(Deserialize)]
struct CallbackParams {
pub user_id: String,
pub access_token: String,
}
// Receive the HTTP request from the user's browser. Retrieve the user id and encrypted
// access token from the query params.
//
@ -1420,17 +1428,13 @@ impl Client {
for _ in 0..100 {
if let Some(req) = server.recv_timeout(Duration::from_secs(1))? {
let path = req.url();
let mut user_id = None;
let mut access_token = None;
let url = Url::parse(&format!("http://example.com{}", path))
.context("failed to parse login notification url")?;
for (key, value) in url.query_pairs() {
if key == "access_token" {
access_token = Some(value.to_string());
} else if key == "user_id" {
user_id = Some(value.to_string());
}
}
let callback_params: CallbackParams =
serde_urlencoded::from_str(url.query().unwrap_or_default())
.context(
"failed to parse sign-in callback query parameters",
)?;
let post_auth_url =
http.build_url("/native_app_signin_succeeded");
@ -1445,8 +1449,8 @@ impl Client {
)
.context("failed to respond to login http request")?;
return Ok((
user_id.context("missing user_id parameter")?,
access_token.context("missing access_token parameter")?,
callback_params.user_id,
callback_params.access_token,
));
}
}

View file

@ -102,13 +102,7 @@ impl CloudApiClient {
let credentials = credentials.as_ref().context("no credentials provided")?;
let authorization_header = format!("{} {}", credentials.user_id, credentials.access_token);
Ok(cx.spawn(async move |cx| {
let handle = cx
.update(|cx| Tokio::handle(cx))
.ok()
.context("failed to get Tokio handle")?;
let _guard = handle.enter();
Ok(Tokio::spawn_result(cx, async move {
let ws = WebSocket::connect(connect_url)
.with_request(
request::Builder::new()

View file

@ -21,7 +21,7 @@ use language::{
point_from_lsp, point_to_lsp,
};
use lsp::{LanguageServer, LanguageServerBinary, LanguageServerId, LanguageServerName};
use node_runtime::NodeRuntime;
use node_runtime::{NodeRuntime, VersionCheck};
use parking_lot::Mutex;
use project::DisableAiSettings;
use request::StatusNotification;
@ -1169,9 +1169,8 @@ async fn get_copilot_lsp(fs: Arc<dyn Fs>, node_runtime: NodeRuntime) -> anyhow::
const SERVER_PATH: &str =
"node_modules/@github/copilot-language-server/dist/language-server.js";
let latest_version = node_runtime
.npm_package_latest_version(PACKAGE_NAME)
.await?;
// pinning it: https://github.com/zed-industries/zed/issues/36093
const PINNED_VERSION: &str = "1.354";
let server_path = paths::copilot_dir().join(SERVER_PATH);
fs.create_dir(paths::copilot_dir()).await?;
@ -1181,12 +1180,13 @@ async fn get_copilot_lsp(fs: Arc<dyn Fs>, node_runtime: NodeRuntime) -> anyhow::
PACKAGE_NAME,
&server_path,
paths::copilot_dir(),
&latest_version,
&PINNED_VERSION,
VersionCheck::VersionMismatch,
)
.await;
if should_install {
node_runtime
.npm_install_packages(paths::copilot_dir(), &[(PACKAGE_NAME, &latest_version)])
.npm_install_packages(paths::copilot_dir(), &[(PACKAGE_NAME, &PINNED_VERSION)])
.await?;
}

View file

@ -12,6 +12,8 @@ minidumper.workspace = true
paths.workspace = true
release_channel.workspace = true
smol.workspace = true
serde.workspace = true
serde_json.workspace = true
workspace-hack.workspace = true
[lints]

View file

@ -2,15 +2,17 @@ use crash_handler::CrashHandler;
use log::info;
use minidumper::{Client, LoopAction, MinidumpBinary};
use release_channel::{RELEASE_CHANNEL, ReleaseChannel};
use serde::{Deserialize, Serialize};
use std::{
env,
fs::File,
fs::{self, File},
io,
panic::Location,
path::{Path, PathBuf},
process::{self, Command},
sync::{
LazyLock, OnceLock,
Arc, OnceLock,
atomic::{AtomicBool, Ordering},
},
thread,
@ -18,19 +20,17 @@ use std::{
};
// set once the crash handler has initialized and the client has connected to it
pub static CRASH_HANDLER: AtomicBool = AtomicBool::new(false);
pub static CRASH_HANDLER: OnceLock<Arc<Client>> = OnceLock::new();
// set when the first minidump request is made to avoid generating duplicate crash reports
pub static REQUESTED_MINIDUMP: AtomicBool = AtomicBool::new(false);
const CRASH_HANDLER_TIMEOUT: Duration = Duration::from_secs(60);
const CRASH_HANDLER_PING_TIMEOUT: Duration = Duration::from_secs(60);
const CRASH_HANDLER_CONNECT_TIMEOUT: Duration = Duration::from_secs(10);
pub static GENERATE_MINIDUMPS: LazyLock<bool> = LazyLock::new(|| {
*RELEASE_CHANNEL != ReleaseChannel::Dev || env::var("ZED_GENERATE_MINIDUMPS").is_ok()
});
pub async fn init(id: String) {
if !*GENERATE_MINIDUMPS {
pub async fn init(crash_init: InitCrashHandler) {
if *RELEASE_CHANNEL == ReleaseChannel::Dev && env::var("ZED_GENERATE_MINIDUMPS").is_err() {
return;
}
let exe = env::current_exe().expect("unable to find ourselves");
let zed_pid = process::id();
// TODO: we should be able to get away with using 1 crash-handler process per machine,
@ -61,9 +61,11 @@ pub async fn init(id: String) {
smol::Timer::after(retry_frequency).await;
}
let client = maybe_client.unwrap();
client.send_message(1, id).unwrap(); // set session id on the server
client
.send_message(1, serde_json::to_vec(&crash_init).unwrap())
.unwrap();
let client = std::sync::Arc::new(client);
let client = Arc::new(client);
let handler = crash_handler::CrashHandler::attach(unsafe {
let client = client.clone();
crash_handler::make_crash_event(move |crash_context: &crash_handler::CrashContext| {
@ -72,7 +74,6 @@ pub async fn init(id: String) {
.compare_exchange(false, true, Ordering::Acquire, Ordering::Relaxed)
.is_ok()
{
client.send_message(2, "mistakes were made").unwrap();
client.ping().unwrap();
client.request_dump(crash_context).is_ok()
} else {
@ -87,7 +88,7 @@ pub async fn init(id: String) {
{
handler.set_ptracer(Some(server_pid));
}
CRASH_HANDLER.store(true, Ordering::Release);
CRASH_HANDLER.set(client.clone()).ok();
std::mem::forget(handler);
info!("crash handler registered");
@ -98,14 +99,43 @@ pub async fn init(id: String) {
}
pub struct CrashServer {
session_id: OnceLock<String>,
initialization_params: OnceLock<InitCrashHandler>,
panic_info: OnceLock<CrashPanic>,
has_connection: Arc<AtomicBool>,
}
#[derive(Debug, Deserialize, Serialize, Clone)]
pub struct CrashInfo {
pub init: InitCrashHandler,
pub panic: Option<CrashPanic>,
}
#[derive(Debug, Deserialize, Serialize, Clone)]
pub struct InitCrashHandler {
pub session_id: String,
pub zed_version: String,
pub release_channel: String,
pub commit_sha: String,
// pub gpu: String,
}
#[derive(Deserialize, Serialize, Debug, Clone)]
pub struct CrashPanic {
pub message: String,
pub span: String,
}
impl minidumper::ServerHandler for CrashServer {
fn create_minidump_file(&self) -> Result<(File, PathBuf), io::Error> {
let err_message = "Need to send a message with the ID upon starting the crash handler";
let err_message = "Missing initialization data";
let dump_path = paths::logs_dir()
.join(self.session_id.get().expect(err_message))
.join(
&self
.initialization_params
.get()
.expect(err_message)
.session_id,
)
.with_extension("dmp");
let file = File::create(&dump_path)?;
Ok((file, dump_path))
@ -122,38 +152,71 @@ impl minidumper::ServerHandler for CrashServer {
info!("failed to write minidump: {:#}", e);
}
}
let crash_info = CrashInfo {
init: self
.initialization_params
.get()
.expect("not initialized")
.clone(),
panic: self.panic_info.get().cloned(),
};
let crash_data_path = paths::logs_dir()
.join(&crash_info.init.session_id)
.with_extension("json");
fs::write(crash_data_path, serde_json::to_vec(&crash_info).unwrap()).ok();
LoopAction::Exit
}
fn on_message(&self, kind: u32, buffer: Vec<u8>) {
let message = String::from_utf8(buffer).expect("invalid utf-8");
info!("kind: {kind}, message: {message}",);
if kind == 1 {
self.session_id
.set(message)
.expect("session id already initialized");
match kind {
1 => {
let init_data =
serde_json::from_slice::<InitCrashHandler>(&buffer).expect("invalid init data");
self.initialization_params
.set(init_data)
.expect("already initialized");
}
2 => {
let panic_data =
serde_json::from_slice::<CrashPanic>(&buffer).expect("invalid panic data");
self.panic_info.set(panic_data).expect("already panicked");
}
_ => {
panic!("invalid message kind");
}
}
}
fn on_client_disconnected(&self, clients: usize) -> LoopAction {
info!("client disconnected, {clients} remaining");
if clients == 0 {
LoopAction::Exit
} else {
LoopAction::Continue
}
fn on_client_disconnected(&self, _clients: usize) -> LoopAction {
LoopAction::Exit
}
fn on_client_connected(&self, _clients: usize) -> LoopAction {
self.has_connection.store(true, Ordering::SeqCst);
LoopAction::Continue
}
}
pub fn handle_panic() {
if !*GENERATE_MINIDUMPS {
return;
}
pub fn handle_panic(message: String, span: Option<&Location>) {
let span = span
.map(|loc| format!("{}:{}", loc.file(), loc.line()))
.unwrap_or_default();
// wait 500ms for the crash handler process to start up
// if it's still not there just write panic info and no minidump
let retry_frequency = Duration::from_millis(100);
for _ in 0..5 {
if CRASH_HANDLER.load(Ordering::Acquire) {
if let Some(client) = CRASH_HANDLER.get() {
client
.send_message(
2,
serde_json::to_vec(&CrashPanic { message, span }).unwrap(),
)
.ok();
log::error!("triggering a crash to generate a minidump...");
#[cfg(target_os = "linux")]
CrashHandler.simulate_signal(crash_handler::Signal::Trap as u32);
@ -170,14 +233,30 @@ pub fn crash_server(socket: &Path) {
log::info!("Couldn't create socket, there may already be a running crash server");
return;
};
let ab = AtomicBool::new(false);
let shutdown = Arc::new(AtomicBool::new(false));
let has_connection = Arc::new(AtomicBool::new(false));
std::thread::spawn({
let shutdown = shutdown.clone();
let has_connection = has_connection.clone();
move || {
std::thread::sleep(CRASH_HANDLER_CONNECT_TIMEOUT);
if !has_connection.load(Ordering::SeqCst) {
shutdown.store(true, Ordering::SeqCst);
}
}
});
server
.run(
Box::new(CrashServer {
session_id: OnceLock::new(),
initialization_params: OnceLock::new(),
panic_info: OnceLock::new(),
has_connection,
}),
&ab,
Some(CRASH_HANDLER_TIMEOUT),
&shutdown,
Some(CRASH_HANDLER_PING_TIMEOUT),
)
.expect("failed to run server");
}

View file

@ -48,7 +48,7 @@ pub struct Inlay {
impl Inlay {
pub fn hint(id: usize, position: Anchor, hint: &project::InlayHint) -> Self {
let mut text = hint.text();
if hint.padding_right && text.chars_at(text.len().saturating_sub(1)).next() != Some(' ') {
if hint.padding_right && text.reversed_chars_at(text.len()).next() != Some(' ') {
text.push(" ");
}
if hint.padding_left && text.chars_at(0).next() != Some(' ') {
@ -1305,6 +1305,29 @@ mod tests {
);
}
#[gpui::test]
fn test_inlay_hint_padding_with_multibyte_chars() {
assert_eq!(
Inlay::hint(
0,
Anchor::min(),
&InlayHint {
label: InlayHintLabel::String("🎨".to_string()),
position: text::Anchor::default(),
padding_left: true,
padding_right: true,
tooltip: None,
kind: None,
resolve_state: ResolveState::Resolved,
},
)
.text
.to_string(),
" 🎨 ",
"Should pad single emoji correctly"
);
}
#[gpui::test]
fn test_basic_inlays(cx: &mut App) {
let buffer = MultiBuffer::build_simple("abcdefghi", cx);

View file

@ -20200,6 +20200,7 @@ impl Editor {
);
let old_cursor_shape = self.cursor_shape;
let old_show_breadcrumbs = self.show_breadcrumbs;
{
let editor_settings = EditorSettings::get_global(cx);
@ -20213,6 +20214,10 @@ impl Editor {
cx.emit(EditorEvent::CursorShapeChanged);
}
if old_show_breadcrumbs != self.show_breadcrumbs {
cx.emit(EditorEvent::BreadcrumbsChanged);
}
let project_settings = ProjectSettings::get_global(cx);
self.serialize_dirty_buffers =
!self.mode.is_minimap() && project_settings.session.restore_unsaved_buffers;
@ -22834,6 +22839,7 @@ pub enum EditorEvent {
},
Reloaded,
CursorShapeChanged,
BreadcrumbsChanged,
PushedToNavHistory {
anchor: Anchor,
is_deactivate: bool,

View file

@ -1036,6 +1036,10 @@ impl Item for Editor {
f(ItemEvent::UpdateBreadcrumbs);
}
EditorEvent::BreadcrumbsChanged => {
f(ItemEvent::UpdateBreadcrumbs);
}
EditorEvent::DirtyChanged => {
f(ItemEvent::UpdateTab);
}

View file

@ -314,6 +314,15 @@ impl MetalRenderer {
}
fn update_path_intermediate_textures(&mut self, size: Size<DevicePixels>) {
// We are uncertain when this happens, but sometimes size can be 0 here. Most likely before
// the layout pass on window creation. Zero-sized texture creation causes SIGABRT.
// https://github.com/zed-industries/zed/issues/36229
if size.width.0 <= 0 || size.height.0 <= 0 {
self.path_intermediate_texture = None;
self.path_intermediate_msaa_texture = None;
return;
}
let texture_descriptor = metal::TextureDescriptor::new();
texture_descriptor.set_width(size.width.0 as u64);
texture_descriptor.set_height(size.height.0 as u64);

View file

@ -13,6 +13,7 @@ path = "src/gpui_tokio.rs"
doctest = false
[dependencies]
anyhow.workspace = true
util.workspace = true
gpui.workspace = true
tokio = { workspace = true, features = ["rt", "rt-multi-thread"] }

View file

@ -52,6 +52,28 @@ impl Tokio {
})
}
/// Spawns the given future on Tokio's thread pool, and returns it via a GPUI task
/// Note that the Tokio task will be cancelled if the GPUI task is dropped
pub fn spawn_result<C, Fut, R>(cx: &C, f: Fut) -> C::Result<Task<anyhow::Result<R>>>
where
C: AppContext,
Fut: Future<Output = anyhow::Result<R>> + Send + 'static,
R: Send + 'static,
{
cx.read_global(|tokio: &GlobalTokio, cx| {
let join_handle = tokio.runtime.spawn(f);
let abort_handle = join_handle.abort_handle();
let cancel = defer(move || {
abort_handle.abort();
});
cx.background_spawn(async move {
let result = join_handle.await?;
drop(cancel);
result
})
})
}
pub fn handle(cx: &App) -> tokio::runtime::Handle {
GlobalTokio::global(cx).runtime.handle().clone()
}

View file

@ -941,6 +941,7 @@ impl LanguageModel for CloudLanguageModel {
request,
model.id(),
model.supports_parallel_tool_calls(),
model.supports_prompt_cache_key(),
None,
None,
);

View file

@ -370,6 +370,7 @@ impl LanguageModel for OpenAiLanguageModel {
request,
self.model.id(),
self.model.supports_parallel_tool_calls(),
self.model.supports_prompt_cache_key(),
self.max_output_tokens(),
self.model.reasoning_effort(),
);
@ -386,6 +387,7 @@ pub fn into_open_ai(
request: LanguageModelRequest,
model_id: &str,
supports_parallel_tool_calls: bool,
supports_prompt_cache_key: bool,
max_output_tokens: Option<u64>,
reasoning_effort: Option<ReasoningEffort>,
) -> open_ai::Request {
@ -477,7 +479,11 @@ pub fn into_open_ai(
} else {
None
},
prompt_cache_key: request.thread_id,
prompt_cache_key: if supports_prompt_cache_key {
request.thread_id
} else {
None
},
tools: request
.tools
.into_iter()

View file

@ -38,6 +38,27 @@ pub struct AvailableModel {
pub max_tokens: u64,
pub max_output_tokens: Option<u64>,
pub max_completion_tokens: Option<u64>,
#[serde(default)]
pub capabilities: ModelCapabilities,
}
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, JsonSchema)]
pub struct ModelCapabilities {
pub tools: bool,
pub images: bool,
pub parallel_tool_calls: bool,
pub prompt_cache_key: bool,
}
impl Default for ModelCapabilities {
fn default() -> Self {
Self {
tools: true,
images: false,
parallel_tool_calls: false,
prompt_cache_key: false,
}
}
}
pub struct OpenAiCompatibleLanguageModelProvider {
@ -293,17 +314,17 @@ impl LanguageModel for OpenAiCompatibleLanguageModel {
}
fn supports_tools(&self) -> bool {
true
self.model.capabilities.tools
}
fn supports_images(&self) -> bool {
false
self.model.capabilities.images
}
fn supports_tool_choice(&self, choice: LanguageModelToolChoice) -> bool {
match choice {
LanguageModelToolChoice::Auto => true,
LanguageModelToolChoice::Any => true,
LanguageModelToolChoice::Auto => self.model.capabilities.tools,
LanguageModelToolChoice::Any => self.model.capabilities.tools,
LanguageModelToolChoice::None => true,
}
}
@ -358,7 +379,8 @@ impl LanguageModel for OpenAiCompatibleLanguageModel {
let request = into_open_ai(
request,
&self.model.name,
true,
self.model.capabilities.parallel_tool_calls,
self.model.capabilities.prompt_cache_key,
self.max_output_tokens(),
None,
);

View file

@ -355,6 +355,7 @@ impl LanguageModel for VercelLanguageModel {
request,
self.model.id(),
self.model.supports_parallel_tool_calls(),
self.model.supports_prompt_cache_key(),
self.max_output_tokens(),
None,
);

View file

@ -359,6 +359,7 @@ impl LanguageModel for XAiLanguageModel {
request,
self.model.id(),
self.model.supports_parallel_tool_calls(),
self.model.supports_prompt_cache_key(),
self.max_output_tokens(),
None,
);

View file

@ -103,7 +103,13 @@ impl LspAdapter for CssLspAdapter {
let should_install_language_server = self
.node
.should_install_npm_package(Self::PACKAGE_NAME, &server_path, &container_dir, &version)
.should_install_npm_package(
Self::PACKAGE_NAME,
&server_path,
&container_dir,
&version,
Default::default(),
)
.await;
if should_install_language_server {

View file

@ -340,7 +340,13 @@ impl LspAdapter for JsonLspAdapter {
let should_install_language_server = self
.node
.should_install_npm_package(Self::PACKAGE_NAME, &server_path, &container_dir, &version)
.should_install_npm_package(
Self::PACKAGE_NAME,
&server_path,
&container_dir,
&version,
Default::default(),
)
.await;
if should_install_language_server {

View file

@ -206,6 +206,7 @@ impl LspAdapter for PythonLspAdapter {
&server_path,
&container_dir,
&version,
Default::default(),
)
.await;

View file

@ -108,7 +108,13 @@ impl LspAdapter for TailwindLspAdapter {
let should_install_language_server = self
.node
.should_install_npm_package(Self::PACKAGE_NAME, &server_path, &container_dir, &version)
.should_install_npm_package(
Self::PACKAGE_NAME,
&server_path,
&container_dir,
&version,
Default::default(),
)
.await;
if should_install_language_server {

View file

@ -589,6 +589,7 @@ impl LspAdapter for TypeScriptLspAdapter {
&server_path,
&container_dir,
version.typescript_version.as_str(),
Default::default(),
)
.await;

View file

@ -116,6 +116,7 @@ impl LspAdapter for VtslsLspAdapter {
&server_path,
&container_dir,
&latest_version.server_version,
Default::default(),
)
.await
{
@ -129,6 +130,7 @@ impl LspAdapter for VtslsLspAdapter {
&container_dir.join(Self::TYPESCRIPT_TSDK_PATH),
&container_dir,
&latest_version.typescript_version,
Default::default(),
)
.await
{

View file

@ -104,7 +104,13 @@ impl LspAdapter for YamlLspAdapter {
let should_install_language_server = self
.node
.should_install_npm_package(Self::PACKAGE_NAME, &server_path, &container_dir, &version)
.should_install_npm_package(
Self::PACKAGE_NAME,
&server_path,
&container_dir,
&version,
Default::default(),
)
.await;
if should_install_language_server {

View file

@ -29,6 +29,15 @@ pub struct NodeBinaryOptions {
pub use_paths: Option<(PathBuf, PathBuf)>,
}
#[derive(Default)]
pub enum VersionCheck {
/// Check whether the installed and requested version have a mismatch
VersionMismatch,
/// Only check whether the currently installed version is older than the newest one
#[default]
OlderVersion,
}
#[derive(Clone)]
pub struct NodeRuntime(Arc<Mutex<NodeRuntimeState>>);
@ -287,6 +296,7 @@ impl NodeRuntime {
local_executable_path: &Path,
local_package_directory: &Path,
latest_version: &str,
version_check: VersionCheck,
) -> bool {
// In the case of the local system not having the package installed,
// or in the instances where we fail to parse package.json data,
@ -311,7 +321,10 @@ impl NodeRuntime {
return true;
};
installed_version < latest_version
match version_check {
VersionCheck::VersionMismatch => installed_version != latest_version,
VersionCheck::OlderVersion => installed_version < latest_version,
}
}
}

View file

@ -721,7 +721,7 @@ fn render_popular_settings_section(
.items_start()
.justify_between()
.child(
v_flex().child(Label::new("Mini Map")).child(
v_flex().child(Label::new("Minimap")).child(
Label::new("See a high-level overview of your source code.")
.color(Color::Muted),
),

View file

@ -236,6 +236,13 @@ impl Model {
Self::O1 | Self::O3 | Self::O3Mini | Self::O4Mini | Model::Custom { .. } => false,
}
}
/// Returns whether the given model supports the `prompt_cache_key` parameter.
///
/// If the model does not support the parameter, do not pass it up.
pub fn supports_prompt_cache_key(&self) -> bool {
return true;
}
}
#[derive(Debug, Serialize, Deserialize)]
@ -257,6 +264,7 @@ pub struct Request {
pub tools: Vec<ToolDefinition>,
#[serde(default, skip_serializing_if = "Option::is_none")]
pub prompt_cache_key: Option<String>,
#[serde(default, skip_serializing_if = "Option::is_none")]
pub reasoning_effort: Option<ReasoningEffort>,
}

View file

@ -63,8 +63,8 @@ use lsp::{
FileOperationPatternKind, FileOperationRegistrationOptions, FileRename, FileSystemWatcher,
LanguageServer, LanguageServerBinary, LanguageServerBinaryOptions, LanguageServerId,
LanguageServerName, LanguageServerSelector, LspRequestFuture, MessageActionItem, MessageType,
OneOf, RenameFilesParams, SymbolKind, TextEdit, WillRenameFiles, WorkDoneProgressCancelParams,
WorkspaceFolder, notification::DidRenameFiles,
OneOf, RenameFilesParams, SymbolKind, TextDocumentSyncSaveOptions, TextEdit, WillRenameFiles,
WorkDoneProgressCancelParams, WorkspaceFolder, notification::DidRenameFiles,
};
use node_runtime::read_package_installed_version;
use parking_lot::Mutex;
@ -11813,14 +11813,48 @@ impl LspStore {
notify_server_capabilities_updated(&server, cx);
}
}
"textDocument/synchronization" => {
if let Some(caps) = reg
"textDocument/didChange" => {
if let Some(sync_kind) = reg
.register_options
.map(serde_json::from_value)
.and_then(|opts| opts.get("syncKind").cloned())
.map(serde_json::from_value::<lsp::TextDocumentSyncKind>)
.transpose()?
{
server.update_capabilities(|capabilities| {
capabilities.text_document_sync = Some(caps);
let mut sync_options =
Self::take_text_document_sync_options(capabilities);
sync_options.change = Some(sync_kind);
capabilities.text_document_sync =
Some(lsp::TextDocumentSyncCapability::Options(sync_options));
});
notify_server_capabilities_updated(&server, cx);
}
}
"textDocument/didSave" => {
if let Some(include_text) = reg
.register_options
.map(|opts| {
let transpose = opts
.get("includeText")
.cloned()
.map(serde_json::from_value::<Option<bool>>)
.transpose();
match transpose {
Ok(value) => Ok(value.flatten()),
Err(e) => Err(e),
}
})
.transpose()?
{
server.update_capabilities(|capabilities| {
let mut sync_options =
Self::take_text_document_sync_options(capabilities);
sync_options.save =
Some(TextDocumentSyncSaveOptions::SaveOptions(lsp::SaveOptions {
include_text,
}));
capabilities.text_document_sync =
Some(lsp::TextDocumentSyncCapability::Options(sync_options));
});
notify_server_capabilities_updated(&server, cx);
}
@ -11970,9 +12004,21 @@ impl LspStore {
});
notify_server_capabilities_updated(&server, cx);
}
"textDocument/synchronization" => {
"textDocument/didChange" => {
server.update_capabilities(|capabilities| {
capabilities.text_document_sync = None;
let mut sync_options = Self::take_text_document_sync_options(capabilities);
sync_options.change = None;
capabilities.text_document_sync =
Some(lsp::TextDocumentSyncCapability::Options(sync_options));
});
notify_server_capabilities_updated(&server, cx);
}
"textDocument/didSave" => {
server.update_capabilities(|capabilities| {
let mut sync_options = Self::take_text_document_sync_options(capabilities);
sync_options.save = None;
capabilities.text_document_sync =
Some(lsp::TextDocumentSyncCapability::Options(sync_options));
});
notify_server_capabilities_updated(&server, cx);
}
@ -12000,18 +12046,31 @@ impl LspStore {
Ok(())
}
fn take_text_document_sync_options(
capabilities: &mut lsp::ServerCapabilities,
) -> lsp::TextDocumentSyncOptions {
match capabilities.text_document_sync.take() {
Some(lsp::TextDocumentSyncCapability::Options(sync_options)) => sync_options,
Some(lsp::TextDocumentSyncCapability::Kind(sync_kind)) => {
let mut sync_options = lsp::TextDocumentSyncOptions::default();
sync_options.change = Some(sync_kind);
sync_options
}
None => lsp::TextDocumentSyncOptions::default(),
}
}
}
// Registration with empty capabilities should be ignored.
// https://github.com/microsoft/vscode-languageserver-node/blob/d90a87f9557a0df9142cfb33e251cfa6fe27d970/client/src/common/formatting.ts#L67-L70
// Registration with registerOptions as null, should fallback to true.
// https://github.com/microsoft/vscode-languageserver-node/blob/d90a87f9557a0df9142cfb33e251cfa6fe27d970/client/src/common/client.ts#L2133
fn parse_register_capabilities<T: serde::de::DeserializeOwned>(
reg: lsp::Registration,
) -> anyhow::Result<Option<OneOf<bool, T>>> {
Ok(reg
.register_options
.map(|options| serde_json::from_value::<T>(options))
.transpose()?
.map(OneOf::Right))
Ok(match reg.register_options {
Some(options) => Some(OneOf::Right(serde_json::from_value::<T>(options)?)),
None => Some(OneOf::Left(true)),
})
}
fn subscribe_to_binary_statuses(
@ -13102,24 +13161,18 @@ async fn populate_labels_for_symbols(
fn include_text(server: &lsp::LanguageServer) -> Option<bool> {
match server.capabilities().text_document_sync.as_ref()? {
lsp::TextDocumentSyncCapability::Kind(kind) => match *kind {
lsp::TextDocumentSyncKind::NONE => None,
lsp::TextDocumentSyncKind::FULL => Some(true),
lsp::TextDocumentSyncKind::INCREMENTAL => Some(false),
_ => None,
},
lsp::TextDocumentSyncCapability::Options(options) => match options.save.as_ref()? {
lsp::TextDocumentSyncSaveOptions::Supported(supported) => {
if *supported {
Some(true)
} else {
None
}
}
lsp::TextDocumentSyncCapability::Options(opts) => match opts.save.as_ref()? {
// Server wants didSave but didn't specify includeText.
lsp::TextDocumentSyncSaveOptions::Supported(true) => Some(false),
// Server doesn't want didSave at all.
lsp::TextDocumentSyncSaveOptions::Supported(false) => None,
// Server provided SaveOptions.
lsp::TextDocumentSyncSaveOptions::SaveOptions(save_options) => {
Some(save_options.include_text.unwrap_or(false))
}
},
// We do not have any save info. Kind affects didChange only.
lsp::TextDocumentSyncCapability::Kind(_) => None,
}
}

View file

@ -84,11 +84,13 @@ message GetCrashFiles {
message GetCrashFilesResponse {
repeated CrashReport crashes = 1;
repeated string legacy_panics = 2;
}
message CrashReport {
optional string panic_contents = 1;
optional bytes minidump_contents = 2;
reserved 1, 2;
string metadata = 3;
bytes minidump_contents = 4;
}
message Extension {

View file

@ -1484,20 +1484,17 @@ impl RemoteConnection for SshRemoteConnection {
identifier = &unique_identifier,
);
if let Some(rust_log) = std::env::var("RUST_LOG").ok() {
start_proxy_command = format!(
"RUST_LOG={} {}",
shlex::try_quote(&rust_log).unwrap(),
start_proxy_command
)
}
if let Some(rust_backtrace) = std::env::var("RUST_BACKTRACE").ok() {
start_proxy_command = format!(
"RUST_BACKTRACE={} {}",
shlex::try_quote(&rust_backtrace).unwrap(),
start_proxy_command
)
for env_var in ["RUST_LOG", "RUST_BACKTRACE", "ZED_GENERATE_MINIDUMPS"] {
if let Some(value) = std::env::var(env_var).ok() {
start_proxy_command = format!(
"{}={} {} ",
env_var,
shlex::try_quote(&value).unwrap(),
start_proxy_command,
);
}
}
if reconnect {
start_proxy_command.push_str(" --reconnect");
}
@ -2229,8 +2226,7 @@ impl SshRemoteConnection {
#[cfg(not(target_os = "windows"))]
{
run_cmd(Command::new("gzip").args(["-9", "-f", &bin_path.to_string_lossy()]))
.await?;
run_cmd(Command::new("gzip").args(["-f", &bin_path.to_string_lossy()])).await?;
}
#[cfg(target_os = "windows")]
{
@ -2462,7 +2458,7 @@ impl ChannelClient {
},
async {
smol::Timer::after(timeout).await;
anyhow::bail!("Timeout detected")
anyhow::bail!("Timed out resyncing remote client")
},
)
.await
@ -2476,7 +2472,7 @@ impl ChannelClient {
},
async {
smol::Timer::after(timeout).await;
anyhow::bail!("Timeout detected")
anyhow::bail!("Timed out pinging remote client")
},
)
.await

View file

@ -34,10 +34,10 @@ use smol::io::AsyncReadExt;
use smol::Async;
use smol::{net::unix::UnixListener, stream::StreamExt as _};
use std::collections::HashMap;
use std::ffi::OsStr;
use std::ops::ControlFlow;
use std::str::FromStr;
use std::sync::LazyLock;
use std::{env, thread};
use std::{
io::Write,
@ -48,6 +48,13 @@ use std::{
use telemetry_events::LocationData;
use util::ResultExt;
pub static VERSION: LazyLock<&str> = LazyLock::new(|| match *RELEASE_CHANNEL {
ReleaseChannel::Stable | ReleaseChannel::Preview => env!("ZED_PKG_VERSION"),
ReleaseChannel::Nightly | ReleaseChannel::Dev => {
option_env!("ZED_COMMIT_SHA").unwrap_or("missing-zed-commit-sha")
}
});
fn init_logging_proxy() {
env_logger::builder()
.format(|buf, record| {
@ -113,7 +120,6 @@ fn init_logging_server(log_file_path: PathBuf) -> Result<Receiver<Vec<u8>>> {
fn init_panic_hook(session_id: String) {
std::panic::set_hook(Box::new(move |info| {
crashes::handle_panic();
let payload = info
.payload()
.downcast_ref::<&str>()
@ -121,6 +127,8 @@ fn init_panic_hook(session_id: String) {
.or_else(|| info.payload().downcast_ref::<String>().cloned())
.unwrap_or_else(|| "Box<Any>".to_string());
crashes::handle_panic(payload.clone(), info.location());
let backtrace = backtrace::Backtrace::new();
let mut backtrace = backtrace
.frames()
@ -150,14 +158,6 @@ fn init_panic_hook(session_id: String) {
(&backtrace).join("\n")
);
let release_channel = *RELEASE_CHANNEL;
let version = match release_channel {
ReleaseChannel::Stable | ReleaseChannel::Preview => env!("ZED_PKG_VERSION"),
ReleaseChannel::Nightly | ReleaseChannel::Dev => {
option_env!("ZED_COMMIT_SHA").unwrap_or("missing-zed-commit-sha")
}
};
let panic_data = telemetry_events::Panic {
thread: thread_name.into(),
payload: payload.clone(),
@ -165,9 +165,9 @@ fn init_panic_hook(session_id: String) {
file: location.file().into(),
line: location.line(),
}),
app_version: format!("remote-server-{version}"),
app_version: format!("remote-server-{}", *VERSION),
app_commit_sha: option_env!("ZED_COMMIT_SHA").map(|sha| sha.into()),
release_channel: release_channel.dev_name().into(),
release_channel: RELEASE_CHANNEL.dev_name().into(),
target: env!("TARGET").to_owned().into(),
os_name: telemetry::os_name(),
os_version: Some(telemetry::os_version()),
@ -204,8 +204,8 @@ fn handle_crash_files_requests(project: &Entity<HeadlessProject>, client: &Arc<C
client.add_request_handler(
project.downgrade(),
|_, _: TypedEnvelope<proto::GetCrashFiles>, _cx| async move {
let mut legacy_panics = Vec::new();
let mut crashes = Vec::new();
let mut minidumps_by_session_id = HashMap::new();
let mut children = smol::fs::read_dir(paths::logs_dir()).await?;
while let Some(child) = children.next().await {
let child = child?;
@ -227,41 +227,31 @@ fn handle_crash_files_requests(project: &Entity<HeadlessProject>, client: &Arc<C
.await
.context("error reading panic file")?;
crashes.push(proto::CrashReport {
panic_contents: Some(file_contents),
minidump_contents: None,
});
legacy_panics.push(file_contents);
smol::fs::remove_file(&child_path)
.await
.context("error removing panic")
.log_err();
} else if extension == Some(OsStr::new("dmp")) {
let session_id = child_path.file_stem().unwrap().to_string_lossy();
minidumps_by_session_id
.insert(session_id.to_string(), smol::fs::read(&child_path).await?);
}
// We've done what we can, delete the file
smol::fs::remove_file(&child_path)
.await
.context("error removing panic")
.log_err();
}
for crash in &mut crashes {
let panic: telemetry_events::Panic =
serde_json::from_str(crash.panic_contents.as_ref().unwrap())?;
if let dump @ Some(_) = minidumps_by_session_id.remove(&panic.session_id) {
crash.minidump_contents = dump;
let mut json_path = child_path.clone();
json_path.set_extension("json");
if let Ok(json_content) = smol::fs::read_to_string(&json_path).await {
crashes.push(CrashReport {
metadata: json_content,
minidump_contents: smol::fs::read(&child_path).await?,
});
smol::fs::remove_file(&child_path).await.log_err();
smol::fs::remove_file(&json_path).await.log_err();
} else {
log::error!("Couldn't find json metadata for crash: {child_path:?}");
}
}
}
crashes.extend(
minidumps_by_session_id
.into_values()
.map(|dmp| CrashReport {
panic_contents: None,
minidump_contents: Some(dmp),
}),
);
anyhow::Ok(proto::GetCrashFilesResponse { crashes })
anyhow::Ok(proto::GetCrashFilesResponse {
crashes,
legacy_panics,
})
},
);
}
@ -442,7 +432,12 @@ pub fn execute_run(
let app = gpui::Application::headless();
let id = std::process::id().to_string();
app.background_executor()
.spawn(crashes::init(id.clone()))
.spawn(crashes::init(crashes::InitCrashHandler {
session_id: id.clone(),
zed_version: VERSION.to_owned(),
release_channel: release_channel::RELEASE_CHANNEL_NAME.clone(),
commit_sha: option_env!("ZED_COMMIT_SHA").unwrap_or("no_sha").to_owned(),
}))
.detach();
init_panic_hook(id);
let log_rx = init_logging_server(log_file)?;
@ -569,7 +564,13 @@ pub fn execute_proxy(identifier: String, is_reconnecting: bool) -> Result<()> {
let server_paths = ServerPaths::new(&identifier)?;
let id = std::process::id().to_string();
smol::spawn(crashes::init(id.clone())).detach();
smol::spawn(crashes::init(crashes::InitCrashHandler {
session_id: id.clone(),
zed_version: VERSION.to_owned(),
release_channel: release_channel::RELEASE_CHANNEL_NAME.clone(),
commit_sha: option_env!("ZED_COMMIT_SHA").unwrap_or("no_sha").to_owned(),
}))
.detach();
init_panic_hook(id);
log::info!("starting proxy process. PID: {}", std::process::id());

View file

@ -928,14 +928,14 @@ impl<'a> KeybindUpdateTarget<'a> {
}
let action_name: Value = self.action_name.into();
let value = match self.action_arguments {
Some(args) => {
Some(args) if !args.is_empty() => {
let args = serde_json::from_str::<Value>(args)
.context("Failed to parse action arguments as JSON")?;
serde_json::json!([action_name, args])
}
None => action_name,
_ => action_name,
};
return Ok(value);
Ok(value)
}
fn keystrokes_unparsed(&self) -> String {
@ -1084,6 +1084,24 @@ mod tests {
.unindent(),
);
check_keymap_update(
"[]",
KeybindUpdateOperation::add(KeybindUpdateTarget {
keystrokes: &parse_keystrokes("ctrl-a"),
action_name: "zed::SomeAction",
context: None,
action_arguments: Some(""),
}),
r#"[
{
"bindings": {
"ctrl-a": "zed::SomeAction"
}
}
]"#
.unindent(),
);
check_keymap_update(
r#"[
{

View file

@ -2150,7 +2150,8 @@ impl KeybindingEditorModal {
let action_arguments = self
.action_arguments_editor
.as_ref()
.map(|editor| editor.read(cx).editor.read(cx).text(cx));
.map(|arguments_editor| arguments_editor.read(cx).editor.read(cx).text(cx))
.filter(|args| !args.is_empty());
let value = action_arguments
.as_ref()
@ -2261,29 +2262,11 @@ impl KeybindingEditorModal {
let create = self.creating;
let status_toast = StatusToast::new(
format!(
"Saved edits to the {} action.",
&self.editing_keybind.action().humanized_name
),
cx,
move |this, _cx| {
this.icon(ToastIcon::new(IconName::Check).color(Color::Success))
.dismiss_button(true)
// .action("Undo", f) todo: wire the undo functionality
},
);
self.workspace
.update(cx, |workspace, cx| {
workspace.toggle_status_toast(status_toast, cx);
})
.log_err();
cx.spawn(async move |this, cx| {
let action_name = existing_keybind.action().name;
let humanized_action_name = existing_keybind.action().humanized_name.clone();
if let Err(err) = save_keybinding_update(
match save_keybinding_update(
create,
existing_keybind,
&action_mapping,
@ -2293,25 +2276,43 @@ impl KeybindingEditorModal {
)
.await
{
this.update(cx, |this, cx| {
this.set_error(InputError::error(err), cx);
})
.log_err();
} else {
this.update(cx, |this, cx| {
this.keymap_editor.update(cx, |keymap, cx| {
keymap.previous_edit = Some(PreviousEdit::Keybinding {
action_mapping,
action_name,
fallback: keymap
.table_interaction_state
.read(cx)
.get_scrollbar_offset(Axis::Vertical),
})
});
cx.emit(DismissEvent);
})
.ok();
Ok(_) => {
this.update(cx, |this, cx| {
this.keymap_editor.update(cx, |keymap, cx| {
keymap.previous_edit = Some(PreviousEdit::Keybinding {
action_mapping,
action_name,
fallback: keymap
.table_interaction_state
.read(cx)
.get_scrollbar_offset(Axis::Vertical),
});
let status_toast = StatusToast::new(
format!("Saved edits to the {} action.", humanized_action_name),
cx,
move |this, _cx| {
this.icon(ToastIcon::new(IconName::Check).color(Color::Success))
.dismiss_button(true)
// .action("Undo", f) todo: wire the undo functionality
},
);
this.workspace
.update(cx, |workspace, cx| {
workspace.toggle_status_toast(status_toast, cx);
})
.log_err();
});
cx.emit(DismissEvent);
})
.ok();
}
Err(err) => {
this.update(cx, |this, cx| {
this.set_error(InputError::error(err), cx);
})
.log_err();
}
}
})
.detach();
@ -2983,7 +2984,7 @@ async fn save_keybinding_update(
let updated_keymap_contents =
settings::KeymapFile::update_keybinding(operation, keymap_contents, tab_size)
.context("Failed to update keybinding")?;
.map_err(|err| anyhow::anyhow!("Could not save updated keybinding: {}", err))?;
fs.write(
paths::keymap_file().as_path(),
updated_keymap_contents.as_bytes(),

View file

@ -31,7 +31,7 @@ pub enum AnimationDirection {
FromTop,
}
pub trait DefaultAnimations: Styled + Sized {
pub trait DefaultAnimations: Styled + Sized + Element {
fn animate_in(
self,
animation_type: AnimationDirection,
@ -44,8 +44,13 @@ pub trait DefaultAnimations: Styled + Sized {
AnimationDirection::FromTop => "animate_from_top",
};
let animation_id = self.id().map_or_else(
|| ElementId::from(animation_name),
|id| (id, animation_name).into(),
);
self.with_animation(
animation_name,
animation_id,
gpui::Animation::new(AnimationDuration::Fast.into()).with_easing(ease_out_quint()),
move |mut this, delta| {
let start_opacity = 0.4;
@ -91,7 +96,7 @@ pub trait DefaultAnimations: Styled + Sized {
}
}
impl<E: Styled> DefaultAnimations for E {}
impl<E: Styled + Element> DefaultAnimations for E {}
// Don't use this directly, it only exists to show animation previews
#[derive(RegisterComponent)]
@ -132,7 +137,7 @@ impl Component for Animation {
.left(px(offset))
.rounded_md()
.bg(gpui::red())
.animate_in(AnimationDirection::FromBottom, false),
.animate_in_from_bottom(false),
)
.into_any_element(),
),
@ -151,7 +156,7 @@ impl Component for Animation {
.left(px(offset))
.rounded_md()
.bg(gpui::blue())
.animate_in(AnimationDirection::FromTop, false),
.animate_in_from_top(false),
)
.into_any_element(),
),
@ -170,7 +175,7 @@ impl Component for Animation {
.top(px(offset))
.rounded_md()
.bg(gpui::green())
.animate_in(AnimationDirection::FromLeft, false),
.animate_in_from_left(false),
)
.into_any_element(),
),
@ -189,7 +194,7 @@ impl Component for Animation {
.top(px(offset))
.rounded_md()
.bg(gpui::yellow())
.animate_in(AnimationDirection::FromRight, false),
.animate_in_from_right(false),
)
.into_any_element(),
),
@ -214,7 +219,7 @@ impl Component for Animation {
.left(px(offset))
.rounded_md()
.bg(gpui::red())
.animate_in(AnimationDirection::FromBottom, true),
.animate_in_from_bottom(true),
)
.into_any_element(),
),
@ -233,7 +238,7 @@ impl Component for Animation {
.left(px(offset))
.rounded_md()
.bg(gpui::blue())
.animate_in(AnimationDirection::FromTop, true),
.animate_in_from_top(true),
)
.into_any_element(),
),
@ -252,7 +257,7 @@ impl Component for Animation {
.top(px(offset))
.rounded_md()
.bg(gpui::green())
.animate_in(AnimationDirection::FromLeft, true),
.animate_in_from_left(true),
)
.into_any_element(),
),
@ -271,7 +276,7 @@ impl Component for Animation {
.top(px(offset))
.rounded_md()
.bg(gpui::yellow())
.animate_in(AnimationDirection::FromRight, true),
.animate_in_from_right(true),
)
.into_any_element(),
),

View file

@ -71,4 +71,8 @@ impl Model {
Model::Custom { .. } => false,
}
}
pub fn supports_prompt_cache_key(&self) -> bool {
false
}
}

View file

@ -3,7 +3,7 @@ use std::{
time::{Duration, Instant},
};
use gpui::{AnyView, DismissEvent, Entity, FocusHandle, ManagedView, Subscription, Task};
use gpui::{AnyView, DismissEvent, Entity, EntityId, FocusHandle, ManagedView, Subscription, Task};
use ui::{animation::DefaultAnimations, prelude::*};
use zed_actions::toast;
@ -76,6 +76,7 @@ impl<V: ToastView> ToastViewHandle for Entity<V> {
}
pub struct ActiveToast {
id: EntityId,
toast: Box<dyn ToastViewHandle>,
action: Option<ToastAction>,
_subscriptions: [Subscription; 1],
@ -113,9 +114,9 @@ impl ToastLayer {
V: ToastView,
{
if let Some(active_toast) = &self.active_toast {
let is_close = active_toast.toast.view().downcast::<V>().is_ok();
let did_close = self.hide_toast(cx);
if is_close || !did_close {
let show_new = active_toast.id != new_toast.entity_id();
self.hide_toast(cx);
if !show_new {
return;
}
}
@ -130,11 +131,12 @@ impl ToastLayer {
let focus_handle = cx.focus_handle();
self.active_toast = Some(ActiveToast {
toast: Box::new(new_toast.clone()),
action,
_subscriptions: [cx.subscribe(&new_toast, |this, _, _: &DismissEvent, cx| {
this.hide_toast(cx);
})],
id: new_toast.entity_id(),
toast: Box::new(new_toast),
action,
focus_handle,
});
@ -143,11 +145,9 @@ impl ToastLayer {
cx.notify();
}
pub fn hide_toast(&mut self, cx: &mut Context<Self>) -> bool {
pub fn hide_toast(&mut self, cx: &mut Context<Self>) {
self.active_toast.take();
cx.notify();
true
}
pub fn active_toast<V>(&self) -> Option<Entity<V>>
@ -218,11 +218,10 @@ impl Render for ToastLayer {
let Some(active_toast) = &self.active_toast else {
return div();
};
let handle = cx.weak_entity();
div().absolute().size_full().bottom_0().left_0().child(
v_flex()
.id("toast-layer-container")
.id(("toast-layer-container", active_toast.id))
.absolute()
.w_full()
.bottom(px(0.))
@ -234,17 +233,14 @@ impl Render for ToastLayer {
h_flex()
.id("active-toast-container")
.occlude()
.on_hover(move |hover_start, _window, cx| {
let Some(this) = handle.upgrade() else {
return;
};
.on_hover(cx.listener(|this, hover_start, _window, cx| {
if *hover_start {
this.update(cx, |this, _| this.pause_dismiss_timer());
this.pause_dismiss_timer();
} else {
this.update(cx, |this, cx| this.restart_dismiss_timer(cx));
this.restart_dismiss_timer(cx);
}
cx.stop_propagation();
})
}))
.on_click(|_, _, cx| {
cx.stop_propagation();
})

View file

@ -105,6 +105,10 @@ impl Model {
}
}
pub fn supports_prompt_cache_key(&self) -> bool {
false
}
pub fn supports_tool(&self) -> bool {
match self {
Self::Grok2Vision

View file

@ -2,7 +2,7 @@
description = "The fast, collaborative code editor."
edition.workspace = true
name = "zed"
version = "0.200.0"
version = "0.200.5"
publish.workspace = true
license = "GPL-3.0-or-later"
authors = ["Zed Team <hi@zed.dev>"]

View file

@ -1 +1 @@
dev
stable

View file

@ -8,6 +8,7 @@ use cli::FORCE_CLI_MODE_ENV_VAR_NAME;
use client::{Client, ProxySettings, UserStore, parse_zed_link};
use collab_ui::channel_view::ChannelView;
use collections::HashMap;
use crashes::InitCrashHandler;
use db::kvp::{GLOBAL_KEY_VALUE_STORE, KEY_VALUE_STORE};
use editor::Editor;
use extension::ExtensionHostProxy;
@ -269,7 +270,15 @@ pub fn main() {
let session = app.background_executor().block(Session::new());
app.background_executor()
.spawn(crashes::init(session_id.clone()))
.spawn(crashes::init(InitCrashHandler {
session_id: session_id.clone(),
zed_version: app_version.to_string(),
release_channel: release_channel::RELEASE_CHANNEL_NAME.clone(),
commit_sha: app_commit_sha
.as_ref()
.map(|sha| sha.full())
.unwrap_or_else(|| "no sha".to_owned()),
}))
.detach();
reliability::init_panic_hook(
app_version,

View file

@ -12,6 +12,7 @@ use gpui::{App, AppContext as _, SemanticVersion};
use http_client::{self, HttpClient, HttpClientWithUrl, HttpRequestExt, Method};
use paths::{crashes_dir, crashes_retired_dir};
use project::Project;
use proto::{CrashReport, GetCrashFilesResponse};
use release_channel::{AppCommitSha, RELEASE_CHANNEL, ReleaseChannel};
use reqwest::multipart::{Form, Part};
use settings::Settings;
@ -51,10 +52,6 @@ pub fn init_panic_hook(
thread::yield_now();
}
}
crashes::handle_panic();
let thread = thread::current();
let thread_name = thread.name().unwrap_or("<unnamed>");
let payload = info
.payload()
@ -63,6 +60,11 @@ pub fn init_panic_hook(
.or_else(|| info.payload().downcast_ref::<String>().cloned())
.unwrap_or_else(|| "Box<Any>".to_string());
crashes::handle_panic(payload.clone(), info.location());
let thread = thread::current();
let thread_name = thread.name().unwrap_or("<unnamed>");
if *release_channel::RELEASE_CHANNEL == ReleaseChannel::Dev {
let location = info.location().unwrap();
let backtrace = Backtrace::new();
@ -214,45 +216,53 @@ pub fn init(
let installation_id = installation_id.clone();
let system_id = system_id.clone();
if let Some(ssh_client) = project.ssh_client() {
ssh_client.update(cx, |client, cx| {
if TelemetrySettings::get_global(cx).diagnostics {
let request = client.proto_client().request(proto::GetCrashFiles {});
cx.background_spawn(async move {
let crash_files = request.await?;
for crash in crash_files.crashes {
let mut panic: Option<Panic> = crash
.panic_contents
.and_then(|s| serde_json::from_str(&s).log_err());
let Some(ssh_client) = project.ssh_client() else {
return;
};
ssh_client.update(cx, |client, cx| {
if !TelemetrySettings::get_global(cx).diagnostics {
return;
}
let request = client.proto_client().request(proto::GetCrashFiles {});
cx.background_spawn(async move {
let GetCrashFilesResponse {
legacy_panics,
crashes,
} = request.await?;
if let Some(panic) = panic.as_mut() {
panic.session_id = session_id.clone();
panic.system_id = system_id.clone();
panic.installation_id = installation_id.clone();
}
if let Some(minidump) = crash.minidump_contents {
upload_minidump(
http_client.clone(),
minidump.clone(),
panic.as_ref(),
)
.await
.log_err();
}
if let Some(panic) = panic {
upload_panic(&http_client, &panic_report_url, panic, &mut None)
.await?;
}
}
anyhow::Ok(())
})
.detach_and_log_err(cx);
for panic in legacy_panics {
if let Some(mut panic) = serde_json::from_str::<Panic>(&panic).log_err() {
panic.session_id = session_id.clone();
panic.system_id = system_id.clone();
panic.installation_id = installation_id.clone();
upload_panic(&http_client, &panic_report_url, panic, &mut None).await?;
}
}
let Some(endpoint) = MINIDUMP_ENDPOINT.as_ref() else {
return Ok(());
};
for CrashReport {
metadata,
minidump_contents,
} in crashes
{
if let Some(metadata) = serde_json::from_str(&metadata).log_err() {
upload_minidump(
http_client.clone(),
endpoint,
minidump_contents,
&metadata,
)
.await
.log_err();
}
}
anyhow::Ok(())
})
}
.detach_and_log_err(cx);
})
})
.detach();
}
@ -466,16 +476,18 @@ fn upload_panics_and_crashes(
installation_id: Option<String>,
cx: &App,
) {
let telemetry_settings = *client::TelemetrySettings::get_global(cx);
if !client::TelemetrySettings::get_global(cx).diagnostics {
return;
}
cx.background_spawn(async move {
let most_recent_panic =
upload_previous_panics(http.clone(), &panic_report_url, telemetry_settings)
.await
.log_err()
.flatten();
upload_previous_crashes(http, most_recent_panic, installation_id, telemetry_settings)
upload_previous_minidumps(http.clone()).await.warn_on_err();
let most_recent_panic = upload_previous_panics(http.clone(), &panic_report_url)
.await
.log_err()
.flatten();
upload_previous_crashes(http, most_recent_panic, installation_id)
.await
.log_err();
})
.detach()
}
@ -484,7 +496,6 @@ fn upload_panics_and_crashes(
async fn upload_previous_panics(
http: Arc<HttpClientWithUrl>,
panic_report_url: &Url,
telemetry_settings: client::TelemetrySettings,
) -> anyhow::Result<Option<(i64, String)>> {
let mut children = smol::fs::read_dir(paths::logs_dir()).await?;
@ -507,58 +518,41 @@ async fn upload_previous_panics(
continue;
}
if telemetry_settings.diagnostics {
let panic_file_content = smol::fs::read_to_string(&child_path)
.await
.context("error reading panic file")?;
let panic_file_content = smol::fs::read_to_string(&child_path)
.await
.context("error reading panic file")?;
let panic: Option<Panic> = serde_json::from_str(&panic_file_content)
.log_err()
.or_else(|| {
panic_file_content
.lines()
.next()
.and_then(|line| serde_json::from_str(line).ok())
})
.unwrap_or_else(|| {
log::error!("failed to deserialize panic file {:?}", panic_file_content);
None
});
let panic: Option<Panic> = serde_json::from_str(&panic_file_content)
.log_err()
.or_else(|| {
panic_file_content
.lines()
.next()
.and_then(|line| serde_json::from_str(line).ok())
})
.unwrap_or_else(|| {
log::error!("failed to deserialize panic file {:?}", panic_file_content);
None
});
if let Some(panic) = panic {
let minidump_path = paths::logs_dir()
.join(&panic.session_id)
.with_extension("dmp");
if minidump_path.exists() {
let minidump = smol::fs::read(&minidump_path)
.await
.context("Failed to read minidump")?;
if upload_minidump(http.clone(), minidump, Some(&panic))
.await
.log_err()
.is_some()
{
fs::remove_file(minidump_path).ok();
}
}
if !upload_panic(&http, &panic_report_url, panic, &mut most_recent_panic).await? {
continue;
}
}
if let Some(panic) = panic
&& upload_panic(&http, &panic_report_url, panic, &mut most_recent_panic).await?
{
// We've done what we can, delete the file
fs::remove_file(child_path)
.context("error removing panic")
.log_err();
}
// We've done what we can, delete the file
fs::remove_file(child_path)
.context("error removing panic")
.log_err();
}
if MINIDUMP_ENDPOINT.is_none() {
return Ok(most_recent_panic);
}
Ok(most_recent_panic)
}
pub async fn upload_previous_minidumps(http: Arc<HttpClientWithUrl>) -> anyhow::Result<()> {
let Some(minidump_endpoint) = MINIDUMP_ENDPOINT.as_ref() else {
return Err(anyhow::anyhow!("Minidump endpoint not set"));
};
// loop back over the directory again to upload any minidumps that are missing panics
let mut children = smol::fs::read_dir(paths::logs_dir()).await?;
while let Some(child) = children.next().await {
let child = child?;
@ -566,33 +560,35 @@ async fn upload_previous_panics(
if child_path.extension() != Some(OsStr::new("dmp")) {
continue;
}
if upload_minidump(
http.clone(),
smol::fs::read(&child_path)
.await
.context("Failed to read minidump")?,
None,
)
.await
.log_err()
.is_some()
{
fs::remove_file(child_path).ok();
let mut json_path = child_path.clone();
json_path.set_extension("json");
if let Ok(metadata) = serde_json::from_slice(&smol::fs::read(&json_path).await?) {
if upload_minidump(
http.clone(),
&minidump_endpoint,
smol::fs::read(&child_path)
.await
.context("Failed to read minidump")?,
&metadata,
)
.await
.log_err()
.is_some()
{
fs::remove_file(child_path).ok();
fs::remove_file(json_path).ok();
}
}
}
Ok(most_recent_panic)
Ok(())
}
async fn upload_minidump(
http: Arc<HttpClientWithUrl>,
endpoint: &str,
minidump: Vec<u8>,
panic: Option<&Panic>,
metadata: &crashes::CrashInfo,
) -> Result<()> {
let minidump_endpoint = MINIDUMP_ENDPOINT
.to_owned()
.ok_or_else(|| anyhow::anyhow!("Minidump endpoint not set"))?;
let mut form = Form::new()
.part(
"upload_file_minidump",
@ -600,38 +596,22 @@ async fn upload_minidump(
.file_name("minidump.dmp")
.mime_str("application/octet-stream")?,
)
.text(
"sentry[tags][channel]",
metadata.init.release_channel.clone(),
)
.text("sentry[tags][version]", metadata.init.zed_version.clone())
.text("sentry[release]", metadata.init.commit_sha.clone())
.text("platform", "rust");
if let Some(panic) = panic {
form = form
.text("sentry[tags][channel]", panic.release_channel.clone())
.text("sentry[tags][version]", panic.app_version.clone())
.text("sentry[context][os][name]", panic.os_name.clone())
.text(
"sentry[context][device][architecture]",
panic.architecture.clone(),
)
.text("sentry[logentry][formatted]", panic.payload.clone());
if let Some(sha) = panic.app_commit_sha.clone() {
form = form.text("sentry[release]", sha)
} else {
form = form.text(
"sentry[release]",
format!("{}-{}", panic.release_channel, panic.app_version),
)
}
if let Some(v) = panic.os_version.clone() {
form = form.text("sentry[context][os][release]", v);
}
if let Some(location) = panic.location_data.as_ref() {
form = form.text("span", format!("{}:{}", location.file, location.line))
}
if let Some(panic_info) = metadata.panic.as_ref() {
form = form.text("sentry[logentry][formatted]", panic_info.message.clone());
form = form.text("span", panic_info.span.clone());
// TODO: add gpu-context, feature-flag-context, and more of device-context like gpu
// name, screen resolution, available ram, device model, etc
}
let mut response_text = String::new();
let mut response = http.send_multipart_form(&minidump_endpoint, form).await?;
let mut response = http.send_multipart_form(endpoint, form).await?;
response
.body_mut()
.read_to_string(&mut response_text)
@ -681,11 +661,7 @@ async fn upload_previous_crashes(
http: Arc<HttpClientWithUrl>,
most_recent_panic: Option<(i64, String)>,
installation_id: Option<String>,
telemetry_settings: client::TelemetrySettings,
) -> Result<()> {
if !telemetry_settings.diagnostics {
return Ok(());
}
let last_uploaded = KEY_VALUE_STORE
.read_kvp(LAST_CRASH_UPLOADED)?
.unwrap_or("zed-2024-01-17-221900.ips".to_string()); // don't upload old crash reports from before we had this.

View file

@ -427,7 +427,7 @@ Custom models will be listed in the model dropdown in the Agent Panel.
Zed supports using [OpenAI compatible APIs](https://platform.openai.com/docs/api-reference/chat) by specifying a custom `api_url` and `available_models` for the OpenAI provider.
This is useful for connecting to other hosted services (like Together AI, Anyscale, etc.) or local models.
You can add a custom, OpenAI-compatible model via either via the UI or by editing your `settings.json`.
You can add a custom, OpenAI-compatible model either via the UI or by editing your `settings.json`.
To do it via the UI, go to the Agent Panel settings (`agent: open settings`) and look for the "Add Provider" button to the right of the "LLM Providers" section title.
Then, fill up the input fields available in the modal.
@ -443,7 +443,13 @@ To do it via your `settings.json`, add the following snippet under `language_mod
{
"name": "mistralai/Mixtral-8x7B-Instruct-v0.1",
"display_name": "Together Mixtral 8x7B",
"max_tokens": 32768
"max_tokens": 32768,
"capabilities": {
"tools": true,
"images": false,
"parallel_tool_calls": false,
"prompt_cache_key": false
}
}
]
}
@ -451,6 +457,13 @@ To do it via your `settings.json`, add the following snippet under `language_mod
}
```
By default, OpenAI-compatible models inherit the following capabilities:
- `tools`: true (supports tool/function calling)
- `images`: false (does not support image inputs)
- `parallel_tool_calls`: false (does not support `parallel_tool_calls` parameter)
- `prompt_cache_key`: false (does not support `prompt_cache_key` parameter)
Note that LLM API keys aren't stored in your settings file.
So, ensure you have it set in your environment variables (`OPENAI_API_KEY=<your api key>`) so your settings can pick it up.