Compare commits

...
Sign in to create a new pull request.

17 commits

Author SHA1 Message Date
gcp-cherry-pick-bot[bot]
0dbd74bfef
Sync extension debuggers to remote host (cherry-pick #33876) (#33933)
Cherry-picked Sync extension debuggers to remote host (#33876)

Closes #33835

Release Notes:

- Fixed debugger extensions not working in remote projects.

Co-authored-by: Ryan Hawkins <ryanlarryhawkins@gmail.com>
2025-07-05 02:45:58 +03:00
Peter Tripp
98d232261d
v0.193.x stable 2025-07-02 10:06:12 -04:00
gcp-cherry-pick-bot[bot]
4c7c0bccaf
gpui: Fix slow scrolling in lists (cherry-pick #33608) (#33722)
Cherry-picked gpui: Fix slow scrolling in lists (#33608)

matches editor element's behavior



https://github.com/user-attachments/assets/f70912e1-5adb-403b-a98c-63e2e89929ac


- in first version editor scrolls like 1.5 pages, but agent panel only
scrolls half a page.
- in second version, agent panel also scrolls like 1.5 pages.

Release Notes:

- Fixed skipping of some scroll events in the non-uniform list UI
element, which fixes slow scrolling of the agent panel.

Co-authored-by: maan2003 <49202620+maan2003@users.noreply.github.com>
2025-07-01 11:34:19 -06:00
Zed Bot
837f28d138 Bump to 0.193.3 for @SomeoneToIgnore 2025-06-30 16:08:41 +00:00
gcp-cherry-pick-bot[bot]
eb48f9b88a
Revert "languages: Bump ESLint LSP server to version 3.0.10 (#32717)" (cherry-pick #33659) (#33663)
Cherry-picked Revert "languages: Bump ESLint LSP server to version
3.0.10 (#32717)" (#33659)

This reverts commit 1edaeebae5.

Based on an elevated number of ESLint-related issues, reverting the
upgrade.
Many people upvoted the issues and did not share any repro details, so
cannot be certain what's more broken: seems relatively generic as
related to *.ts ESLint configs.

Checked the revert on 2 projects from the issues below:

Closes https://github.com/zed-industries/zed/issues/33425

With https://github.com/adamhl8/zed-33425 as an example repo: there,
both eslint configurations worked for me when I stopped Zed and opened a
project.
Somehow, switching various Zed's with different vscode-eslint package
versions, eventually I get
`Error: Cannot find module

'~/.local/share/zed/languages/eslint/vscode-eslint-3.0.10/vscode-eslint/server/out/eslintServer.js'`-ish
error.

Not very related to issues with newer vscode-eslint integration, but
worth mentioning as is related to the package updates.


Closes https://github.com/zed-industries/zed/issues/33648

With a good example of
https://github.com/florian-lackner365/zed-eslint-bug monorepo project.
The monorepo part seems not to be related, but somehow,
`eslint.config.js` is involved as the newer vscode-eslint fails to find
a config.
Works well with the older vscode-eslint.

Release Notes:

- Downgraded to vscode-eslint-2.4.4 as a ESLint language server

Co-authored-by: Kirill Bulatov <kirill@zed.dev>
2025-06-30 18:46:51 +03:00
Kirill Bulatov
04dd9fdc54 Further improve color inlay hints in multi buffers (#33642)
Follow-up of https://github.com/zed-industries/zed/pull/33605

Release Notes:

- N/A
2025-06-30 12:24:10 +03:00
Kirill Bulatov
170d6d58d4 Rework color indicators visual representation (#33605)
Use a div-based rendering code instead of using a text

Closes https://github.com/zed-industries/zed/discussions/33507

Before:
<img width="410" alt="before_dark"
src="https://github.com/user-attachments/assets/66ad63ae-7836-4dc7-8176-a2ff5a38bcd4"
/>
After:
<img width="407" alt="after_dark"
src="https://github.com/user-attachments/assets/0b627da8-461b-4f19-b236-4a69bf5952a0"
/>


Before:
<img width="409" alt="before_light"
src="https://github.com/user-attachments/assets/ebcfabec-fcda-4b63-aee6-c702888f0db4"
/>
After:
<img width="410" alt="after_light"
src="https://github.com/user-attachments/assets/c0da42a1-d6b3-4e08-a56c-9966c07e442d"
/>

The border is not that contrast as in VSCode examples in the issue, but
I'm supposed to use the right thing in

1e11de48ee/crates/editor/src/display_map/inlay_map.rs (L357)

based on 


41583fb066/crates/theme/src/styles/colors.rs (L16-L17)

Another oddity is that the border starts to shrink on `cmd-=`
(`zed::IncreaseBufferFontSize`):

<img width="1244" alt="image"
src="https://github.com/user-attachments/assets/f424edc0-ca0c-4b02-96d4-6da7bf70449a"
/>

but that needs a different part of code to be adjusted hence skipped.

Tailwind CSS example:

<img width="1108" alt="image"
src="https://github.com/user-attachments/assets/10ada4dc-ea8c-46d3-b285-d895bbd6a619"
/>


Release Notes:

- Reworked color indicators visual representation
2025-06-30 12:17:11 +03:00
Kirill Bulatov
1d11ec3ca2 Fix document colors issues with other inlays and multi buffers (#33598)
Closes https://github.com/zed-industries/zed/issues/33575

* Fixes inlay colors spoiled after document color displayed
* Optimizes the query pattern for large multi buffers

Release Notes:

- Fixed document colors issues with other inlays and multi buffers
2025-06-29 00:18:21 +03:00
gcp-cherry-pick-bot[bot]
d08351bbb7
Don't panic on vintage files (cherry-pick #33543) (#33551)
Cherry-picked Don't panic on vintage files (#33543)

Release Notes:

- remoting: Fix a crash on the remote side when encountering files from
before 1970.

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-06-27 13:40:35 -06:00
Zed Bot
4961830845 Bump to 0.193.2 for @SomeoneToIgnore 2025-06-27 16:45:08 +00:00
Kirill Bulatov
5a73af1af1 Respect server capabilities on queries (#33538)
Closes https://github.com/zed-industries/zed/issues/33522

Turns out a bunch of Zed requests were not checking their capabilities
correctly, due to odd copy-paste and due to default that assumed that
the capabilities are met.

Adjust the code, which includes the document colors, add the test on the
colors case.

Release Notes:

- Fixed excessive document colors requests for unrelated files
2025-06-27 19:41:14 +03:00
gcp-cherry-pick-bot[bot]
ebe2056745
Fix blend alpha colors with editor background in inline preview (cherry-pick #33513) (#33517)
Cherry-picked Fix blend alpha colors with editor background in inline
preview (#33513)

Closes #33505

## Before

<img width="434" alt="Screenshot 2025-06-27 at 12 22 57"

src="https://github.com/user-attachments/assets/ac215a39-b3fe-4c9e-bd7d-0d7568d5fd1f"
/>

## After

<img width="441" alt="Screenshot 2025-06-27 at 12 22 47"

src="https://github.com/user-attachments/assets/28218ed6-c1aa-4d3f-a268-def2fa9f0340"
/>

Release Notes:

- Fixed inline color previews not correctly blending alpha/transparency
values with the editor background

Co-authored-by: ddoemonn <109994179+ddoemonn@users.noreply.github.com>
2025-06-27 13:04:51 +03:00
gcp-cherry-pick-bot[bot]
a2b18a0b65
debugger: Fix treatment of node-terminal scenarios (cherry-pick #33432) (#33469)
Cherry-picked debugger: Fix treatment of node-terminal scenarios
(#33432)

- Normalize `node-terminal` to `pwa-node` before sending to DAP
- Split `command` into `program` and `args`
- Run in external console

Release Notes:

- debugger: Fixed debugging JavaScript tasks that used `"type":
"node-terminal"`.

Co-authored-by: Cole Miller <cole@zed.dev>
2025-06-26 15:02:51 -04:00
vipex
e0090a1680 pane: Update pinned tab count when it exceeds actual tab count (#33405)
## Summary

This PR improves the workaround introduced in #33335 that handles cases
where the pinned tab count exceeds the actual tab count during workspace
deserialization.

## Problem

The original workaround in #33335 successfully prevented the panic but
had two issues:
1. **Console spam**: The warning message was logged repeatedly because
`self.pinned_tab_count` wasn't updated to match the actual tab count
2. **Auto-pinning behavior**: New tabs up until you exceed the old safe
tab count were automatically pinned after the workaround was triggered.

## Solution

Updates the defensive code to set `self.pinned_tab_count = tab_count`
when the mismatch is detected, ensuring:
- The warning is only logged once when encountered.
- New tabs behave normally (aren't auto-pinned)
- The workspace remains in a consistent state

This is an immediate fix for the workaround. I'll attempt to open up a
follow-up PR when i get the chance that will address the root cause by
implementing serialization for empty untitled tabs, as discussed in
#33342.

Release Notes:

- N/A
2025-06-26 02:05:00 -04:00
Joseph T. Lyons
ca43498e10 zed 0.193.1 2025-06-25 19:05:50 -04:00
Max Brunsfeld
f315b6f1ed Restore missing initialization of text thread actions (#33422)
Fixes a regression introduced in
https://github.com/zed-industries/zed/pull/33289

Release Notes:

- Fixed a bug where some text thread actions were accidentally removed.
2025-06-25 15:52:58 -07:00
Joseph T. Lyons
6488524196 v0.193.x preview 2025-06-25 11:35:00 -04:00
28 changed files with 855 additions and 399 deletions

3
Cargo.lock generated
View file

@ -4161,6 +4161,7 @@ dependencies = [
"paths", "paths",
"serde", "serde",
"serde_json", "serde_json",
"shlex",
"task", "task",
"util", "util",
"workspace-hack", "workspace-hack",
@ -19919,7 +19920,7 @@ dependencies = [
[[package]] [[package]]
name = "zed" name = "zed"
version = "0.193.0" version = "0.193.3"
dependencies = [ dependencies = [
"activity_indicator", "activity_indicator",
"agent", "agent",

View file

@ -48,7 +48,7 @@ pub use crate::agent_panel::{AgentPanel, ConcreteAssistantPanelDelegate};
pub use crate::inline_assistant::InlineAssistant; pub use crate::inline_assistant::InlineAssistant;
use crate::slash_command_settings::SlashCommandSettings; use crate::slash_command_settings::SlashCommandSettings;
pub use agent_diff::{AgentDiffPane, AgentDiffToolbar}; pub use agent_diff::{AgentDiffPane, AgentDiffToolbar};
pub use text_thread_editor::AgentPanelDelegate; pub use text_thread_editor::{AgentPanelDelegate, TextThreadEditor};
pub use ui::preview::{all_agent_previews, get_agent_preview}; pub use ui::preview::{all_agent_previews, get_agent_preview};
actions!( actions!(
@ -157,6 +157,7 @@ pub fn init(
agent::init(cx); agent::init(cx);
agent_panel::init(cx); agent_panel::init(cx);
context_server_configuration::init(language_registry.clone(), fs.clone(), cx); context_server_configuration::init(language_registry.clone(), fs.clone(), cx);
TextThreadEditor::init(cx);
register_slash_commands(cx); register_slash_commands(cx);
inline_assistant::init( inline_assistant::init(

View file

@ -33,6 +33,7 @@ log.workspace = true
paths.workspace = true paths.workspace = true
serde.workspace = true serde.workspace = true
serde_json.workspace = true serde_json.workspace = true
shlex.workspace = true
task.workspace = true task.workspace = true
util.workspace = true util.workspace = true
workspace-hack.workspace = true workspace-hack.workspace = true

View file

@ -5,7 +5,7 @@ use gpui::AsyncApp;
use serde_json::Value; use serde_json::Value;
use std::{collections::HashMap, path::PathBuf, sync::OnceLock}; use std::{collections::HashMap, path::PathBuf, sync::OnceLock};
use task::DebugRequest; use task::DebugRequest;
use util::ResultExt; use util::{ResultExt, maybe};
use crate::*; use crate::*;
@ -72,6 +72,24 @@ impl JsDebugAdapter {
let mut configuration = task_definition.config.clone(); let mut configuration = task_definition.config.clone();
if let Some(configuration) = configuration.as_object_mut() { if let Some(configuration) = configuration.as_object_mut() {
maybe!({
configuration
.get("type")
.filter(|value| value == &"node-terminal")?;
let command = configuration.get("command")?.as_str()?.to_owned();
let mut args = shlex::split(&command)?.into_iter();
let program = args.next()?;
configuration.insert("program".to_owned(), program.into());
configuration.insert(
"args".to_owned(),
args.map(Value::from).collect::<Vec<_>>().into(),
);
configuration.insert("console".to_owned(), "externalTerminal".into());
Some(())
});
configuration.entry("type").and_modify(normalize_task_type);
if let Some(program) = configuration if let Some(program) = configuration
.get("program") .get("program")
.cloned() .cloned()
@ -96,7 +114,6 @@ impl JsDebugAdapter {
.entry("cwd") .entry("cwd")
.or_insert(delegate.worktree_root_path().to_string_lossy().into()); .or_insert(delegate.worktree_root_path().to_string_lossy().into());
configuration.entry("type").and_modify(normalize_task_type);
configuration configuration
.entry("console") .entry("console")
.or_insert("externalTerminal".into()); .or_insert("externalTerminal".into());
@ -512,7 +529,7 @@ fn normalize_task_type(task_type: &mut Value) {
}; };
let new_name = match task_type_str { let new_name = match task_type_str {
"node" | "pwa-node" => "pwa-node", "node" | "pwa-node" | "node-terminal" => "pwa-node",
"chrome" | "pwa-chrome" => "pwa-chrome", "chrome" | "pwa-chrome" => "pwa-chrome",
"edge" | "msedge" | "pwa-edge" | "pwa-msedge" => "pwa-msedge", "edge" | "msedge" | "pwa-edge" | "pwa-msedge" => "pwa-msedge",
_ => task_type_str, _ => task_type_str,

View file

@ -37,7 +37,9 @@ pub use block_map::{
use block_map::{BlockRow, BlockSnapshot}; use block_map::{BlockRow, BlockSnapshot};
use collections::{HashMap, HashSet}; use collections::{HashMap, HashSet};
pub use crease_map::*; pub use crease_map::*;
pub use fold_map::{ChunkRenderer, ChunkRendererContext, Fold, FoldId, FoldPlaceholder, FoldPoint}; pub use fold_map::{
ChunkRenderer, ChunkRendererContext, ChunkRendererId, Fold, FoldId, FoldPlaceholder, FoldPoint,
};
use fold_map::{FoldMap, FoldSnapshot}; use fold_map::{FoldMap, FoldSnapshot};
use gpui::{App, Context, Entity, Font, HighlightStyle, LineLayout, Pixels, UnderlineStyle}; use gpui::{App, Context, Entity, Font, HighlightStyle, LineLayout, Pixels, UnderlineStyle};
pub use inlay_map::Inlay; pub use inlay_map::Inlay;
@ -538,7 +540,7 @@ impl DisplayMap {
pub fn update_fold_widths( pub fn update_fold_widths(
&mut self, &mut self,
widths: impl IntoIterator<Item = (FoldId, Pixels)>, widths: impl IntoIterator<Item = (ChunkRendererId, Pixels)>,
cx: &mut Context<Self>, cx: &mut Context<Self>,
) -> bool { ) -> bool {
let snapshot = self.buffer.read(cx).snapshot(cx); let snapshot = self.buffer.read(cx).snapshot(cx);
@ -966,10 +968,22 @@ impl DisplaySnapshot {
.and_then(|id| id.style(&editor_style.syntax)); .and_then(|id| id.style(&editor_style.syntax));
if let Some(chunk_highlight) = chunk.highlight_style { if let Some(chunk_highlight) = chunk.highlight_style {
// For color inlays, blend the color with the editor background
let mut processed_highlight = chunk_highlight;
if chunk.is_inlay {
if let Some(inlay_color) = chunk_highlight.color {
// Only blend if the color has transparency (alpha < 1.0)
if inlay_color.a < 1.0 {
let blended_color = editor_style.background.blend(inlay_color);
processed_highlight.color = Some(blended_color);
}
}
}
if let Some(highlight_style) = highlight_style.as_mut() { if let Some(highlight_style) = highlight_style.as_mut() {
highlight_style.highlight(chunk_highlight); highlight_style.highlight(processed_highlight);
} else { } else {
highlight_style = Some(chunk_highlight); highlight_style = Some(processed_highlight);
} }
} }

View file

@ -1,3 +1,5 @@
use crate::{InlayId, display_map::inlay_map::InlayChunk};
use super::{ use super::{
Highlights, Highlights,
inlay_map::{InlayBufferRows, InlayChunks, InlayEdit, InlayOffset, InlayPoint, InlaySnapshot}, inlay_map::{InlayBufferRows, InlayChunks, InlayEdit, InlayOffset, InlayPoint, InlaySnapshot},
@ -275,13 +277,16 @@ impl FoldMapWriter<'_> {
pub(crate) fn update_fold_widths( pub(crate) fn update_fold_widths(
&mut self, &mut self,
new_widths: impl IntoIterator<Item = (FoldId, Pixels)>, new_widths: impl IntoIterator<Item = (ChunkRendererId, Pixels)>,
) -> (FoldSnapshot, Vec<FoldEdit>) { ) -> (FoldSnapshot, Vec<FoldEdit>) {
let mut edits = Vec::new(); let mut edits = Vec::new();
let inlay_snapshot = self.0.snapshot.inlay_snapshot.clone(); let inlay_snapshot = self.0.snapshot.inlay_snapshot.clone();
let buffer = &inlay_snapshot.buffer; let buffer = &inlay_snapshot.buffer;
for (id, new_width) in new_widths { for (id, new_width) in new_widths {
let ChunkRendererId::Fold(id) = id else {
continue;
};
if let Some(metadata) = self.0.snapshot.fold_metadata_by_id.get(&id).cloned() { if let Some(metadata) = self.0.snapshot.fold_metadata_by_id.get(&id).cloned() {
if Some(new_width) != metadata.width { if Some(new_width) != metadata.width {
let buffer_start = metadata.range.start.to_offset(buffer); let buffer_start = metadata.range.start.to_offset(buffer);
@ -527,7 +532,7 @@ impl FoldMap {
placeholder: Some(TransformPlaceholder { placeholder: Some(TransformPlaceholder {
text: ELLIPSIS, text: ELLIPSIS,
renderer: ChunkRenderer { renderer: ChunkRenderer {
id: fold.id, id: ChunkRendererId::Fold(fold.id),
render: Arc::new(move |cx| { render: Arc::new(move |cx| {
(fold.placeholder.render)( (fold.placeholder.render)(
fold_id, fold_id,
@ -1060,7 +1065,7 @@ impl sum_tree::Summary for TransformSummary {
} }
#[derive(Copy, Clone, Eq, PartialEq, Debug, Default, Ord, PartialOrd, Hash)] #[derive(Copy, Clone, Eq, PartialEq, Debug, Default, Ord, PartialOrd, Hash)]
pub struct FoldId(usize); pub struct FoldId(pub(super) usize);
impl From<FoldId> for ElementId { impl From<FoldId> for ElementId {
fn from(val: FoldId) -> Self { fn from(val: FoldId) -> Self {
@ -1265,11 +1270,17 @@ pub struct Chunk<'a> {
pub renderer: Option<ChunkRenderer>, pub renderer: Option<ChunkRenderer>,
} }
#[derive(Clone, Copy, PartialEq, Eq, PartialOrd, Ord)]
pub enum ChunkRendererId {
Fold(FoldId),
Inlay(InlayId),
}
/// A recipe for how the chunk should be presented. /// A recipe for how the chunk should be presented.
#[derive(Clone)] #[derive(Clone)]
pub struct ChunkRenderer { pub struct ChunkRenderer {
/// The id of the fold associated with this chunk. /// The id of the renderer associated with this chunk.
pub id: FoldId, pub id: ChunkRendererId,
/// Creates a custom element to represent this chunk. /// Creates a custom element to represent this chunk.
pub render: Arc<dyn Send + Sync + Fn(&mut ChunkRendererContext) -> AnyElement>, pub render: Arc<dyn Send + Sync + Fn(&mut ChunkRendererContext) -> AnyElement>,
/// If true, the element is constrained to the shaped width of the text. /// If true, the element is constrained to the shaped width of the text.
@ -1311,7 +1322,7 @@ impl DerefMut for ChunkRendererContext<'_, '_> {
pub struct FoldChunks<'a> { pub struct FoldChunks<'a> {
transform_cursor: Cursor<'a, Transform, (FoldOffset, InlayOffset)>, transform_cursor: Cursor<'a, Transform, (FoldOffset, InlayOffset)>,
inlay_chunks: InlayChunks<'a>, inlay_chunks: InlayChunks<'a>,
inlay_chunk: Option<(InlayOffset, language::Chunk<'a>)>, inlay_chunk: Option<(InlayOffset, InlayChunk<'a>)>,
inlay_offset: InlayOffset, inlay_offset: InlayOffset,
output_offset: FoldOffset, output_offset: FoldOffset,
max_output_offset: FoldOffset, max_output_offset: FoldOffset,
@ -1403,7 +1414,8 @@ impl<'a> Iterator for FoldChunks<'a> {
} }
// Otherwise, take a chunk from the buffer's text. // Otherwise, take a chunk from the buffer's text.
if let Some((buffer_chunk_start, mut chunk)) = self.inlay_chunk.clone() { if let Some((buffer_chunk_start, mut inlay_chunk)) = self.inlay_chunk.clone() {
let chunk = &mut inlay_chunk.chunk;
let buffer_chunk_end = buffer_chunk_start + InlayOffset(chunk.text.len()); let buffer_chunk_end = buffer_chunk_start + InlayOffset(chunk.text.len());
let transform_end = self.transform_cursor.end(&()).1; let transform_end = self.transform_cursor.end(&()).1;
let chunk_end = buffer_chunk_end.min(transform_end); let chunk_end = buffer_chunk_end.min(transform_end);
@ -1428,7 +1440,7 @@ impl<'a> Iterator for FoldChunks<'a> {
is_tab: chunk.is_tab, is_tab: chunk.is_tab,
is_inlay: chunk.is_inlay, is_inlay: chunk.is_inlay,
underline: chunk.underline, underline: chunk.underline,
renderer: None, renderer: inlay_chunk.renderer,
}); });
} }

View file

@ -1,4 +1,4 @@
use crate::{HighlightStyles, InlayId}; use crate::{ChunkRenderer, HighlightStyles, InlayId};
use collections::BTreeSet; use collections::BTreeSet;
use gpui::{Hsla, Rgba}; use gpui::{Hsla, Rgba};
use language::{Chunk, Edit, Point, TextSummary}; use language::{Chunk, Edit, Point, TextSummary};
@ -8,11 +8,13 @@ use multi_buffer::{
use std::{ use std::{
cmp, cmp,
ops::{Add, AddAssign, Range, Sub, SubAssign}, ops::{Add, AddAssign, Range, Sub, SubAssign},
sync::Arc,
}; };
use sum_tree::{Bias, Cursor, SumTree}; use sum_tree::{Bias, Cursor, SumTree};
use text::{Patch, Rope}; use text::{Patch, Rope};
use ui::{ActiveTheme, IntoElement as _, ParentElement as _, Styled as _, div};
use super::{Highlights, custom_highlights::CustomHighlightsChunks}; use super::{Highlights, custom_highlights::CustomHighlightsChunks, fold_map::ChunkRendererId};
/// Decides where the [`Inlay`]s should be displayed. /// Decides where the [`Inlay`]s should be displayed.
/// ///
@ -252,6 +254,13 @@ pub struct InlayChunks<'a> {
snapshot: &'a InlaySnapshot, snapshot: &'a InlaySnapshot,
} }
#[derive(Clone)]
pub struct InlayChunk<'a> {
pub chunk: Chunk<'a>,
/// Whether the inlay should be customly rendered.
pub renderer: Option<ChunkRenderer>,
}
impl InlayChunks<'_> { impl InlayChunks<'_> {
pub fn seek(&mut self, new_range: Range<InlayOffset>) { pub fn seek(&mut self, new_range: Range<InlayOffset>) {
self.transforms.seek(&new_range.start, Bias::Right, &()); self.transforms.seek(&new_range.start, Bias::Right, &());
@ -271,7 +280,7 @@ impl InlayChunks<'_> {
} }
impl<'a> Iterator for InlayChunks<'a> { impl<'a> Iterator for InlayChunks<'a> {
type Item = Chunk<'a>; type Item = InlayChunk<'a>;
fn next(&mut self) -> Option<Self::Item> { fn next(&mut self) -> Option<Self::Item> {
if self.output_offset == self.max_output_offset { if self.output_offset == self.max_output_offset {
@ -296,9 +305,12 @@ impl<'a> Iterator for InlayChunks<'a> {
chunk.text = suffix; chunk.text = suffix;
self.output_offset.0 += prefix.len(); self.output_offset.0 += prefix.len();
Chunk { InlayChunk {
text: prefix, chunk: Chunk {
..chunk.clone() text: prefix,
..chunk.clone()
},
renderer: None,
} }
} }
Transform::Inlay(inlay) => { Transform::Inlay(inlay) => {
@ -313,6 +325,7 @@ impl<'a> Iterator for InlayChunks<'a> {
} }
} }
let mut renderer = None;
let mut highlight_style = match inlay.id { let mut highlight_style = match inlay.id {
InlayId::InlineCompletion(_) => { InlayId::InlineCompletion(_) => {
self.highlight_styles.inline_completion.map(|s| { self.highlight_styles.inline_completion.map(|s| {
@ -325,14 +338,33 @@ impl<'a> Iterator for InlayChunks<'a> {
} }
InlayId::Hint(_) => self.highlight_styles.inlay_hint, InlayId::Hint(_) => self.highlight_styles.inlay_hint,
InlayId::DebuggerValue(_) => self.highlight_styles.inlay_hint, InlayId::DebuggerValue(_) => self.highlight_styles.inlay_hint,
InlayId::Color(_) => match inlay.color { InlayId::Color(_) => {
Some(color) => { if let Some(color) = inlay.color {
let style = self.highlight_styles.inlay_hint.get_or_insert_default(); renderer = Some(ChunkRenderer {
style.color = Some(color); id: ChunkRendererId::Inlay(inlay.id),
Some(*style) render: Arc::new(move |cx| {
div()
.w_4()
.h_4()
.relative()
.child(
div()
.absolute()
.right_1()
.w_3p5()
.h_3p5()
.border_2()
.border_color(cx.theme().colors().border)
.bg(color),
)
.into_any_element()
}),
constrain_width: false,
measured_width: None,
});
} }
None => self.highlight_styles.inlay_hint, self.highlight_styles.inlay_hint
}, }
}; };
let next_inlay_highlight_endpoint; let next_inlay_highlight_endpoint;
let offset_in_inlay = self.output_offset - self.transforms.start().0; let offset_in_inlay = self.output_offset - self.transforms.start().0;
@ -370,11 +402,14 @@ impl<'a> Iterator for InlayChunks<'a> {
self.output_offset.0 += chunk.len(); self.output_offset.0 += chunk.len();
Chunk { InlayChunk {
text: chunk, chunk: Chunk {
highlight_style, text: chunk,
is_inlay: true, highlight_style,
..Default::default() is_inlay: true,
..Chunk::default()
},
renderer,
} }
} }
}; };
@ -1066,7 +1101,7 @@ impl InlaySnapshot {
#[cfg(test)] #[cfg(test)]
pub fn text(&self) -> String { pub fn text(&self) -> String {
self.chunks(Default::default()..self.len(), false, Highlights::default()) self.chunks(Default::default()..self.len(), false, Highlights::default())
.map(|chunk| chunk.text) .map(|chunk| chunk.chunk.text)
.collect() .collect()
} }
@ -1704,7 +1739,7 @@ mod tests {
..Highlights::default() ..Highlights::default()
}, },
) )
.map(|chunk| chunk.text) .map(|chunk| chunk.chunk.text)
.collect::<String>(); .collect::<String>();
assert_eq!( assert_eq!(
actual_text, actual_text,

View file

@ -547,6 +547,7 @@ pub enum SoftWrap {
#[derive(Clone)] #[derive(Clone)]
pub struct EditorStyle { pub struct EditorStyle {
pub background: Hsla, pub background: Hsla,
pub border: Hsla,
pub local_player: PlayerColor, pub local_player: PlayerColor,
pub text: TextStyle, pub text: TextStyle,
pub scrollbar_width: Pixels, pub scrollbar_width: Pixels,
@ -562,6 +563,7 @@ impl Default for EditorStyle {
fn default() -> Self { fn default() -> Self {
Self { Self {
background: Hsla::default(), background: Hsla::default(),
border: Hsla::default(),
local_player: PlayerColor::default(), local_player: PlayerColor::default(),
text: TextStyle::default(), text: TextStyle::default(),
scrollbar_width: Pixels::default(), scrollbar_width: Pixels::default(),
@ -1825,13 +1827,13 @@ impl Editor {
editor editor
.refresh_inlay_hints(InlayHintRefreshReason::RefreshRequested, cx); .refresh_inlay_hints(InlayHintRefreshReason::RefreshRequested, cx);
} }
project::Event::LanguageServerAdded(server_id, ..) project::Event::LanguageServerAdded(..)
| project::Event::LanguageServerRemoved(server_id) => { | project::Event::LanguageServerRemoved(..) => {
if editor.tasks_update_task.is_none() { if editor.tasks_update_task.is_none() {
editor.tasks_update_task = editor.tasks_update_task =
Some(editor.refresh_runnables(window, cx)); Some(editor.refresh_runnables(window, cx));
} }
editor.update_lsp_data(Some(*server_id), None, window, cx); editor.update_lsp_data(true, None, window, cx);
} }
project::Event::SnippetEdit(id, snippet_edits) => { project::Event::SnippetEdit(id, snippet_edits) => {
if let Some(buffer) = editor.buffer.read(cx).buffer(*id) { if let Some(buffer) = editor.buffer.read(cx).buffer(*id) {
@ -2270,7 +2272,7 @@ impl Editor {
editor.minimap = editor.minimap =
editor.create_minimap(EditorSettings::get_global(cx).minimap, window, cx); editor.create_minimap(EditorSettings::get_global(cx).minimap, window, cx);
editor.colors = Some(LspColorData::new(cx)); editor.colors = Some(LspColorData::new(cx));
editor.update_lsp_data(None, None, window, cx); editor.update_lsp_data(false, None, window, cx);
} }
editor.report_editor_event("Editor Opened", None, cx); editor.report_editor_event("Editor Opened", None, cx);
@ -5072,7 +5074,7 @@ impl Editor {
to_insert, to_insert,
}) = self.inlay_hint_cache.spawn_hint_refresh( }) = self.inlay_hint_cache.spawn_hint_refresh(
reason_description, reason_description,
self.excerpts_for_inlay_hints_query(required_languages.as_ref(), cx), self.visible_excerpts(required_languages.as_ref(), cx),
invalidate_cache, invalidate_cache,
ignore_debounce, ignore_debounce,
cx, cx,
@ -5090,7 +5092,7 @@ impl Editor {
.collect() .collect()
} }
pub fn excerpts_for_inlay_hints_query( pub fn visible_excerpts(
&self, &self,
restrict_to_languages: Option<&HashSet<Arc<Language>>>, restrict_to_languages: Option<&HashSet<Arc<Language>>>,
cx: &mut Context<Editor>, cx: &mut Context<Editor>,
@ -17191,9 +17193,9 @@ impl Editor {
self.active_indent_guides_state.dirty = true; self.active_indent_guides_state.dirty = true;
} }
pub fn update_fold_widths( pub fn update_renderer_widths(
&mut self, &mut self,
widths: impl IntoIterator<Item = (FoldId, Pixels)>, widths: impl IntoIterator<Item = (ChunkRendererId, Pixels)>,
cx: &mut Context<Self>, cx: &mut Context<Self>,
) -> bool { ) -> bool {
self.display_map self.display_map
@ -19421,7 +19423,7 @@ impl Editor {
cx.emit(SearchEvent::MatchesInvalidated); cx.emit(SearchEvent::MatchesInvalidated);
if let Some(buffer) = edited_buffer { if let Some(buffer) = edited_buffer {
self.update_lsp_data(None, Some(buffer.read(cx).remote_id()), window, cx); self.update_lsp_data(false, Some(buffer.read(cx).remote_id()), window, cx);
} }
if *singleton_buffer_edited { if *singleton_buffer_edited {
@ -19486,7 +19488,7 @@ impl Editor {
.detach(); .detach();
} }
} }
self.update_lsp_data(None, Some(buffer_id), window, cx); self.update_lsp_data(false, Some(buffer_id), window, cx);
cx.emit(EditorEvent::ExcerptsAdded { cx.emit(EditorEvent::ExcerptsAdded {
buffer: buffer.clone(), buffer: buffer.clone(),
predecessor: *predecessor, predecessor: *predecessor,
@ -19672,7 +19674,7 @@ impl Editor {
if !inlay_splice.to_insert.is_empty() || !inlay_splice.to_remove.is_empty() { if !inlay_splice.to_insert.is_empty() || !inlay_splice.to_remove.is_empty() {
self.splice_inlays(&inlay_splice.to_remove, inlay_splice.to_insert, cx); self.splice_inlays(&inlay_splice.to_remove, inlay_splice.to_insert, cx);
} }
self.refresh_colors(None, None, window, cx); self.refresh_colors(false, None, window, cx);
} }
cx.notify(); cx.notify();
@ -20561,13 +20563,13 @@ impl Editor {
fn update_lsp_data( fn update_lsp_data(
&mut self, &mut self,
for_server_id: Option<LanguageServerId>, ignore_cache: bool,
for_buffer: Option<BufferId>, for_buffer: Option<BufferId>,
window: &mut Window, window: &mut Window,
cx: &mut Context<'_, Self>, cx: &mut Context<'_, Self>,
) { ) {
self.pull_diagnostics(for_buffer, window, cx); self.pull_diagnostics(for_buffer, window, cx);
self.refresh_colors(for_server_id, for_buffer, window, cx); self.refresh_colors(ignore_cache, for_buffer, window, cx);
} }
} }
@ -22252,6 +22254,7 @@ impl Render for Editor {
&cx.entity(), &cx.entity(),
EditorStyle { EditorStyle {
background, background,
border: cx.theme().colors().border,
local_player: cx.theme().players().local(), local_player: cx.theme().players().local(),
text: text_style, text: text_style,
scrollbar_width: EditorElement::SCROLLBAR_WIDTH, scrollbar_width: EditorElement::SCROLLBAR_WIDTH,

View file

@ -55,7 +55,8 @@ use util::{
uri, uri,
}; };
use workspace::{ use workspace::{
CloseActiveItem, CloseAllItems, CloseInactiveItems, NavigationEntry, OpenOptions, ViewId, CloseActiveItem, CloseAllItems, CloseInactiveItems, MoveItemToPaneInDirection, NavigationEntry,
OpenOptions, ViewId,
item::{FollowEvent, FollowableItem, Item, ItemHandle, SaveOptions}, item::{FollowEvent, FollowableItem, Item, ItemHandle, SaveOptions},
}; };
@ -22547,8 +22548,8 @@ async fn test_add_selection_after_moving_with_multiple_cursors(cx: &mut TestAppC
); );
} }
#[gpui::test] #[gpui::test(iterations = 10)]
async fn test_mtime_and_document_colors(cx: &mut TestAppContext) { async fn test_document_colors(cx: &mut TestAppContext) {
let expected_color = Rgba { let expected_color = Rgba {
r: 0.33, r: 0.33,
g: 0.33, g: 0.33,
@ -22580,6 +22581,18 @@ async fn test_mtime_and_document_colors(cx: &mut TestAppContext) {
color_provider: Some(lsp::ColorProviderCapability::Simple(true)), color_provider: Some(lsp::ColorProviderCapability::Simple(true)),
..lsp::ServerCapabilities::default() ..lsp::ServerCapabilities::default()
}, },
name: "rust-analyzer",
..FakeLspAdapter::default()
},
);
let mut fake_servers_without_capabilities = language_registry.register_fake_lsp(
"Rust",
FakeLspAdapter {
capabilities: lsp::ServerCapabilities {
color_provider: Some(lsp::ColorProviderCapability::Simple(false)),
..lsp::ServerCapabilities::default()
},
name: "not-rust-analyzer",
..FakeLspAdapter::default() ..FakeLspAdapter::default()
}, },
); );
@ -22599,6 +22612,8 @@ async fn test_mtime_and_document_colors(cx: &mut TestAppContext) {
.downcast::<Editor>() .downcast::<Editor>()
.unwrap(); .unwrap();
let fake_language_server = fake_servers.next().await.unwrap(); let fake_language_server = fake_servers.next().await.unwrap();
let fake_language_server_without_capabilities =
fake_servers_without_capabilities.next().await.unwrap();
let requests_made = Arc::new(AtomicUsize::new(0)); let requests_made = Arc::new(AtomicUsize::new(0));
let closure_requests_made = Arc::clone(&requests_made); let closure_requests_made = Arc::clone(&requests_made);
let mut color_request_handle = fake_language_server let mut color_request_handle = fake_language_server
@ -22610,44 +22625,118 @@ async fn test_mtime_and_document_colors(cx: &mut TestAppContext) {
lsp::Url::from_file_path(path!("/a/first.rs")).unwrap() lsp::Url::from_file_path(path!("/a/first.rs")).unwrap()
); );
requests_made.fetch_add(1, atomic::Ordering::Release); requests_made.fetch_add(1, atomic::Ordering::Release);
Ok(vec![lsp::ColorInformation { Ok(vec![
range: lsp::Range { lsp::ColorInformation {
start: lsp::Position { range: lsp::Range {
line: 0, start: lsp::Position {
character: 0, line: 0,
character: 0,
},
end: lsp::Position {
line: 0,
character: 1,
},
}, },
end: lsp::Position { color: lsp::Color {
line: 0, red: 0.33,
character: 1, green: 0.33,
blue: 0.33,
alpha: 0.33,
}, },
}, },
color: lsp::Color { lsp::ColorInformation {
red: 0.33, range: lsp::Range {
green: 0.33, start: lsp::Position {
blue: 0.33, line: 0,
alpha: 0.33, character: 0,
},
end: lsp::Position {
line: 0,
character: 1,
},
},
color: lsp::Color {
red: 0.33,
green: 0.33,
blue: 0.33,
alpha: 0.33,
},
}, },
}]) ])
} }
}); });
color_request_handle.next().await.unwrap();
cx.run_until_parked(); let _handle = fake_language_server_without_capabilities
.set_request_handler::<lsp::request::DocumentColor, _, _>(move |_, _| async move {
panic!("Should not be called");
});
cx.executor().advance_clock(Duration::from_millis(100));
color_request_handle.next().await.unwrap(); color_request_handle.next().await.unwrap();
cx.run_until_parked(); cx.run_until_parked();
assert_eq!( assert_eq!(
2, 1,
requests_made.load(atomic::Ordering::Acquire), requests_made.load(atomic::Ordering::Acquire),
"Should query for colors once per editor open and once after the language server startup" "Should query for colors once per editor open"
); );
editor.update_in(cx, |editor, _, cx| {
cx.executor().advance_clock(Duration::from_millis(500));
let save = editor.update_in(cx, |editor, window, cx| {
assert_eq!( assert_eq!(
vec![expected_color], vec![expected_color],
extract_color_inlays(editor, cx), extract_color_inlays(editor, cx),
"Should have an initial inlay" "Should have an initial inlay"
); );
});
// opening another file in a split should not influence the LSP query counter
workspace
.update(cx, |workspace, window, cx| {
assert_eq!(
workspace.panes().len(),
1,
"Should have one pane with one editor"
);
workspace.move_item_to_pane_in_direction(
&MoveItemToPaneInDirection {
direction: SplitDirection::Right,
focus: false,
clone: true,
},
window,
cx,
);
})
.unwrap();
cx.run_until_parked();
workspace
.update(cx, |workspace, _, cx| {
let panes = workspace.panes();
assert_eq!(panes.len(), 2, "Should have two panes after splitting");
for pane in panes {
let editor = pane
.read(cx)
.active_item()
.and_then(|item| item.downcast::<Editor>())
.expect("Should have opened an editor in each split");
let editor_file = editor
.read(cx)
.buffer()
.read(cx)
.as_singleton()
.expect("test deals with singleton buffers")
.read(cx)
.file()
.expect("test buffese should have a file")
.path();
assert_eq!(
editor_file.as_ref(),
Path::new("first.rs"),
"Both editors should be opened for the same file"
)
}
})
.unwrap();
cx.executor().advance_clock(Duration::from_millis(500));
let save = editor.update_in(cx, |editor, window, cx| {
editor.move_to_end(&MoveToEnd, window, cx); editor.move_to_end(&MoveToEnd, window, cx);
editor.handle_input("dirty", window, cx); editor.handle_input("dirty", window, cx);
editor.save( editor.save(
@ -22662,12 +22751,10 @@ async fn test_mtime_and_document_colors(cx: &mut TestAppContext) {
}); });
save.await.unwrap(); save.await.unwrap();
color_request_handle.next().await.unwrap();
cx.run_until_parked();
color_request_handle.next().await.unwrap(); color_request_handle.next().await.unwrap();
cx.run_until_parked(); cx.run_until_parked();
assert_eq!( assert_eq!(
4, 3,
requests_made.load(atomic::Ordering::Acquire), requests_made.load(atomic::Ordering::Acquire),
"Should query for colors once per save and once per formatting after save" "Should query for colors once per save and once per formatting after save"
); );
@ -22681,11 +22768,27 @@ async fn test_mtime_and_document_colors(cx: &mut TestAppContext) {
}) })
.unwrap(); .unwrap();
close.await.unwrap(); close.await.unwrap();
let close = workspace
.update(cx, |workspace, window, cx| {
workspace.active_pane().update(cx, |pane, cx| {
pane.close_active_item(&CloseActiveItem::default(), window, cx)
})
})
.unwrap();
close.await.unwrap();
assert_eq!( assert_eq!(
4, 3,
requests_made.load(atomic::Ordering::Acquire), requests_made.load(atomic::Ordering::Acquire),
"After saving and closing the editor, no extra requests should be made" "After saving and closing all editors, no extra requests should be made"
); );
workspace
.update(cx, |workspace, _, cx| {
assert!(
workspace.active_item(cx).is_none(),
"Should close all editors"
)
})
.unwrap();
workspace workspace
.update(cx, |workspace, window, cx| { .update(cx, |workspace, window, cx| {
@ -22694,13 +22797,8 @@ async fn test_mtime_and_document_colors(cx: &mut TestAppContext) {
}) })
}) })
.unwrap(); .unwrap();
color_request_handle.next().await.unwrap(); cx.executor().advance_clock(Duration::from_millis(100));
cx.run_until_parked(); cx.run_until_parked();
assert_eq!(
5,
requests_made.load(atomic::Ordering::Acquire),
"After navigating back to an editor and reopening it, another color request should be made"
);
let editor = workspace let editor = workspace
.update(cx, |workspace, _, cx| { .update(cx, |workspace, _, cx| {
workspace workspace
@ -22710,6 +22808,12 @@ async fn test_mtime_and_document_colors(cx: &mut TestAppContext) {
.expect("Should be an editor") .expect("Should be an editor")
}) })
.unwrap(); .unwrap();
color_request_handle.next().await.unwrap();
assert_eq!(
3,
requests_made.load(atomic::Ordering::Acquire),
"Cache should be reused on buffer close and reopen"
);
editor.update(cx, |editor, cx| { editor.update(cx, |editor, cx| {
assert_eq!( assert_eq!(
vec![expected_color], vec![expected_color],

View file

@ -12,8 +12,8 @@ use crate::{
ToggleFold, ToggleFold,
code_context_menus::{CodeActionsMenu, MENU_ASIDE_MAX_WIDTH, MENU_ASIDE_MIN_WIDTH, MENU_GAP}, code_context_menus::{CodeActionsMenu, MENU_ASIDE_MAX_WIDTH, MENU_ASIDE_MIN_WIDTH, MENU_GAP},
display_map::{ display_map::{
Block, BlockContext, BlockStyle, DisplaySnapshot, EditorMargins, FoldId, HighlightKey, Block, BlockContext, BlockStyle, ChunkRendererId, DisplaySnapshot, EditorMargins,
HighlightedChunk, ToDisplayPoint, HighlightKey, HighlightedChunk, ToDisplayPoint,
}, },
editor_settings::{ editor_settings::{
CurrentLineHighlight, DocumentColorsRenderMode, DoubleClickInMultibuffer, Minimap, CurrentLineHighlight, DocumentColorsRenderMode, DoubleClickInMultibuffer, Minimap,
@ -7119,7 +7119,7 @@ pub(crate) struct LineWithInvisibles {
enum LineFragment { enum LineFragment {
Text(ShapedLine), Text(ShapedLine),
Element { Element {
id: FoldId, id: ChunkRendererId,
element: Option<AnyElement>, element: Option<AnyElement>,
size: Size<Pixels>, size: Size<Pixels>,
len: usize, len: usize,
@ -8297,7 +8297,7 @@ impl Element for EditorElement {
window, window,
cx, cx,
); );
let new_fold_widths = line_layouts let new_renrerer_widths = line_layouts
.iter() .iter()
.flat_map(|layout| &layout.fragments) .flat_map(|layout| &layout.fragments)
.filter_map(|fragment| { .filter_map(|fragment| {
@ -8308,7 +8308,7 @@ impl Element for EditorElement {
} }
}); });
if self.editor.update(cx, |editor, cx| { if self.editor.update(cx, |editor, cx| {
editor.update_fold_widths(new_fold_widths, cx) editor.update_renderer_widths(new_renrerer_widths, cx)
}) { }) {
// If the fold widths have changed, we need to prepaint // If the fold widths have changed, we need to prepaint
// the element again to account for any changes in // the element again to account for any changes in

View file

@ -956,7 +956,7 @@ fn fetch_and_update_hints(
.update(cx, |editor, cx| { .update(cx, |editor, cx| {
if got_throttled { if got_throttled {
let query_not_around_visible_range = match editor let query_not_around_visible_range = match editor
.excerpts_for_inlay_hints_query(None, cx) .visible_excerpts(None, cx)
.remove(&query.excerpt_id) .remove(&query.excerpt_id)
{ {
Some((_, _, current_visible_range)) => { Some((_, _, current_visible_range)) => {
@ -2511,9 +2511,7 @@ pub mod tests {
cx: &mut gpui::TestAppContext, cx: &mut gpui::TestAppContext,
) -> Range<Point> { ) -> Range<Point> {
let ranges = editor let ranges = editor
.update(cx, |editor, _window, cx| { .update(cx, |editor, _window, cx| editor.visible_excerpts(None, cx))
editor.excerpts_for_inlay_hints_query(None, cx)
})
.unwrap(); .unwrap();
assert_eq!( assert_eq!(
ranges.len(), ranges.len(),

View file

@ -3,10 +3,10 @@ use std::{cmp, ops::Range};
use collections::HashMap; use collections::HashMap;
use futures::future::join_all; use futures::future::join_all;
use gpui::{Hsla, Rgba}; use gpui::{Hsla, Rgba};
use itertools::Itertools;
use language::point_from_lsp; use language::point_from_lsp;
use lsp::LanguageServerId;
use multi_buffer::Anchor; use multi_buffer::Anchor;
use project::DocumentColor; use project::{DocumentColor, lsp_store::ColorFetchStrategy};
use settings::Settings as _; use settings::Settings as _;
use text::{Bias, BufferId, OffsetRangeExt as _}; use text::{Bias, BufferId, OffsetRangeExt as _};
use ui::{App, Context, Window}; use ui::{App, Context, Window};
@ -19,16 +19,21 @@ use crate::{
#[derive(Debug)] #[derive(Debug)]
pub(super) struct LspColorData { pub(super) struct LspColorData {
buffer_colors: HashMap<BufferId, BufferColors>,
render_mode: DocumentColorsRenderMode,
}
#[derive(Debug, Default)]
struct BufferColors {
colors: Vec<(Range<Anchor>, DocumentColor, InlayId)>, colors: Vec<(Range<Anchor>, DocumentColor, InlayId)>,
inlay_colors: HashMap<InlayId, usize>, inlay_colors: HashMap<InlayId, usize>,
render_mode: DocumentColorsRenderMode, cache_version_used: usize,
} }
impl LspColorData { impl LspColorData {
pub fn new(cx: &App) -> Self { pub fn new(cx: &App) -> Self {
Self { Self {
colors: Vec::new(), buffer_colors: HashMap::default(),
inlay_colors: HashMap::default(),
render_mode: EditorSettings::get_global(cx).lsp_document_colors, render_mode: EditorSettings::get_global(cx).lsp_document_colors,
} }
} }
@ -45,8 +50,9 @@ impl LspColorData {
DocumentColorsRenderMode::Inlay => Some(InlaySplice { DocumentColorsRenderMode::Inlay => Some(InlaySplice {
to_remove: Vec::new(), to_remove: Vec::new(),
to_insert: self to_insert: self
.colors .buffer_colors
.iter() .iter()
.flat_map(|(_, buffer_colors)| buffer_colors.colors.iter())
.map(|(range, color, id)| { .map(|(range, color, id)| {
Inlay::color( Inlay::color(
id.id(), id.id(),
@ -61,33 +67,49 @@ impl LspColorData {
}) })
.collect(), .collect(),
}), }),
DocumentColorsRenderMode::None => { DocumentColorsRenderMode::None => Some(InlaySplice {
self.colors.clear(); to_remove: self
Some(InlaySplice { .buffer_colors
to_remove: self.inlay_colors.drain().map(|(id, _)| id).collect(), .drain()
to_insert: Vec::new(), .flat_map(|(_, buffer_colors)| buffer_colors.inlay_colors)
}) .map(|(id, _)| id)
} .collect(),
to_insert: Vec::new(),
}),
DocumentColorsRenderMode::Border | DocumentColorsRenderMode::Background => { DocumentColorsRenderMode::Border | DocumentColorsRenderMode::Background => {
Some(InlaySplice { Some(InlaySplice {
to_remove: self.inlay_colors.drain().map(|(id, _)| id).collect(), to_remove: self
.buffer_colors
.iter_mut()
.flat_map(|(_, buffer_colors)| buffer_colors.inlay_colors.drain())
.map(|(id, _)| id)
.collect(),
to_insert: Vec::new(), to_insert: Vec::new(),
}) })
} }
} }
} }
fn set_colors(&mut self, colors: Vec<(Range<Anchor>, DocumentColor, InlayId)>) -> bool { fn set_colors(
if self.colors == colors { &mut self,
buffer_id: BufferId,
colors: Vec<(Range<Anchor>, DocumentColor, InlayId)>,
cache_version: Option<usize>,
) -> bool {
let buffer_colors = self.buffer_colors.entry(buffer_id).or_default();
if let Some(cache_version) = cache_version {
buffer_colors.cache_version_used = cache_version;
}
if buffer_colors.colors == colors {
return false; return false;
} }
self.inlay_colors = colors buffer_colors.inlay_colors = colors
.iter() .iter()
.enumerate() .enumerate()
.map(|(i, (_, _, id))| (*id, i)) .map(|(i, (_, _, id))| (*id, i))
.collect(); .collect();
self.colors = colors; buffer_colors.colors = colors;
true true
} }
@ -101,8 +123,9 @@ impl LspColorData {
{ {
Vec::new() Vec::new()
} else { } else {
self.colors self.buffer_colors
.iter() .iter()
.flat_map(|(_, buffer_colors)| &buffer_colors.colors)
.map(|(range, color, _)| { .map(|(range, color, _)| {
let display_range = range.clone().to_display_points(snapshot); let display_range = range.clone().to_display_points(snapshot);
let color = Hsla::from(Rgba { let color = Hsla::from(Rgba {
@ -122,7 +145,7 @@ impl LspColorData {
impl Editor { impl Editor {
pub(super) fn refresh_colors( pub(super) fn refresh_colors(
&mut self, &mut self,
for_server_id: Option<LanguageServerId>, ignore_cache: bool,
buffer_id: Option<BufferId>, buffer_id: Option<BufferId>,
_: &Window, _: &Window,
cx: &mut Context<Self>, cx: &mut Context<Self>,
@ -141,29 +164,40 @@ impl Editor {
return; return;
} }
let visible_buffers = self
.visible_excerpts(None, cx)
.into_values()
.map(|(buffer, ..)| buffer)
.filter(|editor_buffer| {
buffer_id.is_none_or(|buffer_id| buffer_id == editor_buffer.read(cx).remote_id())
})
.unique_by(|buffer| buffer.read(cx).remote_id())
.collect::<Vec<_>>();
let all_colors_task = project.read(cx).lsp_store().update(cx, |lsp_store, cx| { let all_colors_task = project.read(cx).lsp_store().update(cx, |lsp_store, cx| {
self.buffer() visible_buffers
.update(cx, |multi_buffer, cx| {
multi_buffer
.all_buffers()
.into_iter()
.filter(|editor_buffer| {
buffer_id.is_none_or(|buffer_id| {
buffer_id == editor_buffer.read(cx).remote_id()
})
})
.collect::<Vec<_>>()
})
.into_iter() .into_iter()
.filter_map(|buffer| { .filter_map(|buffer| {
let buffer_id = buffer.read(cx).remote_id(); let buffer_id = buffer.read(cx).remote_id();
let colors_task = lsp_store.document_colors(for_server_id, buffer, cx)?; let fetch_strategy = if ignore_cache {
ColorFetchStrategy::IgnoreCache
} else {
ColorFetchStrategy::UseCache {
known_cache_version: self.colors.as_ref().and_then(|colors| {
Some(colors.buffer_colors.get(&buffer_id)?.cache_version_used)
}),
}
};
let colors_task = lsp_store.document_colors(fetch_strategy, buffer, cx)?;
Some(async move { (buffer_id, colors_task.await) }) Some(async move { (buffer_id, colors_task.await) })
}) })
.collect::<Vec<_>>() .collect::<Vec<_>>()
}); });
cx.spawn(async move |editor, cx| { cx.spawn(async move |editor, cx| {
let all_colors = join_all(all_colors_task).await; let all_colors = join_all(all_colors_task).await;
if all_colors.is_empty() {
return;
}
let Ok((multi_buffer_snapshot, editor_excerpts)) = editor.update(cx, |editor, cx| { let Ok((multi_buffer_snapshot, editor_excerpts)) = editor.update(cx, |editor, cx| {
let multi_buffer_snapshot = editor.buffer().read(cx).snapshot(cx); let multi_buffer_snapshot = editor.buffer().read(cx).snapshot(cx);
let editor_excerpts = multi_buffer_snapshot.excerpts().fold( let editor_excerpts = multi_buffer_snapshot.excerpts().fold(
@ -187,14 +221,14 @@ impl Editor {
return; return;
}; };
let mut new_editor_colors = Vec::<(Range<Anchor>, DocumentColor)>::new(); let mut new_editor_colors = HashMap::default();
for (buffer_id, colors) in all_colors { for (buffer_id, colors) in all_colors {
let Some(excerpts) = editor_excerpts.get(&buffer_id) else { let Some(excerpts) = editor_excerpts.get(&buffer_id) else {
continue; continue;
}; };
match colors { match colors {
Ok(colors) => { Ok(colors) => {
for color in colors { for color in colors.colors {
let color_start = point_from_lsp(color.lsp_range.start); let color_start = point_from_lsp(color.lsp_range.start);
let color_end = point_from_lsp(color.lsp_range.end); let color_end = point_from_lsp(color.lsp_range.end);
@ -227,8 +261,15 @@ impl Editor {
continue; continue;
}; };
let new_entry =
new_editor_colors.entry(buffer_id).or_insert_with(|| {
(Vec::<(Range<Anchor>, DocumentColor)>::new(), None)
});
new_entry.1 = colors.cache_version;
let new_buffer_colors = &mut new_entry.0;
let (Ok(i) | Err(i)) = let (Ok(i) | Err(i)) =
new_editor_colors.binary_search_by(|(probe, _)| { new_buffer_colors.binary_search_by(|(probe, _)| {
probe probe
.start .start
.cmp(&color_start_anchor, &multi_buffer_snapshot) .cmp(&color_start_anchor, &multi_buffer_snapshot)
@ -238,7 +279,7 @@ impl Editor {
.cmp(&color_end_anchor, &multi_buffer_snapshot) .cmp(&color_end_anchor, &multi_buffer_snapshot)
}) })
}); });
new_editor_colors new_buffer_colors
.insert(i, (color_start_anchor..color_end_anchor, color)); .insert(i, (color_start_anchor..color_end_anchor, color));
break; break;
} }
@ -251,45 +292,70 @@ impl Editor {
editor editor
.update(cx, |editor, cx| { .update(cx, |editor, cx| {
let mut colors_splice = InlaySplice::default(); let mut colors_splice = InlaySplice::default();
let mut new_color_inlays = Vec::with_capacity(new_editor_colors.len());
let Some(colors) = &mut editor.colors else { let Some(colors) = &mut editor.colors else {
return; return;
}; };
let mut existing_colors = colors.colors.iter().peekable(); let mut updated = false;
for (new_range, new_color) in new_editor_colors { for (buffer_id, (new_buffer_colors, new_cache_version)) in new_editor_colors {
let rgba_color = Rgba { let mut new_buffer_color_inlays =
r: new_color.color.red, Vec::with_capacity(new_buffer_colors.len());
g: new_color.color.green, let mut existing_buffer_colors = colors
b: new_color.color.blue, .buffer_colors
a: new_color.color.alpha, .entry(buffer_id)
}; .or_default()
.colors
.iter()
.peekable();
for (new_range, new_color) in new_buffer_colors {
let rgba_color = Rgba {
r: new_color.color.red,
g: new_color.color.green,
b: new_color.color.blue,
a: new_color.color.alpha,
};
loop { loop {
match existing_colors.peek() { match existing_buffer_colors.peek() {
Some((existing_range, existing_color, existing_inlay_id)) => { Some((existing_range, existing_color, existing_inlay_id)) => {
match existing_range match existing_range
.start .start
.cmp(&new_range.start, &multi_buffer_snapshot) .cmp(&new_range.start, &multi_buffer_snapshot)
.then_with(|| { .then_with(|| {
existing_range existing_range
.end .end
.cmp(&new_range.end, &multi_buffer_snapshot) .cmp(&new_range.end, &multi_buffer_snapshot)
}) { }) {
cmp::Ordering::Less => { cmp::Ordering::Less => {
colors_splice.to_remove.push(*existing_inlay_id);
existing_colors.next();
continue;
}
cmp::Ordering::Equal => {
if existing_color == &new_color {
new_color_inlays.push((
new_range,
new_color,
*existing_inlay_id,
));
} else {
colors_splice.to_remove.push(*existing_inlay_id); colors_splice.to_remove.push(*existing_inlay_id);
existing_buffer_colors.next();
continue;
}
cmp::Ordering::Equal => {
if existing_color == &new_color {
new_buffer_color_inlays.push((
new_range,
new_color,
*existing_inlay_id,
));
} else {
colors_splice
.to_remove
.push(*existing_inlay_id);
let inlay = Inlay::color(
post_inc(&mut editor.next_color_inlay_id),
new_range.start,
rgba_color,
);
let inlay_id = inlay.id;
colors_splice.to_insert.push(inlay);
new_buffer_color_inlays
.push((new_range, new_color, inlay_id));
}
existing_buffer_colors.next();
break;
}
cmp::Ordering::Greater => {
let inlay = Inlay::color( let inlay = Inlay::color(
post_inc(&mut editor.next_color_inlay_id), post_inc(&mut editor.next_color_inlay_id),
new_range.start, new_range.start,
@ -297,46 +363,40 @@ impl Editor {
); );
let inlay_id = inlay.id; let inlay_id = inlay.id;
colors_splice.to_insert.push(inlay); colors_splice.to_insert.push(inlay);
new_color_inlays new_buffer_color_inlays
.push((new_range, new_color, inlay_id)); .push((new_range, new_color, inlay_id));
break;
} }
existing_colors.next();
break;
}
cmp::Ordering::Greater => {
let inlay = Inlay::color(
post_inc(&mut editor.next_color_inlay_id),
new_range.start,
rgba_color,
);
let inlay_id = inlay.id;
colors_splice.to_insert.push(inlay);
new_color_inlays.push((new_range, new_color, inlay_id));
break;
} }
} }
} None => {
None => { let inlay = Inlay::color(
let inlay = Inlay::color( post_inc(&mut editor.next_color_inlay_id),
post_inc(&mut editor.next_color_inlay_id), new_range.start,
new_range.start, rgba_color,
rgba_color, );
); let inlay_id = inlay.id;
let inlay_id = inlay.id; colors_splice.to_insert.push(inlay);
colors_splice.to_insert.push(inlay); new_buffer_color_inlays
new_color_inlays.push((new_range, new_color, inlay_id)); .push((new_range, new_color, inlay_id));
break; break;
}
} }
} }
} }
}
if existing_colors.peek().is_some() { if existing_buffer_colors.peek().is_some() {
colors_splice colors_splice
.to_remove .to_remove
.extend(existing_colors.map(|(_, _, id)| *id)); .extend(existing_buffer_colors.map(|(_, _, id)| *id));
}
updated |= colors.set_colors(
buffer_id,
new_buffer_color_inlays,
new_cache_version,
);
} }
let mut updated = colors.set_colors(new_color_inlays);
if colors.render_mode == DocumentColorsRenderMode::Inlay if colors.render_mode == DocumentColorsRenderMode::Inlay
&& (!colors_splice.to_insert.is_empty() && (!colors_splice.to_insert.is_empty()
|| !colors_splice.to_remove.is_empty()) || !colors_splice.to_remove.is_empty())

View file

@ -487,8 +487,9 @@ impl Editor {
if opened_first_time { if opened_first_time {
cx.spawn_in(window, async move |editor, cx| { cx.spawn_in(window, async move |editor, cx| {
editor editor
.update(cx, |editor, cx| { .update_in(cx, |editor, window, cx| {
editor.refresh_inlay_hints(InlayHintRefreshReason::NewLinesShown, cx) editor.refresh_inlay_hints(InlayHintRefreshReason::NewLinesShown, cx);
editor.refresh_colors(false, None, window, cx);
}) })
.ok() .ok()
}) })
@ -599,6 +600,7 @@ impl Editor {
); );
self.refresh_inlay_hints(InlayHintRefreshReason::NewLinesShown, cx); self.refresh_inlay_hints(InlayHintRefreshReason::NewLinesShown, cx);
self.refresh_colors(false, None, window, cx);
} }
pub fn scroll_position(&self, cx: &mut Context<Self>) -> gpui::Point<f32> { pub fn scroll_position(&self, cx: &mut Context<Self>) -> gpui::Point<f32> {

View file

@ -1,5 +1,6 @@
use crate::{ use crate::{
ExtensionLibraryKind, ExtensionManifest, GrammarManifestEntry, parse_wasm_extension_version, ExtensionLibraryKind, ExtensionManifest, GrammarManifestEntry, build_debug_adapter_schema_path,
parse_wasm_extension_version,
}; };
use anyhow::{Context as _, Result, bail}; use anyhow::{Context as _, Result, bail};
use async_compression::futures::bufread::GzipDecoder; use async_compression::futures::bufread::GzipDecoder;
@ -99,12 +100,8 @@ impl ExtensionBuilder {
} }
for (debug_adapter_name, meta) in &mut extension_manifest.debug_adapters { for (debug_adapter_name, meta) in &mut extension_manifest.debug_adapters {
let debug_adapter_relative_schema_path = let debug_adapter_schema_path =
meta.schema_path.clone().unwrap_or_else(|| { extension_dir.join(build_debug_adapter_schema_path(debug_adapter_name, meta));
Path::new("debug_adapter_schemas")
.join(Path::new(debug_adapter_name.as_ref()).with_extension("json"))
});
let debug_adapter_schema_path = extension_dir.join(debug_adapter_relative_schema_path);
let debug_adapter_schema = fs::read_to_string(&debug_adapter_schema_path) let debug_adapter_schema = fs::read_to_string(&debug_adapter_schema_path)
.with_context(|| { .with_context(|| {

View file

@ -132,6 +132,16 @@ impl ExtensionManifest {
} }
} }
pub fn build_debug_adapter_schema_path(
adapter_name: &Arc<str>,
meta: &DebugAdapterManifestEntry,
) -> PathBuf {
meta.schema_path.clone().unwrap_or_else(|| {
Path::new("debug_adapter_schemas")
.join(Path::new(adapter_name.as_ref()).with_extension("json"))
})
}
/// A capability for an extension. /// A capability for an extension.
#[derive(Debug, PartialEq, Eq, Clone, Serialize, Deserialize)] #[derive(Debug, PartialEq, Eq, Clone, Serialize, Deserialize)]
#[serde(tag = "kind")] #[serde(tag = "kind")]
@ -320,6 +330,29 @@ mod tests {
} }
} }
#[test]
fn test_build_adapter_schema_path_with_schema_path() {
let adapter_name = Arc::from("my_adapter");
let entry = DebugAdapterManifestEntry {
schema_path: Some(PathBuf::from("foo/bar")),
};
let path = build_debug_adapter_schema_path(&adapter_name, &entry);
assert_eq!(path, PathBuf::from("foo/bar"));
}
#[test]
fn test_build_adapter_schema_path_without_schema_path() {
let adapter_name = Arc::from("my_adapter");
let entry = DebugAdapterManifestEntry { schema_path: None };
let path = build_debug_adapter_schema_path(&adapter_name, &entry);
assert_eq!(
path,
PathBuf::from("debug_adapter_schemas").join("my_adapter.json")
);
}
#[test] #[test]
fn test_allow_exact_match() { fn test_allow_exact_match() {
let manifest = ExtensionManifest { let manifest = ExtensionManifest {

View file

@ -1633,6 +1633,23 @@ impl ExtensionStore {
} }
} }
for (adapter_name, meta) in loaded_extension.manifest.debug_adapters.iter() {
let schema_path = &extension::build_debug_adapter_schema_path(adapter_name, meta);
if fs.is_file(&src_dir.join(schema_path)).await {
match schema_path.parent() {
Some(parent) => fs.create_dir(&tmp_dir.join(parent)).await?,
None => {}
}
fs.copy_file(
&src_dir.join(schema_path),
&tmp_dir.join(schema_path),
fs::CopyOptions::default(),
)
.await?
}
}
Ok(()) Ok(())
}) })
} }

View file

@ -4,8 +4,8 @@ use anyhow::{Context as _, Result};
use client::{TypedEnvelope, proto}; use client::{TypedEnvelope, proto};
use collections::{HashMap, HashSet}; use collections::{HashMap, HashSet};
use extension::{ use extension::{
Extension, ExtensionHostProxy, ExtensionLanguageProxy, ExtensionLanguageServerProxy, Extension, ExtensionDebugAdapterProviderProxy, ExtensionHostProxy, ExtensionLanguageProxy,
ExtensionManifest, ExtensionLanguageServerProxy, ExtensionManifest,
}; };
use fs::{Fs, RemoveOptions, RenameOptions}; use fs::{Fs, RemoveOptions, RenameOptions};
use gpui::{App, AppContext as _, AsyncApp, Context, Entity, Task, WeakEntity}; use gpui::{App, AppContext as _, AsyncApp, Context, Entity, Task, WeakEntity};
@ -169,8 +169,9 @@ impl HeadlessExtensionStore {
return Ok(()); return Ok(());
} }
let wasm_extension: Arc<dyn Extension> = let wasm_extension: Arc<dyn Extension> = Arc::new(
Arc::new(WasmExtension::load(extension_dir, &manifest, wasm_host.clone(), &cx).await?); WasmExtension::load(extension_dir.clone(), &manifest, wasm_host.clone(), &cx).await?,
);
for (language_server_id, language_server_config) in &manifest.language_servers { for (language_server_id, language_server_config) in &manifest.language_servers {
for language in language_server_config.languages() { for language in language_server_config.languages() {
@ -186,6 +187,24 @@ impl HeadlessExtensionStore {
); );
})?; })?;
} }
for (debug_adapter, meta) in &manifest.debug_adapters {
let schema_path = extension::build_debug_adapter_schema_path(debug_adapter, meta);
this.update(cx, |this, _cx| {
this.proxy.register_debug_adapter(
wasm_extension.clone(),
debug_adapter.clone(),
&extension_dir.join(schema_path),
);
})?;
}
for debug_adapter in manifest.debug_locators.keys() {
this.update(cx, |this, _cx| {
this.proxy
.register_debug_locator(wasm_extension.clone(), debug_adapter.clone());
})?;
}
} }
Ok(()) Ok(())

View file

@ -10,8 +10,8 @@
use crate::{ use crate::{
AnyElement, App, AvailableSpace, Bounds, ContentMask, DispatchPhase, Edges, Element, EntityId, AnyElement, App, AvailableSpace, Bounds, ContentMask, DispatchPhase, Edges, Element, EntityId,
FocusHandle, GlobalElementId, Hitbox, HitboxBehavior, InspectorElementId, IntoElement, FocusHandle, GlobalElementId, Hitbox, HitboxBehavior, InspectorElementId, IntoElement,
Overflow, Pixels, Point, ScrollWheelEvent, Size, Style, StyleRefinement, Styled, Window, point, Overflow, Pixels, Point, ScrollDelta, ScrollWheelEvent, Size, Style, StyleRefinement, Styled,
px, size, Window, point, px, size,
}; };
use collections::VecDeque; use collections::VecDeque;
use refineable::Refineable as _; use refineable::Refineable as _;
@ -962,12 +962,15 @@ impl Element for List {
let height = bounds.size.height; let height = bounds.size.height;
let scroll_top = prepaint.layout.scroll_top; let scroll_top = prepaint.layout.scroll_top;
let hitbox_id = prepaint.hitbox.id; let hitbox_id = prepaint.hitbox.id;
let mut accumulated_scroll_delta = ScrollDelta::default();
window.on_mouse_event(move |event: &ScrollWheelEvent, phase, window, cx| { window.on_mouse_event(move |event: &ScrollWheelEvent, phase, window, cx| {
if phase == DispatchPhase::Bubble && hitbox_id.should_handle_scroll(window) { if phase == DispatchPhase::Bubble && hitbox_id.should_handle_scroll(window) {
accumulated_scroll_delta = accumulated_scroll_delta.coalesce(event.delta);
let pixel_delta = accumulated_scroll_delta.pixel_delta(px(20.));
list_state.0.borrow_mut().scroll( list_state.0.borrow_mut().scroll(
&scroll_top, &scroll_top,
height, height,
event.delta.pixel_delta(px(20.)), pixel_delta,
current_view, current_view,
window, window,
cx, cx,

View file

@ -767,8 +767,8 @@ pub struct EsLintLspAdapter {
} }
impl EsLintLspAdapter { impl EsLintLspAdapter {
const CURRENT_VERSION: &'static str = "3.0.10"; const CURRENT_VERSION: &'static str = "2.4.4";
const CURRENT_VERSION_TAG_NAME: &'static str = "release/3.0.10"; const CURRENT_VERSION_TAG_NAME: &'static str = "release/2.4.4";
#[cfg(not(windows))] #[cfg(not(windows))]
const GITHUB_ASSET_KIND: AssetKind = AssetKind::TarGz; const GITHUB_ASSET_KIND: AssetKind = AssetKind::TarGz;
@ -846,7 +846,9 @@ impl LspAdapter for EsLintLspAdapter {
"enable": true "enable": true
} }
}, },
"useFlatConfig": use_flat_config, "experimental": {
"useFlatConfig": use_flat_config,
},
}); });
let override_options = cx.update(|cx| { let override_options = cx.update(|cx| {

View file

@ -107,9 +107,7 @@ pub trait LspCommand: 'static + Sized + Send + std::fmt::Debug {
} }
/// When false, `to_lsp_params_or_response` default implementation will return the default response. /// When false, `to_lsp_params_or_response` default implementation will return the default response.
fn check_capabilities(&self, _: AdapterServerCapabilities) -> bool { fn check_capabilities(&self, _: AdapterServerCapabilities) -> bool;
true
}
fn to_lsp( fn to_lsp(
&self, &self,
@ -277,6 +275,16 @@ impl LspCommand for PrepareRename {
"Prepare rename" "Prepare rename"
} }
fn check_capabilities(&self, capabilities: AdapterServerCapabilities) -> bool {
capabilities
.server_capabilities
.rename_provider
.is_some_and(|capability| match capability {
OneOf::Left(enabled) => enabled,
OneOf::Right(options) => options.prepare_provider.unwrap_or(false),
})
}
fn to_lsp_params_or_response( fn to_lsp_params_or_response(
&self, &self,
path: &Path, path: &Path,
@ -459,6 +467,16 @@ impl LspCommand for PerformRename {
"Rename" "Rename"
} }
fn check_capabilities(&self, capabilities: AdapterServerCapabilities) -> bool {
capabilities
.server_capabilities
.rename_provider
.is_some_and(|capability| match capability {
OneOf::Left(enabled) => enabled,
OneOf::Right(_options) => true,
})
}
fn to_lsp( fn to_lsp(
&self, &self,
path: &Path, path: &Path,
@ -583,7 +601,10 @@ impl LspCommand for GetDefinition {
capabilities capabilities
.server_capabilities .server_capabilities
.definition_provider .definition_provider
.is_some() .is_some_and(|capability| match capability {
OneOf::Left(supported) => supported,
OneOf::Right(_options) => true,
})
} }
fn to_lsp( fn to_lsp(
@ -682,7 +703,11 @@ impl LspCommand for GetDeclaration {
capabilities capabilities
.server_capabilities .server_capabilities
.declaration_provider .declaration_provider
.is_some() .is_some_and(|capability| match capability {
lsp::DeclarationCapability::Simple(supported) => supported,
lsp::DeclarationCapability::RegistrationOptions(..) => true,
lsp::DeclarationCapability::Options(..) => true,
})
} }
fn to_lsp( fn to_lsp(
@ -777,6 +802,16 @@ impl LspCommand for GetImplementation {
"Get implementation" "Get implementation"
} }
fn check_capabilities(&self, capabilities: AdapterServerCapabilities) -> bool {
capabilities
.server_capabilities
.implementation_provider
.is_some_and(|capability| match capability {
lsp::ImplementationProviderCapability::Simple(enabled) => enabled,
lsp::ImplementationProviderCapability::Options(_options) => true,
})
}
fn to_lsp( fn to_lsp(
&self, &self,
path: &Path, path: &Path,
@ -1437,7 +1472,10 @@ impl LspCommand for GetDocumentHighlights {
capabilities capabilities
.server_capabilities .server_capabilities
.document_highlight_provider .document_highlight_provider
.is_some() .is_some_and(|capability| match capability {
OneOf::Left(supported) => supported,
OneOf::Right(_options) => true,
})
} }
fn to_lsp( fn to_lsp(
@ -1590,7 +1628,10 @@ impl LspCommand for GetDocumentSymbols {
capabilities capabilities
.server_capabilities .server_capabilities
.document_symbol_provider .document_symbol_provider
.is_some() .is_some_and(|capability| match capability {
OneOf::Left(supported) => supported,
OneOf::Right(_options) => true,
})
} }
fn to_lsp( fn to_lsp(
@ -2116,6 +2157,13 @@ impl LspCommand for GetCompletions {
"Get completion" "Get completion"
} }
fn check_capabilities(&self, capabilities: AdapterServerCapabilities) -> bool {
capabilities
.server_capabilities
.completion_provider
.is_some()
}
fn to_lsp( fn to_lsp(
&self, &self,
path: &Path, path: &Path,
@ -4161,7 +4209,11 @@ impl LspCommand for GetDocumentColor {
server_capabilities server_capabilities
.server_capabilities .server_capabilities
.color_provider .color_provider
.is_some() .is_some_and(|capability| match capability {
lsp::ColorProviderCapability::Simple(supported) => supported,
lsp::ColorProviderCapability::ColorProvider(..) => true,
lsp::ColorProviderCapability::Options(..) => true,
})
} }
fn to_lsp( fn to_lsp(

View file

@ -171,6 +171,7 @@ pub struct LocalLspStore {
_subscription: gpui::Subscription, _subscription: gpui::Subscription,
lsp_tree: Entity<LanguageServerTree>, lsp_tree: Entity<LanguageServerTree>,
registered_buffers: HashMap<BufferId, usize>, registered_buffers: HashMap<BufferId, usize>,
buffers_opened_in_servers: HashMap<BufferId, HashSet<LanguageServerId>>,
buffer_pull_diagnostics_result_ids: HashMap<LanguageServerId, HashMap<PathBuf, Option<String>>>, buffer_pull_diagnostics_result_ids: HashMap<LanguageServerId, HashMap<PathBuf, Option<String>>>,
} }
@ -2498,6 +2499,11 @@ impl LocalLspStore {
vec![snapshot] vec![snapshot]
}); });
self.buffers_opened_in_servers
.entry(buffer_id)
.or_default()
.insert(server.server_id());
} }
} }
@ -3151,6 +3157,9 @@ impl LocalLspStore {
self.language_servers.remove(server_id_to_remove); self.language_servers.remove(server_id_to_remove);
self.buffer_pull_diagnostics_result_ids self.buffer_pull_diagnostics_result_ids
.remove(server_id_to_remove); .remove(server_id_to_remove);
for buffer_servers in self.buffers_opened_in_servers.values_mut() {
buffer_servers.remove(server_id_to_remove);
}
cx.emit(LspStoreEvent::LanguageServerRemoved(*server_id_to_remove)); cx.emit(LspStoreEvent::LanguageServerRemoved(*server_id_to_remove));
} }
servers_to_remove.into_keys().collect() servers_to_remove.into_keys().collect()
@ -3485,22 +3494,29 @@ pub struct LspStore {
_maintain_buffer_languages: Task<()>, _maintain_buffer_languages: Task<()>,
diagnostic_summaries: diagnostic_summaries:
HashMap<WorktreeId, HashMap<Arc<Path>, HashMap<LanguageServerId, DiagnosticSummary>>>, HashMap<WorktreeId, HashMap<Arc<Path>, HashMap<LanguageServerId, DiagnosticSummary>>>,
lsp_data: Option<LspData>, lsp_data: HashMap<BufferId, DocumentColorData>,
} }
type DocumentColorTask = Shared<Task<std::result::Result<Vec<DocumentColor>, Arc<anyhow::Error>>>>; #[derive(Debug, Default, Clone)]
pub struct DocumentColors {
#[derive(Debug)] pub colors: HashSet<DocumentColor>,
struct LspData { pub cache_version: Option<usize>,
mtime: MTime,
buffer_lsp_data: HashMap<LanguageServerId, HashMap<PathBuf, BufferLspData>>,
colors_update: HashMap<PathBuf, DocumentColorTask>,
last_version_queried: HashMap<PathBuf, Global>,
} }
type DocumentColorTask = Shared<Task<std::result::Result<DocumentColors, Arc<anyhow::Error>>>>;
#[derive(Debug, Default)] #[derive(Debug, Default)]
struct BufferLspData { struct DocumentColorData {
colors: Option<Vec<DocumentColor>>, colors_for_version: Global,
colors: HashMap<LanguageServerId, HashSet<DocumentColor>>,
cache_version: usize,
colors_update: Option<(Global, DocumentColorTask)>,
}
#[derive(Debug, PartialEq, Eq, Clone, Copy)]
pub enum ColorFetchStrategy {
IgnoreCache,
UseCache { known_cache_version: Option<usize> },
} }
pub enum LspStoreEvent { pub enum LspStoreEvent {
@ -3718,6 +3734,7 @@ impl LspStore {
}), }),
lsp_tree: LanguageServerTree::new(manifest_tree, languages.clone(), cx), lsp_tree: LanguageServerTree::new(manifest_tree, languages.clone(), cx),
registered_buffers: HashMap::default(), registered_buffers: HashMap::default(),
buffers_opened_in_servers: HashMap::default(),
buffer_pull_diagnostics_result_ids: HashMap::default(), buffer_pull_diagnostics_result_ids: HashMap::default(),
}), }),
last_formatting_failure: None, last_formatting_failure: None,
@ -3729,7 +3746,7 @@ impl LspStore {
language_server_statuses: Default::default(), language_server_statuses: Default::default(),
nonce: StdRng::from_entropy().r#gen(), nonce: StdRng::from_entropy().r#gen(),
diagnostic_summaries: HashMap::default(), diagnostic_summaries: HashMap::default(),
lsp_data: None, lsp_data: HashMap::default(),
active_entry: None, active_entry: None,
_maintain_workspace_config, _maintain_workspace_config,
_maintain_buffer_languages: Self::maintain_buffer_languages(languages, cx), _maintain_buffer_languages: Self::maintain_buffer_languages(languages, cx),
@ -3785,7 +3802,7 @@ impl LspStore {
language_server_statuses: Default::default(), language_server_statuses: Default::default(),
nonce: StdRng::from_entropy().r#gen(), nonce: StdRng::from_entropy().r#gen(),
diagnostic_summaries: HashMap::default(), diagnostic_summaries: HashMap::default(),
lsp_data: None, lsp_data: HashMap::default(),
active_entry: None, active_entry: None,
toolchain_store, toolchain_store,
_maintain_workspace_config, _maintain_workspace_config,
@ -4073,16 +4090,22 @@ impl LspStore {
local.register_buffer_with_language_servers(buffer, cx); local.register_buffer_with_language_servers(buffer, cx);
} }
if !ignore_refcounts { if !ignore_refcounts {
cx.observe_release(&handle, move |this, buffer, cx| { cx.observe_release(&handle, move |lsp_store, buffer, cx| {
let local = this.as_local_mut().unwrap(); let refcount = {
let Some(refcount) = local.registered_buffers.get_mut(&buffer_id) else { let local = lsp_store.as_local_mut().unwrap();
debug_panic!("bad refcounting"); let Some(refcount) = local.registered_buffers.get_mut(&buffer_id) else {
return; debug_panic!("bad refcounting");
}; return;
};
*refcount -= 1; *refcount -= 1;
if *refcount == 0 { *refcount
};
if refcount == 0 {
lsp_store.lsp_data.remove(&buffer_id);
let local = lsp_store.as_local_mut().unwrap();
local.registered_buffers.remove(&buffer_id); local.registered_buffers.remove(&buffer_id);
local.buffers_opened_in_servers.remove(&buffer_id);
if let Some(file) = File::from_dyn(buffer.read(cx).file()).cloned() { if let Some(file) = File::from_dyn(buffer.read(cx).file()).cloned() {
local.unregister_old_buffer_from_language_servers(&buffer, &file, cx); local.unregister_old_buffer_from_language_servers(&buffer, &file, cx);
} }
@ -4900,7 +4923,7 @@ impl LspStore {
.presentations .presentations
.into_iter() .into_iter()
.map(|presentation| ColorPresentation { .map(|presentation| ColorPresentation {
label: presentation.label, label: SharedString::from(presentation.label),
text_edit: presentation.text_edit.and_then(deserialize_lsp_edit), text_edit: presentation.text_edit.and_then(deserialize_lsp_edit),
additional_text_edits: presentation additional_text_edits: presentation
.additional_text_edits .additional_text_edits
@ -4943,7 +4966,7 @@ impl LspStore {
.context("color presentation resolve LSP request")? .context("color presentation resolve LSP request")?
.into_iter() .into_iter()
.map(|presentation| ColorPresentation { .map(|presentation| ColorPresentation {
label: presentation.label, label: SharedString::from(presentation.label),
text_edit: presentation.text_edit, text_edit: presentation.text_edit,
additional_text_edits: presentation additional_text_edits: presentation
.additional_text_edits .additional_text_edits
@ -6095,152 +6118,137 @@ impl LspStore {
pub fn document_colors( pub fn document_colors(
&mut self, &mut self,
for_server_id: Option<LanguageServerId>, fetch_strategy: ColorFetchStrategy,
buffer: Entity<Buffer>, buffer: Entity<Buffer>,
cx: &mut Context<Self>, cx: &mut Context<Self>,
) -> Option<DocumentColorTask> { ) -> Option<DocumentColorTask> {
let buffer_mtime = buffer.read(cx).saved_mtime()?; let version_queried_for = buffer.read(cx).version();
let buffer_version = buffer.read(cx).version(); let buffer_id = buffer.read(cx).remote_id();
let abs_path = File::from_dyn(buffer.read(cx).file())?.abs_path(cx);
let mut received_colors_data = false; match fetch_strategy {
let buffer_lsp_data = self ColorFetchStrategy::IgnoreCache => {}
.lsp_data ColorFetchStrategy::UseCache {
.as_ref() known_cache_version,
.into_iter() } => {
.filter(|lsp_data| { if let Some(cached_data) = self.lsp_data.get(&buffer_id) {
if buffer_mtime == lsp_data.mtime { if !version_queried_for.changed_since(&cached_data.colors_for_version) {
lsp_data let has_different_servers = self.as_local().is_some_and(|local| {
.last_version_queried local
.get(&abs_path) .buffers_opened_in_servers
.is_none_or(|version_queried| { .get(&buffer_id)
!buffer_version.changed_since(version_queried) .cloned()
}) .unwrap_or_default()
} else { != cached_data.colors.keys().copied().collect()
!buffer_mtime.bad_is_greater_than(lsp_data.mtime) });
} if !has_different_servers {
}) if Some(cached_data.cache_version) == known_cache_version {
.flat_map(|lsp_data| lsp_data.buffer_lsp_data.values()) return None;
.filter_map(|buffer_data| buffer_data.get(&abs_path)) } else {
.filter_map(|buffer_data| { return Some(
let colors = buffer_data.colors.as_deref()?; Task::ready(Ok(DocumentColors {
received_colors_data = true; colors: cached_data
Some(colors) .colors
}) .values()
.flatten() .flatten()
.cloned() .cloned()
.collect::<Vec<_>>(); .collect(),
cache_version: Some(cached_data.cache_version),
if buffer_lsp_data.is_empty() || for_server_id.is_some() { }))
if received_colors_data && for_server_id.is_none() { .shared(),
return None; );
}
let mut outdated_lsp_data = false;
if self.lsp_data.is_none()
|| self.lsp_data.as_ref().is_some_and(|lsp_data| {
if buffer_mtime == lsp_data.mtime {
lsp_data
.last_version_queried
.get(&abs_path)
.is_none_or(|version_queried| {
buffer_version.changed_since(version_queried)
})
} else {
buffer_mtime.bad_is_greater_than(lsp_data.mtime)
}
})
{
self.lsp_data = Some(LspData {
mtime: buffer_mtime,
buffer_lsp_data: HashMap::default(),
colors_update: HashMap::default(),
last_version_queried: HashMap::default(),
});
outdated_lsp_data = true;
}
{
let lsp_data = self.lsp_data.as_mut()?;
match for_server_id {
Some(for_server_id) if !outdated_lsp_data => {
lsp_data.buffer_lsp_data.remove(&for_server_id);
}
None | Some(_) => {
let existing_task = lsp_data.colors_update.get(&abs_path).cloned();
if !outdated_lsp_data && existing_task.is_some() {
return existing_task;
}
for buffer_data in lsp_data.buffer_lsp_data.values_mut() {
if let Some(buffer_data) = buffer_data.get_mut(&abs_path) {
buffer_data.colors = None;
} }
} }
} }
} }
} }
let task_abs_path = abs_path.clone();
let new_task = cx
.spawn(async move |lsp_store, cx| {
cx.background_executor().timer(Duration::from_millis(50)).await;
let fetched_colors = match lsp_store
.update(cx, |lsp_store, cx| {
lsp_store.fetch_document_colors(buffer, cx)
}) {
Ok(fetch_task) => fetch_task.await
.with_context(|| {
format!(
"Fetching document colors for buffer with path {task_abs_path:?}"
)
}),
Err(e) => return Err(Arc::new(e)),
};
let fetched_colors = match fetched_colors {
Ok(fetched_colors) => fetched_colors,
Err(e) => return Err(Arc::new(e)),
};
let lsp_colors = lsp_store.update(cx, |lsp_store, _| {
let lsp_data = lsp_store.lsp_data.as_mut().with_context(|| format!(
"Document lsp data got updated between fetch and update for path {task_abs_path:?}"
))?;
let mut lsp_colors = Vec::new();
anyhow::ensure!(lsp_data.mtime == buffer_mtime, "Buffer lsp data got updated between fetch and update for path {task_abs_path:?}");
for (server_id, colors) in fetched_colors {
let colors_lsp_data = &mut lsp_data.buffer_lsp_data.entry(server_id).or_default().entry(task_abs_path.clone()).or_default().colors;
*colors_lsp_data = Some(colors.clone());
lsp_colors.extend(colors);
}
Ok(lsp_colors)
});
match lsp_colors {
Ok(Ok(lsp_colors)) => Ok(lsp_colors),
Ok(Err(e)) => Err(Arc::new(e)),
Err(e) => Err(Arc::new(e)),
}
})
.shared();
let lsp_data = self.lsp_data.as_mut()?;
lsp_data
.colors_update
.insert(abs_path.clone(), new_task.clone());
lsp_data
.last_version_queried
.insert(abs_path, buffer_version);
lsp_data.mtime = buffer_mtime;
Some(new_task)
} else {
Some(Task::ready(Ok(buffer_lsp_data)).shared())
} }
let lsp_data = self.lsp_data.entry(buffer_id).or_default();
if let Some((updating_for, running_update)) = &lsp_data.colors_update {
if !version_queried_for.changed_since(&updating_for) {
return Some(running_update.clone());
}
}
let query_version_queried_for = version_queried_for.clone();
let new_task = cx
.spawn(async move |lsp_store, cx| {
cx.background_executor()
.timer(Duration::from_millis(30))
.await;
let fetched_colors = lsp_store
.update(cx, |lsp_store, cx| {
lsp_store.fetch_document_colors_for_buffer(buffer.clone(), cx)
})?
.await
.context("fetching document colors")
.map_err(Arc::new);
let fetched_colors = match fetched_colors {
Ok(fetched_colors) => {
if fetch_strategy != ColorFetchStrategy::IgnoreCache
&& Some(true)
== buffer
.update(cx, |buffer, _| {
buffer.version() != query_version_queried_for
})
.ok()
{
return Ok(DocumentColors::default());
}
fetched_colors
}
Err(e) => {
lsp_store
.update(cx, |lsp_store, _| {
lsp_store
.lsp_data
.entry(buffer_id)
.or_default()
.colors_update = None;
})
.ok();
return Err(e);
}
};
lsp_store
.update(cx, |lsp_store, _| {
let lsp_data = lsp_store.lsp_data.entry(buffer_id).or_default();
if lsp_data.colors_for_version == query_version_queried_for {
lsp_data.colors.extend(fetched_colors.clone());
lsp_data.cache_version += 1;
} else if !lsp_data
.colors_for_version
.changed_since(&query_version_queried_for)
{
lsp_data.colors_for_version = query_version_queried_for;
lsp_data.colors = fetched_colors.clone();
lsp_data.cache_version += 1;
}
lsp_data.colors_update = None;
let colors = lsp_data
.colors
.values()
.flatten()
.cloned()
.collect::<HashSet<_>>();
DocumentColors {
colors,
cache_version: Some(lsp_data.cache_version),
}
})
.map_err(Arc::new)
})
.shared();
lsp_data.colors_update = Some((version_queried_for, new_task.clone()));
Some(new_task)
} }
fn fetch_document_colors( fn fetch_document_colors_for_buffer(
&mut self, &mut self,
buffer: Entity<Buffer>, buffer: Entity<Buffer>,
cx: &mut Context<Self>, cx: &mut Context<Self>,
) -> Task<anyhow::Result<Vec<(LanguageServerId, Vec<DocumentColor>)>>> { ) -> Task<anyhow::Result<HashMap<LanguageServerId, HashSet<DocumentColor>>>> {
if let Some((client, project_id)) = self.upstream_client() { if let Some((client, project_id)) = self.upstream_client() {
let request_task = client.request(proto::MultiLspQuery { let request_task = client.request(proto::MultiLspQuery {
project_id, project_id,
@ -6255,7 +6263,7 @@ impl LspStore {
}); });
cx.spawn(async move |project, cx| { cx.spawn(async move |project, cx| {
let Some(project) = project.upgrade() else { let Some(project) = project.upgrade() else {
return Ok(Vec::new()); return Ok(HashMap::default());
}; };
let colors = join_all( let colors = join_all(
request_task request_task
@ -6289,11 +6297,11 @@ impl LspStore {
.await .await
.into_iter() .into_iter()
.fold(HashMap::default(), |mut acc, (server_id, colors)| { .fold(HashMap::default(), |mut acc, (server_id, colors)| {
acc.entry(server_id).or_insert_with(Vec::new).extend(colors); acc.entry(server_id)
.or_insert_with(HashSet::default)
.extend(colors);
acc acc
}) });
.into_iter()
.collect();
Ok(colors) Ok(colors)
}) })
} else { } else {
@ -6304,7 +6312,9 @@ impl LspStore {
.await .await
.into_iter() .into_iter()
.fold(HashMap::default(), |mut acc, (server_id, colors)| { .fold(HashMap::default(), |mut acc, (server_id, colors)| {
acc.entry(server_id).or_insert_with(Vec::new).extend(colors); acc.entry(server_id)
.or_insert_with(HashSet::default)
.extend(colors);
acc acc
}) })
.into_iter() .into_iter()
@ -7419,6 +7429,14 @@ impl LspStore {
.unwrap_or(true) .unwrap_or(true)
}) })
.map(|(_, server)| server.server_id()) .map(|(_, server)| server.server_id())
.filter(|server_id| {
self.as_local().is_none_or(|local| {
local
.buffers_opened_in_servers
.get(&snapshot.remote_id())
.is_some_and(|servers| servers.contains(server_id))
})
})
.collect::<Vec<_>>() .collect::<Vec<_>>()
}); });
@ -8791,7 +8809,7 @@ impl LspStore {
.color_presentations .color_presentations
.into_iter() .into_iter()
.map(|presentation| proto::ColorPresentation { .map(|presentation| proto::ColorPresentation {
label: presentation.label, label: presentation.label.to_string(),
text_edit: presentation.text_edit.map(serialize_lsp_edit), text_edit: presentation.text_edit.map(serialize_lsp_edit),
additional_text_edits: presentation additional_text_edits: presentation
.additional_text_edits .additional_text_edits
@ -9735,6 +9753,7 @@ impl LspStore {
} }
// Tell the language server about every open buffer in the worktree that matches the language. // Tell the language server about every open buffer in the worktree that matches the language.
let mut buffer_paths_registered = Vec::new();
self.buffer_store.clone().update(cx, |buffer_store, cx| { self.buffer_store.clone().update(cx, |buffer_store, cx| {
for buffer_handle in buffer_store.buffers() { for buffer_handle in buffer_store.buffers() {
let buffer = buffer_handle.read(cx); let buffer = buffer_handle.read(cx);
@ -9793,6 +9812,12 @@ impl LspStore {
version, version,
initial_snapshot.text(), initial_snapshot.text(),
); );
buffer_paths_registered.push(file.abs_path(cx));
local
.buffers_opened_in_servers
.entry(buffer.remote_id())
.or_default()
.insert(server_id);
} }
buffer_handle.update(cx, |buffer, cx| { buffer_handle.update(cx, |buffer, cx| {
buffer.set_completion_triggers( buffer.set_completion_triggers(
@ -10257,11 +10282,15 @@ impl LspStore {
} }
fn cleanup_lsp_data(&mut self, for_server: LanguageServerId) { fn cleanup_lsp_data(&mut self, for_server: LanguageServerId) {
if let Some(lsp_data) = &mut self.lsp_data { for buffer_lsp_data in self.lsp_data.values_mut() {
lsp_data.buffer_lsp_data.remove(&for_server); buffer_lsp_data.colors.remove(&for_server);
buffer_lsp_data.cache_version += 1;
} }
if let Some(local) = self.as_local_mut() { if let Some(local) = self.as_local_mut() {
local.buffer_pull_diagnostics_result_ids.remove(&for_server); local.buffer_pull_diagnostics_result_ids.remove(&for_server);
for buffer_servers in local.buffers_opened_in_servers.values_mut() {
buffer_servers.remove(&for_server);
}
} }
} }

View file

@ -16,7 +16,7 @@ use language::{
Buffer, point_to_lsp, Buffer, point_to_lsp,
proto::{deserialize_anchor, serialize_anchor}, proto::{deserialize_anchor, serialize_anchor},
}; };
use lsp::{LanguageServer, LanguageServerId}; use lsp::{AdapterServerCapabilities, LanguageServer, LanguageServerId};
use rpc::proto::{self, PeerId}; use rpc::proto::{self, PeerId};
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use std::{ use std::{
@ -68,6 +68,10 @@ impl LspCommand for ExpandMacro {
"Expand macro" "Expand macro"
} }
fn check_capabilities(&self, _: AdapterServerCapabilities) -> bool {
true
}
fn to_lsp( fn to_lsp(
&self, &self,
path: &Path, path: &Path,
@ -196,6 +200,10 @@ impl LspCommand for OpenDocs {
"Open docs" "Open docs"
} }
fn check_capabilities(&self, _: AdapterServerCapabilities) -> bool {
true
}
fn to_lsp( fn to_lsp(
&self, &self,
path: &Path, path: &Path,
@ -326,6 +334,10 @@ impl LspCommand for SwitchSourceHeader {
"Switch source header" "Switch source header"
} }
fn check_capabilities(&self, _: AdapterServerCapabilities) -> bool {
true
}
fn to_lsp( fn to_lsp(
&self, &self,
path: &Path, path: &Path,
@ -404,6 +416,10 @@ impl LspCommand for GoToParentModule {
"Go to parent module" "Go to parent module"
} }
fn check_capabilities(&self, _: AdapterServerCapabilities) -> bool {
true
}
fn to_lsp( fn to_lsp(
&self, &self,
path: &Path, path: &Path,
@ -578,6 +594,10 @@ impl LspCommand for GetLspRunnables {
"LSP Runnables" "LSP Runnables"
} }
fn check_capabilities(&self, _: AdapterServerCapabilities) -> bool {
true
}
fn to_lsp( fn to_lsp(
&self, &self,
path: &Path, path: &Path,

View file

@ -778,13 +778,42 @@ pub struct DocumentColor {
pub color_presentations: Vec<ColorPresentation>, pub color_presentations: Vec<ColorPresentation>,
} }
#[derive(Clone, Debug, PartialEq)] impl Eq for DocumentColor {}
impl std::hash::Hash for DocumentColor {
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
self.lsp_range.hash(state);
self.color.red.to_bits().hash(state);
self.color.green.to_bits().hash(state);
self.color.blue.to_bits().hash(state);
self.color.alpha.to_bits().hash(state);
self.resolved.hash(state);
self.color_presentations.hash(state);
}
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct ColorPresentation { pub struct ColorPresentation {
pub label: String, pub label: SharedString,
pub text_edit: Option<lsp::TextEdit>, pub text_edit: Option<lsp::TextEdit>,
pub additional_text_edits: Vec<lsp::TextEdit>, pub additional_text_edits: Vec<lsp::TextEdit>,
} }
impl std::hash::Hash for ColorPresentation {
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
self.label.hash(state);
if let Some(ref edit) = self.text_edit {
edit.range.hash(state);
edit.new_text.hash(state);
}
self.additional_text_edits.len().hash(state);
for edit in &self.additional_text_edits {
edit.range.hash(state);
edit.new_text.hash(state);
}
}
}
#[derive(Clone)] #[derive(Clone)]
pub enum DirectoryLister { pub enum DirectoryLister {
Project(Entity<Project>), Project(Entity<Project>),

View file

@ -632,7 +632,7 @@ impl From<Timestamp> for SystemTime {
impl From<SystemTime> for Timestamp { impl From<SystemTime> for Timestamp {
fn from(time: SystemTime) -> Self { fn from(time: SystemTime) -> Self {
let duration = time.duration_since(UNIX_EPOCH).unwrap(); let duration = time.duration_since(UNIX_EPOCH).unwrap_or_default();
Self { Self {
seconds: duration.as_secs(), seconds: duration.as_secs(),
nanos: duration.subsec_nanos(), nanos: duration.subsec_nanos(),

View file

@ -422,7 +422,12 @@ async fn test_remote_lsp(cx: &mut TestAppContext, server_cx: &mut TestAppContext
"Rust", "Rust",
FakeLspAdapter { FakeLspAdapter {
name: "rust-analyzer", name: "rust-analyzer",
..Default::default() capabilities: lsp::ServerCapabilities {
completion_provider: Some(lsp::CompletionOptions::default()),
rename_provider: Some(lsp::OneOf::Left(true)),
..lsp::ServerCapabilities::default()
},
..FakeLspAdapter::default()
}, },
) )
}); });
@ -430,7 +435,11 @@ async fn test_remote_lsp(cx: &mut TestAppContext, server_cx: &mut TestAppContext
let mut fake_lsp = server_cx.update(|cx| { let mut fake_lsp = server_cx.update(|cx| {
headless.read(cx).languages.register_fake_language_server( headless.read(cx).languages.register_fake_language_server(
LanguageServerName("rust-analyzer".into()), LanguageServerName("rust-analyzer".into()),
Default::default(), lsp::ServerCapabilities {
completion_provider: Some(lsp::CompletionOptions::default()),
rename_provider: Some(lsp::OneOf::Left(true)),
..lsp::ServerCapabilities::default()
},
None, None,
) )
}); });

View file

@ -2784,7 +2784,7 @@ impl Pane {
}) })
.collect::<Vec<_>>(); .collect::<Vec<_>>();
let tab_count = tab_items.len(); let tab_count = tab_items.len();
let safe_pinned_count = if self.pinned_tab_count > tab_count { if self.pinned_tab_count > tab_count {
log::warn!( log::warn!(
"Pinned tab count ({}) exceeds actual tab count ({}). \ "Pinned tab count ({}) exceeds actual tab count ({}). \
This should not happen. If possible, add reproduction steps, \ This should not happen. If possible, add reproduction steps, \
@ -2792,11 +2792,9 @@ impl Pane {
self.pinned_tab_count, self.pinned_tab_count,
tab_count tab_count
); );
tab_count self.pinned_tab_count = tab_count;
} else { }
self.pinned_tab_count let unpinned_tabs = tab_items.split_off(self.pinned_tab_count);
};
let unpinned_tabs = tab_items.split_off(safe_pinned_count);
let pinned_tabs = tab_items; let pinned_tabs = tab_items;
TabBar::new("tab_bar") TabBar::new("tab_bar")
.when( .when(

View file

@ -2,7 +2,7 @@
description = "The fast, collaborative code editor." description = "The fast, collaborative code editor."
edition.workspace = true edition.workspace = true
name = "zed" name = "zed"
version = "0.193.0" version = "0.193.3"
publish.workspace = true publish.workspace = true
license = "GPL-3.0-or-later" license = "GPL-3.0-or-later"
authors = ["Zed Team <hi@zed.dev>"] authors = ["Zed Team <hi@zed.dev>"]

View file

@ -1 +1 @@
dev stable