Compare commits

...
Sign in to create a new pull request.

27 commits

Author SHA1 Message Date
Joseph Lyons
49784eb177 v0.87.x stable 2023-05-24 13:21:38 -04:00
Antonio Scandurra
5667d6aa16 zed 0.87.6 2023-05-23 16:25:08 +02:00
Antonio Scandurra
3d6a6ec957 Focus command palette when hitting cmd-shift-p if file finder is open
...and vice versa.
2023-05-23 16:24:37 +02:00
Antonio Scandurra
d7a70d898c Update LiveKit client SDK to 1.0.12 (#2516)
Fixes
https://linear.app/zed-industries/issue/Z-1756/screen-sharing-is-slow-and-sometimes-doesnt-work-at-all

Release Notes:

* Fixed some cases where screen-sharing would have low bitrate or
completely fail to start.
2023-05-23 16:19:47 +02:00
Mikayla Maki
589801f469
zed 0.87.5 2023-05-22 18:50:02 -07:00
Mikayla Maki
445a6b6e60
Fix race condition in diff base initializaiton (#2513)
fixes
https://linear.app/zed-industries/issue/Z-1657/diff-markers-in-gutter-do-not-show-up-until-after-first-save

Release Notes:

- Fixes a race condition on buffer initialization that would cause git
diffs to not load.
2023-05-22 18:45:15 -07:00
Mikayla Maki
540228d93e
Fixed contrast in project panel and scrollbar (#2512)
Redo of https://github.com/zed-industries/zed/pull/2504

This makes the different git locations individually style-able

Release Notes:

- Improve git contrast (preview only)
2023-05-22 18:35:31 -07:00
Mikayla Maki
9d60b37fce
Only fire update diff base when the dot repo is scanned (#2510)
This PR fixes a bug in the firing of the UpdatedRepositories event which
caused it to flood collaboration with new messages on every file save.

Release Notes:

* Fixed a bug in repository detection that caused it to fire
over-eagerly (preview only)
2023-05-22 14:19:13 -07:00
Mikayla Maki
7ce4d23f70
Backport #2507 into preview (#2508)
This PR is a re-implementation of
https://github.com/zed-industries/zed/pull/2507, but in the old
global-settings style. Note that this PR is based on the 0.87.x branch,
and should be merged in along with the cherry-pick that includes
https://github.com/zed-industries/zed/pull/2504

Release Notes:

* N/A
2023-05-22 14:11:50 -07:00
Mikayla Maki
05062fcc74
Backport #2507 into preivew 2023-05-22 12:31:02 -07:00
Joseph Lyons
ac1913a4dc zed 0.87.4 2023-05-22 14:40:16 -04:00
Joseph Lyons
a40aa0432a Fix compile errors 2023-05-22 14:39:05 -04:00
Kirill Bulatov
3cb0b62adf Do not refocus project search query on ESC press (#2494)
Closes
https://linear.app/zed-industries/issue/Z-1471/escape-should-not-move-focus-to-project-search

Makes ESC more predictable as a shortcut that cancels/roll backs the
state in project search panel.

Release Notes:

* Fixes ESC making focus to jump in project search panel
2023-05-22 14:24:14 -04:00
Mikayla Maki
c1b3d389ba Fix perf problem with scrollbars in large multibuffers (#2505)
Remove scrollbars from multibuffers

Release Notes:

* Removes git scrollbar highlights from multibuffers (preview only)
2023-05-22 14:23:32 -04:00
Mikayla Maki
5a2b819c18 Fix bugs in git implementation (#2495)
fixes
https://linear.app/zed-industries/issue/Z-1593/clean-up-git-integration

- Fixes calculation of git offsets in the scrollbar:

fixes
https://linear.app/zed-industries/issue/Z-1608/fix-scrollbar-diffs-sliding-out-of-sync-with-gutter-diffs-in

fixes
https://linear.app/zed-industries/issue/Z-1629/project-search-panel-has-git-marks-on-the-scrollbar-misaligned

fixes
https://linear.app/zed-industries/issue/Z-1625/soft-wrap-affects-diff-locations-in-scrollbar

- Improves the performance of scrollbar hunks:

fixes
https://linear.app/zed-industries/issue/Z-1640/double-check-performance-of-scrollbar-hunks

- Fixes a long standing bug with how git gutters interact with soft
wraps:

fixes
https://linear.app/zed-industries/issue/Z-1442/make-hunks-grow-to-the-end-of-softwraps-when-ending-on-a-softwrapped

- Allows work directories to be renamed

fixes
https://linear.app/zed-industries/issue/Z-1577/fix-stale-git-repositories-when-directory-is-renamed

Release Notes:

* Fix the offsets of the git diffs in the scrollbar when there are
folds, wraps, or excerpts (preview only)
* Allow the work directory of a repository to be renamed (preview only)
* Extend git gutter to cover the entirety of a wrapped line
(https://github.com/zed-industries/community/issues/937)
2023-05-22 13:37:02 -04:00
Max Brunsfeld
c6c8ea8660 Remove expensive-to-clone fields from worktree's LocalSnapshot (#2497)
This fixes performance problems that @nathansobo and I have seen in some
cases, when a large number of files changed on disk. A lot of time was
being spent in `worktree::LocalSnapshot::clone`. I think this may have
been because of needing to clone the `removed_entry_ids` map. This
structure is only really used when *mutating* the `LocalSnapshot` in the
background scanner, so I moved it off of the snapshots.
2023-05-22 10:27:40 -07:00
Max Brunsfeld
d1e4c8dec5 zed 0.87.3 2023-05-19 13:18:43 -07:00
Max Brunsfeld
815b80e295 Remove unnescessary double lookup in repo for (#2492)
Release Notes:

* Optimize repository queries (preview only)
2023-05-19 13:18:12 -07:00
Max Brunsfeld
605213708a Optimize retrieving repos for entries when rendering the project panel (#2493)
This fixes slowness in rendering the project panel due to retrieving the
repository for a given entry.

Release Notes:

* Fixed a lag that would occur when lots of files changed on disk while
the project panel was open (preview only).
2023-05-19 13:17:55 -07:00
Max Brunsfeld
a46e2e82f3 zed 0.87.2 2023-05-19 10:01:24 -07:00
Max Brunsfeld
5e3c5359c5 Fix performance problems in reporting changed FS paths to language servers (#2491)
Fixes
https://linear.app/zed-industries/issue/Z-1611/main-thread-hangs-while-sending-filesystem-change-events-to-lsp

Release Notes:

* Fixed a lag that would sometime occur when large numbers of files
changed on disk, due to reporting the changed files to language servers.
2023-05-19 09:58:37 -07:00
Max Brunsfeld
7da90451d9 Avoid unnecessary code action requests when applying leader updates t… (#2489)
We noticed a huge amount of code actions requests being issued by
followers when applying leader updates. It was caused by a call to
`MultiBuffer::remove_excerpts` with an empty list of excerpts to remove.
This PR fixes that by avoiding emitting spurious events when multibuffer
excerpt manipulation methods are called with empty lists.
2023-05-19 09:45:01 -07:00
Joseph Lyons
e044da3447 zed 0.87.1 2023-05-17 17:52:23 -04:00
Mikayla Maki
5b733542ad Merge pull request #2483 from zed-industries/add-scrollbar-settings
Add scrollbars setting
2023-05-17 17:50:47 -04:00
Mikayla Maki
10a2a592b3 Merge pull request #2482 from zed-industries/add-hunks-to-scrollbar
Add diff hunks to the scroll bar
2023-05-17 15:42:10 -04:00
Joseph Lyons
c93269fd7a collab 0.12.2 2023-05-17 13:36:43 -04:00
Joseph Lyons
f69b94b0a2 v0.87.x preview 2023-05-17 12:38:43 -04:00
35 changed files with 1570 additions and 1062 deletions

8
Cargo.lock generated
View file

@ -1190,7 +1190,7 @@ dependencies = [
[[package]] [[package]]
name = "collab" name = "collab"
version = "0.12.1" version = "0.12.2"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"async-tungstenite", "async-tungstenite",
@ -4721,7 +4721,7 @@ dependencies = [
"fuzzy", "fuzzy",
"git", "git",
"git2", "git2",
"glob", "globset",
"gpui", "gpui",
"ignore", "ignore",
"itertools", "itertools",
@ -5777,7 +5777,7 @@ dependencies = [
"collections", "collections",
"editor", "editor",
"futures 0.3.25", "futures 0.3.25",
"glob", "globset",
"gpui", "gpui",
"language", "language",
"log", "log",
@ -8544,7 +8544,7 @@ checksum = "09041cd90cf85f7f8b2df60c646f853b7f535ce68f85244eb6731cf89fa498ec"
[[package]] [[package]]
name = "zed" name = "zed"
version = "0.87.0" version = "0.87.6"
dependencies = [ dependencies = [
"activity_indicator", "activity_indicator",
"anyhow", "anyhow",

View file

@ -77,7 +77,8 @@ async-trait = { version = "0.1" }
ctor = { version = "0.1" } ctor = { version = "0.1" }
env_logger = { version = "0.9" } env_logger = { version = "0.9" }
futures = { version = "0.3" } futures = { version = "0.3" }
glob = { version = "0.3.1" } glob = { version = "0.3" }
globset = { version = "0.4" }
lazy_static = { version = "1.4.0" } lazy_static = { version = "1.4.0" }
log = { version = "0.4.16", features = ["kv_unstable_serde"] } log = { version = "0.4.16", features = ["kv_unstable_serde"] }
ordered-float = { version = "2.1.1" } ordered-float = { version = "2.1.1" }

View file

@ -43,6 +43,24 @@
// 3. Draw all invisible symbols: // 3. Draw all invisible symbols:
// "all" // "all"
"show_whitespaces": "selection", "show_whitespaces": "selection",
// Scrollbar related settings
"scrollbar": {
// When to show the scrollbar in the editor.
// This setting can take four values:
//
// 1. Show the scrollbar if there's important information or
// follow the system's configured behavior (default):
// "auto"
// 2. Match the system's configured behavior:
// "system"
// 3. Always show the scrollbar:
// "always"
// 4. Never show the scrollbar:
// "never"
"show": "auto",
// Whether to show git diff indicators in the scrollbar.
"git_diff": true
},
// Whether the screen sharing icon is shown in the os status bar. // Whether the screen sharing icon is shown in the os status bar.
"show_call_status_icon": true, "show_call_status_icon": true,
// Whether to use language servers to provide code intelligence. // Whether to use language servers to provide code intelligence.

View file

@ -3,7 +3,7 @@ authors = ["Nathan Sobo <nathan@zed.dev>"]
default-run = "collab" default-run = "collab"
edition = "2021" edition = "2021"
name = "collab" name = "collab"
version = "0.12.1" version = "0.12.2"
publish = false publish = false
[[bin]] [[bin]]

View file

@ -2434,7 +2434,7 @@ async fn test_git_diff_base_change(
buffer_local_a.read_with(cx_a, |buffer, _| { buffer_local_a.read_with(cx_a, |buffer, _| {
assert_eq!(buffer.diff_base(), Some(diff_base.as_ref())); assert_eq!(buffer.diff_base(), Some(diff_base.as_ref()));
git::diff::assert_hunks( git::diff::assert_hunks(
buffer.snapshot().git_diff_hunks_in_row_range(0..4, false), buffer.snapshot().git_diff_hunks_in_row_range(0..4),
&buffer, &buffer,
&diff_base, &diff_base,
&[(1..2, "", "two\n")], &[(1..2, "", "two\n")],
@ -2454,7 +2454,7 @@ async fn test_git_diff_base_change(
buffer_remote_a.read_with(cx_b, |buffer, _| { buffer_remote_a.read_with(cx_b, |buffer, _| {
assert_eq!(buffer.diff_base(), Some(diff_base.as_ref())); assert_eq!(buffer.diff_base(), Some(diff_base.as_ref()));
git::diff::assert_hunks( git::diff::assert_hunks(
buffer.snapshot().git_diff_hunks_in_row_range(0..4, false), buffer.snapshot().git_diff_hunks_in_row_range(0..4),
&buffer, &buffer,
&diff_base, &diff_base,
&[(1..2, "", "two\n")], &[(1..2, "", "two\n")],
@ -2478,7 +2478,7 @@ async fn test_git_diff_base_change(
assert_eq!(buffer.diff_base(), Some(new_diff_base.as_ref())); assert_eq!(buffer.diff_base(), Some(new_diff_base.as_ref()));
git::diff::assert_hunks( git::diff::assert_hunks(
buffer.snapshot().git_diff_hunks_in_row_range(0..4, false), buffer.snapshot().git_diff_hunks_in_row_range(0..4),
&buffer, &buffer,
&diff_base, &diff_base,
&[(2..3, "", "three\n")], &[(2..3, "", "three\n")],
@ -2489,7 +2489,7 @@ async fn test_git_diff_base_change(
buffer_remote_a.read_with(cx_b, |buffer, _| { buffer_remote_a.read_with(cx_b, |buffer, _| {
assert_eq!(buffer.diff_base(), Some(new_diff_base.as_ref())); assert_eq!(buffer.diff_base(), Some(new_diff_base.as_ref()));
git::diff::assert_hunks( git::diff::assert_hunks(
buffer.snapshot().git_diff_hunks_in_row_range(0..4, false), buffer.snapshot().git_diff_hunks_in_row_range(0..4),
&buffer, &buffer,
&diff_base, &diff_base,
&[(2..3, "", "three\n")], &[(2..3, "", "three\n")],
@ -2532,7 +2532,7 @@ async fn test_git_diff_base_change(
buffer_local_b.read_with(cx_a, |buffer, _| { buffer_local_b.read_with(cx_a, |buffer, _| {
assert_eq!(buffer.diff_base(), Some(diff_base.as_ref())); assert_eq!(buffer.diff_base(), Some(diff_base.as_ref()));
git::diff::assert_hunks( git::diff::assert_hunks(
buffer.snapshot().git_diff_hunks_in_row_range(0..4, false), buffer.snapshot().git_diff_hunks_in_row_range(0..4),
&buffer, &buffer,
&diff_base, &diff_base,
&[(1..2, "", "two\n")], &[(1..2, "", "two\n")],
@ -2552,7 +2552,7 @@ async fn test_git_diff_base_change(
buffer_remote_b.read_with(cx_b, |buffer, _| { buffer_remote_b.read_with(cx_b, |buffer, _| {
assert_eq!(buffer.diff_base(), Some(diff_base.as_ref())); assert_eq!(buffer.diff_base(), Some(diff_base.as_ref()));
git::diff::assert_hunks( git::diff::assert_hunks(
buffer.snapshot().git_diff_hunks_in_row_range(0..4, false), buffer.snapshot().git_diff_hunks_in_row_range(0..4),
&buffer, &buffer,
&diff_base, &diff_base,
&[(1..2, "", "two\n")], &[(1..2, "", "two\n")],
@ -2580,12 +2580,12 @@ async fn test_git_diff_base_change(
"{:?}", "{:?}",
buffer buffer
.snapshot() .snapshot()
.git_diff_hunks_in_row_range(0..4, false) .git_diff_hunks_in_row_range(0..4)
.collect::<Vec<_>>() .collect::<Vec<_>>()
); );
git::diff::assert_hunks( git::diff::assert_hunks(
buffer.snapshot().git_diff_hunks_in_row_range(0..4, false), buffer.snapshot().git_diff_hunks_in_row_range(0..4),
&buffer, &buffer,
&diff_base, &diff_base,
&[(2..3, "", "three\n")], &[(2..3, "", "three\n")],
@ -2596,7 +2596,7 @@ async fn test_git_diff_base_change(
buffer_remote_b.read_with(cx_b, |buffer, _| { buffer_remote_b.read_with(cx_b, |buffer, _| {
assert_eq!(buffer.diff_base(), Some(new_diff_base.as_ref())); assert_eq!(buffer.diff_base(), Some(new_diff_base.as_ref()));
git::diff::assert_hunks( git::diff::assert_hunks(
buffer.snapshot().git_diff_hunks_in_row_range(0..4, false), buffer.snapshot().git_diff_hunks_in_row_range(0..4),
&buffer, &buffer,
&diff_base, &diff_base,
&[(2..3, "", "three\n")], &[(2..3, "", "three\n")],
@ -2685,6 +2685,7 @@ async fn test_git_branch_name(
}); });
let project_remote_c = client_c.build_remote_project(project_id, cx_c).await; let project_remote_c = client_c.build_remote_project(project_id, cx_c).await;
deterministic.run_until_parked();
project_remote_c.read_with(cx_c, |project, cx| { project_remote_c.read_with(cx_c, |project, cx| {
assert_branch(Some("branch-2"), project, cx) assert_branch(Some("branch-2"), project, cx)
}); });

View file

@ -328,10 +328,9 @@ async fn configure_disabled_globs(
cx.global::<Settings>() cx.global::<Settings>()
.copilot .copilot
.disabled_globs .disabled_globs
.clone()
.iter() .iter()
.map(|glob| glob.as_str().to_string()) .map(|glob| glob.as_str().to_string())
.collect::<Vec<_>>() .collect()
}); });
if let Some(path_to_disable) = &path_to_disable { if let Some(path_to_disable) = &path_to_disable {

View file

@ -19,6 +19,7 @@ mod editor_tests;
#[cfg(any(test, feature = "test-support"))] #[cfg(any(test, feature = "test-support"))]
pub mod test; pub mod test;
use ::git::diff::DiffHunk;
use aho_corasick::AhoCorasick; use aho_corasick::AhoCorasick;
use anyhow::{anyhow, Result}; use anyhow::{anyhow, Result};
use blink_manager::BlinkManager; use blink_manager::BlinkManager;
@ -5565,68 +5566,91 @@ impl Editor {
} }
fn go_to_hunk(&mut self, _: &GoToHunk, cx: &mut ViewContext<Self>) { fn go_to_hunk(&mut self, _: &GoToHunk, cx: &mut ViewContext<Self>) {
self.go_to_hunk_impl(Direction::Next, cx)
}
fn go_to_prev_hunk(&mut self, _: &GoToPrevHunk, cx: &mut ViewContext<Self>) {
self.go_to_hunk_impl(Direction::Prev, cx)
}
pub fn go_to_hunk_impl(&mut self, direction: Direction, cx: &mut ViewContext<Self>) {
let snapshot = self let snapshot = self
.display_map .display_map
.update(cx, |display_map, cx| display_map.snapshot(cx)); .update(cx, |display_map, cx| display_map.snapshot(cx));
let selection = self.selections.newest::<Point>(cx); let selection = self.selections.newest::<Point>(cx);
fn seek_in_direction( if !self.seek_in_direction(
this: &mut Editor, &snapshot,
snapshot: &DisplaySnapshot, selection.head(),
initial_point: Point, false,
is_wrapped: bool, snapshot
direction: Direction, .buffer_snapshot
cx: &mut ViewContext<Editor>, .git_diff_hunks_in_range((selection.head().row + 1)..u32::MAX),
) -> bool { cx,
let hunks = if direction == Direction::Next { ) {
let wrapped_point = Point::zero();
self.seek_in_direction(
&snapshot,
wrapped_point,
true,
snapshot snapshot
.buffer_snapshot .buffer_snapshot
.git_diff_hunks_in_range(initial_point.row..u32::MAX, false) .git_diff_hunks_in_range((wrapped_point.row + 1)..u32::MAX),
} else { cx,
snapshot );
.buffer_snapshot
.git_diff_hunks_in_range(0..initial_point.row, true)
};
let display_point = initial_point.to_display_point(snapshot);
let mut hunks = hunks
.map(|hunk| diff_hunk_to_display(hunk, &snapshot))
.skip_while(|hunk| {
if is_wrapped {
false
} else {
hunk.contains_display_row(display_point.row())
}
})
.dedup();
if let Some(hunk) = hunks.next() {
this.change_selections(Some(Autoscroll::fit()), cx, |s| {
let row = hunk.start_display_row();
let point = DisplayPoint::new(row, 0);
s.select_display_ranges([point..point]);
});
true
} else {
false
}
} }
}
if !seek_in_direction(self, &snapshot, selection.head(), false, direction, cx) { fn go_to_prev_hunk(&mut self, _: &GoToPrevHunk, cx: &mut ViewContext<Self>) {
let wrapped_point = match direction { let snapshot = self
Direction::Next => Point::zero(), .display_map
Direction::Prev => snapshot.buffer_snapshot.max_point(), .update(cx, |display_map, cx| display_map.snapshot(cx));
}; let selection = self.selections.newest::<Point>(cx);
seek_in_direction(self, &snapshot, wrapped_point, true, direction, cx);
if !self.seek_in_direction(
&snapshot,
selection.head(),
false,
snapshot
.buffer_snapshot
.git_diff_hunks_in_range_rev(0..selection.head().row),
cx,
) {
let wrapped_point = snapshot.buffer_snapshot.max_point();
self.seek_in_direction(
&snapshot,
wrapped_point,
true,
snapshot
.buffer_snapshot
.git_diff_hunks_in_range_rev(0..wrapped_point.row),
cx,
);
}
}
fn seek_in_direction(
&mut self,
snapshot: &DisplaySnapshot,
initial_point: Point,
is_wrapped: bool,
hunks: impl Iterator<Item = DiffHunk<u32>>,
cx: &mut ViewContext<Editor>,
) -> bool {
let display_point = initial_point.to_display_point(snapshot);
let mut hunks = hunks
.map(|hunk| diff_hunk_to_display(hunk, &snapshot))
.skip_while(|hunk| {
if is_wrapped {
false
} else {
hunk.contains_display_row(display_point.row())
}
})
.dedup();
if let Some(hunk) = hunks.next() {
self.change_selections(Some(Autoscroll::fit()), cx, |s| {
let row = hunk.start_display_row();
let point = DisplayPoint::new(row, 0);
s.select_display_ranges([point..point]);
});
true
} else {
false
} }
} }

View file

@ -5459,10 +5459,12 @@ async fn test_following(cx: &mut gpui::TestAppContext) {
}); });
let is_still_following = Rc::new(RefCell::new(true)); let is_still_following = Rc::new(RefCell::new(true));
let follower_edit_event_count = Rc::new(RefCell::new(0));
let pending_update = Rc::new(RefCell::new(None)); let pending_update = Rc::new(RefCell::new(None));
follower.update(cx, { follower.update(cx, {
let update = pending_update.clone(); let update = pending_update.clone();
let is_still_following = is_still_following.clone(); let is_still_following = is_still_following.clone();
let follower_edit_event_count = follower_edit_event_count.clone();
|_, cx| { |_, cx| {
cx.subscribe(&leader, move |_, leader, event, cx| { cx.subscribe(&leader, move |_, leader, event, cx| {
leader leader
@ -5475,6 +5477,9 @@ async fn test_following(cx: &mut gpui::TestAppContext) {
if Editor::should_unfollow_on_event(event, cx) { if Editor::should_unfollow_on_event(event, cx) {
*is_still_following.borrow_mut() = false; *is_still_following.borrow_mut() = false;
} }
if let Event::BufferEdited = event {
*follower_edit_event_count.borrow_mut() += 1;
}
}) })
.detach(); .detach();
} }
@ -5494,6 +5499,7 @@ async fn test_following(cx: &mut gpui::TestAppContext) {
assert_eq!(follower.selections.ranges(cx), vec![1..1]); assert_eq!(follower.selections.ranges(cx), vec![1..1]);
}); });
assert_eq!(*is_still_following.borrow(), true); assert_eq!(*is_still_following.borrow(), true);
assert_eq!(*follower_edit_event_count.borrow(), 0);
// Update the scroll position only // Update the scroll position only
leader.update(cx, |leader, cx| { leader.update(cx, |leader, cx| {
@ -5510,6 +5516,7 @@ async fn test_following(cx: &mut gpui::TestAppContext) {
vec2f(1.5, 3.5) vec2f(1.5, 3.5)
); );
assert_eq!(*is_still_following.borrow(), true); assert_eq!(*is_still_following.borrow(), true);
assert_eq!(*follower_edit_event_count.borrow(), 0);
// Update the selections and scroll position. The follower's scroll position is updated // Update the selections and scroll position. The follower's scroll position is updated
// via autoscroll, not via the leader's exact scroll position. // via autoscroll, not via the leader's exact scroll position.

View file

@ -47,6 +47,7 @@ use std::{
ops::Range, ops::Range,
sync::Arc, sync::Arc,
}; };
use text::Point;
use workspace::item::Item; use workspace::item::Item;
enum FoldMarkers {} enum FoldMarkers {}
@ -648,7 +649,7 @@ impl EditorElement {
//TODO: This rendering is entirely a horrible hack //TODO: This rendering is entirely a horrible hack
DiffHunkStatus::Removed => { DiffHunkStatus::Removed => {
let row = *display_row_range.start(); let row = display_row_range.start;
let offset = line_height / 2.; let offset = line_height / 2.;
let start_y = row as f32 * line_height - offset - scroll_top; let start_y = row as f32 * line_height - offset - scroll_top;
@ -670,11 +671,11 @@ impl EditorElement {
} }
}; };
let start_row = *display_row_range.start(); let start_row = display_row_range.start;
let end_row = *display_row_range.end(); let end_row = display_row_range.end;
let start_y = start_row as f32 * line_height - scroll_top; let start_y = start_row as f32 * line_height - scroll_top;
let end_y = end_row as f32 * line_height - scroll_top + line_height; let end_y = end_row as f32 * line_height - scroll_top;
let width = diff_style.width_em * line_height; let width = diff_style.width_em * line_height;
let highlight_origin = bounds.origin() + vec2f(-width, start_y); let highlight_origin = bounds.origin() + vec2f(-width, start_y);
@ -1022,15 +1023,16 @@ impl EditorElement {
let mut first_row_y_offset = 0.0; let mut first_row_y_offset = 0.0;
// Impose a minimum height on the scrollbar thumb // Impose a minimum height on the scrollbar thumb
let row_height = height / max_row;
let min_thumb_height = let min_thumb_height =
style.min_height_factor * cx.font_cache.line_height(self.style.text.font_size); style.min_height_factor * cx.font_cache.line_height(self.style.text.font_size);
let thumb_height = (row_range.end - row_range.start) * height / max_row; let thumb_height = (row_range.end - row_range.start) * row_height;
if thumb_height < min_thumb_height { if thumb_height < min_thumb_height {
first_row_y_offset = (min_thumb_height - thumb_height) / 2.0; first_row_y_offset = (min_thumb_height - thumb_height) / 2.0;
height -= min_thumb_height - thumb_height; height -= min_thumb_height - thumb_height;
} }
let y_for_row = |row: f32| -> f32 { top + first_row_y_offset + row * height / max_row }; let y_for_row = |row: f32| -> f32 { top + first_row_y_offset + row * row_height };
let thumb_top = y_for_row(row_range.start) - first_row_y_offset; let thumb_top = y_for_row(row_range.start) - first_row_y_offset;
let thumb_bottom = y_for_row(row_range.end) + first_row_y_offset; let thumb_bottom = y_for_row(row_range.end) + first_row_y_offset;
@ -1044,6 +1046,56 @@ impl EditorElement {
background: style.track.background_color, background: style.track.background_color,
..Default::default() ..Default::default()
}); });
if layout.is_singleton && cx.global::<Settings>().scrollbar.git_diff.unwrap_or(true) {
let diff_style = cx.global::<Settings>().theme.editor.scrollbar.git.clone();
for hunk in layout
.position_map
.snapshot
.buffer_snapshot
.git_diff_hunks_in_range(0..(max_row.floor() as u32))
{
let start_display = Point::new(hunk.buffer_range.start, 0)
.to_display_point(&layout.position_map.snapshot.display_snapshot);
let end_display = Point::new(hunk.buffer_range.end, 0)
.to_display_point(&layout.position_map.snapshot.display_snapshot);
let start_y = y_for_row(start_display.row() as f32);
let mut end_y = if hunk.buffer_range.start == hunk.buffer_range.end {
y_for_row((end_display.row() + 1) as f32)
} else {
y_for_row((end_display.row()) as f32)
};
if end_y - start_y < 1. {
end_y = start_y + 1.;
}
let bounds = RectF::from_points(vec2f(left, start_y), vec2f(right, end_y));
let color = match hunk.status() {
DiffHunkStatus::Added => diff_style.inserted,
DiffHunkStatus::Modified => diff_style.modified,
DiffHunkStatus::Removed => diff_style.deleted,
};
let border = Border {
width: 1.,
color: style.thumb.border.color,
overlay: false,
top: false,
right: true,
bottom: false,
left: true,
};
scene.push_quad(Quad {
bounds,
background: Some(color),
border,
corner_radius: style.thumb.corner_radius,
})
}
}
scene.push_quad(Quad { scene.push_quad(Quad {
bounds: thumb_bounds, bounds: thumb_bounds,
border: style.thumb.border, border: style.thumb.border,
@ -1219,7 +1271,7 @@ impl EditorElement {
.row; .row;
buffer_snapshot buffer_snapshot
.git_diff_hunks_in_range(buffer_start_row..buffer_end_row, false) .git_diff_hunks_in_range(buffer_start_row..buffer_end_row)
.map(|hunk| diff_hunk_to_display(hunk, snapshot)) .map(|hunk| diff_hunk_to_display(hunk, snapshot))
.dedup() .dedup()
.collect() .collect()
@ -2013,7 +2065,19 @@ impl Element<Editor> for EditorElement {
)); ));
} }
let show_scrollbars = editor.scroll_manager.scrollbars_visible(); let scrollbar_settings = cx.global::<Settings>().scrollbar;
let show_scrollbars = match scrollbar_settings.show.unwrap_or_default() {
settings::ShowScrollbar::Auto => {
// Git
(is_singleton && scrollbar_settings.git_diff.unwrap_or(true) && snapshot.buffer_snapshot.has_git_diffs())
// Scrollmanager
|| editor.scroll_manager.scrollbars_visible()
}
settings::ShowScrollbar::System => editor.scroll_manager.scrollbars_visible(),
settings::ShowScrollbar::Always => true,
settings::ShowScrollbar::Never => false,
};
let include_root = editor let include_root = editor
.project .project
.as_ref() .as_ref()
@ -2230,6 +2294,7 @@ impl Element<Editor> for EditorElement {
text_size, text_size,
scrollbar_row_range, scrollbar_row_range,
show_scrollbars, show_scrollbars,
is_singleton,
max_row, max_row,
gutter_margin, gutter_margin,
active_rows, active_rows,
@ -2385,6 +2450,7 @@ pub struct LayoutState {
selections: Vec<(ReplicaId, Vec<SelectionLayout>)>, selections: Vec<(ReplicaId, Vec<SelectionLayout>)>,
scrollbar_row_range: Range<f32>, scrollbar_row_range: Range<f32>,
show_scrollbars: bool, show_scrollbars: bool,
is_singleton: bool,
max_row: u32, max_row: u32,
context_menu: Option<(DisplayPoint, AnyElement<Editor>)>, context_menu: Option<(DisplayPoint, AnyElement<Editor>)>,
code_actions_indicator: Option<(u32, AnyElement<Editor>)>, code_actions_indicator: Option<(u32, AnyElement<Editor>)>,

View file

@ -1,4 +1,4 @@
use std::ops::RangeInclusive; use std::ops::Range;
use git::diff::{DiffHunk, DiffHunkStatus}; use git::diff::{DiffHunk, DiffHunkStatus};
use language::Point; use language::Point;
@ -15,7 +15,7 @@ pub enum DisplayDiffHunk {
}, },
Unfolded { Unfolded {
display_row_range: RangeInclusive<u32>, display_row_range: Range<u32>,
status: DiffHunkStatus, status: DiffHunkStatus,
}, },
} }
@ -26,7 +26,7 @@ impl DisplayDiffHunk {
&DisplayDiffHunk::Folded { display_row } => display_row, &DisplayDiffHunk::Folded { display_row } => display_row,
DisplayDiffHunk::Unfolded { DisplayDiffHunk::Unfolded {
display_row_range, .. display_row_range, ..
} => *display_row_range.start(), } => display_row_range.start,
} }
} }
@ -36,7 +36,7 @@ impl DisplayDiffHunk {
DisplayDiffHunk::Unfolded { DisplayDiffHunk::Unfolded {
display_row_range, .. display_row_range, ..
} => display_row_range.clone(), } => display_row_range.start..=display_row_range.end - 1,
}; };
range.contains(&display_row) range.contains(&display_row)
@ -77,16 +77,12 @@ pub fn diff_hunk_to_display(hunk: DiffHunk<u32>, snapshot: &DisplaySnapshot) ->
} else { } else {
let start = hunk_start_point.to_display_point(snapshot).row(); let start = hunk_start_point.to_display_point(snapshot).row();
let hunk_end_row_inclusive = hunk let hunk_end_row_inclusive = hunk.buffer_range.end.max(hunk.buffer_range.start);
.buffer_range
.end
.saturating_sub(1)
.max(hunk.buffer_range.start);
let hunk_end_point = Point::new(hunk_end_row_inclusive, 0); let hunk_end_point = Point::new(hunk_end_row_inclusive, 0);
let end = hunk_end_point.to_display_point(snapshot).row(); let end = hunk_end_point.to_display_point(snapshot).row();
DisplayDiffHunk::Unfolded { DisplayDiffHunk::Unfolded {
display_row_range: start..=end, display_row_range: start..end,
status: hunk.status(), status: hunk.status(),
} }
} }

View file

@ -1165,6 +1165,9 @@ impl MultiBuffer {
) { ) {
self.sync(cx); self.sync(cx);
let ids = excerpt_ids.into_iter().collect::<Vec<_>>(); let ids = excerpt_ids.into_iter().collect::<Vec<_>>();
if ids.is_empty() {
return;
}
let mut buffers = self.buffers.borrow_mut(); let mut buffers = self.buffers.borrow_mut();
let mut snapshot = self.snapshot.borrow_mut(); let mut snapshot = self.snapshot.borrow_mut();
@ -2817,20 +2820,24 @@ impl MultiBufferSnapshot {
}) })
} }
pub fn git_diff_hunks_in_range<'a>( pub fn has_git_diffs(&self) -> bool {
for excerpt in self.excerpts.iter() {
if !excerpt.buffer.git_diff.is_empty() {
return true;
}
}
false
}
pub fn git_diff_hunks_in_range_rev<'a>(
&'a self, &'a self,
row_range: Range<u32>, row_range: Range<u32>,
reversed: bool,
) -> impl 'a + Iterator<Item = DiffHunk<u32>> { ) -> impl 'a + Iterator<Item = DiffHunk<u32>> {
let mut cursor = self.excerpts.cursor::<Point>(); let mut cursor = self.excerpts.cursor::<Point>();
if reversed { cursor.seek(&Point::new(row_range.end, 0), Bias::Left, &());
cursor.seek(&Point::new(row_range.end, 0), Bias::Left, &()); if cursor.item().is_none() {
if cursor.item().is_none() { cursor.prev(&());
cursor.prev(&());
}
} else {
cursor.seek(&Point::new(row_range.start, 0), Bias::Right, &());
} }
std::iter::from_fn(move || { std::iter::from_fn(move || {
@ -2860,7 +2867,7 @@ impl MultiBufferSnapshot {
let buffer_hunks = excerpt let buffer_hunks = excerpt
.buffer .buffer
.git_diff_hunks_intersecting_range(buffer_start..buffer_end, reversed) .git_diff_hunks_intersecting_range_rev(buffer_start..buffer_end)
.filter_map(move |hunk| { .filter_map(move |hunk| {
let start = multibuffer_start.row let start = multibuffer_start.row
+ hunk + hunk
@ -2880,12 +2887,70 @@ impl MultiBufferSnapshot {
}) })
}); });
if reversed { cursor.prev(&());
cursor.prev(&());
} else { Some(buffer_hunks)
cursor.next(&()); })
.flatten()
}
pub fn git_diff_hunks_in_range<'a>(
&'a self,
row_range: Range<u32>,
) -> impl 'a + Iterator<Item = DiffHunk<u32>> {
let mut cursor = self.excerpts.cursor::<Point>();
cursor.seek(&Point::new(row_range.start, 0), Bias::Right, &());
std::iter::from_fn(move || {
let excerpt = cursor.item()?;
let multibuffer_start = *cursor.start();
let multibuffer_end = multibuffer_start + excerpt.text_summary.lines;
if multibuffer_start.row >= row_range.end {
return None;
} }
let mut buffer_start = excerpt.range.context.start;
let mut buffer_end = excerpt.range.context.end;
let excerpt_start_point = buffer_start.to_point(&excerpt.buffer);
let excerpt_end_point = excerpt_start_point + excerpt.text_summary.lines;
if row_range.start > multibuffer_start.row {
let buffer_start_point =
excerpt_start_point + Point::new(row_range.start - multibuffer_start.row, 0);
buffer_start = excerpt.buffer.anchor_before(buffer_start_point);
}
if row_range.end < multibuffer_end.row {
let buffer_end_point =
excerpt_start_point + Point::new(row_range.end - multibuffer_start.row, 0);
buffer_end = excerpt.buffer.anchor_before(buffer_end_point);
}
let buffer_hunks = excerpt
.buffer
.git_diff_hunks_intersecting_range(buffer_start..buffer_end)
.filter_map(move |hunk| {
let start = multibuffer_start.row
+ hunk
.buffer_range
.start
.saturating_sub(excerpt_start_point.row);
let end = multibuffer_start.row
+ hunk
.buffer_range
.end
.min(excerpt_end_point.row + 1)
.saturating_sub(excerpt_start_point.row);
Some(DiffHunk {
buffer_range: start..end,
diff_base_byte_range: hunk.diff_base_byte_range.clone(),
})
});
cursor.next(&());
Some(buffer_hunks) Some(buffer_hunks)
}) })
.flatten() .flatten()
@ -4080,19 +4145,25 @@ mod tests {
let leader_multibuffer = cx.add_model(|_| MultiBuffer::new(0)); let leader_multibuffer = cx.add_model(|_| MultiBuffer::new(0));
let follower_multibuffer = cx.add_model(|_| MultiBuffer::new(0)); let follower_multibuffer = cx.add_model(|_| MultiBuffer::new(0));
let follower_edit_event_count = Rc::new(RefCell::new(0));
follower_multibuffer.update(cx, |_, cx| { follower_multibuffer.update(cx, |_, cx| {
cx.subscribe(&leader_multibuffer, |follower, _, event, cx| { let follower_edit_event_count = follower_edit_event_count.clone();
match event.clone() { cx.subscribe(
&leader_multibuffer,
move |follower, _, event, cx| match event.clone() {
Event::ExcerptsAdded { Event::ExcerptsAdded {
buffer, buffer,
predecessor, predecessor,
excerpts, excerpts,
} => follower.insert_excerpts_with_ids_after(predecessor, buffer, excerpts, cx), } => follower.insert_excerpts_with_ids_after(predecessor, buffer, excerpts, cx),
Event::ExcerptsRemoved { ids } => follower.remove_excerpts(ids, cx), Event::ExcerptsRemoved { ids } => follower.remove_excerpts(ids, cx),
Event::Edited => {
*follower_edit_event_count.borrow_mut() += 1;
}
_ => {} _ => {}
} },
}) )
.detach(); .detach();
}); });
@ -4131,6 +4202,7 @@ mod tests {
leader_multibuffer.read(cx).snapshot(cx).text(), leader_multibuffer.read(cx).snapshot(cx).text(),
follower_multibuffer.read(cx).snapshot(cx).text(), follower_multibuffer.read(cx).snapshot(cx).text(),
); );
assert_eq!(*follower_edit_event_count.borrow(), 2);
leader_multibuffer.update(cx, |leader, cx| { leader_multibuffer.update(cx, |leader, cx| {
let excerpt_ids = leader.excerpt_ids(); let excerpt_ids = leader.excerpt_ids();
@ -4140,6 +4212,27 @@ mod tests {
leader_multibuffer.read(cx).snapshot(cx).text(), leader_multibuffer.read(cx).snapshot(cx).text(),
follower_multibuffer.read(cx).snapshot(cx).text(), follower_multibuffer.read(cx).snapshot(cx).text(),
); );
assert_eq!(*follower_edit_event_count.borrow(), 3);
// Removing an empty set of excerpts is a noop.
leader_multibuffer.update(cx, |leader, cx| {
leader.remove_excerpts([], cx);
});
assert_eq!(
leader_multibuffer.read(cx).snapshot(cx).text(),
follower_multibuffer.read(cx).snapshot(cx).text(),
);
assert_eq!(*follower_edit_event_count.borrow(), 3);
// Adding an empty set of excerpts is a noop.
leader_multibuffer.update(cx, |leader, cx| {
leader.push_excerpts::<usize>(buffer_2.clone(), [], cx);
});
assert_eq!(
leader_multibuffer.read(cx).snapshot(cx).text(),
follower_multibuffer.read(cx).snapshot(cx).text(),
);
assert_eq!(*follower_edit_event_count.borrow(), 3);
leader_multibuffer.update(cx, |leader, cx| { leader_multibuffer.update(cx, |leader, cx| {
leader.clear(cx); leader.clear(cx);
@ -4148,6 +4241,7 @@ mod tests {
leader_multibuffer.read(cx).snapshot(cx).text(), leader_multibuffer.read(cx).snapshot(cx).text(),
follower_multibuffer.read(cx).snapshot(cx).text(), follower_multibuffer.read(cx).snapshot(cx).text(),
); );
assert_eq!(*follower_edit_event_count.borrow(), 4);
} }
#[gpui::test] #[gpui::test]
@ -4595,7 +4689,7 @@ mod tests {
assert_eq!( assert_eq!(
snapshot snapshot
.git_diff_hunks_in_range(0..12, false) .git_diff_hunks_in_range(0..12)
.map(|hunk| (hunk.status(), hunk.buffer_range)) .map(|hunk| (hunk.status(), hunk.buffer_range))
.collect::<Vec<_>>(), .collect::<Vec<_>>(),
&expected, &expected,
@ -4603,7 +4697,7 @@ mod tests {
assert_eq!( assert_eq!(
snapshot snapshot
.git_diff_hunks_in_range(0..12, true) .git_diff_hunks_in_range_rev(0..12)
.map(|hunk| (hunk.status(), hunk.buffer_range)) .map(|hunk| (hunk.status(), hunk.buffer_range))
.collect::<Vec<_>>(), .collect::<Vec<_>>(),
expected expected

View file

@ -212,6 +212,7 @@ impl<'a> EditorTestContext<'a> {
self.assert_selections(expected_selections, marked_text.to_string()) self.assert_selections(expected_selections, marked_text.to_string())
} }
#[track_caller]
pub fn assert_editor_background_highlights<Tag: 'static>(&mut self, marked_text: &str) { pub fn assert_editor_background_highlights<Tag: 'static>(&mut self, marked_text: &str) {
let expected_ranges = self.ranges(marked_text); let expected_ranges = self.ranges(marked_text);
let actual_ranges: Vec<Range<usize>> = self.update_editor(|editor, cx| { let actual_ranges: Vec<Range<usize>> = self.update_editor(|editor, cx| {
@ -228,6 +229,7 @@ impl<'a> EditorTestContext<'a> {
assert_set_eq!(actual_ranges, expected_ranges); assert_set_eq!(actual_ranges, expected_ranges);
} }
#[track_caller]
pub fn assert_editor_text_highlights<Tag: ?Sized + 'static>(&mut self, marked_text: &str) { pub fn assert_editor_text_highlights<Tag: ?Sized + 'static>(&mut self, marked_text: &str) {
let expected_ranges = self.ranges(marked_text); let expected_ranges = self.ranges(marked_text);
let snapshot = self.update_editor(|editor, cx| editor.snapshot(cx)); let snapshot = self.update_editor(|editor, cx| editor.snapshot(cx));
@ -241,12 +243,14 @@ impl<'a> EditorTestContext<'a> {
assert_set_eq!(actual_ranges, expected_ranges); assert_set_eq!(actual_ranges, expected_ranges);
} }
#[track_caller]
pub fn assert_editor_selections(&mut self, expected_selections: Vec<Range<usize>>) { pub fn assert_editor_selections(&mut self, expected_selections: Vec<Range<usize>>) {
let expected_marked_text = let expected_marked_text =
generate_marked_text(&self.buffer_text(), &expected_selections, true); generate_marked_text(&self.buffer_text(), &expected_selections, true);
self.assert_selections(expected_selections, expected_marked_text) self.assert_selections(expected_selections, expected_marked_text)
} }
#[track_caller]
fn assert_selections( fn assert_selections(
&mut self, &mut self,
expected_selections: Vec<Range<usize>>, expected_selections: Vec<Range<usize>>,

View file

@ -1,4 +1,4 @@
use std::ops::Range; use std::{iter, ops::Range};
use sum_tree::SumTree; use sum_tree::SumTree;
use text::{Anchor, BufferSnapshot, OffsetRangeExt, Point}; use text::{Anchor, BufferSnapshot, OffsetRangeExt, Point};
@ -71,22 +71,66 @@ impl BufferDiff {
} }
} }
pub fn is_empty(&self) -> bool {
self.tree.is_empty()
}
pub fn hunks_in_row_range<'a>( pub fn hunks_in_row_range<'a>(
&'a self, &'a self,
range: Range<u32>, range: Range<u32>,
buffer: &'a BufferSnapshot, buffer: &'a BufferSnapshot,
reversed: bool,
) -> impl 'a + Iterator<Item = DiffHunk<u32>> { ) -> impl 'a + Iterator<Item = DiffHunk<u32>> {
let start = buffer.anchor_before(Point::new(range.start, 0)); let start = buffer.anchor_before(Point::new(range.start, 0));
let end = buffer.anchor_after(Point::new(range.end, 0)); let end = buffer.anchor_after(Point::new(range.end, 0));
self.hunks_intersecting_range(start..end, buffer, reversed)
self.hunks_intersecting_range(start..end, buffer)
} }
pub fn hunks_intersecting_range<'a>( pub fn hunks_intersecting_range<'a>(
&'a self, &'a self,
range: Range<Anchor>, range: Range<Anchor>,
buffer: &'a BufferSnapshot, buffer: &'a BufferSnapshot,
reversed: bool, ) -> impl 'a + Iterator<Item = DiffHunk<u32>> {
let mut cursor = self.tree.filter::<_, DiffHunkSummary>(move |summary| {
let before_start = summary.buffer_range.end.cmp(&range.start, buffer).is_lt();
let after_end = summary.buffer_range.start.cmp(&range.end, buffer).is_gt();
!before_start && !after_end
});
let anchor_iter = std::iter::from_fn(move || {
cursor.next(buffer);
cursor.item()
})
.flat_map(move |hunk| {
[
(&hunk.buffer_range.start, hunk.diff_base_byte_range.start),
(&hunk.buffer_range.end, hunk.diff_base_byte_range.end),
]
.into_iter()
});
let mut summaries = buffer.summaries_for_anchors_with_payload::<Point, _, _>(anchor_iter);
iter::from_fn(move || {
let (start_point, start_base) = summaries.next()?;
let (end_point, end_base) = summaries.next()?;
let end_row = if end_point.column > 0 {
end_point.row + 1
} else {
end_point.row
};
Some(DiffHunk {
buffer_range: start_point.row..end_row,
diff_base_byte_range: start_base..end_base,
})
})
}
pub fn hunks_intersecting_range_rev<'a>(
&'a self,
range: Range<Anchor>,
buffer: &'a BufferSnapshot,
) -> impl 'a + Iterator<Item = DiffHunk<u32>> { ) -> impl 'a + Iterator<Item = DiffHunk<u32>> {
let mut cursor = self.tree.filter::<_, DiffHunkSummary>(move |summary| { let mut cursor = self.tree.filter::<_, DiffHunkSummary>(move |summary| {
let before_start = summary.buffer_range.end.cmp(&range.start, buffer).is_lt(); let before_start = summary.buffer_range.end.cmp(&range.start, buffer).is_lt();
@ -95,14 +139,9 @@ impl BufferDiff {
}); });
std::iter::from_fn(move || { std::iter::from_fn(move || {
if reversed { cursor.prev(buffer);
cursor.prev(buffer);
} else {
cursor.next(buffer);
}
let hunk = cursor.item()?; let hunk = cursor.item()?;
let range = hunk.buffer_range.to_point(buffer); let range = hunk.buffer_range.to_point(buffer);
let end_row = if range.end.column > 0 { let end_row = if range.end.column > 0 {
range.end.row + 1 range.end.row + 1
@ -151,7 +190,7 @@ impl BufferDiff {
fn hunks<'a>(&'a self, text: &'a BufferSnapshot) -> impl 'a + Iterator<Item = DiffHunk<u32>> { fn hunks<'a>(&'a self, text: &'a BufferSnapshot) -> impl 'a + Iterator<Item = DiffHunk<u32>> {
let start = text.anchor_before(Point::new(0, 0)); let start = text.anchor_before(Point::new(0, 0));
let end = text.anchor_after(Point::new(u32::MAX, u32::MAX)); let end = text.anchor_after(Point::new(u32::MAX, u32::MAX));
self.hunks_intersecting_range(start..end, text, false) self.hunks_intersecting_range(start..end, text)
} }
fn diff<'a>(head: &'a str, current: &'a str) -> Option<GitPatch<'a>> { fn diff<'a>(head: &'a str, current: &'a str) -> Option<GitPatch<'a>> {
@ -279,6 +318,8 @@ pub fn assert_hunks<Iter>(
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use std::assert_eq;
use super::*; use super::*;
use text::Buffer; use text::Buffer;
use unindent::Unindent as _; use unindent::Unindent as _;
@ -365,7 +406,7 @@ mod tests {
assert_eq!(diff.hunks(&buffer).count(), 8); assert_eq!(diff.hunks(&buffer).count(), 8);
assert_hunks( assert_hunks(
diff.hunks_in_row_range(7..12, &buffer, false), diff.hunks_in_row_range(7..12, &buffer),
&buffer, &buffer,
&diff_base, &diff_base,
&[ &[

View file

@ -1451,27 +1451,13 @@ impl AppContext {
self.views_metadata.remove(&(window_id, view_id)); self.views_metadata.remove(&(window_id, view_id));
let mut view = self.views.remove(&(window_id, view_id)).unwrap(); let mut view = self.views.remove(&(window_id, view_id)).unwrap();
view.release(self); view.release(self);
let change_focus_to = self.windows.get_mut(&window_id).and_then(|window| { if let Some(window) = self.windows.get_mut(&window_id) {
window.parents.remove(&view_id); window.parents.remove(&view_id);
window window
.invalidation .invalidation
.get_or_insert_with(Default::default) .get_or_insert_with(Default::default)
.removed .removed
.push(view_id); .push(view_id);
if window.focused_view_id == Some(view_id) {
Some(window.root_view().id())
} else {
None
}
});
if let Some(view_id) = change_focus_to {
self.pending_effects
.push_back(Effect::Focus(FocusEffect::View {
window_id,
view_id: Some(view_id),
is_forced: false,
}));
} }
self.pending_effects self.pending_effects
@ -1708,8 +1694,69 @@ impl AppContext {
if let Some(invalidation) = invalidation { if let Some(invalidation) = invalidation {
let appearance = cx.window.platform_window.appearance(); let appearance = cx.window.platform_window.appearance();
cx.invalidate(invalidation, appearance); cx.invalidate(invalidation, appearance);
if cx.layout(refreshing).log_err().is_some() { if let Some(old_parents) = cx.layout(refreshing).log_err() {
updated_windows.insert(window_id); updated_windows.insert(window_id);
if let Some(focused_view_id) = cx.focused_view_id() {
let old_ancestors = std::iter::successors(
Some(focused_view_id),
|&view_id| old_parents.get(&view_id).copied(),
)
.collect::<HashSet<_>>();
let new_ancestors =
cx.ancestors(focused_view_id).collect::<HashSet<_>>();
// Notify the old ancestors of the focused view when they don't contain it anymore.
for old_ancestor in old_ancestors.iter().copied() {
if !new_ancestors.contains(&old_ancestor) {
if let Some(mut view) =
cx.views.remove(&(window_id, old_ancestor))
{
view.focus_out(
focused_view_id,
cx,
old_ancestor,
);
cx.views
.insert((window_id, old_ancestor), view);
}
}
}
// Notify the new ancestors of the focused view if they contain it now.
for new_ancestor in new_ancestors.iter().copied() {
if !old_ancestors.contains(&new_ancestor) {
if let Some(mut view) =
cx.views.remove(&(window_id, new_ancestor))
{
view.focus_in(
focused_view_id,
cx,
new_ancestor,
);
cx.views
.insert((window_id, new_ancestor), view);
}
}
}
// When the previously-focused view has been dropped and
// there isn't any pending focus, focus the root view.
let root_view_id = cx.window.root_view().id();
if focused_view_id != root_view_id
&& !cx.views.contains_key(&(window_id, focused_view_id))
&& !focus_effects.contains_key(&window_id)
{
focus_effects.insert(
window_id,
FocusEffect::View {
window_id,
view_id: Some(root_view_id),
is_forced: false,
},
);
}
}
} }
} }
}); });
@ -1886,9 +1933,27 @@ impl AppContext {
fn handle_focus_effect(&mut self, effect: FocusEffect) { fn handle_focus_effect(&mut self, effect: FocusEffect) {
let window_id = effect.window_id(); let window_id = effect.window_id();
self.update_window(window_id, |cx| { self.update_window(window_id, |cx| {
// Ensure the newly-focused view still exists, otherwise focus
// the root view instead.
let focused_id = match effect { let focused_id = match effect {
FocusEffect::View { view_id, .. } => view_id, FocusEffect::View { view_id, .. } => {
FocusEffect::ViewParent { view_id, .. } => cx.ancestors(view_id).skip(1).next(), if let Some(view_id) = view_id {
if cx.views.contains_key(&(window_id, view_id)) {
Some(view_id)
} else {
Some(cx.root_view().id())
}
} else {
None
}
}
FocusEffect::ViewParent { view_id, .. } => Some(
cx.window
.parents
.get(&view_id)
.copied()
.unwrap_or(cx.root_view().id()),
),
}; };
let focus_changed = cx.window.focused_view_id != focused_id; let focus_changed = cx.window.focused_view_id != focused_id;

View file

@ -29,6 +29,7 @@ use sqlez::{
}; };
use std::{ use std::{
any::TypeId, any::TypeId,
mem,
ops::{Deref, DerefMut, Range}, ops::{Deref, DerefMut, Range},
}; };
use util::ResultExt; use util::ResultExt;
@ -890,7 +891,7 @@ impl<'a> WindowContext<'a> {
Ok(element) Ok(element)
} }
pub(crate) fn layout(&mut self, refreshing: bool) -> Result<()> { pub(crate) fn layout(&mut self, refreshing: bool) -> Result<HashMap<usize, usize>> {
let window_size = self.window.platform_window.content_size(); let window_size = self.window.platform_window.content_size();
let root_view_id = self.window.root_view().id(); let root_view_id = self.window.root_view().id();
let mut rendered_root = self.window.rendered_views.remove(&root_view_id).unwrap(); let mut rendered_root = self.window.rendered_views.remove(&root_view_id).unwrap();
@ -923,11 +924,11 @@ impl<'a> WindowContext<'a> {
} }
} }
self.window.parents = new_parents; let old_parents = mem::replace(&mut self.window.parents, new_parents);
self.window self.window
.rendered_views .rendered_views
.insert(root_view_id, rendered_root); .insert(root_view_id, rendered_root);
Ok(()) Ok(old_parents)
} }
pub(crate) fn paint(&mut self) -> Result<Scene> { pub(crate) fn paint(&mut self) -> Result<Scene> {

View file

@ -2500,18 +2500,22 @@ impl BufferSnapshot {
pub fn git_diff_hunks_in_row_range<'a>( pub fn git_diff_hunks_in_row_range<'a>(
&'a self, &'a self,
range: Range<u32>, range: Range<u32>,
reversed: bool,
) -> impl 'a + Iterator<Item = git::diff::DiffHunk<u32>> { ) -> impl 'a + Iterator<Item = git::diff::DiffHunk<u32>> {
self.git_diff.hunks_in_row_range(range, self, reversed) self.git_diff.hunks_in_row_range(range, self)
} }
pub fn git_diff_hunks_intersecting_range<'a>( pub fn git_diff_hunks_intersecting_range<'a>(
&'a self, &'a self,
range: Range<Anchor>, range: Range<Anchor>,
reversed: bool,
) -> impl 'a + Iterator<Item = git::diff::DiffHunk<u32>> { ) -> impl 'a + Iterator<Item = git::diff::DiffHunk<u32>> {
self.git_diff self.git_diff.hunks_intersecting_range(range, self)
.hunks_intersecting_range(range, self, reversed) }
pub fn git_diff_hunks_intersecting_range_rev<'a>(
&'a self,
range: Range<Anchor>,
) -> impl 'a + Iterator<Item = git::diff::DiffHunk<u32>> {
self.git_diff.hunks_intersecting_range_rev(range, self)
} }
pub fn diagnostics_in_range<'a, T, O>( pub fn diagnostics_in_range<'a, T, O>(

View file

@ -6,8 +6,8 @@
"repositoryURL": "https://github.com/livekit/client-sdk-swift.git", "repositoryURL": "https://github.com/livekit/client-sdk-swift.git",
"state": { "state": {
"branch": null, "branch": null,
"revision": "f6ca534eb334e99acb8e82cc99b491717df28d8a", "revision": "7331b813a5ab8a95cfb81fb2b4ed10519428b9ff",
"version": null "version": "1.0.12"
} }
}, },
{ {
@ -15,8 +15,8 @@
"repositoryURL": "https://github.com/google/promises.git", "repositoryURL": "https://github.com/google/promises.git",
"state": { "state": {
"branch": null, "branch": null,
"revision": "3e4e743631e86c8c70dbc6efdc7beaa6e90fd3bb", "revision": "ec957ccddbcc710ccc64c9dcbd4c7006fcf8b73a",
"version": "2.1.1" "version": "2.2.0"
} }
}, },
{ {
@ -24,8 +24,8 @@
"repositoryURL": "https://github.com/webrtc-sdk/Specs.git", "repositoryURL": "https://github.com/webrtc-sdk/Specs.git",
"state": { "state": {
"branch": null, "branch": null,
"revision": "38ac06261e62f980652278c69b70284324c769e0", "revision": "2f6bab30c8df0fe59ab3e58bc99097f757f85f65",
"version": "104.5112.5" "version": "104.5112.17"
} }
}, },
{ {
@ -33,8 +33,8 @@
"repositoryURL": "https://github.com/apple/swift-log.git", "repositoryURL": "https://github.com/apple/swift-log.git",
"state": { "state": {
"branch": null, "branch": null,
"revision": "6fe203dc33195667ce1759bf0182975e4653ba1c", "revision": "32e8d724467f8fe623624570367e3d50c5638e46",
"version": "1.4.4" "version": "1.5.2"
} }
}, },
{ {
@ -42,8 +42,8 @@
"repositoryURL": "https://github.com/apple/swift-protobuf.git", "repositoryURL": "https://github.com/apple/swift-protobuf.git",
"state": { "state": {
"branch": null, "branch": null,
"revision": "88c7d15e1242fdb6ecbafbc7926426a19be1e98a", "revision": "0af9125c4eae12a4973fb66574c53a54962a9e1e",
"version": "1.20.2" "version": "1.21.0"
} }
} }
] ]

View file

@ -15,7 +15,7 @@ let package = Package(
targets: ["LiveKitBridge"]), targets: ["LiveKitBridge"]),
], ],
dependencies: [ dependencies: [
.package(url: "https://github.com/livekit/client-sdk-swift.git", revision: "f6ca534eb334e99acb8e82cc99b491717df28d8a"), .package(url: "https://github.com/livekit/client-sdk-swift.git", .exact("1.0.12")),
], ],
targets: [ targets: [
// Targets are the basic building blocks of a package. A target can define a module or a test suite. // Targets are the basic building blocks of a package. A target can define a module or a test suite.

View file

@ -42,7 +42,7 @@ anyhow.workspace = true
async-trait.workspace = true async-trait.workspace = true
backtrace = "0.3" backtrace = "0.3"
futures.workspace = true futures.workspace = true
glob.workspace = true globset.workspace = true
ignore = "0.4" ignore = "0.4"
lazy_static.workspace = true lazy_static.workspace = true
log.workspace = true log.workspace = true

View file

@ -1,121 +0,0 @@
use anyhow::{anyhow, Result};
use std::path::Path;
#[derive(Default)]
pub struct LspGlobSet {
patterns: Vec<glob::Pattern>,
}
impl LspGlobSet {
pub fn clear(&mut self) {
self.patterns.clear();
}
/// Add a pattern to the glob set.
///
/// LSP's glob syntax supports bash-style brace expansion. For example,
/// the pattern '*.{js,ts}' would match all JavaScript or TypeScript files.
/// This is not a part of the standard libc glob syntax, and isn't supported
/// by the `glob` crate. So we pre-process the glob patterns, producing a
/// separate glob `Pattern` object for each part of a brace expansion.
pub fn add_pattern(&mut self, pattern: &str) -> Result<()> {
// Find all of the ranges of `pattern` that contain matched curly braces.
let mut expansion_ranges = Vec::new();
let mut expansion_start_ix = None;
for (ix, c) in pattern.match_indices(|c| ['{', '}'].contains(&c)) {
match c {
"{" => {
if expansion_start_ix.is_some() {
return Err(anyhow!("nested braces in glob patterns aren't supported"));
}
expansion_start_ix = Some(ix);
}
"}" => {
if let Some(start_ix) = expansion_start_ix {
expansion_ranges.push(start_ix..ix + 1);
}
expansion_start_ix = None;
}
_ => {}
}
}
// Starting with a single pattern, process each brace expansion by cloning
// the pattern once per element of the expansion.
let mut unexpanded_patterns = vec![];
let mut expanded_patterns = vec![pattern.to_string()];
for outer_range in expansion_ranges.into_iter().rev() {
let inner_range = (outer_range.start + 1)..(outer_range.end - 1);
std::mem::swap(&mut unexpanded_patterns, &mut expanded_patterns);
for unexpanded_pattern in unexpanded_patterns.drain(..) {
for part in unexpanded_pattern[inner_range.clone()].split(',') {
let mut expanded_pattern = unexpanded_pattern.clone();
expanded_pattern.replace_range(outer_range.clone(), part);
expanded_patterns.push(expanded_pattern);
}
}
}
// Parse the final glob patterns and add them to the set.
for pattern in expanded_patterns {
let pattern = glob::Pattern::new(&pattern)?;
self.patterns.push(pattern);
}
Ok(())
}
pub fn matches(&self, path: &Path) -> bool {
self.patterns
.iter()
.any(|pattern| pattern.matches_path(path))
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_glob_set() {
let mut watch = LspGlobSet::default();
watch.add_pattern("/a/**/*.rs").unwrap();
watch.add_pattern("/a/**/Cargo.toml").unwrap();
assert!(watch.matches("/a/b.rs".as_ref()));
assert!(watch.matches("/a/b/c.rs".as_ref()));
assert!(!watch.matches("/b/c.rs".as_ref()));
assert!(!watch.matches("/a/b.ts".as_ref()));
}
#[test]
fn test_brace_expansion() {
let mut watch = LspGlobSet::default();
watch.add_pattern("/a/*.{ts,js,tsx}").unwrap();
assert!(watch.matches("/a/one.js".as_ref()));
assert!(watch.matches("/a/two.ts".as_ref()));
assert!(watch.matches("/a/three.tsx".as_ref()));
assert!(!watch.matches("/a/one.j".as_ref()));
assert!(!watch.matches("/a/two.s".as_ref()));
assert!(!watch.matches("/a/three.t".as_ref()));
assert!(!watch.matches("/a/four.t".as_ref()));
assert!(!watch.matches("/a/five.xt".as_ref()));
}
#[test]
fn test_multiple_brace_expansion() {
let mut watch = LspGlobSet::default();
watch.add_pattern("/a/{one,two,three}.{b*c,d*e}").unwrap();
assert!(watch.matches("/a/one.bic".as_ref()));
assert!(watch.matches("/a/two.dole".as_ref()));
assert!(watch.matches("/a/three.deeee".as_ref()));
assert!(!watch.matches("/a/four.bic".as_ref()));
assert!(!watch.matches("/a/one.be".as_ref()));
}
}

View file

@ -1,6 +1,5 @@
mod ignore; mod ignore;
mod lsp_command; mod lsp_command;
mod lsp_glob_set;
pub mod search; pub mod search;
pub mod terminals; pub mod terminals;
pub mod worktree; pub mod worktree;
@ -16,8 +15,10 @@ use copilot::Copilot;
use futures::{ use futures::{
channel::mpsc::{self, UnboundedReceiver}, channel::mpsc::{self, UnboundedReceiver},
future::{try_join_all, Shared}, future::{try_join_all, Shared},
stream::FuturesUnordered,
AsyncWriteExt, Future, FutureExt, StreamExt, TryFutureExt, AsyncWriteExt, Future, FutureExt, StreamExt, TryFutureExt,
}; };
use globset::{Glob, GlobSet, GlobSetBuilder};
use gpui::{ use gpui::{
AnyModelHandle, AppContext, AsyncAppContext, BorrowAppContext, Entity, ModelContext, AnyModelHandle, AppContext, AsyncAppContext, BorrowAppContext, Entity, ModelContext,
ModelHandle, Task, WeakModelHandle, ModelHandle, Task, WeakModelHandle,
@ -39,7 +40,6 @@ use lsp::{
DocumentHighlightKind, LanguageServer, LanguageServerId, DocumentHighlightKind, LanguageServer, LanguageServerId,
}; };
use lsp_command::*; use lsp_command::*;
use lsp_glob_set::LspGlobSet;
use postage::watch; use postage::watch;
use rand::prelude::*; use rand::prelude::*;
use search::SearchQuery; use search::SearchQuery;
@ -225,7 +225,7 @@ pub enum LanguageServerState {
language: Arc<Language>, language: Arc<Language>,
adapter: Arc<CachedLspAdapter>, adapter: Arc<CachedLspAdapter>,
server: Arc<LanguageServer>, server: Arc<LanguageServer>,
watched_paths: LspGlobSet, watched_paths: HashMap<WorktreeId, GlobSet>,
simulate_disk_based_diagnostics_completion: Option<Task<()>>, simulate_disk_based_diagnostics_completion: Option<Task<()>>,
}, },
} }
@ -1362,7 +1362,7 @@ impl Project {
return Task::ready(Ok(existing_buffer)); return Task::ready(Ok(existing_buffer));
} }
let mut loading_watch = match self.loading_buffers_by_path.entry(project_path.clone()) { let loading_watch = match self.loading_buffers_by_path.entry(project_path.clone()) {
// If the given path is already being loaded, then wait for that existing // If the given path is already being loaded, then wait for that existing
// task to complete and return the same buffer. // task to complete and return the same buffer.
hash_map::Entry::Occupied(e) => e.get().clone(), hash_map::Entry::Occupied(e) => e.get().clone(),
@ -1393,15 +1393,9 @@ impl Project {
}; };
cx.foreground().spawn(async move { cx.foreground().spawn(async move {
loop { pump_loading_buffer_reciever(loading_watch)
if let Some(result) = loading_watch.borrow().as_ref() { .await
match result { .map_err(|error| anyhow!("{}", error))
Ok(buffer) => return Ok(buffer.clone()),
Err(error) => return Err(anyhow!("{}", error)),
}
}
loading_watch.next().await;
}
}) })
} }
@ -2859,10 +2853,37 @@ impl Project {
if let Some(LanguageServerState::Running { watched_paths, .. }) = if let Some(LanguageServerState::Running { watched_paths, .. }) =
self.language_servers.get_mut(&language_server_id) self.language_servers.get_mut(&language_server_id)
{ {
watched_paths.clear(); let mut builders = HashMap::default();
for watcher in params.watchers { for watcher in params.watchers {
watched_paths.add_pattern(&watcher.glob_pattern).log_err(); for worktree in &self.worktrees {
if let Some(worktree) = worktree.upgrade(cx) {
let worktree = worktree.read(cx);
if let Some(abs_path) = worktree.abs_path().to_str() {
if let Some(suffix) = watcher
.glob_pattern
.strip_prefix(abs_path)
.and_then(|s| s.strip_prefix(std::path::MAIN_SEPARATOR))
{
if let Some(glob) = Glob::new(suffix).log_err() {
builders
.entry(worktree.id())
.or_insert_with(|| GlobSetBuilder::new())
.add(glob);
}
break;
}
}
}
}
} }
watched_paths.clear();
for (worktree_id, builder) in builders {
if let Ok(globset) = builder.build() {
watched_paths.insert(worktree_id, globset);
}
}
cx.notify(); cx.notify();
} }
} }
@ -4706,25 +4727,39 @@ impl Project {
changes: &HashMap<(Arc<Path>, ProjectEntryId), PathChange>, changes: &HashMap<(Arc<Path>, ProjectEntryId), PathChange>,
cx: &mut ModelContext<Self>, cx: &mut ModelContext<Self>,
) { ) {
if changes.is_empty() {
return;
}
let worktree_id = worktree_handle.read(cx).id(); let worktree_id = worktree_handle.read(cx).id();
let mut language_server_ids = self
.language_server_ids
.iter()
.filter_map(|((server_worktree_id, _), server_id)| {
(*server_worktree_id == worktree_id).then_some(*server_id)
})
.collect::<Vec<_>>();
language_server_ids.sort();
language_server_ids.dedup();
let abs_path = worktree_handle.read(cx).abs_path(); let abs_path = worktree_handle.read(cx).abs_path();
for ((server_worktree_id, _), server_id) in &self.language_server_ids { for server_id in &language_server_ids {
if *server_worktree_id == worktree_id { if let Some(server) = self.language_servers.get(server_id) {
if let Some(server) = self.language_servers.get(server_id) { if let LanguageServerState::Running {
if let LanguageServerState::Running { server,
server, watched_paths,
watched_paths, ..
.. } = server
} = server {
{ if let Some(watched_paths) = watched_paths.get(&worktree_id) {
let params = lsp::DidChangeWatchedFilesParams { let params = lsp::DidChangeWatchedFilesParams {
changes: changes changes: changes
.iter() .iter()
.filter_map(|((path, _), change)| { .filter_map(|((path, _), change)| {
let path = abs_path.join(path); if watched_paths.is_match(&path) {
if watched_paths.matches(&path) {
Some(lsp::FileEvent { Some(lsp::FileEvent {
uri: lsp::Url::from_file_path(path).unwrap(), uri: lsp::Url::from_file_path(abs_path.join(path))
.unwrap(),
typ: match change { typ: match change {
PathChange::Added => lsp::FileChangeType::CREATED, PathChange::Added => lsp::FileChangeType::CREATED,
PathChange::Removed => lsp::FileChangeType::DELETED, PathChange::Removed => lsp::FileChangeType::DELETED,
@ -4760,6 +4795,51 @@ impl Project {
) { ) {
debug_assert!(worktree_handle.read(cx).is_local()); debug_assert!(worktree_handle.read(cx).is_local());
// Setup the pending buffers
let future_buffers = self
.loading_buffers_by_path
.iter()
.filter_map(|(path, receiver)| {
let path = &path.path;
let (work_directory, repo) = repos
.iter()
.find(|(work_directory, _)| path.starts_with(work_directory))?;
let repo_relative_path = path.strip_prefix(work_directory).log_err()?;
let receiver = receiver.clone();
let repo_ptr = repo.repo_ptr.clone();
let repo_relative_path = repo_relative_path.to_owned();
Some(async move {
pump_loading_buffer_reciever(receiver)
.await
.ok()
.map(|buffer| (buffer, repo_relative_path, repo_ptr))
})
})
.collect::<FuturesUnordered<_>>()
.filter_map(|result| async move {
let (buffer_handle, repo_relative_path, repo_ptr) = result?;
let lock = repo_ptr.lock();
lock.load_index_text(&repo_relative_path)
.map(|diff_base| (diff_base, buffer_handle))
});
let update_diff_base_fn = update_diff_base(self);
cx.spawn(|_, mut cx| async move {
let diff_base_tasks = cx
.background()
.spawn(future_buffers.collect::<Vec<_>>())
.await;
for (diff_base, buffer) in diff_base_tasks.into_iter() {
update_diff_base_fn(Some(diff_base), buffer, &mut cx);
}
})
.detach();
// And the current buffers
for (_, buffer) in &self.opened_buffers { for (_, buffer) in &self.opened_buffers {
if let Some(buffer) = buffer.upgrade(cx) { if let Some(buffer) = buffer.upgrade(cx) {
let file = match File::from_dyn(buffer.read(cx).file()) { let file = match File::from_dyn(buffer.read(cx).file()) {
@ -4779,18 +4859,17 @@ impl Project {
.find(|(work_directory, _)| path.starts_with(work_directory)) .find(|(work_directory, _)| path.starts_with(work_directory))
{ {
Some(repo) => repo.clone(), Some(repo) => repo.clone(),
None => return, None => continue,
}; };
let relative_repo = match path.strip_prefix(work_directory).log_err() { let relative_repo = match path.strip_prefix(work_directory).log_err() {
Some(relative_repo) => relative_repo.to_owned(), Some(relative_repo) => relative_repo.to_owned(),
None => return, None => continue,
}; };
drop(worktree); drop(worktree);
let remote_id = self.remote_id(); let update_diff_base_fn = update_diff_base(self);
let client = self.client.clone();
let git_ptr = repo.repo_ptr.clone(); let git_ptr = repo.repo_ptr.clone();
let diff_base_task = cx let diff_base_task = cx
.background() .background()
@ -4798,21 +4877,7 @@ impl Project {
cx.spawn(|_, mut cx| async move { cx.spawn(|_, mut cx| async move {
let diff_base = diff_base_task.await; let diff_base = diff_base_task.await;
update_diff_base_fn(diff_base, buffer, &mut cx);
let buffer_id = buffer.update(&mut cx, |buffer, cx| {
buffer.set_diff_base(diff_base.clone(), cx);
buffer.remote_id()
});
if let Some(project_id) = remote_id {
client
.send(proto::UpdateDiffBase {
project_id,
buffer_id: buffer_id as u64,
diff_base,
})
.log_err();
}
}) })
.detach(); .detach();
} }
@ -6699,3 +6764,40 @@ impl Item for Buffer {
}) })
} }
} }
async fn pump_loading_buffer_reciever(
mut receiver: postage::watch::Receiver<Option<Result<ModelHandle<Buffer>, Arc<anyhow::Error>>>>,
) -> Result<ModelHandle<Buffer>, Arc<anyhow::Error>> {
loop {
if let Some(result) = receiver.borrow().as_ref() {
match result {
Ok(buffer) => return Ok(buffer.to_owned()),
Err(e) => return Err(e.to_owned()),
}
}
receiver.next().await;
}
}
fn update_diff_base(
project: &Project,
) -> impl Fn(Option<String>, ModelHandle<Buffer>, &mut AsyncAppContext) {
let remote_id = project.remote_id();
let client = project.client().clone();
move |diff_base, buffer, cx| {
let buffer_id = buffer.update(cx, |buffer, cx| {
buffer.set_diff_base(diff_base.clone(), cx);
buffer.remote_id()
});
if let Some(project_id) = remote_id {
client
.send(proto::UpdateDiffBase {
project_id,
buffer_id: buffer_id as u64,
diff_base,
})
.log_err();
}
}
}

View file

@ -2,8 +2,8 @@ use crate::{worktree::WorktreeHandle, Event, *};
use fs::LineEnding; use fs::LineEnding;
use fs::{FakeFs, RealFs}; use fs::{FakeFs, RealFs};
use futures::{future, StreamExt}; use futures::{future, StreamExt};
use gpui::AppContext; use globset::Glob;
use gpui::{executor::Deterministic, test::subscribe}; use gpui::{executor::Deterministic, test::subscribe, AppContext};
use language::{ use language::{
tree_sitter_rust, tree_sitter_typescript, Diagnostic, FakeLspAdapter, LanguageConfig, tree_sitter_rust, tree_sitter_typescript, Diagnostic, FakeLspAdapter, LanguageConfig,
OffsetRangeExt, Point, ToPoint, OffsetRangeExt, Point, ToPoint,
@ -503,7 +503,7 @@ async fn test_reporting_fs_changes_to_language_servers(cx: &mut gpui::TestAppCon
register_options: serde_json::to_value( register_options: serde_json::to_value(
lsp::DidChangeWatchedFilesRegistrationOptions { lsp::DidChangeWatchedFilesRegistrationOptions {
watchers: vec![lsp::FileSystemWatcher { watchers: vec![lsp::FileSystemWatcher {
glob_pattern: "*.{rs,c}".to_string(), glob_pattern: "/the-root/*.{rs,c}".to_string(),
kind: None, kind: None,
}], }],
}, },
@ -3361,7 +3361,7 @@ async fn test_search_with_inclusions(cx: &mut gpui::TestAppContext) {
search_query, search_query,
false, false,
true, true,
vec![glob::Pattern::new("*.odd").unwrap()], vec![Glob::new("*.odd").unwrap().compile_matcher()],
Vec::new() Vec::new()
), ),
cx cx
@ -3379,7 +3379,7 @@ async fn test_search_with_inclusions(cx: &mut gpui::TestAppContext) {
search_query, search_query,
false, false,
true, true,
vec![glob::Pattern::new("*.rs").unwrap()], vec![Glob::new("*.rs").unwrap().compile_matcher()],
Vec::new() Vec::new()
), ),
cx cx
@ -3401,8 +3401,8 @@ async fn test_search_with_inclusions(cx: &mut gpui::TestAppContext) {
false, false,
true, true,
vec![ vec![
glob::Pattern::new("*.ts").unwrap(), Glob::new("*.ts").unwrap().compile_matcher(),
glob::Pattern::new("*.odd").unwrap(), Glob::new("*.odd").unwrap().compile_matcher(),
], ],
Vec::new() Vec::new()
), ),
@ -3425,9 +3425,9 @@ async fn test_search_with_inclusions(cx: &mut gpui::TestAppContext) {
false, false,
true, true,
vec![ vec![
glob::Pattern::new("*.rs").unwrap(), Glob::new("*.rs").unwrap().compile_matcher(),
glob::Pattern::new("*.ts").unwrap(), Glob::new("*.ts").unwrap().compile_matcher(),
glob::Pattern::new("*.odd").unwrap(), Glob::new("*.odd").unwrap().compile_matcher(),
], ],
Vec::new() Vec::new()
), ),
@ -3470,7 +3470,7 @@ async fn test_search_with_exclusions(cx: &mut gpui::TestAppContext) {
false, false,
true, true,
Vec::new(), Vec::new(),
vec![glob::Pattern::new("*.odd").unwrap()], vec![Glob::new("*.odd").unwrap().compile_matcher()],
), ),
cx cx
) )
@ -3493,7 +3493,7 @@ async fn test_search_with_exclusions(cx: &mut gpui::TestAppContext) {
false, false,
true, true,
Vec::new(), Vec::new(),
vec![glob::Pattern::new("*.rs").unwrap()], vec![Glob::new("*.rs").unwrap().compile_matcher()],
), ),
cx cx
) )
@ -3515,8 +3515,8 @@ async fn test_search_with_exclusions(cx: &mut gpui::TestAppContext) {
true, true,
Vec::new(), Vec::new(),
vec![ vec![
glob::Pattern::new("*.ts").unwrap(), Glob::new("*.ts").unwrap().compile_matcher(),
glob::Pattern::new("*.odd").unwrap(), Glob::new("*.odd").unwrap().compile_matcher(),
], ],
), ),
cx cx
@ -3539,9 +3539,9 @@ async fn test_search_with_exclusions(cx: &mut gpui::TestAppContext) {
true, true,
Vec::new(), Vec::new(),
vec![ vec![
glob::Pattern::new("*.rs").unwrap(), Glob::new("*.rs").unwrap().compile_matcher(),
glob::Pattern::new("*.ts").unwrap(), Glob::new("*.ts").unwrap().compile_matcher(),
glob::Pattern::new("*.odd").unwrap(), Glob::new("*.odd").unwrap().compile_matcher(),
], ],
), ),
cx cx
@ -3576,8 +3576,8 @@ async fn test_search_with_exclusions_and_inclusions(cx: &mut gpui::TestAppContex
search_query, search_query,
false, false,
true, true,
vec![glob::Pattern::new("*.odd").unwrap()], vec![Glob::new("*.odd").unwrap().compile_matcher()],
vec![glob::Pattern::new("*.odd").unwrap()], vec![Glob::new("*.odd").unwrap().compile_matcher()],
), ),
cx cx
) )
@ -3594,8 +3594,8 @@ async fn test_search_with_exclusions_and_inclusions(cx: &mut gpui::TestAppContex
search_query, search_query,
false, false,
true, true,
vec![glob::Pattern::new("*.ts").unwrap()], vec![Glob::new("*.ts").unwrap().compile_matcher()],
vec![glob::Pattern::new("*.ts").unwrap()], vec![Glob::new("*.ts").unwrap().compile_matcher()],
), ),
cx cx
) )
@ -3613,12 +3613,12 @@ async fn test_search_with_exclusions_and_inclusions(cx: &mut gpui::TestAppContex
false, false,
true, true,
vec![ vec![
glob::Pattern::new("*.ts").unwrap(), Glob::new("*.ts").unwrap().compile_matcher(),
glob::Pattern::new("*.odd").unwrap() Glob::new("*.odd").unwrap().compile_matcher()
], ],
vec![ vec![
glob::Pattern::new("*.ts").unwrap(), Glob::new("*.ts").unwrap().compile_matcher(),
glob::Pattern::new("*.odd").unwrap() Glob::new("*.odd").unwrap().compile_matcher()
], ],
), ),
cx cx
@ -3637,12 +3637,12 @@ async fn test_search_with_exclusions_and_inclusions(cx: &mut gpui::TestAppContex
false, false,
true, true,
vec![ vec![
glob::Pattern::new("*.ts").unwrap(), Glob::new("*.ts").unwrap().compile_matcher(),
glob::Pattern::new("*.odd").unwrap() Glob::new("*.odd").unwrap().compile_matcher()
], ],
vec![ vec![
glob::Pattern::new("*.rs").unwrap(), Glob::new("*.rs").unwrap().compile_matcher(),
glob::Pattern::new("*.odd").unwrap() Glob::new("*.odd").unwrap().compile_matcher()
], ],
), ),
cx cx

View file

@ -1,6 +1,7 @@
use aho_corasick::{AhoCorasick, AhoCorasickBuilder}; use aho_corasick::{AhoCorasick, AhoCorasickBuilder};
use anyhow::Result; use anyhow::Result;
use client::proto; use client::proto;
use globset::{Glob, GlobMatcher};
use itertools::Itertools; use itertools::Itertools;
use language::{char_kind, Rope}; use language::{char_kind, Rope};
use regex::{Regex, RegexBuilder}; use regex::{Regex, RegexBuilder};
@ -19,8 +20,8 @@ pub enum SearchQuery {
query: Arc<str>, query: Arc<str>,
whole_word: bool, whole_word: bool,
case_sensitive: bool, case_sensitive: bool,
files_to_include: Vec<glob::Pattern>, files_to_include: Vec<GlobMatcher>,
files_to_exclude: Vec<glob::Pattern>, files_to_exclude: Vec<GlobMatcher>,
}, },
Regex { Regex {
regex: Regex, regex: Regex,
@ -28,8 +29,8 @@ pub enum SearchQuery {
multiline: bool, multiline: bool,
whole_word: bool, whole_word: bool,
case_sensitive: bool, case_sensitive: bool,
files_to_include: Vec<glob::Pattern>, files_to_include: Vec<GlobMatcher>,
files_to_exclude: Vec<glob::Pattern>, files_to_exclude: Vec<GlobMatcher>,
}, },
} }
@ -38,8 +39,8 @@ impl SearchQuery {
query: impl ToString, query: impl ToString,
whole_word: bool, whole_word: bool,
case_sensitive: bool, case_sensitive: bool,
files_to_include: Vec<glob::Pattern>, files_to_include: Vec<GlobMatcher>,
files_to_exclude: Vec<glob::Pattern>, files_to_exclude: Vec<GlobMatcher>,
) -> Self { ) -> Self {
let query = query.to_string(); let query = query.to_string();
let search = AhoCorasickBuilder::new() let search = AhoCorasickBuilder::new()
@ -60,8 +61,8 @@ impl SearchQuery {
query: impl ToString, query: impl ToString,
whole_word: bool, whole_word: bool,
case_sensitive: bool, case_sensitive: bool,
files_to_include: Vec<glob::Pattern>, files_to_include: Vec<GlobMatcher>,
files_to_exclude: Vec<glob::Pattern>, files_to_exclude: Vec<GlobMatcher>,
) -> Result<Self> { ) -> Result<Self> {
let mut query = query.to_string(); let mut query = query.to_string();
let initial_query = Arc::from(query.as_str()); let initial_query = Arc::from(query.as_str());
@ -95,40 +96,16 @@ impl SearchQuery {
message.query, message.query,
message.whole_word, message.whole_word,
message.case_sensitive, message.case_sensitive,
message deserialize_globs(&message.files_to_include)?,
.files_to_include deserialize_globs(&message.files_to_exclude)?,
.split(',')
.map(str::trim)
.filter(|glob_str| !glob_str.is_empty())
.map(|glob_str| glob::Pattern::new(glob_str))
.collect::<Result<_, _>>()?,
message
.files_to_exclude
.split(',')
.map(str::trim)
.filter(|glob_str| !glob_str.is_empty())
.map(|glob_str| glob::Pattern::new(glob_str))
.collect::<Result<_, _>>()?,
) )
} else { } else {
Ok(Self::text( Ok(Self::text(
message.query, message.query,
message.whole_word, message.whole_word,
message.case_sensitive, message.case_sensitive,
message deserialize_globs(&message.files_to_include)?,
.files_to_include deserialize_globs(&message.files_to_exclude)?,
.split(',')
.map(str::trim)
.filter(|glob_str| !glob_str.is_empty())
.map(|glob_str| glob::Pattern::new(glob_str))
.collect::<Result<_, _>>()?,
message
.files_to_exclude
.split(',')
.map(str::trim)
.filter(|glob_str| !glob_str.is_empty())
.map(|glob_str| glob::Pattern::new(glob_str))
.collect::<Result<_, _>>()?,
)) ))
} }
} }
@ -143,12 +120,12 @@ impl SearchQuery {
files_to_include: self files_to_include: self
.files_to_include() .files_to_include()
.iter() .iter()
.map(ToString::to_string) .map(|g| g.glob().to_string())
.join(","), .join(","),
files_to_exclude: self files_to_exclude: self
.files_to_exclude() .files_to_exclude()
.iter() .iter()
.map(ToString::to_string) .map(|g| g.glob().to_string())
.join(","), .join(","),
} }
} }
@ -289,7 +266,7 @@ impl SearchQuery {
matches!(self, Self::Regex { .. }) matches!(self, Self::Regex { .. })
} }
pub fn files_to_include(&self) -> &[glob::Pattern] { pub fn files_to_include(&self) -> &[GlobMatcher] {
match self { match self {
Self::Text { Self::Text {
files_to_include, .. files_to_include, ..
@ -300,7 +277,7 @@ impl SearchQuery {
} }
} }
pub fn files_to_exclude(&self) -> &[glob::Pattern] { pub fn files_to_exclude(&self) -> &[GlobMatcher] {
match self { match self {
Self::Text { Self::Text {
files_to_exclude, .. files_to_exclude, ..
@ -317,14 +294,23 @@ impl SearchQuery {
!self !self
.files_to_exclude() .files_to_exclude()
.iter() .iter()
.any(|exclude_glob| exclude_glob.matches_path(file_path)) .any(|exclude_glob| exclude_glob.is_match(file_path))
&& (self.files_to_include().is_empty() && (self.files_to_include().is_empty()
|| self || self
.files_to_include() .files_to_include()
.iter() .iter()
.any(|include_glob| include_glob.matches_path(file_path))) .any(|include_glob| include_glob.is_match(file_path)))
} }
None => self.files_to_include().is_empty(), None => self.files_to_include().is_empty(),
} }
} }
} }
fn deserialize_globs(glob_set: &str) -> Result<Vec<GlobMatcher>> {
glob_set
.split(',')
.map(str::trim)
.filter(|glob_str| !glob_str.is_empty())
.map(|glob_str| Ok(Glob::new(glob_str)?.compile_matcher()))
.collect()
}

File diff suppressed because it is too large Load diff

View file

@ -6,7 +6,7 @@ use gpui::{
actions, actions,
anyhow::{anyhow, Result}, anyhow::{anyhow, Result},
elements::{ elements::{
AnchorCorner, ChildView, ComponentHost, ContainerStyle, Empty, Flex, MouseEventHandler, AnchorCorner, ChildView, ContainerStyle, Empty, Flex, Label, MouseEventHandler,
ParentElement, ScrollTarget, Stack, Svg, UniformList, UniformListState, ParentElement, ScrollTarget, Stack, Svg, UniformList, UniformListState,
}, },
geometry::vector::Vector2F, geometry::vector::Vector2F,
@ -29,7 +29,7 @@ use std::{
path::Path, path::Path,
sync::Arc, sync::Arc,
}; };
use theme::{ui::FileName, ProjectPanelEntry}; use theme::ProjectPanelEntry;
use unicase::UniCase; use unicase::UniCase;
use workspace::Workspace; use workspace::Workspace;
@ -1011,14 +1011,11 @@ impl ProjectPanel {
.unwrap_or(&[]); .unwrap_or(&[]);
let entry_range = range.start.saturating_sub(ix)..end_ix - ix; let entry_range = range.start.saturating_sub(ix)..end_ix - ix;
for entry in &visible_worktree_entries[entry_range] { for (entry, repo) in
let path = &entry.path; snapshot.entries_with_repositories(visible_worktree_entries[entry_range].iter())
{
let status = (entry.path.parent().is_some() && !entry.is_ignored) let status = (entry.path.parent().is_some() && !entry.is_ignored)
.then(|| { .then(|| repo.and_then(|repo| repo.status_for_path(&snapshot, &entry.path)))
snapshot
.repo_for(path)
.and_then(|entry| entry.status_for_path(&snapshot, path))
})
.flatten(); .flatten();
let mut details = EntryDetails { let mut details = EntryDetails {
@ -1083,6 +1080,17 @@ impl ProjectPanel {
let kind = details.kind; let kind = details.kind;
let show_editor = details.is_editing && !details.is_processing; let show_editor = details.is_editing && !details.is_processing;
let mut filename_text_style = style.text.clone();
filename_text_style.color = details
.git_status
.as_ref()
.map(|status| match status {
GitFileStatus::Added => style.status.git.inserted,
GitFileStatus::Modified => style.status.git.modified,
GitFileStatus::Conflict => style.status.git.conflict,
})
.unwrap_or(style.text.color);
Flex::row() Flex::row()
.with_child( .with_child(
if kind == EntryKind::Dir { if kind == EntryKind::Dir {
@ -1110,16 +1118,12 @@ impl ProjectPanel {
.flex(1.0, true) .flex(1.0, true)
.into_any() .into_any()
} else { } else {
ComponentHost::new(FileName::new( Label::new(details.filename.clone(), filename_text_style)
details.filename.clone(), .contained()
details.git_status, .with_margin_left(style.icon_spacing)
FileName::style(style.text.clone(), &cx.global::<Settings>().theme), .aligned()
)) .left()
.contained() .into_any()
.with_margin_left(style.icon_spacing)
.aligned()
.left()
.into_any()
}) })
.constrained() .constrained()
.with_height(style.height) .with_height(style.height)

View file

@ -27,7 +27,7 @@ serde.workspace = true
serde_derive.workspace = true serde_derive.workspace = true
smallvec.workspace = true smallvec.workspace = true
smol.workspace = true smol.workspace = true
glob.workspace = true globset.workspace = true
[dev-dependencies] [dev-dependencies]
editor = { path = "../editor", features = ["test-support"] } editor = { path = "../editor", features = ["test-support"] }

View file

@ -2,12 +2,14 @@ use crate::{
SearchOption, SelectNextMatch, SelectPrevMatch, ToggleCaseSensitive, ToggleRegex, SearchOption, SelectNextMatch, SelectPrevMatch, ToggleCaseSensitive, ToggleRegex,
ToggleWholeWord, ToggleWholeWord,
}; };
use anyhow::Result;
use collections::HashMap; use collections::HashMap;
use editor::{ use editor::{
items::active_match_index, scroll::autoscroll::Autoscroll, Anchor, Editor, MultiBuffer, items::active_match_index, scroll::autoscroll::Autoscroll, Anchor, Editor, MultiBuffer,
SelectAll, MAX_TAB_TITLE_LEN, SelectAll, MAX_TAB_TITLE_LEN,
}; };
use futures::StreamExt; use futures::StreamExt;
use globset::{Glob, GlobMatcher};
use gpui::{ use gpui::{
actions, actions,
elements::*, elements::*,
@ -47,7 +49,7 @@ pub fn init(cx: &mut AppContext) {
cx.add_action(ProjectSearchBar::search_in_new); cx.add_action(ProjectSearchBar::search_in_new);
cx.add_action(ProjectSearchBar::select_next_match); cx.add_action(ProjectSearchBar::select_next_match);
cx.add_action(ProjectSearchBar::select_prev_match); cx.add_action(ProjectSearchBar::select_prev_match);
cx.add_action(ProjectSearchBar::toggle_focus); cx.add_action(ProjectSearchBar::move_focus_to_results);
cx.capture_action(ProjectSearchBar::tab); cx.capture_action(ProjectSearchBar::tab);
cx.capture_action(ProjectSearchBar::tab_previous); cx.capture_action(ProjectSearchBar::tab_previous);
add_toggle_option_action::<ToggleCaseSensitive>(SearchOption::CaseSensitive, cx); add_toggle_option_action::<ToggleCaseSensitive>(SearchOption::CaseSensitive, cx);
@ -572,46 +574,30 @@ impl ProjectSearchView {
fn build_search_query(&mut self, cx: &mut ViewContext<Self>) -> Option<SearchQuery> { fn build_search_query(&mut self, cx: &mut ViewContext<Self>) -> Option<SearchQuery> {
let text = self.query_editor.read(cx).text(cx); let text = self.query_editor.read(cx).text(cx);
let included_files = match self let included_files =
.included_files_editor match Self::load_glob_set(&self.included_files_editor.read(cx).text(cx)) {
.read(cx) Ok(included_files) => {
.text(cx) self.panels_with_errors.remove(&InputPanel::Include);
.split(',') included_files
.map(str::trim) }
.filter(|glob_str| !glob_str.is_empty()) Err(_e) => {
.map(|glob_str| glob::Pattern::new(glob_str)) self.panels_with_errors.insert(InputPanel::Include);
.collect::<Result<_, _>>() cx.notify();
{ return None;
Ok(included_files) => { }
self.panels_with_errors.remove(&InputPanel::Include); };
included_files let excluded_files =
} match Self::load_glob_set(&self.excluded_files_editor.read(cx).text(cx)) {
Err(_e) => { Ok(excluded_files) => {
self.panels_with_errors.insert(InputPanel::Include); self.panels_with_errors.remove(&InputPanel::Exclude);
cx.notify(); excluded_files
return None; }
} Err(_e) => {
}; self.panels_with_errors.insert(InputPanel::Exclude);
let excluded_files = match self cx.notify();
.excluded_files_editor return None;
.read(cx) }
.text(cx) };
.split(',')
.map(str::trim)
.filter(|glob_str| !glob_str.is_empty())
.map(|glob_str| glob::Pattern::new(glob_str))
.collect::<Result<_, _>>()
{
Ok(excluded_files) => {
self.panels_with_errors.remove(&InputPanel::Exclude);
excluded_files
}
Err(_e) => {
self.panels_with_errors.insert(InputPanel::Exclude);
cx.notify();
return None;
}
};
if self.regex { if self.regex {
match SearchQuery::regex( match SearchQuery::regex(
text, text,
@ -641,6 +627,14 @@ impl ProjectSearchView {
} }
} }
fn load_glob_set(text: &str) -> Result<Vec<GlobMatcher>> {
text.split(',')
.map(str::trim)
.filter(|glob_str| !glob_str.is_empty())
.map(|glob_str| anyhow::Ok(Glob::new(glob_str)?.compile_matcher()))
.collect()
}
fn select_match(&mut self, direction: Direction, cx: &mut ViewContext<Self>) { fn select_match(&mut self, direction: Direction, cx: &mut ViewContext<Self>) {
if let Some(index) = self.active_match_index { if let Some(index) = self.active_match_index {
let match_ranges = self.model.read(cx).match_ranges.clone(); let match_ranges = self.model.read(cx).match_ranges.clone();
@ -801,18 +795,16 @@ impl ProjectSearchBar {
} }
} }
fn toggle_focus(pane: &mut Pane, _: &ToggleFocus, cx: &mut ViewContext<Pane>) { fn move_focus_to_results(pane: &mut Pane, _: &ToggleFocus, cx: &mut ViewContext<Pane>) {
if let Some(search_view) = pane if let Some(search_view) = pane
.active_item() .active_item()
.and_then(|item| item.downcast::<ProjectSearchView>()) .and_then(|item| item.downcast::<ProjectSearchView>())
{ {
search_view.update(cx, |search_view, cx| { search_view.update(cx, |search_view, cx| {
if search_view.query_editor.is_focused(cx) { if search_view.query_editor.is_focused(cx)
if !search_view.model.read(cx).match_ranges.is_empty() { && !search_view.model.read(cx).match_ranges.is_empty()
search_view.focus_results_editor(cx); {
} search_view.focus_results_editor(cx);
} else {
search_view.focus_query_editor(cx);
} }
}); });
} else { } else {

View file

@ -46,6 +46,7 @@ pub struct Settings {
pub hover_popover_enabled: bool, pub hover_popover_enabled: bool,
pub show_completions_on_input: bool, pub show_completions_on_input: bool,
pub show_call_status_icon: bool, pub show_call_status_icon: bool,
pub scrollbar: Scrollbar,
pub vim_mode: bool, pub vim_mode: bool,
pub autosave: Autosave, pub autosave: Autosave,
pub default_dock_anchor: DockAnchor, pub default_dock_anchor: DockAnchor,
@ -68,6 +69,22 @@ pub struct Settings {
pub base_keymap: BaseKeymap, pub base_keymap: BaseKeymap,
} }
#[derive(Copy, Clone, Debug, Serialize, Deserialize, JsonSchema, PartialEq, Eq, Default)]
pub struct Scrollbar {
pub show: Option<ShowScrollbar>,
pub git_diff: Option<bool>,
}
#[derive(Copy, Clone, Debug, Serialize, Deserialize, JsonSchema, PartialEq, Eq, Default)]
#[serde(rename_all = "snake_case")]
pub enum ShowScrollbar {
#[default]
Auto,
System,
Always,
Never,
}
#[derive(Copy, Clone, Debug, Serialize, Deserialize, JsonSchema, PartialEq, Eq, Default)] #[derive(Copy, Clone, Debug, Serialize, Deserialize, JsonSchema, PartialEq, Eq, Default)]
pub enum BaseKeymap { pub enum BaseKeymap {
#[default] #[default]
@ -390,6 +407,8 @@ pub struct SettingsFileContent {
#[serde(default)] #[serde(default)]
pub active_pane_magnification: Option<f32>, pub active_pane_magnification: Option<f32>,
#[serde(default)] #[serde(default)]
pub scrollbar: Option<Scrollbar>,
#[serde(default)]
pub cursor_blink: Option<bool>, pub cursor_blink: Option<bool>,
#[serde(default)] #[serde(default)]
pub confirm_quit: Option<bool>, pub confirm_quit: Option<bool>,
@ -547,6 +566,7 @@ impl Settings {
features: Features { features: Features {
copilot: defaults.features.copilot.unwrap(), copilot: defaults.features.copilot.unwrap(),
}, },
scrollbar: defaults.scrollbar.unwrap(),
} }
} }
@ -598,6 +618,7 @@ impl Settings {
merge(&mut self.autosave, data.autosave); merge(&mut self.autosave, data.autosave);
merge(&mut self.default_dock_anchor, data.default_dock_anchor); merge(&mut self.default_dock_anchor, data.default_dock_anchor);
merge(&mut self.base_keymap, data.base_keymap); merge(&mut self.base_keymap, data.base_keymap);
merge(&mut self.scrollbar, data.scrollbar);
merge(&mut self.features.copilot, data.features.copilot); merge(&mut self.features.copilot, data.features.copilot);
if let Some(copilot) = data.copilot { if let Some(copilot) = data.copilot {
@ -830,6 +851,7 @@ impl Settings {
auto_update: true, auto_update: true,
base_keymap: Default::default(), base_keymap: Default::default(),
features: Features { copilot: true }, features: Features { copilot: true },
scrollbar: Default::default(),
} }
} }

View file

@ -1783,6 +1783,19 @@ impl BufferSnapshot {
where where
D: 'a + TextDimension, D: 'a + TextDimension,
A: 'a + IntoIterator<Item = &'a Anchor>, A: 'a + IntoIterator<Item = &'a Anchor>,
{
let anchors = anchors.into_iter();
self.summaries_for_anchors_with_payload::<D, _, ()>(anchors.map(|a| (a, ())))
.map(|d| d.0)
}
pub fn summaries_for_anchors_with_payload<'a, D, A, T>(
&'a self,
anchors: A,
) -> impl 'a + Iterator<Item = (D, T)>
where
D: 'a + TextDimension,
A: 'a + IntoIterator<Item = (&'a Anchor, T)>,
{ {
let anchors = anchors.into_iter(); let anchors = anchors.into_iter();
let mut insertion_cursor = self.insertions.cursor::<InsertionFragmentKey>(); let mut insertion_cursor = self.insertions.cursor::<InsertionFragmentKey>();
@ -1790,11 +1803,11 @@ impl BufferSnapshot {
let mut text_cursor = self.visible_text.cursor(0); let mut text_cursor = self.visible_text.cursor(0);
let mut position = D::default(); let mut position = D::default();
anchors.map(move |anchor| { anchors.map(move |(anchor, payload)| {
if *anchor == Anchor::MIN { if *anchor == Anchor::MIN {
return D::default(); return (D::default(), payload);
} else if *anchor == Anchor::MAX { } else if *anchor == Anchor::MAX {
return D::from_text_summary(&self.visible_text.summary()); return (D::from_text_summary(&self.visible_text.summary()), payload);
} }
let anchor_key = InsertionFragmentKey { let anchor_key = InsertionFragmentKey {
@ -1825,7 +1838,7 @@ impl BufferSnapshot {
} }
position.add_assign(&text_cursor.summary(fragment_offset)); position.add_assign(&text_cursor.summary(fragment_offset));
position.clone() (position.clone(), payload)
}) })
} }

View file

@ -425,6 +425,19 @@ pub struct ProjectPanelEntry {
pub icon_color: Color, pub icon_color: Color,
pub icon_size: f32, pub icon_size: f32,
pub icon_spacing: f32, pub icon_spacing: f32,
pub status: EntryStatus,
}
#[derive(Clone, Debug, Deserialize, Default)]
pub struct EntryStatus {
pub git: GitProjectStatus,
}
#[derive(Clone, Debug, Deserialize, Default)]
pub struct GitProjectStatus {
pub modified: Color,
pub inserted: Color,
pub conflict: Color,
} }
#[derive(Clone, Debug, Deserialize, Default)] #[derive(Clone, Debug, Deserialize, Default)]
@ -649,6 +662,14 @@ pub struct Scrollbar {
pub thumb: ContainerStyle, pub thumb: ContainerStyle,
pub width: f32, pub width: f32,
pub min_height_factor: f32, pub min_height_factor: f32,
pub git: GitDiffColors,
}
#[derive(Clone, Deserialize, Default)]
pub struct GitDiffColors {
pub inserted: Color,
pub modified: Color,
pub deleted: Color,
} }
#[derive(Clone, Deserialize, Default)] #[derive(Clone, Deserialize, Default)]

View file

@ -1,10 +1,9 @@
use std::borrow::Cow; use std::borrow::Cow;
use fs::repository::GitFileStatus;
use gpui::{ use gpui::{
color::Color, color::Color,
elements::{ elements::{
ConstrainedBox, Container, ContainerStyle, Empty, Flex, KeystrokeLabel, Label, LabelStyle, ConstrainedBox, Container, ContainerStyle, Empty, Flex, KeystrokeLabel, Label,
MouseEventHandler, ParentElement, Stack, Svg, MouseEventHandler, ParentElement, Stack, Svg,
}, },
fonts::TextStyle, fonts::TextStyle,
@ -12,11 +11,11 @@ use gpui::{
platform, platform,
platform::MouseButton, platform::MouseButton,
scene::MouseClick, scene::MouseClick,
Action, AnyElement, Element, EventContext, MouseState, View, ViewContext, Action, Element, EventContext, MouseState, View, ViewContext,
}; };
use serde::Deserialize; use serde::Deserialize;
use crate::{ContainedText, Interactive, Theme}; use crate::{ContainedText, Interactive};
#[derive(Clone, Deserialize, Default)] #[derive(Clone, Deserialize, Default)]
pub struct CheckboxStyle { pub struct CheckboxStyle {
@ -253,53 +252,3 @@ where
.constrained() .constrained()
.with_height(style.dimensions().y()) .with_height(style.dimensions().y())
} }
pub struct FileName {
filename: String,
git_status: Option<GitFileStatus>,
style: FileNameStyle,
}
pub struct FileNameStyle {
template_style: LabelStyle,
git_inserted: Color,
git_modified: Color,
git_deleted: Color,
}
impl FileName {
pub fn new(filename: String, git_status: Option<GitFileStatus>, style: FileNameStyle) -> Self {
FileName {
filename,
git_status,
style,
}
}
pub fn style<I: Into<LabelStyle>>(style: I, theme: &Theme) -> FileNameStyle {
FileNameStyle {
template_style: style.into(),
git_inserted: theme.editor.diff.inserted,
git_modified: theme.editor.diff.modified,
git_deleted: theme.editor.diff.deleted,
}
}
}
impl<V: View> gpui::elements::Component<V> for FileName {
fn render(&self, _: &mut V, _: &mut ViewContext<V>) -> AnyElement<V> {
// Prepare colors for git statuses
let mut filename_text_style = self.style.template_style.text.clone();
filename_text_style.color = self
.git_status
.as_ref()
.map(|status| match status {
GitFileStatus::Added => self.style.git_inserted,
GitFileStatus::Modified => self.style.git_modified,
GitFileStatus::Conflict => self.style.git_deleted,
})
.unwrap_or(self.style.template_style.text.color);
Label::new(self.filename.clone(), filename_text_style).into_any()
}
}

View file

@ -3,7 +3,7 @@ authors = ["Nathan Sobo <nathansobo@gmail.com>"]
description = "The fast, collaborative code editor." description = "The fast, collaborative code editor."
edition = "2021" edition = "2021"
name = "zed" name = "zed"
version = "0.87.0" version = "0.87.6"
publish = false publish = false
[lib] [lib]

View file

@ -1 +1 @@
dev stable

View file

@ -6,6 +6,8 @@ import hoverPopover from "./hoverPopover"
import { SyntaxHighlightStyle, buildSyntax } from "../themes/common/syntax" import { SyntaxHighlightStyle, buildSyntax } from "../themes/common/syntax"
export default function editor(colorScheme: ColorScheme) { export default function editor(colorScheme: ColorScheme) {
const { isLight } = colorScheme
let layer = colorScheme.highest let layer = colorScheme.highest
const autocompleteItem = { const autocompleteItem = {
@ -97,12 +99,18 @@ export default function editor(colorScheme: ColorScheme) {
foldBackground: foreground(layer, "variant"), foldBackground: foreground(layer, "variant"),
}, },
diff: { diff: {
deleted: foreground(layer, "negative"), deleted: isLight
modified: foreground(layer, "warning"), ? colorScheme.ramps.red(0.5).hex()
inserted: foreground(layer, "positive"), : colorScheme.ramps.red(0.4).hex(),
modified: isLight
? colorScheme.ramps.yellow(0.3).hex()
: colorScheme.ramps.yellow(0.5).hex(),
inserted: isLight
? colorScheme.ramps.green(0.4).hex()
: colorScheme.ramps.green(0.5).hex(),
removedWidthEm: 0.275, removedWidthEm: 0.275,
widthEm: 0.22, widthEm: 0.15,
cornerRadius: 0.2, cornerRadius: 0.05,
}, },
/** Highlights matching occurences of what is under the cursor /** Highlights matching occurences of what is under the cursor
* as well as matched brackets * as well as matched brackets
@ -234,12 +242,27 @@ export default function editor(colorScheme: ColorScheme) {
border: border(layer, "variant", { left: true }), border: border(layer, "variant", { left: true }),
}, },
thumb: { thumb: {
background: withOpacity(background(layer, "inverted"), 0.4), background: withOpacity(background(layer, "inverted"), 0.3),
border: { border: {
width: 1, width: 1,
color: borderColor(layer, "variant"), color: borderColor(layer, "variant"),
}, top: false,
right: true,
left: true,
bottom: false,
}
}, },
git: {
deleted: isLight
? withOpacity(colorScheme.ramps.red(0.5).hex(), 0.8)
: withOpacity(colorScheme.ramps.red(0.4).hex(), 0.8),
modified: isLight
? withOpacity(colorScheme.ramps.yellow(0.5).hex(), 0.8)
: withOpacity(colorScheme.ramps.yellow(0.4).hex(), 0.8),
inserted: isLight
? withOpacity(colorScheme.ramps.green(0.5).hex(), 0.8)
: withOpacity(colorScheme.ramps.green(0.4).hex(), 0.8),
}
}, },
compositionMark: { compositionMark: {
underline: { underline: {

View file

@ -3,6 +3,8 @@ import { withOpacity } from "../utils/color"
import { background, border, foreground, text } from "./components" import { background, border, foreground, text } from "./components"
export default function projectPanel(colorScheme: ColorScheme) { export default function projectPanel(colorScheme: ColorScheme) {
const { isLight } = colorScheme
let layer = colorScheme.middle let layer = colorScheme.middle
let baseEntry = { let baseEntry = {
@ -12,6 +14,20 @@ export default function projectPanel(colorScheme: ColorScheme) {
iconSpacing: 8, iconSpacing: 8,
} }
let status = {
git: {
modified: isLight
? colorScheme.ramps.yellow(0.6).hex()
: colorScheme.ramps.yellow(0.5).hex(),
inserted: isLight
? colorScheme.ramps.green(0.45).hex()
: colorScheme.ramps.green(0.5).hex(),
conflict: isLight
? colorScheme.ramps.red(0.6).hex()
: colorScheme.ramps.red(0.5).hex()
}
}
let entry = { let entry = {
...baseEntry, ...baseEntry,
text: text(layer, "mono", "variant", { size: "sm" }), text: text(layer, "mono", "variant", { size: "sm" }),
@ -28,6 +44,7 @@ export default function projectPanel(colorScheme: ColorScheme) {
background: background(layer, "active"), background: background(layer, "active"),
text: text(layer, "mono", "active", { size: "sm" }), text: text(layer, "mono", "active", { size: "sm" }),
}, },
status
} }
return { return {
@ -62,6 +79,7 @@ export default function projectPanel(colorScheme: ColorScheme) {
text: text(layer, "mono", "on", { size: "sm" }), text: text(layer, "mono", "on", { size: "sm" }),
background: withOpacity(background(layer, "on"), 0.9), background: withOpacity(background(layer, "on"), 0.9),
border: border(layer), border: border(layer),
status
}, },
ignoredEntry: { ignoredEntry: {
...entry, ...entry,