Compare commits

...
Sign in to create a new pull request.

17 commits

Author SHA1 Message Date
Joseph T. Lyons
e6468f9da9 zed 0.111.6 2023-11-09 14:19:37 -05:00
Max Brunsfeld
48b8178f46 Use normal JS comments within JSX tags and JSX expression blocks (#3290)
This fix only required changing the `overrides` queries for JavaScript
and TSX. I've made the fix in both the `zed2` and `zed` crates.

Release Notes:

- Fixed an issue in JavaScript and TSX files, where the 'toggle
comments' command used the wrong comment syntax inside of JSX tags and
expressions within JSX.
2023-11-09 14:17:45 -05:00
Kirill Bulatov
a67a283bdf Do not use prettier for formatting node_modules/** files (#3286)
Fixes

> the most annoying thing i'm running into right now is that when i'm
patching something inside node_modules, Zed tries to pretty-format it
according to my prettier config. this messes up the patch because it has
formatting changes now. i need the pretty formatting on save to be off
inside node_modules, that never makes sense

feedback from #influencers 

Do note though, that language servers will still format any file inside
node_modules, but at least it's not prettier now.
VSCode seem to format the node_modules/** files via language servers
too, so that seems ok for now, and the rest could be fixed during

> "project diagnostics" (eslint) seem to be running inside node_modules,
e.g. i'm seeing 3182 "errors" in my project. that doesn't make sense and
probably wastes resources in addition to being annoying

feedback later.

Release Notes:

- Fixed prettier formatting files inside node_modules
2023-11-09 11:19:38 -05:00
Joseph T. Lyons
5ef021d200 zed 0.111.5 2023-11-09 10:19:57 -05:00
Julia
a86daacf5f Get tsserver running again 2023-11-09 10:16:27 -05:00
Joseph T. Lyons
9cbe4f2a71 v0.111.x stable 2023-11-08 10:53:12 -05:00
Kirill Bulatov
88bbe748e9 Properly toggle diagnostics (#3243)
Follow-up of https://github.com/zed-industries/zed/pull/3236 that fixes
a bug introduced in that PR: diagnostics warning toggle stopped working.

Release Notes:

- N/A
2023-11-06 19:37:35 +02:00
Joseph T. Lyons
5d7604f100 zed 0.111.4 2023-11-06 10:29:47 -05:00
Piotr Osiewicz
e296c53d32 chore: Update vue.js parser (fixes wonky HTML parsing) (#3238)
Vue.js defined a bunch of symbols in it's scanner that collided with
those defined in HTML Tree-sitter grammar. I simply removed them as they
were meant for consumption by the external parties interested in HTML
parser with Vue support - since we handle that ourselves this is not
really necessary to preserve anymore. cc was firing up a bunch of
warnings about unused symbols when I've marked those functions as
`static`, so yeah.
Release Notes:

- Fixed HTML highlighting breaking in presence of <!-- --> comments
(fixes zed-industries/community#2166).
2023-11-06 10:28:49 -05:00
Kirill Bulatov
477a2a4f1b zed 0.111.3 2023-11-05 16:19:59 +02:00
Kirill Bulatov
e986a93535 More heuristics for diagnostics updates (#3236)
Follow-up of https://github.com/zed-industries/zed/pull/3225
That PR enabled every `project::Event::DiskBasedDiagnosticsFinished` to
update the diagnostics, which turned out to be bad, Zed does query for
more diagnostics after every excerpt update, and that seems to be due to
`Event::Edited` emitted by the multibuffers created in the diagnostics
panel.

* now, instead of eagerly updating the diagnostics every time, only do
that if the panel has 0 or 1 caret placed and no changes were made in
the panel yet.
Otherwise, use previous approach and register the updated paths to defer
their update later.

* on every `update_excerpts` in the diagnostics panel, query the entire
diagnostics summary (and store it for the future comparisons), compare
old and new summaries and re-query diagnostics for every path that's not
in both summaries.
Also, query every path that was registered during the
`DiskBasedDiagnosticsFinished` updates that were not eagerly updated
before.

This way we're supposed to get all new diagnostics (for new paths added)
and re-check all old paths that might have stale diagnostics now.

* do diagnostics rechecks concurrently for every path now, speeding the
overall process

Release Notes:

- Fixed diagnostics triggering too eagerly during multicaret edits and
certain stale diagnostics not being removed in time
2023-11-05 16:19:07 +02:00
Kirill Bulatov
acb90ba336 zed 0.111.2 2023-11-03 22:20:46 +02:00
Kirill Bulatov
6780e80bf7 Refresh diagnostics inside the tab (#3225)
r-a now has 2 different types of diagnostics: 
* "disk-based" ones that come from `cargo check` and related, that emit
`project::Event::DiskBasedDiagnosticsStarted` and
`DiskBasedDiagnosticsFinished`
* "flycheck" diagnostics from r-a itself, that it tries to dynamically
apply to every buffer open, that come with `DiagnosticsUpdated` event.

Latter diagnostics update frequently, on every file close and open, but
`diagnostics.rs` logic had never polled for new diagnostics after
registering the `DiagnosticsUpdated` event, so the only way we could
have newer diagnostics was to re-open the whole panel.
The PR fixes that, and also adds more debug logging to the module.
The logic of the fix looks very familiar to previous related fix:
https://github.com/zed-industries/zed/pull/3128

One notable thing after the fix: "flycheck" diagnostics stay forever if
the diagnostics panel is opened: excerpts in that panel do not allow the
buffer to get dropped (hence, closed in terms of r-a) and get the
updated, zero diagnostics.
If the diagnostics panel is opened and closed multiple times, those
errors gradually disappear.

Release Notes:

- Fixed diagnostics panel not refreshing its contents properly
2023-11-03 22:15:53 +02:00
Kirill Bulatov
7e17603059 Detect prettier in npm workspaces correctly (#3219)
Deals with https://github.com/zed-industries/community/issues/2016

Also refactored project code to spawn less default prettiers.

Release Notes:

- Fixed prettier not working in npm workspaces
2023-11-03 22:15:49 +02:00
Joseph T. Lyons
d2c0c4eac4 zed 0.111.1 2023-11-02 10:50:20 -04:00
Kyle Caverly
80469283ee authenticate with completion provider on new inline assists (#3209)
authenticate with completion provider on new inline assists

Release Notes:

- Fixed bug which lead the inline assist functionality to never
authenticate
2023-11-02 10:40:10 -04:00
Joseph T. Lyons
463901820c v0.111.x preview 2023-11-01 12:34:08 -04:00
34 changed files with 2473 additions and 1047 deletions

8
Cargo.lock generated
View file

@ -2424,8 +2424,10 @@ dependencies = [
"client", "client",
"collections", "collections",
"editor", "editor",
"futures 0.3.28",
"gpui", "gpui",
"language", "language",
"log",
"lsp", "lsp",
"postage", "postage",
"project", "project",
@ -7510,7 +7512,6 @@ dependencies = [
"collections", "collections",
"editor", "editor",
"futures 0.3.28", "futures 0.3.28",
"globset",
"gpui", "gpui",
"language", "language",
"log", "log",
@ -9546,7 +9547,7 @@ dependencies = [
[[package]] [[package]]
name = "tree-sitter-vue" name = "tree-sitter-vue"
version = "0.0.1" version = "0.0.1"
source = "git+https://github.com/zed-industries/tree-sitter-vue?rev=95b2890#95b28908d90e928c308866f7631e73ef6e1d4b5f" source = "git+https://github.com/zed-industries/tree-sitter-vue?rev=9b6cb221ccb8d0b956fcb17e9a1efac2feefeb58#9b6cb221ccb8d0b956fcb17e9a1efac2feefeb58"
dependencies = [ dependencies = [
"cc", "cc",
"tree-sitter", "tree-sitter",
@ -9794,6 +9795,7 @@ dependencies = [
"dirs 3.0.2", "dirs 3.0.2",
"futures 0.3.28", "futures 0.3.28",
"git2", "git2",
"globset",
"isahc", "isahc",
"lazy_static", "lazy_static",
"log", "log",
@ -10821,7 +10823,7 @@ dependencies = [
[[package]] [[package]]
name = "zed" name = "zed"
version = "0.111.0" version = "0.111.6"
dependencies = [ dependencies = [
"activity_indicator", "activity_indicator",
"ai", "ai",

View file

@ -175,7 +175,7 @@ tree-sitter-yaml = { git = "https://github.com/zed-industries/tree-sitter-yaml",
tree-sitter-lua = "0.0.14" tree-sitter-lua = "0.0.14"
tree-sitter-nix = { git = "https://github.com/nix-community/tree-sitter-nix", rev = "66e3e9ce9180ae08fc57372061006ef83f0abde7" } tree-sitter-nix = { git = "https://github.com/nix-community/tree-sitter-nix", rev = "66e3e9ce9180ae08fc57372061006ef83f0abde7" }
tree-sitter-nu = { git = "https://github.com/nushell/tree-sitter-nu", rev = "786689b0562b9799ce53e824cb45a1a2a04dc673"} tree-sitter-nu = { git = "https://github.com/nushell/tree-sitter-nu", rev = "786689b0562b9799ce53e824cb45a1a2a04dc673"}
tree-sitter-vue = {git = "https://github.com/zed-industries/tree-sitter-vue", rev = "95b2890"} tree-sitter-vue = {git = "https://github.com/zed-industries/tree-sitter-vue", rev = "9b6cb221ccb8d0b956fcb17e9a1efac2feefeb58"}
[patch.crates-io] [patch.crates-io]
tree-sitter = { git = "https://github.com/tree-sitter/tree-sitter", rev = "35a6052fbcafc5e5fc0f9415b8652be7dcaf7222" } tree-sitter = { git = "https://github.com/tree-sitter/tree-sitter", rev = "35a6052fbcafc5e5fc0f9415b8652be7dcaf7222" }
async-task = { git = "https://github.com/zed-industries/async-task", rev = "341b57d6de98cdfd7b418567b8de2022ca993a6e" } async-task = { git = "https://github.com/zed-industries/async-task", rev = "341b57d6de98cdfd7b418567b8de2022ca993a6e" }

View file

@ -153,10 +153,17 @@ impl FakeCompletionProvider {
pub fn send_completion(&self, completion: impl Into<String>) { pub fn send_completion(&self, completion: impl Into<String>) {
let mut tx = self.last_completion_tx.lock(); let mut tx = self.last_completion_tx.lock();
tx.as_mut().unwrap().try_send(completion.into()).unwrap();
println!("COMPLETION TX: {:?}", &tx);
let a = tx.as_mut().unwrap();
a.try_send(completion.into()).unwrap();
// tx.as_mut().unwrap().try_send(completion.into()).unwrap();
} }
pub fn finish_completion(&self) { pub fn finish_completion(&self) {
println!("FINISHING COMPLETION");
self.last_completion_tx.lock().take().unwrap(); self.last_completion_tx.lock().take().unwrap();
} }
} }
@ -181,8 +188,10 @@ impl CompletionProvider for FakeCompletionProvider {
&self, &self,
_prompt: Box<dyn CompletionRequest>, _prompt: Box<dyn CompletionRequest>,
) -> BoxFuture<'static, anyhow::Result<BoxStream<'static, anyhow::Result<String>>>> { ) -> BoxFuture<'static, anyhow::Result<BoxStream<'static, anyhow::Result<String>>>> {
println!("COMPLETING");
let (tx, rx) = mpsc::channel(1); let (tx, rx) = mpsc::channel(1);
*self.last_completion_tx.lock() = Some(tx); *self.last_completion_tx.lock() = Some(tx);
println!("TX: {:?}", *self.last_completion_tx.lock());
async move { Ok(rx.map(|rx| Ok(rx)).boxed()) }.boxed() async move { Ok(rx.map(|rx| Ok(rx)).boxed()) }.boxed()
} }
fn box_clone(&self) -> Box<dyn CompletionProvider> { fn box_clone(&self) -> Box<dyn CompletionProvider> {

View file

@ -142,7 +142,7 @@ pub struct AssistantPanel {
zoomed: bool, zoomed: bool,
has_focus: bool, has_focus: bool,
toolbar: ViewHandle<Toolbar>, toolbar: ViewHandle<Toolbar>,
completion_provider: Box<dyn CompletionProvider>, completion_provider: Arc<dyn CompletionProvider>,
api_key_editor: Option<ViewHandle<Editor>>, api_key_editor: Option<ViewHandle<Editor>>,
languages: Arc<LanguageRegistry>, languages: Arc<LanguageRegistry>,
fs: Arc<dyn Fs>, fs: Arc<dyn Fs>,
@ -204,7 +204,7 @@ impl AssistantPanel {
let semantic_index = SemanticIndex::global(cx); let semantic_index = SemanticIndex::global(cx);
// Defaulting currently to GPT4, allow for this to be set via config. // Defaulting currently to GPT4, allow for this to be set via config.
let completion_provider = Box::new(OpenAICompletionProvider::new( let completion_provider = Arc::new(OpenAICompletionProvider::new(
"gpt-4", "gpt-4",
cx.background().clone(), cx.background().clone(),
)); ));
@ -259,7 +259,13 @@ impl AssistantPanel {
cx: &mut ViewContext<Workspace>, cx: &mut ViewContext<Workspace>,
) { ) {
let this = if let Some(this) = workspace.panel::<AssistantPanel>(cx) { let this = if let Some(this) = workspace.panel::<AssistantPanel>(cx) {
if this.update(cx, |assistant, _| assistant.has_credentials()) { if this.update(cx, |assistant, cx| {
if !assistant.has_credentials() {
assistant.load_credentials(cx);
};
assistant.has_credentials()
}) {
this this
} else { } else {
workspace.focus_panel::<AssistantPanel>(cx); workspace.focus_panel::<AssistantPanel>(cx);
@ -320,13 +326,10 @@ impl AssistantPanel {
}; };
let inline_assist_id = post_inc(&mut self.next_inline_assist_id); let inline_assist_id = post_inc(&mut self.next_inline_assist_id);
let provider = Arc::new(OpenAICompletionProvider::new( let provider = self.completion_provider.clone();
"gpt-4",
cx.background().clone(),
));
// Retrieve Credentials Authenticates the Provider // Retrieve Credentials Authenticates the Provider
// provider.retrieve_credentials(cx); provider.retrieve_credentials(cx);
let codegen = cx.add_model(|cx| { let codegen = cx.add_model(|cx| {
Codegen::new(editor.read(cx).buffer().clone(), codegen_kind, provider, cx) Codegen::new(editor.read(cx).buffer().clone(), codegen_kind, provider, cx)
@ -1439,7 +1442,7 @@ struct Conversation {
pending_save: Task<Result<()>>, pending_save: Task<Result<()>>,
path: Option<PathBuf>, path: Option<PathBuf>,
_subscriptions: Vec<Subscription>, _subscriptions: Vec<Subscription>,
completion_provider: Box<dyn CompletionProvider>, completion_provider: Arc<dyn CompletionProvider>,
} }
impl Entity for Conversation { impl Entity for Conversation {
@ -1450,7 +1453,7 @@ impl Conversation {
fn new( fn new(
language_registry: Arc<LanguageRegistry>, language_registry: Arc<LanguageRegistry>,
cx: &mut ModelContext<Self>, cx: &mut ModelContext<Self>,
completion_provider: Box<dyn CompletionProvider>, completion_provider: Arc<dyn CompletionProvider>,
) -> Self { ) -> Self {
let markdown = language_registry.language_for_name("Markdown"); let markdown = language_registry.language_for_name("Markdown");
let buffer = cx.add_model(|cx| { let buffer = cx.add_model(|cx| {
@ -1544,7 +1547,7 @@ impl Conversation {
None => Some(Uuid::new_v4().to_string()), None => Some(Uuid::new_v4().to_string()),
}; };
let model = saved_conversation.model; let model = saved_conversation.model;
let completion_provider: Box<dyn CompletionProvider> = Box::new( let completion_provider: Arc<dyn CompletionProvider> = Arc::new(
OpenAICompletionProvider::new(model.full_name(), cx.background().clone()), OpenAICompletionProvider::new(model.full_name(), cx.background().clone()),
); );
completion_provider.retrieve_credentials(cx); completion_provider.retrieve_credentials(cx);
@ -2201,7 +2204,7 @@ struct ConversationEditor {
impl ConversationEditor { impl ConversationEditor {
fn new( fn new(
completion_provider: Box<dyn CompletionProvider>, completion_provider: Arc<dyn CompletionProvider>,
language_registry: Arc<LanguageRegistry>, language_registry: Arc<LanguageRegistry>,
fs: Arc<dyn Fs>, fs: Arc<dyn Fs>,
workspace: WeakViewHandle<Workspace>, workspace: WeakViewHandle<Workspace>,
@ -3406,7 +3409,7 @@ mod tests {
init(cx); init(cx);
let registry = Arc::new(LanguageRegistry::test()); let registry = Arc::new(LanguageRegistry::test());
let completion_provider = Box::new(FakeCompletionProvider::new()); let completion_provider = Arc::new(FakeCompletionProvider::new());
let conversation = cx.add_model(|cx| Conversation::new(registry, cx, completion_provider)); let conversation = cx.add_model(|cx| Conversation::new(registry, cx, completion_provider));
let buffer = conversation.read(cx).buffer.clone(); let buffer = conversation.read(cx).buffer.clone();
@ -3535,7 +3538,7 @@ mod tests {
cx.set_global(SettingsStore::test(cx)); cx.set_global(SettingsStore::test(cx));
init(cx); init(cx);
let registry = Arc::new(LanguageRegistry::test()); let registry = Arc::new(LanguageRegistry::test());
let completion_provider = Box::new(FakeCompletionProvider::new()); let completion_provider = Arc::new(FakeCompletionProvider::new());
let conversation = cx.add_model(|cx| Conversation::new(registry, cx, completion_provider)); let conversation = cx.add_model(|cx| Conversation::new(registry, cx, completion_provider));
let buffer = conversation.read(cx).buffer.clone(); let buffer = conversation.read(cx).buffer.clone();
@ -3633,7 +3636,7 @@ mod tests {
cx.set_global(SettingsStore::test(cx)); cx.set_global(SettingsStore::test(cx));
init(cx); init(cx);
let registry = Arc::new(LanguageRegistry::test()); let registry = Arc::new(LanguageRegistry::test());
let completion_provider = Box::new(FakeCompletionProvider::new()); let completion_provider = Arc::new(FakeCompletionProvider::new());
let conversation = cx.add_model(|cx| Conversation::new(registry, cx, completion_provider)); let conversation = cx.add_model(|cx| Conversation::new(registry, cx, completion_provider));
let buffer = conversation.read(cx).buffer.clone(); let buffer = conversation.read(cx).buffer.clone();
@ -3716,7 +3719,7 @@ mod tests {
cx.set_global(SettingsStore::test(cx)); cx.set_global(SettingsStore::test(cx));
init(cx); init(cx);
let registry = Arc::new(LanguageRegistry::test()); let registry = Arc::new(LanguageRegistry::test());
let completion_provider = Box::new(FakeCompletionProvider::new()); let completion_provider = Arc::new(FakeCompletionProvider::new());
let conversation = let conversation =
cx.add_model(|cx| Conversation::new(registry.clone(), cx, completion_provider)); cx.add_model(|cx| Conversation::new(registry.clone(), cx, completion_provider));
let buffer = conversation.read(cx).buffer.clone(); let buffer = conversation.read(cx).buffer.clone();

View file

@ -367,6 +367,8 @@ fn strip_invalid_spans_from_codeblock(
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use std::sync::Arc;
use super::*; use super::*;
use ai::test::FakeCompletionProvider; use ai::test::FakeCompletionProvider;
use futures::stream::{self}; use futures::stream::{self};
@ -437,6 +439,7 @@ mod tests {
let max_len = cmp::min(new_text.len(), 10); let max_len = cmp::min(new_text.len(), 10);
let len = rng.gen_range(1..=max_len); let len = rng.gen_range(1..=max_len);
let (chunk, suffix) = new_text.split_at(len); let (chunk, suffix) = new_text.split_at(len);
println!("CHUNK: {:?}", &chunk);
provider.send_completion(chunk); provider.send_completion(chunk);
new_text = suffix; new_text = suffix;
deterministic.run_until_parked(); deterministic.run_until_parked();
@ -569,6 +572,7 @@ mod tests {
let max_len = cmp::min(new_text.len(), 10); let max_len = cmp::min(new_text.len(), 10);
let len = rng.gen_range(1..=max_len); let len = rng.gen_range(1..=max_len);
let (chunk, suffix) = new_text.split_at(len); let (chunk, suffix) = new_text.split_at(len);
println!("{:?}", &chunk);
provider.send_completion(chunk); provider.send_completion(chunk);
new_text = suffix; new_text = suffix;
deterministic.run_until_parked(); deterministic.run_until_parked();

View file

@ -20,7 +20,9 @@ theme = { path = "../theme" }
util = { path = "../util" } util = { path = "../util" }
workspace = { path = "../workspace" } workspace = { path = "../workspace" }
log.workspace = true
anyhow.workspace = true anyhow.workspace = true
futures.workspace = true
schemars.workspace = true schemars.workspace = true
serde.workspace = true serde.workspace = true
serde_derive.workspace = true serde_derive.workspace = true

View file

@ -2,8 +2,8 @@ pub mod items;
mod project_diagnostics_settings; mod project_diagnostics_settings;
mod toolbar_controls; mod toolbar_controls;
use anyhow::Result; use anyhow::{Context, Result};
use collections::{BTreeSet, HashSet}; use collections::{HashMap, HashSet};
use editor::{ use editor::{
diagnostic_block_renderer, diagnostic_block_renderer,
display_map::{BlockDisposition, BlockId, BlockProperties, BlockStyle, RenderBlock}, display_map::{BlockDisposition, BlockId, BlockProperties, BlockStyle, RenderBlock},
@ -11,9 +11,10 @@ use editor::{
scroll::autoscroll::Autoscroll, scroll::autoscroll::Autoscroll,
Editor, ExcerptId, ExcerptRange, MultiBuffer, ToOffset, Editor, ExcerptId, ExcerptRange, MultiBuffer, ToOffset,
}; };
use futures::future::try_join_all;
use gpui::{ use gpui::{
actions, elements::*, fonts::TextStyle, serde_json, AnyViewHandle, AppContext, Entity, actions, elements::*, fonts::TextStyle, serde_json, AnyViewHandle, AppContext, Entity,
ModelHandle, Task, View, ViewContext, ViewHandle, WeakViewHandle, ModelHandle, Subscription, Task, View, ViewContext, ViewHandle, WeakViewHandle,
}; };
use language::{ use language::{
Anchor, Bias, Buffer, Diagnostic, DiagnosticEntry, DiagnosticSeverity, Point, Selection, Anchor, Bias, Buffer, Diagnostic, DiagnosticEntry, DiagnosticSeverity, Point, Selection,
@ -28,6 +29,7 @@ use std::{
any::{Any, TypeId}, any::{Any, TypeId},
borrow::Cow, borrow::Cow,
cmp::Ordering, cmp::Ordering,
mem,
ops::Range, ops::Range,
path::PathBuf, path::PathBuf,
sync::Arc, sync::Arc,
@ -60,8 +62,10 @@ struct ProjectDiagnosticsEditor {
summary: DiagnosticSummary, summary: DiagnosticSummary,
excerpts: ModelHandle<MultiBuffer>, excerpts: ModelHandle<MultiBuffer>,
path_states: Vec<PathState>, path_states: Vec<PathState>,
paths_to_update: BTreeSet<(ProjectPath, LanguageServerId)>, paths_to_update: HashMap<LanguageServerId, HashSet<ProjectPath>>,
current_diagnostics: HashMap<LanguageServerId, HashSet<ProjectPath>>,
include_warnings: bool, include_warnings: bool,
_subscriptions: Vec<Subscription>,
} }
struct PathState { struct PathState {
@ -125,9 +129,12 @@ impl View for ProjectDiagnosticsEditor {
"summary": project.diagnostic_summary(cx), "summary": project.diagnostic_summary(cx),
}), }),
"summary": self.summary, "summary": self.summary,
"paths_to_update": self.paths_to_update.iter().map(|(path, server_id)| "paths_to_update": self.paths_to_update.iter().map(|(server_id, paths)|
(path.path.to_string_lossy(), server_id.0) (server_id.0, paths.into_iter().map(|path| path.path.to_string_lossy()).collect::<Vec<_>>())
).collect::<Vec<_>>(), ).collect::<HashMap<_, _>>(),
"current_diagnostics": self.current_diagnostics.iter().map(|(server_id, paths)|
(server_id.0, paths.into_iter().map(|path| path.path.to_string_lossy()).collect::<Vec<_>>())
).collect::<HashMap<_, _>>(),
"paths_states": self.path_states.iter().map(|state| "paths_states": self.path_states.iter().map(|state|
json!({ json!({
"path": state.path.path.to_string_lossy(), "path": state.path.path.to_string_lossy(),
@ -149,21 +156,30 @@ impl ProjectDiagnosticsEditor {
workspace: WeakViewHandle<Workspace>, workspace: WeakViewHandle<Workspace>,
cx: &mut ViewContext<Self>, cx: &mut ViewContext<Self>,
) -> Self { ) -> Self {
cx.subscribe(&project_handle, |this, _, event, cx| match event { let project_event_subscription =
project::Event::DiskBasedDiagnosticsFinished { language_server_id } => { cx.subscribe(&project_handle, |this, _, event, cx| match event {
this.update_excerpts(Some(*language_server_id), cx); project::Event::DiskBasedDiagnosticsFinished { language_server_id } => {
this.update_title(cx); log::debug!("Disk based diagnostics finished for server {language_server_id}");
} this.update_excerpts(Some(*language_server_id), cx);
project::Event::DiagnosticsUpdated { }
language_server_id, project::Event::DiagnosticsUpdated {
path, language_server_id,
} => { path,
this.paths_to_update } => {
.insert((path.clone(), *language_server_id)); log::debug!("Adding path {path:?} to update for server {language_server_id}");
} this.paths_to_update
_ => {} .entry(*language_server_id)
}) .or_default()
.detach(); .insert(path.clone());
let no_multiselections = this.editor.update(cx, |editor, cx| {
editor.selections.all::<usize>(cx).len() <= 1
});
if no_multiselections && !this.is_dirty(cx) {
this.update_excerpts(Some(*language_server_id), cx);
}
}
_ => {}
});
let excerpts = cx.add_model(|cx| MultiBuffer::new(project_handle.read(cx).replica_id())); let excerpts = cx.add_model(|cx| MultiBuffer::new(project_handle.read(cx).replica_id()));
let editor = cx.add_view(|cx| { let editor = cx.add_view(|cx| {
@ -172,19 +188,14 @@ impl ProjectDiagnosticsEditor {
editor.set_vertical_scroll_margin(5, cx); editor.set_vertical_scroll_margin(5, cx);
editor editor
}); });
cx.subscribe(&editor, |this, _, event, cx| { let editor_event_subscription = cx.subscribe(&editor, |this, _, event, cx| {
cx.emit(event.clone()); cx.emit(event.clone());
if event == &editor::Event::Focused && this.path_states.is_empty() { if event == &editor::Event::Focused && this.path_states.is_empty() {
cx.focus_self() cx.focus_self()
} }
}) });
.detach();
let project = project_handle.read(cx); let project = project_handle.read(cx);
let paths_to_update = project
.diagnostic_summaries(cx)
.map(|(path, server_id, _)| (path, server_id))
.collect();
let summary = project.diagnostic_summary(cx); let summary = project.diagnostic_summary(cx);
let mut this = Self { let mut this = Self {
project: project_handle, project: project_handle,
@ -193,8 +204,10 @@ impl ProjectDiagnosticsEditor {
excerpts, excerpts,
editor, editor,
path_states: Default::default(), path_states: Default::default(),
paths_to_update, paths_to_update: HashMap::default(),
include_warnings: settings::get::<ProjectDiagnosticsSettings>(cx).include_warnings, include_warnings: settings::get::<ProjectDiagnosticsSettings>(cx).include_warnings,
current_diagnostics: HashMap::default(),
_subscriptions: vec![project_event_subscription, editor_event_subscription],
}; };
this.update_excerpts(None, cx); this.update_excerpts(None, cx);
this this
@ -214,12 +227,7 @@ impl ProjectDiagnosticsEditor {
fn toggle_warnings(&mut self, _: &ToggleWarnings, cx: &mut ViewContext<Self>) { fn toggle_warnings(&mut self, _: &ToggleWarnings, cx: &mut ViewContext<Self>) {
self.include_warnings = !self.include_warnings; self.include_warnings = !self.include_warnings;
self.paths_to_update = self self.paths_to_update = self.current_diagnostics.clone();
.project
.read(cx)
.diagnostic_summaries(cx)
.map(|(path, server_id, _)| (path, server_id))
.collect();
self.update_excerpts(None, cx); self.update_excerpts(None, cx);
cx.notify(); cx.notify();
} }
@ -229,29 +237,94 @@ impl ProjectDiagnosticsEditor {
language_server_id: Option<LanguageServerId>, language_server_id: Option<LanguageServerId>,
cx: &mut ViewContext<Self>, cx: &mut ViewContext<Self>,
) { ) {
let mut paths = Vec::new(); log::debug!("Updating excerpts for server {language_server_id:?}");
self.paths_to_update.retain(|(path, server_id)| { let mut paths_to_recheck = HashSet::default();
if language_server_id let mut new_summaries: HashMap<LanguageServerId, HashSet<ProjectPath>> = self
.map_or(true, |language_server_id| language_server_id == *server_id) .project
{ .read(cx)
paths.push(path.clone()); .diagnostic_summaries(cx)
false .fold(HashMap::default(), |mut summaries, (path, server_id, _)| {
summaries.entry(server_id).or_default().insert(path);
summaries
});
let mut old_diagnostics = if let Some(language_server_id) = language_server_id {
new_summaries.retain(|server_id, _| server_id == &language_server_id);
self.paths_to_update.retain(|server_id, paths| {
if server_id == &language_server_id {
paths_to_recheck.extend(paths.drain());
false
} else {
true
}
});
let mut old_diagnostics = HashMap::default();
if let Some(new_paths) = new_summaries.get(&language_server_id) {
if let Some(old_paths) = self
.current_diagnostics
.insert(language_server_id, new_paths.clone())
{
old_diagnostics.insert(language_server_id, old_paths);
}
} else { } else {
true if let Some(old_paths) = self.current_diagnostics.remove(&language_server_id) {
old_diagnostics.insert(language_server_id, old_paths);
}
} }
}); old_diagnostics
} else {
paths_to_recheck.extend(self.paths_to_update.drain().flat_map(|(_, paths)| paths));
mem::replace(&mut self.current_diagnostics, new_summaries.clone())
};
for (server_id, new_paths) in new_summaries {
match old_diagnostics.remove(&server_id) {
Some(mut old_paths) => {
paths_to_recheck.extend(
new_paths
.into_iter()
.filter(|new_path| !old_paths.remove(new_path)),
);
paths_to_recheck.extend(old_paths);
}
None => paths_to_recheck.extend(new_paths),
}
}
paths_to_recheck.extend(old_diagnostics.into_iter().flat_map(|(_, paths)| paths));
if paths_to_recheck.is_empty() {
log::debug!("No paths to recheck for language server {language_server_id:?}");
return;
}
log::debug!(
"Rechecking {} paths for language server {:?}",
paths_to_recheck.len(),
language_server_id
);
let project = self.project.clone(); let project = self.project.clone();
cx.spawn(|this, mut cx| { cx.spawn(|this, mut cx| {
async move { async move {
for path in paths { let _: Vec<()> = try_join_all(paths_to_recheck.into_iter().map(|path| {
let buffer = project let mut cx = cx.clone();
.update(&mut cx, |project, cx| project.open_buffer(path.clone(), cx)) let project = project.clone();
.await?; async move {
this.update(&mut cx, |this, cx| { let buffer = project
this.populate_excerpts(path, language_server_id, buffer, cx) .update(&mut cx, |project, cx| project.open_buffer(path.clone(), cx))
})?; .await
} .with_context(|| format!("opening buffer for path {path:?}"))?;
Result::<_, anyhow::Error>::Ok(()) this.update(&mut cx, |this, cx| {
this.populate_excerpts(path, language_server_id, buffer, cx);
})
.context("missing project")?;
anyhow::Ok(())
}
}))
.await
.context("rechecking diagnostics for paths")?;
this.update(&mut cx, |this, cx| {
this.summary = this.project.read(cx).diagnostic_summary(cx);
cx.emit(Event::TitleChanged);
})?;
anyhow::Ok(())
} }
.log_err() .log_err()
}) })
@ -554,11 +627,6 @@ impl ProjectDiagnosticsEditor {
} }
cx.notify(); cx.notify();
} }
fn update_title(&mut self, cx: &mut ViewContext<Self>) {
self.summary = self.project.read(cx).diagnostic_summary(cx);
cx.emit(Event::TitleChanged);
}
} }
impl Item for ProjectDiagnosticsEditor { impl Item for ProjectDiagnosticsEditor {
@ -1301,25 +1369,6 @@ mod tests {
cx, cx,
) )
.unwrap(); .unwrap();
project
.update_diagnostic_entries(
server_id_2,
PathBuf::from("/test/main.js"),
None,
vec![DiagnosticEntry {
range: Unclipped(PointUtf16::new(1, 0))..Unclipped(PointUtf16::new(1, 1)),
diagnostic: Diagnostic {
message: "warning 1".to_string(),
severity: DiagnosticSeverity::ERROR,
is_primary: true,
is_disk_based: true,
group_id: 2,
..Default::default()
},
}],
cx,
)
.unwrap();
}); });
// The first language server finishes // The first language server finishes
@ -1353,6 +1402,25 @@ mod tests {
// The second language server finishes // The second language server finishes
project.update(cx, |project, cx| { project.update(cx, |project, cx| {
project
.update_diagnostic_entries(
server_id_2,
PathBuf::from("/test/main.js"),
None,
vec![DiagnosticEntry {
range: Unclipped(PointUtf16::new(1, 0))..Unclipped(PointUtf16::new(1, 1)),
diagnostic: Diagnostic {
message: "warning 1".to_string(),
severity: DiagnosticSeverity::ERROR,
is_primary: true,
is_disk_based: true,
group_id: 2,
..Default::default()
},
}],
cx,
)
.unwrap();
project.disk_based_diagnostics_finished(server_id_2, cx); project.disk_based_diagnostics_finished(server_id_2, cx);
}); });

View file

@ -33,9 +33,9 @@ use util::{
paths::{PathExt, FILE_ROW_COLUMN_DELIMITER}, paths::{PathExt, FILE_ROW_COLUMN_DELIMITER},
ResultExt, TryFutureExt, ResultExt, TryFutureExt,
}; };
use workspace::item::{BreadcrumbText, FollowableItemHandle}; use workspace::item::{BreadcrumbText, FollowableItemHandle, ItemHandle};
use workspace::{ use workspace::{
item::{FollowableItem, Item, ItemEvent, ItemHandle, ProjectItem}, item::{FollowableItem, Item, ItemEvent, ProjectItem},
searchable::{Direction, SearchEvent, SearchableItem, SearchableItemHandle}, searchable::{Direction, SearchEvent, SearchableItem, SearchableItemHandle},
ItemId, ItemNavHistory, Pane, StatusItemView, ToolbarItemLocation, ViewId, Workspace, ItemId, ItemNavHistory, Pane, StatusItemView, ToolbarItemLocation, ViewId, Workspace,
WorkspaceId, WorkspaceId,

View file

@ -1692,14 +1692,25 @@ fn test_language_scope_at_with_javascript(cx: &mut AppContext) {
r#" r#"
(jsx_element) @element (jsx_element) @element
(string) @string (string) @string
[
(jsx_opening_element)
(jsx_closing_element)
(jsx_expression)
] @default
"#, "#,
) )
.unwrap(); .unwrap();
let text = r#"a["b"] = <C d="e"></C>;"#; let text = r#"
a["b"] = <C d="e">
<F></F>
{ g() }
</C>;
"#
.unindent();
let buffer = let buffer =
Buffer::new(0, cx.model_id() as u64, text).with_language(Arc::new(language), cx); Buffer::new(0, cx.model_id() as u64, &text).with_language(Arc::new(language), cx);
let snapshot = buffer.snapshot(); let snapshot = buffer.snapshot();
let config = snapshot.language_scope_at(0).unwrap(); let config = snapshot.language_scope_at(0).unwrap();
@ -1710,7 +1721,9 @@ fn test_language_scope_at_with_javascript(cx: &mut AppContext) {
&[true, true] &[true, true]
); );
let string_config = snapshot.language_scope_at(3).unwrap(); let string_config = snapshot
.language_scope_at(text.find("b\"").unwrap())
.unwrap();
assert_eq!(string_config.line_comment_prefix().unwrap().as_ref(), "// "); assert_eq!(string_config.line_comment_prefix().unwrap().as_ref(), "// ");
// Second bracket pair is disabled // Second bracket pair is disabled
assert_eq!( assert_eq!(
@ -1718,18 +1731,49 @@ fn test_language_scope_at_with_javascript(cx: &mut AppContext) {
&[true, false] &[true, false]
); );
let element_config = snapshot.language_scope_at(10).unwrap(); // In between JSX tags: use the `element` override.
let element_config = snapshot
.language_scope_at(text.find("<F>").unwrap())
.unwrap();
assert_eq!(element_config.line_comment_prefix(), None); assert_eq!(element_config.line_comment_prefix(), None);
assert_eq!( assert_eq!(
element_config.block_comment_delimiters(), element_config.block_comment_delimiters(),
Some((&"{/*".into(), &"*/}".into())) Some((&"{/*".into(), &"*/}".into()))
); );
// Both bracket pairs are enabled
assert_eq!( assert_eq!(
element_config.brackets().map(|e| e.1).collect::<Vec<_>>(), element_config.brackets().map(|e| e.1).collect::<Vec<_>>(),
&[true, true] &[true, true]
); );
// Within a JSX tag: use the default config.
let tag_config = snapshot
.language_scope_at(text.find(" d=").unwrap() + 1)
.unwrap();
assert_eq!(tag_config.line_comment_prefix().unwrap().as_ref(), "// ");
assert_eq!(
tag_config.brackets().map(|e| e.1).collect::<Vec<_>>(),
&[true, true]
);
// In a JSX expression: use the default config.
let expression_in_element_config = snapshot
.language_scope_at(text.find("{").unwrap() + 1)
.unwrap();
assert_eq!(
expression_in_element_config
.line_comment_prefix()
.unwrap()
.as_ref(),
"// "
);
assert_eq!(
expression_in_element_config
.brackets()
.map(|e| e.1)
.collect::<Vec<_>>(),
&[true, true]
);
buffer buffer
}); });
} }

View file

@ -1696,14 +1696,25 @@ fn test_language_scope_at_with_javascript(cx: &mut AppContext) {
r#" r#"
(jsx_element) @element (jsx_element) @element
(string) @string (string) @string
[
(jsx_opening_element)
(jsx_closing_element)
(jsx_expression)
] @default
"#, "#,
) )
.unwrap(); .unwrap();
let text = r#"a["b"] = <C d="e"></C>;"#; let text = r#"
a["b"] = <C d="e">
<F></F>
{ g() }
</C>;
"#
.unindent();
let buffer = let buffer =
Buffer::new(0, cx.entity_id().as_u64(), text).with_language(Arc::new(language), cx); Buffer::new(0, cx.entity_id().as_u64(), &text).with_language(Arc::new(language), cx);
let snapshot = buffer.snapshot(); let snapshot = buffer.snapshot();
let config = snapshot.language_scope_at(0).unwrap(); let config = snapshot.language_scope_at(0).unwrap();
@ -1714,7 +1725,9 @@ fn test_language_scope_at_with_javascript(cx: &mut AppContext) {
&[true, true] &[true, true]
); );
let string_config = snapshot.language_scope_at(3).unwrap(); let string_config = snapshot
.language_scope_at(text.find("b\"").unwrap())
.unwrap();
assert_eq!(string_config.line_comment_prefix().unwrap().as_ref(), "// "); assert_eq!(string_config.line_comment_prefix().unwrap().as_ref(), "// ");
// Second bracket pair is disabled // Second bracket pair is disabled
assert_eq!( assert_eq!(
@ -1722,18 +1735,49 @@ fn test_language_scope_at_with_javascript(cx: &mut AppContext) {
&[true, false] &[true, false]
); );
let element_config = snapshot.language_scope_at(10).unwrap(); // In between JSX tags: use the `element` override.
let element_config = snapshot
.language_scope_at(text.find("<F>").unwrap())
.unwrap();
assert_eq!(element_config.line_comment_prefix(), None); assert_eq!(element_config.line_comment_prefix(), None);
assert_eq!( assert_eq!(
element_config.block_comment_delimiters(), element_config.block_comment_delimiters(),
Some((&"{/*".into(), &"*/}".into())) Some((&"{/*".into(), &"*/}".into()))
); );
// Both bracket pairs are enabled
assert_eq!( assert_eq!(
element_config.brackets().map(|e| e.1).collect::<Vec<_>>(), element_config.brackets().map(|e| e.1).collect::<Vec<_>>(),
&[true, true] &[true, true]
); );
// Within a JSX tag: use the default config.
let tag_config = snapshot
.language_scope_at(text.find(" d=").unwrap() + 1)
.unwrap();
assert_eq!(tag_config.line_comment_prefix().unwrap().as_ref(), "// ");
assert_eq!(
tag_config.brackets().map(|e| e.1).collect::<Vec<_>>(),
&[true, true]
);
// In a JSX expression: use the default config.
let expression_in_element_config = snapshot
.language_scope_at(text.find("{").unwrap() + 1)
.unwrap();
assert_eq!(
expression_in_element_config
.line_comment_prefix()
.unwrap()
.as_ref(),
"// "
);
assert_eq!(
expression_in_element_config
.brackets()
.map(|e| e.1)
.collect::<Vec<_>>(),
&[true, true]
);
buffer buffer
}); });
} }

View file

@ -1,9 +1,9 @@
use std::collections::VecDeque; use std::ops::ControlFlow;
use std::path::{Path, PathBuf}; use std::path::{Path, PathBuf};
use std::sync::Arc; use std::sync::Arc;
use anyhow::Context; use anyhow::Context;
use collections::HashMap; use collections::{HashMap, HashSet};
use fs::Fs; use fs::Fs;
use gpui::{AsyncAppContext, ModelHandle}; use gpui::{AsyncAppContext, ModelHandle};
use language::language_settings::language_settings; use language::language_settings::language_settings;
@ -11,7 +11,7 @@ use language::{Buffer, Diff};
use lsp::{LanguageServer, LanguageServerId}; use lsp::{LanguageServer, LanguageServerId};
use node_runtime::NodeRuntime; use node_runtime::NodeRuntime;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use util::paths::DEFAULT_PRETTIER_DIR; use util::paths::{PathMatcher, DEFAULT_PRETTIER_DIR};
pub enum Prettier { pub enum Prettier {
Real(RealPrettier), Real(RealPrettier),
@ -20,7 +20,6 @@ pub enum Prettier {
} }
pub struct RealPrettier { pub struct RealPrettier {
worktree_id: Option<usize>,
default: bool, default: bool,
prettier_dir: PathBuf, prettier_dir: PathBuf,
server: Arc<LanguageServer>, server: Arc<LanguageServer>,
@ -28,17 +27,10 @@ pub struct RealPrettier {
#[cfg(any(test, feature = "test-support"))] #[cfg(any(test, feature = "test-support"))]
pub struct TestPrettier { pub struct TestPrettier {
worktree_id: Option<usize>,
prettier_dir: PathBuf, prettier_dir: PathBuf,
default: bool, default: bool,
} }
#[derive(Debug)]
pub struct LocateStart {
pub worktree_root_path: Arc<Path>,
pub starting_path: Arc<Path>,
}
pub const PRETTIER_SERVER_FILE: &str = "prettier_server.js"; pub const PRETTIER_SERVER_FILE: &str = "prettier_server.js";
pub const PRETTIER_SERVER_JS: &str = include_str!("./prettier_server.js"); pub const PRETTIER_SERVER_JS: &str = include_str!("./prettier_server.js");
const PRETTIER_PACKAGE_NAME: &str = "prettier"; const PRETTIER_PACKAGE_NAME: &str = "prettier";
@ -63,79 +55,112 @@ impl Prettier {
".editorconfig", ".editorconfig",
]; ];
pub async fn locate( pub async fn locate_prettier_installation(
starting_path: Option<LocateStart>, fs: &dyn Fs,
fs: Arc<dyn Fs>, installed_prettiers: &HashSet<PathBuf>,
) -> anyhow::Result<PathBuf> { locate_from: &Path,
fn is_node_modules(path_component: &std::path::Component<'_>) -> bool { ) -> anyhow::Result<ControlFlow<(), Option<PathBuf>>> {
path_component.as_os_str().to_string_lossy() == "node_modules" let mut path_to_check = locate_from
.components()
.take_while(|component| component.as_os_str().to_string_lossy() != "node_modules")
.collect::<PathBuf>();
if path_to_check != locate_from {
log::debug!(
"Skipping prettier location for path {path_to_check:?} that is inside node_modules"
);
return Ok(ControlFlow::Break(()));
}
let path_to_check_metadata = fs
.metadata(&path_to_check)
.await
.with_context(|| format!("failed to get metadata for initial path {path_to_check:?}"))?
.with_context(|| format!("empty metadata for initial path {path_to_check:?}"))?;
if !path_to_check_metadata.is_dir {
path_to_check.pop();
} }
let paths_to_check = match starting_path.as_ref() { let mut project_path_with_prettier_dependency = None;
Some(starting_path) => { loop {
let worktree_root = starting_path if installed_prettiers.contains(&path_to_check) {
.worktree_root_path log::debug!("Found prettier path {path_to_check:?} in installed prettiers");
.components() return Ok(ControlFlow::Continue(Some(path_to_check)));
.into_iter() } else if let Some(package_json_contents) =
.take_while(|path_component| !is_node_modules(path_component)) read_package_json(fs, &path_to_check).await?
.collect::<PathBuf>(); {
if worktree_root != starting_path.worktree_root_path.as_ref() { if has_prettier_in_package_json(&package_json_contents) {
vec![worktree_root] if has_prettier_in_node_modules(fs, &path_to_check).await? {
log::debug!("Found prettier path {path_to_check:?} in both package.json and node_modules");
return Ok(ControlFlow::Continue(Some(path_to_check)));
} else if project_path_with_prettier_dependency.is_none() {
project_path_with_prettier_dependency = Some(path_to_check.clone());
}
} else { } else {
if starting_path.starting_path.as_ref() == Path::new("") { match package_json_contents.get("workspaces") {
worktree_root Some(serde_json::Value::Array(workspaces)) => {
.parent() match &project_path_with_prettier_dependency {
.map(|path| vec![path.to_path_buf()]) Some(project_path_with_prettier_dependency) => {
.unwrap_or_default() let subproject_path = project_path_with_prettier_dependency.strip_prefix(&path_to_check).expect("traversing path parents, should be able to strip prefix");
} else { if workspaces.iter().filter_map(|value| {
let file_to_format = starting_path.starting_path.as_ref(); if let serde_json::Value::String(s) = value {
let mut paths_to_check = VecDeque::new(); Some(s.clone())
let mut current_path = worktree_root; } else {
for path_component in file_to_format.components().into_iter() { log::warn!("Skipping non-string 'workspaces' value: {value:?}");
let new_path = current_path.join(path_component); None
let old_path = std::mem::replace(&mut current_path, new_path); }
paths_to_check.push_front(old_path); }).any(|workspace_definition| {
if is_node_modules(&path_component) { if let Some(path_matcher) = PathMatcher::new(&workspace_definition).ok() {
break; path_matcher.is_match(subproject_path)
} else {
workspace_definition == subproject_path.to_string_lossy()
}
}) {
anyhow::ensure!(has_prettier_in_node_modules(fs, &path_to_check).await?, "Found prettier path {path_to_check:?} in the workspace root for project in {project_path_with_prettier_dependency:?}, but it's not installed into workspace root's node_modules");
log::info!("Found prettier path {path_to_check:?} in the workspace root for project in {project_path_with_prettier_dependency:?}");
return Ok(ControlFlow::Continue(Some(path_to_check)));
} else {
log::warn!("Skipping path {path_to_check:?} that has prettier in its 'node_modules' subdirectory, but is not included in its package.json workspaces {workspaces:?}");
}
}
None => {
log::warn!("Skipping path {path_to_check:?} that has prettier in its 'node_modules' subdirectory, but has no prettier in its package.json");
}
} }
} },
Vec::from(paths_to_check) Some(unknown) => log::error!("Failed to parse workspaces for {path_to_check:?} from package.json, got {unknown:?}. Skipping."),
None => log::warn!("Skipping path {path_to_check:?} that has no prettier dependency and no workspaces section in its package.json"),
} }
} }
} }
None => Vec::new(),
};
match find_closest_prettier_dir(paths_to_check, fs.as_ref()) if !path_to_check.pop() {
.await match project_path_with_prettier_dependency {
.with_context(|| format!("finding prettier starting with {starting_path:?}"))? Some(closest_prettier_discovered) => {
{ anyhow::bail!("No prettier found in node_modules for ancestors of {locate_from:?}, but discovered prettier package.json dependency in {closest_prettier_discovered:?}")
Some(prettier_dir) => Ok(prettier_dir), }
None => Ok(DEFAULT_PRETTIER_DIR.to_path_buf()), None => {
log::debug!("Found no prettier in ancestors of {locate_from:?}");
return Ok(ControlFlow::Continue(None));
}
}
}
} }
} }
#[cfg(any(test, feature = "test-support"))] #[cfg(any(test, feature = "test-support"))]
pub async fn start( pub async fn start(
worktree_id: Option<usize>,
_: LanguageServerId, _: LanguageServerId,
prettier_dir: PathBuf, prettier_dir: PathBuf,
_: Arc<dyn NodeRuntime>, _: Arc<dyn NodeRuntime>,
_: AsyncAppContext, _: AsyncAppContext,
) -> anyhow::Result<Self> { ) -> anyhow::Result<Self> {
Ok( Ok(Self::Test(TestPrettier {
#[cfg(any(test, feature = "test-support"))] default: prettier_dir == DEFAULT_PRETTIER_DIR.as_path(),
Self::Test(TestPrettier { prettier_dir,
worktree_id, }))
default: prettier_dir == DEFAULT_PRETTIER_DIR.as_path(),
prettier_dir,
}),
)
} }
#[cfg(not(any(test, feature = "test-support")))] #[cfg(not(any(test, feature = "test-support")))]
pub async fn start( pub async fn start(
worktree_id: Option<usize>,
server_id: LanguageServerId, server_id: LanguageServerId,
prettier_dir: PathBuf, prettier_dir: PathBuf,
node: Arc<dyn NodeRuntime>, node: Arc<dyn NodeRuntime>,
@ -143,7 +168,7 @@ impl Prettier {
) -> anyhow::Result<Self> { ) -> anyhow::Result<Self> {
use lsp::LanguageServerBinary; use lsp::LanguageServerBinary;
let backgroud = cx.background(); let background = cx.background();
anyhow::ensure!( anyhow::ensure!(
prettier_dir.is_dir(), prettier_dir.is_dir(),
"Prettier dir {prettier_dir:?} is not a directory" "Prettier dir {prettier_dir:?} is not a directory"
@ -154,7 +179,7 @@ impl Prettier {
"no prettier server package found at {prettier_server:?}" "no prettier server package found at {prettier_server:?}"
); );
let node_path = backgroud let node_path = background
.spawn(async move { node.binary_path().await }) .spawn(async move { node.binary_path().await })
.await?; .await?;
let server = LanguageServer::new( let server = LanguageServer::new(
@ -169,12 +194,11 @@ impl Prettier {
cx, cx,
) )
.context("prettier server creation")?; .context("prettier server creation")?;
let server = backgroud let server = background
.spawn(server.initialize(None)) .spawn(server.initialize(None))
.await .await
.context("prettier server initialization")?; .context("prettier server initialization")?;
Ok(Self::Real(RealPrettier { Ok(Self::Real(RealPrettier {
worktree_id,
server, server,
default: prettier_dir == DEFAULT_PRETTIER_DIR.as_path(), default: prettier_dir == DEFAULT_PRETTIER_DIR.as_path(),
prettier_dir, prettier_dir,
@ -340,64 +364,61 @@ impl Prettier {
Self::Test(test_prettier) => &test_prettier.prettier_dir, Self::Test(test_prettier) => &test_prettier.prettier_dir,
} }
} }
pub fn worktree_id(&self) -> Option<usize> {
match self {
Self::Real(local) => local.worktree_id,
#[cfg(any(test, feature = "test-support"))]
Self::Test(test_prettier) => test_prettier.worktree_id,
}
}
} }
async fn find_closest_prettier_dir( async fn has_prettier_in_node_modules(fs: &dyn Fs, path: &Path) -> anyhow::Result<bool> {
paths_to_check: Vec<PathBuf>, let possible_node_modules_location = path.join("node_modules").join(PRETTIER_PACKAGE_NAME);
fs: &dyn Fs, if let Some(node_modules_location_metadata) = fs
) -> anyhow::Result<Option<PathBuf>> { .metadata(&possible_node_modules_location)
for path in paths_to_check { .await
let possible_package_json = path.join("package.json"); .with_context(|| format!("fetching metadata for {possible_node_modules_location:?}"))?
if let Some(package_json_metadata) = fs {
.metadata(&possible_package_json) return Ok(node_modules_location_metadata.is_dir);
.await }
.with_context(|| format!("Fetching metadata for {possible_package_json:?}"))? Ok(false)
{ }
if !package_json_metadata.is_dir && !package_json_metadata.is_symlink {
let package_json_contents = fs
.load(&possible_package_json)
.await
.with_context(|| format!("reading {possible_package_json:?} file contents"))?;
if let Ok(json_contents) = serde_json::from_str::<HashMap<String, serde_json::Value>>(
&package_json_contents,
) {
if let Some(serde_json::Value::Object(o)) = json_contents.get("dependencies") {
if o.contains_key(PRETTIER_PACKAGE_NAME) {
return Ok(Some(path));
}
}
if let Some(serde_json::Value::Object(o)) = json_contents.get("devDependencies")
{
if o.contains_key(PRETTIER_PACKAGE_NAME) {
return Ok(Some(path));
}
}
}
}
}
let possible_node_modules_location = path.join("node_modules").join(PRETTIER_PACKAGE_NAME); async fn read_package_json(
if let Some(node_modules_location_metadata) = fs fs: &dyn Fs,
.metadata(&possible_node_modules_location) path: &Path,
.await ) -> anyhow::Result<Option<HashMap<String, serde_json::Value>>> {
.with_context(|| format!("fetching metadata for {possible_node_modules_location:?}"))? let possible_package_json = path.join("package.json");
{ if let Some(package_json_metadata) = fs
if node_modules_location_metadata.is_dir { .metadata(&possible_package_json)
return Ok(Some(path)); .await
} .with_context(|| format!("fetching metadata for package json {possible_package_json:?}"))?
{
if !package_json_metadata.is_dir && !package_json_metadata.is_symlink {
let package_json_contents = fs
.load(&possible_package_json)
.await
.with_context(|| format!("reading {possible_package_json:?} file contents"))?;
return serde_json::from_str::<HashMap<String, serde_json::Value>>(
&package_json_contents,
)
.map(Some)
.with_context(|| format!("parsing {possible_package_json:?} file contents"));
} }
} }
Ok(None) Ok(None)
} }
fn has_prettier_in_package_json(
package_json_contents: &HashMap<String, serde_json::Value>,
) -> bool {
if let Some(serde_json::Value::Object(o)) = package_json_contents.get("dependencies") {
if o.contains_key(PRETTIER_PACKAGE_NAME) {
return true;
}
}
if let Some(serde_json::Value::Object(o)) = package_json_contents.get("devDependencies") {
if o.contains_key(PRETTIER_PACKAGE_NAME) {
return true;
}
}
false
}
enum Format {} enum Format {}
#[derive(Clone, Debug, Eq, PartialEq, Serialize, Deserialize)] #[derive(Clone, Debug, Eq, PartialEq, Serialize, Deserialize)]
@ -436,3 +457,378 @@ impl lsp::request::Request for ClearCache {
type Result = (); type Result = ();
const METHOD: &'static str = "prettier/clear_cache"; const METHOD: &'static str = "prettier/clear_cache";
} }
#[cfg(test)]
mod tests {
use fs::FakeFs;
use serde_json::json;
use super::*;
#[gpui::test]
async fn test_prettier_lookup_finds_nothing(cx: &mut gpui::TestAppContext) {
let fs = FakeFs::new(cx.background());
fs.insert_tree(
"/root",
json!({
".config": {
"zed": {
"settings.json": r#"{ "formatter": "auto" }"#,
},
},
"work": {
"project": {
"src": {
"index.js": "// index.js file contents",
},
"node_modules": {
"expect": {
"build": {
"print.js": "// print.js file contents",
},
"package.json": r#"{
"devDependencies": {
"prettier": "2.5.1"
}
}"#,
},
"prettier": {
"index.js": "// Dummy prettier package file",
},
},
"package.json": r#"{}"#
},
}
}),
)
.await;
assert!(
matches!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/.config/zed/settings.json"),
)
.await,
Ok(ControlFlow::Continue(None))
),
"Should successfully find no prettier for path hierarchy without it"
);
assert!(
matches!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/project/src/index.js")
)
.await,
Ok(ControlFlow::Continue(None))
),
"Should successfully find no prettier for path hierarchy that has node_modules with prettier, but no package.json mentions of it"
);
assert!(
matches!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/project/node_modules/expect/build/print.js")
)
.await,
Ok(ControlFlow::Break(()))
),
"Should not format files inside node_modules/"
);
}
#[gpui::test]
async fn test_prettier_lookup_in_simple_npm_projects(cx: &mut gpui::TestAppContext) {
let fs = FakeFs::new(cx.background());
fs.insert_tree(
"/root",
json!({
"web_blog": {
"node_modules": {
"prettier": {
"index.js": "// Dummy prettier package file",
},
"expect": {
"build": {
"print.js": "// print.js file contents",
},
"package.json": r#"{
"devDependencies": {
"prettier": "2.5.1"
}
}"#,
},
},
"pages": {
"[slug].tsx": "// [slug].tsx file contents",
},
"package.json": r#"{
"devDependencies": {
"prettier": "2.3.0"
},
"prettier": {
"semi": false,
"printWidth": 80,
"htmlWhitespaceSensitivity": "strict",
"tabWidth": 4
}
}"#
}
}),
)
.await;
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/web_blog/pages/[slug].tsx")
)
.await
.unwrap(),
ControlFlow::Continue(Some(PathBuf::from("/root/web_blog"))),
"Should find a preinstalled prettier in the project root"
);
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/web_blog/node_modules/expect/build/print.js")
)
.await
.unwrap(),
ControlFlow::Break(()),
"Should not allow formatting node_modules/ contents"
);
}
#[gpui::test]
async fn test_prettier_lookup_for_not_installed(cx: &mut gpui::TestAppContext) {
let fs = FakeFs::new(cx.background());
fs.insert_tree(
"/root",
json!({
"work": {
"web_blog": {
"node_modules": {
"expect": {
"build": {
"print.js": "// print.js file contents",
},
"package.json": r#"{
"devDependencies": {
"prettier": "2.5.1"
}
}"#,
},
},
"pages": {
"[slug].tsx": "// [slug].tsx file contents",
},
"package.json": r#"{
"devDependencies": {
"prettier": "2.3.0"
},
"prettier": {
"semi": false,
"printWidth": 80,
"htmlWhitespaceSensitivity": "strict",
"tabWidth": 4
}
}"#
}
}
}),
)
.await;
match Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/web_blog/pages/[slug].tsx")
)
.await {
Ok(path) => panic!("Expected to fail for prettier in package.json but not in node_modules found, but got path {path:?}"),
Err(e) => {
let message = e.to_string();
assert!(message.contains("/root/work/web_blog"), "Error message should mention which project had prettier defined");
},
};
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::from_iter(
[PathBuf::from("/root"), PathBuf::from("/root/work")].into_iter()
),
Path::new("/root/work/web_blog/pages/[slug].tsx")
)
.await
.unwrap(),
ControlFlow::Continue(Some(PathBuf::from("/root/work"))),
"Should return closest cached value found without path checks"
);
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/web_blog/node_modules/expect/build/print.js")
)
.await
.unwrap(),
ControlFlow::Break(()),
"Should not allow formatting files inside node_modules/"
);
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::from_iter(
[PathBuf::from("/root"), PathBuf::from("/root/work")].into_iter()
),
Path::new("/root/work/web_blog/node_modules/expect/build/print.js")
)
.await
.unwrap(),
ControlFlow::Break(()),
"Should ignore cache lookup for files inside node_modules/"
);
}
#[gpui::test]
async fn test_prettier_lookup_in_npm_workspaces(cx: &mut gpui::TestAppContext) {
let fs = FakeFs::new(cx.background());
fs.insert_tree(
"/root",
json!({
"work": {
"full-stack-foundations": {
"exercises": {
"03.loading": {
"01.problem.loader": {
"app": {
"routes": {
"users+": {
"$username_+": {
"notes.tsx": "// notes.tsx file contents",
},
},
},
},
"node_modules": {
"test.js": "// test.js contents",
},
"package.json": r#"{
"devDependencies": {
"prettier": "^3.0.3"
}
}"#
},
},
},
"package.json": r#"{
"workspaces": ["exercises/*/*", "examples/*"]
}"#,
"node_modules": {
"prettier": {
"index.js": "// Dummy prettier package file",
},
},
},
}
}),
)
.await;
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/full-stack-foundations/exercises/03.loading/01.problem.loader/app/routes/users+/$username_+/notes.tsx"),
).await.unwrap(),
ControlFlow::Continue(Some(PathBuf::from("/root/work/full-stack-foundations"))),
"Should ascend to the multi-workspace root and find the prettier there",
);
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/full-stack-foundations/node_modules/prettier/index.js")
)
.await
.unwrap(),
ControlFlow::Break(()),
"Should not allow formatting files inside root node_modules/"
);
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/full-stack-foundations/exercises/03.loading/01.problem.loader/node_modules/test.js")
)
.await
.unwrap(),
ControlFlow::Break(()),
"Should not allow formatting files inside submodule's node_modules/"
);
}
#[gpui::test]
async fn test_prettier_lookup_in_npm_workspaces_for_not_installed(
cx: &mut gpui::TestAppContext,
) {
let fs = FakeFs::new(cx.background());
fs.insert_tree(
"/root",
json!({
"work": {
"full-stack-foundations": {
"exercises": {
"03.loading": {
"01.problem.loader": {
"app": {
"routes": {
"users+": {
"$username_+": {
"notes.tsx": "// notes.tsx file contents",
},
},
},
},
"node_modules": {},
"package.json": r#"{
"devDependencies": {
"prettier": "^3.0.3"
}
}"#
},
},
},
"package.json": r#"{
"workspaces": ["exercises/*/*", "examples/*"]
}"#,
},
}
}),
)
.await;
match Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/full-stack-foundations/exercises/03.loading/01.problem.loader/app/routes/users+/$username_+/notes.tsx")
)
.await {
Ok(path) => panic!("Expected to fail for prettier in package.json but not in node_modules found, but got path {path:?}"),
Err(e) => {
let message = e.to_string();
assert!(message.contains("/root/work/full-stack-foundations/exercises/03.loading/01.problem.loader"), "Error message should mention which project had prettier defined");
assert!(message.contains("/root/work/full-stack-foundations"), "Error message should mention potential candidates without prettier node_modules contents");
},
};
}
}

View file

@ -1,11 +1,13 @@
const { Buffer } = require('buffer'); const { Buffer } = require("buffer");
const fs = require("fs"); const fs = require("fs");
const path = require("path"); const path = require("path");
const { once } = require('events'); const { once } = require("events");
const prettierContainerPath = process.argv[2]; const prettierContainerPath = process.argv[2];
if (prettierContainerPath == null || prettierContainerPath.length == 0) { if (prettierContainerPath == null || prettierContainerPath.length == 0) {
process.stderr.write(`Prettier path argument was not specified or empty.\nUsage: ${process.argv[0]} ${process.argv[1]} prettier/path\n`); process.stderr.write(
`Prettier path argument was not specified or empty.\nUsage: ${process.argv[0]} ${process.argv[1]} prettier/path\n`,
);
process.exit(1); process.exit(1);
} }
fs.stat(prettierContainerPath, (err, stats) => { fs.stat(prettierContainerPath, (err, stats) => {
@ -19,7 +21,7 @@ fs.stat(prettierContainerPath, (err, stats) => {
process.exit(1); process.exit(1);
} }
}); });
const prettierPath = path.join(prettierContainerPath, 'node_modules/prettier'); const prettierPath = path.join(prettierContainerPath, "node_modules/prettier");
class Prettier { class Prettier {
constructor(path, prettier, config) { constructor(path, prettier, config) {
@ -34,7 +36,7 @@ class Prettier {
let config; let config;
try { try {
prettier = await loadPrettier(prettierPath); prettier = await loadPrettier(prettierPath);
config = await prettier.resolveConfig(prettierPath) || {}; config = (await prettier.resolveConfig(prettierPath)) || {};
} catch (e) { } catch (e) {
process.stderr.write(`Failed to load prettier: ${e}\n`); process.stderr.write(`Failed to load prettier: ${e}\n`);
process.exit(1); process.exit(1);
@ -42,7 +44,7 @@ class Prettier {
process.stderr.write(`Prettier at path '${prettierPath}' loaded successfully, config: ${JSON.stringify(config)}\n`); process.stderr.write(`Prettier at path '${prettierPath}' loaded successfully, config: ${JSON.stringify(config)}\n`);
process.stdin.resume(); process.stdin.resume();
handleBuffer(new Prettier(prettierPath, prettier, config)); handleBuffer(new Prettier(prettierPath, prettier, config));
})() })();
async function handleBuffer(prettier) { async function handleBuffer(prettier) {
for await (const messageText of readStdin()) { for await (const messageText of readStdin()) {
@ -54,25 +56,29 @@ async function handleBuffer(prettier) {
continue; continue;
} }
// allow concurrent request handling by not `await`ing the message handling promise (async function) // allow concurrent request handling by not `await`ing the message handling promise (async function)
handleMessage(message, prettier).catch(e => { handleMessage(message, prettier).catch((e) => {
const errorMessage = message; const errorMessage = message;
if ((errorMessage.params || {}).text !== undefined) { if ((errorMessage.params || {}).text !== undefined) {
errorMessage.params.text = "..snip.."; errorMessage.params.text = "..snip..";
} }
sendResponse({ id: message.id, ...makeError(`error during message '${JSON.stringify(errorMessage)}' handling: ${e}`) }); }); sendResponse({
id: message.id,
...makeError(`error during message '${JSON.stringify(errorMessage)}' handling: ${e}`),
});
});
} }
} }
const headerSeparator = "\r\n"; const headerSeparator = "\r\n";
const contentLengthHeaderName = 'Content-Length'; const contentLengthHeaderName = "Content-Length";
async function* readStdin() { async function* readStdin() {
let buffer = Buffer.alloc(0); let buffer = Buffer.alloc(0);
let streamEnded = false; let streamEnded = false;
process.stdin.on('end', () => { process.stdin.on("end", () => {
streamEnded = true; streamEnded = true;
}); });
process.stdin.on('data', (data) => { process.stdin.on("data", (data) => {
buffer = Buffer.concat([buffer, data]); buffer = Buffer.concat([buffer, data]);
}); });
@ -80,7 +86,7 @@ async function* readStdin() {
sendResponse(makeError(errorMessage)); sendResponse(makeError(errorMessage));
buffer = Buffer.alloc(0); buffer = Buffer.alloc(0);
messageLength = null; messageLength = null;
await once(process.stdin, 'readable'); await once(process.stdin, "readable");
streamEnded = false; streamEnded = false;
} }
@ -91,20 +97,25 @@ async function* readStdin() {
if (messageLength === null) { if (messageLength === null) {
while (buffer.indexOf(`${headerSeparator}${headerSeparator}`) === -1) { while (buffer.indexOf(`${headerSeparator}${headerSeparator}`) === -1) {
if (streamEnded) { if (streamEnded) {
await handleStreamEnded('Unexpected end of stream: headers not found'); await handleStreamEnded("Unexpected end of stream: headers not found");
continue main_loop; continue main_loop;
} else if (buffer.length > contentLengthHeaderName.length * 10) { } else if (buffer.length > contentLengthHeaderName.length * 10) {
await handleStreamEnded(`Unexpected stream of bytes: no headers end found after ${buffer.length} bytes of input`); await handleStreamEnded(
`Unexpected stream of bytes: no headers end found after ${buffer.length} bytes of input`,
);
continue main_loop; continue main_loop;
} }
await once(process.stdin, 'readable'); await once(process.stdin, "readable");
} }
const headers = buffer.subarray(0, buffer.indexOf(`${headerSeparator}${headerSeparator}`)).toString('ascii'); const headers = buffer
const contentLengthHeader = headers.split(headerSeparator) .subarray(0, buffer.indexOf(`${headerSeparator}${headerSeparator}`))
.map(header => header.split(':')) .toString("ascii");
.filter(header => header[2] === undefined) const contentLengthHeader = headers
.filter(header => (header[1] || '').length > 0) .split(headerSeparator)
.find(header => (header[0] || '').trim() === contentLengthHeaderName); .map((header) => header.split(":"))
.filter((header) => header[2] === undefined)
.filter((header) => (header[1] || "").length > 0)
.find((header) => (header[0] || "").trim() === contentLengthHeaderName);
const contentLength = (contentLengthHeader || [])[1]; const contentLength = (contentLengthHeader || [])[1];
if (contentLength === undefined) { if (contentLength === undefined) {
await handleStreamEnded(`Missing or incorrect ${contentLengthHeaderName} header: ${headers}`); await handleStreamEnded(`Missing or incorrect ${contentLengthHeaderName} header: ${headers}`);
@ -114,13 +125,14 @@ async function* readStdin() {
messageLength = parseInt(contentLength, 10); messageLength = parseInt(contentLength, 10);
} }
while (buffer.length < (headersLength + messageLength)) { while (buffer.length < headersLength + messageLength) {
if (streamEnded) { if (streamEnded) {
await handleStreamEnded( await handleStreamEnded(
`Unexpected end of stream: buffer length ${buffer.length} does not match expected header length ${headersLength} + body length ${messageLength}`); `Unexpected end of stream: buffer length ${buffer.length} does not match expected header length ${headersLength} + body length ${messageLength}`,
);
continue main_loop; continue main_loop;
} }
await once(process.stdin, 'readable'); await once(process.stdin, "readable");
} }
const messageEnd = headersLength + messageLength; const messageEnd = headersLength + messageLength;
@ -128,12 +140,12 @@ async function* readStdin() {
buffer = buffer.subarray(messageEnd); buffer = buffer.subarray(messageEnd);
headersLength = null; headersLength = null;
messageLength = null; messageLength = null;
yield message.toString('utf8'); yield message.toString("utf8");
} }
} catch (e) { } catch (e) {
sendResponse(makeError(`Error reading stdin: ${e}`)); sendResponse(makeError(`Error reading stdin: ${e}`));
} finally { } finally {
process.stdin.off('data', () => { }); process.stdin.off("data", () => {});
} }
} }
@ -146,7 +158,7 @@ async function handleMessage(message, prettier) {
throw new Error(`Message id is undefined: ${JSON.stringify(message)}`); throw new Error(`Message id is undefined: ${JSON.stringify(message)}`);
} }
if (method === 'prettier/format') { if (method === "prettier/format") {
if (params === undefined || params.text === undefined) { if (params === undefined || params.text === undefined) {
throw new Error(`Message params.text is undefined: ${JSON.stringify(message)}`); throw new Error(`Message params.text is undefined: ${JSON.stringify(message)}`);
} }
@ -156,7 +168,7 @@ async function handleMessage(message, prettier) {
let resolvedConfig = {}; let resolvedConfig = {};
if (params.options.filepath !== undefined) { if (params.options.filepath !== undefined) {
resolvedConfig = await prettier.prettier.resolveConfig(params.options.filepath) || {}; resolvedConfig = (await prettier.prettier.resolveConfig(params.options.filepath)) || {};
} }
const options = { const options = {
@ -164,21 +176,25 @@ async function handleMessage(message, prettier) {
...resolvedConfig, ...resolvedConfig,
parser: params.options.parser, parser: params.options.parser,
plugins: params.options.plugins, plugins: params.options.plugins,
path: params.options.filepath path: params.options.filepath,
}; };
process.stderr.write(`Resolved config: ${JSON.stringify(resolvedConfig)}, will format file '${params.options.filepath || ''}' with options: ${JSON.stringify(options)}\n`); process.stderr.write(
`Resolved config: ${JSON.stringify(resolvedConfig)}, will format file '${
params.options.filepath || ""
}' with options: ${JSON.stringify(options)}\n`,
);
const formattedText = await prettier.prettier.format(params.text, options); const formattedText = await prettier.prettier.format(params.text, options);
sendResponse({ id, result: { text: formattedText } }); sendResponse({ id, result: { text: formattedText } });
} else if (method === 'prettier/clear_cache') { } else if (method === "prettier/clear_cache") {
prettier.prettier.clearConfigCache(); prettier.prettier.clearConfigCache();
prettier.config = await prettier.prettier.resolveConfig(prettier.path) || {}; prettier.config = (await prettier.prettier.resolveConfig(prettier.path)) || {};
sendResponse({ id, result: null }); sendResponse({ id, result: null });
} else if (method === 'initialize') { } else if (method === "initialize") {
sendResponse({ sendResponse({
id: id || 0, id,
result: { result: {
"capabilities": {} capabilities: {},
} },
}); });
} else { } else {
throw new Error(`Unknown method: ${method}`); throw new Error(`Unknown method: ${method}`);
@ -188,18 +204,20 @@ async function handleMessage(message, prettier) {
function makeError(message) { function makeError(message) {
return { return {
error: { error: {
"code": -32600, // invalid request code code: -32600, // invalid request code
message, message,
} },
}; };
} }
function sendResponse(response) { function sendResponse(response) {
const responsePayloadString = JSON.stringify({ const responsePayloadString = JSON.stringify({
jsonrpc: "2.0", jsonrpc: "2.0",
...response ...response,
}); });
const headers = `${contentLengthHeaderName}: ${Buffer.byteLength(responsePayloadString)}${headerSeparator}${headerSeparator}`; const headers = `${contentLengthHeaderName}: ${Buffer.byteLength(
responsePayloadString,
)}${headerSeparator}${headerSeparator}`;
process.stdout.write(headers + responsePayloadString); process.stdout.write(headers + responsePayloadString);
} }

View file

@ -1,5 +1,5 @@
use anyhow::Context; use anyhow::Context;
use collections::HashMap; use collections::{HashMap, HashSet};
use fs2::Fs; use fs2::Fs;
use gpui2::{AsyncAppContext, Model}; use gpui2::{AsyncAppContext, Model};
use language2::{language_settings::language_settings, Buffer, Diff}; use language2::{language_settings::language_settings, Buffer, Diff};
@ -7,11 +7,11 @@ use lsp2::{LanguageServer, LanguageServerId};
use node_runtime::NodeRuntime; use node_runtime::NodeRuntime;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use std::{ use std::{
collections::VecDeque, ops::ControlFlow,
path::{Path, PathBuf}, path::{Path, PathBuf},
sync::Arc, sync::Arc,
}; };
use util::paths::DEFAULT_PRETTIER_DIR; use util::paths::{PathMatcher, DEFAULT_PRETTIER_DIR};
pub enum Prettier { pub enum Prettier {
Real(RealPrettier), Real(RealPrettier),
@ -20,7 +20,6 @@ pub enum Prettier {
} }
pub struct RealPrettier { pub struct RealPrettier {
worktree_id: Option<usize>,
default: bool, default: bool,
prettier_dir: PathBuf, prettier_dir: PathBuf,
server: Arc<LanguageServer>, server: Arc<LanguageServer>,
@ -28,17 +27,10 @@ pub struct RealPrettier {
#[cfg(any(test, feature = "test-support"))] #[cfg(any(test, feature = "test-support"))]
pub struct TestPrettier { pub struct TestPrettier {
worktree_id: Option<usize>,
prettier_dir: PathBuf, prettier_dir: PathBuf,
default: bool, default: bool,
} }
#[derive(Debug)]
pub struct LocateStart {
pub worktree_root_path: Arc<Path>,
pub starting_path: Arc<Path>,
}
pub const PRETTIER_SERVER_FILE: &str = "prettier_server.js"; pub const PRETTIER_SERVER_FILE: &str = "prettier_server.js";
pub const PRETTIER_SERVER_JS: &str = include_str!("./prettier_server.js"); pub const PRETTIER_SERVER_JS: &str = include_str!("./prettier_server.js");
const PRETTIER_PACKAGE_NAME: &str = "prettier"; const PRETTIER_PACKAGE_NAME: &str = "prettier";
@ -63,79 +55,112 @@ impl Prettier {
".editorconfig", ".editorconfig",
]; ];
pub async fn locate( pub async fn locate_prettier_installation(
starting_path: Option<LocateStart>, fs: &dyn Fs,
fs: Arc<dyn Fs>, installed_prettiers: &HashSet<PathBuf>,
) -> anyhow::Result<PathBuf> { locate_from: &Path,
fn is_node_modules(path_component: &std::path::Component<'_>) -> bool { ) -> anyhow::Result<ControlFlow<(), Option<PathBuf>>> {
path_component.as_os_str().to_string_lossy() == "node_modules" let mut path_to_check = locate_from
.components()
.take_while(|component| component.as_os_str().to_string_lossy() != "node_modules")
.collect::<PathBuf>();
if path_to_check != locate_from {
log::debug!(
"Skipping prettier location for path {path_to_check:?} that is inside node_modules"
);
return Ok(ControlFlow::Break(()));
}
let path_to_check_metadata = fs
.metadata(&path_to_check)
.await
.with_context(|| format!("failed to get metadata for initial path {path_to_check:?}"))?
.with_context(|| format!("empty metadata for initial path {path_to_check:?}"))?;
if !path_to_check_metadata.is_dir {
path_to_check.pop();
} }
let paths_to_check = match starting_path.as_ref() { let mut project_path_with_prettier_dependency = None;
Some(starting_path) => { loop {
let worktree_root = starting_path if installed_prettiers.contains(&path_to_check) {
.worktree_root_path log::debug!("Found prettier path {path_to_check:?} in installed prettiers");
.components() return Ok(ControlFlow::Continue(Some(path_to_check)));
.into_iter() } else if let Some(package_json_contents) =
.take_while(|path_component| !is_node_modules(path_component)) read_package_json(fs, &path_to_check).await?
.collect::<PathBuf>(); {
if worktree_root != starting_path.worktree_root_path.as_ref() { if has_prettier_in_package_json(&package_json_contents) {
vec![worktree_root] if has_prettier_in_node_modules(fs, &path_to_check).await? {
log::debug!("Found prettier path {path_to_check:?} in both package.json and node_modules");
return Ok(ControlFlow::Continue(Some(path_to_check)));
} else if project_path_with_prettier_dependency.is_none() {
project_path_with_prettier_dependency = Some(path_to_check.clone());
}
} else { } else {
if starting_path.starting_path.as_ref() == Path::new("") { match package_json_contents.get("workspaces") {
worktree_root Some(serde_json::Value::Array(workspaces)) => {
.parent() match &project_path_with_prettier_dependency {
.map(|path| vec![path.to_path_buf()]) Some(project_path_with_prettier_dependency) => {
.unwrap_or_default() let subproject_path = project_path_with_prettier_dependency.strip_prefix(&path_to_check).expect("traversing path parents, should be able to strip prefix");
} else { if workspaces.iter().filter_map(|value| {
let file_to_format = starting_path.starting_path.as_ref(); if let serde_json::Value::String(s) = value {
let mut paths_to_check = VecDeque::new(); Some(s.clone())
let mut current_path = worktree_root; } else {
for path_component in file_to_format.components().into_iter() { log::warn!("Skipping non-string 'workspaces' value: {value:?}");
let new_path = current_path.join(path_component); None
let old_path = std::mem::replace(&mut current_path, new_path); }
paths_to_check.push_front(old_path); }).any(|workspace_definition| {
if is_node_modules(&path_component) { if let Some(path_matcher) = PathMatcher::new(&workspace_definition).ok() {
break; path_matcher.is_match(subproject_path)
} } else {
workspace_definition == subproject_path.to_string_lossy()
}
}) {
anyhow::ensure!(has_prettier_in_node_modules(fs, &path_to_check).await?, "Found prettier path {path_to_check:?} in the workspace root for project in {project_path_with_prettier_dependency:?}, but it's not installed into workspace root's node_modules");
log::info!("Found prettier path {path_to_check:?} in the workspace root for project in {project_path_with_prettier_dependency:?}");
return Ok(ControlFlow::Continue(Some(path_to_check)));
} else {
log::warn!("Skipping path {path_to_check:?} that has prettier in its 'node_modules' subdirectory, but is not included in its package.json workspaces {workspaces:?}");
}
}
None => {
log::warn!("Skipping path {path_to_check:?} that has prettier in its 'node_modules' subdirectory, but has no prettier in its package.json");
}
}
},
Some(unknown) => log::error!("Failed to parse workspaces for {path_to_check:?} from package.json, got {unknown:?}. Skipping."),
None => log::warn!("Skipping path {path_to_check:?} that has no prettier dependency and no workspaces section in its package.json"),
} }
Vec::from(paths_to_check) }
}
if !path_to_check.pop() {
match project_path_with_prettier_dependency {
Some(closest_prettier_discovered) => {
anyhow::bail!("No prettier found in node_modules for ancestors of {locate_from:?}, but discovered prettier package.json dependency in {closest_prettier_discovered:?}")
}
None => {
log::debug!("Found no prettier in ancestors of {locate_from:?}");
return Ok(ControlFlow::Continue(None));
} }
} }
} }
None => Vec::new(),
};
match find_closest_prettier_dir(paths_to_check, fs.as_ref())
.await
.with_context(|| format!("finding prettier starting with {starting_path:?}"))?
{
Some(prettier_dir) => Ok(prettier_dir),
None => Ok(DEFAULT_PRETTIER_DIR.to_path_buf()),
} }
} }
#[cfg(any(test, feature = "test-support"))] #[cfg(any(test, feature = "test-support"))]
pub async fn start( pub async fn start(
worktree_id: Option<usize>,
_: LanguageServerId, _: LanguageServerId,
prettier_dir: PathBuf, prettier_dir: PathBuf,
_: Arc<dyn NodeRuntime>, _: Arc<dyn NodeRuntime>,
_: AsyncAppContext, _: AsyncAppContext,
) -> anyhow::Result<Self> { ) -> anyhow::Result<Self> {
Ok( Ok(Self::Test(TestPrettier {
#[cfg(any(test, feature = "test-support"))] default: prettier_dir == DEFAULT_PRETTIER_DIR.as_path(),
Self::Test(TestPrettier { prettier_dir,
worktree_id, }))
default: prettier_dir == DEFAULT_PRETTIER_DIR.as_path(),
prettier_dir,
}),
)
} }
#[cfg(not(any(test, feature = "test-support")))] #[cfg(not(any(test, feature = "test-support")))]
pub async fn start( pub async fn start(
worktree_id: Option<usize>,
server_id: LanguageServerId, server_id: LanguageServerId,
prettier_dir: PathBuf, prettier_dir: PathBuf,
node: Arc<dyn NodeRuntime>, node: Arc<dyn NodeRuntime>,
@ -174,7 +199,6 @@ impl Prettier {
.await .await
.context("prettier server initialization")?; .context("prettier server initialization")?;
Ok(Self::Real(RealPrettier { Ok(Self::Real(RealPrettier {
worktree_id,
server, server,
default: prettier_dir == DEFAULT_PRETTIER_DIR.as_path(), default: prettier_dir == DEFAULT_PRETTIER_DIR.as_path(),
prettier_dir, prettier_dir,
@ -370,64 +394,61 @@ impl Prettier {
Self::Test(test_prettier) => &test_prettier.prettier_dir, Self::Test(test_prettier) => &test_prettier.prettier_dir,
} }
} }
pub fn worktree_id(&self) -> Option<usize> {
match self {
Self::Real(local) => local.worktree_id,
#[cfg(any(test, feature = "test-support"))]
Self::Test(test_prettier) => test_prettier.worktree_id,
}
}
} }
async fn find_closest_prettier_dir( async fn has_prettier_in_node_modules(fs: &dyn Fs, path: &Path) -> anyhow::Result<bool> {
paths_to_check: Vec<PathBuf>, let possible_node_modules_location = path.join("node_modules").join(PRETTIER_PACKAGE_NAME);
fs: &dyn Fs, if let Some(node_modules_location_metadata) = fs
) -> anyhow::Result<Option<PathBuf>> { .metadata(&possible_node_modules_location)
for path in paths_to_check { .await
let possible_package_json = path.join("package.json"); .with_context(|| format!("fetching metadata for {possible_node_modules_location:?}"))?
if let Some(package_json_metadata) = fs {
.metadata(&possible_package_json) return Ok(node_modules_location_metadata.is_dir);
.await }
.with_context(|| format!("Fetching metadata for {possible_package_json:?}"))? Ok(false)
{ }
if !package_json_metadata.is_dir && !package_json_metadata.is_symlink {
let package_json_contents = fs
.load(&possible_package_json)
.await
.with_context(|| format!("reading {possible_package_json:?} file contents"))?;
if let Ok(json_contents) = serde_json::from_str::<HashMap<String, serde_json::Value>>(
&package_json_contents,
) {
if let Some(serde_json::Value::Object(o)) = json_contents.get("dependencies") {
if o.contains_key(PRETTIER_PACKAGE_NAME) {
return Ok(Some(path));
}
}
if let Some(serde_json::Value::Object(o)) = json_contents.get("devDependencies")
{
if o.contains_key(PRETTIER_PACKAGE_NAME) {
return Ok(Some(path));
}
}
}
}
}
let possible_node_modules_location = path.join("node_modules").join(PRETTIER_PACKAGE_NAME); async fn read_package_json(
if let Some(node_modules_location_metadata) = fs fs: &dyn Fs,
.metadata(&possible_node_modules_location) path: &Path,
.await ) -> anyhow::Result<Option<HashMap<String, serde_json::Value>>> {
.with_context(|| format!("fetching metadata for {possible_node_modules_location:?}"))? let possible_package_json = path.join("package.json");
{ if let Some(package_json_metadata) = fs
if node_modules_location_metadata.is_dir { .metadata(&possible_package_json)
return Ok(Some(path)); .await
} .with_context(|| format!("fetching metadata for package json {possible_package_json:?}"))?
{
if !package_json_metadata.is_dir && !package_json_metadata.is_symlink {
let package_json_contents = fs
.load(&possible_package_json)
.await
.with_context(|| format!("reading {possible_package_json:?} file contents"))?;
return serde_json::from_str::<HashMap<String, serde_json::Value>>(
&package_json_contents,
)
.map(Some)
.with_context(|| format!("parsing {possible_package_json:?} file contents"));
} }
} }
Ok(None) Ok(None)
} }
fn has_prettier_in_package_json(
package_json_contents: &HashMap<String, serde_json::Value>,
) -> bool {
if let Some(serde_json::Value::Object(o)) = package_json_contents.get("dependencies") {
if o.contains_key(PRETTIER_PACKAGE_NAME) {
return true;
}
}
if let Some(serde_json::Value::Object(o)) = package_json_contents.get("devDependencies") {
if o.contains_key(PRETTIER_PACKAGE_NAME) {
return true;
}
}
false
}
enum Format {} enum Format {}
#[derive(Clone, Debug, Eq, PartialEq, Serialize, Deserialize)] #[derive(Clone, Debug, Eq, PartialEq, Serialize, Deserialize)]
@ -466,3 +487,379 @@ impl lsp2::request::Request for ClearCache {
type Result = (); type Result = ();
const METHOD: &'static str = "prettier/clear_cache"; const METHOD: &'static str = "prettier/clear_cache";
} }
#[cfg(test)]
mod tests {
use fs2::FakeFs;
use serde_json::json;
use super::*;
#[gpui2::test]
async fn test_prettier_lookup_finds_nothing(cx: &mut gpui2::TestAppContext) {
let fs = FakeFs::new(cx.executor().clone());
fs.insert_tree(
"/root",
json!({
".config": {
"zed": {
"settings.json": r#"{ "formatter": "auto" }"#,
},
},
"work": {
"project": {
"src": {
"index.js": "// index.js file contents",
},
"node_modules": {
"expect": {
"build": {
"print.js": "// print.js file contents",
},
"package.json": r#"{
"devDependencies": {
"prettier": "2.5.1"
}
}"#,
},
"prettier": {
"index.js": "// Dummy prettier package file",
},
},
"package.json": r#"{}"#
},
}
}),
)
.await;
assert!(
matches!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/.config/zed/settings.json"),
)
.await,
Ok(ControlFlow::Continue(None))
),
"Should successfully find no prettier for path hierarchy without it"
);
assert!(
matches!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/project/src/index.js")
)
.await,
Ok(ControlFlow::Continue(None))
),
"Should successfully find no prettier for path hierarchy that has node_modules with prettier, but no package.json mentions of it"
);
assert!(
matches!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/project/node_modules/expect/build/print.js")
)
.await,
Ok(ControlFlow::Break(()))
),
"Should not format files inside node_modules/"
);
}
#[gpui2::test]
async fn test_prettier_lookup_in_simple_npm_projects(cx: &mut gpui2::TestAppContext) {
let fs = FakeFs::new(cx.executor().clone());
fs.insert_tree(
"/root",
json!({
"web_blog": {
"node_modules": {
"prettier": {
"index.js": "// Dummy prettier package file",
},
"expect": {
"build": {
"print.js": "// print.js file contents",
},
"package.json": r#"{
"devDependencies": {
"prettier": "2.5.1"
}
}"#,
},
},
"pages": {
"[slug].tsx": "// [slug].tsx file contents",
},
"package.json": r#"{
"devDependencies": {
"prettier": "2.3.0"
},
"prettier": {
"semi": false,
"printWidth": 80,
"htmlWhitespaceSensitivity": "strict",
"tabWidth": 4
}
}"#
}
}),
)
.await;
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/web_blog/pages/[slug].tsx")
)
.await
.unwrap(),
ControlFlow::Continue(Some(PathBuf::from("/root/web_blog"))),
"Should find a preinstalled prettier in the project root"
);
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/web_blog/node_modules/expect/build/print.js")
)
.await
.unwrap(),
ControlFlow::Break(()),
"Should not allow formatting node_modules/ contents"
);
}
#[gpui2::test]
async fn test_prettier_lookup_for_not_installed(cx: &mut gpui2::TestAppContext) {
let fs = FakeFs::new(cx.executor().clone());
fs.insert_tree(
"/root",
json!({
"work": {
"web_blog": {
"node_modules": {
"expect": {
"build": {
"print.js": "// print.js file contents",
},
"package.json": r#"{
"devDependencies": {
"prettier": "2.5.1"
}
}"#,
},
},
"pages": {
"[slug].tsx": "// [slug].tsx file contents",
},
"package.json": r#"{
"devDependencies": {
"prettier": "2.3.0"
},
"prettier": {
"semi": false,
"printWidth": 80,
"htmlWhitespaceSensitivity": "strict",
"tabWidth": 4
}
}"#
}
}
}),
)
.await;
match Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/web_blog/pages/[slug].tsx")
)
.await {
Ok(path) => panic!("Expected to fail for prettier in package.json but not in node_modules found, but got path {path:?}"),
Err(e) => {
let message = e.to_string();
assert!(message.contains("/root/work/web_blog"), "Error message should mention which project had prettier defined");
},
};
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::from_iter(
[PathBuf::from("/root"), PathBuf::from("/root/work")].into_iter()
),
Path::new("/root/work/web_blog/pages/[slug].tsx")
)
.await
.unwrap(),
ControlFlow::Continue(Some(PathBuf::from("/root/work"))),
"Should return closest cached value found without path checks"
);
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/web_blog/node_modules/expect/build/print.js")
)
.await
.unwrap(),
ControlFlow::Break(()),
"Should not allow formatting files inside node_modules/"
);
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::from_iter(
[PathBuf::from("/root"), PathBuf::from("/root/work")].into_iter()
),
Path::new("/root/work/web_blog/node_modules/expect/build/print.js")
)
.await
.unwrap(),
ControlFlow::Break(()),
"Should ignore cache lookup for files inside node_modules/"
);
}
#[gpui2::test]
async fn test_prettier_lookup_in_npm_workspaces(cx: &mut gpui2::TestAppContext) {
let fs = FakeFs::new(cx.executor().clone());
fs.insert_tree(
"/root",
json!({
"work": {
"full-stack-foundations": {
"exercises": {
"03.loading": {
"01.problem.loader": {
"app": {
"routes": {
"users+": {
"$username_+": {
"notes.tsx": "// notes.tsx file contents",
},
},
},
},
"node_modules": {
"test.js": "// test.js contents",
},
"package.json": r#"{
"devDependencies": {
"prettier": "^3.0.3"
}
}"#
},
},
},
"package.json": r#"{
"workspaces": ["exercises/*/*", "examples/*"]
}"#,
"node_modules": {
"prettier": {
"index.js": "// Dummy prettier package file",
},
},
},
}
}),
)
.await;
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/full-stack-foundations/exercises/03.loading/01.problem.loader/app/routes/users+/$username_+/notes.tsx"),
).await.unwrap(),
ControlFlow::Continue(Some(PathBuf::from("/root/work/full-stack-foundations"))),
"Should ascend to the multi-workspace root and find the prettier there",
);
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/full-stack-foundations/node_modules/prettier/index.js")
)
.await
.unwrap(),
ControlFlow::Break(()),
"Should not allow formatting files inside root node_modules/"
);
assert_eq!(
Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/full-stack-foundations/exercises/03.loading/01.problem.loader/node_modules/test.js")
)
.await
.unwrap(),
ControlFlow::Break(()),
"Should not allow formatting files inside submodule's node_modules/"
);
}
#[gpui2::test]
async fn test_prettier_lookup_in_npm_workspaces_for_not_installed(
cx: &mut gpui2::TestAppContext,
) {
let fs = FakeFs::new(cx.executor().clone());
fs.insert_tree(
"/root",
json!({
"work": {
"full-stack-foundations": {
"exercises": {
"03.loading": {
"01.problem.loader": {
"app": {
"routes": {
"users+": {
"$username_+": {
"notes.tsx": "// notes.tsx file contents",
},
},
},
},
"node_modules": {},
"package.json": r#"{
"devDependencies": {
"prettier": "^3.0.3"
}
}"#
},
},
},
"package.json": r#"{
"workspaces": ["exercises/*/*", "examples/*"]
}"#,
},
}
}),
)
.await;
match Prettier::locate_prettier_installation(
fs.as_ref(),
&HashSet::default(),
Path::new("/root/work/full-stack-foundations/exercises/03.loading/01.problem.loader/app/routes/users+/$username_+/notes.tsx")
)
.await {
Ok(path) => panic!("Expected to fail for prettier in package.json but not in node_modules found, but got path {path:?}"),
Err(e) => {
let message = e.to_string();
assert!(message.contains("/root/work/full-stack-foundations/exercises/03.loading/01.problem.loader"), "Error message should mention which project had prettier defined");
assert!(message.contains("/root/work/full-stack-foundations"), "Error message should mention potential candidates without prettier node_modules contents");
},
};
}
}

View file

@ -1,11 +1,13 @@
const { Buffer } = require('buffer'); const { Buffer } = require("buffer");
const fs = require("fs"); const fs = require("fs");
const path = require("path"); const path = require("path");
const { once } = require('events'); const { once } = require("events");
const prettierContainerPath = process.argv[2]; const prettierContainerPath = process.argv[2];
if (prettierContainerPath == null || prettierContainerPath.length == 0) { if (prettierContainerPath == null || prettierContainerPath.length == 0) {
process.stderr.write(`Prettier path argument was not specified or empty.\nUsage: ${process.argv[0]} ${process.argv[1]} prettier/path\n`); process.stderr.write(
`Prettier path argument was not specified or empty.\nUsage: ${process.argv[0]} ${process.argv[1]} prettier/path\n`,
);
process.exit(1); process.exit(1);
} }
fs.stat(prettierContainerPath, (err, stats) => { fs.stat(prettierContainerPath, (err, stats) => {
@ -19,7 +21,7 @@ fs.stat(prettierContainerPath, (err, stats) => {
process.exit(1); process.exit(1);
} }
}); });
const prettierPath = path.join(prettierContainerPath, 'node_modules/prettier'); const prettierPath = path.join(prettierContainerPath, "node_modules/prettier");
class Prettier { class Prettier {
constructor(path, prettier, config) { constructor(path, prettier, config) {
@ -34,7 +36,7 @@ class Prettier {
let config; let config;
try { try {
prettier = await loadPrettier(prettierPath); prettier = await loadPrettier(prettierPath);
config = await prettier.resolveConfig(prettierPath) || {}; config = (await prettier.resolveConfig(prettierPath)) || {};
} catch (e) { } catch (e) {
process.stderr.write(`Failed to load prettier: ${e}\n`); process.stderr.write(`Failed to load prettier: ${e}\n`);
process.exit(1); process.exit(1);
@ -42,7 +44,7 @@ class Prettier {
process.stderr.write(`Prettier at path '${prettierPath}' loaded successfully, config: ${JSON.stringify(config)}\n`); process.stderr.write(`Prettier at path '${prettierPath}' loaded successfully, config: ${JSON.stringify(config)}\n`);
process.stdin.resume(); process.stdin.resume();
handleBuffer(new Prettier(prettierPath, prettier, config)); handleBuffer(new Prettier(prettierPath, prettier, config));
})() })();
async function handleBuffer(prettier) { async function handleBuffer(prettier) {
for await (const messageText of readStdin()) { for await (const messageText of readStdin()) {
@ -54,22 +56,29 @@ async function handleBuffer(prettier) {
continue; continue;
} }
// allow concurrent request handling by not `await`ing the message handling promise (async function) // allow concurrent request handling by not `await`ing the message handling promise (async function)
handleMessage(message, prettier).catch(e => { handleMessage(message, prettier).catch((e) => {
sendResponse({ id: message.id, ...makeError(`error during message handling: ${e}`) }); const errorMessage = message;
if ((errorMessage.params || {}).text !== undefined) {
errorMessage.params.text = "..snip..";
}
sendResponse({
id: message.id,
...makeError(`error during message '${JSON.stringify(errorMessage)}' handling: ${e}`),
});
}); });
} }
} }
const headerSeparator = "\r\n"; const headerSeparator = "\r\n";
const contentLengthHeaderName = 'Content-Length'; const contentLengthHeaderName = "Content-Length";
async function* readStdin() { async function* readStdin() {
let buffer = Buffer.alloc(0); let buffer = Buffer.alloc(0);
let streamEnded = false; let streamEnded = false;
process.stdin.on('end', () => { process.stdin.on("end", () => {
streamEnded = true; streamEnded = true;
}); });
process.stdin.on('data', (data) => { process.stdin.on("data", (data) => {
buffer = Buffer.concat([buffer, data]); buffer = Buffer.concat([buffer, data]);
}); });
@ -77,7 +86,7 @@ async function* readStdin() {
sendResponse(makeError(errorMessage)); sendResponse(makeError(errorMessage));
buffer = Buffer.alloc(0); buffer = Buffer.alloc(0);
messageLength = null; messageLength = null;
await once(process.stdin, 'readable'); await once(process.stdin, "readable");
streamEnded = false; streamEnded = false;
} }
@ -88,20 +97,25 @@ async function* readStdin() {
if (messageLength === null) { if (messageLength === null) {
while (buffer.indexOf(`${headerSeparator}${headerSeparator}`) === -1) { while (buffer.indexOf(`${headerSeparator}${headerSeparator}`) === -1) {
if (streamEnded) { if (streamEnded) {
await handleStreamEnded('Unexpected end of stream: headers not found'); await handleStreamEnded("Unexpected end of stream: headers not found");
continue main_loop; continue main_loop;
} else if (buffer.length > contentLengthHeaderName.length * 10) { } else if (buffer.length > contentLengthHeaderName.length * 10) {
await handleStreamEnded(`Unexpected stream of bytes: no headers end found after ${buffer.length} bytes of input`); await handleStreamEnded(
`Unexpected stream of bytes: no headers end found after ${buffer.length} bytes of input`,
);
continue main_loop; continue main_loop;
} }
await once(process.stdin, 'readable'); await once(process.stdin, "readable");
} }
const headers = buffer.subarray(0, buffer.indexOf(`${headerSeparator}${headerSeparator}`)).toString('ascii'); const headers = buffer
const contentLengthHeader = headers.split(headerSeparator) .subarray(0, buffer.indexOf(`${headerSeparator}${headerSeparator}`))
.map(header => header.split(':')) .toString("ascii");
.filter(header => header[2] === undefined) const contentLengthHeader = headers
.filter(header => (header[1] || '').length > 0) .split(headerSeparator)
.find(header => (header[0] || '').trim() === contentLengthHeaderName); .map((header) => header.split(":"))
.filter((header) => header[2] === undefined)
.filter((header) => (header[1] || "").length > 0)
.find((header) => (header[0] || "").trim() === contentLengthHeaderName);
const contentLength = (contentLengthHeader || [])[1]; const contentLength = (contentLengthHeader || [])[1];
if (contentLength === undefined) { if (contentLength === undefined) {
await handleStreamEnded(`Missing or incorrect ${contentLengthHeaderName} header: ${headers}`); await handleStreamEnded(`Missing or incorrect ${contentLengthHeaderName} header: ${headers}`);
@ -111,13 +125,14 @@ async function* readStdin() {
messageLength = parseInt(contentLength, 10); messageLength = parseInt(contentLength, 10);
} }
while (buffer.length < (headersLength + messageLength)) { while (buffer.length < headersLength + messageLength) {
if (streamEnded) { if (streamEnded) {
await handleStreamEnded( await handleStreamEnded(
`Unexpected end of stream: buffer length ${buffer.length} does not match expected header length ${headersLength} + body length ${messageLength}`); `Unexpected end of stream: buffer length ${buffer.length} does not match expected header length ${headersLength} + body length ${messageLength}`,
);
continue main_loop; continue main_loop;
} }
await once(process.stdin, 'readable'); await once(process.stdin, "readable");
} }
const messageEnd = headersLength + messageLength; const messageEnd = headersLength + messageLength;
@ -125,12 +140,12 @@ async function* readStdin() {
buffer = buffer.subarray(messageEnd); buffer = buffer.subarray(messageEnd);
headersLength = null; headersLength = null;
messageLength = null; messageLength = null;
yield message.toString('utf8'); yield message.toString("utf8");
} }
} catch (e) { } catch (e) {
sendResponse(makeError(`Error reading stdin: ${e}`)); sendResponse(makeError(`Error reading stdin: ${e}`));
} finally { } finally {
process.stdin.off('data', () => { }); process.stdin.off("data", () => {});
} }
} }
@ -143,7 +158,7 @@ async function handleMessage(message, prettier) {
throw new Error(`Message id is undefined: ${JSON.stringify(message)}`); throw new Error(`Message id is undefined: ${JSON.stringify(message)}`);
} }
if (method === 'prettier/format') { if (method === "prettier/format") {
if (params === undefined || params.text === undefined) { if (params === undefined || params.text === undefined) {
throw new Error(`Message params.text is undefined: ${JSON.stringify(message)}`); throw new Error(`Message params.text is undefined: ${JSON.stringify(message)}`);
} }
@ -153,7 +168,7 @@ async function handleMessage(message, prettier) {
let resolvedConfig = {}; let resolvedConfig = {};
if (params.options.filepath !== undefined) { if (params.options.filepath !== undefined) {
resolvedConfig = await prettier.prettier.resolveConfig(params.options.filepath) || {}; resolvedConfig = (await prettier.prettier.resolveConfig(params.options.filepath)) || {};
} }
const options = { const options = {
@ -161,21 +176,25 @@ async function handleMessage(message, prettier) {
...resolvedConfig, ...resolvedConfig,
parser: params.options.parser, parser: params.options.parser,
plugins: params.options.plugins, plugins: params.options.plugins,
path: params.options.filepath path: params.options.filepath,
}; };
process.stderr.write(`Resolved config: ${JSON.stringify(resolvedConfig)}, will format file '${params.options.filepath || ''}' with options: ${JSON.stringify(options)}\n`); process.stderr.write(
`Resolved config: ${JSON.stringify(resolvedConfig)}, will format file '${
params.options.filepath || ""
}' with options: ${JSON.stringify(options)}\n`,
);
const formattedText = await prettier.prettier.format(params.text, options); const formattedText = await prettier.prettier.format(params.text, options);
sendResponse({ id, result: { text: formattedText } }); sendResponse({ id, result: { text: formattedText } });
} else if (method === 'prettier/clear_cache') { } else if (method === "prettier/clear_cache") {
prettier.prettier.clearConfigCache(); prettier.prettier.clearConfigCache();
prettier.config = await prettier.prettier.resolveConfig(prettier.path) || {}; prettier.config = (await prettier.prettier.resolveConfig(prettier.path)) || {};
sendResponse({ id, result: null }); sendResponse({ id, result: null });
} else if (method === 'initialize') { } else if (method === "initialize") {
sendResponse({ sendResponse({
id, id,
result: { result: {
"capabilities": {} capabilities: {},
} },
}); });
} else { } else {
throw new Error(`Unknown method: ${method}`); throw new Error(`Unknown method: ${method}`);
@ -185,18 +204,20 @@ async function handleMessage(message, prettier) {
function makeError(message) { function makeError(message) {
return { return {
error: { error: {
"code": -32600, // invalid request code code: -32600, // invalid request code
message, message,
} },
}; };
} }
function sendResponse(response) { function sendResponse(response) {
const responsePayloadString = JSON.stringify({ const responsePayloadString = JSON.stringify({
jsonrpc: "2.0", jsonrpc: "2.0",
...response ...response,
}); });
const headers = `${contentLengthHeaderName}: ${Buffer.byteLength(responsePayloadString)}${headerSeparator}${headerSeparator}`; const headers = `${contentLengthHeaderName}: ${Buffer.byteLength(
responsePayloadString,
)}${headerSeparator}${headerSeparator}`;
process.stdout.write(headers + responsePayloadString); process.stdout.write(headers + responsePayloadString);
} }

View file

@ -54,7 +54,7 @@ use lsp_command::*;
use node_runtime::NodeRuntime; use node_runtime::NodeRuntime;
use parking_lot::Mutex; use parking_lot::Mutex;
use postage::watch; use postage::watch;
use prettier::{LocateStart, Prettier}; use prettier::Prettier;
use project_settings::{LspSettings, ProjectSettings}; use project_settings::{LspSettings, ProjectSettings};
use rand::prelude::*; use rand::prelude::*;
use search::SearchQuery; use search::SearchQuery;
@ -69,7 +69,7 @@ use std::{
hash::Hash, hash::Hash,
mem, mem,
num::NonZeroU32, num::NonZeroU32,
ops::Range, ops::{ControlFlow, Range},
path::{self, Component, Path, PathBuf}, path::{self, Component, Path, PathBuf},
process::Stdio, process::Stdio,
str, str,
@ -82,8 +82,11 @@ use std::{
use terminals::Terminals; use terminals::Terminals;
use text::Anchor; use text::Anchor;
use util::{ use util::{
debug_panic, defer, http::HttpClient, merge_json_value_into, debug_panic, defer,
paths::LOCAL_SETTINGS_RELATIVE_PATH, post_inc, ResultExt, TryFutureExt as _, http::HttpClient,
merge_json_value_into,
paths::{DEFAULT_PRETTIER_DIR, LOCAL_SETTINGS_RELATIVE_PATH},
post_inc, ResultExt, TryFutureExt as _,
}; };
pub use fs::*; pub use fs::*;
@ -162,17 +165,15 @@ pub struct Project {
copilot_log_subscription: Option<lsp::Subscription>, copilot_log_subscription: Option<lsp::Subscription>,
current_lsp_settings: HashMap<Arc<str>, LspSettings>, current_lsp_settings: HashMap<Arc<str>, LspSettings>,
node: Option<Arc<dyn NodeRuntime>>, node: Option<Arc<dyn NodeRuntime>>,
#[cfg(not(any(test, feature = "test-support")))]
default_prettier: Option<DefaultPrettier>, default_prettier: Option<DefaultPrettier>,
prettier_instances: HashMap< prettiers_per_worktree: HashMap<WorktreeId, HashSet<Option<PathBuf>>>,
(Option<WorktreeId>, PathBuf), prettier_instances: HashMap<PathBuf, Shared<Task<Result<Arc<Prettier>, Arc<anyhow::Error>>>>>,
Shared<Task<Result<Arc<Prettier>, Arc<anyhow::Error>>>>,
>,
} }
#[cfg(not(any(test, feature = "test-support")))]
struct DefaultPrettier { struct DefaultPrettier {
installation_process: Option<Shared<Task<()>>>, instance: Option<Shared<Task<Result<Arc<Prettier>, Arc<anyhow::Error>>>>>,
installation_process: Option<Shared<Task<Result<(), Arc<anyhow::Error>>>>>,
#[cfg(not(any(test, feature = "test-support")))]
installed_plugins: HashSet<&'static str>, installed_plugins: HashSet<&'static str>,
} }
@ -685,8 +686,8 @@ impl Project {
copilot_log_subscription: None, copilot_log_subscription: None,
current_lsp_settings: settings::get::<ProjectSettings>(cx).lsp.clone(), current_lsp_settings: settings::get::<ProjectSettings>(cx).lsp.clone(),
node: Some(node), node: Some(node),
#[cfg(not(any(test, feature = "test-support")))]
default_prettier: None, default_prettier: None,
prettiers_per_worktree: HashMap::default(),
prettier_instances: HashMap::default(), prettier_instances: HashMap::default(),
} }
}) })
@ -786,8 +787,8 @@ impl Project {
copilot_log_subscription: None, copilot_log_subscription: None,
current_lsp_settings: settings::get::<ProjectSettings>(cx).lsp.clone(), current_lsp_settings: settings::get::<ProjectSettings>(cx).lsp.clone(),
node: None, node: None,
#[cfg(not(any(test, feature = "test-support")))]
default_prettier: None, default_prettier: None,
prettiers_per_worktree: HashMap::default(),
prettier_instances: HashMap::default(), prettier_instances: HashMap::default(),
}; };
for worktree in worktrees { for worktree in worktrees {
@ -924,8 +925,7 @@ impl Project {
} }
for (worktree, language, settings) in language_formatters_to_check { for (worktree, language, settings) in language_formatters_to_check {
self.install_default_formatters(worktree, &language, &settings, cx) self.install_default_formatters(worktree, &language, &settings, cx);
.detach_and_log_err(cx);
} }
// Start all the newly-enabled language servers. // Start all the newly-enabled language servers.
@ -2681,20 +2681,7 @@ impl Project {
let buffer_file = File::from_dyn(buffer_file.as_ref()); let buffer_file = File::from_dyn(buffer_file.as_ref());
let worktree = buffer_file.as_ref().map(|f| f.worktree_id(cx)); let worktree = buffer_file.as_ref().map(|f| f.worktree_id(cx));
let task_buffer = buffer.clone(); self.install_default_formatters(worktree, &new_language, &settings, cx);
let prettier_installation_task =
self.install_default_formatters(worktree, &new_language, &settings, cx);
cx.spawn(|project, mut cx| async move {
prettier_installation_task.await?;
let _ = project
.update(&mut cx, |project, cx| {
project.prettier_instance_for_buffer(&task_buffer, cx)
})
.await;
anyhow::Ok(())
})
.detach_and_log_err(cx);
if let Some(file) = buffer_file { if let Some(file) = buffer_file {
let worktree = file.worktree.clone(); let worktree = file.worktree.clone();
if let Some(tree) = worktree.read(cx).as_local() { if let Some(tree) = worktree.read(cx).as_local() {
@ -4029,7 +4016,7 @@ impl Project {
} }
pub fn format( pub fn format(
&self, &mut self,
buffers: HashSet<ModelHandle<Buffer>>, buffers: HashSet<ModelHandle<Buffer>>,
push_to_history: bool, push_to_history: bool,
trigger: FormatTrigger, trigger: FormatTrigger,
@ -4049,10 +4036,10 @@ impl Project {
}) })
.collect::<Vec<_>>(); .collect::<Vec<_>>();
cx.spawn(|this, mut cx| async move { cx.spawn(|project, mut cx| async move {
// Do not allow multiple concurrent formatting requests for the // Do not allow multiple concurrent formatting requests for the
// same buffer. // same buffer.
this.update(&mut cx, |this, cx| { project.update(&mut cx, |this, cx| {
buffers_with_paths_and_servers.retain(|(buffer, _, _)| { buffers_with_paths_and_servers.retain(|(buffer, _, _)| {
this.buffers_being_formatted this.buffers_being_formatted
.insert(buffer.read(cx).remote_id()) .insert(buffer.read(cx).remote_id())
@ -4060,7 +4047,7 @@ impl Project {
}); });
let _cleanup = defer({ let _cleanup = defer({
let this = this.clone(); let this = project.clone();
let mut cx = cx.clone(); let mut cx = cx.clone();
let buffers = &buffers_with_paths_and_servers; let buffers = &buffers_with_paths_and_servers;
move || { move || {
@ -4128,7 +4115,7 @@ impl Project {
{ {
format_operation = Some(FormatOperation::Lsp( format_operation = Some(FormatOperation::Lsp(
Self::format_via_lsp( Self::format_via_lsp(
&this, &project,
&buffer, &buffer,
buffer_abs_path, buffer_abs_path,
&language_server, &language_server,
@ -4163,14 +4150,14 @@ impl Project {
} }
} }
(Formatter::Auto, FormatOnSave::On | FormatOnSave::Off) => { (Formatter::Auto, FormatOnSave::On | FormatOnSave::Off) => {
if let Some(prettier_task) = this if let Some((prettier_path, prettier_task)) = project
.update(&mut cx, |project, cx| { .update(&mut cx, |project, cx| {
project.prettier_instance_for_buffer(buffer, cx) project.prettier_instance_for_buffer(buffer, cx)
}).await { }).await {
match prettier_task.await match prettier_task.await
{ {
Ok(prettier) => { Ok(prettier) => {
let buffer_path = buffer.read_with(&cx, |buffer, cx| { let buffer_path = buffer.update(&mut cx, |buffer, cx| {
File::from_dyn(buffer.file()).map(|file| file.abs_path(cx)) File::from_dyn(buffer.file()).map(|file| file.abs_path(cx))
}); });
format_operation = Some(FormatOperation::Prettier( format_operation = Some(FormatOperation::Prettier(
@ -4180,16 +4167,35 @@ impl Project {
.context("formatting via prettier")?, .context("formatting via prettier")?,
)); ));
} }
Err(e) => anyhow::bail!( Err(e) => {
"Failed to create prettier instance for buffer during autoformatting: {e:#}" project.update(&mut cx, |project, _| {
), match &prettier_path {
Some(prettier_path) => {
project.prettier_instances.remove(prettier_path);
},
None => {
if let Some(default_prettier) = project.default_prettier.as_mut() {
default_prettier.instance = None;
}
},
}
});
match &prettier_path {
Some(prettier_path) => {
log::error!("Failed to create prettier instance from {prettier_path:?} for buffer during autoformatting: {e:#}");
},
None => {
log::error!("Failed to create default prettier instance for buffer during autoformatting: {e:#}");
},
}
}
} }
} else if let Some((language_server, buffer_abs_path)) = } else if let Some((language_server, buffer_abs_path)) =
language_server.as_ref().zip(buffer_abs_path.as_ref()) language_server.as_ref().zip(buffer_abs_path.as_ref())
{ {
format_operation = Some(FormatOperation::Lsp( format_operation = Some(FormatOperation::Lsp(
Self::format_via_lsp( Self::format_via_lsp(
&this, &project,
&buffer, &buffer,
buffer_abs_path, buffer_abs_path,
&language_server, &language_server,
@ -4202,14 +4208,14 @@ impl Project {
} }
} }
(Formatter::Prettier { .. }, FormatOnSave::On | FormatOnSave::Off) => { (Formatter::Prettier { .. }, FormatOnSave::On | FormatOnSave::Off) => {
if let Some(prettier_task) = this if let Some((prettier_path, prettier_task)) = project
.update(&mut cx, |project, cx| { .update(&mut cx, |project, cx| {
project.prettier_instance_for_buffer(buffer, cx) project.prettier_instance_for_buffer(buffer, cx)
}).await { }).await {
match prettier_task.await match prettier_task.await
{ {
Ok(prettier) => { Ok(prettier) => {
let buffer_path = buffer.read_with(&cx, |buffer, cx| { let buffer_path = buffer.update(&mut cx, |buffer, cx| {
File::from_dyn(buffer.file()).map(|file| file.abs_path(cx)) File::from_dyn(buffer.file()).map(|file| file.abs_path(cx))
}); });
format_operation = Some(FormatOperation::Prettier( format_operation = Some(FormatOperation::Prettier(
@ -4219,9 +4225,28 @@ impl Project {
.context("formatting via prettier")?, .context("formatting via prettier")?,
)); ));
} }
Err(e) => anyhow::bail!( Err(e) => {
"Failed to create prettier instance for buffer during formatting: {e:#}" project.update(&mut cx, |project, _| {
), match &prettier_path {
Some(prettier_path) => {
project.prettier_instances.remove(prettier_path);
},
None => {
if let Some(default_prettier) = project.default_prettier.as_mut() {
default_prettier.instance = None;
}
},
}
});
match &prettier_path {
Some(prettier_path) => {
log::error!("Failed to create prettier instance from {prettier_path:?} for buffer during autoformatting: {e:#}");
},
None => {
log::error!("Failed to create default prettier instance for buffer during autoformatting: {e:#}");
},
}
}
} }
} }
} }
@ -6431,15 +6456,25 @@ impl Project {
"Prettier config file {config_path:?} changed, reloading prettier instances for worktree {current_worktree_id}" "Prettier config file {config_path:?} changed, reloading prettier instances for worktree {current_worktree_id}"
); );
let prettiers_to_reload = self let prettiers_to_reload = self
.prettier_instances .prettiers_per_worktree
.get(&current_worktree_id)
.iter() .iter()
.filter_map(|((worktree_id, prettier_path), prettier_task)| { .flat_map(|prettier_paths| prettier_paths.iter())
if worktree_id.is_none() || worktree_id == &Some(current_worktree_id) { .flatten()
Some((*worktree_id, prettier_path.clone(), prettier_task.clone())) .filter_map(|prettier_path| {
} else { Some((
None current_worktree_id,
} Some(prettier_path.clone()),
self.prettier_instances.get(prettier_path)?.clone(),
))
}) })
.chain(self.default_prettier.iter().filter_map(|default_prettier| {
Some((
current_worktree_id,
None,
default_prettier.instance.clone()?,
))
}))
.collect::<Vec<_>>(); .collect::<Vec<_>>();
cx.background() cx.background()
@ -6450,9 +6485,15 @@ impl Project {
.clear_cache() .clear_cache()
.await .await
.with_context(|| { .with_context(|| {
format!( match prettier_path {
"clearing prettier {prettier_path:?} cache for worktree {worktree_id:?} on prettier settings update" Some(prettier_path) => format!(
) "clearing prettier {prettier_path:?} cache for worktree {worktree_id:?} on prettier settings update"
),
None => format!(
"clearing default prettier cache for worktree {worktree_id:?} on prettier settings update"
),
}
}) })
.map_err(Arc::new) .map_err(Arc::new)
} }
@ -8364,7 +8405,12 @@ impl Project {
&mut self, &mut self,
buffer: &ModelHandle<Buffer>, buffer: &ModelHandle<Buffer>,
cx: &mut ModelContext<Self>, cx: &mut ModelContext<Self>,
) -> Task<Option<Shared<Task<Result<Arc<Prettier>, Arc<anyhow::Error>>>>>> { ) -> Task<
Option<(
Option<PathBuf>,
Shared<Task<Result<Arc<Prettier>, Arc<anyhow::Error>>>>,
)>,
> {
let buffer = buffer.read(cx); let buffer = buffer.read(cx);
let buffer_file = buffer.file(); let buffer_file = buffer.file();
let Some(buffer_language) = buffer.language() else { let Some(buffer_language) = buffer.language() else {
@ -8374,136 +8420,122 @@ impl Project {
return Task::ready(None); return Task::ready(None);
} }
let buffer_file = File::from_dyn(buffer_file); if self.is_local() {
let buffer_path = buffer_file.map(|file| Arc::clone(file.path()));
let worktree_path = buffer_file
.as_ref()
.and_then(|file| Some(file.worktree.read(cx).abs_path()));
let worktree_id = buffer_file.map(|file| file.worktree_id(cx));
if self.is_local() || worktree_id.is_none() || worktree_path.is_none() {
let Some(node) = self.node.as_ref().map(Arc::clone) else { let Some(node) = self.node.as_ref().map(Arc::clone) else {
return Task::ready(None); return Task::ready(None);
}; };
cx.spawn(|this, mut cx| async move { match File::from_dyn(buffer_file).map(|file| (file.worktree_id(cx), file.abs_path(cx)))
let fs = this.update(&mut cx, |project, _| Arc::clone(&project.fs)); {
let prettier_dir = match cx Some((worktree_id, buffer_path)) => {
.background() let fs = Arc::clone(&self.fs);
.spawn(Prettier::locate( let installed_prettiers = self.prettier_instances.keys().cloned().collect();
worktree_path.zip(buffer_path).map( return cx.spawn(|project, mut cx| async move {
|(worktree_root_path, starting_path)| LocateStart { match cx
worktree_root_path, .background()
starting_path, .spawn(async move {
}, Prettier::locate_prettier_installation(
), fs.as_ref(),
fs, &installed_prettiers,
)) &buffer_path,
.await )
{ .await
Ok(path) => path, })
Err(e) => { .await
return Some(
Task::ready(Err(Arc::new(e.context(
"determining prettier path for worktree {worktree_path:?}",
))))
.shared(),
);
}
};
if let Some(existing_prettier) = this.update(&mut cx, |project, _| {
project
.prettier_instances
.get(&(worktree_id, prettier_dir.clone()))
.cloned()
}) {
return Some(existing_prettier);
}
log::info!("Found prettier in {prettier_dir:?}, starting.");
let task_prettier_dir = prettier_dir.clone();
let weak_project = this.downgrade();
let new_server_id =
this.update(&mut cx, |this, _| this.languages.next_language_server_id());
let new_prettier_task = cx
.spawn(|mut cx| async move {
let prettier = Prettier::start(
worktree_id.map(|id| id.to_usize()),
new_server_id,
task_prettier_dir,
node,
cx.clone(),
)
.await
.context("prettier start")
.map_err(Arc::new)?;
log::info!("Started prettier in {:?}", prettier.prettier_dir());
if let Some((project, prettier_server)) =
weak_project.upgrade(&mut cx).zip(prettier.server())
{ {
project.update(&mut cx, |project, cx| { Ok(ControlFlow::Break(())) => {
let name = if prettier.is_default() { return None;
LanguageServerName(Arc::from("prettier (default)")) }
} else { Ok(ControlFlow::Continue(None)) => {
let prettier_dir = prettier.prettier_dir(); let started_default_prettier =
let worktree_path = prettier project.update(&mut cx, |project, _| {
.worktree_id() project
.map(WorktreeId::from_usize) .prettiers_per_worktree
.and_then(|id| project.worktree_for_id(id, cx)) .entry(worktree_id)
.map(|worktree| worktree.read(cx).abs_path()); .or_default()
match worktree_path { .insert(None);
Some(worktree_path) => { project.default_prettier.as_ref().and_then(
if worktree_path.as_ref() == prettier_dir { |default_prettier| default_prettier.instance.clone(),
LanguageServerName(Arc::from(format!( )
"prettier ({})", });
prettier_dir match started_default_prettier {
.file_name() Some(old_task) => return Some((None, old_task)),
.and_then(|name| name.to_str()) None => {
.unwrap_or_default() let new_default_prettier = project
))) .update(&mut cx, |_, cx| {
} else { start_default_prettier(node, Some(worktree_id), cx)
let dir_to_display = match prettier_dir })
.strip_prefix(&worktree_path) .await;
.ok() return Some((None, new_default_prettier));
{
Some(relative_path) => relative_path,
None => prettier_dir,
};
LanguageServerName(Arc::from(format!(
"prettier ({})",
dir_to_display.display(),
)))
}
}
None => LanguageServerName(Arc::from(format!(
"prettier ({})",
prettier_dir.display(),
))),
} }
}; }
}
Ok(ControlFlow::Continue(Some(prettier_dir))) => {
project.update(&mut cx, |project, _| {
project
.prettiers_per_worktree
.entry(worktree_id)
.or_default()
.insert(Some(prettier_dir.clone()))
});
if let Some(existing_prettier) =
project.update(&mut cx, |project, _| {
project.prettier_instances.get(&prettier_dir).cloned()
})
{
log::debug!(
"Found already started prettier in {prettier_dir:?}"
);
return Some((Some(prettier_dir), existing_prettier));
}
project log::info!("Found prettier in {prettier_dir:?}, starting.");
.supplementary_language_servers let new_prettier_task = project.update(&mut cx, |project, cx| {
.insert(new_server_id, (name, Arc::clone(prettier_server))); let new_prettier_task = start_prettier(
cx.emit(Event::LanguageServerAdded(new_server_id)); node,
}); prettier_dir.clone(),
Some(worktree_id),
cx,
);
project
.prettier_instances
.insert(prettier_dir.clone(), new_prettier_task.clone());
new_prettier_task
});
Some((Some(prettier_dir), new_prettier_task))
}
Err(e) => {
return Some((
None,
Task::ready(Err(Arc::new(
e.context("determining prettier path"),
)))
.shared(),
));
}
} }
Ok(Arc::new(prettier)).map_err(Arc::new) });
}) }
.shared(); None => {
this.update(&mut cx, |project, _| { let started_default_prettier = self
project .default_prettier
.prettier_instances .as_ref()
.insert((worktree_id, prettier_dir), new_prettier_task.clone()); .and_then(|default_prettier| default_prettier.instance.clone());
}); match started_default_prettier {
Some(new_prettier_task) Some(old_task) => return Task::ready(Some((None, old_task))),
}) None => {
let new_task = start_default_prettier(node, None, cx);
return cx.spawn(|_, _| async move { Some((None, new_task.await)) });
}
}
}
}
} else if self.remote_id().is_some() { } else if self.remote_id().is_some() {
return Task::ready(None); return Task::ready(None);
} else { } else {
Task::ready(Some( Task::ready(Some((
None,
Task::ready(Err(Arc::new(anyhow!("project does not have a remote id")))).shared(), Task::ready(Err(Arc::new(anyhow!("project does not have a remote id")))).shared(),
)) )))
} }
} }
@ -8514,8 +8546,7 @@ impl Project {
_new_language: &Language, _new_language: &Language,
_language_settings: &LanguageSettings, _language_settings: &LanguageSettings,
_cx: &mut ModelContext<Self>, _cx: &mut ModelContext<Self>,
) -> Task<anyhow::Result<()>> { ) {
return Task::ready(Ok(()));
} }
#[cfg(not(any(test, feature = "test-support")))] #[cfg(not(any(test, feature = "test-support")))]
@ -8525,19 +8556,19 @@ impl Project {
new_language: &Language, new_language: &Language,
language_settings: &LanguageSettings, language_settings: &LanguageSettings,
cx: &mut ModelContext<Self>, cx: &mut ModelContext<Self>,
) -> Task<anyhow::Result<()>> { ) {
match &language_settings.formatter { match &language_settings.formatter {
Formatter::Prettier { .. } | Formatter::Auto => {} Formatter::Prettier { .. } | Formatter::Auto => {}
Formatter::LanguageServer | Formatter::External { .. } => return Task::ready(Ok(())), Formatter::LanguageServer | Formatter::External { .. } => return,
}; };
let Some(node) = self.node.as_ref().cloned() else { let Some(node) = self.node.as_ref().cloned() else {
return Task::ready(Ok(())); return;
}; };
let mut prettier_plugins = None; let mut prettier_plugins = None;
if new_language.prettier_parser_name().is_some() { if new_language.prettier_parser_name().is_some() {
prettier_plugins prettier_plugins
.get_or_insert_with(|| HashSet::default()) .get_or_insert_with(|| HashSet::<&'static str>::default())
.extend( .extend(
new_language new_language
.lsp_adapters() .lsp_adapters()
@ -8546,114 +8577,271 @@ impl Project {
) )
} }
let Some(prettier_plugins) = prettier_plugins else { let Some(prettier_plugins) = prettier_plugins else {
return Task::ready(Ok(())); return;
}; };
let fs = Arc::clone(&self.fs);
let locate_prettier_installation = match worktree.and_then(|worktree_id| {
self.worktree_for_id(worktree_id, cx)
.map(|worktree| worktree.read(cx).abs_path())
}) {
Some(locate_from) => {
let installed_prettiers = self.prettier_instances.keys().cloned().collect();
cx.background().spawn(async move {
Prettier::locate_prettier_installation(
fs.as_ref(),
&installed_prettiers,
locate_from.as_ref(),
)
.await
})
}
None => Task::ready(Ok(ControlFlow::Break(()))),
};
let mut plugins_to_install = prettier_plugins; let mut plugins_to_install = prettier_plugins;
let (mut install_success_tx, mut install_success_rx) =
futures::channel::mpsc::channel::<HashSet<&'static str>>(1);
let new_installation_process = cx
.spawn(|this, mut cx| async move {
if let Some(installed_plugins) = install_success_rx.next().await {
this.update(&mut cx, |this, _| {
let default_prettier =
this.default_prettier
.get_or_insert_with(|| DefaultPrettier {
installation_process: None,
installed_plugins: HashSet::default(),
});
if !installed_plugins.is_empty() {
log::info!("Installed new prettier plugins: {installed_plugins:?}");
default_prettier.installed_plugins.extend(installed_plugins);
}
})
}
})
.shared();
let previous_installation_process = let previous_installation_process =
if let Some(default_prettier) = &mut self.default_prettier { if let Some(default_prettier) = &mut self.default_prettier {
plugins_to_install plugins_to_install
.retain(|plugin| !default_prettier.installed_plugins.contains(plugin)); .retain(|plugin| !default_prettier.installed_plugins.contains(plugin));
if plugins_to_install.is_empty() { if plugins_to_install.is_empty() {
return Task::ready(Ok(())); return;
} }
std::mem::replace( default_prettier.installation_process.clone()
&mut default_prettier.installation_process,
Some(new_installation_process.clone()),
)
} else { } else {
None None
}; };
let default_prettier_dir = util::paths::DEFAULT_PRETTIER_DIR.as_path();
let already_running_prettier = self
.prettier_instances
.get(&(worktree, default_prettier_dir.to_path_buf()))
.cloned();
let fs = Arc::clone(&self.fs); let fs = Arc::clone(&self.fs);
cx.spawn(|this, mut cx| async move { let default_prettier = self
if let Some(previous_installation_process) = previous_installation_process { .default_prettier
previous_installation_process.await; .get_or_insert_with(|| DefaultPrettier {
} instance: None,
let mut everything_was_installed = false; installation_process: None,
this.update(&mut cx, |this, _| { installed_plugins: HashSet::default(),
match &mut this.default_prettier {
Some(default_prettier) => {
plugins_to_install
.retain(|plugin| !default_prettier.installed_plugins.contains(plugin));
everything_was_installed = plugins_to_install.is_empty();
},
None => this.default_prettier = Some(DefaultPrettier { installation_process: Some(new_installation_process), installed_plugins: HashSet::default() }),
}
}); });
if everything_was_installed { default_prettier.installation_process = Some(
return Ok(()); cx.spawn(|this, mut cx| async move {
} match locate_prettier_installation
cx.background()
.spawn(async move {
let prettier_wrapper_path = default_prettier_dir.join(prettier::PRETTIER_SERVER_FILE);
// method creates parent directory if it doesn't exist
fs.save(&prettier_wrapper_path, &text::Rope::from(prettier::PRETTIER_SERVER_JS), text::LineEnding::Unix).await
.with_context(|| format!("writing {} file at {prettier_wrapper_path:?}", prettier::PRETTIER_SERVER_FILE))?;
let packages_to_versions = future::try_join_all(
plugins_to_install
.iter()
.chain(Some(&"prettier"))
.map(|package_name| async {
let returned_package_name = package_name.to_string();
let latest_version = node.npm_package_latest_version(package_name)
.await
.with_context(|| {
format!("fetching latest npm version for package {returned_package_name}")
})?;
anyhow::Ok((returned_package_name, latest_version))
}),
)
.await .await
.context("fetching latest npm versions")?; .context("locate prettier installation")
.map_err(Arc::new)?
log::info!("Fetching default prettier and plugins: {packages_to_versions:?}"); {
let borrowed_packages = packages_to_versions.iter().map(|(package, version)| { ControlFlow::Break(()) => return Ok(()),
(package.as_str(), version.as_str()) ControlFlow::Continue(Some(_non_default_prettier)) => return Ok(()),
}).collect::<Vec<_>>(); ControlFlow::Continue(None) => {
node.npm_install_packages(default_prettier_dir, &borrowed_packages).await.context("fetching formatter packages")?; let mut needs_install = match previous_installation_process {
let installed_packages = !plugins_to_install.is_empty(); Some(previous_installation_process) => {
install_success_tx.try_send(plugins_to_install).ok(); previous_installation_process.await.is_err()
}
if !installed_packages { None => true,
if let Some(prettier) = already_running_prettier { };
prettier.await.map_err(|e| anyhow::anyhow!("Default prettier startup await failure: {e:#}"))?.clear_cache().await.context("clearing default prettier cache after plugins install")?; this.update(&mut cx, |this, _| {
if let Some(default_prettier) = &mut this.default_prettier {
plugins_to_install.retain(|plugin| {
!default_prettier.installed_plugins.contains(plugin)
});
needs_install |= !plugins_to_install.is_empty();
}
});
if needs_install {
let installed_plugins = plugins_to_install.clone();
cx.background()
.spawn(async move {
install_default_prettier(plugins_to_install, node, fs).await
})
.await
.context("prettier & plugins install")
.map_err(Arc::new)?;
this.update(&mut cx, |this, _| {
let default_prettier =
this.default_prettier
.get_or_insert_with(|| DefaultPrettier {
instance: None,
installation_process: Some(
Task::ready(Ok(())).shared(),
),
installed_plugins: HashSet::default(),
});
default_prettier.instance = None;
default_prettier.installed_plugins.extend(installed_plugins);
});
} }
} }
}
anyhow::Ok(()) Ok(())
}).await })
}) .shared(),
);
} }
} }
fn start_default_prettier(
node: Arc<dyn NodeRuntime>,
worktree_id: Option<WorktreeId>,
cx: &mut ModelContext<'_, Project>,
) -> Task<Shared<Task<Result<Arc<Prettier>, Arc<anyhow::Error>>>>> {
cx.spawn(|project, mut cx| async move {
loop {
let default_prettier_installing = project.update(&mut cx, |project, _| {
project
.default_prettier
.as_ref()
.and_then(|default_prettier| default_prettier.installation_process.clone())
});
match default_prettier_installing {
Some(installation_task) => {
if installation_task.await.is_ok() {
break;
}
}
None => break,
}
}
project.update(&mut cx, |project, cx| {
match project
.default_prettier
.as_mut()
.and_then(|default_prettier| default_prettier.instance.as_mut())
{
Some(default_prettier) => default_prettier.clone(),
None => {
let new_default_prettier =
start_prettier(node, DEFAULT_PRETTIER_DIR.clone(), worktree_id, cx);
project
.default_prettier
.get_or_insert_with(|| DefaultPrettier {
instance: None,
installation_process: None,
#[cfg(not(any(test, feature = "test-support")))]
installed_plugins: HashSet::default(),
})
.instance = Some(new_default_prettier.clone());
new_default_prettier
}
}
})
})
}
fn start_prettier(
node: Arc<dyn NodeRuntime>,
prettier_dir: PathBuf,
worktree_id: Option<WorktreeId>,
cx: &mut ModelContext<'_, Project>,
) -> Shared<Task<Result<Arc<Prettier>, Arc<anyhow::Error>>>> {
cx.spawn(|project, mut cx| async move {
let new_server_id = project.update(&mut cx, |project, _| {
project.languages.next_language_server_id()
});
let new_prettier = Prettier::start(new_server_id, prettier_dir, node, cx.clone())
.await
.context("default prettier spawn")
.map(Arc::new)
.map_err(Arc::new)?;
register_new_prettier(&project, &new_prettier, worktree_id, new_server_id, &mut cx);
Ok(new_prettier)
})
.shared()
}
fn register_new_prettier(
project: &ModelHandle<Project>,
prettier: &Prettier,
worktree_id: Option<WorktreeId>,
new_server_id: LanguageServerId,
cx: &mut AsyncAppContext,
) {
let prettier_dir = prettier.prettier_dir();
let is_default = prettier.is_default();
if is_default {
log::info!("Started default prettier in {prettier_dir:?}");
} else {
log::info!("Started prettier in {prettier_dir:?}");
}
if let Some(prettier_server) = prettier.server() {
project.update(cx, |project, cx| {
let name = if is_default {
LanguageServerName(Arc::from("prettier (default)"))
} else {
let worktree_path = worktree_id
.and_then(|id| project.worktree_for_id(id, cx))
.map(|worktree| worktree.update(cx, |worktree, _| worktree.abs_path()));
let name = match worktree_path {
Some(worktree_path) => {
if prettier_dir == worktree_path.as_ref() {
let name = prettier_dir
.file_name()
.and_then(|name| name.to_str())
.unwrap_or_default();
format!("prettier ({name})")
} else {
let dir_to_display = prettier_dir
.strip_prefix(worktree_path.as_ref())
.ok()
.unwrap_or(prettier_dir);
format!("prettier ({})", dir_to_display.display())
}
}
None => format!("prettier ({})", prettier_dir.display()),
};
LanguageServerName(Arc::from(name))
};
project
.supplementary_language_servers
.insert(new_server_id, (name, Arc::clone(prettier_server)));
cx.emit(Event::LanguageServerAdded(new_server_id));
});
}
}
#[cfg(not(any(test, feature = "test-support")))]
async fn install_default_prettier(
plugins_to_install: HashSet<&'static str>,
node: Arc<dyn NodeRuntime>,
fs: Arc<dyn Fs>,
) -> anyhow::Result<()> {
let prettier_wrapper_path = DEFAULT_PRETTIER_DIR.join(prettier::PRETTIER_SERVER_FILE);
// method creates parent directory if it doesn't exist
fs.save(
&prettier_wrapper_path,
&text::Rope::from(prettier::PRETTIER_SERVER_JS),
text::LineEnding::Unix,
)
.await
.with_context(|| {
format!(
"writing {} file at {prettier_wrapper_path:?}",
prettier::PRETTIER_SERVER_FILE
)
})?;
let packages_to_versions =
future::try_join_all(plugins_to_install.iter().chain(Some(&"prettier")).map(
|package_name| async {
let returned_package_name = package_name.to_string();
let latest_version = node
.npm_package_latest_version(package_name)
.await
.with_context(|| {
format!("fetching latest npm version for package {returned_package_name}")
})?;
anyhow::Ok((returned_package_name, latest_version))
},
))
.await
.context("fetching latest npm versions")?;
log::info!("Fetching default prettier and plugins: {packages_to_versions:?}");
let borrowed_packages = packages_to_versions
.iter()
.map(|(package, version)| (package.as_str(), version.as_str()))
.collect::<Vec<_>>();
node.npm_install_packages(DEFAULT_PRETTIER_DIR.as_path(), &borrowed_packages)
.await
.context("fetching formatter packages")?;
anyhow::Ok(())
}
fn subscribe_for_copilot_events( fn subscribe_for_copilot_events(
copilot: &ModelHandle<Copilot>, copilot: &ModelHandle<Copilot>,
cx: &mut ModelContext<'_, Project>, cx: &mut ModelContext<'_, Project>,

View file

@ -1,4 +1,4 @@
use crate::{search::PathMatcher, worktree::WorktreeModelHandle, Event, *}; use crate::{worktree::WorktreeModelHandle, Event, *};
use fs::{FakeFs, RealFs}; use fs::{FakeFs, RealFs};
use futures::{future, StreamExt}; use futures::{future, StreamExt};
use gpui::{executor::Deterministic, test::subscribe, AppContext}; use gpui::{executor::Deterministic, test::subscribe, AppContext};
@ -13,7 +13,7 @@ use pretty_assertions::assert_eq;
use serde_json::json; use serde_json::json;
use std::{cell::RefCell, os::unix, rc::Rc, task::Poll}; use std::{cell::RefCell, os::unix, rc::Rc, task::Poll};
use unindent::Unindent as _; use unindent::Unindent as _;
use util::{assert_set_eq, test::temp_tree}; use util::{assert_set_eq, paths::PathMatcher, test::temp_tree};
#[cfg(test)] #[cfg(test)]
#[ctor::ctor] #[ctor::ctor]

View file

@ -1,7 +1,6 @@
use aho_corasick::{AhoCorasick, AhoCorasickBuilder}; use aho_corasick::{AhoCorasick, AhoCorasickBuilder};
use anyhow::{Context, Result}; use anyhow::{Context, Result};
use client::proto; use client::proto;
use globset::{Glob, GlobMatcher};
use itertools::Itertools; use itertools::Itertools;
use language::{char_kind, BufferSnapshot}; use language::{char_kind, BufferSnapshot};
use regex::{Regex, RegexBuilder}; use regex::{Regex, RegexBuilder};
@ -10,9 +9,10 @@ use std::{
borrow::Cow, borrow::Cow,
io::{BufRead, BufReader, Read}, io::{BufRead, BufReader, Read},
ops::Range, ops::Range,
path::{Path, PathBuf}, path::Path,
sync::Arc, sync::Arc,
}; };
use util::paths::PathMatcher;
#[derive(Clone, Debug)] #[derive(Clone, Debug)]
pub struct SearchInputs { pub struct SearchInputs {
@ -52,31 +52,6 @@ pub enum SearchQuery {
}, },
} }
#[derive(Clone, Debug)]
pub struct PathMatcher {
maybe_path: PathBuf,
glob: GlobMatcher,
}
impl std::fmt::Display for PathMatcher {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
self.maybe_path.to_string_lossy().fmt(f)
}
}
impl PathMatcher {
pub fn new(maybe_glob: &str) -> Result<Self, globset::Error> {
Ok(PathMatcher {
glob: Glob::new(&maybe_glob)?.compile_matcher(),
maybe_path: PathBuf::from(maybe_glob),
})
}
pub fn is_match<P: AsRef<Path>>(&self, other: P) -> bool {
other.as_ref().starts_with(&self.maybe_path) || self.glob.is_match(other)
}
}
impl SearchQuery { impl SearchQuery {
pub fn text( pub fn text(
query: impl ToString, query: impl ToString,

View file

@ -54,7 +54,7 @@ use lsp_command::*;
use node_runtime::NodeRuntime; use node_runtime::NodeRuntime;
use parking_lot::Mutex; use parking_lot::Mutex;
use postage::watch; use postage::watch;
use prettier2::{LocateStart, Prettier}; use prettier2::Prettier;
use project_settings::{LspSettings, ProjectSettings}; use project_settings::{LspSettings, ProjectSettings};
use rand::prelude::*; use rand::prelude::*;
use search::SearchQuery; use search::SearchQuery;
@ -69,7 +69,7 @@ use std::{
hash::Hash, hash::Hash,
mem, mem,
num::NonZeroU32, num::NonZeroU32,
ops::Range, ops::{ControlFlow, Range},
path::{self, Component, Path, PathBuf}, path::{self, Component, Path, PathBuf},
process::Stdio, process::Stdio,
str, str,
@ -82,8 +82,11 @@ use std::{
use terminals::Terminals; use terminals::Terminals;
use text::Anchor; use text::Anchor;
use util::{ use util::{
debug_panic, defer, http::HttpClient, merge_json_value_into, debug_panic, defer,
paths::LOCAL_SETTINGS_RELATIVE_PATH, post_inc, ResultExt, TryFutureExt as _, http::HttpClient,
merge_json_value_into,
paths::{DEFAULT_PRETTIER_DIR, LOCAL_SETTINGS_RELATIVE_PATH},
post_inc, ResultExt, TryFutureExt as _,
}; };
pub use fs2::*; pub use fs2::*;
@ -162,17 +165,15 @@ pub struct Project {
copilot_log_subscription: Option<lsp2::Subscription>, copilot_log_subscription: Option<lsp2::Subscription>,
current_lsp_settings: HashMap<Arc<str>, LspSettings>, current_lsp_settings: HashMap<Arc<str>, LspSettings>,
node: Option<Arc<dyn NodeRuntime>>, node: Option<Arc<dyn NodeRuntime>>,
#[cfg(not(any(test, feature = "test-support")))]
default_prettier: Option<DefaultPrettier>, default_prettier: Option<DefaultPrettier>,
prettier_instances: HashMap< prettiers_per_worktree: HashMap<WorktreeId, HashSet<Option<PathBuf>>>,
(Option<WorktreeId>, PathBuf), prettier_instances: HashMap<PathBuf, Shared<Task<Result<Arc<Prettier>, Arc<anyhow::Error>>>>>,
Shared<Task<Result<Arc<Prettier>, Arc<anyhow::Error>>>>,
>,
} }
#[cfg(not(any(test, feature = "test-support")))]
struct DefaultPrettier { struct DefaultPrettier {
installation_process: Option<Shared<Task<()>>>, instance: Option<Shared<Task<Result<Arc<Prettier>, Arc<anyhow::Error>>>>>,
installation_process: Option<Shared<Task<Result<(), Arc<anyhow::Error>>>>>,
#[cfg(not(any(test, feature = "test-support")))]
installed_plugins: HashSet<&'static str>, installed_plugins: HashSet<&'static str>,
} }
@ -686,8 +687,8 @@ impl Project {
copilot_log_subscription: None, copilot_log_subscription: None,
current_lsp_settings: ProjectSettings::get_global(cx).lsp.clone(), current_lsp_settings: ProjectSettings::get_global(cx).lsp.clone(),
node: Some(node), node: Some(node),
#[cfg(not(any(test, feature = "test-support")))]
default_prettier: None, default_prettier: None,
prettiers_per_worktree: HashMap::default(),
prettier_instances: HashMap::default(), prettier_instances: HashMap::default(),
} }
}) })
@ -789,8 +790,8 @@ impl Project {
copilot_log_subscription: None, copilot_log_subscription: None,
current_lsp_settings: ProjectSettings::get_global(cx).lsp.clone(), current_lsp_settings: ProjectSettings::get_global(cx).lsp.clone(),
node: None, node: None,
#[cfg(not(any(test, feature = "test-support")))]
default_prettier: None, default_prettier: None,
prettiers_per_worktree: HashMap::default(),
prettier_instances: HashMap::default(), prettier_instances: HashMap::default(),
}; };
for worktree in worktrees { for worktree in worktrees {
@ -963,8 +964,7 @@ impl Project {
} }
for (worktree, language, settings) in language_formatters_to_check { for (worktree, language, settings) in language_formatters_to_check {
self.install_default_formatters(worktree, &language, &settings, cx) self.install_default_formatters(worktree, &language, &settings, cx);
.detach_and_log_err(cx);
} }
// Start all the newly-enabled language servers. // Start all the newly-enabled language servers.
@ -2722,20 +2722,7 @@ impl Project {
let buffer_file = File::from_dyn(buffer_file.as_ref()); let buffer_file = File::from_dyn(buffer_file.as_ref());
let worktree = buffer_file.as_ref().map(|f| f.worktree_id(cx)); let worktree = buffer_file.as_ref().map(|f| f.worktree_id(cx));
let task_buffer = buffer.clone(); self.install_default_formatters(worktree, &new_language, &settings, cx);
let prettier_installation_task =
self.install_default_formatters(worktree, &new_language, &settings, cx);
cx.spawn(move |project, mut cx| async move {
prettier_installation_task.await?;
let _ = project
.update(&mut cx, |project, cx| {
project.prettier_instance_for_buffer(&task_buffer, cx)
})?
.await;
anyhow::Ok(())
})
.detach_and_log_err(cx);
if let Some(file) = buffer_file { if let Some(file) = buffer_file {
let worktree = file.worktree.clone(); let worktree = file.worktree.clone();
if let Some(tree) = worktree.read(cx).as_local() { if let Some(tree) = worktree.read(cx).as_local() {
@ -4098,7 +4085,7 @@ impl Project {
} }
pub fn format( pub fn format(
&self, &mut self,
buffers: HashSet<Model<Buffer>>, buffers: HashSet<Model<Buffer>>,
push_to_history: bool, push_to_history: bool,
trigger: FormatTrigger, trigger: FormatTrigger,
@ -4118,10 +4105,10 @@ impl Project {
}) })
.collect::<Vec<_>>(); .collect::<Vec<_>>();
cx.spawn(move |this, mut cx| async move { cx.spawn(move |project, mut cx| async move {
// Do not allow multiple concurrent formatting requests for the // Do not allow multiple concurrent formatting requests for the
// same buffer. // same buffer.
this.update(&mut cx, |this, cx| { project.update(&mut cx, |this, cx| {
buffers_with_paths_and_servers.retain(|(buffer, _, _)| { buffers_with_paths_and_servers.retain(|(buffer, _, _)| {
this.buffers_being_formatted this.buffers_being_formatted
.insert(buffer.read(cx).remote_id()) .insert(buffer.read(cx).remote_id())
@ -4129,7 +4116,7 @@ impl Project {
})?; })?;
let _cleanup = defer({ let _cleanup = defer({
let this = this.clone(); let this = project.clone();
let mut cx = cx.clone(); let mut cx = cx.clone();
let buffers = &buffers_with_paths_and_servers; let buffers = &buffers_with_paths_and_servers;
move || { move || {
@ -4197,7 +4184,7 @@ impl Project {
{ {
format_operation = Some(FormatOperation::Lsp( format_operation = Some(FormatOperation::Lsp(
Self::format_via_lsp( Self::format_via_lsp(
&this, &project,
&buffer, &buffer,
buffer_abs_path, buffer_abs_path,
&language_server, &language_server,
@ -4232,7 +4219,7 @@ impl Project {
} }
} }
(Formatter::Auto, FormatOnSave::On | FormatOnSave::Off) => { (Formatter::Auto, FormatOnSave::On | FormatOnSave::Off) => {
if let Some(prettier_task) = this if let Some((prettier_path, prettier_task)) = project
.update(&mut cx, |project, cx| { .update(&mut cx, |project, cx| {
project.prettier_instance_for_buffer(buffer, cx) project.prettier_instance_for_buffer(buffer, cx)
})?.await { })?.await {
@ -4249,16 +4236,35 @@ impl Project {
.context("formatting via prettier")?, .context("formatting via prettier")?,
)); ));
} }
Err(e) => anyhow::bail!( Err(e) => {
"Failed to create prettier instance for buffer during autoformatting: {e:#}" project.update(&mut cx, |project, _| {
), match &prettier_path {
Some(prettier_path) => {
project.prettier_instances.remove(prettier_path);
},
None => {
if let Some(default_prettier) = project.default_prettier.as_mut() {
default_prettier.instance = None;
}
},
}
})?;
match &prettier_path {
Some(prettier_path) => {
log::error!("Failed to create prettier instance from {prettier_path:?} for buffer during autoformatting: {e:#}");
},
None => {
log::error!("Failed to create default prettier instance for buffer during autoformatting: {e:#}");
},
}
}
} }
} else if let Some((language_server, buffer_abs_path)) = } else if let Some((language_server, buffer_abs_path)) =
language_server.as_ref().zip(buffer_abs_path.as_ref()) language_server.as_ref().zip(buffer_abs_path.as_ref())
{ {
format_operation = Some(FormatOperation::Lsp( format_operation = Some(FormatOperation::Lsp(
Self::format_via_lsp( Self::format_via_lsp(
&this, &project,
&buffer, &buffer,
buffer_abs_path, buffer_abs_path,
&language_server, &language_server,
@ -4271,7 +4277,7 @@ impl Project {
} }
} }
(Formatter::Prettier { .. }, FormatOnSave::On | FormatOnSave::Off) => { (Formatter::Prettier { .. }, FormatOnSave::On | FormatOnSave::Off) => {
if let Some(prettier_task) = this if let Some((prettier_path, prettier_task)) = project
.update(&mut cx, |project, cx| { .update(&mut cx, |project, cx| {
project.prettier_instance_for_buffer(buffer, cx) project.prettier_instance_for_buffer(buffer, cx)
})?.await { })?.await {
@ -4288,9 +4294,28 @@ impl Project {
.context("formatting via prettier")?, .context("formatting via prettier")?,
)); ));
} }
Err(e) => anyhow::bail!( Err(e) => {
"Failed to create prettier instance for buffer during formatting: {e:#}" project.update(&mut cx, |project, _| {
), match &prettier_path {
Some(prettier_path) => {
project.prettier_instances.remove(prettier_path);
},
None => {
if let Some(default_prettier) = project.default_prettier.as_mut() {
default_prettier.instance = None;
}
},
}
})?;
match &prettier_path {
Some(prettier_path) => {
log::error!("Failed to create prettier instance from {prettier_path:?} for buffer during autoformatting: {e:#}");
},
None => {
log::error!("Failed to create default prettier instance for buffer during autoformatting: {e:#}");
},
}
}
} }
} }
} }
@ -6508,15 +6533,25 @@ impl Project {
"Prettier config file {config_path:?} changed, reloading prettier instances for worktree {current_worktree_id}" "Prettier config file {config_path:?} changed, reloading prettier instances for worktree {current_worktree_id}"
); );
let prettiers_to_reload = self let prettiers_to_reload = self
.prettier_instances .prettiers_per_worktree
.get(&current_worktree_id)
.iter() .iter()
.filter_map(|((worktree_id, prettier_path), prettier_task)| { .flat_map(|prettier_paths| prettier_paths.iter())
if worktree_id.is_none() || worktree_id == &Some(current_worktree_id) { .flatten()
Some((*worktree_id, prettier_path.clone(), prettier_task.clone())) .filter_map(|prettier_path| {
} else { Some((
None current_worktree_id,
} Some(prettier_path.clone()),
self.prettier_instances.get(prettier_path)?.clone(),
))
}) })
.chain(self.default_prettier.iter().filter_map(|default_prettier| {
Some((
current_worktree_id,
None,
default_prettier.instance.clone()?,
))
}))
.collect::<Vec<_>>(); .collect::<Vec<_>>();
cx.executor() cx.executor()
@ -6527,9 +6562,14 @@ impl Project {
.clear_cache() .clear_cache()
.await .await
.with_context(|| { .with_context(|| {
format!( match prettier_path {
"clearing prettier {prettier_path:?} cache for worktree {worktree_id:?} on prettier settings update" Some(prettier_path) => format!(
) "clearing prettier {prettier_path:?} cache for worktree {worktree_id:?} on prettier settings update"
),
None => format!(
"clearing default prettier cache for worktree {worktree_id:?} on prettier settings update"
),
}
}) })
.map_err(Arc::new) .map_err(Arc::new)
} }
@ -8415,7 +8455,12 @@ impl Project {
&mut self, &mut self,
buffer: &Model<Buffer>, buffer: &Model<Buffer>,
cx: &mut ModelContext<Self>, cx: &mut ModelContext<Self>,
) -> Task<Option<Shared<Task<Result<Arc<Prettier>, Arc<anyhow::Error>>>>>> { ) -> Task<
Option<(
Option<PathBuf>,
Shared<Task<Result<Arc<Prettier>, Arc<anyhow::Error>>>>,
)>,
> {
let buffer = buffer.read(cx); let buffer = buffer.read(cx);
let buffer_file = buffer.file(); let buffer_file = buffer.file();
let Some(buffer_language) = buffer.language() else { let Some(buffer_language) = buffer.language() else {
@ -8425,142 +8470,145 @@ impl Project {
return Task::ready(None); return Task::ready(None);
} }
let buffer_file = File::from_dyn(buffer_file); if self.is_local() {
let buffer_path = buffer_file.map(|file| Arc::clone(file.path()));
let worktree_path = buffer_file
.as_ref()
.and_then(|file| Some(file.worktree.read(cx).abs_path()));
let worktree_id = buffer_file.map(|file| file.worktree_id(cx));
if self.is_local() || worktree_id.is_none() || worktree_path.is_none() {
let Some(node) = self.node.as_ref().map(Arc::clone) else { let Some(node) = self.node.as_ref().map(Arc::clone) else {
return Task::ready(None); return Task::ready(None);
}; };
let fs = self.fs.clone(); match File::from_dyn(buffer_file).map(|file| (file.worktree_id(cx), file.abs_path(cx)))
cx.spawn(move |this, mut cx| async move { {
let prettier_dir = match cx Some((worktree_id, buffer_path)) => {
.executor() let fs = Arc::clone(&self.fs);
.spawn(Prettier::locate( let installed_prettiers = self.prettier_instances.keys().cloned().collect();
worktree_path.zip(buffer_path).map( return cx.spawn(move |project, mut cx| async move {
|(worktree_root_path, starting_path)| LocateStart { match cx
worktree_root_path, .executor()
starting_path, .spawn(async move {
}, Prettier::locate_prettier_installation(
), fs.as_ref(),
fs, &installed_prettiers,
)) &buffer_path,
.await )
{ .await
Ok(path) => path, })
Err(e) => {
return Some(
Task::ready(Err(Arc::new(e.context(
"determining prettier path for worktree {worktree_path:?}",
))))
.shared(),
);
}
};
if let Some(existing_prettier) = this
.update(&mut cx, |project, _| {
project
.prettier_instances
.get(&(worktree_id, prettier_dir.clone()))
.cloned()
})
.ok()
.flatten()
{
return Some(existing_prettier);
}
log::info!("Found prettier in {prettier_dir:?}, starting.");
let task_prettier_dir = prettier_dir.clone();
let new_prettier_task = cx
.spawn({
let this = this.clone();
move |mut cx| async move {
let new_server_id = this.update(&mut cx, |this, _| {
this.languages.next_language_server_id()
})?;
let prettier = Prettier::start(
worktree_id.map(|id| id.to_usize()),
new_server_id,
task_prettier_dir,
node,
cx.clone(),
)
.await .await
.context("prettier start") {
.map_err(Arc::new)?; Ok(ControlFlow::Break(())) => {
log::info!("Started prettier in {:?}", prettier.prettier_dir()); return None;
}
if let Some(prettier_server) = prettier.server() { Ok(ControlFlow::Continue(None)) => {
this.update(&mut cx, |project, cx| { match project.update(&mut cx, |project, _| {
let name = if prettier.is_default() { project
LanguageServerName(Arc::from("prettier (default)")) .prettiers_per_worktree
} else { .entry(worktree_id)
let prettier_dir = prettier.prettier_dir(); .or_default()
let worktree_path = prettier .insert(None);
.worktree_id() project.default_prettier.as_ref().and_then(
.map(WorktreeId::from_usize) |default_prettier| default_prettier.instance.clone(),
.and_then(|id| project.worktree_for_id(id, cx)) )
.map(|worktree| worktree.read(cx).abs_path()); }) {
match worktree_path { Ok(Some(old_task)) => Some((None, old_task)),
Some(worktree_path) => { Ok(None) => {
if worktree_path.as_ref() == prettier_dir { match project.update(&mut cx, |_, cx| {
LanguageServerName(Arc::from(format!( start_default_prettier(node, Some(worktree_id), cx)
"prettier ({})", }) {
prettier_dir Ok(new_default_prettier) => {
.file_name() return Some((None, new_default_prettier.await))
.and_then(|name| name.to_str()) }
.unwrap_or_default() Err(e) => {
))) Some((
} else { None,
let dir_to_display = match prettier_dir Task::ready(Err(Arc::new(e.context("project is gone during default prettier startup"))))
.strip_prefix(&worktree_path) .shared(),
.ok() ))
{ }
Some(relative_path) => relative_path, }
None => prettier_dir, }
}; Err(e) => Some((None, Task::ready(Err(Arc::new(e.context("project is gone during default prettier checks"))))
LanguageServerName(Arc::from(format!( .shared())),
"prettier ({})", }
dir_to_display.display(), }
))) Ok(ControlFlow::Continue(Some(prettier_dir))) => {
} match project.update(&mut cx, |project, _| {
} project
None => LanguageServerName(Arc::from(format!( .prettiers_per_worktree
"prettier ({})", .entry(worktree_id)
prettier_dir.display(), .or_default()
))), .insert(Some(prettier_dir.clone()));
} project.prettier_instances.get(&prettier_dir).cloned()
}; }) {
Ok(Some(existing_prettier)) => {
project log::debug!(
.supplementary_language_servers "Found already started prettier in {prettier_dir:?}"
.insert(new_server_id, (name, Arc::clone(prettier_server))); );
cx.emit(Event::LanguageServerAdded(new_server_id)); return Some((Some(prettier_dir), existing_prettier));
})?; }
Err(e) => {
return Some((
Some(prettier_dir),
Task::ready(Err(Arc::new(e.context("project is gone during custom prettier checks"))))
.shared(),
))
}
_ => {},
}
log::info!("Found prettier in {prettier_dir:?}, starting.");
let new_prettier_task =
match project.update(&mut cx, |project, cx| {
let new_prettier_task = start_prettier(
node,
prettier_dir.clone(),
Some(worktree_id),
cx,
);
project.prettier_instances.insert(
prettier_dir.clone(),
new_prettier_task.clone(),
);
new_prettier_task
}) {
Ok(task) => task,
Err(e) => return Some((
Some(prettier_dir),
Task::ready(Err(Arc::new(e.context("project is gone during custom prettier startup"))))
.shared()
)),
};
Some((Some(prettier_dir), new_prettier_task))
}
Err(e) => {
return Some((
None,
Task::ready(Err(Arc::new(
e.context("determining prettier path"),
)))
.shared(),
));
} }
Ok(Arc::new(prettier)).map_err(Arc::new)
} }
}) });
.shared(); }
this.update(&mut cx, |project, _| { None => {
project let started_default_prettier = self
.prettier_instances .default_prettier
.insert((worktree_id, prettier_dir), new_prettier_task.clone()); .as_ref()
}) .and_then(|default_prettier| default_prettier.instance.clone());
.ok(); match started_default_prettier {
Some(new_prettier_task) Some(old_task) => return Task::ready(Some((None, old_task))),
}) None => {
let new_task = start_default_prettier(node, None, cx);
return cx.spawn(|_, _| async move { Some((None, new_task.await)) });
}
}
}
}
} else if self.remote_id().is_some() { } else if self.remote_id().is_some() {
return Task::ready(None); return Task::ready(None);
} else { } else {
Task::ready(Some( Task::ready(Some((
None,
Task::ready(Err(Arc::new(anyhow!("project does not have a remote id")))).shared(), Task::ready(Err(Arc::new(anyhow!("project does not have a remote id")))).shared(),
)) )))
} }
} }
@ -8571,8 +8619,7 @@ impl Project {
_: &Language, _: &Language,
_: &LanguageSettings, _: &LanguageSettings,
_: &mut ModelContext<Self>, _: &mut ModelContext<Self>,
) -> Task<anyhow::Result<()>> { ) {
Task::ready(Ok(()))
} }
#[cfg(not(any(test, feature = "test-support")))] #[cfg(not(any(test, feature = "test-support")))]
@ -8582,19 +8629,19 @@ impl Project {
new_language: &Language, new_language: &Language,
language_settings: &LanguageSettings, language_settings: &LanguageSettings,
cx: &mut ModelContext<Self>, cx: &mut ModelContext<Self>,
) -> Task<anyhow::Result<()>> { ) {
match &language_settings.formatter { match &language_settings.formatter {
Formatter::Prettier { .. } | Formatter::Auto => {} Formatter::Prettier { .. } | Formatter::Auto => {}
Formatter::LanguageServer | Formatter::External { .. } => return Task::ready(Ok(())), Formatter::LanguageServer | Formatter::External { .. } => return,
}; };
let Some(node) = self.node.as_ref().cloned() else { let Some(node) = self.node.as_ref().cloned() else {
return Task::ready(Ok(())); return;
}; };
let mut prettier_plugins = None; let mut prettier_plugins = None;
if new_language.prettier_parser_name().is_some() { if new_language.prettier_parser_name().is_some() {
prettier_plugins prettier_plugins
.get_or_insert_with(|| HashSet::default()) .get_or_insert_with(|| HashSet::<&'static str>::default())
.extend( .extend(
new_language new_language
.lsp_adapters() .lsp_adapters()
@ -8603,114 +8650,288 @@ impl Project {
) )
} }
let Some(prettier_plugins) = prettier_plugins else { let Some(prettier_plugins) = prettier_plugins else {
return Task::ready(Ok(())); return;
}; };
let fs = Arc::clone(&self.fs);
let locate_prettier_installation = match worktree.and_then(|worktree_id| {
self.worktree_for_id(worktree_id, cx)
.map(|worktree| worktree.read(cx).abs_path())
}) {
Some(locate_from) => {
let installed_prettiers = self.prettier_instances.keys().cloned().collect();
cx.executor().spawn(async move {
Prettier::locate_prettier_installation(
fs.as_ref(),
&installed_prettiers,
locate_from.as_ref(),
)
.await
})
}
None => Task::ready(Ok(ControlFlow::Break(()))),
};
let mut plugins_to_install = prettier_plugins; let mut plugins_to_install = prettier_plugins;
let (mut install_success_tx, mut install_success_rx) =
futures::channel::mpsc::channel::<HashSet<&'static str>>(1);
let new_installation_process = cx
.spawn(|this, mut cx| async move {
if let Some(installed_plugins) = install_success_rx.next().await {
this.update(&mut cx, |this, _| {
let default_prettier =
this.default_prettier
.get_or_insert_with(|| DefaultPrettier {
installation_process: None,
installed_plugins: HashSet::default(),
});
if !installed_plugins.is_empty() {
log::info!("Installed new prettier plugins: {installed_plugins:?}");
default_prettier.installed_plugins.extend(installed_plugins);
}
})
.ok();
}
})
.shared();
let previous_installation_process = let previous_installation_process =
if let Some(default_prettier) = &mut self.default_prettier { if let Some(default_prettier) = &mut self.default_prettier {
plugins_to_install plugins_to_install
.retain(|plugin| !default_prettier.installed_plugins.contains(plugin)); .retain(|plugin| !default_prettier.installed_plugins.contains(plugin));
if plugins_to_install.is_empty() { if plugins_to_install.is_empty() {
return Task::ready(Ok(())); return;
} }
std::mem::replace( default_prettier.installation_process.clone()
&mut default_prettier.installation_process,
Some(new_installation_process.clone()),
)
} else { } else {
None None
}; };
let default_prettier_dir = util::paths::DEFAULT_PRETTIER_DIR.as_path();
let already_running_prettier = self
.prettier_instances
.get(&(worktree, default_prettier_dir.to_path_buf()))
.cloned();
let fs = Arc::clone(&self.fs); let fs = Arc::clone(&self.fs);
cx.spawn_on_main(move |this, mut cx| async move { let default_prettier = self
if let Some(previous_installation_process) = previous_installation_process { .default_prettier
previous_installation_process.await; .get_or_insert_with(|| DefaultPrettier {
} instance: None,
let mut everything_was_installed = false; installation_process: None,
this.update(&mut cx, |this, _| { installed_plugins: HashSet::default(),
match &mut this.default_prettier { });
Some(default_prettier) => { default_prettier.installation_process = Some(
plugins_to_install cx.spawn_on_main(|this, mut cx| async move {
.retain(|plugin| !default_prettier.installed_plugins.contains(plugin)); match locate_prettier_installation
everything_was_installed = plugins_to_install.is_empty(); .await
}, .context("locate prettier installation")
None => this.default_prettier = Some(DefaultPrettier { installation_process: Some(new_installation_process), installed_plugins: HashSet::default() }), .map_err(Arc::new)?
} {
})?; ControlFlow::Break(()) => return Ok(()),
if everything_was_installed { ControlFlow::Continue(Some(_non_default_prettier)) => return Ok(()),
return Ok(()); ControlFlow::Continue(None) => {
} let mut needs_install = match previous_installation_process {
Some(previous_installation_process) => {
cx.spawn(move |_| async move { previous_installation_process.await.is_err()
let prettier_wrapper_path = default_prettier_dir.join(prettier2::PRETTIER_SERVER_FILE); }
// method creates parent directory if it doesn't exist None => true,
fs.save(&prettier_wrapper_path, &text::Rope::from(prettier2::PRETTIER_SERVER_JS), text::LineEnding::Unix).await };
.with_context(|| format!("writing {} file at {prettier_wrapper_path:?}", prettier2::PRETTIER_SERVER_FILE))?; this.update(&mut cx, |this, _| {
if let Some(default_prettier) = &mut this.default_prettier {
let packages_to_versions = future::try_join_all( plugins_to_install.retain(|plugin| {
plugins_to_install !default_prettier.installed_plugins.contains(plugin)
.iter() });
.chain(Some(&"prettier")) needs_install |= !plugins_to_install.is_empty();
.map(|package_name| async { }
let returned_package_name = package_name.to_string(); })?;
let latest_version = node.npm_package_latest_version(package_name) if needs_install {
let installed_plugins = plugins_to_install.clone();
cx.executor()
.spawn(async move {
install_default_prettier(plugins_to_install, node, fs).await
})
.await .await
.with_context(|| { .context("prettier & plugins install")
format!("fetching latest npm version for package {returned_package_name}") .map_err(Arc::new)?;
})?; this.update(&mut cx, |this, _| {
anyhow::Ok((returned_package_name, latest_version)) let default_prettier =
}), this.default_prettier
) .get_or_insert_with(|| DefaultPrettier {
.await instance: None,
.context("fetching latest npm versions")?; installation_process: Some(
Task::ready(Ok(())).shared(),
log::info!("Fetching default prettier and plugins: {packages_to_versions:?}"); ),
let borrowed_packages = packages_to_versions.iter().map(|(package, version)| { installed_plugins: HashSet::default(),
(package.as_str(), version.as_str()) });
}).collect::<Vec<_>>(); default_prettier.instance = None;
node.npm_install_packages(default_prettier_dir, &borrowed_packages).await.context("fetching formatter packages")?; default_prettier.installed_plugins.extend(installed_plugins);
let installed_packages = !plugins_to_install.is_empty(); })?;
install_success_tx.try_send(plugins_to_install).ok(); }
if !installed_packages {
if let Some(prettier) = already_running_prettier {
prettier.await.map_err(|e| anyhow::anyhow!("Default prettier startup await failure: {e:#}"))?.clear_cache().await.context("clearing default prettier cache after plugins install")?;
} }
} }
Ok(())
anyhow::Ok(()) })
}).await .shared(),
}) );
} }
} }
fn start_default_prettier(
node: Arc<dyn NodeRuntime>,
worktree_id: Option<WorktreeId>,
cx: &mut ModelContext<'_, Project>,
) -> Task<Shared<Task<Result<Arc<Prettier>, Arc<anyhow::Error>>>>> {
cx.spawn(move |project, mut cx| async move {
loop {
let default_prettier_installing = match project.update(&mut cx, |project, _| {
project
.default_prettier
.as_ref()
.and_then(|default_prettier| default_prettier.installation_process.clone())
}) {
Ok(installation) => installation,
Err(e) => {
return Task::ready(Err(Arc::new(
e.context("project is gone during default prettier installation"),
)))
.shared()
}
};
match default_prettier_installing {
Some(installation_task) => {
if installation_task.await.is_ok() {
break;
}
}
None => break,
}
}
match project.update(&mut cx, |project, cx| {
match project
.default_prettier
.as_mut()
.and_then(|default_prettier| default_prettier.instance.as_mut())
{
Some(default_prettier) => default_prettier.clone(),
None => {
let new_default_prettier =
start_prettier(node, DEFAULT_PRETTIER_DIR.clone(), worktree_id, cx);
project
.default_prettier
.get_or_insert_with(|| DefaultPrettier {
instance: None,
installation_process: None,
#[cfg(not(any(test, feature = "test-support")))]
installed_plugins: HashSet::default(),
})
.instance = Some(new_default_prettier.clone());
new_default_prettier
}
}
}) {
Ok(task) => task,
Err(e) => Task::ready(Err(Arc::new(
e.context("project is gone during default prettier startup"),
)))
.shared(),
}
})
}
fn start_prettier(
node: Arc<dyn NodeRuntime>,
prettier_dir: PathBuf,
worktree_id: Option<WorktreeId>,
cx: &mut ModelContext<'_, Project>,
) -> Shared<Task<Result<Arc<Prettier>, Arc<anyhow::Error>>>> {
cx.spawn(move |project, mut cx| async move {
let new_server_id = project.update(&mut cx, |project, _| {
project.languages.next_language_server_id()
})?;
let new_prettier = Prettier::start(new_server_id, prettier_dir, node, cx.clone())
.await
.context("default prettier spawn")
.map(Arc::new)
.map_err(Arc::new)?;
register_new_prettier(&project, &new_prettier, worktree_id, new_server_id, &mut cx);
Ok(new_prettier)
})
.shared()
}
fn register_new_prettier(
project: &WeakModel<Project>,
prettier: &Prettier,
worktree_id: Option<WorktreeId>,
new_server_id: LanguageServerId,
cx: &mut AsyncAppContext,
) {
let prettier_dir = prettier.prettier_dir();
let is_default = prettier.is_default();
if is_default {
log::info!("Started default prettier in {prettier_dir:?}");
} else {
log::info!("Started prettier in {prettier_dir:?}");
}
if let Some(prettier_server) = prettier.server() {
project
.update(cx, |project, cx| {
let name = if is_default {
LanguageServerName(Arc::from("prettier (default)"))
} else {
let worktree_path = worktree_id
.and_then(|id| project.worktree_for_id(id, cx))
.map(|worktree| worktree.update(cx, |worktree, _| worktree.abs_path()));
let name = match worktree_path {
Some(worktree_path) => {
if prettier_dir == worktree_path.as_ref() {
let name = prettier_dir
.file_name()
.and_then(|name| name.to_str())
.unwrap_or_default();
format!("prettier ({name})")
} else {
let dir_to_display = prettier_dir
.strip_prefix(worktree_path.as_ref())
.ok()
.unwrap_or(prettier_dir);
format!("prettier ({})", dir_to_display.display())
}
}
None => format!("prettier ({})", prettier_dir.display()),
};
LanguageServerName(Arc::from(name))
};
project
.supplementary_language_servers
.insert(new_server_id, (name, Arc::clone(prettier_server)));
cx.emit(Event::LanguageServerAdded(new_server_id));
})
.ok();
}
}
#[cfg(not(any(test, feature = "test-support")))]
async fn install_default_prettier(
plugins_to_install: HashSet<&'static str>,
node: Arc<dyn NodeRuntime>,
fs: Arc<dyn Fs>,
) -> anyhow::Result<()> {
let prettier_wrapper_path = DEFAULT_PRETTIER_DIR.join(prettier2::PRETTIER_SERVER_FILE);
// method creates parent directory if it doesn't exist
fs.save(
&prettier_wrapper_path,
&text::Rope::from(prettier2::PRETTIER_SERVER_JS),
text::LineEnding::Unix,
)
.await
.with_context(|| {
format!(
"writing {} file at {prettier_wrapper_path:?}",
prettier2::PRETTIER_SERVER_FILE
)
})?;
let packages_to_versions =
future::try_join_all(plugins_to_install.iter().chain(Some(&"prettier")).map(
|package_name| async {
let returned_package_name = package_name.to_string();
let latest_version = node
.npm_package_latest_version(package_name)
.await
.with_context(|| {
format!("fetching latest npm version for package {returned_package_name}")
})?;
anyhow::Ok((returned_package_name, latest_version))
},
))
.await
.context("fetching latest npm versions")?;
log::info!("Fetching default prettier and plugins: {packages_to_versions:?}");
let borrowed_packages = packages_to_versions
.iter()
.map(|(package, version)| (package.as_str(), version.as_str()))
.collect::<Vec<_>>();
node.npm_install_packages(DEFAULT_PRETTIER_DIR.as_path(), &borrowed_packages)
.await
.context("fetching formatter packages")?;
anyhow::Ok(())
}
fn subscribe_for_copilot_events( fn subscribe_for_copilot_events(
copilot: &Model<Copilot>, copilot: &Model<Copilot>,
cx: &mut ModelContext<'_, Project>, cx: &mut ModelContext<'_, Project>,

View file

@ -1,19 +1,24 @@
// use crate::{search::PathMatcher, worktree::WorktreeModelHandle, Event, *}; // use crate::{Event, *};
// use fs::{FakeFs, RealFs}; // use fs::FakeFs;
// use futures::{future, StreamExt}; // use futures::{future, StreamExt};
// use gpui::{executor::Deterministic, test::subscribe, AppContext}; // use gpui::AppContext;
// use language2::{ // use language::{
// language_settings::{AllLanguageSettings, LanguageSettingsContent}, // language_settings::{AllLanguageSettings, LanguageSettingsContent},
// tree_sitter_rust, tree_sitter_typescript, Diagnostic, FakeLspAdapter, LanguageConfig, // tree_sitter_rust, tree_sitter_typescript, Diagnostic, FakeLspAdapter, LanguageConfig,
// LineEnding, OffsetRangeExt, Point, ToPoint, // LineEnding, OffsetRangeExt, Point, ToPoint,
// }; // };
// use lsp2::Url; // use lsp::Url;
// use parking_lot::Mutex; // use parking_lot::Mutex;
// use pretty_assertions::assert_eq; // use pretty_assertions::assert_eq;
// use serde_json::json; // use serde_json::json;
// use std::{cell::RefCell, os::unix, rc::Rc, task::Poll}; // use std::{os, task::Poll};
// use unindent::Unindent as _; // use unindent::Unindent as _;
// use util::{assert_set_eq, test::temp_tree}; // use util::{assert_set_eq, paths::PathMatcher, test::temp_tree};
// #[gpui::test]
// async fn test_block_via_channel(cx: &mut gpui2::TestAppContext) {
// cx.executor().allow_parking();
// }
// #[cfg(test)] // #[cfg(test)]
// #[ctor::ctor] // #[ctor::ctor]

View file

@ -1,7 +1,6 @@
use aho_corasick::{AhoCorasick, AhoCorasickBuilder}; use aho_corasick::{AhoCorasick, AhoCorasickBuilder};
use anyhow::{Context, Result}; use anyhow::{Context, Result};
use client2::proto; use client2::proto;
use globset::{Glob, GlobMatcher};
use itertools::Itertools; use itertools::Itertools;
use language2::{char_kind, BufferSnapshot}; use language2::{char_kind, BufferSnapshot};
use regex::{Regex, RegexBuilder}; use regex::{Regex, RegexBuilder};
@ -10,9 +9,10 @@ use std::{
borrow::Cow, borrow::Cow,
io::{BufRead, BufReader, Read}, io::{BufRead, BufReader, Read},
ops::Range, ops::Range,
path::{Path, PathBuf}, path::Path,
sync::Arc, sync::Arc,
}; };
use util::paths::PathMatcher;
#[derive(Clone, Debug)] #[derive(Clone, Debug)]
pub struct SearchInputs { pub struct SearchInputs {
@ -52,31 +52,6 @@ pub enum SearchQuery {
}, },
} }
#[derive(Clone, Debug)]
pub struct PathMatcher {
maybe_path: PathBuf,
glob: GlobMatcher,
}
impl std::fmt::Display for PathMatcher {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
self.maybe_path.to_string_lossy().fmt(f)
}
}
impl PathMatcher {
pub fn new(maybe_glob: &str) -> Result<Self, globset::Error> {
Ok(PathMatcher {
glob: Glob::new(&maybe_glob)?.compile_matcher(),
maybe_path: PathBuf::from(maybe_glob),
})
}
pub fn is_match<P: AsRef<Path>>(&self, other: P) -> bool {
other.as_ref().starts_with(&self.maybe_path) || self.glob.is_match(other)
}
}
impl SearchQuery { impl SearchQuery {
pub fn text( pub fn text(
query: impl ToString, query: impl ToString,

View file

@ -29,7 +29,6 @@ serde.workspace = true
serde_derive.workspace = true serde_derive.workspace = true
smallvec.workspace = true smallvec.workspace = true
smol.workspace = true smol.workspace = true
globset.workspace = true
serde_json.workspace = true serde_json.workspace = true
[dev-dependencies] [dev-dependencies]
client = { path = "../client", features = ["test-support"] } client = { path = "../client", features = ["test-support"] }

View file

@ -22,7 +22,7 @@ use gpui::{
}; };
use menu::Confirm; use menu::Confirm;
use project::{ use project::{
search::{PathMatcher, SearchInputs, SearchQuery}, search::{SearchInputs, SearchQuery},
Entry, Project, Entry, Project,
}; };
use semantic_index::{SemanticIndex, SemanticIndexStatus}; use semantic_index::{SemanticIndex, SemanticIndexStatus};
@ -37,7 +37,7 @@ use std::{
sync::Arc, sync::Arc,
time::{Duration, Instant}, time::{Duration, Instant},
}; };
use util::ResultExt as _; use util::{paths::PathMatcher, ResultExt as _};
use workspace::{ use workspace::{
item::{BreadcrumbText, Item, ItemEvent, ItemHandle}, item::{BreadcrumbText, Item, ItemEvent, ItemHandle},
searchable::{Direction, SearchableItem, SearchableItemHandle}, searchable::{Direction, SearchableItem, SearchableItemHandle},

View file

@ -9,7 +9,7 @@ use futures::channel::oneshot;
use gpui::executor; use gpui::executor;
use ndarray::{Array1, Array2}; use ndarray::{Array1, Array2};
use ordered_float::OrderedFloat; use ordered_float::OrderedFloat;
use project::{search::PathMatcher, Fs}; use project::Fs;
use rpc::proto::Timestamp; use rpc::proto::Timestamp;
use rusqlite::params; use rusqlite::params;
use rusqlite::types::Value; use rusqlite::types::Value;
@ -21,7 +21,7 @@ use std::{
sync::Arc, sync::Arc,
time::SystemTime, time::SystemTime,
}; };
use util::TryFutureExt; use util::{paths::PathMatcher, TryFutureExt};
pub fn argsort<T: Ord>(data: &[T]) -> Vec<usize> { pub fn argsort<T: Ord>(data: &[T]) -> Vec<usize> {
let mut indices = (0..data.len()).collect::<Vec<_>>(); let mut indices = (0..data.len()).collect::<Vec<_>>();

View file

@ -21,7 +21,7 @@ use ordered_float::OrderedFloat;
use parking_lot::Mutex; use parking_lot::Mutex;
use parsing::{CodeContextRetriever, Span, SpanDigest, PARSEABLE_ENTIRE_FILE_TYPES}; use parsing::{CodeContextRetriever, Span, SpanDigest, PARSEABLE_ENTIRE_FILE_TYPES};
use postage::watch; use postage::watch;
use project::{search::PathMatcher, Fs, PathChange, Project, ProjectEntryId, Worktree, WorktreeId}; use project::{Fs, PathChange, Project, ProjectEntryId, Worktree, WorktreeId};
use smol::channel; use smol::channel;
use std::{ use std::{
cmp::Reverse, cmp::Reverse,
@ -33,6 +33,7 @@ use std::{
sync::{Arc, Weak}, sync::{Arc, Weak},
time::{Duration, Instant, SystemTime}, time::{Duration, Instant, SystemTime},
}; };
use util::paths::PathMatcher;
use util::{channel::RELEASE_CHANNEL_NAME, http::HttpClient, paths::EMBEDDINGS_DIR, ResultExt}; use util::{channel::RELEASE_CHANNEL_NAME, http::HttpClient, paths::EMBEDDINGS_DIR, ResultExt};
use workspace::WorkspaceCreated; use workspace::WorkspaceCreated;

View file

@ -10,13 +10,13 @@ use gpui::{executor::Deterministic, Task, TestAppContext};
use language::{Language, LanguageConfig, LanguageRegistry, ToOffset}; use language::{Language, LanguageConfig, LanguageRegistry, ToOffset};
use parking_lot::Mutex; use parking_lot::Mutex;
use pretty_assertions::assert_eq; use pretty_assertions::assert_eq;
use project::{project_settings::ProjectSettings, search::PathMatcher, FakeFs, Fs, Project}; use project::{project_settings::ProjectSettings, FakeFs, Fs, Project};
use rand::{rngs::StdRng, Rng}; use rand::{rngs::StdRng, Rng};
use serde_json::json; use serde_json::json;
use settings::SettingsStore; use settings::SettingsStore;
use std::{path::Path, sync::Arc, time::SystemTime}; use std::{path::Path, sync::Arc, time::SystemTime};
use unindent::Unindent; use unindent::Unindent;
use util::RandomCharIter; use util::{paths::PathMatcher, RandomCharIter};
#[ctor::ctor] #[ctor::ctor]
fn init_logger() { fn init_logger() {

View file

@ -14,6 +14,7 @@ test-support = ["tempdir", "git2"]
[dependencies] [dependencies]
anyhow.workspace = true anyhow.workspace = true
backtrace = "0.3" backtrace = "0.3"
globset.workspace = true
log.workspace = true log.workspace = true
lazy_static.workspace = true lazy_static.workspace = true
futures.workspace = true futures.workspace = true

View file

@ -1,5 +1,6 @@
use std::path::{Path, PathBuf}; use std::path::{Path, PathBuf};
use globset::{Glob, GlobMatcher};
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
lazy_static::lazy_static! { lazy_static::lazy_static! {
@ -189,6 +190,31 @@ impl<P> PathLikeWithPosition<P> {
} }
} }
#[derive(Clone, Debug)]
pub struct PathMatcher {
maybe_path: PathBuf,
glob: GlobMatcher,
}
impl std::fmt::Display for PathMatcher {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
self.maybe_path.to_string_lossy().fmt(f)
}
}
impl PathMatcher {
pub fn new(maybe_glob: &str) -> Result<Self, globset::Error> {
Ok(PathMatcher {
glob: Glob::new(&maybe_glob)?.compile_matcher(),
maybe_path: PathBuf::from(maybe_glob),
})
}
pub fn is_match<P: AsRef<Path>>(&self, other: P) -> bool {
other.as_ref().starts_with(&self.maybe_path) || self.glob.is_match(other)
}
}
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::*; use super::*;

View file

@ -3,7 +3,7 @@ authors = ["Nathan Sobo <nathansobo@gmail.com>"]
description = "The fast, collaborative code editor." description = "The fast, collaborative code editor."
edition = "2021" edition = "2021"
name = "zed" name = "zed"
version = "0.111.0" version = "0.111.6"
publish = false publish = false
[lib] [lib]

View file

@ -1 +1 @@
dev stable

View file

@ -8,6 +8,11 @@
[ [
(jsx_element) (jsx_element)
(jsx_fragment) (jsx_fragment)
] @element
[
(jsx_opening_element)
(jsx_closing_element)
(jsx_self_closing_element) (jsx_self_closing_element)
(jsx_expression) (jsx_expression)
] @element ] @default

View file

@ -8,6 +8,11 @@
[ [
(jsx_element) (jsx_element)
(jsx_fragment) (jsx_fragment)
] @element
[
(jsx_opening_element)
(jsx_closing_element)
(jsx_self_closing_element) (jsx_self_closing_element)
(jsx_expression) (jsx_expression)
] @element ] @default

View file

@ -2,6 +2,7 @@ use anyhow::{anyhow, Result};
use async_compression::futures::bufread::GzipDecoder; use async_compression::futures::bufread::GzipDecoder;
use async_tar::Archive; use async_tar::Archive;
use async_trait::async_trait; use async_trait::async_trait;
use collections::HashMap;
use futures::{future::BoxFuture, FutureExt}; use futures::{future::BoxFuture, FutureExt};
use gpui::AppContext; use gpui::AppContext;
use language::{LanguageServerName, LspAdapter, LspAdapterDelegate}; use language::{LanguageServerName, LspAdapter, LspAdapterDelegate};
@ -20,12 +21,7 @@ use util::{fs::remove_matching, github::latest_github_release};
use util::{github::GitHubLspBinaryVersion, ResultExt}; use util::{github::GitHubLspBinaryVersion, ResultExt};
fn typescript_server_binary_arguments(server_path: &Path) -> Vec<OsString> { fn typescript_server_binary_arguments(server_path: &Path) -> Vec<OsString> {
vec![ vec![server_path.into(), "--stdio".into()]
server_path.into(),
"--stdio".into(),
"--tsserver-path".into(),
"node_modules/typescript/lib".into(),
]
} }
fn eslint_server_binary_arguments(server_path: &Path) -> Vec<OsString> { fn eslint_server_binary_arguments(server_path: &Path) -> Vec<OsString> {
@ -158,9 +154,20 @@ impl LspAdapter for TypeScriptLspAdapter {
async fn initialization_options(&self) -> Option<serde_json::Value> { async fn initialization_options(&self) -> Option<serde_json::Value> {
Some(json!({ Some(json!({
"provideFormatter": true "provideFormatter": true,
"tsserver": {
"path": "node_modules/typescript/lib",
},
})) }))
} }
async fn language_ids(&self) -> HashMap<String, String> {
HashMap::from_iter([
("TypeScript".into(), "typescript".into()),
("JavaScript".into(), "javascript".into()),
("TSX".into(), "typescriptreact".into()),
])
}
} }
async fn get_cached_ts_server_binary( async fn get_cached_ts_server_binary(

View file

@ -8,6 +8,11 @@
[ [
(jsx_element) (jsx_element)
(jsx_fragment) (jsx_fragment)
] @element
[
(jsx_opening_element)
(jsx_closing_element)
(jsx_self_closing_element) (jsx_self_closing_element)
(jsx_expression) (jsx_expression)
] @element ] @default

View file

@ -8,6 +8,11 @@
[ [
(jsx_element) (jsx_element)
(jsx_fragment) (jsx_fragment)
] @element
[
(jsx_opening_element)
(jsx_closing_element)
(jsx_self_closing_element) (jsx_self_closing_element)
(jsx_expression) (jsx_expression)
] @element ] @default