ZIm/crates/assistant/src/embedded_scope.rs
Kyle Kelley d77e553466
File context for assistant panel (#9712)
Introducing the Active File Context portion of #9705. When someone is in
the assistant panel it now includes the active file as a system message
on send while showing them a nice little display in the lower right:


![image](https://github.com/zed-industries/zed/assets/836375/9abc56e0-e8f2-45ee-9e7e-b83b28b483ea)

For this iteration, I'd love to see the following before we land this:

* [x] Toggle-able context - user should be able to disable sending this
context
* [x] Show nothing if there is no context coming in
* [x] Update token count as we change items
* [x] Listen for a more finely scoped event for when the active item
changes
* [x] Create a global for pulling a file icon based on a path. Zed's
main way to do this is nested within project panel's `FileAssociation`s.
* [x] Get the code fence name for a Language for the system prompt
* [x] Update the token count when the buffer content changes

I'm seeing this PR as the foundation for providing other kinds of
context -- diagnostic summaries, failing tests, additional files, etc.

Release Notes:

- Added file context to assistant chat panel
([#9705](https://github.com/zed-industries/zed/issues/9705)).

<img width="1558" alt="image"
src="https://github.com/zed-industries/zed/assets/836375/86eb7e50-3e28-4754-9c3f-895be588616d">

---------

Co-authored-by: Conrad Irwin <conrad@zed.dev>
Co-authored-by: Nathan <nathan@zed.dev>
Co-authored-by: Antonio Scandurra <me@as-cii.com>
Co-authored-by: Mikayla Maki <mikayla@zed.dev>
2024-03-29 13:55:01 -07:00

91 lines
2.6 KiB
Rust

use editor::MultiBuffer;
use gpui::{AppContext, Model, ModelContext, Subscription};
use crate::{assistant_panel::Conversation, LanguageModelRequestMessage, Role};
#[derive(Default)]
pub struct EmbeddedScope {
active_buffer: Option<Model<MultiBuffer>>,
active_buffer_enabled: bool,
active_buffer_subscription: Option<Subscription>,
}
impl EmbeddedScope {
pub fn new() -> Self {
Self {
active_buffer: None,
active_buffer_enabled: true,
active_buffer_subscription: None,
}
}
pub fn set_active_buffer(
&mut self,
buffer: Option<Model<MultiBuffer>>,
cx: &mut ModelContext<Conversation>,
) {
self.active_buffer_subscription.take();
if let Some(active_buffer) = buffer.clone() {
self.active_buffer_subscription =
Some(cx.subscribe(&active_buffer, |conversation, _, e, cx| {
if let multi_buffer::Event::Edited { .. } = e {
conversation.count_remaining_tokens(cx)
}
}));
}
self.active_buffer = buffer;
}
pub fn active_buffer(&self) -> Option<&Model<MultiBuffer>> {
self.active_buffer.as_ref()
}
pub fn active_buffer_enabled(&self) -> bool {
self.active_buffer_enabled
}
pub fn set_active_buffer_enabled(&mut self, enabled: bool) {
self.active_buffer_enabled = enabled;
}
/// Provide a message for the language model based on the active buffer.
pub fn message(&self, cx: &AppContext) -> Option<LanguageModelRequestMessage> {
if !self.active_buffer_enabled {
return None;
}
let active_buffer = self.active_buffer.as_ref()?;
let buffer = active_buffer.read(cx);
if let Some(singleton) = buffer.as_singleton() {
let singleton = singleton.read(cx);
let filename = singleton
.file()
.map(|file| file.path().to_string_lossy())
.unwrap_or("Untitled".into());
let text = singleton.text();
let language = singleton
.language()
.map(|l| {
let name = l.code_fence_block_name();
name.to_string()
})
.unwrap_or_default();
let markdown =
format!("User's active file `{filename}`:\n\n```{language}\n{text}```\n\n");
return Some(LanguageModelRequestMessage {
role: Role::System,
content: markdown,
});
}
None
}
}