google_ai: Add Gemini 2.0 Flash support (#22665)
Release Notes: - Added support for Google's Gemini 2.0 Flash experimental model. Note: Weirdly enough the model is slow on small talk responses like 'hi' (in my tests) but very fast on things that need more tokens like 'write me a snake game in python'. Likely an API problem. TESTED ONLY ON WINDOWS! Would test further but don't have Linux installed and don't have an Mac. Will likely work everywhere. Why?: I think Gemini 2.0 Flash is incredibly good model at coding and following instructions. I think it would be nice to have it in the editor. I did as minimal changes as possible while adding the model and streaming validation. I think it's worth merging the commits as they bring good improvements. --------- Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
This commit is contained in:
parent
0d30bda740
commit
799e81ffe5
2 changed files with 22 additions and 2 deletions
|
@ -88,6 +88,7 @@ impl CloudModel {
|
|||
Self::Google(model) => match model {
|
||||
google_ai::Model::Gemini15Pro
|
||||
| google_ai::Model::Gemini15Flash
|
||||
| google_ai::Model::Gemini20Flash
|
||||
| google_ai::Model::Custom { .. } => {
|
||||
LanguageModelAvailability::RequiresPlan(Plan::ZedPro)
|
||||
}
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue