ollama: Change default context size to 4096 (#31682)

Ollama increased their default context size from 2048 to 4096 tokens in
version v0.6.7, which released over a month ago.

https://github.com/ollama/ollama/releases/tag/v0.6.7

Release Notes:

- ollama: Update default model context to 4096 (matching upstream)
This commit is contained in:
tidely 2025-05-30 23:12:39 +03:00 committed by GitHub
parent 32214abb64
commit 6d687a2c2c
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -42,7 +42,7 @@ pub struct Model {
fn get_max_tokens(name: &str) -> usize {
/// Default context length for unknown models.
const DEFAULT_TOKENS: usize = 2048;
const DEFAULT_TOKENS: usize = 4096;
/// Magic number. Lets many Ollama models work with ~16GB of ram.
const MAXIMUM_TOKENS: usize = 16384;