ollama: Change default context size to 4096 (#31682)
Ollama increased their default context size from 2048 to 4096 tokens in version v0.6.7, which released over a month ago. https://github.com/ollama/ollama/releases/tag/v0.6.7 Release Notes: - ollama: Update default model context to 4096 (matching upstream)
This commit is contained in:
parent
32214abb64
commit
6d687a2c2c
1 changed files with 1 additions and 1 deletions
|
@ -42,7 +42,7 @@ pub struct Model {
|
||||||
|
|
||||||
fn get_max_tokens(name: &str) -> usize {
|
fn get_max_tokens(name: &str) -> usize {
|
||||||
/// Default context length for unknown models.
|
/// Default context length for unknown models.
|
||||||
const DEFAULT_TOKENS: usize = 2048;
|
const DEFAULT_TOKENS: usize = 4096;
|
||||||
/// Magic number. Lets many Ollama models work with ~16GB of ram.
|
/// Magic number. Lets many Ollama models work with ~16GB of ram.
|
||||||
const MAXIMUM_TOKENS: usize = 16384;
|
const MAXIMUM_TOKENS: usize = 16384;
|
||||||
|
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue