ZIm/crates/language_models/src
Richard Feldman 5405c2c2d3
Standardize on u64 for token counts (#32869)
Previously we were using a mix of `u32` and `usize`, e.g. `max_tokens:
usize, max_output_tokens: Option<u32>` in the same `struct`.

Although [tiktoken](https://github.com/openai/tiktoken) uses `usize`,
token counts should be consistent across targets (e.g. the same model
doesn't suddenly get a smaller context window if you're compiling for
wasm32), and these token counts could end up getting serialized using a
binary protocol, so `usize` is not the right choice for token counts.

I chose to standardize on `u64` over `u32` because we don't store many
of them (so the extra size should be insignificant) and future models
may exceed `u32::MAX` tokens.

Release Notes:

- N/A
2025-06-17 10:43:07 -04:00
..
provider Standardize on u64 for token counts (#32869) 2025-06-17 10:43:07 -04:00
ui agent: Fix instruction list item with multiple buttons not working (#30541) 2025-05-12 06:19:20 -03:00
language_models.rs Add support for OpenRouter as a language model provider (#29496) 2025-06-03 15:59:46 +00:00
provider.rs Add support for OpenRouter as a language model provider (#29496) 2025-06-03 15:59:46 +00:00
settings.rs copilot: Allow enterprise to sign in and use copilot (#32296) 2025-06-17 11:36:53 +02:00
ui.rs assistant: Refine settings view's instruction visuals (#25812) 2025-02-28 12:06:47 -03:00