We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
By default, Ollama uses a context window size of 2048 tokens.
To change this when using ollama run, use /set parameter:
ollama run
/set parameter
/set parameter num_ctx 4096
When using the API, specify the num_ctx parameter:
num_ctx
curl http://localhost:11434/api/generate -d '{ "model": "llama3.2", "prompt": "Why is the sky blue?", "options": { "num_ctx": 4096 } }'
The text was updated successfully, but these errors were encountered:
No branches or pull requests
By default, Ollama uses a context window size of 2048 tokens.
To change this when using
ollama run
, use/set parameter
:When using the API, specify the
num_ctx
parameter:The text was updated successfully, but these errors were encountered: