Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support context_window config for OpenAILike / Ollama when create LLM #548

Open
Mini256 opened this issue Dec 26, 2024 · 0 comments
Open

Comments

@Mini256
Copy link
Member

Mini256 commented Dec 26, 2024

Although the current LLM creation page can support configuration context_window in JSON config, this is very easy to be ignored. This will lead to a reported error related to token limitation.

The OpenAI LLMclass class will maintain the context_window of different models internally, but the third-party providers such as OpenAILike / Ollama won't, they use the default value (3900) of LlamaIndex (currently to increase the default value of 200000 for workaround).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant