You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Although the current LLM creation page can support configuration context_window in JSON config, this is very easy to be ignored. This will lead to a reported error related to token limitation.
The OpenAI LLMclass class will maintain the context_window of different models internally, but the third-party providers such as OpenAILike / Ollama won't, they use the default value (3900) of LlamaIndex (currently to increase the default value of 200000 for workaround).
The text was updated successfully, but these errors were encountered:
Although the current LLM creation page can support configuration
context_window
in JSON config, this is very easy to be ignored. This will lead to a reported error related to token limitation.The OpenAI LLMclass class will maintain the
context_window
of different models internally, but the third-party providers such as OpenAILike / Ollama won't, they use the default value (3900
) of LlamaIndex (currently to increase the default value of200000
for workaround).The text was updated successfully, but these errors were encountered: