You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I believe this behavior changed in #9713. Logprobs have apparently been removed from the API before, when they didn't follow the OpenAI format (see #579).
I would recommend that we either follow the OpenAI API here, or not return logprobs at all.
Thank you for such a fantastic proxy!
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
litellm-proxy Docker image 44f4b5172e6c
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered:
Uh oh!
There was an error while loading. Please reload this page.
What happened?
When calling LiteLLM using a Gemini model and the Rust
async-openai
client, I have just started getting the following error:The LiteLLM response contains
logprobs
values which look like:But according to the OpenAI API docs, this should look like:
I believe this behavior changed in #9713. Logprobs have apparently been removed from the API before, when they didn't follow the OpenAI format (see #579).
I would recommend that we either follow the OpenAI API here, or not return logprobs at all.
Thank you for such a fantastic proxy!
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
litellm-proxy Docker image 44f4b5172e6c
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: