Skip to content

[Bug]: Gemini returns logprobs, but not in OpenAI format #9888

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
emk opened this issue Apr 10, 2025 · 3 comments · Fixed by #9893
Closed

[Bug]: Gemini returns logprobs, but not in OpenAI format #9888

emk opened this issue Apr 10, 2025 · 3 comments · Fixed by #9893
Assignees
Labels
bug Something isn't working priority

Comments

@emk
Copy link

emk commented Apr 10, 2025

What happened?

When calling LiteLLM using a Gemini model and the Rust async-openai client, I have just started getting the following error:

invalid type: floating point `-0.005895897917802447`, expected struct ChatChoiceLogprobs

The LiteLLM response contains logprobs values which look like:

"logprobs":-0.003003644641441635

But according to the OpenAI API docs, this should look like:

"logprobs":  { "content": [...], "refusal": [...] }

I believe this behavior changed in #9713. Logprobs have apparently been removed from the API before, when they didn't follow the OpenAI format (see #579).

I would recommend that we either follow the OpenAI API here, or not return logprobs at all.

Thank you for such a fantastic proxy!

Relevant log output

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

litellm-proxy Docker image 44f4b5172e6c

Twitter / LinkedIn details

No response

@emk emk added the bug Something isn't working label Apr 10, 2025
@emk
Copy link
Author

emk commented Apr 10, 2025

Downgrading to v1.65.0-stable definitely fixes this.

@krrishdholakia
Copy link
Contributor

Thanks for the ticket @emk

Picking this up today.

@krrishdholakia
Copy link
Contributor

This change needs to be reverted, as the ability to get logprobs when doing logprobs=True is already supported.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working priority
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants