Skip to content

Conversation

@allen-cook
Copy link

When requesting logprobs using ChatLiteLLM the results aren't returned as part of the ChatResponse.response_metadata. This MR adds it in

You can reproduce the problem with this python script using fireworks.ai (or any provider that supplies logprobs)

    chat = ChatLiteLLM(
        model="fireworks_ai/accounts/fireworks/models/deepseek-v3p1",
        temperature=0.3,
        # Pass logprobs and top_logprobs through model_kwargs
        model_kwargs={
            "logprobs": 1,
        }
    )
    messages = [
        SystemMessage(content="You are a helpful assistant."),
        HumanMessage(content="What is the capital of France?")
    ]
    response = chat.invoke(messages)
    print(response.response_metadata["logprobs"])

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant