Skip to content

OpenAIModel fails with OpenRouter Gemini: missing "choices" and "created" keys #1746

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
2 tasks done
gkeb opened this issue May 16, 2025 · 5 comments · May be fixed by #1764
Open
2 tasks done

OpenAIModel fails with OpenRouter Gemini: missing "choices" and "created" keys #1746

gkeb opened this issue May 16, 2025 · 5 comments · May be fixed by #1764
Assignees

Comments

@gkeb
Copy link

gkeb commented May 16, 2025

Initial Checks

Description

Summary

When using OpenAIModel with OpenRouter and the model google/gemini-2.0-flash-exp:free, the agent fails with:

TypeError: 'NoneType' object is not subscriptable

This happens because the response returned by OpenRouter for Gemini does not include the choices or created keys expected by _process_response().


Steps to Reproduce

from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIModel
from pydantic_ai.providers.openai import OpenAIProvider

model = OpenAIModel(
    'google/gemini-2.0-flash-exp:free',
    provider=OpenAIProvider(
        base_url='https://openrouter.ai/api/v1',
        api_key='sk-or-v1-...',
    ),
)


agent = Agent(model=model, system_prompt="Be helpful.")
result = agent.run_sync("Tell me a joke.")

Observed Error

  • response.created is None, causing datetime.fromtimestamp() to fail
  • response.choices is None, causing choices[0] to throw

Suggestion

  • Add a fallback timestamp = datetime.now() when created is missing
  • Handle cases where choices is missing — or validate the response format before attempting to parse
  • Possibly support custom response parsers for models that aren't 100% OpenAI-compatible

Why it matters

OpenRouter is a growing ecosystem for multimodel access, and many of the available models (Gemini, Cohere, etc.) do not fully follow the OpenAI ChatCompletion schema. Supporting them would improve compatibility.

Let me know if you'd like me to submit a PR or share sample responses.

Thanks for the awesome tool!

Example Code

Python, Pydantic AI & LLM client version

Python 3.12
pydantic>=2.11.4
pydantic-ai-slim[openai]>=0.2.4
pydantic-settings>=2.9.1
@wdhorton
Copy link

I was able to reproduce this. It only happens sometimes--since OpenRouter has two providers for this model, I'm guessing it happens for one of the providers but not the other.

wdhorton added a commit to wdhorton/pydantic-ai that referenced this issue May 19, 2025
The OpenAI spec defines this as required, but other OpenAI-compatible providers (like OpenRouter) may not populate it
on all responses. This adds code to handle the case where the field is None.

Fixes pydantic#1746
@gkeb
Copy link
Author

gkeb commented May 19, 2025

Probably yes. If I set up OpenAI or DeepSeek it works. The problem practically only occurs with Google models. In any case, I only noticed it with them.

@Kludex Kludex self-assigned this May 20, 2025
@Kludex
Copy link
Member

Kludex commented May 20, 2025

I can't reproduce it. Also, trying many times is hard because there's a lot of 429.

@Kludex
Copy link
Member

Kludex commented May 20, 2025

I've created the OpenRouterProvider to make it easier to work with it, also, you can now use Agent('openrouter:<model_name>') to use OpenRouter - Assuming you have the OPENROUTER_API_KEY set.

@Kludex
Copy link
Member

Kludex commented May 20, 2025

See this: https://github.com/pydantic/pydantic-ai/pull/1778/files#r2097604759.

It didn't fail... How can I reproduce it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants