Skip to content

fix: translate max_tokens to max_completion_tokens for GPT-5 and o1+ models (closes #560)#603

Open
anuragg-saxenaa wants to merge 2 commits intodecolua:masterfrom
anuragg-saxenaa:fix/issue-560-openai-max-tokens
Open

fix: translate max_tokens to max_completion_tokens for GPT-5 and o1+ models (closes #560)#603
anuragg-saxenaa wants to merge 2 commits intodecolua:masterfrom
anuragg-saxenaa:fix/issue-560-openai-max-tokens

Conversation

@anuragg-saxenaa
Copy link
Copy Markdown
Contributor

OpenAI GPT-5 and o1/o3/o4 models return 400 errors when clients send max_tokens instead of max_completion_tokens. This fix adds transformRequest() in DefaultExecutor to detect these model names and automatically translate max_tokens to max_completion_tokens. Closes #560

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

OpenAI [400]: Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tok

1 participant