Skip to content

LLM request failed with GPT 5 models #440

@alexpouliquen

Description

@alexpouliquen

Description

Hello,

I am getting this error while trying to request gpt-5-mini, gpt-5-nano andgpt-5.

"error": {
    "message": "Unsupported value: 'temperature' does not support 0.5 with this model. Only the default (1) value is supported.",
    "type": "invalid_request_error",
    "param": "temperature",
    "code": "unsupported_value"
}

Expected Behavior

Expect the LLM request to not failed. I guess by setting temperature: 1 or just not passing it.

Steps to Reproduce

  1. Run openfang start
  2. Go on web dashboard > chat
  3. Choose one of the mentioned model
  4. Send a message

OpenFang Version

0.3.29

Operating System

macOS (Apple Silicon)

Logs / Screenshots

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions