Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/guide/agent-model-matching.md
Original file line number Diff line number Diff line change
Expand Up @@ -176,7 +176,7 @@ See the [Orchestration System Guide](./orchestration.md) for how agents dispatch
"explore": { "model": "github-copilot/grok-code-fast-1" },

// Architecture consultation: GPT or Claude Opus
"oracle": { "model": "openai/gpt-5.2", "variant": "high" },
"oracle": { "model": "openai/gpt-5.4", "variant": "high" },

// Prometheus inherits sisyphus model; just add prompt guidance
"prometheus": { "prompt_append": "Leverage deep & quick agents heavily, always in parallel." }
Expand Down
4 changes: 2 additions & 2 deletions docs/guide/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -197,7 +197,7 @@ When GitHub Copilot is the best available provider, oh-my-opencode uses these mo
| Agent | Model |
Copy link

@cubic-dev-ai cubic-dev-ai bot Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: Incomplete documentation update: other sections of this file still list GPT-5.2 for the Oracle agent.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At docs/guide/installation.md, line 200:

<comment>Incomplete documentation update: other sections of this file still list `GPT-5.2` for the Oracle agent.</comment>

<file context>
@@ -197,7 +197,7 @@ When GitHub Copilot is the best available provider, oh-my-opencode uses these mo
 | ------------- | --------------------------------------------------------- |
 | **Sisyphus**  | `github-copilot/claude-opus-4-6`                          |
-| **Oracle**    | `github-copilot/gpt-5.2`                                  |
+| **Oracle**    | `github-copilot/gpt-5.4`                                  |
 | **Explore**   | `opencode/gpt-5-nano`                                     |
 | **Librarian** | `zai-coding-plan/glm-4.7` (if Z.ai available) or fallback |
</file context>
Fix with Cubic

| ------------- | --------------------------------------------------------- |
| **Sisyphus** | `github-copilot/claude-opus-4-6` |
| **Oracle** | `github-copilot/gpt-5.2` |
| **Oracle** | `github-copilot/gpt-5.4` |
| **Explore** | `opencode/gpt-5-nano` |
| **Librarian** | `zai-coding-plan/glm-4.7` (if Z.ai available) or fallback |
Expand Down Expand Up @@ -225,7 +225,7 @@ When OpenCode Zen is the best available provider (no native or Copilot), these m
| Agent | Model |
| ------------- | -------------------------- |
| **Sisyphus** | `opencode/claude-opus-4-6` |
| **Oracle** | `opencode/gpt-5.2` |
| **Oracle** | `opencode/gpt-5.4` |
| **Explore** | `opencode/gpt-5-nano` |
| **Librarian** | `opencode/glm-4.7-free` |
Expand Down
2 changes: 1 addition & 1 deletion docs/guide/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -181,7 +181,7 @@ You can override specific agents or categories in your config:
"explore": { "model": "github-copilot/grok-code-fast-1" },

// Architecture consultation: GPT or Claude Opus
"oracle": { "model": "openai/gpt-5.2", "variant": "high" }
"oracle": { "model": "openai/gpt-5.4", "variant": "high" }
Copy link

@cubic-dev-ai cubic-dev-ai bot Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: Custom agent: Opencode Compatibility

Upgrading to gpt-5.4 breaks compatibility with OpenCode's OpenAI OAuth authentication. The opencode SDK explicitly filters out gpt-5.4 as it is not in the allowedModels whitelist for OAuth users.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At docs/guide/overview.md, line 184:

<comment>Upgrading to `gpt-5.4` breaks compatibility with OpenCode's OpenAI OAuth authentication. The `opencode` SDK explicitly filters out `gpt-5.4` as it is not in the `allowedModels` whitelist for OAuth users.</comment>

<file context>
@@ -181,7 +181,7 @@ You can override specific agents or categories in your config:
 
     // Architecture consultation: GPT or Claude Opus
-    "oracle": { "model": "openai/gpt-5.2", "variant": "high" }
+    "oracle": { "model": "openai/gpt-5.4", "variant": "high" }
   },
 
</file context>
Suggested change
"oracle": { "model": "openai/gpt-5.4", "variant": "high" }
"oracle": { "model": "openai/gpt-5.2", "variant": "high" }
Fix with Cubic

},

"categories": {
Expand Down
4 changes: 2 additions & 2 deletions docs/reference/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ Here's a practical starting configuration:
"explore": { "model": "github-copilot/grok-code-fast-1" },

// Architecture consultation: GPT or Claude Opus
"oracle": { "model": "openai/gpt-5.2", "variant": "high" },
"oracle": { "model": "openai/gpt-5.4", "variant": "high" },

// Prometheus inherits sisyphus model; just add prompt guidance
"prometheus": { "prompt_append": "Leverage deep & quick agents heavily, always in parallel." }
Expand Down Expand Up @@ -253,7 +253,7 @@ Disable categories: `{ "disabled_categories": ["ultrabrain"] }`
|-------|---------------|-------------------|
| **Sisyphus** | `claude-opus-4-6` | anthropic → github-copilot → opencode → kimi-for-coding → zai-coding-plan |
| **Hephaestus** | `gpt-5.3-codex` | openai → github-copilot → opencode |
| **oracle** | `gpt-5.2` | openai → google → anthropic (via github-copilot/opencode) |
| **oracle** | `gpt-5.4` | openai → google → anthropic (via github-copilot/opencode) |
| **librarian** | `glm-4.7` | zai-coding-plan → opencode → anthropic |
| **explore** | `grok-code-fast-1` | github-copilot → anthropic/opencode → opencode |
| **multimodal-looker** | `gemini-3-flash` | google → openai → zai-coding-plan → kimi-for-coding → opencode → anthropic |
Expand Down
2 changes: 1 addition & 1 deletion src/agents/AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Agent factories following `createXXXAgent(model) → AgentConfig` pattern. Each
|-------|-------|------|------|----------------|---------|
| **Sisyphus** | claude-opus-4-6 | 0.1 | all | kimi-k2.5 → glm-5 → big-pickle | Main orchestrator, plans + delegates |
| **Hephaestus** | gpt-5.3-codex | 0.1 | all | gpt-5.2 (copilot) | Autonomous deep worker |
| **Oracle** | gpt-5.2 | 0.1 | subagent | gemini-3.1-pro → claude-opus-4-6 | Read-only consultation |
| **Oracle** | gpt-5.4 | 0.1 | subagent | gemini-3.1-pro → claude-opus-4-6 | Read-only consultation |
| **Librarian** | kimi-k2.5 | 0.1 | subagent | gemini-3-flash → gpt-5.2 → glm-4.6v | External docs/code search |
| **Explore** | grok-code-fast-1 | 0.1 | subagent | minimax-m2.5 → claude-haiku-4-5 → gpt-5-nano | Contextual grep |
| **Multimodal-Looker** | gemini-3-flash | 0.1 | subagent | minimax-m2.5 → big-pickle | PDF/image analysis |
Expand Down
14 changes: 6 additions & 8 deletions src/agents/utils.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -169,15 +169,14 @@ describe("createBuiltinAgents with model overrides", () => {
test("Oracle uses connected provider fallback when availableModels is empty and cache exists", async () => {
// #given - connected providers cache has "openai", which matches oracle's first fallback entry
const cacheSpy = spyOn(connectedProvidersCache, "readConnectedProvidersCache").mockReturnValue(["openai"])

const fetchSpy = spyOn(shared, "fetchAvailableModels").mockResolvedValue(new Set())
// #when
const agents = await createBuiltinAgents([], {}, undefined, TEST_DEFAULT_MODEL, undefined, undefined, [], undefined, undefined)

// #then - oracle resolves via connected cache fallback to openai/gpt-5.2 (not system default)
expect(agents.oracle.model).toBe("openai/gpt-5.2")
expect(agents.oracle.model).toBe("openai/gpt-5.4")
expect(agents.oracle.reasoningEffort).toBe("medium")
expect(agents.oracle.thinking).toBeUndefined()
cacheSpy.mockRestore?.()
fetchSpy.mockRestore?.()
})

test("Oracle created without model field when no cache exists (first run scenario)", async () => {
Expand Down Expand Up @@ -473,14 +472,13 @@ describe("createBuiltinAgents without systemDefaultModel", () => {
test("agents created via connected cache fallback even without systemDefaultModel", async () => {
// #given - connected cache has "openai", which matches oracle's fallback chain
const cacheSpy = spyOn(connectedProvidersCache, "readConnectedProvidersCache").mockReturnValue(["openai"])

const fetchSpy = spyOn(shared, "fetchAvailableModels").mockResolvedValue(new Set())
// #when
const agents = await createBuiltinAgents([], {}, undefined, undefined)

// #then - connected cache enables model resolution despite no systemDefaultModel
expect(agents.oracle).toBeDefined()
expect(agents.oracle.model).toBe("openai/gpt-5.2")
expect(agents.oracle.model).toBe("openai/gpt-5.4")
cacheSpy.mockRestore?.()
fetchSpy.mockRestore?.()
})

test("agents NOT created when no cache and no systemDefaultModel (first run without defaults)", async () => {
Expand Down
26 changes: 13 additions & 13 deletions src/cli/__snapshots__/model-fallback.test.ts.snap
Original file line number Diff line number Diff line change
Expand Up @@ -216,7 +216,7 @@ exports[`generateModelConfig single native provider uses OpenAI models when only
"variant": "medium",
},
"oracle": {
"model": "openai/gpt-5.2",
"model": "openai/gpt-5.4",
"variant": "high",
},
"prometheus": {
Expand Down Expand Up @@ -284,7 +284,7 @@ exports[`generateModelConfig single native provider uses OpenAI models with isMa
"variant": "medium",
},
"oracle": {
"model": "openai/gpt-5.2",
"model": "openai/gpt-5.4",
"variant": "high",
},
"prometheus": {
Expand Down Expand Up @@ -474,7 +474,7 @@ exports[`generateModelConfig all native providers uses preferred models from fal
"variant": "medium",
},
"oracle": {
"model": "openai/gpt-5.2",
"model": "openai/gpt-5.4",
"variant": "high",
},
"prometheus": {
Expand Down Expand Up @@ -549,7 +549,7 @@ exports[`generateModelConfig all native providers uses preferred models with isM
"variant": "medium",
},
"oracle": {
"model": "openai/gpt-5.2",
"model": "openai/gpt-5.4",
"variant": "high",
},
"prometheus": {
Expand Down Expand Up @@ -625,7 +625,7 @@ exports[`generateModelConfig fallback providers uses OpenCode Zen models when on
"variant": "medium",
},
"oracle": {
"model": "opencode/gpt-5.2",
"model": "opencode/gpt-5.4",
"variant": "high",
},
"prometheus": {
Expand Down Expand Up @@ -700,7 +700,7 @@ exports[`generateModelConfig fallback providers uses OpenCode Zen models with is
"variant": "medium",
},
"oracle": {
"model": "opencode/gpt-5.2",
"model": "opencode/gpt-5.4",
"variant": "high",
},
"prometheus": {
Expand Down Expand Up @@ -771,7 +771,7 @@ exports[`generateModelConfig fallback providers uses GitHub Copilot models when
"model": "github-copilot/gemini-3-flash-preview",
},
"oracle": {
"model": "github-copilot/gpt-5.2",
"model": "github-copilot/gpt-5.4",
"variant": "high",
},
"prometheus": {
Expand Down Expand Up @@ -837,7 +837,7 @@ exports[`generateModelConfig fallback providers uses GitHub Copilot models with
"model": "github-copilot/gemini-3-flash-preview",
},
"oracle": {
"model": "github-copilot/gpt-5.2",
"model": "github-copilot/gpt-5.4",
"variant": "high",
},
"prometheus": {
Expand Down Expand Up @@ -1019,7 +1019,7 @@ exports[`generateModelConfig mixed provider scenarios uses Claude + OpenCode Zen
"variant": "medium",
},
"oracle": {
"model": "opencode/gpt-5.2",
"model": "opencode/gpt-5.4",
"variant": "high",
},
"prometheus": {
Expand Down Expand Up @@ -1094,7 +1094,7 @@ exports[`generateModelConfig mixed provider scenarios uses OpenAI + Copilot comb
"variant": "medium",
},
"oracle": {
"model": "openai/gpt-5.2",
"model": "openai/gpt-5.4",
"variant": "high",
},
"prometheus": {
Expand Down Expand Up @@ -1296,7 +1296,7 @@ exports[`generateModelConfig mixed provider scenarios uses all fallback provider
"variant": "medium",
},
"oracle": {
"model": "github-copilot/gpt-5.2",
"model": "github-copilot/gpt-5.4",
"variant": "high",
},
"prometheus": {
Expand Down Expand Up @@ -1371,7 +1371,7 @@ exports[`generateModelConfig mixed provider scenarios uses all providers togethe
"variant": "medium",
},
"oracle": {
"model": "openai/gpt-5.2",
"model": "openai/gpt-5.4",
"variant": "high",
},
"prometheus": {
Expand Down Expand Up @@ -1446,7 +1446,7 @@ exports[`generateModelConfig mixed provider scenarios uses all providers with is
"variant": "medium",
},
"oracle": {
"model": "openai/gpt-5.2",
"model": "openai/gpt-5.4",
"variant": "high",
},
"prometheus": {
Expand Down
2 changes: 1 addition & 1 deletion src/cli/config-manager.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -322,7 +322,7 @@ describe("generateOmoConfig - model fallback system", () => {
// #then Sisyphus is omitted (requires all fallback providers)
expect((result.agents as Record<string, { model: string }>).sisyphus).toBeUndefined()
// #then Oracle should use native OpenAI (first fallback entry)
expect((result.agents as Record<string, { model: string }>).oracle.model).toBe("openai/gpt-5.2")
expect((result.agents as Record<string, { model: string }>).oracle.model).toBe("openai/gpt-5.4")
Copy link

@cubic-dev-ai cubic-dev-ai bot Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: Custom agent: Opencode Compatibility

Upgrading to gpt-5.4 is premature because explicit upstream support in opencode is still pending (PR #16233).

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At src/cli/config-manager.test.ts, line 325:

<comment>Upgrading to `gpt-5.4` is premature because explicit upstream support in `opencode` is still pending (PR #16233).</comment>

<file context>
@@ -322,7 +322,7 @@ describe("generateOmoConfig - model fallback system", () => {
     expect((result.agents as Record<string, { model: string }>).sisyphus).toBeUndefined()
     // #then Oracle should use native OpenAI (first fallback entry)
-    expect((result.agents as Record<string, { model: string }>).oracle.model).toBe("openai/gpt-5.2")
+    expect((result.agents as Record<string, { model: string }>).oracle.model).toBe("openai/gpt-5.4")
     // #then multimodal-looker should use native OpenAI (first fallback entry is gpt-5.3-codex)
     expect((result.agents as Record<string, { model: string }>)["multimodal-looker"].model).toBe("openai/gpt-5.3-codex")
</file context>
Suggested change
expect((result.agents as Record<string, { model: string }>).oracle.model).toBe("openai/gpt-5.4")
expect((result.agents as Record<string, { model: string }>).oracle.model).toBe("openai/gpt-5.2")
Fix with Cubic

// #then multimodal-looker should use native OpenAI (first fallback entry is gpt-5.3-codex)
expect((result.agents as Record<string, { model: string }>)["multimodal-looker"].model).toBe("openai/gpt-5.3-codex")
})
Expand Down
2 changes: 1 addition & 1 deletion src/cli/model-fallback-requirements.ts
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ export const CLI_AGENT_MODEL_REQUIREMENTS: Record<string, ModelRequirement> = {
},
oracle: {
fallbackChain: [
{ providers: ["openai", "github-copilot", "opencode"], model: "gpt-5.2", variant: "high" },
{ providers: ["openai", "github-copilot", "opencode"], model: "gpt-5.4", variant: "high" },
Copy link

@cubic-dev-ai cubic-dev-ai bot Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: Custom agent: Opencode Compatibility

The gpt-5.4 model is not yet supported by OpenCode's Codex OAuth plugin and will be silently filtered out for OpenAI users.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At src/cli/model-fallback-requirements.ts, line 25:

<comment>The `gpt-5.4` model is not yet supported by OpenCode's Codex OAuth plugin and will be silently filtered out for OpenAI users.</comment>

<file context>
@@ -22,7 +22,7 @@ export const CLI_AGENT_MODEL_REQUIREMENTS: Record<string, ModelRequirement> = {
   oracle: {
     fallbackChain: [
-      { providers: ["openai", "github-copilot", "opencode"], model: "gpt-5.2", variant: "high" },
+      { providers: ["openai", "github-copilot", "opencode"], model: "gpt-5.4", variant: "high" },
       { providers: ["google", "github-copilot", "opencode"], model: "gemini-3.1-pro", variant: "high" },
       { providers: ["anthropic", "github-copilot", "opencode"], model: "claude-opus-4-6", variant: "max" },
</file context>
Fix with Cubic

{ providers: ["google", "github-copilot", "opencode"], model: "gemini-3.1-pro", variant: "high" },
{ providers: ["anthropic", "github-copilot", "opencode"], model: "claude-opus-4-6", variant: "max" },
],
Expand Down
2 changes: 1 addition & 1 deletion src/shared/agent-variant.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -191,7 +191,7 @@ describe("resolveVariantForModel", () => {
test("returns correct variant for oracle agent with openai", () => {
// given
const config = {} as OhMyOpenCodeConfig
const model = { providerID: "openai", modelID: "gpt-5.2" }
const model = { providerID: "openai", modelID: "gpt-5.4" }

// when
const variant = resolveVariantForModel(config, "oracle", model)
Expand Down
6 changes: 3 additions & 3 deletions src/shared/model-requirements.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,19 +7,19 @@ import {
} from "./model-requirements"

describe("AGENT_MODEL_REQUIREMENTS", () => {
test("oracle has valid fallbackChain with gpt-5.2 as primary", () => {
test("oracle has valid fallbackChain with gpt-5.4 as primary", () => {
// given - oracle agent requirement
const oracle = AGENT_MODEL_REQUIREMENTS["oracle"]

// when - accessing oracle requirement
// then - fallbackChain exists with gpt-5.2 as first entry
// then - fallbackChain exists with gpt-5.4 as first entry
expect(oracle).toBeDefined()
expect(oracle.fallbackChain).toBeArray()
expect(oracle.fallbackChain.length).toBeGreaterThan(0)

const primary = oracle.fallbackChain[0]
expect(primary.providers).toContain("openai")
expect(primary.model).toBe("gpt-5.2")
expect(primary.model).toBe("gpt-5.4")
expect(primary.variant).toBe("high")
})

Expand Down
2 changes: 1 addition & 1 deletion src/shared/model-requirements.ts
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ export const AGENT_MODEL_REQUIREMENTS: Record<string, ModelRequirement> = {
},
oracle: {
fallbackChain: [
{ providers: ["openai", "github-copilot", "opencode"], model: "gpt-5.2", variant: "high" },
{ providers: ["openai", "github-copilot", "opencode"], model: "gpt-5.4", variant: "high" },
Copy link

@cubic-dev-ai cubic-dev-ai bot Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: Custom agent: Opencode Compatibility

The opencode provider does not yet support the gpt-5.4 model (upstream PR #16233 is still open). Keep opencode on gpt-5.2 until the support is merged.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At src/shared/model-requirements.ts, line 33:

<comment>The `opencode` provider does not yet support the `gpt-5.4` model (upstream PR #16233 is still open). Keep `opencode` on `gpt-5.2` until the support is merged.</comment>

<file context>
@@ -30,7 +30,7 @@ export const AGENT_MODEL_REQUIREMENTS: Record<string, ModelRequirement> = {
   oracle: {
     fallbackChain: [
-      { providers: ["openai", "github-copilot", "opencode"], model: "gpt-5.2", variant: "high" },
+      { providers: ["openai", "github-copilot", "opencode"], model: "gpt-5.4", variant: "high" },
       { providers: ["google", "github-copilot", "opencode"], model: "gemini-3.1-pro", variant: "high" },
       { providers: ["anthropic", "github-copilot", "opencode"], model: "claude-opus-4-6", variant: "max" },
</file context>
Suggested change
{ providers: ["openai", "github-copilot", "opencode"], model: "gpt-5.4", variant: "high" },
{ providers: ["openai", "github-copilot"], model: "gpt-5.4", variant: "high" },
{ providers: ["opencode"], model: "gpt-5.2", variant: "high" },
Fix with Cubic

{ providers: ["google", "github-copilot", "opencode"], model: "gemini-3.1-pro", variant: "high" },
{ providers: ["anthropic", "github-copilot", "opencode"], model: "claude-opus-4-6", variant: "max" },
],
Expand Down
13 changes: 7 additions & 6 deletions src/tools/delegate-task/tools.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ const TEST_AVAILABLE_MODELS = new Set([
"google/gemini-3.1-pro",
"google/gemini-3-flash",
"openai/gpt-5.2",
"openai/gpt-5.4",
"openai/gpt-5.3-codex",
])

Expand Down Expand Up @@ -53,7 +54,7 @@ describe("sisyphus-task", () => {
models: {
anthropic: ["claude-opus-4-6", "claude-sonnet-4-6", "claude-haiku-4-5"],
google: ["gemini-3.1-pro", "gemini-3-flash"],
openai: ["gpt-5.2", "gpt-5.3-codex"],
openai: ["gpt-5.2", "gpt-5.4", "gpt-5.3-codex"],
},
connected: ["anthropic", "google", "openai"],
updatedAt: "2026-01-01T00:00:00.000Z",
Expand Down Expand Up @@ -3375,7 +3376,7 @@ describe("sisyphus-task", () => {
app: {
agents: async () => ({
data: [
{ name: "oracle", mode: "subagent", model: { providerID: "openai", modelID: "gpt-5.2" } },
{ name: "oracle", mode: "subagent", model: { providerID: "openai", modelID: "gpt-5.4" } },
],
}),
},
Expand Down Expand Up @@ -3442,7 +3443,7 @@ describe("sisyphus-task", () => {
app: {
agents: async () => ({
data: [
{ name: "oracle", mode: "subagent", model: { providerID: "openai", modelID: "gpt-5.2" } },
{ name: "oracle", mode: "subagent", model: { providerID: "openai", modelID: "gpt-5.4" } },
],
}),
},
Expand Down Expand Up @@ -3551,11 +3552,11 @@ describe("sisyphus-task", () => {
)

// then - should resolve via AGENT_MODEL_REQUIREMENTS fallback chain for oracle
// oracle fallback chain: gpt-5.2 (openai) > gemini-3.1-pro (google) > claude-opus-4-6 (anthropic)
// Since openai is in connectedProviders, should resolve to openai/gpt-5.2
// oracle fallback chain: gpt-5.4 (openai) > gemini-3.1-pro (google) > claude-opus-4-6 (anthropic)
// Since openai is in connectedProviders, should resolve to openai/gpt-5.4
expect(promptBody.model).toBeDefined()
expect(promptBody.model.providerID).toBe("openai")
expect(promptBody.model.modelID).toContain("gpt-5.2")
expect(promptBody.model.modelID).toContain("gpt-5.4")
}, { timeout: 20000 })
})

Expand Down
Loading