-
Notifications
You must be signed in to change notification settings - Fork 17
Description
I was very excited when I saw this because I am building agents on top of ACP, with custom models (due to company restrictions).
As Codex CLI supports custom models, I got my setup to work with it, but I cannot integrate it into the agent since this ACP adapter uses the builtin models list and fails if the selected model is not from within.
I have tried just specifying gpt-5 and have a local proxy rewrite the model to the correct one, but then it never sends any message or message update events, even though I can clearly observe both the request and the answer going through my proxy.
And just to confirm, with the upstream standalone agent CLI my setup works (both with and without proxy, even if I set the model to gpt-5), and with an official OpenAI model the ACP also works for me, the agent STDERR is also not helpful.
My suspicion is, apart from the model id lookup issue in the beginning, the issue is that the messages are not streamed (it streams the response from the API but codex only displays it when it's fully generated, for official models it streams, but not for custom providers)
Unrelated, but it would be awesome if you could distribute this ACP binary also via npm (with optional sub-packages for every architecture), as openai also distributes their cli additionally via npm