Using a custom OPENAI_BASE_URL + OPENAI_API_KEY (e.g., OpenRouter/Ollama) still fails with:
No OpenAI API key found.
Repro
export OPENAI_API_KEY=dummy
export OPENAI_BASE_URL=https://openrouter.ai/api/v1
Run CLI → error persists.
Expected
Accept custom base URL
Skip strict OpenAI key validation
Route requests to provided endpoint
Actual
Hardcoded OpenAI validation
Fails before using OPENAI_BASE_URL
Suggestion
Respect OPENAI_BASE_URL first
Allow non-OpenAI keys when custom base is set
Make LLM provider pluggable
Thanks!
Using a custom OPENAI_BASE_URL + OPENAI_API_KEY (e.g., OpenRouter/Ollama) still fails with:
No OpenAI API key found.
Repro
export OPENAI_API_KEY=dummy
export OPENAI_BASE_URL=https://openrouter.ai/api/v1
Run CLI → error persists.
Expected
Accept custom base URL
Skip strict OpenAI key validation
Route requests to provided endpoint
Actual
Hardcoded OpenAI validation
Fails before using OPENAI_BASE_URL
Suggestion
Respect OPENAI_BASE_URL first
Allow non-OpenAI keys when custom base is set
Make LLM provider pluggable
Thanks!