Skip to content

Gemini authentication with browser-based OAuth login and model selection dropdowns#449

Open
reikernodd wants to merge 8 commits into
claude-code-best:mainfrom
reikernodd:main
Open

Gemini authentication with browser-based OAuth login and model selection dropdowns#449
reikernodd wants to merge 8 commits into
claude-code-best:mainfrom
reikernodd:main

Conversation

@reikernodd
Copy link
Copy Markdown

@reikernodd reikernodd commented May 10, 2026


View in Codesmith
Need help on this PR? Tag @codesmith with what you need.

  • Let Codesmith autofix CI failures and bot reviews

Summary by CodeRabbit

  • New Features

    • Local LLM support with interactive local setup, model listing/pull, and new "local" provider option
    • Gemini browser OAuth, model fetching, and model-selection UI
  • Documentation

    • Expanded onboarding, Quick Start, local LLM setup, environment variables, diagnostics, and learning/community guides
  • Bug Fixes

    • More consistent language handling for assistant responses
    • Reduced hook-related UI clutter
  • Tests

    • Added tests for console OAuth and local login flow

Review Change Stack

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented May 10, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: ef2760ac-d023-40fd-8433-ba27e53ebc76

📥 Commits

Reviewing files that changed from the base of the PR and between 75c230d and a8d1e83.

📒 Files selected for processing (1)
  • src/services/api/claude.ts
🚧 Files skipped from review as they are similar to previous changes (1)
  • src/services/api/claude.ts

📝 Walkthrough

Walkthrough

Adds a 'local' API provider (Ollama) with local-LLM utilities, Google OAuth for Gemini, provider/OpenAI routing changes, an expanded ConsoleOAuthFlow UI (local + Gemini), diagnostic collection/display, README updates, and tests/bootstrap tweaks.

Changes

Local LLM Provider and Google OAuth Support

Layer / File(s) Summary
Type System and Settings Schema
src/utils/model/providers.ts, src/utils/settings/types.ts, src/utils/model/configs.ts, src/utils/doctorDiagnostic.ts
Adds 'local' to APIProvider; getAPIProvider() returns 'local' when settings.modelType === 'local'; SettingsSchema adds googleOAuth; ModelConfig types adjust to optional local; DiagnosticInfo gains localLlmStatus and hardwareInfo.
Local LLM Utilities
src/utils/localLlm.ts
New utilities: checkOllamaStatus(), listOllamaModels(), pullOllamaModel() async generator with streaming progress, and pingUrl() for URL validation.
Google OAuth + Gemini Client
src/services/api/gemini/google-oauth.ts, src/services/api/gemini/client.ts
Adds Google OAuth login/refresh (loginToGoogle, getGoogleAccessToken) and listGeminiModels() plus streaming auth headers using API key or Google Bearer token.
Provider Routing & OpenAI Client
src/commands/provider.ts, src/services/api/openai/client.ts, src/services/api/claude.ts, src/services/api/openai/index.ts
provider CLI accepts local; local switching validates LOCAL_BASE_URL and updates modelType; getOpenAIClient uses LOCAL_/OPENAI_ envs per provider; queryModel routes 'local' via OpenAI-compatible path; queryModelOpenAI uses LOCAL_MODEL when provider is local.
OAuth Component
src/components/ConsoleOAuthFlow.tsx
Adds "Local LLM" flow: runner selection, base URL/API key, installed model listing, model pulling with progress, and saving LOCAL_* settings; expands Gemini flow with model fetching, dropdowns, custom model fields, and robustness fixes.
Diagnostics & Doctor UI
src/utils/doctorDiagnostic.ts, src/screens/Doctor.tsx
getDoctorDiagnostic() collects Ollama running state and available models and host CPU/memory/arch; Doctor screen conditionally renders System Information and Local LLM sections.
Supporting Updates
src/utils/model/modelStrings.ts, src/utils/status.tsx, src/utils/swarm/teammateModel.ts, src/components/messages/AttachmentMessage.tsx, src/constants/prompts.ts
Model lookup uses typed config with fallback to firstParty; status mapping adds "Local LLM"; teammate model fallback honors LOCAL_MODEL; hook-related messages suppress rendering on session-start events; language system section defaults to user's input language when unset.
Docs, Tests & Config
README.md, README_EN.md, .gitignore, src/components/__tests__/ConsoleOAuthFlow.test.tsx, src/bootstrap/state.ts, tests/integration/autonomy-lifecycle-user-flow.test.ts
READMEs document local LLM/Gemini OAuth/doctor; .gitignore updated to ignore .agents/*, .claude/*, .omx/*; adds unit test for ConsoleOAuthFlow; bootstrap prefers CLAUDE_CODE_CWD; integration test sets CLAUDE_CODE_CWD for spawned CLI.

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

Possibly related PRs

Suggested reviewers

  • KonghaYao

Poem

🐰 I dug a tunnel near the code,
Ollama seeds in the local node,
Tokens flutter, models stream,
Doctor checks the CPU dream,
Local LLMs hop into the mode.

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 9.52% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: adding Gemini authentication with browser-based OAuth login and model selection dropdowns, which is a central focus across multiple files including ConsoleOAuthFlow.tsx, google-oauth.ts, and gemini/client.ts.
Linked Issues check ✅ Passed Check skipped because no linked issues were found for this pull request.
Out of Scope Changes check ✅ Passed Check skipped because no linked issues were found for this pull request.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Tip

💬 Introducing Slack Agent: The best way for teams to turn conversations into code.

Slack Agent is built on CodeRabbit's deep understanding of your code, so your team can collaborate across the entire SDLC without losing context.

  • Generate code and open pull requests
  • Plan features and break down work
  • Investigate incidents and troubleshoot customer tickets together
  • Automate recurring tasks and respond to alerts with triggers
  • Summarize progress and report instantly

Built for teams:

  • Shared memory across your entire org—no repeating context
  • Per-thread sandboxes to safely plan and execute work
  • Governance built-in—scoped access, auditability, and budget controls

One agent for your entire SDLC. Right inside Slack.

👉 Get started


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Author

@reikernodd reikernodd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added Google auth login method.

@reikernodd reikernodd changed the title Update from source Gemini authentication with browser-based OAuth login and model selection dropdowns May 10, 2026
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 12

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/commands/provider.ts (1)

179-188: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

description and argumentHint are missing the new local provider.

/provider local is a valid arg now (line 70) and the settings-backed branch handles it (line 148), but the help text and argument-hint completion still advertise only the old set. Users won't discover the new option from /provider's own help or autocompletion.

📝 Proposed fix
 const provider = {
   type: 'local',
   name: 'provider',
   description:
-    'Switch API provider (anthropic/openai/gemini/grok/bedrock/vertex/foundry)',
+    'Switch API provider (anthropic/openai/gemini/grok/local/bedrock/vertex/foundry)',
   aliases: ['api'],
-  argumentHint: '[anthropic|openai|gemini|grok|bedrock|vertex|foundry|unset]',
+  argumentHint: '[anthropic|openai|gemini|grok|local|bedrock|vertex|foundry|unset]',
   supportsNonInteractive: true,
   load: () => Promise.resolve({ call }),
 } satisfies Command

While here, also update the stale comment at lines 141–142 which still lists only anthropic, openai, gemini even though grok and local are now in the same branch.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@src/commands/provider.ts` around lines 179 - 188, Update the provider
metadata to include the new "local" option: modify the provider object’s
description and argumentHint fields (the const provider = { ... } satisfying
Command) to list "local" alongside
anthropic/openai/gemini/grok/bedrock/vertex/foundry, and update the stale inline
comment that enumerates providers (the comment near the settings-backed branch
handling in the same file) so it also includes "grok" and "local" so help text
and autocompletion reflect the actual supported providers.
🧹 Nitpick comments (2)
src/services/api/claude.ts (1)

1334-1334: 💤 Low value

Optimize repeated getAPIProvider() call.

The condition calls getAPIProvider() twice. Consider caching the result in a variable.

♻️ Proposed optimization
+  const apiProvider = getAPIProvider()
-  if (getAPIProvider() === 'openai' || getAPIProvider() === 'local') {
+  if (apiProvider === 'openai' || apiProvider === 'local') {
     const { queryModelOpenAI } = await import('./openai/index.js')
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@src/services/api/claude.ts` at line 1334, Replace the repeated calls to
getAPIProvider() in the conditional with a cached local variable: call
getAPIProvider() once (e.g., const provider = getAPIProvider()) and then use
provider === 'openai' || provider === 'local' in the if statement; update any
nearby logic that referenced getAPIProvider() in the same scope to use the new
provider variable for consistency.
src/utils/doctorDiagnostic.ts (1)

646-652: ⚡ Quick win

Hardware info: redundant cpus() calls and lossy memory formatting.

Two small refinements:

  1. cpus() is called twice (line 647 for .length, line 648 for [0]?.model). On high-core-count machines this allocates a sizable array twice — cache it once.
  2. Math.round(... / 1024^3) + ' GB' rounds to whole GiB, so a system with 0.4 GiB free reports "0 GB" and 7.6 GiB reports "8 GB". For local-LLM capacity checks (the whole point of this section), one decimal is much more useful.
♻️ Proposed refactor
+  const cpuList = cpus()
+  const toGiB = (bytes: number): string => `${(bytes / 1024 ** 3).toFixed(1)} GB`
   const diagnostic: DiagnosticInfo = {
     ...
     hardwareInfo: {
-      cpus: cpus().length,
-      cpuModel: cpus()[0]?.model || 'Unknown',
-      totalMem: Math.round(totalmem() / 1024 / 1024 / 1024) + ' GB',
-      freeMem: Math.round(freemem() / 1024 / 1024 / 1024) + ' GB',
+      cpus: cpuList.length,
+      cpuModel: cpuList[0]?.model || 'Unknown',
+      totalMem: toGiB(totalmem()),
+      freeMem: toGiB(freemem()),
       arch: arch(),
     },
   }
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@src/utils/doctorDiagnostic.ts` around lines 646 - 652, Cache the result of
os.cpus() into a local variable before constructing the hardwareInfo object to
avoid allocating the CPU array twice (used for cpus().length and
cpus()[0]?.model) and use that cached array when setting cpus and cpuModel; also
change totalMem and freeMem formatting to compute GiB with one decimal place
(e.g., totalmem() / (1024**3) and freemem() / (1024**3)) and format using
toFixed(1) + ' GB' so values like 0.4 GiB display as "0.4 GB" and 7.6 GiB as
"7.6 GB".
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

Inline comments:
In `@README.md`:
- Around line 154-156: Update the README entries that incorrectly show the OAuth
credentials path with a leading slash; change any occurrences of
"/.files/OAuth.json" to ".files/OAuth.json" in README.md (around the current
lines referencing the OAuth.json path) and the corresponding entry in
README_EN.md so they match how the code in
src/services/api/gemini/google-oauth.ts (which uses path.join(process.cwd(),
'.files', 'OAuth.json')) expects the file to be located in the project root.

In `@src/components/ConsoleOAuthFlow.tsx`:
- Around line 89-94: The local_llm_pulling state currently lacks baseUrl and
code paths hardcode 'http://localhost:11434', so update the state shape for
'local_llm_pulling' to include baseUrl:string (alongside
modelName/status/percentage), replace hardcoded URLs in the pulling effect
(references: the pulling logic that builds the fetch/POST to Ollama) to use that
baseUrl, and modify all transitions into local_llm_pulling (including
handleLocalEnter and the Select transition that creates the state) to pass
oauthStatus.baseUrl into the new state so the post-pull state rebuild does not
overwrite the user-provided Base URL. Ensure every spot that previously wrote
'http://localhost:11434' now reads from the state's baseUrl.
- Around line 713-815: The new local_llm_setup branch uses multiple `as any`
casts; replace them by properly typing the local-state shape and narrowing
oauthStatus to that discriminated union (e.g., define LocalLlmSetupState =
Extract<OAuthStatus, { state: 'local_llm_setup' }>), then (1) remove the
`(oauthStatus as any).apiKey` usage and read oauthStatus.apiKey directly, (2)
make buildLocalState return LocalLlmSetupState and construct its next state as
that type instead of `as any`, (3) change doLocalSave signature to accept
LocalLlmSetupState rather than `any`, and (4) supply correctly typed payloads to
updateSettingsForSource so the `} as any` casts are unnecessary; this eliminates
the need to cast at call sites like buildLocalState(...) as any and preserves
type safety for LocalField, LOCAL_FIELDS, displayValues, buildLocalState,
doLocalSave, and setOAuthStatus.

In `@src/services/api/gemini/client.ts`:
- Around line 15-17: getGeminiBaseUrl currently always returns
DEFAULT_GEMINI_BASE_URL and no longer respects process.env.GEMINI_BASE_URL;
restore the env-override by reading process.env.GEMINI_BASE_URL (falling back to
DEFAULT_GEMINI_BASE_URL) and normalize it with the same replace(/\/+$/, '')
trimming; if the app truly requires the canonical endpoint in OAuth-only mode,
gate the env override behind the auth-mode check you use (e.g., an isOAuthOnly
or authMode flag) so getGeminiBaseUrl uses the env value except when OAuth-only
is enforced.
- Around line 52-58: Replace the unsafe map using (m: any) and the unguarded
m.name access with a type-narrowed iteration: remove the any assertion and
instead iterate over data.models (e.g., for...of or filter+map), guard each item
with checks like typeof m === 'object' && m !== null && typeof (m as any).name
=== 'string' (or use a custom isModel(obj) predicate), then push/return
m.name.replace(/^models\//, '') only for valid entries; ensure the function
returns an empty array for invalid items and does not assume m.name exists. Use
the existing data.models reference and replace the anonymous map callback with
the guarded logic.

In `@src/services/api/gemini/google-oauth.ts`:
- Around line 130-136: The code uses `as any` when calling
updateSettingsForSource for 'userSettings' with the googleOAuth payload; remove
the `as any` and pass a correctly typed object like the other
updateSettingsForSource calls: build a partial/typed settings object (matching
the expected UserSettings/Settings type) containing googleOAuth { access_token,
refresh_token: credentials.refresh_token || googleOAuth.refresh_token,
expiry_date } and pass that without assertion; apply the same change to the
similar call affecting lines around 141–143 so both calls use the proper typed
payload instead of `as any`.
- Around line 62-68: The object passed to updateSettingsForSource currently uses
a prohibited `as any` on the googleOAuth payload; replace that with a proper
typed value by either augmenting the settings type to include googleOAuth
(extend the userSettings/settings interface) or perform a safe double assertion:
cast the literal to unknown and then to the correct settings type before calling
updateSettingsForSource; reference the updateSettingsForSource call and the
googleOAuth object (access_token/refresh_token/expiry_date from tokens) and
ensure the resulting value matches the expected settings type rather than using
`as any`.
- Around line 108-114: The code uses a forbidden `as any` in
getGoogleAccessToken when reading googleOAuth from getSettings; replace it by
properly typing the settings result (extend the SettingsJson type to include a
googleOAuth property or create an interface GoogleOAuth { refresh_token?:
string; ... } and cast via `as unknown as GoogleOAuth`) and then use a
type-narrowing check (e.g., `const googleOAuth = settings.googleOAuth` with an
if guard checking `googleOAuth && googleOAuth.refresh_token`) so no `as any` is
used; update references in getGoogleAccessToken and any call sites that assume
googleOAuth to match the new typed shape.

In `@src/services/api/openai/index.ts`:
- Around line 221-225: The README is missing documentation for the LOCAL_MODEL
env var used by getAPIProvider logic (code uses process.env.LOCAL_MODEL when
provider === 'local' to select the model passed as options.model or fallback);
update the environment variables reference or the local LLM setup section to
document LOCAL_MODEL, explain its purpose (overrides options.model when using
the local provider), show expected values (model name) and any defaults, and
mention that the Zod settings schema requires configuring LOCAL_MODEL when using
the local provider so users know to set it.

In `@src/utils/localLlm.ts`:
- Around line 38-46: The POST to `${baseUrl}/api/pull` in localLlm.ts must
include the header "Content-Type: application/json" so the Ollama endpoint
correctly interprets the JSON body; update the fetch call that builds the pull
request (the one that sends JSON.stringify({ name: model })) to add headers: {
'Content-Type': 'application/json' }. Also add timeout handling to the fetch
calls used in checkOllamaStatus and listOllamaModels by creating or passing an
AbortSignal with AbortSignal.timeout(milliseconds) (or constructing an
AbortController and using setTimeout to abort) so those functions don’t hang
indefinitely when the local Ollama instance is unresponsive.

In `@src/utils/model/modelStrings.ts`:
- Around line 28-30: The expression (ALL_MODEL_CONFIGS[key] as any)[provider]
uses a forbidden any assertion; replace it with a proper typed narrowing or a
safe double-cast to a specific interface for that entry. Define or use a
ModelConfigEntry type (e.g., an object with provider keys and a firstParty
field) and cast ALL_MODEL_CONFIGS[key] via 'as unknown as ModelConfigEntry' or
narrow it with a type guard before indexing, then use (ALL_MODEL_CONFIGS[key] as
unknown as ModelConfigEntry)[provider] || ALL_MODEL_CONFIGS[key].firstParty so
you avoid any and keep type-safety for ALL_MODEL_CONFIGS, key, provider, out,
and firstParty.

In `@src/utils/swarm/teammateModel.ts`:
- Around line 9-11: The current hardcoded local fallback 'claude-opus-4-6' in
teammateModel.ts is invalid for non-Anthropic local providers; replace that
hardcoded string so local uses the same configurable fallback as elsewhere:
return process.env.LOCAL_MODEL (or process.env.LOCAL_MODEL || another documented
default) instead of 'claude-opus-4-6', and update the comment to note that local
model comes from process.env.LOCAL_MODEL; keep the surrounding logic using
getAPIProvider() and CLAUDE_OPUS_4_6_CONFIG unchanged.

---

Outside diff comments:
In `@src/commands/provider.ts`:
- Around line 179-188: Update the provider metadata to include the new "local"
option: modify the provider object’s description and argumentHint fields (the
const provider = { ... } satisfying Command) to list "local" alongside
anthropic/openai/gemini/grok/bedrock/vertex/foundry, and update the stale inline
comment that enumerates providers (the comment near the settings-backed branch
handling in the same file) so it also includes "grok" and "local" so help text
and autocompletion reflect the actual supported providers.

---

Nitpick comments:
In `@src/services/api/claude.ts`:
- Line 1334: Replace the repeated calls to getAPIProvider() in the conditional
with a cached local variable: call getAPIProvider() once (e.g., const provider =
getAPIProvider()) and then use provider === 'openai' || provider === 'local' in
the if statement; update any nearby logic that referenced getAPIProvider() in
the same scope to use the new provider variable for consistency.

In `@src/utils/doctorDiagnostic.ts`:
- Around line 646-652: Cache the result of os.cpus() into a local variable
before constructing the hardwareInfo object to avoid allocating the CPU array
twice (used for cpus().length and cpus()[0]?.model) and use that cached array
when setting cpus and cpuModel; also change totalMem and freeMem formatting to
compute GiB with one decimal place (e.g., totalmem() / (1024**3) and freemem() /
(1024**3)) and format using toFixed(1) + ' GB' so values like 0.4 GiB display as
"0.4 GB" and 7.6 GiB as "7.6 GB".
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: b193b877-0b9b-418b-a20b-cd67603548ab

📥 Commits

Reviewing files that changed from the base of the PR and between 17c0669 and 0195760.

📒 Files selected for processing (24)
  • .gitignore
  • README.md
  • README_EN.md
  • src/bootstrap/state.ts
  • src/commands/provider.ts
  • src/components/ConsoleOAuthFlow.tsx
  • src/components/__tests__/ConsoleOAuthFlow.test.tsx
  • src/components/messages/AttachmentMessage.tsx
  • src/constants/prompts.ts
  • src/screens/Doctor.tsx
  • src/services/api/claude.ts
  • src/services/api/gemini/client.ts
  • src/services/api/gemini/google-oauth.ts
  • src/services/api/openai/client.ts
  • src/services/api/openai/index.ts
  • src/utils/doctorDiagnostic.ts
  • src/utils/localLlm.ts
  • src/utils/model/configs.ts
  • src/utils/model/modelStrings.ts
  • src/utils/model/providers.ts
  • src/utils/settings/types.ts
  • src/utils/status.tsx
  • src/utils/swarm/teammateModel.ts
  • tests/integration/autonomy-lifecycle-user-flow.test.ts

Comment thread README.md Outdated
Comment thread src/components/ConsoleOAuthFlow.tsx
Comment thread src/components/ConsoleOAuthFlow.tsx
Comment thread src/services/api/gemini/client.ts
Comment thread src/services/api/gemini/client.ts
Comment thread src/services/api/gemini/google-oauth.ts Outdated
Comment thread src/services/api/openai/index.ts
Comment thread src/utils/localLlm.ts
Comment thread src/utils/model/modelStrings.ts Outdated
Comment thread src/utils/swarm/teammateModel.ts
@reikernodd
Copy link
Copy Markdown
Author

Errors to be fixed soon.

@claude-code-best
Copy link
Copy Markdown
Owner

@reikernodd 有验证截图吗

@reikernodd
Copy link
Copy Markdown
Author

有验证截图吗

是的,我愿意.

@claude-code-best
Copy link
Copy Markdown
Owner

@reikernodd 还有这个 CI 构建失败了, 你应该要把 main 合入到你那边先看看,可能是没更新的问题

@reikernodd
Copy link
Copy Markdown
Author

@reikernodd 还有这个 CI 构建失败了, 你应该要把 main 合入到你那边先看看,可能是没更新的问题

好的,我会调查一下。

@reikernodd
Copy link
Copy Markdown
Author

@claude-code-best 我已经完成了所有修复工作。

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
README_EN.md (1)

11-11: ⚡ Quick win

Fix compound adjective hyphenation.

Use hyphens to join compound adjectives:

  • Line 11: "open source" → "open-source"
  • Line 158: "API compatible" → "API-compatible"
📝 Proposed fixes
-A source code decompilation/reverse engineering project of the official [Claude Code](https://docs.anthropic.com/en/docs/claude-code) CLI tool from Anthropic (aka "Old A"). The goal is to reproduce most of the features and engineering capabilities of Claude Code (the user says "Old Lafayette has already paid for it"). Although it's a bit awkward, it's called CCB (Cai Cai Bei / Step on the Back)... Moreover, we have implemented features that are usually limited to the Enterprise edition or require logging into a Claude account, achieving technology democratization.
+An open-source decompilation/reverse engineering project of the official [Claude Code](https://docs.anthropic.com/en/docs/claude-code) CLI tool from Anthropic (aka "Old A"). The goal is to reproduce most of the features and engineering capabilities of Claude Code (the user says "Old Lafayette has already paid for it"). Although it's a bit awkward, it's called CCB (Cai Cai Bei / Step on the Back)... Moreover, we have implemented features that are usually limited to the Enterprise edition or require logging into a Claude account, achieving technology democratization.
-> ℹ️ Supports all Anthropic API compatible services (e.g., OpenRouter, AWS Bedrock proxies, etc.), as long as the interface is compatible with the Messages API.
+> ℹ️ Supports all Anthropic API-compatible services (e.g., OpenRouter, AWS Bedrock proxies, etc.), as long as the interface is compatible with the Messages API.

Also applies to: 158-158

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@README_EN.md` at line 11, The phrases used as compound adjectives need
hyphenation: replace the occurrence of "open source" with "open-source" (from
the sentence "Which Claude do you like? The open source one is the best.") and
replace "API compatible" with "API-compatible" wherever it appears (e.g., the
"API compatible" instance noted), ensuring both are treated as single compound
adjectives.
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

Inline comments:
In `@README_EN.md`:
- Around line 164-167: The README has GEMINI_BASE_URL documented under the
"Local LLM" section alongside LOCAL_BASE_URL and LOCAL_MODEL; remove the
misplaced GEMINI_BASE_URL line from the Local LLM section and add (or ensure) a
corresponding entry for GEMINI_BASE_URL under the Gemini API configuration
section (the "Gemini" setup), leaving LOCAL_BASE_URL and LOCAL_MODEL only in the
Local LLM section to keep environment variable docs accurate.

---

Nitpick comments:
In `@README_EN.md`:
- Line 11: The phrases used as compound adjectives need hyphenation: replace the
occurrence of "open source" with "open-source" (from the sentence "Which Claude
do you like? The open source one is the best.") and replace "API compatible"
with "API-compatible" wherever it appears (e.g., the "API compatible" instance
noted), ensuring both are treated as single compound adjectives.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: de3bef54-31fc-4768-b815-b539d80e2066

📥 Commits

Reviewing files that changed from the base of the PR and between 0195760 and 75c230d.

📒 Files selected for processing (13)
  • .gitignore
  • README.md
  • README_EN.md
  • src/commands/provider.ts
  • src/components/ConsoleOAuthFlow.tsx
  • src/services/api/claude.ts
  • src/services/api/gemini/client.ts
  • src/services/api/gemini/google-oauth.ts
  • src/utils/doctorDiagnostic.ts
  • src/utils/localLlm.ts
  • src/utils/model/modelStrings.ts
  • src/utils/settings/types.ts
  • src/utils/swarm/teammateModel.ts
✅ Files skipped from review due to trivial changes (2)
  • .gitignore
  • README.md
🚧 Files skipped from review as they are similar to previous changes (8)
  • src/utils/model/modelStrings.ts
  • src/services/api/claude.ts
  • src/utils/swarm/teammateModel.ts
  • src/utils/localLlm.ts
  • src/commands/provider.ts
  • src/utils/doctorDiagnostic.ts
  • src/services/api/gemini/google-oauth.ts
  • src/components/ConsoleOAuthFlow.tsx

Comment thread README_EN.md
Comment on lines +164 to 167
- `LOCAL_BASE_URL`: The base URL for the local LLM runner (e.g., `http://localhost:11434`).
- `LOCAL_MODEL`: The model name for the local LLM (e.g., `llama3.1`). Overrides the default model when using the `local` provider.
- `GEMINI_BASE_URL`: Custom base URL for the Gemini API.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Misplaced environment variable documentation.

Line 166 documents GEMINI_BASE_URL within the "Local LLM" setup section, but GEMINI_BASE_URL is specific to Gemini API configuration, not local LLM runners. This appears to be a copy-paste error.

Either remove this line from the Local LLM section or move it to the Gemini setup section (lines 150-152).

📝 Proposed fix

Remove the misplaced line:

 - `LOCAL_BASE_URL`: The base URL for the local LLM runner (e.g., `http://localhost:11434`).
 - `LOCAL_MODEL`: The model name for the local LLM (e.g., `llama3.1`). Overrides the default model when using the `local` provider.
-- `GEMINI_BASE_URL`: Custom base URL for the Gemini API.
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@README_EN.md` around lines 164 - 167, The README has GEMINI_BASE_URL
documented under the "Local LLM" section alongside LOCAL_BASE_URL and
LOCAL_MODEL; remove the misplaced GEMINI_BASE_URL line from the Local LLM
section and add (or ensure) a corresponding entry for GEMINI_BASE_URL under the
Gemini API configuration section (the "Gemini" setup), leaving LOCAL_BASE_URL
and LOCAL_MODEL only in the Local LLM section to keep environment variable docs
accurate.

@claude-code-best
Copy link
Copy Markdown
Owner

@reikernodd 类型检查没过

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants