Skip to content
Open
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
65 changes: 40 additions & 25 deletions docs/en/configuration/providers.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,33 +7,34 @@ Kimi Code CLI supports multiple LLM platforms, which can be configured via confi
The easiest way to configure is to run the `/login` command (alias `/setup`) in shell mode and follow the wizard to select platform and model:

1. Select an API platform
2. Enter your API key
2. For **AWS Bedrock Mantle**, select an AWS Region, then enter your API key; for other platforms, enter your API key
3. Select a model from the available list

After configuration, Kimi Code CLI will automatically save settings to `~/.kimi/config.toml` and reload.

`/login` currently supports the following platforms:

| Platform | Description |
| --- | --- |
| Kimi Code | Kimi Code platform, supports search and fetch services |
| Moonshot AI Open Platform (moonshot.cn) | China region API endpoint |
| Moonshot AI Open Platform (moonshot.ai) | Global region API endpoint |
| Platform | Description |
| --------------------------------------- | ---------------------------------------------------------------------------- |
| AWS Bedrock Mantle (OpenAI-compatible) | Amazon Bedrock Mantle OpenAI API; uses `openai_legacy` and a Bedrock API key |
| Kimi Code | Kimi Code platform, supports search and fetch services |
| Moonshot AI Open Platform (moonshot.cn) | China region API endpoint |
| Moonshot AI Open Platform (moonshot.ai) | Global region API endpoint |

For other platforms, please manually edit the configuration file.
For other platforms, please manually edit the configuration file. See also [Bedrock Mantle example](../../../examples/bedrock-mantle.md).

## Provider types

The `type` field in `providers` configuration specifies the API provider type. Different types use different API protocols and client implementations.

| Type | Description |
| --- | --- |
| `kimi` | Kimi API |
| `openai_legacy` | OpenAI Chat Completions API |
| `openai_responses` | OpenAI Responses API |
| `anthropic` | Anthropic Claude API |
| `gemini` | Google Gemini API |
| `vertexai` | Google Vertex AI |
| Type | Description |
| ------------------ | --------------------------- |
| `kimi` | Kimi API |
| `openai_legacy` | OpenAI Chat Completions API |
| `openai_responses` | OpenAI Responses API |
| `anthropic` | Anthropic Claude API |
| `gemini` | Google Gemini API |
| `vertexai` | Google Vertex AI |

### `kimi`

Expand All @@ -57,6 +58,20 @@ base_url = "https://api.openai.com/v1"
api_key = "sk-xxx"
```

#### AWS Bedrock Mantle (OpenAI-compatible API)

[Bedrock Mantle](https://docs.aws.amazon.com/bedrock/latest/userguide/bedrock-mantle.html) exposes an OpenAI-compatible endpoint per AWS Region, for example:

`https://bedrock-mantle.<region>.api.aws/v1`

Use a **Bedrock API key** (not IAM access keys) with `type = "openai_legacy"`. Model IDs look like `moonshotai.kimi-k2.5` (catalog varies by region).

**`/login` flow:** choose **AWS Bedrock Mantle (OpenAI-compatible)**, pick a region, enter the API key, then select a model. This writes a managed provider `managed:bedrock-mantle` and clears Moonshot search/fetch (those tools are Kimi Code–specific).

**Environment overrides** (optional): `OPENAI_BASE_URL` and `OPENAI_API_KEY` override the saved `base_url` and `api_key` for `openai_legacy` providers only when set; they do not change other providers’ URLs.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟡 Documentation incorrectly states env overrides apply to openai_legacy only, but code also applies them to openai_responses

The new documentation in three places claims that OPENAI_BASE_URL and OPENAI_API_KEY environment variable overrides apply exclusively to openai_legacy providers. However, src/kimi_cli/llm.py:87 shows case "openai_legacy" | "openai_responses":, meaning these env vars also override openai_responses providers. The Chinese source-of-truth doc at docs/zh/configuration/providers.md:71 explicitly states 不会影响其他供应商 ("will not affect other providers"), which is factually wrong for openai_responses users. The same incorrect claim appears in the English providers doc and examples/bedrock-mantle.md:44.

Prompt for agents
The documentation in three files claims that OPENAI_BASE_URL and OPENAI_API_KEY env var overrides apply only to openai_legacy providers. But the actual implementation at src/kimi_cli/llm.py:87 applies them to both openai_legacy and openai_responses via a combined match case. Update the wording in all three locations:

1. docs/en/configuration/providers.md:71 — change 'for openai_legacy providers only' to 'for openai_legacy and openai_responses providers'
2. docs/zh/configuration/providers.md:71 — change the Chinese text similarly, updating the 不会影响其他供应商 claim to reflect that openai_responses is also affected
3. examples/bedrock-mantle.md:44 — change 'For any openai_legacy provider' to 'For any openai_legacy or openai_responses provider'

Alternatively, if the intent is to restrict the override scope to openai_legacy only (excluding openai_responses), then the code in src/kimi_cli/llm.py:87 should be changed to separate the two cases.
Open in Devin Review

Was this helpful? React with 👍 or 👎 to provide feedback.


**Example:** see [`examples/bedrock-mantle.md`](../../../examples/bedrock-mantle.md).

### `openai_responses`

For OpenAI Responses API (newer API format).
Expand Down Expand Up @@ -106,12 +121,12 @@ env = { GOOGLE_CLOUD_PROJECT = "your-project-id" }

The `capabilities` field in model configuration declares the capabilities supported by the model. This affects feature availability in Kimi Code CLI.

| Capability | Description |
| --- | --- |
| `thinking` | Supports thinking mode (deep reasoning), can be toggled |
| `always_thinking` | Always uses thinking mode (cannot be disabled) |
| `image_in` | Supports image input |
| `video_in` | Supports video input |
| Capability | Description |
| ----------------- | ------------------------------------------------------- |
| `thinking` | Supports thinking mode (deep reasoning), can be toggled |
| `always_thinking` | Always uses thinking mode (cannot be disabled) |
| `image_in` | Supports image input |
| `video_in` | Supports video input |

```toml
[models.gemini-3-pro-preview]
Expand Down Expand Up @@ -143,9 +158,9 @@ The `SearchWeb` and `FetchURL` tools depend on external services, currently only

When selecting the Kimi Code platform using `/login`, search and fetch services are automatically configured.

| Service | Corresponding tool | Behavior when not configured |
| --- | --- | --- |
| `moonshot_search` | `SearchWeb` | Tool unavailable |
| `moonshot_fetch` | `FetchURL` | Falls back to local fetching |
| Service | Corresponding tool | Behavior when not configured |
| ----------------- | ------------------ | ---------------------------- |
| `moonshot_search` | `SearchWeb` | Tool unavailable |
| `moonshot_fetch` | `FetchURL` | Falls back to local fetching |

When using other platforms, the `FetchURL` tool is still available but will fall back to local fetching.
66 changes: 40 additions & 26 deletions docs/zh/configuration/providers.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,33 +7,34 @@ Kimi Code CLI 支持多种 LLM 平台,可以通过配置文件或 `/login` 命
最简单的配置方式是在 Shell 模式下运行 `/login` 命令(别名 `/setup`),按照向导完成平台和模型的选择:

1. 选择 API 平台
2. 输入 API 密钥
2. 若选择 **AWS Bedrock Mantle(OpenAI 兼容)**,先选择 AWS 区域,再输入 API 密钥;其他平台直接输入 API 密钥
3. 从可用模型列表中选择模型

配置完成后,Kimi Code CLI 会自动保存设置到 `~/.kimi/config.toml` 并重新加载。

`/login` 目前支持以下平台:

| 平台 | 说明 |
| --- | --- |
| Kimi Code | Kimi Code 平台,支持搜索和抓取服务 |
| Moonshot AI 开放平台 (moonshot.cn) | 中国区 API 端点 |
| Moonshot AI Open Platform (moonshot.ai) | 全球区 API 端点 |
| 平台 | 说明 |
| --------------------------------------- | ---------------------------------------------------------------------------------- |
| AWS Bedrock Mantle(OpenAI 兼容) | Amazon Bedrock Mantle 的 OpenAI 兼容 API;使用 `openai_legacy` 与 Bedrock API 密钥 |
| Kimi Code | Kimi Code 平台,支持搜索和抓取服务 |
| Moonshot AI 开放平台 (moonshot.cn) | 中国区 API 端点 |
| Moonshot AI Open Platform (moonshot.ai) | 全球区 API 端点 |

如需使用其他平台,请手动编辑配置文件。
如需使用其他平台,请手动编辑配置文件。示例见仓库内 [`examples/bedrock-mantle.md`](../../../examples/bedrock-mantle.md)。

## 供应商类型

`providers` 配置中的 `type` 字段指定 API 供应商类型。不同类型使用不同的 API 协议和客户端实现。

| 类型 | 说明 |
| --- | --- |
| `kimi` | Kimi API |
| `openai_legacy` | OpenAI Chat Completions API |
| `openai_responses` | OpenAI Responses API |
| `anthropic` | Anthropic Claude API |
| `gemini` | Google Gemini API |
| `vertexai` | Google Vertex AI |
| 类型 | 说明 |
| ------------------ | --------------------------- |
| `kimi` | Kimi API |
| `openai_legacy` | OpenAI Chat Completions API |
| `openai_responses` | OpenAI Responses API |
| `anthropic` | Anthropic Claude API |
| `gemini` | Google Gemini API |
| `vertexai` | Google Vertex AI |

### `kimi`

Expand All @@ -57,6 +58,20 @@ base_url = "https://api.openai.com/v1"
api_key = "sk-xxx"
```

#### AWS Bedrock Mantle(OpenAI 兼容 API)

[Bedrock Mantle](https://docs.aws.amazon.com/bedrock/latest/userguide/bedrock-mantle.html) 在每个 AWS 区域提供 OpenAI 兼容端点,例如:

`https://bedrock-mantle.<region>.api.aws/v1`

请使用 **Bedrock API 密钥**(不是 IAM 访问密钥),`type` 设为 `openai_legacy`。模型 ID 形如 `moonshotai.kimi-k2.5`(实际目录随区域变化)。

**`/login` 流程:** 选择 **AWS Bedrock Mantle (OpenAI-compatible)**,选择区域,输入 API 密钥,再选模型。将写入托管供应商 `managed:bedrock-mantle`,并清除 Moonshot 搜索/抓取配置(这些能力依赖 Kimi Code)。

**环境变量覆盖(可选):** 若设置了 `OPENAI_BASE_URL` 与 `OPENAI_API_KEY`,会覆盖已保存的 `openai_legacy` 供应商的 `base_url` 与 `api_key`,**不会**影响其他供应商的 URL。

**示例:** 见仓库 [`examples/bedrock-mantle.md`](../../../examples/bedrock-mantle.md)(英文说明)。

### `openai_responses`

用于 OpenAI Responses API(较新的 API 格式)。
Expand Down Expand Up @@ -106,12 +121,12 @@ env = { GOOGLE_CLOUD_PROJECT = "your-project-id" }

模型配置中的 `capabilities` 字段声明模型支持的能力。这会影响 Kimi Code CLI 的功能可用性。

| 能力 | 说明 |
| --- | --- |
| `thinking` | 支持 Thinking 模式(深度思考),可开关 |
| `always_thinking` | 始终使用 Thinking 模式(不可关闭) |
| `image_in` | 支持图片输入 |
| `video_in` | 支持视频输入 |
| 能力 | 说明 |
| ----------------- | -------------------------------------- |
| `thinking` | 支持 Thinking 模式(深度思考),可开关 |
| `always_thinking` | 始终使用 Thinking 模式(不可关闭) |
| `image_in` | 支持图片输入 |
| `video_in` | 支持视频输入 |

```toml
[models.gemini-3-pro-preview]
Expand Down Expand Up @@ -143,10 +158,9 @@ capabilities = ["thinking", "image_in"]

使用 `/login` 选择 Kimi Code 平台时,搜索和抓取服务会自动配置。

| 服务 | 对应工具 | 未配置时的行为 |
| --- | --- | --- |
| `moonshot_search` | `SearchWeb` | 工具不可用 |
| `moonshot_fetch` | `FetchURL` | 回退到本地抓取 |
| 服务 | 对应工具 | 未配置时的行为 |
| ----------------- | ----------- | -------------- |
| `moonshot_search` | `SearchWeb` | 工具不可用 |
| `moonshot_fetch` | `FetchURL` | 回退到本地抓取 |

使用其他平台时,`FetchURL` 工具仍可使用,但会回退到本地抓取。

53 changes: 53 additions & 0 deletions examples/bedrock-mantle.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
# AWS Bedrock Mantle with Kimi Code CLI

Use Kimi models through [Amazon Bedrock Mantle](https://docs.aws.amazon.com/bedrock/latest/userguide/bedrock-mantle.html)’s OpenAI-compatible API.

## Quick setup (recommended)

1. Create a Bedrock API key in the AWS console.
2. Start Kimi Code CLI shell mode and run `/login` (or `/setup`).
3. Choose **AWS Bedrock Mantle (OpenAI-compatible)**.
4. Pick an AWS Region that supports Mantle and lists the models you need (for Kimi, regions such as `eu-west-2` or `us-east-1` often expose `moonshotai.*` IDs; availability varies by region).
5. Paste your API key and select a model (for example `moonshotai.kimi-k2.5`).

Configuration is written to `~/.kimi/config.toml` under the managed provider `managed:bedrock-mantle`.

## Verify (non-interactive)

```sh
kimi --print --prompt "Say hello in one short sentence."
```

## Manual configuration

If you prefer not to use `/login`, use `openai_legacy` with the Mantle `base_url`:

```toml
default_model = "bedrock-mantle/moonshotai.kimi-k2.5"

[providers."managed:bedrock-mantle"]
type = "openai_legacy"
base_url = "https://bedrock-mantle.eu-west-2.api.aws/v1"
api_key = "ABSK..."

[models."bedrock-mantle/moonshotai.kimi-k2.5"]
provider = "managed:bedrock-mantle"
model = "moonshotai.kimi-k2.5"
max_context_size = 131072
capabilities = ["thinking", "image_in"]
```

Model alias keys must match what `/login` would generate (`<platform_id>/<model_id>`).

## Environment overrides

For any `openai_legacy` provider, Kimi CLI can override the saved URL and key from the environment:

- `OPENAI_BASE_URL` — replaces `base_url` when set.
- `OPENAI_API_KEY` — replaces `api_key` when set.

These apply per run and are useful for CI or switching regions without editing TOML.

## Search and fetch

Mantle setup does **not** configure Moonshot Search/Fetch. The `SearchWeb` and `FetchURL` tools behave like other non–Kimi Code providers (search unavailable; fetch may fall back locally). Use Kimi Code via `/login` if you need those services.
Loading
Loading