Skip to content

feat(llm): add multi-LLM provider support with API key validation#4

Merged
Alenryuichi merged 4 commits intomainfrom
fix/openmemory-api-config
Feb 3, 2026
Merged

feat(llm): add multi-LLM provider support with API key validation#4
Alenryuichi merged 4 commits intomainfrom
fix/openmemory-api-config

Conversation

@Alenryuichi
Copy link
Owner

Summary

Add multi-LLM provider support to openmemory-plus, allowing users to choose from 6 different LLM providers for the memory categorization feature.

Features

🚀 LLM Provider Support

Provider API Key Env Default Model
DeepSeek (推荐) \ \
MiniMax \ \
智谱 AI \ \
通义千问 \ \
OpenAI \ \
Ollama (本地) N/A \

💻 CLI Usage

\
╔═══════════════════════════════════════════════════════════════╗
║ ║
║ 🧠 OpenMemory Plus - Agent Memory Management ║
║ ║
║ 让任何 AI Agent 在 5 分钟内获得持久记忆能力 ║
║ ║
╚═══════════════════════════════════════════════════════════════╝

━━━ 第 1 步: 检测系统依赖 ━━━

当前状态:
🐳 Docker: ✓ 运行中
🐳 Compose: ✓ 可用
🦙 Ollama: ✓ 运行中
📦 Qdrant: ✓ 运行中
🔤 BGE-M3: ✓ 已安装

✅ 所有依赖已就绪!

━━━ 第 2 步: 配置项目 ━━━

使用默认 IDE: augment

📁 创建配置文件...

✓ 创建 _omp/ (核心目录)
✓ 创建 _omp/memory/project.yaml
✓ 创建 _omp/commands/ (1 命令)
✓ 创建 _omp/workflows/ (9 步骤)
✓ 创建 _omp/skills/ (memory-extraction)
✓ 配置 augment (.augment/commands/)

━━━ Phase 3: MCP 配置与验证 ━━━

━━━ 安装完成 ━━━

🎉 OpenMemory Plus 已成功安装!

✓ MCP 已自动配置到 Augment

💡 下一步:

  1. 重启 IDE 以加载 MCP 配置
  2. 使用 /memory 打开记忆管理菜单
  3. 选择操作或用自然语言描述需求\

🔐 Security

  • Auto-update \ to exclude \ files
  • Display security warning when API key is saved
  • Optional API key validation before saving

Code Review Fixes

  • H1-H4: 严重问题修复
  • M1-M3: 中等问题修复
  • L1-L2: 低优先级问题修复

Testing

✅ 119 tests passed

## Features
- Add 6 LLM providers: DeepSeek, MiniMax, ZhiPu, Qwen, OpenAI, Ollama
- Add --llm CLI option for non-interactive provider selection
- Add Docker Volume mounting strategy for categorization.py patch
- Add optional API key validation before saving

## Code Review Fixes
- H1: Add providers.test.ts with 22 test cases
- H2: Add --llm option tests in install.test.ts
- H3: Auto-update .gitignore and show security warning for .env
- H4: Fix Ollama envKey empty string bug in generateProviderEnv
- M1: Handle invalid provider in non-interactive mode
- M2: Add Ollama support in categorization.py
- M3: Use getMcpEnvForProvider in showMcpConfig
- L1: Auto-generate PROVIDER_CHOICES from LLM_PROVIDERS
- L2: Add validateApiKey function with tests

## Files
- cli/src/lib/providers.ts (new)
- cli/templates/patches/categorization.py (new)
- cli/templates/docker-compose.full.yml (new)
- cli/tests/providers.test.ts (new)
- cli/src/commands/install.ts (modified)
- cli/src/index.ts (modified)
- cli/tests/install.test.ts (modified)
- .env.example (modified)

Tests: 119 passed
Add generated templates for self-hosting openmemory-plus:
- .augment/commands/memory.md - Augment IDE memory command
- .augment/skills/memory-extraction/ - Memory extraction skill
- _omp/commands/ - Shared memory command
- _omp/skills/ - Shared skills
- _omp/workflows/memory/ - Memory workflow (9 steps)
- Merge main branch MCP auto-config features
- Re-add multi-LLM provider selection (DeepSeek, MiniMax, ZhiPu, Qwen, OpenAI, Ollama)
- Add patches directory copy for categorization.py
- Add API key validation and .env generation
- All 128 tests passing
@Alenryuichi Alenryuichi merged commit e0c5f1b into main Feb 3, 2026
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant