feat(llm): add multi-LLM provider support with API key validation#4
Merged
Alenryuichi merged 4 commits intomainfrom Feb 3, 2026
Merged
feat(llm): add multi-LLM provider support with API key validation#4Alenryuichi merged 4 commits intomainfrom
Alenryuichi merged 4 commits intomainfrom
Conversation
## Features - Add 6 LLM providers: DeepSeek, MiniMax, ZhiPu, Qwen, OpenAI, Ollama - Add --llm CLI option for non-interactive provider selection - Add Docker Volume mounting strategy for categorization.py patch - Add optional API key validation before saving ## Code Review Fixes - H1: Add providers.test.ts with 22 test cases - H2: Add --llm option tests in install.test.ts - H3: Auto-update .gitignore and show security warning for .env - H4: Fix Ollama envKey empty string bug in generateProviderEnv - M1: Handle invalid provider in non-interactive mode - M2: Add Ollama support in categorization.py - M3: Use getMcpEnvForProvider in showMcpConfig - L1: Auto-generate PROVIDER_CHOICES from LLM_PROVIDERS - L2: Add validateApiKey function with tests ## Files - cli/src/lib/providers.ts (new) - cli/templates/patches/categorization.py (new) - cli/templates/docker-compose.full.yml (new) - cli/tests/providers.test.ts (new) - cli/src/commands/install.ts (modified) - cli/src/index.ts (modified) - cli/tests/install.test.ts (modified) - .env.example (modified) Tests: 119 passed
Add generated templates for self-hosting openmemory-plus: - .augment/commands/memory.md - Augment IDE memory command - .augment/skills/memory-extraction/ - Memory extraction skill - _omp/commands/ - Shared memory command - _omp/skills/ - Shared skills - _omp/workflows/memory/ - Memory workflow (9 steps)
- Merge main branch MCP auto-config features - Re-add multi-LLM provider selection (DeepSeek, MiniMax, ZhiPu, Qwen, OpenAI, Ollama) - Add patches directory copy for categorization.py - Add API key validation and .env generation - All 128 tests passing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Add multi-LLM provider support to openmemory-plus, allowing users to choose from 6 different LLM providers for the memory categorization feature.
Features
🚀 LLM Provider Support
💻 CLI Usage
\
╔═══════════════════════════════════════════════════════════════╗
║ ║
║ 🧠 OpenMemory Plus - Agent Memory Management ║
║ ║
║ 让任何 AI Agent 在 5 分钟内获得持久记忆能力 ║
║ ║
╚═══════════════════════════════════════════════════════════════╝
━━━ 第 1 步: 检测系统依赖 ━━━
当前状态:
🐳 Docker: ✓ 运行中
🐳 Compose: ✓ 可用
🦙 Ollama: ✓ 运行中
📦 Qdrant: ✓ 运行中
🔤 BGE-M3: ✓ 已安装
✅ 所有依赖已就绪!
━━━ 第 2 步: 配置项目 ━━━
使用默认 IDE: augment
📁 创建配置文件...
✓ 创建 _omp/ (核心目录)
✓ 创建 _omp/memory/project.yaml
✓ 创建 _omp/commands/ (1 命令)
✓ 创建 _omp/workflows/ (9 步骤)
✓ 创建 _omp/skills/ (memory-extraction)
✓ 配置 augment (.augment/commands/)
━━━ Phase 3: MCP 配置与验证 ━━━
━━━ 安装完成 ━━━
🎉 OpenMemory Plus 已成功安装!
✓ MCP 已自动配置到 Augment
💡 下一步:
🔐 Security
Code Review Fixes
Testing
✅ 119 tests passed