Personal knowledge wiki maintained by your AI tool via MCP.
Your AI tool (Claude Code, Cursor, etc.) maintains a Karpathy-style wiki through MCP. No separate LLM client. No API keys. Your existing AI tool IS the brain.
Requires Node.js 20+
npm install -g autopediaautopedia initCreates ~/.autopedia/ with:
wiki/ ← synthesized knowledge (AI-maintained)
sources/ ← raw inputs (URLs, text notes, files)
ops/ ← audit trail (log, metrics, queue)
schema/ ← your profile and rules
Add to your AI tool's MCP config (one-time setup):
Claude Code (~/.claude.json):
{
"mcpServers": {
"autopedia": {
"command": "autopedia",
"args": ["serve"]
}
}
}Cursor (.cursor/mcp.json):
{
"mcpServers": {
"autopedia": {
"command": "autopedia",
"args": ["serve"]
}
}
}autopedia status
# Should show: Wiki pages: 1, Queued: 0Start a new Claude Code or Cursor session. On first connection, autopedia interviews you (~30 seconds) to personalize your wiki. After that, it's silent until you need it.
Add stuff anytime (from any terminal):
autopedia add "GPU prices dropped 20% this quarter"
autopedia add https://example.com/article
autopedia add ~/research/notes.mdProcess when ready (tell your AI tool):
You: "sync my wiki"
AI: Processing 1/3: gpu-pricing-note → created gpu-pricing.md
Processing 2/3: example.com/article → updated market-trends.md
Processing 3/3: notes.md → created research-notes.md
Done. Created 2 pages, updated 1.
Ask questions anytime:
You: "What do I know about GPU pricing?"
AI: → answers from YOUR research, not training data
autopedia never hijacks your conversation. It's a quiet knowledge layer — there when you need it, invisible when you don't.
| Command | What it does |
|---|---|
autopedia init |
Create ~/.autopedia/ directory structure |
autopedia add <source> |
Queue a URL, text note, file, folder, or repo |
autopedia add --repo <path> |
Scan a codebase and create an architectural bundle |
autopedia lint |
Check wiki health: orphans, stale pages, broken links |
autopedia remove <name> |
Remove a wiki page (or source with -s) |
autopedia scan |
Detect files added outside autopedia (Obsidian, IDE) and queue them |
autopedia status |
Show wiki stats and unprocessed sources |
autopedia search <query> |
Search wiki pages from the terminal |
autopedia view |
Browse your wiki in a local dashboard |
autopedia export |
Export wiki as a single markdown file |
autopedia serve |
Start MCP server (used by AI tools, not run manually) |
autopedia add "GPU prices dropped 20% this quarter" # text note
autopedia add https://example.com/article # URL
autopedia add ~/research/gpu-report.pdf # file
autopedia add ~/research/ # whole folder
autopedia add ~/code/my-project/ # auto-detect repo (.git/)
autopedia add --repo ~/code/my-project/ # explicit repo modeEverything is saved instantly. Tell your AI tool "sync" to process.
Run autopedia view to open a local dashboard.
- Wiki index with rendered markdown and clickable [[wikilinks]]
- Knowledge graph — force-directed visualization of page connections
- Backlinks — each page shows what links to it
- Source browser with content-derived titles
- Status — page count, queue, untracked files
- Light/dark theme with Newsreader + DM Sans typography
Open ~/.autopedia/ as an Obsidian vault. Wikilinks, graph view, and backlinks work out of the box.
Drag-and-drop workflow: Drop files into the vault via Obsidian, then run autopedia scan to queue them. Tell your AI tool "sync" to process.
Implements Karpathy's three wiki operations:
- INGEST — Fetch URLs, save notes, synthesize into wiki pages
- QUERY — Search and read, answer grounded in your research
- LINT — Find orphans, stale content, contradictions, fix them
| Tool | Operation | Purpose |
|---|---|---|
add_source |
INGEST | Fetch URL or save text (queue or ingest mode) |
apply_wiki_ops |
INGEST | Create/update wiki pages |
read_source |
QUERY | Read a saved source |
search |
QUERY | Search wiki pages |
read_page |
QUERY | Read a specific page |
get_status |
STATUS | Page count, queue, untracked files |
lint |
LINT | Orphans, stale pages, broken links, low crossrefs |
question_assumptions |
LINT | Challenge high-confidence claims |
complete_onboarding |
ONBOARDING | Write identity + interests |
| Resource | What |
|---|---|
autopedia://prompt |
System prompt (auto-updates on upgrade) |
autopedia://identity |
Your profile |
autopedia://interests |
What you care about |
- Sacred boundary: Server writes only to
wiki/,ops/,sources/agent/. User content is never modified. - Path traversal:
path.resolve()+startsWith()+ symlink chain validation - SSRF protection: Blocks localhost, private IPs, IPv6, metadata endpoints, redirect bypasses
- XSS prevention: All rendered content HTML-escaped, link text escaped, graph JSON escaped
- No API keys: Server makes zero LLM calls — your AI tool does all the thinking
src/wiki.ts — File I/O, boundary enforcement, wikilink graph, lint, scan
src/mcp.ts — 9 MCP tools + 3 resources
src/cli.ts — CLI: init, add, lint, scan, serve, status, view, search, export, remove
Repo scanner: smart file discovery, role scoring, bundle formatting
src/dashboard.ts — Server-rendered HTML dashboard (graph, backlinks, source titles)
schema/prompt.md — System prompt (served via MCP, auto-updates on upgrade)
7 runtime dependencies. No LLM SDK. No database. No Express.
git clone https://github.com/devp1/autopedia
cd autopedia
npm install
npm run build
npm test # 259 tests
npm run typecheck
npm run lint