AI-powered deep-dive research engine — rewritten in pure Python
Build like Stark. 🚀 Research like a 100-person team.
| 💡 What It Does | 🔍 How It Works |
|---|---|
| 1. Explodes a single question into multiple high-quality SERP queries. | LLM (OpenAI o3-mini or DeepSeek R1) 📜 ➜ JSON-validated queries |
| 2. Scours the web in parallel | Firecrawl API ⚡ async + configurable concurrency |
| 3. Extracts smart learnings & next questions | Token-aware prompt trimming 🪄 |
| 4. Loops recursively (breadth ↓, depth ↓) | Unique learnings / URLs deduplicated |
| 5. Outputs either… | • a surgical one-liner answer ✂️ • a multi-page Markdown report 📑 |
# 1 · Clone & enter
git clone https://github.com/meltmagic/deep-research.git
cd deep-research
# 2 · Add your keys
cp .env.example .env # edit FIRECRAWL_KEY, OPENAI_KEY or FIREWORKS_KEY
# 3 · Install & run CLI
pip install -r requirements.txt
python -m deep_research # interactive wizard 🧙docker compose up # API 👉 http://localhost:8000/docs| Key | 🚦 Required | 📝 Description |
|---|---|---|
FIRECRAWL_KEY |
✅ | Web search + scrape |
OPENAI_KEY |
✅* | LLM (o3-mini) |
FIREWORKS_KEY |
✅* | DeepSeek R1 alternative |
CUSTOM_MODEL + OPENAI_ENDPOINT |
✅* | Any OpenAI-compatible endpoint |
FIRECRAWL_CONCURRENCY |
▫️ | Parallel calls (default 2) |
CONTEXT_SIZE |
▫️ | Max tokens to send to LLM (default 128 000) |
* Provide one of the three LLM options.
$ python -m deep_research
❓ What would you like to research? Impact of autonomous trucks on jobs
📏 Breadth (2-10) [4]:
📐 Depth (1-5) [2]:
📄 Generate report or answer? (report/answer) [report]:
Outputs land in the reports/ folder:
report.md 📓 or answer.md 💬
| Verb | Endpoint | 🎯 Purpose |
|---|---|---|
| POST | /api/research |
Concise answer + metadata |
| POST | /api/generate-report |
Full reportMarkdown |
| GET | /api/healthz |
Liveness ping |
Example 👇
POST /api/research
Content-Type: application/json
{
"query": "Why did the 2024 GPU shortage happen?",
"breadth": 3,
"depth": 2
}CLI / REST ↔️ Orchestrator 🔄
↙️ ↘️
Firecrawl 🌐 LLM 🎓
↘️ ↙️
Markdown Report 📑 / Exact Answer ⚡
- 🔌 WebSocket progress stream
- 🛡️ RAG verification to curb hallucinations
- 🖼️ Streamlit front-end dashboard
- 🔐 Optional API-key auth middleware
📬 Open an issue to suggest features!
- Fork →
git checkout -b magic-feature poetry installorpip install -r requirements-dev.txt- Lint
ruff, type-checkmypy, testpytest -q - PR with emoji-rich description 😉
MIT © 2025 MeltMagic Agency — Fork it, ship it, make magic. ✨