A Model Context Protocol (MCP) server that provides AI-enhanced code search capabilities using Sourcegraph.
- Table of contents
- Overview
- Features
- Prerequisites
- Installation
- Configuration
- Usage with AI tools
- MCP tools
- Migrating from upstream
- Development
This MCP server integrates with Sourcegraph, a universal code search platform that enables searching across multiple repositories and codebases. It provides powerful search capabilities with advanced query syntax, making it ideal for AI assistants that need to find and understand code patterns across large codebases.
This is an actively maintained fork of divar-ir/sourcegraph-mcp, created to patch upstream bugs and maintain an actively supported version of the Sourcegraph MCP server.
Note
Key additions vs. upstream:
-
Python 3.12+ support (compatible with 3.10+)
-
Functional package structure
-
Tunable config via env vars, rather than hardcoded paths
-
FastMCP dependency upgraded to β₯2.11.2 (to integrate patches)
- Updated to decorator-based tool registration (to conform to new version & for convenience)
- Code search: Search across codebases using Sourcegraph's powerful query language
- Advanced query language: Support for regex patterns, file filters, language filters, and boolean operators
- Repository discovery: Find repositories by name and explore their structure
- Content fetching: Browse repository files and directories
- AI integration: Designed for LLM integration with guided search prompts
- Python 3.12+ compatible: Fully tested and working on Python 3.10, 3.11, and 3.12+
- Sourcegraph Instance: Access to a Sourcegraph instance (either sourcegraph.com or self-hosted)
- Python 3.10+: Required for running the MCP server (Python 3.12+ fully supported)
- uv (optional): Modern Python package manager for easier dependency management
# Clone the repository
git clone https://github.com/akbad/sourcegraph-mcp.git
cd sourcegraph-mcp
# Install dependencies
uv sync
# Run the server
uv run python -m src.main# Install directly from GitHub
pip install git+https://github.com/akbad/sourcegraph-mcp.git
# Or clone and install locally
git clone https://github.com/akbad/sourcegraph-mcp.git
cd sourcegraph-mcp
pip install -e .
# Run the server
python -m src.main# Pull from GitHub Container Registry
docker pull ghcr.io/akbad/sourcegraph-mcp:latest
# Or build locally
git clone https://github.com/akbad/sourcegraph-mcp.git
cd sourcegraph-mcp
docker build -t sourcegraph-mcp .
# Run the container with default ports
docker run -p 8000:8000 -p 8080:8080 \
-e SRC_ENDPOINT=https://sourcegraph.com \
-e SRC_ACCESS_TOKEN=your-token \
ghcr.io/akbad/sourcegraph-mcp:latest
# Or run with custom ports
docker run -p 9000:9000 -p 9080:9080 \
-e SRC_ENDPOINT=https://sourcegraph.com \
-e SRC_ACCESS_TOKEN=your-token \
-e MCP_SSE_PORT=9000 \
-e MCP_STREAMABLE_HTTP_PORT=9080 \
ghcr.io/akbad/sourcegraph-mcp:latestSRC_ENDPOINT: Sourcegraph instance URL (e.g., https://sourcegraph.com)
SRC_ACCESS_TOKEN: Authentication token for private Sourcegraph instancesMCP_SSE_PORT: SSE server port (default: 8000)MCP_STREAMABLE_HTTP_PORT: HTTP server port (default: 8080)FASTMCP_SSE_PATH: SSE endpoint path (default: /sourcegraph/sse)FASTMCP_MESSAGE_PATH: SSE messages endpoint path (default: /sourcegraph/messages/)
After running the MCP server, add the following to your .cursor/mcp.json file:
{
"mcpServers": {
"sourcegraph": {
"url": "http://localhost:8080/sourcegraph/mcp/"
}
}
}After running the MCP server, add it to Claude Code using the claude CLI:
claude mcp add --transport http sourcegraph --scope user \
http://localhost:8080/sourcegraph/mcp/Verify the server was added:
claude mcp listAfter running the MCP server, add the following to your ~/.codex/config.toml:
[mcp_servers.sourcegraph]
url = "http://localhost:8080/sourcegraph/mcp/"
transport = "http"Verify the server is configured:
codex mcp listAfter running the MCP server, add the following to your ~/.gemini/settings.json:
{
"mcpServers": {
"sourcegraph": {
"transport": "http",
"url": "http://localhost:8080/sourcegraph/mcp/"
}
}
}Verify the server is configured:
gemini mcp listNote: If you customized the port using
MCP_STREAMABLE_HTTP_PORT, update the URLs above accordingly.
This server provides three powerful tools for AI assistants:
Search across codebases using Sourcegraph's advanced query syntax with support for regex, language filters, and boolean operators.
Example queries:
repo:github.com/kubernetes/kubernetes error handlerlang:python class UserServicefile:\.go$ func SendMessage
Generate a context-aware guide for constructing effective search queries based on your specific objective. This tool helps AI assistants learn how to use Sourcegraph's query syntax effectively.
Parameters:
objective: What you're trying to find or accomplish
Retrieve file contents or explore directory structures from repositories.
Parameters:
repo: Repository path (e.g., "github.com/org/project")path: File or directory path within the repository
How to switch to this maintained fork if you're currently using the upstream:
Important
Breaking changes:
- FastMCP version: Minimum version is now 2.11.2+ (was 2.4.0)
- Environment variables: Add these to your
.envfile:FASTMCP_SSE_PATH=/sourcegraph/sse FASTMCP_MESSAGE_PATH=/sourcegraph/messages/
- Python version: Ensure you're using Python 3.10+ (3.12+ fully supported)
-
Update git remote
cd /path/to/your/sourcegraph-mcp git remote set-url origin https://github.com/akbad/sourcegraph-mcp.git -
Pull the latest changes
git pull origin master
-
Update dependencies
uv sync # or: pip install --upgrade -e . -
Add new environment variables to
.envecho "FASTMCP_SSE_PATH=/sourcegraph/sse" >> src/.env echo "FASTMCP_MESSAGE_PATH=/sourcegraph/messages/" >> src/.env
-
Restart Sourcegraph MCP server
uv run python -m src.main
# Check code style
uv run ruff check src/
# Format code
uv run ruff format src/
# Fix auto-fixable issues
uv run ruff check --fix src/