Skip to content

Commit 1b80672

Browse files
authored
Merge pull request #1 from rocknroll17/feature/disable-embedding
feat: make embedding services optional
2 parents 08c246e + 32149f5 commit 1b80672

File tree

3 files changed

+56
-22
lines changed

3 files changed

+56
-22
lines changed

README.md

Lines changed: 41 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ The MCP MariaDB Server exposes a set of tools for interacting with MariaDB datab
2525
- Retrieving table schemas
2626
- Executing safe, read-only SQL queries
2727
- Creating and managing vector stores for embedding-based search
28-
- Integrating with embedding providers (currently OpenAI, Gemini, and HuggingFace)
28+
- Integrating with embedding providers (currently OpenAI, Gemini, and HuggingFace) (optional)
2929

3030
---
3131

@@ -63,7 +63,9 @@ The MCP MariaDB Server exposes a set of tools for interacting with MariaDB datab
6363
- Creates a new database if it doesn't exist.
6464
- Parameters: `database_name` (string, required)
6565

66-
### Vector Store & Embedding Tools
66+
### Vector Store & Embedding Tools (optional)
67+
68+
**Note**: These tools are only available when `EMBEDDING_PROVIDER` is configured. If no embedding provider is set, these tools will be disabled.
6769

6870
- **create_vector_store**
6971
- Creates a new vector store (table) for embeddings.
@@ -89,6 +91,10 @@ The MCP MariaDB Server exposes a set of tools for interacting with MariaDB datab
8991

9092
## Embeddings & Vector Store
9193

94+
### Overview
95+
96+
The MCP MariaDB Server provides **optional** embedding and vector store capabilities. These features can be enabled by configuring an embedding provider, or completely disabled if you only need standard database operations.
97+
9298
### Supported Providers
9399

94100
- **OpenAI**
@@ -97,11 +103,10 @@ The MCP MariaDB Server exposes a set of tools for interacting with MariaDB datab
97103

98104
### Configuration
99105

100-
- `EMBEDDING_PROVIDER`: Set to `openai` (default option), can change it to required providers
106+
- `EMBEDDING_PROVIDER`: Set to `openai`, `gemini`, `huggingface`, or leave unset to disable
101107
- `OPENAI_API_KEY`: Required if using OpenAI embeddings
102-
- GEMINI_API_KEY`: Required if using Gemini embeddings
103-
- Open models from HUGGINGFACE: Required open model currently provided option for "intfloat/multilingual-e5-large-instruct" & "BAAI/bge-m3"
104-
108+
- `GEMINI_API_KEY`: Required if using Gemini embeddings
109+
- `HF_MODEL`: Required if using HuggingFace embeddings (e.g., "intfloat/multilingual-e5-large-instruct" or "BAAI/bge-m3")
105110
### Model Selection
106111

107112
- Default and allowed models are configurable in code (`DEFAULT_OPENAI_MODEL`, `ALLOWED_OPENAI_MODELS`)
@@ -130,13 +135,14 @@ All configuration is via environment variables (typically set in a `.env` file):
130135
| `DB_NAME` | Default database (optional; can be set per query) | No | |
131136
| `MCP_READ_ONLY` | Enforce read-only SQL mode (`true`/`false`) | No | `true` |
132137
| `MCP_MAX_POOL_SIZE` | Max DB connection pool size | No | `10` |
133-
| `EMBEDDING_PROVIDER` | Embedding provider (`openai`/`gemini`/`huggingface`) | No | `openai` |
134-
| `OPENAI_API_KEY` | API key for OpenAI embeddings | Yes (if using embeddings) | |
135-
| `GEMINII_API_KEY` | API key for Gemini embeddings | Yes (if using embeddings) | |
136-
| `HF_MODEL` | Open models from Huggingface | Yes (if using embeddings) | |
138+
| `EMBEDDING_PROVIDER` | Embedding provider (`openai`/`gemini`/`huggingface`) | No |`None`(Disabled)|
139+
| `OPENAI_API_KEY` | API key for OpenAI embeddings | Yes (if EMBEDDING_PROVIDER=openai) | |
140+
| `GEMINI_API_KEY` | API key for Gemini embeddings | Yes (if EMBEDDING_PROVIDER=gemini) | |
141+
| `HF_MODEL` | Open models from Huggingface | Yes (if EMBEDDING_PROVIDER=huggingface) | |
137142

138143
#### Example `.env` file
139144

145+
**With Embedding Support (OpenAI):**
140146
```dotenv
141147
DB_HOST=localhost
142148
DB_USER=your_db_user
@@ -153,6 +159,17 @@ GEMINI_API_KEY=AI...
153159
HF_MODEL="BAAI/bge-m3"
154160
```
155161

162+
**Without Embedding Support:**
163+
```dotenv
164+
DB_HOST=localhost
165+
DB_USER=your_db_user
166+
DB_PASSWORD=your_db_password
167+
DB_PORT=3306
168+
DB_NAME=your_default_database
169+
MCP_READ_ONLY=true
170+
MCP_MAX_POOL_SIZE=10
171+
```
172+
156173
---
157174

158175
## Installation & Setup
@@ -244,9 +261,9 @@ HF_MODEL="BAAI/bge-m3"
244261
```
245262
---
246263

247-
## Integration - Claude desktop/Cursor/Windsurf
264+
## Integration - Claude desktop/Cursor/Windsurf/VSCode
248265

249-
```python
266+
```json
250267
{
251268
"mcpServers": {
252269
"MariaDB_Server": {
@@ -262,6 +279,18 @@ HF_MODEL="BAAI/bge-m3"
262279
}
263280
}
264281
```
282+
or
283+
**If already running MCP server**
284+
```json
285+
{
286+
"servers": {
287+
"mariadb-mcp-server": {
288+
"url": "http://{host}:9001/sse",
289+
"type": "sse"
290+
}
291+
}
292+
}
293+
```
265294
---
266295

267296
## Logging

src/config.py

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,8 @@
5959

6060
# --- Embedding Configuration ---
6161
# Provider selection ('openai' or 'gemini' or 'huggingface')
62-
EMBEDDING_PROVIDER = os.getenv("EMBEDDING_PROVIDER", "openai").lower()
62+
EMBEDDING_PROVIDER = os.getenv("EMBEDDING_PROVIDER")
63+
EMBEDDING_PROVIDER = EMBEDDING_PROVIDER.lower() if EMBEDDING_PROVIDER else None
6364
# API Keys
6465
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
6566
GEMINI_API_KEY = os.getenv("GEMINI_API_KEY")
@@ -86,8 +87,8 @@
8687
logger.error("EMBEDDING_PROVIDER is 'huggingface' but HF_MODEL is missing.")
8788
raise ValueError("HuggingFace model is required when EMBEDDING_PROVIDER is 'huggingface'.")
8889
else:
89-
logger.error(f"Invalid EMBEDDING_PROVIDER specified: '{EMBEDDING_PROVIDER}'. Use 'openai' or 'gemini' or 'huggingface'.")
90-
raise ValueError(f"Invalid EMBEDDING_PROVIDER: '{EMBEDDING_PROVIDER}'.")
90+
EMBEDDING_PROVIDER = None
91+
logger.info(f"No EMBEDDING_PROVIDER selected or it is set to None. Disabling embedding features.")
9192

9293
logger.info(f"Read-only mode: {MCP_READ_ONLY}")
9394
logger.info(f"Logging to console and to file: {LOG_FILE_PATH} (Level: {LOG_LEVEL}, MaxSize: {LOG_MAX_BYTES}B, Backups: {LOG_BACKUP_COUNT})")

src/server.py

Lines changed: 11 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -12,14 +12,17 @@
1212
# Import configuration settings
1313
from config import (
1414
DB_HOST, DB_PORT, DB_USER, DB_PASSWORD, DB_NAME,
15-
MCP_READ_ONLY, MCP_MAX_POOL_SIZE, logger
15+
MCP_READ_ONLY, MCP_MAX_POOL_SIZE, EMBEDDING_PROVIDER,
16+
logger
1617
)
1718

1819
# Import EmbeddingService for vector store creation
1920
from embeddings import EmbeddingService
2021

2122
# Singleton instance for embedding service
22-
embedding_service = EmbeddingService()
23+
embedding_service = None
24+
if EMBEDDING_PROVIDER is not None:
25+
embedding_service = EmbeddingService()
2326

2427
from asyncmy.errors import Error as AsyncMyError
2528

@@ -698,11 +701,12 @@ def register_tools(self):
698701
self.mcp.add_tool(self.get_table_schema)
699702
self.mcp.add_tool(self.execute_sql)
700703
self.mcp.add_tool(self.create_database)
701-
self.mcp.add_tool(self.create_vector_store)
702-
self.mcp.add_tool(self.list_vector_stores)
703-
self.mcp.add_tool(self.delete_vector_store)
704-
self.mcp.add_tool(self.insert_docs_vector_store)
705-
self.mcp.add_tool(self.search_vector_store)
704+
if EMBEDDING_PROVIDER is not None:
705+
self.mcp.add_tool(self.create_vector_store)
706+
self.mcp.add_tool(self.list_vector_stores)
707+
self.mcp.add_tool(self.delete_vector_store)
708+
self.mcp.add_tool(self.insert_docs_vector_store)
709+
self.mcp.add_tool(self.search_vector_store)
706710
logger.info("Registered MCP tools explicitly.")
707711

708712
# --- Async Main Server Logic ---

0 commit comments

Comments
 (0)