-
Notifications
You must be signed in to change notification settings - Fork 9
LlamaIndex: Add example using MCP #1032
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from 3 commits
Commits
Show all changes
7 commits
Select commit
Hold shift + click to select a range
a807a14
LlamaIndex: This and that. Naming things.
amotl 08e0167
LlamaIndex: Add example using MCP
amotl 6f7cd57
LlamaIndex: Make model configurable
amotl 11445fa
LlamaIndex: Optionally use debugging when configuring llm
amotl a850d9f
LlamaIndex: Address review comments by CodeRabbit
amotl d21b80d
LlamaIndex: Use `cratedb-about 0.0.6`
amotl a43ee4e
LlamaIndex: Use OCI `cratedb-mcp:main`
amotl File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,57 @@ | ||
| import os | ||
| from typing import Tuple | ||
|
|
||
| import openai | ||
| from langchain_openai import AzureOpenAIEmbeddings | ||
| from langchain_openai import OpenAIEmbeddings | ||
| from llama_index.core.base.embeddings.base import BaseEmbedding | ||
| from llama_index.core.llms import LLM | ||
| from llama_index.llms.azure_openai import AzureOpenAI | ||
| from llama_index.llms.openai import OpenAI | ||
| from llama_index.embeddings.langchain import LangchainEmbedding | ||
|
|
||
|
|
||
| MODEL_NAME = "gpt-4o" | ||
|
|
||
|
|
||
| def configure_llm() -> Tuple[LLM, BaseEmbedding]: | ||
| """ | ||
| Configure LLM. Use either vanilla Open AI, or Azure Open AI. | ||
| """ | ||
|
|
||
| openai.api_type = os.getenv("OPENAI_API_TYPE") | ||
| openai.azure_endpoint = os.getenv("OPENAI_AZURE_ENDPOINT") | ||
| openai.api_version = os.getenv("OPENAI_AZURE_API_VERSION") | ||
| openai.api_key = os.getenv("OPENAI_API_KEY") | ||
|
|
||
| if openai.api_type == "openai": | ||
| llm = OpenAI( | ||
| model=MODEL_NAME, | ||
| temperature=0.0, | ||
| api_key=os.getenv("OPENAI_API_KEY"), | ||
| ) | ||
| elif openai.api_type == "azure": | ||
| llm = AzureOpenAI( | ||
| model=MODEL_NAME, | ||
| temperature=0.0, | ||
| engine=os.getenv("LLM_INSTANCE"), | ||
| azure_endpoint=os.getenv("OPENAI_AZURE_ENDPOINT"), | ||
| api_key = os.getenv("OPENAI_API_KEY"), | ||
| api_version = os.getenv("OPENAI_AZURE_API_VERSION"), | ||
| ) | ||
| else: | ||
| raise ValueError(f"Open AI API type not defined or invalid: {openai.api_type}") | ||
|
|
||
| if openai.api_type == "openai": | ||
| embed_model = LangchainEmbedding(OpenAIEmbeddings(model=MODEL_NAME)) | ||
| elif openai.api_type == "azure": | ||
| embed_model = LangchainEmbedding( | ||
| AzureOpenAIEmbeddings( | ||
| azure_endpoint=os.getenv("OPENAI_AZURE_ENDPOINT"), | ||
| model=os.getenv("EMBEDDING_MODEL_INSTANCE") | ||
| ) | ||
| ) | ||
| else: | ||
| embed_model = None | ||
|
|
||
| return llm, embed_model |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,89 @@ | ||
| """ | ||
| Use an LLM to query a database in human language via MCP. | ||
| Example code using LlamaIndex with vanilla Open AI and Azure Open AI. | ||
|
|
||
| https://github.com/run-llama/llama_index/tree/main/llama-index-integrations/tools/llama-index-tools-mcp | ||
|
|
||
| ## Start CrateDB MCP Server | ||
| ``` | ||
| export CRATEDB_CLUSTER_URL="http://localhost:4200/" | ||
| cratedb-mcp serve --transport=streamable-http | ||
| ``` | ||
|
|
||
| ## Usage | ||
| ``` | ||
| source env.standalone | ||
| export OPENAI_API_KEY=sk-XJZ7pfog5Gp8Kus8D--invalid--0CJ5lyAKSefZLaV1Y9S1 | ||
amotl marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| python demo_mcp.py | ||
| ``` | ||
| """ | ||
| import asyncio | ||
| import os | ||
|
|
||
| from cratedb_about.instruction import Instructions | ||
|
|
||
| from dotenv import load_dotenv | ||
| from llama_index.core.agent.workflow import FunctionAgent | ||
| from llama_index.core.llms import LLM | ||
| from llama_index.tools.mcp import BasicMCPClient, McpToolSpec | ||
|
|
||
| from boot import configure_llm | ||
|
|
||
|
|
||
| class Agent: | ||
|
|
||
| def __init__(self, llm: LLM): | ||
| self.llm = llm | ||
|
|
||
| async def get_tools(self): | ||
| # Connect to the CrateDB MCP server using `streamable-http` transport. | ||
| mcp_url = os.getenv("CRATEDB_MCP_URL", "http://127.0.0.1:8000/mcp/") | ||
| mcp_client = BasicMCPClient(mcp_url) | ||
| mcp_tool_spec = McpToolSpec( | ||
| client=mcp_client, | ||
| # Optional: Filter the tools by name | ||
| # allowed_tools=["tool1", "tool2"], | ||
| # Optional: Include resources in the tool list | ||
| # include_resources=True, | ||
| ) | ||
| return await mcp_tool_spec.to_tool_list_async() | ||
|
|
||
| async def get_agent(self): | ||
| return FunctionAgent( | ||
| name="Agent", | ||
| description="CrateDB text-to-SQL agent", | ||
| llm=self.llm, | ||
| tools=await self.get_tools(), | ||
| system_prompt=Instructions.full(), | ||
| ) | ||
|
|
||
| async def aquery(self, query): | ||
| return await (await self.get_agent()).run(query) | ||
|
|
||
| def query(self, query): | ||
| print("Inquiring MCP server") | ||
| return asyncio.run(self.aquery(query)) | ||
|
|
||
|
|
||
| def main(): | ||
| """ | ||
| Use an LLM to query a database in human language. | ||
| """ | ||
|
|
||
| # Configure application. | ||
| load_dotenv() | ||
| llm, embed_model = configure_llm() | ||
|
|
||
| # Use an agent that uses the CrateDB MCP server. | ||
| agent = Agent(llm) | ||
|
|
||
| # Invoke an inquiry. | ||
| print("Running query") | ||
| QUERY_STR = "What is the average value for sensor 1?" | ||
| answer = agent.query(QUERY_STR) | ||
| print("Query was:", QUERY_STR) | ||
| print("Answer was:", answer) | ||
|
|
||
|
|
||
| if __name__ == "__main__": | ||
| main() | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,50 @@ | ||
| """ | ||
| Use an LLM to query a database in human language via NLSQLTableQueryEngine. | ||
| Example code using LlamaIndex with vanilla Open AI and Azure Open AI. | ||
| """ | ||
|
|
||
| import os | ||
| import sqlalchemy as sa | ||
|
|
||
| from dotenv import load_dotenv | ||
| from llama_index.core.utilities.sql_wrapper import SQLDatabase | ||
| from llama_index.core.query_engine import NLSQLTableQueryEngine | ||
|
|
||
| from boot import configure_llm | ||
|
|
||
|
|
||
| def main(): | ||
| """ | ||
| Use an LLM to query a database in human language. | ||
| """ | ||
|
|
||
| # Configure application. | ||
| load_dotenv() | ||
| llm, embed_model = configure_llm() | ||
|
|
||
| # Configure database connection and query engine. | ||
| print("Connecting to CrateDB") | ||
| engine_crate = sa.create_engine(os.getenv("CRATEDB_SQLALCHEMY_URL")) | ||
| engine_crate.connect() | ||
amotl marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
|
||
| print("Creating LlamaIndex QueryEngine") | ||
| sql_database = SQLDatabase(engine_crate, include_tables=[os.getenv("CRATEDB_TABLE_NAME")]) | ||
| query_engine = NLSQLTableQueryEngine( | ||
| sql_database=sql_database, | ||
| tables=[os.getenv("CRATEDB_TABLE_NAME")], | ||
| llm=llm, | ||
| embed_model=embed_model, | ||
| ) | ||
|
|
||
| # Invoke an inquiry. | ||
| print("Running query") | ||
| QUERY_STR = "What is the average value for sensor 1?" | ||
| answer = query_engine.query(QUERY_STR) | ||
| print(answer.get_formatted_sources()) | ||
| print("Query was:", QUERY_STR) | ||
| print("Answer was:", answer) | ||
| print(answer.metadata) | ||
|
|
||
|
|
||
| if __name__ == "__main__": | ||
| main() | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,8 +1,8 @@ | ||
| OPENAI_API_KEY=TODO | ||
| OPENAI_API_TYPE=azure | ||
| OPENAI_AZURE_ENDPOINT=https://TODO.openai.azure.com | ||
| OPENAI_AZURE_API_VERSION=2024-08-01-preview | ||
| LLM_INSTANCE=TODO | ||
| EMBEDDING_MODEL_INSTANCE=TODO | ||
| CRATEDB_SQLALCHEMY_URL="crate://USER:PASSWORD@HOST:4200/?ssl=true" | ||
| CRATEDB_TABLE_NAME=time_series_data | ||
| export OPENAI_API_KEY=TODO | ||
| export OPENAI_API_TYPE=azure | ||
| export OPENAI_AZURE_ENDPOINT=https://TODO.openai.azure.com | ||
| export OPENAI_AZURE_API_VERSION=2024-08-01-preview | ||
| export LLM_INSTANCE=TODO | ||
| export EMBEDDING_MODEL_INSTANCE=TODO | ||
| export CRATEDB_SQLALCHEMY_URL="crate://USER:PASSWORD@HOST:4200/?ssl=true" | ||
| export CRATEDB_TABLE_NAME=time_series_data |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,4 +1,4 @@ | ||
| # OPENAI_API_KEY=sk-XJZ7pfog5Gp8Kus8D--invalid--0CJ5lyAKSefZLaV1Y9S1 | ||
| OPENAI_API_TYPE=openai | ||
| CRATEDB_SQLALCHEMY_URL="crate://crate@localhost:4200/" | ||
| CRATEDB_TABLE_NAME=time_series_data | ||
| export OPENAI_API_TYPE=openai | ||
| export CRATEDB_SQLALCHEMY_URL="crate://crate@localhost:4200/" | ||
| export CRATEDB_TABLE_NAME=time_series_data |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This needs to be adjusted after the next release of
cratedb-mcp.cratedb-mcpandcratedb-mcpocratedb-mcp#50There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Resolved with a43ee4e.