diff --git a/docs/gram/api-clients/assets/configure-mcp.png b/docs/gram/api-clients/assets/configure-mcp.png new file mode 100644 index 00000000..70b676eb Binary files /dev/null and b/docs/gram/api-clients/assets/configure-mcp.png differ diff --git a/docs/gram/api-clients/assets/create-toolset.png b/docs/gram/api-clients/assets/create-toolset.png new file mode 100644 index 00000000..3b0cb6d9 Binary files /dev/null and b/docs/gram/api-clients/assets/create-toolset.png differ diff --git a/docs/gram/api-clients/assets/fill-env-vars-toolset.png b/docs/gram/api-clients/assets/fill-env-vars-toolset.png new file mode 100644 index 00000000..17385ff1 Binary files /dev/null and b/docs/gram/api-clients/assets/fill-env-vars-toolset.png differ diff --git a/docs/gram/api-clients/assets/gram-new-project.png b/docs/gram/api-clients/assets/gram-new-project.png new file mode 100644 index 00000000..3050bb00 Binary files /dev/null and b/docs/gram/api-clients/assets/gram-new-project.png differ diff --git a/docs/gram/api-clients/assets/mcp-details.png b/docs/gram/api-clients/assets/mcp-details.png new file mode 100644 index 00000000..23be43b2 Binary files /dev/null and b/docs/gram/api-clients/assets/mcp-details.png differ diff --git a/docs/gram/api-clients/assets/set-server-url.png b/docs/gram/api-clients/assets/set-server-url.png new file mode 100644 index 00000000..3855145a Binary files /dev/null and b/docs/gram/api-clients/assets/set-server-url.png differ diff --git a/docs/gram/api-clients/assets/toolset-created.png b/docs/gram/api-clients/assets/toolset-created.png new file mode 100644 index 00000000..f7053831 Binary files /dev/null and b/docs/gram/api-clients/assets/toolset-created.png differ diff --git a/docs/gram/api-clients/assets/upload-openapi-spec.png b/docs/gram/api-clients/assets/upload-openapi-spec.png new file mode 100644 index 00000000..86cd7c63 Binary files /dev/null and b/docs/gram/api-clients/assets/upload-openapi-spec.png differ diff --git a/docs/gram/api-clients/assets/vellum-workflow-diagram.png b/docs/gram/api-clients/assets/vellum-workflow-diagram.png new file mode 100644 index 00000000..0f1b8e3b Binary files /dev/null and b/docs/gram/api-clients/assets/vellum-workflow-diagram.png differ diff --git a/docs/gram/api-clients/assets/vellum-workflow-output.png b/docs/gram/api-clients/assets/vellum-workflow-output.png new file mode 100644 index 00000000..b450d1d4 Binary files /dev/null and b/docs/gram/api-clients/assets/vellum-workflow-output.png differ diff --git a/docs/gram/api-clients/using-vellum-workflows-sdk-with-gram-mcp-servers.mdx b/docs/gram/api-clients/using-vellum-workflows-sdk-with-gram-mcp-servers.mdx new file mode 100644 index 00000000..e03a4519 --- /dev/null +++ b/docs/gram/api-clients/using-vellum-workflows-sdk-with-gram-mcp-servers.mdx @@ -0,0 +1,331 @@ +--- +title: "Using Vellum Workflows with Gram-hosted MCP servers" +description: "Build a Vellum Workflow that connects to a Gram-hosted MCP server and uses natural language to query your APIs." +sidebar: + order: 2 +--- + +[Vellum](https://www.vellum.ai/) Workflows can connect to Model Context Protocol (MCP) servers to interact with external APIs and tools. This guide shows you how to connect a Vellum Workflow to a [Gram-hosted MCP server](/blog/release-gram-beta) using the Push Advisor API from the Gram [core concepts](/mcp/core-concepts) guide. + +By the end, you'll have a Workflow that uses natural language to check whether it's safe to push to production. + +Find the full code and OpenAPI document in the [Push Advisor API repository](https://github.com/ritza-co/gram-examples/tree/main/push-advisor-api). + +## Prerequisites + +To follow this tutorial, you need: + +- A [Gram account](/product/gram) +- A [Vellum account](https://vellum.ai/) with an API key +- A Python environment set up on your machine +- The `uv` package manager installed on your machine +- Basic familiarity with making API requests + +## Creating a Gram MCP server + +If you already have a Gram MCP server configured, you can skip to [connecting Vellum to your Gram-hosted MCP server](#connecting-vellum-to-your-gram-hosted-mcp-server). For an in-depth guide to how Gram works and creating a Gram-hosted MCP server, check out [core concepts](/mcp/core-concepts) documentation. + +### Setting up a Gram project + +In the [Gram dashboard](https://app.getgram.ai), click **New Project** to create a new project. Enter a project name and click **Submit** + +![Screenshot of the Gram dashboard showing the New Project modal](./assets/gram-new-project.png) + +Once you've created the project, click the **Get Started** button. + +Choose **Start from API**. Gram then guides you through the following steps. + +#### Step 1: Upload the OpenAPI document + +Upload the [Push Advisor OpenAPI document](https://github.com/ritza-co/gram-examples/blob/main/push-advisor-api/openapi.yaml), enter the name of your API, and click **Continue**. + +![Screenshot of the upload your OpenAPI document dialog](./assets/upload-openapi-spec.png) + +#### Step 2: Create a toolset + +Give your toolset a name (for example, `Push Advisor`) and click **Continue**. + +![Screenshot of the create toolset dialog](./assets/create-toolset.png) + +Notice that the **Name Your Toolset** dialog displays the names of the tools that Gram will generate from your OpenAPI document. + +#### Step 3: Configure MCP + +Enter a URL slug for the MCP server and click **Continue**. + +![Screenshot of the configure MCP dialog](./assets/configure-mcp.png) + +Gram will create a new toolset from the OpenAPI document. + +Click **Toolsets** in the sidebar to view the Push Advisor toolset. + +![Screenshot of the Gram dashboard showing the Push Advisor toolset](./assets/toolset-created.png) + +### Configuring environment variables + +[Environments](/docs/gram/concepts/environments) store API keys and configuration separately from your toolset logic. + +In the **Environments** tab, click the **Default** environment. Click **Fill for Toolset**. Select the **Push Advisor** toolset and click **Fill Variables** to automatically populate the required variables. + +![Screenshot showing the fill for toolset dialog to automatically populate required variables](./assets/fill-env-vars-toolset.png) + +The Push Advisor API is hosted at `https://canpushtoprod.abdulbaaridavids04.workers.dev`, so set the `_SERVER_URL` environment variable to `https://canpushtoprod.abdulbaaridavids04.workers.dev`. Click **Save**. + +![Set server URL](./assets/set-server-url.png) + +### Publishing an MCP server + +Let's make the toolset available as an MCP server. + +Go to the **MCP** tab, find the Push Advisor toolset, and click the title of the server. + +On the **MCP Details** page, click **Enable** and then **Enable Server** to enable the server. + +![Screenshot of the MCP details page](./assets/mcp-details.png) + +Take note of your MCP server URL in the **Hosted URL** section. + +[Generate a GRAM API key](/docs/gram/concepts/api-keys) in the **Settings** tab. + +## Connecting Vellum to your Gram-hosted MCP server + +This section walks you through creating a Vellum Workflow using the Workflow SDK. The Workflow will include an entry point, an agent node, and an output that determines whether today is a good day to push to production. + +Run these commands to create the project directory and initialize a Python project: + +```bash +# Create project directory +mkdir vellum-workflows-sdk +cd vellum-workflows-sdk + +# Initialize with uv +uv init +``` + +Install the dependencies: + +```bash +# Add Vellum SDK +uv add vellum-ai + +# Add python-dotenv for environment variable management +uv add python-dotenv +``` + +Create a `.env` file in the project root with your API keys: + +```txt +// .env + +VELLUM_API_KEY=your-vellum-api-key-here +GRAM_KEY=your-gram-api-key-here +``` + +Create your Vellum API key by clicking your username (top right) on the dashboard and navigating to **Settings -> API Keys**. + +Test that your Vellum API key works: + +```bash +export VELLUM_API_KEY=$(grep "^VELLUM_API_KEY=" .env | cut -d'=' -f2-) +uv run vellum ping +``` + +You should see your organization, workspace, and environment information printed to the console. + +### Creating the Workflow + +A Vellum Workflow has three main components: + +- **Inputs** that define what data the Workflow accepts +- **Nodes** that process the data +- **Outputs** that the Workflow returns + +For this Workflow, the Agent node can make multiple calls to the MCP server (as many as needed to answer the query). + +![Diagram showing the Vellum Workflow with entry point, agent node, and output](./assets/vellum-workflow-diagram.png) + +Inside the project directory, create a file called `workflow.py`. Start by defining the input structure: + +```python +from vellum.workflows.inputs.base import BaseInputs + +class Inputs(BaseInputs): + """Workflow input variables.""" + query: str +``` + +This defines a single input field `query` that accepts a string containing the user's question. + +Next, define the MCP server connection: + +```python +from vellum.workflows.constants import AuthorizationType +from vellum.workflows.references import EnvironmentVariableReference +from vellum.workflows.types.definition import MCPServer + +... + +# MCP server configuration +mcp_server = MCPServer( + name="push_advisor", + url="https://app.getgram.ai/mcp/your_server_slug", + authorization_type=AuthorizationType.API_KEY, + api_key_header_key="Authorization", + api_key_header_value=EnvironmentVariableReference(name="GRAM_KEY"), +) +``` + +Replace the `your_server_slug` with your actual MCP server slug. + +Now define the agent node. The `Agent` class is a `ToolCallingNode` that uses the MCP server: + +```python +from vellum import ChatMessagePromptBlock, PlainTextPromptBlock, PromptParameters, RichTextPromptBlock +from vellum.workflows.nodes.displayable.tool_calling_node import ToolCallingNode + +... + +class Agent(ToolCallingNode): + """Agent node that uses the push_advisor MCP server as a tool.""" + + ml_model = "gpt-5-responses" + prompt_inputs = {"query": Inputs.query} + max_prompt_iterations = 25 + + blocks = [ + ChatMessagePromptBlock( + chat_role="SYSTEM", + blocks=[ + RichTextPromptBlock( + blocks=[ + PlainTextPromptBlock( + text="You are a helpful assistant with access to the push_advisor MCP server. When users ask questions about pushing to production, you must actively use the available MCP tools to check the current status and provide a direct, clear answer. Do not ask the user what they want - instead, automatically use the appropriate tools and provide a helpful response based on the tool results. Always give a definitive answer when possible." + ) + ] + ) + ], + ), + ChatMessagePromptBlock( + chat_role="USER", + blocks=[ + RichTextPromptBlock( + blocks=[ + PlainTextPromptBlock(text="{{ query }}") + ] + ) + ], + ), + ] + + parameters = PromptParameters( + temperature=0, + max_tokens=1000, + custom_parameters={"json_mode": False}, + ) + + settings = {"stream_enabled": False} + + functions = [mcp_server] +``` + +The `Agent` class defines a tool-calling node that uses GPT-5 with the `push_advisor` MCP server. The `blocks` list structures the conversation: a system message sets the assistant's role, and a user message injects the query using Jinja templating (`{{ query }}`). The `functions` list connects the MCP server, giving the agent access to its tools. + +Create the output node to define how the Workflow returns results: + +```python +from vellum.workflows.nodes.displayable.final_output_node import FinalOutputNode +from vellum.workflows.state.base import BaseState + +... + +class FinalOutput(FinalOutputNode[BaseState, str]): + """Final output node that returns the agent's text response.""" + + class Outputs(FinalOutputNode.Outputs): + value = Agent.Outputs.text +``` + +This node extracts the text output from the agent node. + +Finally, connect all components: + +```python +from vellum.workflows.workflows.base import BaseWorkflow + +class Workflow(BaseWorkflow[Inputs, BaseState]): + """Vellum workflow with Agent node configured to use push_advisor MCP server.""" + + graph = Agent >> FinalOutput + + class Outputs(BaseWorkflow.Outputs): + final_output = FinalOutput.Outputs.value +``` + +The `graph` defines the execution flow: `Agent >> FinalOutput` means data flows from the `Agent` node to the `FinalOutput` node. + +### Running the Workflow + +Create a `run.py` file and add the following code: + +```python +import os +import sys +from dotenv import load_dotenv +from workflow import Workflow, Inputs + +load_dotenv() + + +def main(): + """Execute the workflow with the provided query.""" + if not os.getenv("VELLUM_API_KEY"): + print("Error: VELLUM_API_KEY environment variable is not set") + print("Please set it in your .env file or export it") + sys.exit(1) + + query = sys.argv[1] if len(sys.argv) > 1 else "Can I push to production?" + workflow = Workflow() + + print(f"Executing workflow with query: {query}") + print("-" * 60) + + result = workflow.run(inputs=Inputs(query=query)) + + if result.name == "workflow.execution.fulfilled": + print("\n✓ Workflow completed successfully!") + print("-" * 60) + + for output_descriptor, output_value in result.outputs: + if output_descriptor.name == "final_output": + print(f"\nOutput: {output_value}") + return + + print("\nWarning: Could not find output. Full result:") + print(result.outputs) + else: + print(f"\n✗ Workflow execution failed: {result.name}") + if hasattr(result, "body") and hasattr(result.body, "error"): + error = result.body.error + print(f"Error: {error.message if hasattr(error, 'message') else str(error)}") + sys.exit(1) + + +if __name__ == "__main__": + main() +``` + +Run the Workflow with a query: + +```bash +uv run python workflow.py "Is it safe to push to production today?" +``` + +The output shows the agent's response after it queries the MCP server and evaluates whether pushing to production is safe. + +![Screenshot of the Workflow output showing the agent's response about pushing to production](./assets/vellum-workflow-output.png) + +## What's next + +You now have Vellum Workflows connected to your Gram-hosted MCP server, giving it access to your custom APIs and tools. + +Ready to build your own MCP server? [Try Gram today](/product/gram) and see how easy it is to turn any API into agent-ready tools that work with both Anthropic and OpenAI models. diff --git a/docs/gram/clients/assets/configure-mcp.png b/docs/gram/clients/assets/configure-mcp.png new file mode 100644 index 00000000..d1ec2566 Binary files /dev/null and b/docs/gram/clients/assets/configure-mcp.png differ diff --git a/docs/gram/clients/assets/create-toolset.png b/docs/gram/clients/assets/create-toolset.png new file mode 100644 index 00000000..c2f4edee Binary files /dev/null and b/docs/gram/clients/assets/create-toolset.png differ diff --git a/docs/gram/clients/assets/fill-env-vars-toolset.png b/docs/gram/clients/assets/fill-env-vars-toolset.png new file mode 100644 index 00000000..27fdaced Binary files /dev/null and b/docs/gram/clients/assets/fill-env-vars-toolset.png differ diff --git a/docs/gram/clients/assets/gram-new-project.png b/docs/gram/clients/assets/gram-new-project.png new file mode 100644 index 00000000..56be3fc9 Binary files /dev/null and b/docs/gram/clients/assets/gram-new-project.png differ diff --git a/docs/gram/clients/assets/mcp-details.png b/docs/gram/clients/assets/mcp-details.png new file mode 100644 index 00000000..d44d3122 Binary files /dev/null and b/docs/gram/clients/assets/mcp-details.png differ diff --git a/docs/gram/clients/assets/set-server-url.png b/docs/gram/clients/assets/set-server-url.png new file mode 100644 index 00000000..af876aec Binary files /dev/null and b/docs/gram/clients/assets/set-server-url.png differ diff --git a/docs/gram/clients/assets/toolset-created.png b/docs/gram/clients/assets/toolset-created.png new file mode 100644 index 00000000..2a45d57a Binary files /dev/null and b/docs/gram/clients/assets/toolset-created.png differ diff --git a/docs/gram/clients/assets/upload-openapi-spec.png b/docs/gram/clients/assets/upload-openapi-spec.png new file mode 100644 index 00000000..b89c8483 Binary files /dev/null and b/docs/gram/clients/assets/upload-openapi-spec.png differ diff --git a/docs/gram/clients/assets/vellum-add-agent-node.png b/docs/gram/clients/assets/vellum-add-agent-node.png new file mode 100644 index 00000000..6771307d Binary files /dev/null and b/docs/gram/clients/assets/vellum-add-agent-node.png differ diff --git a/docs/gram/clients/assets/vellum-add-system-prompt.png b/docs/gram/clients/assets/vellum-add-system-prompt.png new file mode 100644 index 00000000..da420efc Binary files /dev/null and b/docs/gram/clients/assets/vellum-add-system-prompt.png differ diff --git a/docs/gram/clients/assets/vellum-add-tool-button.png b/docs/gram/clients/assets/vellum-add-tool-button.png new file mode 100644 index 00000000..43cb159e Binary files /dev/null and b/docs/gram/clients/assets/vellum-add-tool-button.png differ diff --git a/docs/gram/clients/assets/vellum-configure-mcp-server.png b/docs/gram/clients/assets/vellum-configure-mcp-server.png new file mode 100644 index 00000000..33111dd9 Binary files /dev/null and b/docs/gram/clients/assets/vellum-configure-mcp-server.png differ diff --git a/docs/gram/clients/assets/vellum-connect-workflow-nodes.png b/docs/gram/clients/assets/vellum-connect-workflow-nodes.png new file mode 100644 index 00000000..faebc6b2 Binary files /dev/null and b/docs/gram/clients/assets/vellum-connect-workflow-nodes.png differ diff --git a/docs/gram/clients/assets/vellum-create-gram-key-variable.png b/docs/gram/clients/assets/vellum-create-gram-key-variable.png new file mode 100644 index 00000000..e440afca Binary files /dev/null and b/docs/gram/clients/assets/vellum-create-gram-key-variable.png differ diff --git a/docs/gram/clients/assets/vellum-edit-prompt-button.png b/docs/gram/clients/assets/vellum-edit-prompt-button.png new file mode 100644 index 00000000..08371a02 Binary files /dev/null and b/docs/gram/clients/assets/vellum-edit-prompt-button.png differ diff --git a/docs/gram/clients/assets/vellum-profile-settings.png b/docs/gram/clients/assets/vellum-profile-settings.png new file mode 100644 index 00000000..c25c7dd3 Binary files /dev/null and b/docs/gram/clients/assets/vellum-profile-settings.png differ diff --git a/docs/gram/clients/assets/vellum-select-mcp-server.png b/docs/gram/clients/assets/vellum-select-mcp-server.png new file mode 100644 index 00000000..a5e512be Binary files /dev/null and b/docs/gram/clients/assets/vellum-select-mcp-server.png differ diff --git a/docs/gram/clients/assets/vellum-workflow-response.png b/docs/gram/clients/assets/vellum-workflow-response.png new file mode 100644 index 00000000..15596440 Binary files /dev/null and b/docs/gram/clients/assets/vellum-workflow-response.png differ diff --git a/docs/gram/clients/using-vellum-agents-with-gram-mcp-servers.mdx b/docs/gram/clients/using-vellum-agents-with-gram-mcp-servers.mdx new file mode 100644 index 00000000..a0f33e22 --- /dev/null +++ b/docs/gram/clients/using-vellum-agents-with-gram-mcp-servers.mdx @@ -0,0 +1,191 @@ +--- +title: "Using Vellum agents with Gram-hosted MCP servers" +description: "Learn how to connect Vellum's agents to Gram-hosted MCP servers to enable advanced agentic workflows." +sidebar: + order: 2 +--- + +[Vellum](https://www.vellum.ai/) agents can connect to Model Context Protocol (MCP) servers to interact with external APIs and tools. This guide shows you how to connect a Vellum agent to a [Gram-hosted MCP server](/blog/release-gram-beta) using the Push Advisor API from the Gram [core concepts](/mcp/core-concepts) guide. + +By the end, you'll have a Vellum Workflow that uses natural language to check whether it's safe to push to production. + +Find the full code and OpenAPI document in the [Push Advisor API repository](https://github.com/ritza-co/gram-examples/tree/main/push-advisor-api). + +## Prerequisites + +To follow this tutorial, you need: + +- A [Gram account](/product/gram) +- A [Vellum account](https://vellum.ai/) + +## Creating a Gram MCP server + +If you already have a Gram MCP server configured, you can skip to [connecting Vellum to your Gram-hosted MCP server](#connecting-vellum-to-your-gram-hosted-mcp-server). For an in-depth guide to how Gram works and more details on how to create a Gram-hosted MCP server, check out the [core concepts](/mcp/core-concepts) documentation. + +### Setting up a Gram project + +In the [Gram dashboard](https://app.getgram.ai), click **New Project** to create a new project. Enter a project name and click **Submit** + +![Screenshot of the Gram dashboard showing the New Project modal](./assets/gram-new-project.png) + +Once you've created the project, click the **Get Started** button. + +Choose **Start from API**. Gram then guides you through the following steps. + +#### Step 1: Upload the OpenAPI document + +Upload the [Push Advisor OpenAPI document](https://github.com/ritza-co/gram-examples/blob/main/push-advisor-api/openapi.yaml), enter the name of your API, and click **Continue**. + +![Screenshot of the upload your OpenAPI document dialog](./assets/upload-openapi-spec.png) + +#### Step 2: Create a toolset + +Give your toolset a name (for example, `Push Advisor`) and click **Continue**. + +![Screenshot of the create toolset dialog](./assets/create-toolset.png) + +Notice that the **Name Your Toolset** dialog displays the names of the tools that Gram will generate from your OpenAPI document. + +#### Step 3: Configure MCP + +Enter a URL slug for the MCP server and click **Continue**. + +![Screenshot of the configure MCP dialog](./assets/configure-mcp.png) + +Gram will create a new toolset from the OpenAPI document. + +Click **Toolsets** in the sidebar to view the Push Advisor toolset. + +![Screenshot of the Gram dashboard showing the Push Advisor toolset](./assets/toolset-created.png) + +### Configuring environment variables + +[Environments](/docs/gram/concepts/environments) store API keys and configuration separately from your toolset logic. + +In the **Environments** tab, click the **Default** environment. Then click **Fill for Toolset**, select the **Push Advisor** toolset, and click **Fill Variables** to automatically populate the required variables. + +![Screenshot showing the fill for toolset dialog to automatically populate required variables](./assets/fill-env-vars-toolset.png) + +The Push Advisor API is hosted at `https://canpushtoprod.abdulbaaridavids04.workers.dev`, so set the `_SERVER_URL` environment variable to `https://canpushtoprod.abdulbaaridavids04.workers.dev`. Click **Save**. + +![Set server URL](./assets/set-server-url.png) + +### Publishing an MCP server + +Let's make the toolset available as an MCP server. + +Go to the **MCP** tab, find the Push Advisor toolset, and click the title of the server. + +On the **MCP Details** page, click **Enable** and then **Enable Server** to enable the server. + +![Screenshot of the MCP details page](./assets/mcp-details.png) + +Take note of your MCP server URL in the **Hosted URL** section. + +[Generate a GRAM API key](/docs/gram/concepts/api-keys) in the **Settings** tab. + +## Connecting Vellum to your Gram MCP server + +With the Push Advisor MCP server ready, you can connect it to Vellum by configuring a tool in an agent node. + +## Adding a secret environment to Vellum + +Configuring the MCP server requires your Gram Key. To add it as a secret, navigate to **Profile → Settings → Secrets & Variables → + Add Environment Variable**. + +![Screenshot of Vellum profile settings page](./assets/vellum-profile-settings.png) + +Enter `GRAM_KEY` for the variable name, paste your Gram key as the value, and click **Create Variable**. + +![Screenshot of creating the GRAM_KEY environment variable in Vellum](./assets/vellum-create-gram-key-variable.png) + +## Adding a Vellum agent + +If you see a prompt input screen on the **New Workflow** creation screen, send Vellum the following prompt: + +``` +Create entrypoint, agent, and output nodes +``` + +Otherwise, use the **Create Workflow** button to create a new Workflow: + +- Add an **Agent** node. + + ![Screenshot of adding an agent node to a Vellum Workflow](./assets/vellum-add-agent-node.png) + +- Click the **Agent** node, then click **+ Tool**. + + ![Screenshot of the Add Tool button in the agent node](./assets/vellum-add-tool-button.png) + +- Select **MCP Server** from the **Tool Type** options. + + ![Screenshot of selecting MCP Server from the tool options](./assets/vellum-select-mcp-server.png) + +- Configure the MCP server as follows: + + - **Server Name:** Push Advisor + - **Server URL:** `https://app.getgram.ai/mcp/your-server-slug` (your Gram MCP server URL) + - **Authentication:** API Key + - **API Key Header Name:** Authorization + - **API Key Value:** GRAM_KEY + + ![Screenshot of the MCP server configuration modal in Vellum](./assets/vellum-configure-mcp-server.png) + +- Click **Confirm** to save. + + Vellum automatically discovers the available tools. You should see the MCP server listed under **Tools**. + +- Now, go to the **Overview** tab and click **Edit Prompt** to add a prompt. + +![Screenshot of the Edit Prompt button in the agent overview](./assets/vellum-edit-prompt-button.png) + +- In the modal, enter this system prompt: + + ``` + Is it safe to push to production today? + ``` + + ![Screenshot of adding a system prompt to the agent](./assets/vellum-add-system-prompt.png) + +- Connect the **Entrypoint** node to the **Agent** node, then connect the **Agent** node to the **Output**. Click on the **Output** node and under **Overview → Output Value**, select **Agent** for the **Node Output**, and select **Text** for the **Output Name**. + + ![Screenshot of connecting Workflow nodes in Vellum](./assets/vellum-connect-workflow-nodes.png) + +- Click **▶️ Run**. The Workflow should return a response indicating whether pushing to production is safe based on the Push Advisor server's evaluation. + + ![Screenshot of the Workflow response showing the agent output](./assets/vellum-workflow-response.png) + +## Troubleshooting + +Let's go through some common issues and how to fix them. + +### MCP Client not connecting + +If the MCP client can't connect to your server: + +- Verify the server URL is correct. +- Check that the MCP server is published as public in Gram. +- For authenticated servers, ensure your Gram API key is valid. +- Test the connection using the Gram Playground first. + +### Tool calls not working + +If the AI agent isn't calling the MCP tools: + +- Ensure the MCP client is properly configured in the AI Agent node. +- Check that your AI model has sufficient context about available tools. +- Try being more explicit in your prompts about using the Push Advisor tool. + +### Authentication errors + +For authenticated servers: + +- Verify your Gram API key in the dashboard under **Settings → API Keys**. +- Ensure the authorization header format is correct. +- Check that environment variables are correctly set in Gram. + +## What's next + +You now have Vellum connected to a Gram-hosted MCP server, enabling AI-powered automation workflows with access to your APIs and tools. + +Ready to build your own MCP server? [Try Gram today](/product/gram) and see how easy it is to turn any API into agent-ready tools. +