Skip to content

Commit 209f0ea

Browse files
committed
docs
1 parent c38f116 commit 209f0ea

File tree

7 files changed

+62
-5
lines changed

7 files changed

+62
-5
lines changed

CONTRIBUTING.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,8 @@ Install [Pants Build](https://www.pantsbuild.org/stable/docs/getting-started/ins
1919

2020
On Linux, you can run `./get-pants.sh` available in the repo root, as described/recommended in the Pants Installation docs.
2121

22+
Install [Podman](https://podman.io/) to be able build images like `Next Gen UI MCP Server`.
23+
2224
### VSCode
2325

2426
Run Pants export to create a virtual env
@@ -88,6 +90,12 @@ pants run libs/next_gen_ui_llama_stack/agent_test.py
8890

8991
# Run formatter, linter, check
9092
pants fmt lint check ::
93+
94+
# Build all packages including python and docker
95+
pants package ::
96+
97+
# Build only Python packages
98+
pants package --filter-target-type=python_distribution ::
9199
```
92100

93101
### Dependency Management

docs/guide/ai_apps_binding/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ Binding UI Agent core functionality into your AI application (AI assistant backe
1111

1212
## AI Protocol bindings/servers
1313

14-
* MCP - WIP
14+
* MCP - [MCP Library](mcp.md) and [MCP Container](mcp-containers.md)
1515
* A2A - WIP
1616
* [ACP](acp.md) - Tech Preview
1717

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
{%
2+
include-markdown "../../../libs/next_gen_ui_mcp/README-containers.md"
3+
%}
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
{%
2+
include-markdown "../../../libs/next_gen_ui_mcp/README.md"
3+
%}

libs/next_gen_ui_mcp/README-containers.md

Lines changed: 38 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Next Gen UI MCP Server
1+
# Next Gen UI MCP Server Container
22

33
Next Gen UI MCP Server container image.
44

@@ -32,7 +32,42 @@ The MCP server container can be configured via environment variables. All config
3232
| `NGUI_MODEL` | `gpt-4o` | Model name (required for non-MCP providers) |
3333
| `NGUI_PROVIDER_API_BASE_URL` | - | Base URL for OpenAI-compatible API |
3434
| `NGUI_PROVIDER_API_KEY` | - | API key for the LLM provider |
35-
| `NGUI_PROVIDER_LLAMA_URL` | - | LlamaStack server URL (if `llamastac` is used) |
35+
| `NGUI_PROVIDER_LLAMA_URL` | - | LlamaStack server URL (if `llamastack` is used) |
36+
37+
### Providers
38+
39+
The Next Gen UI MCP server supports three inference providers, controlled by the `NGUI_PROVIDER` environment variable:
40+
41+
Selects the inference provider to use for generating UI components:
42+
43+
#### Provider **`mcp`**
44+
45+
Uses Model Context Protocol sampling to leverage the client's LLM capabilities. No additional configuration required as it uses the connected MCP client's model.
46+
47+
#### Provider **`langchain`**:
48+
49+
Uses LangChain with OpenAI-compatible APIs.
50+
51+
Requires:
52+
53+
- `NGUI_MODEL`: Model name (e.g., `gpt-4o`, `llama3.2`)
54+
- `NGUI_PROVIDER_API_KEY`: API key for the provider
55+
- `NGUI_PROVIDER_API_BASE_URL` (optional): Custom base URL for OpenAI-compatible APIs like Ollama
56+
57+
Examples:
58+
59+
- OpenAI: `https://api.openai.com/v1` (default)
60+
- Ollama: `http://host.containers.internal:11434/v1`
61+
62+
#### Provider **`llamastack`**:
63+
64+
Uses LlamaStack server for inference.
65+
66+
Requires:
67+
68+
- `NGUI_MODEL`: Model name available on the LlamaStack server
69+
- `NGUI_PROVIDER_LLAMA_URL`: URL of the LlamaStack server
70+
3671

3772
### Usage Examples
3873

@@ -84,6 +119,7 @@ podman run --rm -it -p 5000:5000 \
84119
### Network Configuration
85120

86121
For local development connecting to services running on the host machine:
122+
87123
- Use `host.containers.internal` to access host services (works with Podman and Docker Desktop)
88124
- For Linux with Podman, you may need to use `host.docker.internal` or the host's IP address
89125
- Ensure the target services (like Ollama) are accessible from containers

libs/next_gen_ui_mcp/README.md

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Next Gen UI MCP Server
1+
# Next Gen UI MCP Server Library
22

33
This package wraps our NextGenUI agent in a Model Context Protocol (MCP) tool using the standard MCP SDK. Since MCP adoption is so strong these days and there is an apetite to use this protocol also for handling agentic AI, we wanted to also deliver this way of consuming our agent. The most common way of utilising MCP tools is to provide them to LLM to choose and execute with certain parameters. This approach doesn't make sense for NextGenUI agent as you want to call it at specific moment after gathering data for response and also you don't want LLM to try to pass the prompt and JSON content as it may lead to unnecessary errors in the content. It's more natural and reliable to invoke this MCP tool directly with the parameters as part of your main application logic.
44

@@ -80,16 +80,21 @@ result = client.tool_runtime.invoke_tool(tool_name="generate_ui", kwargs=input_d
8080
## Available MCP Tools
8181

8282
### `generate_ui`
83-
The main tool that wraps the entire Next Gen UI Agent functionality. This single tool handles:
83+
The main tool that wraps the entire Next Gen UI Agent functionality.
84+
85+
This single tool handles:
86+
8487
- Component selection based on user prompt and data
8588
- Data transformation to match selected components
8689
- Design system rendering to produce final UI
8790

8891
**Parameters:**
92+
8993
- `user_prompt` (str): User's prompt which we want to enrich with UI components
9094
- `input_data` (List[Dict]): List of input data to render within the UI components
9195

9296
**Returns:**
97+
9398
- List of rendered UI components ready for display
9499

95100
## Available MCP Resources

mkdocs.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -80,6 +80,8 @@ nav:
8080
- Embedded Llama Stack Server: guide/ai_apps_binding/llamastack_embedded.md
8181
- BeeAI: guide/ai_apps_binding/beeai.md
8282
- ACP: guide/ai_apps_binding/acp.md
83+
- MCP Library: guide/ai_apps_binding/mcp-library.md
84+
- MCP Container: guide/ai_apps_binding/mcp-container.md
8385
- Python Library: guide/ai_apps_binding/pythonlib.md
8486
- Binding into UI:
8587
- guide/renderer/index.md

0 commit comments

Comments
 (0)