Connect to Python agents from TypeScript - Use powerful Python agents in your TypeScript apps
ConnectOnion TypeScript SDK lets you connect to and use AI agents built with Python. Build your agents in Python (where the ecosystem is rich), then use them seamlessly from TypeScript/JavaScript applications.
// Connect to a Python agent and use it
import { connect } from 'connectonion';
// Connect to a remote agent by address
const agent = connect('0x3d4017c3e843895a92b70aa74d1b7ebc9c982ccf2ec4968cc0cd55f12af4660c');
// Use it like a local function
const result = await agent.input('Search for TypeScript tutorials');
console.log(result);That's it. No server setup. No complex configuration. Just connect and use.
Python has the richest AI ecosystem - LangChain, LlamaIndex, transformers, and countless ML libraries. Build your agents where the tools are best.
Your web apps, React frontends, Node.js backends, and Electron apps are in TypeScript. Now you can use powerful Python agents directly.
No servers to manage. No API endpoints to deploy. Agents connect peer-to-peer through the relay network.
Ed25519 cryptographic addressing. No passwords. No auth tokens to leak. Just public/private key pairs.
npm install connectonion
# or
yarn add connectonion
# or
pnpm add connectonionimport { connect } from 'connectonion';
// Connect to a remote Python agent
const agent = connect('0x3d4017c3e843895a92b70aa74d1b7ebc9c982ccf2ec4968cc0cd55f12af4660c');
// Use it!
const response = await agent.input('Analyze this data and create a report');
console.log(response);If you need to create your own agent in Python:
# pip install connectonion
from connectonion import Agent, announce
def analyze_data(data: str) -> str:
"""Analyze data and create a report"""
# Your Python logic with pandas, numpy, etc.
return f"Analysis: {data}"
agent = Agent(
name="data-analyst",
tools=[analyze_data]
)
# Announce to the network
announce(agent)
# Prints: Agent address: 0x3d401...Then connect from TypeScript as shown above!
// React component using a Python ML agent
import { connect } from 'connectonion';
import { useState } from 'react';
function DataAnalyzer() {
const [result, setResult] = useState('');
const agent = connect('0xYourPythonMLAgent');
const analyze = async () => {
// Python agent has pandas, scikit-learn, matplotlib, etc.
const response = await agent.input(
'Analyze sales data and predict next quarter trends'
);
setResult(response);
};
return <button onClick={analyze}>Analyze Data</button>;
}// Express API using a Python agent for complex processing
import express from 'express';
import { connect } from 'connectonion';
const app = express();
const pythonAgent = connect('0xYourPythonAgent');
app.post('/analyze', async (req, res) => {
// Offload heavy processing to Python agent
const result = await pythonAgent.input(req.body.query);
res.json({ result });
});
app.listen(3000);// Electron app using Python agent for system operations
import { connect } from 'connectonion';
const systemAgent = connect('0xYourSystemAgent');
async function handleFileOperation() {
// Python agent has full system access and libraries
const result = await systemAgent.input(
'Find all PDFs in Downloads, extract text, and summarize'
);
return result;
}// Connect to local development relay
const agent = connect(
'0xYourAgent',
'ws://localhost:8000/ws/announce'
);
// Or use environment variable
process.env.RELAY_URL = 'ws://localhost:8000/ws/announce';
const agent = connect('0xYourAgent'); // uses RELAY_URL// Adjust timeout for long-running tasks
const result = await agent.input(
'Process large dataset',
60000 // 60 second timeout
);// Connect to different specialized agents
const mlAgent = connect('0xMLAgent');
const nlpAgent = connect('0xNLPAgent');
const visionAgent = connect('0xVisionAgent');
// Use them in parallel
const [analysis, sentiment, objects] = await Promise.all([
mlAgent.input('Analyze time series'),
nlpAgent.input('Extract sentiment from reviews'),
visionAgent.input('Detect objects in image')
]);- Getting Started Guide - Complete setup walkthrough
- Connect API - Remote agent connection details
- API Reference - Full API documentation
- Troubleshooting - Common issues & solutions
While we recommend building agents in Python, you can also build simple agents directly in TypeScript:
- Tool System - How to create tools in TypeScript
- Examples - TypeScript agent examples
Important Notes:
- TypeScript agent features are experimental and may have bugs
- Python agent features are well-tested and fully supported
- For complex agents with ML, data processing, or extensive Python libraries, use Python and connect via
connect() - Full TypeScript agent support planned for Q1 2026
If you encounter bugs building agents in TypeScript, please report them on GitHub.
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β ConnectOnion TypeScript SDK β
β β
β ββββββββββββββββ ββββββββββββββββ ββββββββββββββββββββββββββ β
β β Agent β β connect() β β llmDo() β β
β β (local AI) β β (remote) β β (one-shot LLM call) β β
β ββββββββ¬ββββββββ ββββββββ¬ββββββββ ββββββββββββββ¬ββββββββββββ β
β β β β β
β βΌ βΌ β β
β βββββββββββββββββββββββββββββββ β β
β β LLM Factory βββββββββββββββββββββ β
β β createLLM(model) β β
β ββββββββββββ¬βββββββββββββββββββ β
β βββββββββΌβββββββββββ¬βββββββββββββ β
β βΌ βΌ βΌ βΌ β
β Anthropic OpenAI Gemini OpenOnion β
β (claude-*) (gpt-*) (gemini-*) (co/*) β
β β
β ββββββββββββββββ ββββββββββββ βββββββββββββ βββββββββββββ β
β β Tool System β β Trust β β Console β β Xray β β
β β funcβschema β β Levels β β Logging β β Debugger β β
β ββββββββββββββββ ββββββββββββ βββββββββββββ βββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
agent.input("What is 2+2?")
β
βΌ
βββββββββββββββββββ
β Init messages β [system prompt] + [user message]
ββββββββββ¬βββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββ
β Main Loop (max 10 iter) β
β β
β LLM.complete(messages, tools) β
β β β
β βββ No tool calls βββΆ EXIT β
β β β
β βββ Tool calls found: β
β Promise.all( β
β tool_1.run(args), β
β tool_2.run(args) β
β ) β
β β β
β βΌ β
β Append results β LOOP β
βββββββββββββββββββββββββββββββββββββββ
β
βΌ
Return final text response
Your code SDK internals
function add(a: number, Tool {
b: number): number { βββΆ name: "add",
return a + b; description: "...",
} run(args) β add(a, b),
toFunctionSchema() β {
class API { type: "object",
search(q: string) {} βββΆ properties: {a: {type: "number"}, ...}
fetch(id: number) {} }
} }
createLLM(model)
β
βββ "co/*" βββΆ OpenAI LLM + OpenOnion baseURL
βββ "claude-*" βββΆ Anthropic LLM (default)
βββ "gpt-*" βββΆ OpenAI LLM
βββ "o*" βββΆ OpenAI LLM
βββ "gemini-*" βββΆ Gemini LLM
βββ (unknown) βββΆ Anthropic (fallback) or NoopLLM
your-project/
βββ src/
β βββ agents/ # Your agent definitions
β βββ tools/ # Custom tool implementations
β βββ index.ts # Main entry point
βββ .env # API keys (never commit!)
βββ package.json
βββ tsconfig.json
src/
βββ core/
β βββ agent.ts # Main Agent class (orchestrator)
βββ llm/
β βββ index.ts # LLM factory (routes model names)
β βββ anthropic.ts # Anthropic Claude provider (default)
β βββ openai.ts # OpenAI GPT/O-series provider
β βββ gemini.ts # Google Gemini provider
β βββ noop.ts # Fallback for missing config
β βββ llm-do.ts # One-shot llmDo() helper
βββ tools/
β βββ tool-utils.ts # Function β Tool conversion
β βββ tool-executor.ts # Execution + trace recording
β βββ xray.ts # Debug context injection (@xray)
β βββ replay.ts # Replay decorator for debugging
β βββ email.ts # Mock email tools for demos/tests
βββ trust/
β βββ index.ts # Trust levels (open/careful/strict)
β βββ tools.ts # Whitelist checks & verification
βββ connect/
β βββ index.ts # connect() factory + re-exports
β βββ types.ts # ChatItem, Response, AgentStatus, ConnectOptions, etc.
β βββ endpoint.ts # resolveEndpoint, fetchAgentInfo, utils
β βββ remote-agent.ts # RemoteAgent class
βββ console.ts # Dual logging (stderr + file)
βββ types.ts # Core TypeScript interfaces
βββ index.ts # Public API exports
Get help, share agents, and discuss with 1000+ builders in our active community.
If ConnectOnion helps you build better agents, give it a star! β
It helps others discover the framework and motivates us to keep improving it.
We love contributions! See CONTRIBUTING.md for guidelines.
# Clone the repo
git clone https://github.com/openonion/connectonion-ts
cd connectonion-ts
# Install dependencies
npm install
# Run tests
npm test
# Build
npm run buildMIT Β© OpenOnion Team
- Python Version - Original Python SDK
- Discord Community - Get help & share ideas
- Blog - Tutorials and updates
- Rich AI Ecosystem: LangChain, transformers, pandas, scikit-learn, PyTorch, TensorFlow
- Data Processing: NumPy, SciPy, matplotlib for complex analysis
- Mature Libraries: Decades of proven Python libraries
- Simple Setup:
pip installand you're ready
- Web & Mobile: React, Next.js, React Native, Electron
- Type Safety: Catch errors at compile time
- IDE Support: Unmatched IntelliSense and auto-completion
- NPM Ecosystem: Access to millions of UI/frontend packages
Build agents where the tools are rich (Python), use them where users are (TypeScript apps).
Built with β€οΈ by developers, for developers