
An Open-source AI Chatbot Built With Next.js, Convex, and AI SDK.
Features · Model Providers · Deploy Your Own · Running locally
- Next.js App Router
- Advanced routing for seamless navigation and performance
- React Server Components (RSCs) for server-side rendering and performance improvements
- AI SDK
- Unified API for generating text, structured objects, and tool calls with LLMs
- Hooks for building dynamic chat and generative user interfaces
- Assisting with code generation, images, text editing, handling data, and searching the web
- Redis
- Persistent message streaming for real-time updates
- Shadcn/ui
- Styling with Tailwind CSS
- Component primitives from Radix UI for accessibility and flexibility
- Data Persistence
- Convex for saving chat history and user data
- Convex file storage for efficient file storage
- Convex Vector Search for long-term memory storage and retrieval
- Convex Auth
- Simple and secure authentication
This app ships with Openai provider as the default. However, with the AI SDK, you can switch LLM providers to Ollama, Anthropic, Cohere, and many more with just a few lines of code.
- Mini model (
gpt-4o-mini
): A fast and efficient model suitable for simple tasks - Large model (
gpt-4o
): A powerful model designed for complex tasks - Reasoning model (
o4-mini
): An advanced model configured for multi-step reasoning tasks
You can deploy your own version of the OpenChat to Vercel with one click:
You will need to use the environment variables defined in .env.example
to run OpenChat. It's recommended you use Vercel Environment Variables for this, but a .env
file is all that is necessary.
Note: You should not commit your
.env
file or it will expose secrets that will allow others to control access to your various OpenAI and authentication provider accounts.
- Install Vercel CLI:
npm i -g vercel
- Link local instance with Vercel and GitHub accounts (creates
.vercel
directory):vercel link
- Download your environment variables:
vercel env pull
bun install
bun dev
Your app should now be running on localhost:3000.