From idea to running code in minutes, not weeks. LocalCloud delivers developer-friendly PostgreSQL, MongoDB, vector databases, AI models, Redis cache, job queues, and S3-like storage instantly. No DevOps, no cloud bills, no infrastructure drama.
๐ Programming Language Agnostic - Works seamlessly with Python, Node.js, Go, Java, Rust, PHP, .NET, or any language. LocalCloud provides standard service APIs (PostgreSQL, MongoDB, Redis, S3, etc.) that integrate with your existing code regardless of technology stack.
- ๐ธ Bootstrapped Startups - Build MVPs with zero infrastructure costs during early development
- ๐ Privacy-First Enterprises - Run open-source AI models locally, keeping data in-house
- โฐ Corporate Developers - Skip IT approval queues - get PostgreSQL and Redis running now
- ๐ฑ Demo Heroes - Tunnel your app to any device - present from iPhone to client's iPad instantly
- ๐ค Remote Teams - Share running environments with frontend developers without deployment hassles
- ๐ Students & Learners - Master databases and AI without complex setup or cloud accounts
- ๐งช Testing Pipelines - Integrate AI and databases in CI without external dependencies
- ๐ง Prototype Speed - Spin up full-stack environments faster than booting a VM
- ๐ค AI Assistant Users - Works seamlessly with Claude Code, Cursor, Gemini CLI for AI-powered development
Choose your platform for one-command installation:
curl -fsSL https://localcloud.sh/install | bash
# Install
iwr -useb https://localcloud.sh/install.ps1 | iex
# Update/Reinstall
iwr -useb https://localcloud.sh/install.ps1 | iex -ArgumentList "-Force"
# Homebrew (macOS/Linux)
brew install localcloud-sh/tap/localcloud
# Coming Soon
# winget install localcloud # Windows Package Manager
# choco install localcloud # Chocolatey
# scoop install localcloud # Scoop
# apt install localcloud # Debian/Ubuntu
# dnf install localcloud # Fedora
# pacman -S localcloud # Arch Linux
๐ Alternative Installation Methods
Windows (PowerShell)
# Install (https://localcloud.sh/install.ps1)
iwr -useb https://localcloud.sh/install.ps1 | iex
# Update/Reinstall
iwr -useb https://localcloud.sh/install.ps1 | iex -ArgumentList "-Force"
Manual Download:
- Download latest release from GitHub Releases
- Extract the archive for your platform
- Move binary to PATH directory
Development Build:
git clone https://github.com/localcloud-sh/localcloud
cd localcloud
go build -o localcloud ./cmd/localcloud
# Setup your project with an interactive wizard
lc setup
You'll see an interactive wizard:
? What would you like to build? (Use arrow keys)
โฏ Chat Assistant - Conversational AI with memory
RAG System - Document Q&A with vector search
Custom - Select components manually
? Select components you need: (Press <space> to select, <enter> to confirm)
โฏ โฏ [AI] LLM (Text generation) - Large language models for text generation, chat, and completion
โฏ [AI] Embeddings (Semantic search) - Text embeddings for semantic search and similarity
โฏ [Database] Database (PostgreSQL) - Standard relational database for data storage
โฏ [Database] Vector Search (pgvector) - Add vector similarity search to PostgreSQL
โฏ [Database] NoSQL Database (MongoDB) - Document-oriented database for flexible data storage
โฏ [Infrastructure] Cache (Redis) - In-memory cache for temporary data and sessions
โฏ [Infrastructure] Queue (Redis) - Reliable job queue for background processing
โฏ [Infrastructure] Object Storage (MinIO) - S3-compatible object storage for files and media
Then start your services:
lc start
# Your infrastructure is now running!
# Check status: lc status
AI assistants can set up projects with simple commands:
# Quick presets for common stacks
lc setup my-ai-app --preset=ai-dev --yes # AI + Database + Vector search
lc setup my-app --preset=full-stack --yes # Everything included
lc setup blog --preset=minimal --yes # Just AI models
# Or specify exact components
lc setup my-app --components=llm,database,storage --models=llama3.2:3b --yes
Note:
lc
is the short alias forlocalcloud
- use whichever you prefer!
- ๐ One-Command Setup: Create and configure projects with just
lc setup
- ๐ฐ Zero Cloud Costs: Everything runs locally - no API fees or usage limits
- ๐ Complete Privacy: Your data never leaves your machine
- ๐ฆ Pre-built Templates: Production-ready backends for common AI use cases
- ๐ง Optimized Models: Carefully selected models that run on 4GB RAM
- ๐ง Developer Friendly: Simple CLI, clear errors, extensible architecture
- ๐ณ Docker-Based: Consistent environment across all platforms
- ๐ Mobile Ready: Built-in tunnel support for demos anywhere
- ๐ค Export Tools: One-command migration to any cloud provider
- ๐ค AI Assistant Ready: Non-interactive setup perfect for Claude Code, Cursor, Gemini CLI
Make production infrastructure as simple as running a local web server.
LocalCloud eliminates the complexity and cost of infrastructure setup by providing a complete, local-first development environment. No cloud bills, no data privacy concerns, no complex configurations - just pure development productivity.
For AI coding assistants: Share this repository link to give your AI assistant complete context:
"I'm using LocalCloud for local AI development. Please review this repository to understand its capabilities: https://github.com/localcloud-sh/localcloud"
Your AI assistant will automatically understand all commands and help you build applications using LocalCloud's non-interactive setup options.
Waiting 3 weeks for cloud access approval? Your POC could be done by then. LocalCloud lets you build and demonstrate AI solutions immediately, no IT tickets required.
Present from your phone to any client's screen. Built-in tunneling means you can demo your AI app from anywhere - coffee shop WiFi, client office, or conference room.
We've all been there - spun up a demo, showed the client, forgot to tear it down. With LocalCloud, closing your laptop is shutting down the infrastructure.
Healthcare, finance, government? Some data can't leave the building. LocalCloud keeps everything local while giving you cloud-level capabilities.
No API rate limits. No usage caps. No waiting for credits. Just pure development speed when every minute counts.
Your Own Cursor/Copilot: Build an AI code editor without $10k/month in API costs during development.
AI Mobile Apps: Develop and test your AI-powered iOS/Android app locally until you're ready to monetize.
SaaS MVP: Validate your AI startup idea without cloud bills - switch to cloud only after getting paying customers.
For Employers: Give candidates a pre-configured LocalCloud environment. No setup headaches, just coding skills evaluation.
For Candidates: Submit a fully-working AI application. While others struggle with API keys, you ship a complete solution.
AI Customer Support Trainer: Process your support tickets locally to train a custom assistant.
Code Review Bot: Build a team-specific code reviewer without sending code to external APIs.
Meeting Transcription System: Record, transcribe, and summarize meetings - all on company hardware.
"Hey Claude, build me a chatbot backend" โ Your AI assistant runs lc setup my-chatbot --preset=ai-dev --yes
and in 60 seconds you have PostgreSQL, vector search, AI models, and Redis running locally. Complete with database schema, API endpoints, and a working chat interface. By the time you finish your coffee, you're making API calls to your fully functional backend.
No cloud signup. No credit card. No infrastructure drama. Just pure AI-assisted development velocity.
During lc setup
, you can choose from pre-configured templates or customize your own service selection:
lc setup my-assistant # Select "Chat Assistant" template
- Conversational AI with persistent memory
- PostgreSQL for conversation storage
- Streaming responses
- Model switching support
lc setup my-knowledge-base # Select "RAG System" template
- Document ingestion and embedding
- Vector search with pgvector
- Context-aware responses
- Scalable to millions of documents
lc setup my-transcriber # Select "Speech/Whisper" template
- Audio transcription API
- Multiple language support
- Real-time processing
- Optimized Whisper models
lc setup my-project # Choose "Custom" and select individual services
- Pick only the services you need
- Configure each service individually
- Optimal resource usage
Note: MVP includes backend infrastructure only. Frontend applications coming in v2.
LocalCloud Project Structure:
โโโ .localcloud/ # Project configuration
โ โโโ config.yaml # Service configurations and runtime settings
โโโ .gitignore # Git ignore file (excludes .localcloud from version control)
โโโ your-app/ # Your application code goes here
- Setup:
lc setup [project-name]
creates project and opens interactive wizard- Creates project structure (if new)
- Choose template or custom services
- Select AI models
- Configure ports and resources
- Start:
lc start
launches all services - Develop: Services are ready for your application
Service | Description | Default Port |
---|---|---|
AI/LLM | Ollama with selected models | 11434 |
Database | PostgreSQL (optional pgvector extension) | 5432 |
MongoDB | Document-oriented NoSQL database | 27017 |
Cache | Redis for performance | 6379 |
Queue | Redis for job processing | 6380 |
Storage | MinIO (S3-compatible) | 9000/9001 |
- OS: macOS, Linux, Windows 10/11
- RAM: 4GB minimum (8GB recommended)
- Disk: 10GB free space
- Docker: Docker Desktop or Docker Engine
- CPU: x64 or ARM64 processor
Note: LocalCloud is written in Go for performance, but you don't need Go installed. The CLI is a single binary that works everywhere. Windows users can install via PowerShell - no WSL required.
Windows:
# Check if update is needed (will show current version)
iwr -useb https://localcloud.sh/install.ps1 | iex
# Force update/reinstall
iwr -useb https://localcloud.sh/install.ps1 | iex -ArgumentList "-Force"
macOS/Linux (Homebrew):
brew upgrade localcloud-sh/tap/localcloud
Linux (without Homebrew):
# Re-run install script
curl -fsSL https://localcloud.sh/install | bash
# Create and configure new project
lc setup [project-name]
# Configure existing project (in current directory)
lc setup
# Add/remove components
lc setup --add llm
lc setup --add vector # Add vector search to existing database
lc setup --remove cache
lc setup --remove vector # Remove vector search, keep database
# Start all services
lc start
# Stop all services
lc stop
# View service status
lc status
# View logs
lc logs [service]
# List available models
lc models list
# Pull a new model
lc models pull llama3.2:3b
# Remove a model
lc models remove llama3.2:3b
# Show model information
lc models info qwen2.5:3b
LocalCloud uses a simple YAML configuration:
# .localcloud/config.yaml
project:
name: my-assistant
version: 1.0.0
models:
default: qwen2.5:3b
embeddings: nomic-embed-text
services:
ai:
memory_limit: 2GB
gpu: false
database:
port: 5432
extensions:
- pgvector
# Check Docker status
docker info
# macOS/Windows: Start Docker Desktop
# Linux: sudo systemctl start docker
# Find process using port
lsof -i :3000
# Use different port
lc start --port 3001
# Check available space
df -h
# Clear unused models
lc models prune
# Check if PostgreSQL is running
lc status postgres
# View PostgreSQL logs
lc logs postgres
# Restart PostgreSQL
lc service restart postgres
LocalCloud includes a comprehensive test suite for validating all components work correctly:
cd test-components
# Test all components
./test-runner.sh
# Test specific components
./test-runner.sh --components database,vector,llm
# Test with verbose output and progress monitoring
./test-runner.sh --components llm --verbose
# GitHub Actions compatible output
./test-runner.sh --format junit --output ./reports
Key Features:
- โ Cross-platform: Works on macOS, Linux with automatic timeout handling
- โ Robust interruption: Proper Ctrl+C handling and process cleanup
- โ Smart monitoring: Event-driven readiness detection for all services
- โ CI/CD ready: JUnit XML output for GitHub Actions integration
- โ Dependency aware: Understands component relationships (database โ vector)
For detailed testing documentation, see test-components/README.md.
We welcome contributions! See CONTRIBUTING.md for:
- Development setup
- Code style guidelines
- Testing requirements
- Pull request process
docs.localcloud.sh - Complete documentation, CLI reference, and examples
We're excited about the future of local-first AI development! Here are some ideas we're exploring:
- Multi-Language SDKs - Python, JavaScript, Go, and Rust client libraries
- Web Admin Panel - Visual service management and monitoring dashboard
- Model Fine-tuning - Train custom models on your local data
- Team Collaboration - Share projects and sync configurations across teams
- Performance Optimization - GPU acceleration and model quantization
- Enterprise Features - SSO, audit logs, and compliance tools
- Project Isolation - Currently, multiple projects share the same Docker containers (e.g., localcloud-mongodb, localcloud-postgres). Future releases will implement project-based container naming for complete isolation between projects
- Plugin System - Extend LocalCloud with custom services
- Alternative AI Providers - Support for Hugging Face Transformers, OpenAI-compatible APIs
- Cloud Sync - Seamlessly transition from local to cloud deployment
- Mobile Development - Native iOS/Android development tools
- Kubernetes Integration - Deploy LocalCloud setups to K8s clusters
- IDE Extensions - VS Code and JetBrains plugins for better DX
We'd love to hear your ideas! Share your thoughts:
- ๐ฌ GitHub Discussions - Feature requests and community chat
- ๐ GitHub Issues - Bug reports and specific feature requests
- ๐ง [email protected] - Direct feedback and collaboration
Your input helps us prioritize what matters most to developers building AI applications.
Licensed under Apache 2.0 - see LICENSE for details.
LocalCloud exists because of amazing open-source projects and communities:
- Ollama - Our AI model serving foundation, making local LLMs accessible to everyone
- Meta AI - Llama models available through Ollama
- Mistral AI - Mistral models available through Ollama
- Model creators - All the researchers and companies who open-source their models for Ollama
- PostgreSQL - The world's most advanced open source database
- pgvector - Vector similarity search for PostgreSQL
- MongoDB - Document database for modern applications
- Redis - In-memory data structure store
- MinIO - High-performance object storage
- Docker - Containerization that makes deployment simple
- Go - Fast, reliable, and efficient programming language
- Cobra - Powerful CLI framework for Go
- The Ollama team for creating such an elegant local AI solution
- Docker community for making containerization accessible
- All the model creators who chose to open-source their work
- Contributors and testers who help make LocalCloud better
Without these projects and their maintainers, LocalCloud wouldn't exist. We're proud to be part of the open-source ecosystem.
Website โข Documentation โข GitHub โข Contact