Skip to content

Latest commit

 

History

History
122 lines (89 loc) · 3.85 KB

README.md

File metadata and controls

122 lines (89 loc) · 3.85 KB

PocketLLM

PocketLLM is a powerful, cross-platform AI chat application built with Flutter that brings the power of large language models to your pocket. What makes PocketLLM unique is its ability to download and run AI models directly on your mobile device or connect to locally hosted models on your computer.

PocketLLM Screenshot PocketLLM Screenshot PocketLLM Screenshot

✨ Key Features

  • 📱 Mobile-First LLM Integration

    • Download and run AI models directly on your mobile device
    • Optimized for mobile performance and battery life
    • Secure, private, and offline-capable operation
  • 💻 Desktop Integration

    • Seamless connection to locally hosted models via LLMStudio or Ollama
    • Easy model management and switching
    • Cross-device synchronization
  • 🚀 Advanced Features

    • Real-time chat interface with markdown support
    • Token usage tracking and cost estimation
    • Chat history management with local storage
    • Dynamic model switching
    • Clean, modern UI with dark mode support
    • Responsive design for all screen sizes

🤖 Supported Models

Mobile-Compatible Models

  • Optimized versions of popular open-source models
  • Quantized models for efficient mobile execution
  • Various size options (7B, 13B) for different device capabilities

Desktop Integration

  • Connect to locally hosted models via:
    • Ollama
    • LLMStudio
    • Custom API endpoints

🛠️ Technical Requirements

Mobile

  • Android 8.0 or later
  • iOS 13.0 or later
  • Minimum 4GB RAM recommended
  • At least 2GB free storage space

Desktop

  • Windows 10/11
  • macOS 10.15 or later
  • Linux (major distributions)
  • Ollama or LLMStudio installed (for local model hosting)

📥 Installation

Mobile Apps

Download from (Comming Soon):

Desktop Apps

  1. Download the latest release for your platform from the releases page
  2. Install Ollama or LLMStudio if you want to host models locally
  3. Launch PocketLLM and connect to your local model server

Build from Source

# Clone the repository
git clone https://github.com/yourusername/pocketllm.git
cd pocketllm

# Install dependencies
flutter pub get

# Run the app
flutter run

📱 Getting Started

  1. Mobile Setup

    • Launch PocketLLM
    • Browse available models in the Model Hub
    • Download your preferred model
    • Start chatting immediately
  2. Desktop Setup

    • Install and configure Ollama or LLMStudio
    • Launch PocketLLM
    • Connect to your local model server
    • Select your model and begin chatting

🤝 Contributing

We welcome contributions! Please see our Contributing Guidelines for details.

📸 Screenshots

Mobile Chat Interface Model Selection Desktop Interface

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

  • Flutter team for the amazing framework
  • Ollama project for local model hosting
  • LLMStudio for desktop integration
  • All open-source model providers

Note: This project is currently in active development and open source. A beta version will be released soon! If you'd like to contribute, feel free to clone the repository and submit a pull request. Screenshots and documentation will be updated as the project progresses.