Skip to content

πŸš€ A powerful Flutter-based AI chat application that lets you run LLMs directly on your mobile device or connect to local model servers. Features offline model execution, Ollama/LLMStudio integration, and a beautiful modern UI. Privacy-focused, cross-platform, and fully open source.

Notifications You must be signed in to change notification settings

Mr-Dark-debug/PocketLLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

3 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

PocketLLM

PocketLLM is a powerful, cross-platform AI chat application built with Flutter that brings the power of large language models to your pocket. What makes PocketLLM unique is its ability to download and run AI models directly on your mobile device or connect to locally hosted models on your computer.

PocketLLM Screenshot PocketLLM Screenshot PocketLLM Screenshot

✨ Key Features

  • πŸ“± Mobile-First LLM Integration

    • Download and run AI models directly on your mobile device
    • Optimized for mobile performance and battery life
    • Secure, private, and offline-capable operation
  • πŸ’» Desktop Integration

    • Seamless connection to locally hosted models via LLMStudio or Ollama
    • Easy model management and switching
    • Cross-device synchronization
  • πŸš€ Advanced Features

    • Real-time chat interface with markdown support
    • Token usage tracking and cost estimation
    • Chat history management with local storage
    • Dynamic model switching
    • Clean, modern UI with dark mode support
    • Responsive design for all screen sizes

πŸ€– Supported Models

Mobile-Compatible Models

  • Optimized versions of popular open-source models
  • Quantized models for efficient mobile execution
  • Various size options (7B, 13B) for different device capabilities

Desktop Integration

  • Connect to locally hosted models via:
    • Ollama
    • LLMStudio
    • Custom API endpoints

πŸ› οΈ Technical Requirements

Mobile

  • Android 8.0 or later
  • iOS 13.0 or later
  • Minimum 4GB RAM recommended
  • At least 2GB free storage space

Desktop

  • Windows 10/11
  • macOS 10.15 or later
  • Linux (major distributions)
  • Ollama or LLMStudio installed (for local model hosting)

πŸ“₯ Installation

Mobile Apps

Download from (Comming Soon):

Desktop Apps

  1. Download the latest release for your platform from the releases page
  2. Install Ollama or LLMStudio if you want to host models locally
  3. Launch PocketLLM and connect to your local model server

Build from Source

# Clone the repository
git clone https://github.com/yourusername/pocketllm.git
cd pocketllm

# Install dependencies
flutter pub get

# Run the app
flutter run

πŸ“± Getting Started

  1. Mobile Setup

    • Launch PocketLLM
    • Browse available models in the Model Hub
    • Download your preferred model
    • Start chatting immediately
  2. Desktop Setup

    • Install and configure Ollama or LLMStudio
    • Launch PocketLLM
    • Connect to your local model server
    • Select your model and begin chatting

🀝 Contributing

We welcome contributions! Please see our Contributing Guidelines for details.

πŸ“Έ Screenshots

Mobile Chat Interface Model Selection Desktop Interface

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments

  • Flutter team for the amazing framework
  • Ollama project for local model hosting
  • LLMStudio for desktop integration
  • All open-source model providers

Note: This project is currently in active development and open source. A beta version will be released soon! If you'd like to contribute, feel free to clone the repository and submit a pull request. Screenshots and documentation will be updated as the project progresses.

About

πŸš€ A powerful Flutter-based AI chat application that lets you run LLMs directly on your mobile device or connect to local model servers. Features offline model execution, Ollama/LLMStudio integration, and a beautiful modern UI. Privacy-focused, cross-platform, and fully open source.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published