Skip to content

ericz99/local-ai-chat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Local Multi-Modal AI Chat

A privacy-focused, locally-run chat interface supporting multi-modal interactions with AI models. Built with Python and Ollama.

Demo

Features

  • 🔒 100% Local Execution - No data leaves your machine
  • 🖼️ Multi-Modal Support - Chat with images, documents, and text
  • Real-time Streaming - Typewriter-style responses
  • 🧠 Model Management - Easily switch between local LLMs
  • 🔄 Context Aware - Maintains conversation history
  • 🛡️ Encrypted History - Secures chat sessions
  • 🎨 Rich CLI Interface - Beautiful terminal formatting

Installation

Prerequisites

  • Python 3.10+
  • Ollama installed locally
  • Supported models downloaded (e.g., llava, mistral)
# Clone repository
git clone https://github.com/ericz99/local-ai-chat.git
cd local-ai-chat

# Using Poetry
poetry shell

# Install packages
poetry install

# Run setup_env
sh scripts/setup_env.sh

# Run CLI
poetry run python main.py chat

Configuration

# data/config/settings.yaml

model:
  default: "llava:latest"
  temperature: 0.7  # 0-1, creativity control
  gpu_layers: 20    # Use 0 for CPU-only
  max_history: 6    # Context window size

privacy:
  encrypt_history: true
  auto_clear_temp: true

ui:
  theme: "dark"     # dark/light
  response_color: "cyan"

About

Run Multi-Modal Chat locally

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published