Skip to content
#

local-llm-integration

Here are 9 public repositories matching this topic...

🚀 A powerful Flutter-based AI chat application that lets you run LLMs directly on your mobile device or connect to local model servers. Features offline model execution, Ollama/LLMStudio integration, and a beautiful modern UI. Privacy-focused, cross-platform, and fully open source.

  • Updated Feb 16, 2025
  • Dart

Improve this page

Add a description, image, and links to the local-llm-integration topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the local-llm-integration topic, visit your repo's landing page and select "manage topics."

Learn more