A high-performance, privacy-first email client built with Rust and Slint, featuring deep local AI integration.
- Outlook-Inspired UI: A modern, responsive design with support for Light and Dark modes, resizable sidebars, and fluid navigation.
- Tejas AI Assistant: Chat with your entire 1000+ email inbox using local LLMs. Ask about action items, summarize themes, or find specific invoices without your data ever leaving your machine.
- Reply w/ AI ✨: Generate professional, context-aware email drafts instantly based on the active thread.
- Instant Greetings: Blazing-fast, time-aware conversational responses for simple greetings, bypassing the LLM for better responsiveness.
- Advanced Composition: Support for CC/BCC fields, local attachment management, and integrated email format validation.
- Privacy First: All emails are stored in a local SQLite database, and all AI processing is handled locally via Ollama.
- Core: Rust
- UI Framework: Slint
- Database: SQLite (via rusqlite)
- Local AI: Ollama (Llama 3.1)
- Networking: Reqwest, Tokio
- Ollama: Install Ollama and download the Llama 3.1 model:
ollama run llama3.1
- Rust: Ensure you have the latest Rust toolchain installed.
-
Clone the repository:
git clone <repo-url> cd email-client
-
Run the application:
cargo run --release
The project is structured into three main components:
src/main.rs: Application logic, Slint callbacks, and event loop.src/ai.rs: Local LLM integration handling 32k context windows for deep email analysis.ui/app.slint: High-performance UI definitions and layout logic.