AI on your machine.
Under your control.
The only local AI desktop with built-in RAG, project organization, and multi-provider support — in English and Spanish. No subscriptions. No hidden cloud.
Free · Offline-ready · Windows · macOS · Linux
1. Arquitectura local-first con SQLite
2. RAG con embeddings Ollama...
Everything in one place.
No extensions. No integrations. No API keys required to get started.
Built-in RAG Engine
Upload documents and MIKA5 automatically chunks, embeds, and retrieves the most relevant fragments for every message. Per-project and per-chat knowledge bases.
Projects & Chats
Organize work into projects with isolated knowledge, system prompts, and chat history. Quick Chat for ad-hoc conversations without a project.
Vision Models
Attach images to any message. MIKA5 auto-detects vision-capable models (LLaVA, GLM-OCR, Gemma3, Qwen2.5-VL, GPT-4o, Claude) and warns when a model can't process images.
Cloud AI — Optional
Connect OpenAI, Anthropic, Groq, or Moonshot when you need them. API keys stored with AES-256-GCM encryption. A privacy indicator shows when cloud is active.
Model Tracking
Every message is tagged with the model that generated it. Model switches within a chat are highlighted inline so you always know which AI answered.
Export Anywhere
Export individual messages or full conversations to Markdown, plain text, HTML, Python, or PDF. Smart titles generated automatically from content.
Popular & everyday — local via Ollama
High-end — require 32–64 GB RAM / GPU 24 GB+
⚡ High-end models require significant hardware. Performance varies by PC specs. See Model Guide for detailed hardware requirements.
Local runtimes: Ollama · llama.cpp — Cloud providers: OpenAI · Anthropic · Groq · Moonshot
The cloud isn't neutral.
Subscription lock-in
Pricing and access can change overnight without your consent.
Token billing
You pay per thought. Creativity shouldn't have a meter running.
Data exposure
Prompts pass through servers you don't own or control.
Cloud AI flow
You → Browser → Vendor Cloud → Policies → Model → Response
# Your prompt stored, processed, potentially used for training
MIKA5 flow
You → MIKA5 Desktop → Local Ollama → Done
# Zero network calls. Your hardware. Your data.
Download MIKA5
Free. No account. No license key.
⚠ macOS: The app is not notarized. On first launch, right-click the .dmg → Open to bypass Gatekeeper.
# Verify download integrity
PS> Get-FileHash "MIKA5-Setup-0.1.0-win-x64.exe" -Algorithm SHA256 # Windows
$ shasum -a 256 MIKA5-0.1.0-mac-arm64.dmg # macOS
$ sha256sum MIKA5-0.1.0-linux-x64.AppImage # Linux
# Compare with the SHA256 published in the GitHub release notes
All releases → github.com/mika5app/mika5/releases
Up and running in 3 minutes.
MIKA5 requires Ollama running locally. Install Ollama first, then follow these steps.
Install Ollama
Download from ollama.com and install it. It runs a local server at 127.0.0.1:11434.
Pull a model + the embedding model
$ ollama pull llama3.2
$ ollama pull nomic-embed-text
# nomic-embed-text powers the RAG engine
Install and launch MIKA5
Run the MIKA5-Setup-0.1.0-win-x64.exe installer, then open MIKA5 from the Start Menu or Desktop shortcut.
System Requirements
- Windows 10 or 11 (64-bit)
- 8 GB RAM minimum (16 GB recommended)
- 4 GB free disk (+ model files)
- Ollama installed and running
- GPU optional — CPU inference supported
Recommended starter models
Your data never leaves your machine.
No telemetry. No background sync. No account creation. No hidden calls home. All conversations and knowledge are stored in a local SQLite database on your computer.
Run AI on your terms.
v0.1.0 · Windows · macOS · Linux · Free & Open Source
MIKA5 Pro is coming
Team collaboration, cloud sync, advanced RAG, and more. Get notified when it launches.
No spam. Unsubscribe anytime.