Stina
A Local-First AI Assistant for Your Workday
Stina runs on your machine, keeps your data locally, and connects to any AI provider you choose. Use Ollama for fully offline AI, OpenAI for cloud power, or build your own provider — it's your call.
What makes Stina different
Privacy First
All your data stays on your machine. No cloud sync, no telemetry, no surprises. You own your data.
Choose Your AI
Run Ollama for fully offline AI, connect to OpenAI for cloud models, or build your own provider extension.
Work Anywhere
Desktop app (Electron), web interface, or CLI/TUI — pick the interface that fits your workflow.
Extensible
Mail reader, work/todo manager, people registry, and more. Install extensions or build your own.
Natural Language
Just talk naturally. Tell Stina what you need in everyday language — no commands to memorize.
Self-Hostable
Run Stina on your own server with Docker Compose. Full control over your AI assistant infrastructure.
Get started
Download the latest release for macOS, Windows, or Linux.
GitHub Releases →Run Stina as a web app with Docker Compose.
curl -O https://raw.githubusercontent.com/einord/stina/main/docker-compose.yml
docker compose up -d Open localhost:3002
Clone and build the project yourself.
git clone https://github.com/einord/stina.git
cd stina
pnpm install
pnpm build:packages
pnpm dev:web