Multi-user email archival system with search. Syncs emails from IMAP, POP3, and Gmail API accounts into a structured filesystem.
See Competitors for similar OSS projects.
Central hub for all email. Aggregates messages from every source into structured storage (filesystem, S3). Unstructured mail becomes normalized data ready for downstream systems.
- Multi-user — username/password registration (no email verification), optional OAuth2 (GitHub, Google, Facebook)
- Multi-account — each user manages their own email accounts
- Protocol support — IMAP, POP3, Gmail API (OAuth flow incomplete)
- PST/OST import — upload Outlook archive files (10GB+), streamed with progress
- Deduplication — SHA-256 content checksums prevent duplicate storage
- Search — keyword search (DuckDB + Parquet) and similarity search (Qdrant + Ollama)
- Live sync — cancel running syncs, real-time progress, auto-reindex every 5s
- Date preservation — file mtime set from email Date/Received headers
- UUIDv7 IDs — time-ordered identifiers for all entities
- Raw storage — emails preserved as
.emlfiles (RFC 822), readable by any mail client - Per-user isolation — all data under
users/{uuid}/ - Mobile-first UI — bottom nav, infinite scroll, swipe between emails
- High-performance search results — virtual list (viewport-only rendering), custom scroll bar, throttled scroll, CSS containment for smooth UX with 30k+ emails
- PWA — installable on desktop and mobile (standalone app, offline shell for static assets)
Note
The service never deletes or marks emails as read.
# 1. Clone and build
git clone https://github.com/eSlider/mails.git
cd mails
go build ./cmd/mails
# 2. Run
./mails serve
# 3. Open browser and register
open http://localhost:8090For Docker:
cp .env.example .env # optional: configure OAuth providers
docker compose up -d
open http://localhost:8090Docker images are published to ghcr.io/eSlider/mail-archive with SemVer tags (e.g. v1.0.1, 1.0.1) and latest. Use a specific version for production:
docker pull ghcr.io/eslider/mail-archive:v1.0.1| Variable | Default | Description |
|---|---|---|
LISTEN_ADDR |
:8090 |
HTTP listen address |
DATA_DIR |
./users |
Base directory for user data |
BASE_URL |
http://localhost:8090 |
Public URL for OAuth callbacks |
GITHUB_CLIENT_ID |
— | GitHub OAuth app client ID |
GITHUB_CLIENT_SECRET |
— | GitHub OAuth app client secret |
GOOGLE_CLIENT_ID |
— | Google OAuth app client ID |
GOOGLE_CLIENT_SECRET |
— | Google OAuth app client secret |
FACEBOOK_CLIENT_ID |
— | Facebook OAuth app client ID |
FACEBOOK_CLIENT_SECRET |
— | Facebook OAuth app client secret |
QDRANT_URL |
— | Qdrant gRPC address for similarity search |
OLLAMA_URL |
— | Ollama API URL for embeddings |
EMBED_MODEL |
all-minilm |
Embedding model name |
S3_ENDPOINT |
— | S3-compatible storage endpoint (e.g. MinIO) |
S3_ACCESS_KEY_ID |
— | S3 access key |
S3_SECRET_ACCESS_KEY |
— | S3 secret key |
S3_BUCKET |
mails |
S3 bucket name |
S3_USE_SSL |
true |
Use HTTPS for S3 endpoint |
OAuth is not required. Users can register with email + password by default.
To enable OAuth login, configure one or more providers:
-
GitHub: Create an OAuth App at github.com/settings/applications/new. Set callback URL to
{BASE_URL}/auth/github/callback. -
Google: Create credentials at console.cloud.google.com. Set redirect URI to
{BASE_URL}/auth/google/callback. -
Facebook: Create an app at developers.facebook.com. Set redirect URI to
{BASE_URL}/auth/facebook/callback.
When S3 env vars are set, user.json, accounts.yml, sessions.json, and .eml files are stored in S3. SQLite and Parquet stay on the local filesystem.
users/
019c56a4-a9ef-79bd-b53a-ef7a080d9c90/
user.json # User metadata
accounts.yml # Email account configs
sync.sqlite # Sync state database
logs/ # Structured sync logs
gmail.com/
eslider/
inbox/
a1b2c3d4e5f67890-12345.eml
gmail/
sent/
b2c3d4e5f6789012-12346.eml
index.parquet # Search index
cmd/mails/ → Entry point, CLI (serve, fix-dates, version)
internal/
auth/ → OAuth2 (GitHub, Google, Facebook), sessions
storage/ → Blob store (FS or S3) for user data
user/ → User storage (users/{uuid}/)
account/ → Email account CRUD (accounts.yml)
model/ → Shared types (User, Account, SyncJob)
sync/ → Sync orchestration, live indexing, cancel support
imap/ → IMAP protocol sync (UID-based, context-aware)
pop3/ → POP3 protocol sync
gmail/ → Gmail API sync
pst/ → PST/OST file import (go-pst; readpst fallback for newer OST)
search/
eml/ → .eml parser (charset, MIME, fuzzy date parsing)
index/ → DuckDB search index → Parquet (with cache cleanup)
vector/ → Qdrant similarity search
web/ → Chi router, HTTP handlers
web/
static/ → CSS, JS, Vue templates (all local, no CDN, no build step)
# Build
go build ./cmd/mails
# Run locally
DATA_DIR=./users ./mails serve
# Fix file timestamps on all .eml files
./mails fix-dates
# Run unit tests
go test ./...
# Run e2e tests (requires GreenMail + Qdrant + Ollama)
docker compose --profile test up -d greenmail
go test -tags e2e -v ./tests/e2e/
# Run S3 storage integration tests (requires MinIO)
docker compose --profile s3 up -d minio
S3_ENDPOINT=http://localhost:9900 S3_ACCESS_KEY_ID=minioadmin S3_SECRET_ACCESS_KEY=minioadmin S3_BUCKET=mails-test S3_USE_SSL=false go test -v ./internal/storage/
# Docker dev mode (auto-rebuild on changes)
docker compose watch| Component | Technology |
|---|---|
| Backend | Go 1.24+ |
| Frontend | Vue.js 3.5, native fetch (no build step, no CDN) |
| Search index | DuckDB (in-memory) + Parquet (persistence) |
| Vector search | Qdrant + Ollama embeddings |
| Sync state | SQLite (per-user) |
| Auth | bcrypt passwords, optional OAuth2 |
| Container | Docker + Docker Compose |
| License | MIT |
The frontend uses Vue.js 3 and native fetch with zero build tooling — no webpack, no Vite, no TypeScript. All vendor libraries are committed locally under web/static/js/vendor/ (no CDN dependency).
The app is installable as a Progressive Web App. Use your browser’s install option (Chrome/Edge: menu → “Install…”; Safari iOS: Share → “Add to Home Screen”). Requires HTTPS in production (localhost works for development).
- Bottom navigation — Search, Accounts, Import tabs (viewport < 768px)
- Infinite scroll — "Load more" button and auto-load when scrolling to the bottom (mouse wheel and drag)
- Email detail — Prev/next buttons, swipe left/right, position count (e.g. "3 of 50"), back to search results
- Virtual list — renders only items in the viewport (~40 DOM nodes instead of thousands)
- Custom scroll bar — drag-to-scroll, click track to jump, progress fill shows load state
- Hidden native scrollbar — clean UI; scroll via mouse wheel, touch, or custom bar
Vue templates are stored as standalone .vue files containing raw HTML with Vue directives. At startup, main.js fetches the template asynchronously before creating the Vue app:
// main.js — async bootstrap
(async function () {
var res = await fetch("/static/js/app/main.template.vue");
var template = await res.text();
var App = {
template: template,
data: function () {
return {
/* ... */
};
},
methods: {
/* ... */
},
};
Vue.createApp(App).mount("#app");
})();This keeps the template editable as a proper .vue file (with IDE syntax highlighting and linting) while avoiding any compile/transpile step. The pattern is inspired by the Produktor UI approach to dynamic component loading.
web/static/
css/app.css # Application styles (dark theme, responsive)
favicon.svg
manifest.webmanifest # PWA manifest
sw.js # Service worker (served at /sw.js)
js/
vendor/
vue-3.5.13.global.prod.js # Vue.js (local copy)
app/
main.js # App logic (native fetch, ES6+)
main.template.vue # Vue template (HTML with directives)
sw-register.js # Service worker registration
Store user data on S3 when S3_ENDPOINT and credentials are set:
docker compose --profile s3 up -d minio
export S3_ENDPOINT=http://localhost:9900
export S3_ACCESS_KEY_ID=minioadmin
export S3_SECRET_ACCESS_KEY=minioadmin
export S3_BUCKET=mails
export S3_USE_SSL=false
./mails serveSee TODO.md
- Advanced analytics — Conversation timeline on the map; history scroller with progress bar
- Callback registration — Search filters that trigger callbacks when matching emails arrive
See CONTRIBUTING.md for the full API reference.
# Health check (no auth)
curl http://localhost:8090/health
# Search (requires session cookie)
curl -b cookies.txt "http://localhost:8090/api/search?q=invoice&limit=20"
# List accounts
curl -b cookies.txt http://localhost:8090/api/accounts
# Trigger sync
curl -b cookies.txt -X POST http://localhost:8090/api/sync
# Stop a running sync
curl -b cookies.txt -X POST http://localhost:8090/api/sync/stop -H 'Content-Type: application/json' -d '{"account_id":"..."}'
# Import PST/OST file
curl -b cookies.txt -X POST http://localhost:8090/api/import/pst -F "file=@archive.pst" -F "title=My Outlook Archive"
# Check import progress
curl -b cookies.txt http://localhost:8090/api/import/status/{job_id}
# For newer Outlook OST files, install pst-utils: apt install pst-utilsMIT — see LICENSE.