Meet ModelDock – the local AI cockpit for power users. One clean chat UI, every major LLM. ChatGPT, Claude, Gemini, Grok, Qwen, Mistral, Z.ai, even local Ollama – all docked in a single interface and driven by real Chrome sessions, not fragile APIs. Forget juggling tabs, logins, and keys; ModelDock reuses your own browser sessions with Puppeteer, defeats typical bot detection, and streams responses back into a unified chat with voice support and full local privacy. Use the built‑in developer API to script, automate, and wire every provider into your tools from http://localhost:3000/docs/api without sending a single token to a third‑party server.
- Unified Chat Interface: Interact with multiple LLMs from a single UI.
- No API Keys Required: Uses your existing browser sessions/accounts to communicate with models.
- Cross-Platform: Supports Windows, macOS, and Linux out of the box with auto-detected Chrome installations.
- Secure Credentials: Stores session cookies and data securely using
keytarand SQLite. - Voice Support: Built-in voice features and customizable styling.
- Privacy-First: Runs entirely locally on your machine.
- ChatGPT (OpenAI)
- Claude (Anthropic)
- Gemini (Google)
- Grok (xAI)
- Z.ai
- Qwen
- Mistral
- Ollama (Local Models)
- Node.js: v18 or newer
- Google Chrome: Installed on your system (paths auto-detected for Windows, macOS, and Linux).
-
Clone the repository:
git clone https://github.com/yourusername/ModelDock.git cd ModelDock -
Install dependencies:
npm install
(Note: This project uses
npmbut also includes abun.lockif you prefer using Bun). -
Start the development server:
npm run dev
-
Access the Application: Open
http://localhost:3000in your web browser.
ModelDock uses puppeteer-real-browser to launch a headless/visible Chrome instance. When you send a message to a specific provider, the backend navigates to the respective web interface, injects your session cookies (which you configure in the settings), and interacts with the chat DOM elements strictly through the browser, scraping the AI's responses and piping them back to your unified chat window.
Navigate to the settings icon in the sidebar to configure individual providers, manage cookies, and adjust your theme/voice preferences.
ModelDock now includes a developer-facing local API so you can talk to every supported provider without using the chat UI directly.
- Docs page: Open
http://localhost:3000/docs/api - Provider discovery:
GET /api/v1/providers - Session setup:
POST /api/v1/providers/:provider/session - Chat endpoint:
POST /api/v1/chat
The public API is designed for local scripts, CLIs, automations, and companion tools. Browser-backed providers can reuse your existing web sessions by importing cookies from a local browser or by posting cookies directly to the session endpoint. If you want to protect the API, set MODELDOCK_API_KEY in your environment and send it as a Bearer token or x-modeldock-api-key header.
We welcome contributions! Please see the CONTRIBUTING.md file for guidelines on how to get started.
This project is licensed under the MIT License - see the LICENSE.md file for details.
