A personal AI assistant built on LangGraph. Cerebro connects to your local-brain knowledge base via MCP, giving the agent direct access to your todos, notes, projects, and daily logs through a terminal chat interface.
- Terminal chat (TUI) — clean chat interface built with Textual, streaming responses token by token
- Brain tools — reads and writes todos, notes, and projects via the
brain-mcpMCP server - Web search — falls back to Tavily for current events and general knowledge
- Persistent history — conversations are saved to SQLite; each day resumes the same thread automatically
- LangGraph Studio — open the graph visually with
langgraph devfor debugging
- local-brain — install and configure your brain, then make sure
brain-mcpis available in your$PATH - Anthropic API key — Cerebro defaults to Claude Haiku; any
anthropic/model works - Python ≥ 3.11 and uv
git clone https://github.com/sandermoonemans/cerebro
cd cerebro
cp .env.example .env
# Add your API key(s) to .env
uv syncuv run cerebro| Key | Action |
|---|---|
Enter |
Send message |
Ctrl+N |
Start a new thread |
Ctrl+C |
Quit |
Chat history is stored in ~/.local/share/cerebro/chat.db. Each calendar day gets its own thread by default; Ctrl+N starts a fresh one at any time.
.env file:
ANTHROPIC_API_KEY=your-key-here # required
TAVILY_API_KEY=your-key-here # optional, enables web searchThe model and system prompt can also be overridden via environment variables (MODEL, SYSTEM_PROMPT) or by editing src/cerebro/context.py.
# Unit tests
uv run pytest tests/unit_tests/
# LangGraph Studio (visual graph debugger)
uv run langgraph devsrc/cerebro/
├── graph.py # create_graph() factory + default compiled graph
├── chat.py # Textual TUI entry point
├── context.py # Runtime config (model, system prompt, search results)
├── prompts.py # Default system prompt
├── state.py # Agent state schema
├── tools.py # web_search tool (Tavily)
└── utils.py # load_chat_model helper
The agent loop: call_model → [tools →] call_model → response
Brain tools (todos, notes, projects, search within brain, etc.) are loaded at TUI startup from the brain-mcp MCP server and injected into the graph. If brain-mcp is not found in $PATH, the agent falls back to web search only.