Connect any Postgres database to an MCP server and a chat UI designed for tool-building, observability, and safe data exploration.
- Live demo: chat.anashel.com
This repo is a toolkit + reference implementation for running:
pg-mcp: an MCP server (Cloudflare Worker) exposing Postgres-focused tools (not just “run SQL”)llm-app: a lightweight chat UI + LLM orchestrator (Cloudflare Worker + Vite frontend) built to inspect, debug, and measure every tool interaction
The demo ships with the public NY SPARCS Hospital Inpatient Discharges (2021) dataset (≈2.13M rows) so you can try it immediately.
Most “LLM + Postgres” projects are thin SQL wrappers. This toolkit intentionally pushes the LLM toward business intelligence-style tools:
- compare / segment / pivot / summarize
- scoped exploration with row caps
- tool-by-tool profiling (latency + concurrency)
You can still run specific queries, but if you inspect the LLM↔MCP exchange you’ll typically see tool-driven analysis rather than “generate one giant SQL query”.
- Full tool transparency: inspect every MCP call, payload, response, and timing
- SSE conversation inspection: view the full streaming exchange end-to-end
- Latency + concurrency testing: measure parallel tool calls and overhead
- Table UX: export tables, open results in modals, copy JSON
- Multilingual chat: ships with 5 languages; add more easily
- On-the-fly glossary: generates a glossary to explain unfamiliar fields/values
- Row-Level Security (RLS) friendly: user sessions map to a “security scope”; the MCP should only pivot data the user is allowed to see
- Runtime: Cloudflare Workers (2 workers) + Cloudflare Hyperdrive (Postgres connection)
- State: Durable Objects (auth, chat sessions, scoped dataset handles)
- Observability: Cloudflare Logs + Traces enabled in
wrangler.toml - Security posture (default): API-key protected, fail-closed, with extra CORS + schema filtering options
flowchart LR
U[User / Browser] -->|HTTPS| APP["llm-app Worker<br/>(API + UI)"]
subgraph CF[Cloudflare]
APP -->|serves| ASSETS["Static assets<br/>(Vite build via Workers Assets)"]
APP -->|reads/writes| DO_AUTH["Durable Object<br/>AuthConfig"]
APP -->|reads/writes| DO_CHAT["Durable Object<br/>ChatSession"]
APP -->|MCP JSON-RPC<br/>X-API-Key| MCP["pg-mcp Worker"]
MCP -->|stores dataset handles| DO_DATA["Durable Object<br/>DatasetStore"]
MCP -->|Hyperdrive| HD[(Hyperdrive)]
end
HD -->|Postgres| PG[(PostgreSQL)]
APP -->|LLM API| LLM["OpenAI-compatible LLM endpoint"]
What’s in the diagram
- Durable Objects:
llm-app:AuthConfig,ChatSessionpg-mcp:DatasetStore
- KV/R2/D1: not used in the current implementation (the “KV” you may see in the UI code is just a key/value markdown rendering feature, not Cloudflare KV).
This repo is very close to being turnkey, but it’s not “one command and you’re live” yet.
- Some setup still requires manual steps (Cloudflare login, Hyperdrive creation, SPARCS CSV download, setting OpenAI key).
- I’m actively working on simplifying install/deploy (and adding a Docker/local wrapper) this week.
That said, there is already a working setup flow (doctor, setup, deploy:all) and the demo is running in production.
- Node.js >= 20 (Wrangler v4 requirement)
- A Postgres database
- A Cloudflare account with Workers + (optionally) Hyperdrive access
- The SPARCS CSV (for the demo dataset): see
sql/dataset/README.md
cd mcp-postgress
npm installnpm run doctorThe importer reads sql/setup/.env. If it doesn’t exist, it will be created from sql/setup/env.template.
npm run setup:db -- --schema-onlyThen edit:
sql/setup/.env→ setDATABASE_URL=postgres://...
To import a safe sample (recommended on first run):
npm run setup:db -- --max-rows 250000 --index-mode minimalTo import the full dataset:
npm run setup:db -- --fullCreate a Hyperdrive config using the repo-pinned Wrangler (v4):
npm run wrangler -- hyperdrive create my-hyperdrive --connection-string="postgres://..."Then set the returned Hyperdrive ID in:
workers/pg-mcp/wrangler.toml→[[hyperdrive]] id = "YOUR_HYPERDRIVE_ID"
First, login to Cloudflare (using repo-pinned Wrangler):
npm run wrangler -- loginGenerate + upload the shared MCP API key to both workers:
npm run setup:secretsSet your OpenAI key for llm-app (required for chat):
npm run wrangler -- secret put OPENAI_API_KEY --config workers/llm-app/wrangler.tomlThis deploys pg-mcp, detects its URL, wires llm-app to it, then deploys llm-app.
npm run deploy:all# UI dev (remote backend)
npm run dev
# Run the MCP worker locally
npm run dev:mcp
# Run the full app in “local” mode (see llm-app scripts)
npm run dev:localThis repo includes a template cursor-mcp.json you can copy into your MCP client config.
- Template:
cursor-mcp.json - Fill in:
url: your deployedpg-mcpRPC endpointX-API-Key: theMCP_API_KEYyou generated
pg-mcp is designed to be fail-closed:
- If
MCP_API_KEYis missing, it rejects requests by default. - There is an
ALLOW_UNAUTHENTICATEDflag inworkers/pg-mcp/wrangler.tomlintended only for local development. - CORS controls and schema filtering are available via
CORS_*,EXCLUDED_SCHEMAS, andALLOWED_SOURCES. - Multiple “row cap” safeguards exist (
MAX_*_ROWS) to prevent accidental large pulls.
workers/pg-mcp/: MCP server Workerworkers/llm-app/: chat UI + LLM orchestration Worker + frontendshared/: shared utilities (telemetry, middleware)sql/: demo schema/import/indexes/rollups + dataset docspostman/: Postman collection + environment template
- A truly turnkey install/deploy flow
- A Docker/local wrapper so it can run anywhere (not just Workers)
- More prebuilt tool packs and safer “cap + scope” patterns for production data
If you’re building MCP tools and want a UI where you can see everything (payloads, SSE, concurrency, latency) while staying secure by default, this is exactly what I wanted to exist.