flow + prompt = flompt
Turn any AI prompt into a visual flow. Decompose, edit as a flowchart, recompile.
flompt is a visual prompt engineering tool that transforms how you write AI prompts.
Instead of writing one long block of text, flompt lets you:
- Decompose — Paste any prompt and let AI break it into structured blocks
- Edit visually — Drag, connect, and reorder blocks in a flowchart editor
- Recompile — Generate a Claude-optimized, machine-ready prompt from your flow
Think of it as Figma for prompts — visual, structured, and built for Claude.
11 specialized blocks that map directly to Claude's prompt engineering best practices:
| Block | Purpose | Claude XML |
|---|---|---|
| Role | AI persona & expertise | <role> |
| Context | Background information | <context> |
| Objective | What you want to achieve | <objective> |
| Input | Data you're providing | <input> |
| Constraints | Rules & limitations | <constraints> |
| Output Format | Expected output structure | <output_format> |
| Examples | Few-shot demonstrations | <examples><example> |
| Chain of Thought | Reasoning steps | <thinking> |
| Document | External content grounding | <documents><document> |
| Format Control | Claude-specific directives (tone, verbosity, markdown) | <format_instructions> |
| Language | Response language | <language> |
Blocks are automatically ordered following Anthropic's recommended prompt structure.
→ flompt.dev — No account needed. Free & open-source.
Use flompt directly inside ChatGPT, Claude, and Gemini — without leaving your tab.
- ✦ Enhance button injected into the AI chat input
- Bidirectional sync between the sidebar and the chat
- Works on ChatGPT · Claude · Gemini
→ Download from GitHub Releases
flompt exposes its core capabilities as native tools inside Claude Code via the Model Context Protocol (MCP).
Once configured, you can call decompose_prompt, compile_prompt, and list_block_types directly from any Claude Code conversation — no browser, no copy-paste.
Option 1 — CLI (recommended):
claude mcp add --transport http --scope user flompt https://flompt.dev/mcp/The --scope user flag makes flompt available in all your Claude Code projects.
Option 2 — ~/.claude.json:
{
"mcpServers": {
"flompt": {
"type": "http",
"url": "https://flompt.dev/mcp/"
}
}
}Once connected, 3 tools are available in Claude Code:
Breaks down a raw prompt into structured blocks (role, objective, context, constraints, etc.).
- Uses Claude or GPT on the server if an API key is configured
- Falls back to keyword-based heuristic analysis otherwise
- Returns a list of typed blocks + full JSON to pass to
compile_prompt
Input: "You are a Python expert. Write a function that parses JSON and handles errors."
Output: ✅ 3 blocks extracted:
[ROLE] You are a Python expert.
[OBJECTIVE] Write a function that parses JSON…
[CONSTRAINTS] handles errors
📋 Full blocks JSON: [{"id": "...", "type": "role", ...}, ...]
Compiles a list of blocks into a Claude-optimized XML prompt.
- Takes the JSON from
decompose_prompt(or manually crafted blocks) - Reorders blocks following Anthropic's recommended structure
- Returns the final XML prompt with an estimated token count
Input: [{"type": "role", "content": "You are a Python expert", ...}, ...]
Output: ✅ Prompt compiled (142 estimated tokens):
<role>You are a Python expert.</role>
<objective>Write a function that parses JSON and handles errors.</objective>
Lists all 11 available block types with descriptions and the recommended canonical ordering. Useful when manually crafting blocks.
1. decompose_prompt("your raw prompt here")
→ get structured blocks as JSON
2. (optionally edit the JSON to add/remove/modify blocks)
3. compile_prompt("<json from step 1>")
→ get Claude-optimized XML prompt, ready to use
| Property | Value |
|---|---|
| Transport | Streamable HTTP (POST) |
| Endpoint | https://flompt.dev/mcp/ |
| Session | Stateless (each call is independent) |
| Auth | None required |
| DNS rebinding protection | Enabled (flompt.dev explicitly allowed) |
- Python 3.12+
- Node.js 18+
- An Anthropic or OpenAI API key (optional — heuristic fallback works without one)
Backend:
cd backend
python -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt
cp .env.example .env # add your API key
uvicorn app.main:app --reload --port 8000App (Frontend):
cd app
cp .env.example .env # optional: add PostHog key
npm install
npm run devBlog:
cd blog
npm install
npm run dev # available at http://localhost:3000/blog| Service | URL |
|---|---|
| App | http://localhost:5173 |
| Backend API | http://localhost:8000 |
| API Docs (Swagger) | http://localhost:8000/docs |
| MCP endpoint | http://localhost:8000/mcp/ |
flompt supports multiple AI providers. Copy backend/.env.example to backend/.env:
# Anthropic (recommended)
ANTHROPIC_API_KEY=sk-ant-...
AI_PROVIDER=anthropic
AI_MODEL=claude-3-5-haiku-20241022
# or OpenAI
OPENAI_API_KEY=sk-...
AI_PROVIDER=openai
AI_MODEL=gpt-4o-miniNo API key? No problem — flompt falls back to a heuristic decomposer (keyword-based) and structured XML compilation.
This section documents the exact production setup running at flompt.dev. Everything lives in /projects/flompt.
Internet
│
▼
Caddy (auto-TLS, reverse proxy) ← port 443/80
├── /app* → Vite SPA static files (app/dist/)
├── /blog* → Next.js static export (blog/out/)
├── /api/* → FastAPI backend (localhost:8000)
├── /mcp/* → FastAPI MCP server (localhost:8000, no buffering)
├── /docs* → Reverse proxy to GitBook
└── / → Static landing page (landing/)
↓
FastAPI (uvicorn, port 8000)
↓
Anthropic / OpenAI API
Both Caddy and the FastAPI backend are managed by supervisord, itself watched by a keepalive loop.
# Python 3.12+ with pip
python --version
# Node.js 18+
node --version
# Caddy binary placed at /projects/flompt/caddy
# (not committed to git — download from https://caddyserver.com/download)
curl -o caddy "https://caddyserver.com/api/download?os=linux&arch=amd64"
chmod +x caddy
# supervisor installed in a Python virtualenv
pip install supervisorBackend (backend/.env):
ANTHROPIC_API_KEY=sk-ant-... # or OPENAI_API_KEY
AI_PROVIDER=anthropic # or: openai
AI_MODEL=claude-3-5-haiku-20241022 # model to use for decompose/compileApp frontend (app/.env):
VITE_POSTHOG_KEY=phc_... # optional analytics
VITE_POSTHOG_HOST=https://eu.i.posthog.comBlog (blog/.env.local):
NEXT_PUBLIC_POSTHOG_KEY=phc_...
NEXT_PUBLIC_POSTHOG_HOST=https://eu.i.posthog.comAll assets must be built before starting services. Use the deploy script or manually:
Full deploy (build + restart + health check):
cd /projects/flompt
./deploy.shBuild only (no service restart):
./deploy.sh --build-onlyRestart only (no rebuild):
./deploy.sh --restart-onlyManual build steps:
# 1. Vite SPA → app/dist/
cd /projects/flompt/app
npm run build
# Output: app/dist/ (pre-compressed with gzip, served by Caddy)
# 2. Next.js blog → blog/out/
cd /projects/flompt/blog
rm -rf .next out # clear cache to avoid stale builds
npm run build
# Output: blog/out/ (full static export, no Node server needed)Production processes are managed by supervisord (supervisord.conf):
| Program | Command | Port | Log |
|---|---|---|---|
flompt-backend |
uvicorn app.main:app --host 0.0.0.0 --port 8000 |
8000 | /tmp/flompt-backend.log |
flompt-caddy |
caddy run --config /projects/flompt/Caddyfile |
443/80 | /tmp/flompt-caddy.log |
Both programs have autorestart=true and startretries=5 — they automatically restart on crash.
Start supervisord (first boot or after a full restart):
supervisord -c /projects/flompt/supervisord.confCommon supervisorctl commands:
# Check status of all programs
supervisorctl -c /projects/flompt/supervisord.conf status
# Restart backend only (e.g. after a code change)
supervisorctl -c /projects/flompt/supervisord.conf restart flompt-backend
# Restart Caddy only (e.g. after a Caddyfile change)
supervisorctl -c /projects/flompt/supervisord.conf restart flompt-caddy
# Restart everything
supervisorctl -c /projects/flompt/supervisord.conf restart all
# Stop everything
supervisorctl -c /projects/flompt/supervisord.conf stop all
# Read real-time logs
tail -f /tmp/flompt-backend.log
tail -f /tmp/flompt-caddy.log
tail -f /tmp/flompt-supervisord.logkeepalive.sh is an infinite bash loop (running as a background process) that:
- Checks every 30 seconds whether supervisord is alive
- If supervisord is down, kills any zombie process occupying port 8000 (via inode lookup in
/proc/net/tcp) - Restarts supervisord
- Logs all events to
/tmp/flompt-keepalive.log
Start keepalive (should be running at all times):
nohup /projects/flompt/keepalive.sh >> /tmp/flompt-keepalive.log 2>&1 &
echo $! # note the PIDCheck if keepalive is running:
ps aux | grep keepalive.sh
tail -f /tmp/flompt-keepalive.logNote:
keepalive.shuses the same Python virtualenv path as supervisord. If you reinstall supervisor in a different venv, updateSUPERVISORDandSUPERVISORCTLpaths at the top ofkeepalive.sh.
Caddyfile handles all routing for flompt.dev. Key rules (in priority order):
/blog* → Static Next.js export at blog/out/
/api/* → FastAPI backend at localhost:8000
/health → FastAPI health check
/mcp/* → FastAPI MCP server (flush_interval -1 for streaming)
/mcp → 308 redirect to /mcp/ (avoids upstream 307 issues)
/docs* → Reverse proxy to GitBook (external)
/app* → Vite SPA at app/dist/ (gzip precompressed)
/ → Static landing page at landing/
Reload Caddy after a Caddyfile change:
supervisorctl -c /projects/flompt/supervisord.conf restart flompt-caddy
# or directly:
/projects/flompt/caddy reload --config /projects/flompt/CaddyfileCaddy auto-manages TLS certificates via Let's Encrypt — no manual SSL setup needed.
The deploy script runs these checks automatically. You can run them manually:
# Backend API
curl -s https://flompt.dev/health
# → {"status":"ok","service":"flompt-api"}
# Landing page
curl -s -o /dev/null -w "%{http_code}" https://flompt.dev/
# → 200
# Vite SPA
curl -s -o /dev/null -w "%{http_code}" https://flompt.dev/app
# → 200
# Blog
curl -s -o /dev/null -w "%{http_code}" https://flompt.dev/blog/en
# → 200
# MCP endpoint (requires Accept header)
curl -s -o /dev/null -w "%{http_code}" \
-X POST https://flompt.dev/mcp/ \
-H "Content-Type: application/json" \
-H "Accept: application/json, text/event-stream" \
-d '{"jsonrpc":"2.0","method":"tools/list","id":1}'
# → 200After a backend code change:
cd /projects/flompt
git pull
supervisorctl -c supervisord.conf restart flompt-backendAfter a frontend change:
cd /projects/flompt
git pull
cd app && npm run build
# No service restart needed — Caddy serves static files directlyAfter a blog change:
cd /projects/flompt
git pull
cd blog && rm -rf .next out && npm run build
# No service restart neededAfter a Caddyfile change:
supervisorctl -c /projects/flompt/supervisord.conf restart flompt-caddyFull redeploy from scratch:
cd /projects/flompt && ./deploy.sh| File | Content |
|---|---|
/tmp/flompt-backend.log |
FastAPI/uvicorn stdout + stderr |
/tmp/flompt-caddy.log |
Caddy access + error logs |
/tmp/flompt-supervisord.log |
supervisord daemon logs |
/tmp/flompt-keepalive.log |
keepalive watchdog events |
| Layer | Technology |
|---|---|
| Frontend | React 18, TypeScript, React Flow v11, Zustand, Vite |
| Backend | FastAPI, Python 3.12, Uvicorn |
| MCP Server | FastMCP (streamable HTTP transport) |
| AI | Anthropic Claude / OpenAI GPT (pluggable) |
| Reverse Proxy | Caddy (auto-TLS via Let's Encrypt) |
| Process Manager | Supervisord + keepalive watchdog |
| Blog | Next.js 15 (static export), Tailwind CSS |
| Extension | Chrome MV3 (content script + sidebar) |
| i18n | English & French |
- 🎨 Visual flowchart editor — Drag-and-drop blocks with React Flow
- 🤖 AI-powered decomposition — Paste a prompt, get structured blocks
- ⚡ Async job queue — Non-blocking decomposition with live progress tracking
- 🦾 Claude-optimized output — XML structured following Anthropic best practices
- 🧩 Chrome extension — Enhance button inside ChatGPT, Claude & Gemini
- 🤖 Claude Code MCP — Native tool integration via Model Context Protocol
- 📱 Responsive — Full touch support, tap-to-connect
- 🌙 Dark theme — Mermaid-inspired warm dark UI
- 🌐 Bilingual — English & French interface
- 💾 Auto-save — Local persistence with Zustand
- ⌨️ Keyboard shortcuts — Power-user friendly
- 📋 Export — Copy, download as TXT or JSON
- 🔓 Open-source — MIT licensed, self-hostable
⭐ Star this repo if flompt helps you write better prompts!