The AI-native Markdown editor.
OpenMarkdown is an MIT-licensed desktop Markdown editor built for the agent era. It is designed around a simple idea:
Markdown is becoming the default working format for AI.
Prompts, notes, specs, memory, research, agent context, product docs, personal knowledge, and lightweight publishing all work better when the source format is structured, portable, and easy for both humans and models to understand.
OpenMarkdown is built to make that workflow fast, local-first, and practical on real files, including very large Markdown documents that traditional editors struggle with.
GitHub:
- Repository: https://github.com/ToBeWin/openmarkdown
- Releases: https://github.com/ToBeWin/openmarkdown/releases
- Issues: https://github.com/ToBeWin/openmarkdown/issues
Quick glance:
- 🤖 AI-native Markdown editing
- 📂 Local-first workflows
- 🚀 100MB+ large-file path
- 🦀 Rust core + native desktop runtime
- 🖥️ macOS / Windows / Linux
- ⚖️ MIT licensed
OpenMarkdown is not just another Markdown editor.
It is intended to be:
- AI-native: AI works directly on the current note, section, or selection and writes back into the document.
- Local-first: your notes stay on your machine by default; local RAG and local models are first-class.
- Built for large files: the editor is designed to open and edit 100MB+ Markdown without freezing the UI.
- Cross-platform: native desktop runtime with a Rust core and a Svelte frontend.
- Open source: MIT licensed, inspectable, extensible, and community-friendly.
The product principle is:
simple by default, powerful when needed.
That means:
- the editor remains the primary surface
- advanced controls stay secondary
- AI, knowledge, tools, and extensions always serve the current Markdown task
- low-frequency complexity does not crowd the default workflow
Most Markdown editors were built for a pre-agent workflow:
- write text
- preview text
- export text
That is no longer enough.
Today, Markdown is increasingly used as:
- AI prompt source
- AI memory format
- personal knowledge format
- spec and documentation format
- project planning format
- note-taking format
- agent context format
In this environment, the editor itself needs to understand:
- the current note
- the current section
- the current selection
- the local workspace
- the user's preferred model
- the difference between cloud and local workflows
OpenMarkdown is built around that reality.
OpenMarkdown does not treat AI as a side chatbot bolted onto an editor.
AI can work on:
- the full note
- the current section
- the current selection
AI outputs can:
- replace a selection
- replace a section
- update the whole note
- insert or append content
- flow back into the current Markdown task
Inline AI editing supports:
- polish
- expand
- shorten
- structure
- translate to English
- translate to Chinese
- patch-style preview before apply
OpenMarkdown is designed so that important workflows can stay local:
- local files
- local workspace browsing
- local RAG over your notes
- local model support through Ollama
- local export
- local image asset storage
If you want cloud models, you can use them. If you want local models, they are first-class.
Large Markdown files are not treated as a separate product.
OpenMarkdown includes a large-file engine integrated into the normal Writer flow. Open a large Markdown file and the editor switches to the large-file path automatically.
Current large-file behavior includes:
- async indexing
- range-based reads
- large-file editing path inside Writer
- temporary open for dragged-in Markdown files
Benchmark artifacts live in:
docs/benchmarks/latest-large-file.json
The right-side Workbench is not meant to be a second app.
Its job is to support the current Markdown editing task.
Workbench includes:
- AI Agent
- Knowledge
- Tools
- Extensions
Workbench results can:
- write back into the current Markdown document
- send results into AI for further transformation
This turns the editor into a single working system instead of multiple disconnected feature pages.
- WYSIWYG-style hybrid editing
- write Markdown and see structure render in place
- source-visible when you need it, clean document flow when you do not
- Source mode
- Line numbers
- Outline
- Quick Open
- Command Palette
- Zen mode
- Table formatting tools
- section-aware editing behaviors
- slash commands for inserts
OpenMarkdown currently supports the primary Markdown writing workflow plus rich blocks commonly used in modern Markdown:
- headings
- unordered lists
- ordered lists
- task lists
- blockquotes
- fenced code blocks
- tables
- image links
- horizontal rules
- LaTeX formulas
- Mermaid diagrams
- ECharts blocks
- media embed blocks
Images can be inserted by:
- choosing files from disk
- paste
- drag and drop
Imported images are stored beside the note in:
.openmarkdown-assets/
Markdown uses relative links so notes remain portable.
OpenMarkdown includes a multi-model adapter with support for:
- Ollama
- OpenAI
- Anthropic
- Gemini
- DeepSeek
- Qwen
- MiniMax
- GLM
- Kimi
Provider UX includes:
- local / cloud readiness state
- model fetching
- fallback handling
- conversation history
- timestamps
- attachment preview
- model identity on assistant replies
- configurable chat-history folder
- local RAG index over your notes
- workspace-aware retrieval
- result write-back into the current note
Built-in local tools include:
- folder reading
- file-range reading
- Python execution
- URL fetching
- extension sandbox
- plugin execution
- editor-context-aware payloads
Pandoc-based local export queue supports:
- DOCX
- LaTeX
- HTML
- Reveal.js slides
- PNG
OpenMarkdown keeps the editor at the center and routes AI, retrieval, tools, export, and large-file handling around the current Markdown task.
flowchart LR
user["User"] --> writer["Writer<br/>Hybrid / Source / Large File"]
writer --> workbench["Workbench<br/>AI Agent / Knowledge / Tools / Extensions"]
writer --> editor_state["Editor Context<br/>note / section / selection"]
editor_state --> workbench
workbench --> writeback["Write Back to Markdown"]
writeback --> writer
writer --> tauri["Tauri v2 Desktop Shell"]
tauri --> rust["Rust Core"]
rust --> file_engine["Large-file Engine"]
rust --> ai_router["AI Router"]
rust --> rag["Local RAG Index"]
rust --> exportq["Export Queue"]
rust --> skills["Built-in Skills / MCP Layer"]
rust --> plugins["Plugin Host"]
ai_router --> ollama["Ollama"]
ai_router --> cloud["OpenAI / Anthropic / Gemini / DeepSeek / Qwen / MiniMax / GLM / Kimi"]
exportq --> pandoc["Pandoc"]
- Tauri v2
- native desktop packaging
- file dialogs
- local file persistence
- Rust
- Tokio
- async IO
- provider routing
- export queue
- RAG indexing
- built-in tools
- Svelte 5
- Tailwind CSS
- hybrid editor UI
- Workbench UI
- local state management
src-tauri/src/
main.rs Tauri commands and native integration
ai_router.rs Multi-provider AI routing and streaming
file_engine.rs Large-file engine
export_queue.rs Export queue
rag_index.rs Local RAG index
mcp.rs Built-in tools / skills
plugin_host.rs Plugin sandbox host
src/lib/
ai/ AI Agent, RAG, MCP client UI
editor/ Hybrid editor, large-file editor, export helpers
plugins/ Extension center
ui/ Workspace shell and desktop UI
OpenMarkdown is designed to compete on both experience and capability.
- AI is integrated into editing, not bolted on
- local models are first-class
- local knowledge workflows are built in
- very large Markdown files are part of the main product path
- Workbench results can flow directly back into the note
- Rust core with a native desktop runtime instead of a browser-heavy stack
- smaller packaging footprint than many Electron-class desktop apps
- lower baseline runtime overhead from the native stack
- local-first
- no forced cloud note storage
- direct file ownership
- desktop-native performance
- portable Markdown output instead of proprietary documents
- native bundle size and runtime profile designed to stay lean
OpenMarkdown is built around local ownership.
- notes live on your machine
- local RAG can run on your machine
- local models through Ollama are supported directly
- cloud providers are optional, not mandatory
When you choose a cloud model, the request necessarily goes to that provider. When you choose a local model, the note can stay on your machine.
OpenMarkdown is designed for:
- macOS
- Windows
- Linux
The native stack is based on Tauri v2 with platform-specific adaptation paths.
Current packaging direction:
- macOS:
.app,.dmg - Windows: native Tauri desktop bundle targets
- Linux: native Tauri desktop bundle targets
npm installnpm run devnpm run tauri:devnpm run build
npm run tauri:buildnpm run tauri:build:fullnpm run previewnpm run checknpm run benchmark:largenpm run release:auditnpm run release:manifestnpm run dev
npm run build
npm run preview
npm run check
npm run qa:buttons
npm run smoke
npm run benchmark:large
npm run release:audit
npm run release:manifest
npm run release:gate
npm run tauri:dev
npm run tauri:build
npm run tauri:build:fullnpm run release:gateThis runs the local release gate, including:
- build
- smoke checks
- benchmark generation
- release audit
npm run smokenpm run qa:buttonsnpm run tauri:build is the stable packaging path for this repository.
npm run tauri:build:full runs the full Tauri build directly.
Actual output format depends on:
- current operating system
- installed native toolchains
- bundling/signing environment
docs/BENCHMARK.mddocs/RELEASE_NOTES_v1.0.0.mddocs/benchmarks/latest-large-file.jsondocs/release/manifest-v1.0.0.json
Internal release and launch playbooks live under:
docs/internal/release/docs/internal/launch/
OpenMarkdown uses semantic versioning.
The public release format should be:
v1.0.0v1.0.1v1.1.0v2.0.0
Release date can still be shown in release notes and changelogs, but the product version should remain semver-based.
OpenMarkdown is released under the MIT License.
- License file:
LICENSE
This means:
- commercial use is allowed
- modification is allowed
- redistribution is allowed
- private use is allowed
subject to the standard MIT license terms.
Current public release target:
- v1.0.0
OpenMarkdown is already usable as a real desktop Markdown editor with AI-native workflows, but the long-term goal is larger:
the best AI-native Markdown editor of the agent era


