Skip to content

ToBeWin/openmarkdown

OpenMarkdown logo

OpenMarkdown demo

OpenMarkdown editor screenshot OpenMarkdown workbench screenshot

OpenMarkdown

The AI-native Markdown editor.

OpenMarkdown is an MIT-licensed desktop Markdown editor built for the agent era. It is designed around a simple idea:

Markdown is becoming the default working format for AI.

Prompts, notes, specs, memory, research, agent context, product docs, personal knowledge, and lightweight publishing all work better when the source format is structured, portable, and easy for both humans and models to understand.

OpenMarkdown is built to make that workflow fast, local-first, and practical on real files, including very large Markdown documents that traditional editors struggle with.

GitHub:

Quick glance:

  • 🤖 AI-native Markdown editing
  • 📂 Local-first workflows
  • 🚀 100MB+ large-file path
  • 🦀 Rust core + native desktop runtime
  • 🖥️ macOS / Windows / Linux
  • ⚖️ MIT licensed

Compass Positioning

OpenMarkdown is not just another Markdown editor.

It is intended to be:

  • AI-native: AI works directly on the current note, section, or selection and writes back into the document.
  • Local-first: your notes stay on your machine by default; local RAG and local models are first-class.
  • Built for large files: the editor is designed to open and edit 100MB+ Markdown without freezing the UI.
  • Cross-platform: native desktop runtime with a Rust core and a Svelte frontend.
  • Open source: MIT licensed, inspectable, extensible, and community-friendly.

The product principle is:

simple by default, powerful when needed.

That means:

  • the editor remains the primary surface
  • advanced controls stay secondary
  • AI, knowledge, tools, and extensions always serve the current Markdown task
  • low-frequency complexity does not crowd the default workflow

Why Now

Most Markdown editors were built for a pre-agent workflow:

  1. write text
  2. preview text
  3. export text

That is no longer enough.

Today, Markdown is increasingly used as:

  • AI prompt source
  • AI memory format
  • personal knowledge format
  • spec and documentation format
  • project planning format
  • note-taking format
  • agent context format

In this environment, the editor itself needs to understand:

  • the current note
  • the current section
  • the current selection
  • the local workspace
  • the user's preferred model
  • the difference between cloud and local workflows

OpenMarkdown is built around that reality.

Why It Is Different

1. AI-native editing

OpenMarkdown does not treat AI as a side chatbot bolted onto an editor.

AI can work on:

  • the full note
  • the current section
  • the current selection

AI outputs can:

  • replace a selection
  • replace a section
  • update the whole note
  • insert or append content
  • flow back into the current Markdown task

Inline AI editing supports:

  • polish
  • expand
  • shorten
  • structure
  • translate to English
  • translate to Chinese
  • patch-style preview before apply

2. Local-first workflows

OpenMarkdown is designed so that important workflows can stay local:

  • local files
  • local workspace browsing
  • local RAG over your notes
  • local model support through Ollama
  • local export
  • local image asset storage

If you want cloud models, you can use them. If you want local models, they are first-class.

3. Large-file architecture

Large Markdown files are not treated as a separate product.

OpenMarkdown includes a large-file engine integrated into the normal Writer flow. Open a large Markdown file and the editor switches to the large-file path automatically.

Current large-file behavior includes:

  • async indexing
  • range-based reads
  • large-file editing path inside Writer
  • temporary open for dragged-in Markdown files

Benchmark artifacts live in:

  • docs/benchmarks/latest-large-file.json

4. Workbench that serves Markdown

The right-side Workbench is not meant to be a second app.

Its job is to support the current Markdown editing task.

Workbench includes:

  • AI Agent
  • Knowledge
  • Tools
  • Extensions

Workbench results can:

  • write back into the current Markdown document
  • send results into AI for further transformation

This turns the editor into a single working system instead of multiple disconnected feature pages.

Key Capabilities

Writing Experience

  • WYSIWYG-style hybrid editing
  • write Markdown and see structure render in place
  • source-visible when you need it, clean document flow when you do not
  • Source mode
  • Line numbers
  • Outline
  • Quick Open
  • Command Palette
  • Zen mode
  • Table formatting tools
  • section-aware editing behaviors
  • slash commands for inserts

Markdown and Rich Content

OpenMarkdown currently supports the primary Markdown writing workflow plus rich blocks commonly used in modern Markdown:

  • headings
  • unordered lists
  • ordered lists
  • task lists
  • blockquotes
  • fenced code blocks
  • tables
  • image links
  • horizontal rules
  • LaTeX formulas
  • Mermaid diagrams
  • ECharts blocks
  • media embed blocks

Images and Asset Handling

Images can be inserted by:

  • choosing files from disk
  • paste
  • drag and drop

Imported images are stored beside the note in:

  • .openmarkdown-assets/

Markdown uses relative links so notes remain portable.

AI Models

OpenMarkdown includes a multi-model adapter with support for:

  • Ollama
  • OpenAI
  • Anthropic
  • Gemini
  • DeepSeek
  • Qwen
  • MiniMax
  • GLM
  • Kimi

Provider UX includes:

  • local / cloud readiness state
  • model fetching
  • fallback handling
  • conversation history
  • timestamps
  • attachment preview
  • model identity on assistant replies
  • configurable chat-history folder

Knowledge / Tools / Extensions

Knowledge

  • local RAG index over your notes
  • workspace-aware retrieval
  • result write-back into the current note

Tools / Skills

Built-in local tools include:

  • folder reading
  • file-range reading
  • Python execution
  • URL fetching

Extensions

  • extension sandbox
  • plugin execution
  • editor-context-aware payloads

Export

Pandoc-based local export queue supports:

  • PDF
  • DOCX
  • LaTeX
  • HTML
  • Reveal.js slides
  • PNG

Architecture

OpenMarkdown keeps the editor at the center and routes AI, retrieval, tools, export, and large-file handling around the current Markdown task.

flowchart LR
    user["User"] --> writer["Writer<br/>Hybrid / Source / Large File"]
    writer --> workbench["Workbench<br/>AI Agent / Knowledge / Tools / Extensions"]
    writer --> editor_state["Editor Context<br/>note / section / selection"]
    editor_state --> workbench
    workbench --> writeback["Write Back to Markdown"]
    writeback --> writer

    writer --> tauri["Tauri v2 Desktop Shell"]
    tauri --> rust["Rust Core"]

    rust --> file_engine["Large-file Engine"]
    rust --> ai_router["AI Router"]
    rust --> rag["Local RAG Index"]
    rust --> exportq["Export Queue"]
    rust --> skills["Built-in Skills / MCP Layer"]
    rust --> plugins["Plugin Host"]

    ai_router --> ollama["Ollama"]
    ai_router --> cloud["OpenAI / Anthropic / Gemini / DeepSeek / Qwen / MiniMax / GLM / Kimi"]
    exportq --> pandoc["Pandoc"]
Loading

Desktop runtime

  • Tauri v2
  • native desktop packaging
  • file dialogs
  • local file persistence

Core engine

  • Rust
  • Tokio
  • async IO
  • provider routing
  • export queue
  • RAG indexing
  • built-in tools

Frontend

  • Svelte 5
  • Tailwind CSS
  • hybrid editor UI
  • Workbench UI
  • local state management

Major modules

src-tauri/src/
  main.rs              Tauri commands and native integration
  ai_router.rs         Multi-provider AI routing and streaming
  file_engine.rs       Large-file engine
  export_queue.rs      Export queue
  rag_index.rs         Local RAG index
  mcp.rs               Built-in tools / skills
  plugin_host.rs       Plugin sandbox host

src/lib/
  ai/                  AI Agent, RAG, MCP client UI
  editor/              Hybrid editor, large-file editor, export helpers
  plugins/             Extension center
  ui/                  Workspace shell and desktop UI

Product Advantages

OpenMarkdown is designed to compete on both experience and capability.

Compared to traditional Markdown editors

  • AI is integrated into editing, not bolted on
  • local models are first-class
  • local knowledge workflows are built in
  • very large Markdown files are part of the main product path
  • Workbench results can flow directly back into the note
  • Rust core with a native desktop runtime instead of a browser-heavy stack
  • smaller packaging footprint than many Electron-class desktop apps
  • lower baseline runtime overhead from the native stack

Compared to cloud-heavy AI writing tools

  • local-first
  • no forced cloud note storage
  • direct file ownership
  • desktop-native performance
  • portable Markdown output instead of proprietary documents
  • native bundle size and runtime profile designed to stay lean

Privacy and Ownership

OpenMarkdown is built around local ownership.

  • notes live on your machine
  • local RAG can run on your machine
  • local models through Ollama are supported directly
  • cloud providers are optional, not mandatory

When you choose a cloud model, the request necessarily goes to that provider. When you choose a local model, the note can stay on your machine.

Supported Platforms

OpenMarkdown is designed for:

  • macOS
  • Windows
  • Linux

The native stack is based on Tauri v2 with platform-specific adaptation paths.

Current packaging direction:

  • macOS: .app, .dmg
  • Windows: native Tauri desktop bundle targets
  • Linux: native Tauri desktop bundle targets

Install and Run

Install dependencies

npm install

Frontend only

npm run dev

Desktop app

npm run tauri:dev

Production build

npm run build
npm run tauri:build

Full Tauri build

npm run tauri:build:full

Frontend preview

npm run preview

Full local check

npm run check

Large-file benchmark

npm run benchmark:large

Release audit

npm run release:audit

Release manifest

npm run release:manifest

Available npm scripts

npm run dev
npm run build
npm run preview
npm run check
npm run qa:buttons
npm run smoke
npm run benchmark:large
npm run release:audit
npm run release:manifest
npm run release:gate
npm run tauri:dev
npm run tauri:build
npm run tauri:build:full

Release and QA

Release gate

npm run release:gate

This runs the local release gate, including:

  • build
  • smoke checks
  • benchmark generation
  • release audit

Smoke

npm run smoke

Button audit

npm run qa:buttons

Packaging notes

npm run tauri:build is the stable packaging path for this repository.

npm run tauri:build:full runs the full Tauri build directly.

Actual output format depends on:

  • current operating system
  • installed native toolchains
  • bundling/signing environment

Repository Docs

  • docs/BENCHMARK.md
  • docs/RELEASE_NOTES_v1.0.0.md
  • docs/benchmarks/latest-large-file.json
  • docs/release/manifest-v1.0.0.json

Internal release and launch playbooks live under:

  • docs/internal/release/
  • docs/internal/launch/

Versioning

OpenMarkdown uses semantic versioning.

The public release format should be:

  • v1.0.0
  • v1.0.1
  • v1.1.0
  • v2.0.0

Release date can still be shown in release notes and changelogs, but the product version should remain semver-based.

License

OpenMarkdown is released under the MIT License.

  • License file: LICENSE

This means:

  • commercial use is allowed
  • modification is allowed
  • redistribution is allowed
  • private use is allowed

subject to the standard MIT license terms.

Status

Current public release target:

  • v1.0.0

OpenMarkdown is already usable as a real desktop Markdown editor with AI-native workflows, but the long-term goal is larger:

the best AI-native Markdown editor of the agent era

About

The AI-native Markdown editor.

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors