llm-docker provides a secure, sandboxed environment for running OpenCode and Claude Code with complete data isolation and privacy on macOS.
Stop worrying about local dependencies or data leaks. Run your AI agents in a controlled Docker environment that automatically manages keys, configs, and persistence.
Run these commands instead of the native tools:
- OpenCode:
ocd - Claude Code:
cld
Simple as that. The environment handles the rest.
You can use ocd ./my-path too with params ocd ./my-path -c
Create a .env file in the llm_docker directory with your API keys and settings:
cat > .env << 'EOF'
# ============================================
# OpenCode API Keys
# ============================================
OPENAI_API_KEY=sk-your-openai-key-here
ZAI_API_KEY=your-zai-key-here
# ============================================
# Claude Code API Keys
# ============================================
ANTHROPIC_API_KEY=sk-ant-your-anthropic-key-here
CLAUDE_CODE_MAX_OUTPUT_TOKENS=48000
EOFEdit opencode.config.jsonc to customize your OpenCode model preferences.
Ensure Docker is installed and accessible:
docker --version
docker compose versiondocker compose build
sudo ln -sf $(pwd)/ocd /usr/local/bin/ocd
sudo ln -sf $(pwd)/cld /usr/local/bin/cldocdIt will spin up or use existing docker container and drop you into its shell, then launch OpenCode.
cldIt will spin up or use existing docker container and drop you into its shell, then launch Claude Code.
The llm-docker container includes:
- Base Image:
node:18(with Python 3.11+ support) - OpenCode CLI: Globally installed via
npm install -g opencode-ai - Claude Code CLI: Globally installed via
npm install -g @anthropic-ai/claude-code - Development Tools: Python, pip, git, curl, wget, vim
- Security: Dropped capabilities, no-new-privileges, restricted file access
- Network: Host mode for seamless connectivity
- Volume Mounts:
~/Projects→/root/Projects(your projects)~/.llm_docker/opencode→/root(persistent OpenCode data)~/.llm_docker/claude→/root_claude(persistent Claude Code data)opencode.config.jsonc→/tmp/opencode.config.jsonc(config file)
- ✅ Auto-start OpenCode/Claude Code - Launches automatically when container starts
- ✅ Smart directory detection - Starts in your current directory
- ✅ Interactive shell access - Drop to shell after tool exits
- ✅ Data persistence - All sessions, API keys, and config saved to
~/.llm_docker/opencode(OpenCode) or~/.llm_docker/claude(Claude Code) - ✅ Complete isolation - Separate from native macOS installations (privacy-focused)
- ✅ Auto-start Docker - Automatically starts Docker Desktop on macOS
- ✅ Dual tool support - Run both OpenCode and Claude Code from the same Docker image
- ✅ Restricted file access - Only
~/Projectsis accessible - ✅ Dropped capabilities - Minimal container privileges
- ✅ No new privileges - Security hardening enabled
- ✅ Isolated data - Tool data completely separate from host
- ✅ Environment variables - API keys from
.envfile (OPENAI_API_KEY, ZAI_API_KEY, ANTHROPIC_API_KEY) - ✅ Config file support - JSONC format with comments for OpenCode
- ✅ Model customization - Configure agents and models per your needs
- ✅ Custom hostname - Easy identification (
llm-docker)
- ocd/cld --params: Allow to pass through params from ocd/cld to docker's opencode/claude
- Server Mode: Run OpenCode/Claude Code as a server for IDE integration (port 49455)
- SSH/GIT: Securely forward your SSH/Git credentials to the container


{ "model": "openai/gpt-5", "small_model": "zai/glm-4.5-flash", "agent": { "build": { "model": "openai/gpt-5" }, // ... more agent configurations. I left a 6 agent madness for you ;) } }