⚠️ WARNING: Large-scale migrations (especially logs/experiments) can be extremely expensive and operationally risky. This tool includes streaming + resumable migration for high-volume event streams, but TB-scale migrations have not been fully soak-tested in production-like conditions. Use with caution and test on a subset first.
A Python CLI & library for migrating Braintrust organizations with maximum fidelity, using direct HTTP requests (via httpx) against the Braintrust REST API.
This tool provides migration capabilities for Braintrust organizations, handling everything from AI provider credentials to project-level data. It is best suited for small-scale migrations, such as moving POC/test data to a new deployment.
- Organization administrators migrating between environments (dev → staging → prod)
- Teams consolidating multiple organizations
- Enterprises setting up new Braintrust instances
- Developers contributing to migration tooling
- Resource Coverage: Migrates most Braintrust resources including AI secrets, datasets, prompts, functions, experiments, and more
- Dependency Resolution: Handles resource dependencies (e.g., functions referenced by prompts, datasets referenced by experiments)
- Organization vs Project Scope: Org-level resources are migrated once, project-level resources per project
- Real-time Progress: Live progress indicators and detailed migration reports
- High-volume Streaming: Logs, experiment events, and dataset events are migrated via BTQL sorted pagination (by
_pagination_key) with bounded insert batches - Resume + Idempotency: Per-resource/per-experiment checkpoints + a SQLite "seen ids" store enable safe resume and help avoid duplicate inserts/overwrites
- Rate Limit Resilience: Automatic LIMIT backoff on 500/504 errors (retries with progressively smaller page sizes: 1000 → 500 → 250 → ...)
- Dependency-Aware Migration: Resources are migrated in an order that respects dependencies (see below)
- Organization Scoping: AI secrets, roles, and groups migrated once at org level
- Batch Processing: Configurable batch sizes for optimal performance
- Retry Logic: Adaptive retries with exponential backoff + jitter; respects
Retry-Afterwhen rate-limited (429) - Validation: Pre-flight connectivity and permission checks
- Error Recovery: Detailed error reporting with actionable guidance
- Real-time Progress: Live updates on what's being created, skipped, or failed
- Comprehensive Reporting: JSON + human-readable migration summaries
- Structured Logging: JSON and text formats with configurable detail levels
- Skip Analysis: Detailed breakdowns of why resources were skipped
- Python 3.8+ (3.12+ recommended)
- API Keys for source and destination Braintrust organizations
- Network Access to Braintrust API endpoints
# Clone the repository
git clone https://github.com/braintrustdata/braintrust-migrate
cd braintrust-migrate
# Install uv if not already installed
curl -LsSf https://astral.sh/uv/install.sh | sh
# Install with uv (recommended)
uv sync --all-extras
source .venv/bin/activate
# Or install with pip
pip install -e .
# Verify installation
braintrust-migrate --help# Install development dependencies
uv sync --all-extras --dev
# Install pre-commit hooks
pre-commit install
# Run tests to verify setup
pytestAll options can be set via environment variables or CLI flags. CLI flags take precedence over environment variables.
| Environment Variable | CLI Flag | Default | Description |
|---|---|---|---|
BT_SOURCE_API_KEY |
— | — | Required. API key for source organization |
BT_SOURCE_URL |
— | https://api.braintrust.dev |
Source Braintrust API URL |
BT_DEST_API_KEY |
— | — | Required. API key for destination organization |
BT_DEST_URL |
— | https://api.braintrust.dev |
Destination Braintrust API URL |
| Environment Variable | CLI Flag | Default | Description |
|---|---|---|---|
MIGRATION_RESOURCES |
--resources, -r |
all |
Comma-separated list of resources to migrate. Options: all, ai_secrets, roles, groups, datasets, project_tags, span_iframes, functions, prompts, project_scores, experiments, logs, views |
MIGRATION_PROJECTS |
--projects, -p |
(all projects) | Comma-separated list of project names to migrate |
MIGRATION_CREATED_AFTER |
--created-after |
(none) | Only migrate data created on or after this date (inclusive: >=). Format: YYYY-MM-DD or ISO-8601 |
MIGRATION_CREATED_BEFORE |
--created-before |
(none) | Only migrate data created before this date (exclusive: <). Format: YYYY-MM-DD or ISO-8601 |
| Environment Variable | CLI Flag | Default | Description |
|---|---|---|---|
LOG_LEVEL |
--log-level, -l |
INFO |
Log verbosity. Options: DEBUG, INFO, WARNING, ERROR, CRITICAL |
LOG_FORMAT |
--log-format, -f |
text |
Log output format. Options: json, text |
| Environment Variable | CLI Flag | Default | Description |
|---|---|---|---|
MIGRATION_STATE_DIR |
--state-dir, -s |
./checkpoints |
Directory for migration state and checkpoints. Can be a root dir, run dir, or project dir (see Resume section) |
| Environment Variable | CLI Flag | Default | Description |
|---|---|---|---|
MIGRATION_BATCH_SIZE |
— | 100 |
Number of resources to process per batch |
MIGRATION_RETRY_ATTEMPTS |
— | 3 |
Number of retry attempts for failed operations (0 = no retries) |
MIGRATION_RETRY_DELAY |
— | 1.0 |
Initial retry delay in seconds (exponential backoff) |
MIGRATION_MAX_CONCURRENT |
— | 10 |
Maximum concurrent operations |
MIGRATION_CHECKPOINT_INTERVAL |
— | 50 |
Write checkpoint every N successful operations |
These settings control BTQL-based streaming for high-volume resources.
| Environment Variable | CLI Flag | Default | Description |
|---|---|---|---|
MIGRATION_EVENTS_FETCH_LIMIT |
— | 1000 |
BTQL fetch page size (rows per query) |
MIGRATION_EVENTS_INSERT_BATCH_SIZE |
— | 200 |
Events per insert API call |
MIGRATION_EVENTS_USE_SEEN_DB |
— | true |
Use SQLite store for deduplication |
MIGRATION_LOGS_FETCH_LIMIT |
--logs-fetch-limit |
(inherits) | Override fetch limit for logs only |
MIGRATION_LOGS_INSERT_BATCH_SIZE |
--logs-insert-batch-size |
(inherits) | Override insert batch size for logs only |
Resource-specific overrides follow the pattern MIGRATION_{RESOURCE}_FETCH_LIMIT, MIGRATION_{RESOURCE}_INSERT_BATCH_SIZE, MIGRATION_{RESOURCE}_USE_SEEN_DB where {RESOURCE} is LOGS, EXPERIMENT_EVENTS, or DATASET_EVENTS.
| Environment Variable | CLI Flag | Default | Description |
|---|---|---|---|
MIGRATION_INSERT_MAX_REQUEST_BYTES |
— | 6291456 (6MB) |
Maximum HTTP request payload size |
MIGRATION_INSERT_REQUEST_HEADROOM_RATIO |
— | 0.75 |
Target ratio of max size (0.75 = ~4.5MB effective limit) |
| Environment Variable | CLI Flag | Default | Description |
|---|---|---|---|
| — | --dry-run, -n |
false |
Validate configuration without making changes |
| — | --config, -c |
(none) | Path to YAML/JSON configuration file |
MIGRATION_COPY_ATTACHMENTS |
— | false |
Copy Braintrust-managed attachments between orgs |
MIGRATION_ATTACHMENT_MAX_BYTES |
— | 52428800 (50MB) |
Maximum attachment size to copy |
Create a .env file with your configuration:
# Copy the example file
cp .env.example .envExample .env file:
# Required: API keys
BT_SOURCE_API_KEY=your_source_api_key_here
BT_DEST_API_KEY=your_destination_api_key_here
# Optional: Custom URLs (defaults to https://api.braintrust.dev)
# BT_SOURCE_URL=https://api.braintrust.dev
# BT_DEST_URL=https://api.braintrust.dev
# Optional: Logging
LOG_LEVEL=INFO
LOG_FORMAT=text
# Optional: Date filtering
# MIGRATION_CREATED_AFTER=2026-01-01
# MIGRATION_CREATED_BEFORE=2026-02-01- Log into Braintrust → Go to your organization settings
- Navigate to API Keys → Usually under Settings or Developer section
- Generate New Key → Create with appropriate permissions:
- Source: Read permissions for all resource types
- Destination: Write permissions for resource creation
- Copy Keys → Add to your
.envfile
Permission Requirements:
- Source org:
read:allor specific resource read permissions - Destination org:
write:allor specific resource write permissions
Validate Configuration:
# Test connectivity and permissions
braintrust-migrate validateComplete Migration:
# Migrate all resources
braintrust-migrate migrateSelective Migration:
# Migrate specific resource types
braintrust-migrate migrate --resources ai_secrets,datasets,prompts
# Migrate specific projects only
braintrust-migrate migrate --projects "Project A","Project B"Resume Migration:
# Resume from last checkpoint (automatic)
#
# `--state-dir` (or MIGRATION_STATE_DIR) can be:
# - a root directory (creates a new timestamped run dir under it)
# - a run directory (resumes that run)
# - a project directory within a run (resumes that run and infers the project)
#
# Root checkpoints dir (new run):
braintrust-migrate migrate --state-dir ./checkpoints
#
# Resume from a specific run:
braintrust-migrate migrate --state-dir ./checkpoints/20260113_212530
#
# Resume just one project from a run:
braintrust-migrate migrate --state-dir ./checkpoints/20260113_212530/langgraph-supervisor --resources logsCustom Configuration:
braintrust-migrate migrate \
--state-dir ./production-migration \
--log-level DEBUG \
--log-format text \
--batch-size 50Dry Run (Validation Only):
braintrust-migrate migrate --dry-runTime-based Filtering:
# Migrate data from a specific date onward (inclusive: >=)
braintrust-migrate migrate --created-after 2026-01-15
# Migrate data before a specific date (exclusive: <)
braintrust-migrate migrate --created-before 2026-02-01
# Date range: migrate all of January 2026
braintrust-migrate migrate --created-after 2026-01-01 --created-before 2026-02-01Semantics:
--created-afteris inclusive (>=),--created-beforeis exclusive (<). This half-open interval[after, before)makes date ranges intuitive—e.g., "all of January" is--created-after 2026-01-01 --created-before 2026-02-01.
# General help
braintrust-migrate --help
# Command-specific help
braintrust-migrate migrate --help
braintrust-migrate validate --helpThe migration follows a dependency-aware order:
- AI Secrets - AI provider credentials (OpenAI, Anthropic, etc.)
- Roles - Organization-level role definitions
- Groups - Organization-level user groups
- Datasets - Training and evaluation data
- Project Tags - Project-level metadata tags
- Span Iframes - Custom span visualization components
- Functions - Tools, scorers, tasks, and LLMs (migrated before prompts)
- Prompts - Template definitions that can use functions as tools
- Project Scores - Scoring configurations
- Experiments - Experiment metadata + event streams (BTQL sorted pagination)
- Logs - Project logs / traces (BTQL sorted pagination)
- Views - Custom project views
- Functions are migrated before prompts to ensure all function references in prompts can be resolved.
- Experiments handle dependencies on datasets and other experiments (via
base_exp_id) in a single pass with dependency-aware ordering. - ID mapping and dependency resolution are used throughout to ensure references are updated to the new organization/project.
- Prompts are migrated in a single pass; prompt origins and tool/function references are remapped via ID mappings.
- ACLs: Support is present in the codebase but may be experimental or disabled by default.
- Agents and users: Not supported for migration (users are org-specific; agents are not present in the codebase).
Real-time Updates:
2024-01-15 10:30:45 [info] Starting organization-scoped resource migration
2024-01-15 10:30:46 [info] ✅ Created AI secret: 'OpenAI API Key' (src-123 → dest-456)
2024-01-15 10:30:47 [info] ⏭️ Skipped role: 'Admin' (already exists)
2024-01-15 10:30:48 [info] Starting project-scoped resource migration
2024-01-15 10:30:49 [info] ✅ Created dataset: 'Training Data' (src-789 → dest-012)
Comprehensive Reporting: After migration, you'll get:
- JSON Report (
migration_report.json) - Machine-readable detailed results - Human Summary (
migration_summary.txt) - Readable overview with skip analysis - Checkpoint Files - Resume state for interrupted migrations
The tool uses two-level checkpointing for streaming resources (logs, experiments, datasets):
Level 1: Resource Metadata
{project}/experiments_state.json- tracks which experiments are created- Contains
completed_ids,failed_ids, andid_mapping(source → dest) - On resume: skips experiments in
completed_ids
Level 2: Event Streaming State (per-experiment/dataset)
{project}/experiment_events/{exp_id}_state.json- BTQL pagination position{project}/experiment_events/{exp_id}_seen.sqlite3- deduplication store- Tracks
btql_min_pagination_key(resume point) and counters (fetched/inserted) - On resume: continues from last
_pagination_key
Example: If you migrate 100 experiments and crash after:
- ✅ Metadata created for experiments 1-50
- ✅ All events migrated for experiments 1-30
⚠️ 50% of events migrated for experiment 31 (crashed mid-stream)
On resume: skips 1-30 (done), resumes experiment 31 from saved _pagination_key, continues with 32-100.
The following resource types are supported:
- AI Secrets (organization-scoped)
- Roles (organization-scoped)
- Groups (organization-scoped)
- Datasets
- Project Tags
- Span Iframes
- Functions
- Prompts
- Project Scores
- Experiments
- Logs
- Views
- ACLs (experimental; may be disabled)
Note: Agents and users are not supported for migration.
1. Authentication Errors
# Verify API keys
braintrust-migrate validate
# Check key permissions
curl -H "Authorization: Bearer $BT_SOURCE_API_KEY" \
https://api.braintrust.dev/v1/organization2. Dependency Errors
- Circular Dependencies: If you hit a dependency loop, try migrating the involved resource types separately (or re-run; idempotent resources will skip)
- Missing Resources: Check source organization for required dependencies
- Permission Issues: Ensure API keys have read/write access
3. Performance Issues
# Reduce batch size
export MIGRATION_BATCH_SIZE=25
# Increase retry delay
export MIGRATION_RETRY_DELAY=2.0
# Migrate incrementally
braintrust-migrate migrate --resources ai_secrets,datasets
braintrust-migrate migrate --resources prompts,functions4. Network Issues
- Timeouts: Increase retry attempts and delay
- Rate Limits: Reduce batch size and concurrent operations; the client respects
Retry-Afterwhen throttled (429) - Connectivity: Verify firewall and proxy settings
Tip: If you want rate-limit retries/backoff to actually happen, ensure
MIGRATION_RETRY_ATTEMPTSis greater than 0. If it is set to0, the tool will fail fast on 429/5xx without retrying.
Enable detailed logging for troubleshooting:
# Maximum verbosity
braintrust-migrate migrate \
--log-level DEBUG \
--log-format text
# Focus on specific issues
export LOG_LEVEL=DEBUG
braintrust-migrate validateResume Interrupted Migration:
# Automatic resume (recommended)
braintrust-migrate migrate
# Manual checkpoint specification (run directory)
braintrust-migrate migrate --state-dir ./checkpoints/20240115_103045
# Or point directly at a single project's checkpoint directory
braintrust-migrate migrate --state-dir ./checkpoints/20240115_103045/ProjectA --resources logsPartial Re-migration:
# Re-migrate specific resource types
braintrust-migrate migrate --resources experiments,logs
# Re-migrate specific projects
braintrust-migrate migrate --projects "Failed Project"braintrust_migrate/
├── __init__.py # Package initialization
├── config.py # Configuration models (Pydantic)
├── client.py # Braintrust API client wrapper
├── orchestration.py # Migration orchestrator & reporting
├── cli.py # Command-line interface (Typer)
├── resources/ # Resource-specific migrators
│ ├── __init__.py
│ ├── base.py # Abstract base migrator class
│ ├── ai_secrets.py # AI provider credentials
│ ├── datasets.py # Training/evaluation data
│ ├── prompts.py # Prompt templates
│ ├── functions.py # Tools, scorers, tasks
│ ├── experiments.py # Evaluation runs
│ ├── logs.py # Execution traces
│ ├── roles.py # Organization roles
│ ├── groups.py # Organization groups
│ └── views.py # Project views
└── checkpoints/ # Migration state (created at runtime)
├── organization/ # Org-scoped resource checkpoints
└── project_name/ # Project-scoped checkpoints
tests/
├── unit/ # Unit tests (fast)
├── integration/ # Integration tests (API mocking)
└── e2e/ # End-to-end tests (real API)
We welcome contributions! Here's how to get started:
1. Setup Development Environment:
# Fork and clone the repository
git clone https://github.com/yourusername/migration-tool.git
cd migration-tool
# Install development dependencies
uv sync --all-extras --dev
# Install pre-commit hooks
pre-commit install2. Development Workflow:
# Create feature branch
git checkout -b feature/your-feature-name
# Make changes and test
pytest # Run tests
ruff check --fix # Lint and format
mypy braintrust_migrate # Type checking
# Commit with pre-commit hooks
git commit -m "feat: add your feature"3. Testing:
# Run all tests
pytest
# Run with coverage
pytest --cov=braintrust_migrate --cov-report=html
# Run specific test categories
pytest tests/unit/ # Fast unit tests
pytest tests/integration/ # Integration tests
pytest tests/e2e/ # End-to-end tests- Type Hints: All functions must have type annotations
- Documentation: Docstrings for public APIs
- Testing: New features require tests
- Linting: Code must pass
ruffchecks - Formatting: Automatic formatting with
ruff format
To add support for a new Braintrust resource type:
- Create Migrator Class in
braintrust_migrate/resources/new_resource.py - Extend Base Class from
ResourceMigrator[ResourceType] - Implement Required Methods:
list_source_resources,migrate_resource, etc. - Add to Orchestration in appropriate scope (organization vs project)
- Write Tests covering the new functionality
- Update Documentation including this README
# Setup environment for dev → prod migration
cat > .env << EOF
BT_SOURCE_API_KEY="dev_org_api_key_here"
BT_SOURCE_URL="https://api.braintrust.dev"
BT_DEST_API_KEY="prod_org_api_key_here"
BT_DEST_URL="https://api.braintrust.dev"
LOG_LEVEL=INFO
EOF
# Validate before migrating
braintrust-migrate validate
# Run complete migration
braintrust-migrate migrate# Phase 1: Setup and data
braintrust-migrate migrate --resources ai_secrets,datasets
# Phase 2: Logic and templates
braintrust-migrate migrate --resources prompts,functions
# Phase 3: Experiments and results
braintrust-migrate migrate --resources experiments,logs# Migrate only specific projects
braintrust-migrate migrate --projects "Customer Analytics","Model Evaluation"
# Later migrate remaining projects
braintrust-migrate migrate# If migration fails partway through:
braintrust-migrate migrate
# Automatically resumes from last checkpoint
# Or specify checkpoint directory:
braintrust-migrate migrate --state-dir ./checkpoints/20240115_103045- Config Models: See
braintrust_migrate/config.pyfor configuration options - Resource Migrators: Base classes in
braintrust_migrate/resources/base.py - Client Wrapper: API helpers in
braintrust_migrate/client.py
- Check Documentation: Start with this README and inline code documentation
- Review Logs: Enable debug logging for detailed troubleshooting information
- Validate Setup: Use
braintrust-migrate validateto test configuration - Check Issues: Search existing GitHub issues for similar problems
- Create Issue: Open a new issue with detailed information including:
- Error messages and logs
- Configuration (sanitized)
- Migration command used
- Environment details
Before Migration:
- Test with a small subset of data first
- Backup critical data in source organization
- Verify API key permissions
- Plan for AI secret reconfiguration
During Migration:
- Monitor progress through logs
- Don't interrupt during critical operations
- Keep network connection stable
After Migration:
- Verify migrated data completeness
- Reconfigure AI provider credentials
- Test functionality in destination organization
- Archive migration reports for compliance
This project is licensed under the MIT License. See the LICENSE file for details.
braintrust-migrate validate # Test setup
braintrust-migrate migrate # Full migration
braintrust-migrate migrate --dry-run # Validation only
braintrust-migrate migrate --resources ai_secrets,datasets # Selective migration.env- Configurationcheckpoints/- Migration statemigration_report.json- Detailed resultsmigration_summary.txt- Human-readable summary
- AI Secrets: Only metadata migrated; manually configure actual API keys
- Dependency Order: Functions are migrated before prompts; all dependencies are resolved via ID mapping
- Organization Scope: Some resources migrated once, others per project
- Resume Capability: Interrupted migrations automatically resume from checkpoints
- Not for Large-Scale Data: This tool is not thoroughly tested for large-scale logs or experiments. Use for POC/test data only.