A reusable platform for running protein design competitions using AlphaFold3 structure prediction.
If you're in the SeokLab GitHub organization and want to run your own competition, the self-hosted runner on galaxy4 is already available for all seoklab repos.
- Create your repository under
seoklaborg - Clone this template and update paths (see Setup Guide below)
- Replace target problems with your actual targets (see Replacing Target Problems)
- Set up Netlify for the submission form
- Enable GitHub Pages for the public site
The platform comes with example problems. Here's how to replace them with your actual competition targets.
| File | Purpose |
|---|---|
docs/targets/config.json |
Problem definitions (names, types, residue counts) |
docs/targets/*.pdb |
Reference structures for evaluation |
docs/targets/*.a3m |
Pre-computed MSA files (for binder targets) |
.github/workflows/check_completion.yml |
Problem-to-reference mapping for evaluation |
docs/index.html |
Problem descriptions shown to participants |
For each problem, prepare a backbone-only PDB file:
# Extract backbone atoms (N, CA, C, O) from your reference structure
# Name convention: {problem_id}_bb.pdb or descriptive name
# For monomer problems:
grep -E "^ATOM.*( N | CA | C | O )" full_structure.pdb > problem_1_bb.pdb
# For binder problems, include both target and binder chains:
# Chain A = participant's binder, Chain B = given targetPlace all PDB files in docs/targets/.
Edit docs/targets/config.json:
{
"problems": [
{
"id": "problem_1",
"name": "Problem 1 - Your Title",
"description": "Description shown to participants",
"target_file": "your_target_1.pdb",
"residue_count": 50,
"type": "monomer",
"msa_mode": "none"
},
{
"id": "problem_2",
"name": "Problem 2 - Binder Design",
"description": "Design a binder for target X",
"type": "binder",
"target_file": "binder_problem_x.pdb",
"residue_count": 80,
"target_sequence": "MKTAYIAK...",
"target_msa_file": "/full/path/to/target_x.a3m",
"participant_msa_mode": "none",
"expected_binder_length": [50, 100]
}
]
}Problem Types:
monomer: Participant designs a sequence to fold into the target structurebinder: Participant designs a binder sequence for a given target protein
MSA Modes:
none: No MSA search (faster, for de novo design)search: Run MSA search (default AF3 behavior)precomputed: Use pre-calculated MSA file
Edit .github/workflows/check_completion.yml, find the PROBLEM_REFS section (~lines 224-230):
declare -A PROBLEM_REFS=(
["problem_1"]="your_target_1.pdb:monomer"
["problem_2"]="binder_problem_x.pdb:binder"
["problem_3"]="your_target_3.pdb:monomer"
# Add/remove as needed
)Format: ["problem_id"]="reference_file.pdb:problem_type"
Edit docs/index.html to update problem descriptions shown to participants:
<div class="problem-section">
<h3>Problem 1: Your Title</h3>
<p>Description of what participants should design...</p>
<p><strong>Target length:</strong> 50 residues</p>
</div>If your binder problem uses a pre-computed MSA for the target:
# Generate MSA using your preferred tool (e.g., jackhmmer, hhblits)
# Save as .a3m format
# Reference the full path in config.json target_msa_file field# Remove old evaluation results
rm -rf docs/results/*/
# Reset leaderboard
echo '{"problems": {}, "overall_rankings": [], "last_updated": ""}' > docs/leaderboard_data.json| Problem | Type | Reference | Residues |
|---|---|---|---|
| Problem 1 | monomer | 3v86_bb.pdb | 27 |
| Problem 2 | monomer | 4r80_bb.pdb | 76 |
| Problem 3 | monomer | 1qys_bb.pdb | 91 |
| Problem 4 | monomer | 6wi5_bb.pdb | 92 |
| Problem 5 | binder | binder_problem_9bk5.pdb | 78 |
git clone https://github.com/seoklab/design-test.git my-competition
cd my-competition
git remote set-url origin https://github.com/seoklab/my-competition.gitReplace paths with your username in these files:
.github/workflows/process_submission.yml (~lines 15, 58):
SUBMISSIONS_BASE: /data/galaxy4/user/YOUR_USER/my-competition/submissions
QUEUE_DIR: /data/galaxy4/user/YOUR_USER/job_queue.github/workflows/check_completion.yml (~lines 15-18):
SUBMISSIONS_BASE: /data/galaxy4/user/YOUR_USER/my-competition/submissions
PUBLIC_RESULTS: /data/galaxy4/user/YOUR_USER/my-competition/public_results
SITE_URL: https://your-site.netlify.app
ADMIN_EMAIL: your-email@example.commkdir -p /data/galaxy4/user/YOUR_USER/my-competition/submissions
mkdir -p /data/galaxy4/user/YOUR_USER/my-competition/public_results
mkdir -p /data/galaxy4/user/YOUR_USER/job_queue
chmod 777 /data/galaxy4/user/YOUR_USER/my-competition/submissions
chmod 777 /data/galaxy4/user/YOUR_USER/my-competition/public_resultsAdd to crontab on galaxy4 (crontab -e):
*/15 * * * * for f in /data/galaxy4/user/YOUR_USER/job_queue/*.sh; do [ -f "$f" ] && sbatch "$f" && mv "$f" "$f.done"; done- Create account at netlify.com
- Add new site → Import an existing project → Connect to GitHub
- Select your repository
- Add environment variables in Site settings:
GITHUB_TOKEN: Personal access token withreposcopeGITHUB_OWNER:seoklabGITHUB_REPO: Your repo name
Edit docs/index.html:
const SUBMIT_URL = 'https://your-site.netlify.app/api/submit';- Settings → Pages
- Source: Deploy from a branch
- Branch:
main, Folder:/docs
User → Web Form → Netlify Function → GitHub Actions
(GitHub Pages) ↓
Process Submission
(Self-hosted runner on galaxy4)
↓
SLURM Job Queue → AlphaFold3
↓
Check Completion (every minute)
↓
Evaluate + Update Leaderboard
↓
Email + Viewer Link → User
├── .github/workflows/
│ ├── process_submission.yml # Handles new submissions
│ ├── check_completion.yml # Checks jobs, evaluates, updates leaderboard
│ └── end_competition.yml # Final evaluation workflow
├── docs/ # Public site (GitHub Pages)
│ ├── index.html # Submission form
│ ├── viewer.html # Mol* structure viewer
│ ├── leaderboard.html # Competition leaderboard
│ ├── leaderboard_data.json # Leaderboard data
│ ├── targets/ # Reference structures
│ │ ├── config.json # Problem definitions
│ │ └── *.pdb # Reference PDB files
│ └── results/ # Packaged results (per token)
├── netlify/functions/
│ └── submit.js # Form submission API
├── scripts/
│ ├── process_multi_submission.py # Parse submission
│ ├── run_af3.py # Generate SLURM script
│ ├── package_results.py # Package results with token
│ ├── evaluate_structure.py # TMalign/lDDT evaluation
│ └── update_leaderboard.py # Aggregate rankings
└── submissions/ # (gitignored) On galaxy4
| Metric | Description |
|---|---|
| BB-lDDT | Backbone local distance difference test (primary) |
| RMSD | Root mean square deviation after alignment |
| TM-score | Template modeling score |
| pTM | AlphaFold predicted TM-score |
| pLDDT | AlphaFold predicted lDDT |
| Category | Metrics |
|---|---|
| Binder Only | BB-lDDT, RMSD, TM-score (binder chain vs reference) |
| Complex | iLDDT, TM-score, RMSD (full complex alignment) |
| AlphaFold3 | ipTM, pTM, pLDDT |
In .github/workflows/check_completion.yml:
on:
schedule:
- cron: '* * * * *' # Comment out to disable
workflow_dispatch: # Keep for manual triggersActions → Check Job Completion → Run workflow
To auto-deploy when results are ready:
- In Netlify: Site settings → Build hooks → Add hook
- Add as GitHub secret:
NETLIFY_BUILD_HOOK - Add step in
check_completion.ymlafter commit:
- name: Trigger Netlify deploy
if: steps.check.outputs.results_updated == 'true'
run: curl -X POST -d {} ${{ secrets.NETLIFY_BUILD_HOOK }}| Issue | Solution |
|---|---|
| Git push rejected | git pull --rebase && git push |
| Permission denied | Run chmod 777 on submission directories |
| Form not submitting | Check Netlify env vars and function logs |
| No emails | Check sendmail config, may have delay (5-10 min normal) |
| Workflow not running | Check Actions tab, verify runner is online |
| Wrong evaluation scores | Ensure reference PDB residue numbering starts at 1 |
- AlphaFold3 - Structure prediction
- TMalign/USalign - Structure alignment
- PDBe-Molstar - 3D visualization
- SeokLab