Half your team's time goes to finding things that already exist — in code, in tickets, in docs. Engineers context-switch between GitHub, Jira, Slack, and Confluence just to answer "where is this implemented?" or "what's the status of that feature?"
Probe fixes this. It's an AI agent that connects to your codebase, tickets, docs, and tools — then answers questions, explores code, makes changes, and automates workflows. You define everything in YAML. No custom code, no vendor lock-in, any LLM provider.
This quickstart gives you a working assistant in under a minute. Here's the entire config:
version: "1.0"
# Import the assistant engine (intent classification, skill activation, tools)
imports:
- https://raw.githubusercontent.com/probelabs/visor-ee/master/workflows/assistant.yaml
# Slack config (remove if CLI-only)
slack:
version: "v1"
mentions: all
threads: required
checks:
chat:
type: workflow
workflow: assistant
assume: ["true"]
args:
question: "{{ conversation.current.text }}"
system_prompt: |
You are a Probe Labs assistant helping developers understand and build
AI assistants with Visor. You can explore code, explain Visor concepts,
and demonstrate how skills, intents, and tools work together.
# Intents — broad request categories for routing
intents:
- id: chat
description: General Q&A, follow-up questions, small talk
- id: code_help
description: Questions about code, implementation, or architecture
- id: task
description: Create, update, or execute something
# Skills — capabilities that activate based on what the user asks
skills:
# Inline knowledge (no tools)
- id: capabilities
description: user asks what this assistant can do
knowledge: |
I can explain Visor, explore code across repos, and make changes via PRs.
# Knowledge loaded from file via {% readfile %}
- id: visor-guide
description: questions about how Visor works, skills, intents, tools, or YAML config
knowledge: |
{% readfile "docs/visor-overview.md" %}
# Workflow tool — code search across repos
- id: code-explorer
description: needs codebase exploration, code search, or implementation details
tools:
code-explorer:
workflow: code-talk
inputs:
projects:
- name: quickstart
path: .
- name: visor
repo: probelabs/visor
allowed_commands: ['git:log:*', 'git:show:*', 'git:diff:*']
# Skill with dependency — auto-activates code-explorer
- id: engineer
description: user wants code changes, a PR, or a feature implemented
requires: [code-explorer]
tools:
engineer:
workflow: engineer
inputs: {}
allowed_commands: ['git:*', 'npm:*']
disallowed_commands: ['git:push:--force', 'git:reset:--hard']
# MCP tool (uncomment + set JIRA_* in .env)
# - id: jira
# description: user mentions Jira or ticket IDs like PROJ-123
# tools:
# jira:
# command: uvx
# args: ["mcp-atlassian"]
# env:
# JIRA_URL: "${JIRA_URL}"
# JIRA_API_TOKEN: "${JIRA_API_TOKEN}"
# allowedMethods: [jira_get_issue, jira_search, jira_create_issue]That's it. One file. Skills, tools, knowledge, and routing — all declared in YAML.
git clone https://github.com/probelabs/visor-quickstart.git
cd visor-quickstart
cp .env.example .env
# Edit .env — uncomment and set ANTHROPIC_API_KEY (or another provider)Launch the interactive TUI (recommended):
npx -y @probelabs/visor@latest run assistant.yaml --tuiThe TUI gives you a full chat interface with real-time visibility into what the assistant is doing. Press Shift+Tab to cycle through three views:
- Chat — the conversation with your assistant
- Logs — runtime logs showing intent classification, skill activation, and tool calls
- Trace — detailed execution trace for debugging pipelines and workflows
Or send a single message from the command line with --message:
npx -y @probelabs/visor@latest run assistant.yaml --message "What can you do?"When you sent a message, Probe ran this pipeline:
- Intent classification — determined the request type (
chat,code_help, ortask) - Skill selection — matched relevant skills based on their
descriptionfields - Dependency expansion — skills with
requirespulled in other skills automatically - Knowledge + tool injection — activated skills' knowledge and tools were added to the AI's context
- Response — the AI answered using the assembled context, calling tools if needed
Everything above is defined in assistant.yaml. Open it and read along.
Each message activates a different skill. Try them in the TUI (--tui) or pass them with --message:
# capabilities — inline knowledge, no tools
npx -y @probelabs/visor@latest run assistant.yaml --message "What can you help me with?"
# visor-guide — knowledge loaded from docs/visor-overview.md
npx -y @probelabs/visor@latest run assistant.yaml --message "How do skills work in Visor?"
# code-explorer — searches code across repos
npx -y @probelabs/visor@latest run assistant.yaml --message "Show me what's in assistant.yaml"
# engineer — requires code-explorer, so both activate
npx -y @probelabs/visor@latest run assistant.yaml --message "Add a comment to the top of README.md"The --message flag is useful for scripting, CI pipelines, or quick one-off questions. For interactive exploration, use --tui instead.
Change the identity — edit system_prompt in assistant.yaml:
system_prompt: |
You are an assistant for Acme Corp's engineering team.Add your repos — add entries to the code-explorer skill's projects:
projects:
- name: backend
path: /path/to/backend
description: Backend API server
- name: frontend
repo: myorg/frontend
ref: main
description: React frontendAdd a knowledge skill — load docs from a file:
- id: onboarding
description: questions about onboarding, setup, or getting started
knowledge: |
{% readfile "docs/onboarding-guide.md" %}Control bash commands per skill — restrict what each skill can run:
- id: devops
description: user needs container management or deployment help
allowed_commands: ['docker:*', 'kubectl:get:*']
disallowed_commands: ['docker:rm:--force', 'docker:system:prune', 'kubectl:delete:*']Add an MCP tool — uncomment the jira skill in assistant.yaml and set credentials in .env.
Turn this assistant into a Slack bot your whole team can talk to.
1. Create a Slack app at api.slack.com/apps:
- Enable Socket Mode (Settings → Socket Mode → Enable)
- Under OAuth & Permissions, add these Bot Token Scopes:
app_mentions:read,channels:history,groups:history,im:history,mpim:history,chat:write,reactions:read,reactions:write,im:read,im:write - Under Event Subscriptions, subscribe to bot events:
app_mention,message.channels,message.groups,message.im,message.mpim - Install the app to your workspace
2. Grab two tokens and add them to .env:
# Bot token (OAuth & Permissions → Bot User OAuth Token)
SLACK_BOT_TOKEN=xoxb-your-bot-token
# App token (Basic Information → App-Level Tokens → create one with connections:write scope)
SLACK_APP_TOKEN=xapp-your-app-token3. Run it:
npx -y @probelabs/visor@latest run assistant.yaml --slackMention your bot in any Slack thread and it will respond. The slack section in assistant.yaml controls behavior — threads: required means it only responds in threads, not top-level channel messages.
For a full walkthrough, see the Build a Slack Bot guide.
Run tests to verify that intent classification routes to the correct skills:
npx -y @probelabs/visor@latest test assistant.yamlEach test case in the tests section of assistant.yaml sends a mock message and asserts which intents and skills activate. Use this to catch routing regressions when you add or edit skills.
Lint your configuration to catch YAML syntax errors, missing fields, and invalid references before running:
npx -y @probelabs/visor@latest lint assistant.yamlLint checks for common issues like misspelled skill IDs in requires, missing description fields, and invalid workflow references. Run it after every config change — it's instant and saves debugging time.
Validate your configuration for schema errors and missing fields:
npx -y @probelabs/visor@latest validate assistant.yaml| File | What it shows |
|---|---|
assistant.yaml |
Full-featured assistant (start here) |
examples/minimal.yaml |
Simplest possible assistant (~25 lines) |
examples/with-jira.yaml |
External MCP tool integration |
examples/multi-repo.yaml |
Code exploration across multiple repos |
Your API key isn't set. Check .env has an uncommented key:
grep API_KEY .envThe imports URL must be accessible. To work offline:
curl -o workflows/assistant.yaml https://raw.githubusercontent.com/probelabs/visor-ee/master/workflows/assistant.yamlThen change imports to: imports: ["./workflows/assistant.yaml"]
You forgot --message in CLI mode:
npx -y @probelabs/visor@latest run assistant.yaml --message "Hello"Make the skill's description more specific about when it should trigger:
# Too vague:
description: ticket stuff
# Better:
description: user mentions Jira, ticket IDs like PROJ-123, or needs ticket information- Check both tokens in
.env:SLACK_BOT_TOKENandSLACK_APP_TOKEN - Make sure you're mentioning the bot in a thread (not a channel message)
- Run with:
npx -y @probelabs/visor@latest run assistant.yaml --slack
The ProbeLabs assistant — the AI bot that powers the ProbeLabs engineering team's Slack — is itself an open-source project built on this same quickstart pattern. It has 7 skills covering code exploration, GitHub access, CI/CD debugging, and automated PR creation across 15 repositories.
Browse the source to see how a production assistant is structured: probelabs/probelabs-assistant
It's a great reference for:
- Organizing skills and knowledge files across
config/anddocs/directories - Wiring up workflow tools like
code-talkandengineer - Connecting MCP tools and skill dependencies (
requires) - Writing a system prompt for a team-specific assistant
- Probe Labs — the platform
- Visor documentation — workflow engine reference
- visor-ee workflows — the assistant engine this quickstart imports
- Questions? Open an issue or contact hello@probelabs.com
MIT