A collection of quick-start examples, experiments, and reference implementations for the Vercel AI SDK. Each example is kept small and focused so you can copy, run, and adapt it to your needs.
Autonomous agents capable of multi-step reasoning and tool use.
- MCP Tool Calling Agent – Hybrid agent combining local file system control with GitHub MCP capabilities (running via Docker).
- File System Agent – Autonomous agent capable of creating, editing, and managing files in a sandboxed environment.
Tools and patterns for evaluating LLM outputs using Evalite.
- LLM as a Judge – Using a smaller model (Claude 3.5 Haiku) to grade complex answers (e.g., verifying citations in a paper).
- Deterministic Evals – Running regex checks, length constraints, and other rule-based evaluations.
Full-stack Next.js applications demonstrating UI patterns.
- Streaming to UI – Next.js 16 chat UI that streams Anthropic responses via the AI SDK.
- Multi-Modal Chat – Chat interface supporting text, image previews, and file attachments with context preservation.
Focused scripts demonstrating core LLM concepts.
- Context Window – Managing and visualizing context window limits.
- Tokens – Understanding tokenization and its impact on costs and limits.
- Prompt Caching – Techniques for caching prompts to reduce latency and cost.
- Data Representation – Strategies for representing structured data as tokens.
- Usage Tracking – Monitoring and calculating token usage.
Patterns for structuring prompts and context.
- Web Agent Context – Example system prompt and context configuration for a web-capable agent.
- Choose an example folder from the list above.
- Install dependencies:
pnpm install
- Configure Environment:
- Copy
.env.exampleto.env(or.env.localdepending on the project). - Add required keys (usually
ANTHROPIC_API_KEY).
- Copy
- Run the project:
- For Next.js apps:
pnpm dev - For scripts:
pnpm startornpx tsx src/index.ts(check package.json).
- For Next.js apps:
MIT