-
Notifications
You must be signed in to change notification settings - Fork 0
feat: add Vercel AI SDK Sentry instrumentation and document OPENAI_AP… #130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…I_KEY - Add vercelAIIntegration to Sentry for both Node.js and Cloudflare Workers - Enable experimental_telemetry in AI SDK calls for automatic span tracking - Track token usage, model info, latency, and errors for AI operations - Document OPENAI_API_KEY in env.example, deployment.md, and CLAUDE.md - Clarify secret configuration: use wrangler CLI for Cloudflare Workers - Add AI Features Configuration section to project guidelines Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
|
@sentry review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR adds Sentry instrumentation for AI-powered features using the Vercel AI SDK, enabling automatic tracking of OpenAI API calls for category suggestions. The changes include configuring the Sentry Vercel AI integration, enabling telemetry in AI SDK calls, and documenting the OPENAI_API_KEY environment variable for local, Cloudflare, and CI/CD deployments.
Changes:
- Integrated Sentry's
vercelAIIntegrationin Node.js and Cloudflare Workers entry points to track AI SDK calls - Enabled
experimental_telemetryin the AI category suggester service to capture token usage, latency, model info, and errors - Added OPENAI_API_KEY documentation to env.example, deployment.md, and CLAUDE.md with setup instructions for different deployment methods
Reviewed changes
Copilot reviewed 7 out of 7 changed files in this pull request and generated 5 comments.
Show a summary per file
| File | Description |
|---|---|
| packages/api/src/services/ai-category-suggester.ts | Added experimental_telemetry configuration to AI SDK call for automatic span tracking |
| packages/api/src/entries/node.ts | Configured Sentry vercelAIIntegration with recordInputs/recordOutputs enabled |
| packages/api/src/entries/cloudflare.ts | Added vercelAIIntegration to Sentry config and enabled AI tracking in initialization logs |
| packages/api/src/config/sentry.ts | Added enableAIIntegration configuration field and comments about AI SDK integration |
| env.example | Documented OPENAI_API_KEY environment variable with setup instructions |
| docs/deployment.md | Added OPENAI_API_KEY configuration for Docker and Cloudflare deployments |
| CLAUDE.md | Added comprehensive AI Features Configuration section with Sentry instrumentation details |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| // Note: Input/output recording is controlled via experimental_telemetry in AI SDK calls | ||
| config.integrations = [Sentry.vercelAIIntegration()]; |
Copilot
AI
Jan 12, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's an inconsistency between Node.js and Cloudflare configurations: Node.js sets recordInputs: true, recordOutputs: true at the integration level, while Cloudflare doesn't pass these parameters to vercelAIIntegration(). Both rely on experimental_telemetry settings in individual AI SDK calls. For consistency and clarity, consider either: (1) removing the parameters from Node.js integration to match Cloudflare's approach and only control recording via experimental_telemetry, or (2) adding the same parameters to Cloudflare's integration. The current mixed approach may cause confusion about which settings take precedence.
| // Note: Input/output recording is controlled via experimental_telemetry in AI SDK calls | |
| config.integrations = [Sentry.vercelAIIntegration()]; | |
| // Note: Input/output recording is enabled here but still controlled per call via experimental_telemetry in AI SDK usage | |
| config.integrations = [ | |
| Sentry.vercelAIIntegration({ | |
| recordInputs: true, | |
| recordOutputs: true, | |
| }), | |
| ]; |
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
This pull request introduces optional AI-powered features to TuvixRSS, specifically AI-based category suggestions for RSS feeds using OpenAI and the Vercel AI SDK. It also adds comprehensive Sentry instrumentation for monitoring AI usage, latency, and errors. The changes include updates to documentation, environment configuration, and code to support secure and observable AI feature rollout.
AI Features & Configuration:
aiEnabledflag, Pro/Enterprise plan,OPENAI_API_KEY). Documentation and environment examples updated for local, Cloudflare, and CI/CD deployments. [1] [2] [3] [4]Observability & Sentry Integration: