This GitHub Action provides automated pull request reviews, combining static analysis, security heuristics, and LLM-based reasoning to improve code quality and security.
To use this action in your repository, create a file named .github/workflows/pr-review.yml with the following content:
name: PR Review Bot
on:
pull_request:
types: [opened, synchronize, reopened]
permissions:
contents: read
pull-requests: write
security-events: read
concurrency:
group: ${{ github.head_ref }}
cancel-in-progress: true
jobs:
review:
runs-on: ubuntu-latest
steps:
- name: Checkout Repository
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Run PR Review
uses: your-username/your-repo-name@v1 # TODO: Update with the correct action name
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
openai_api_key: ${{ secrets.OPENAI_API_KEY }} # OptionalThe action performs the following steps on each pull request:
- Fetches PR Data: Retrieves the diff, changed files, and repository languages.
- Builds a Repo Fingerprint: Analyzes the existing codebase to learn its conventions (e.g., naming, test structure). This is cached for 7 days.
- Runs Linters: Executes lightweight static analysis tools for the detected languages.
- Performs Heuristic Scans: Checks for common security issues like exposed secrets and dangerous API usage.
- LLM-Powered Review: Uses a Large Language Model (e.g., GPT-4o) to perform a deeper, context-aware review of the code changes.
- Posts Comments: Submits inline review comments and a summary comment to the pull request.
The action is designed to work out-of-the-box with zero configuration. However, you can customize its behavior by creating a .pr-review.yml file in the root of your repository. See .pr-review.yml.example for all available options.
GITHUB_TOKEN: (Required) The default GitHub token provided to the action. No special permissions are needed beyond what is defined in the workflow.OPENAI_API_KEY: (Optional) Your OpenAI API key. Required for the LLM-based review. The action can function without it, but the review quality will be lower.LLM_PROVIDER_URL: (Optional) The base URL for a self-hosted or alternative LLM provider that is compatible with the OpenAI API.
- The action never logs raw diffs or code that might contain secrets.
- Secrets are redacted before being sent to an external LLM.
- You can use a self-hosted LLM to keep all code within your infrastructure.