Skip to content

Token Based Rate limit policy failing when all optional parameters are not configured #1230

@IsuruGunarathne

Description

@IsuruGunarathne

Please select the area the issue is related to

Area/Policies (Policies, Policy Hub, Policy Engine etc)

Please select the aspect the issue is related to

Aspect/API (API backends, definitions, contracts, interfaces, OpenAPI)

Description

When the token based rate limit policy is configured with only one of the following optional parameters

Image

The policy initialises successfully but fails at runtime when trying to delegate to advanced-ratelimit

Steps to Reproduce

  1. Deploy a selfhosted ai-gateway with debug logs enabled for router, controller and polcy-engine
  2. Deploy an OpenAI LLM provider
  3. Add Token Based Ratelimit policy to the provider with only promptTokenLimits set
  4. Save and redeploy the LLM provider
  5. Notice the logs in the gateway, the policy chain would have successfully been updated
  6. Make an api call to the provider
  7. Notice the error logs in console
Image

Severity Level of the Issue

Severity/Major (Important functionality is broken. Should be prioritized. Doesn't need immediate attention)

Environment Details (with versions)

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Area/AIPoliciesIssues related to policies,guardrails in AI GatewayArea/PoliciesIssues related to any policy, policy hub, policy engine etcAspect/APIAPI definitions, contracts, OpenAPI, interfacesSeverity/MajorType/Bug

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions