-
Notifications
You must be signed in to change notification settings - Fork 47
Open
Labels
Area/AIPoliciesIssues related to policies,guardrails in AI GatewayIssues related to policies,guardrails in AI GatewayArea/PoliciesIssues related to any policy, policy hub, policy engine etcIssues related to any policy, policy hub, policy engine etcAspect/APIAPI definitions, contracts, OpenAPI, interfacesAPI definitions, contracts, OpenAPI, interfacesSeverity/MajorType/Bug
Description
Please select the area the issue is related to
Area/Policies (Policies, Policy Hub, Policy Engine etc)
Please select the aspect the issue is related to
Aspect/API (API backends, definitions, contracts, interfaces, OpenAPI)
Description
When the token based rate limit policy is configured with only one of the following optional parameters
The policy initialises successfully but fails at runtime when trying to delegate to advanced-ratelimit
Steps to Reproduce
- Deploy a selfhosted ai-gateway with debug logs enabled for router, controller and polcy-engine
- Deploy an OpenAI LLM provider
- Add
Token Based Ratelimitpolicy to the provider with onlypromptTokenLimitsset - Save and redeploy the LLM provider
- Notice the logs in the gateway, the policy chain would have successfully been updated
- Make an api call to the provider
- Notice the error logs in console
Severity Level of the Issue
Severity/Major (Important functionality is broken. Should be prioritized. Doesn't need immediate attention)
Environment Details (with versions)
No response
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
Area/AIPoliciesIssues related to policies,guardrails in AI GatewayIssues related to policies,guardrails in AI GatewayArea/PoliciesIssues related to any policy, policy hub, policy engine etcIssues related to any policy, policy hub, policy engine etcAspect/APIAPI definitions, contracts, OpenAPI, interfacesAPI definitions, contracts, OpenAPI, interfacesSeverity/MajorType/Bug