Skip to content

Conversation

@minpeter
Copy link

@minpeter minpeter commented Jan 22, 2026

Summary

  • add configurable vLLM group port to RLConfig
  • pass group port into VLLMClient init

Testing

  • not run

Fixes #760

@chatgpt-codex-connector
Copy link

You have reached your Codex usage limits for code reviews. You can see your limits in the Codex usage dashboard.

@CLAassistant
Copy link

CLAassistant commented Jan 22, 2026

CLA assistant check
All committers have signed the CLA.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Feature Request: Make group_port configurable in RLConfig

2 participants