Fix HTTPAgent to support OpenAI-compatible API responses (vLLM) #210
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Problem
HTTPAgent was returning the entire OpenAI API response JSON as a string instead of extracting the actual message content. This caused agent validation failures when using OpenAI-compatible servers like vLLM.
Current Behavior
{'id': 'chatcmpl-xxx', 'object': 'chat.completion', 'choices': [...]}Expected Behavior
choices[0].message.contentfor OpenAI formatSolution
This PR modifies
HTTPAgent.inference()to intelligently handle OpenAI-compatible API responses:choicesfield in responsechoices[0].message.contentreturn_formatbehaviorChanges
src/client/agents/http_agent.pyEffects
Enables vLLM Integration
Eliminates Agent Validation Failures
Backward Compatibility
Broader Compatibility
Implementation Details
The fix adds intelligent response parsing:
This approach ensures that:
Use Case
This fix is particularly important for researchers and practitioners who:
Related
This addresses a common pain point when using AgentBench with modern open-source inference engines. The OpenAI API format has become the de facto standard for LLM serving, and this fix ensures AgentBench works seamlessly with that ecosystem.