Skip to content

Python: Agent.run(stream=True) loses response_format from default_options — AgentResponse.value is always None #4437

@pamelafox

Description

@pamelafox

Bug Description

When using Agent.run(stream=True) with response_format set in default_options, the resulting AgentResponse.value is always None. The same agent works correctly in non-streaming mode (stream=False).

This causes failures in DevUI (which uses stream=True) for any workflow that relies on structured outputs via response_format.

Root Cause

In Agent.run() (Python, _agents.py), the streaming path constructs a ResponseStream with a finalizer that captures the runtime options parameter rather than the merged options (which include default_options):

# Streaming path — uses `options` (the runtime param, often None)
return (
    ResponseStream
    .from_awaitable(_get_stream())
    .map(
        transform=partial(map_chat_to_agent_update, agent_name=self.name),
        finalizer=partial(
            self._finalize_response_updates,
            response_format=options.get("response_format") if options else None  # BUG: should use merged options
        ),
    )
    # ...
)

The non-streaming path correctly uses the merged ctx["chat_options"]:

# Non-streaming path — correctly uses merged options
response_format = ctx["chat_options"].get("response_format")

Since ctx["chat_options"] merges default_options with runtime options, the non-streaming path works. But the streaming path only looks at the runtime options parameter (which is None when callers rely on default_options), so response_format is lost.

Reproduction

from pydantic import BaseModel
from typing import Literal
from agent_framework import Agent
from agent_framework.openai import OpenAIChatClient

class ClassifyResult(BaseModel):
    category: Literal["Question", "Complaint", "Feedback"]
    original_message: str
    reasoning: str

client = OpenAIChatClient(...)

agent = Agent(
    client=client,
    name="Classifier",
    instructions="Classify the message. Return JSON with category, original_message, reasoning.",
    default_options={"response_format": ClassifyResult},
)

# Non-streaming — works ✅
response = await agent.run("How do I reset my password?")
print(response.value)  # ClassifyResult(category='Question', ...)

# Streaming — broken ❌
stream = agent.run("How do I reset my password?", stream=True)
async for _ in stream:
    pass
response = await stream.get_final_response()
print(response.value)  # None — should be ClassifyResult

Impact

Any workflow using structured outputs (response_format in default_options) fails in DevUI / streaming mode because AgentResponse.value is None. The raw JSON text is present in response.text, but the Pydantic parsing never runs.

Error seen in DevUI:

Error in workflow execution: 'NoneType' object has no attribute 'category'

Suggested Fix

In Agent.run(), the streaming finalizer should use the merged options. For example, capture default_options in the closure:

merged_response_format = (
    (options.get("response_format") if options else None)
    or self.default_options.get("response_format")
)

return (
    ResponseStream
    .from_awaitable(_get_stream())
    .map(
        transform=partial(map_chat_to_agent_update, agent_name=self.name),
        finalizer=partial(
            self._finalize_response_updates,
            response_format=merged_response_format,
        ),
    )
    # ...
)

Or, ideally, defer to ctx["chat_options"] the same way the non-streaming path does.

Version

  • agent-framework-core: 1.0.0rc1
  • agent-framework-orchestrations: 1.0.0b260219
  • Python 3.12

Workaround

Manually parse response_format from response.text when value is None:

result = response.agent_response.value
if result is None:
    result = ClassifyResult.model_validate_json(response.agent_response.text)

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions