-
Notifications
You must be signed in to change notification settings - Fork 280
Description
Describe the bug
When using AgentScope-Java's tracing feature with TelemetryTracer, the gen_ai.request.temperature attribute is not being recorded in the trace spans, even when the model is configured with temperature or when users expect temperature to be tracked.
According to the OpenTelemetry GenAI Semantic Conventions, the gen_ai.request.temperature attribute should be captured for LLM requests.
Expected attributes that are missing:
gen_ai.request.temperature- The temperature parameter for text generationagentscope.function.input.temperature(if applicable in the function input JSON)
Root Cause Analysis
After investigating the code, the root cause has been identified:
1. ReActAgent.buildGenerateOptions() does not include temperature
@Override
protected GenerateOptions buildGenerateOptions() {
GenerateOptions.Builder builder = GenerateOptions.builder();
if (modelExecutionConfig != null) {
builder.executionConfig(modelExecutionConfig);
}
return builder.build(); // ❌ No temperature, topP, maxTokens, etc.
}The buildGenerateOptions() method only sets executionConfig (timeout/retry settings), but does NOT set any generation parameters like temperature, topP, maxTokens, etc.
2. ReActAgent.Builder lacks generateOptions configuration
In ReActAgent.Builder, there is no generateOptions(GenerateOptions) method to configure generation parameters at the agent level.
Available configurations:
- ✅
modelExecutionConfig(ExecutionConfig)- timeout, retry, backoff - ❌
generateOptions(GenerateOptions)- MISSING - temperature, topP, maxTokens, etc.
3. Flow of options in reasoning phase
In ReActAgent.reasoning() method:
private Mono<Msg> reasoning(int iter, boolean ignoreMaxIters) {
// ...
return checkInterruptedAsync()
.then(notifyPreReasoningEvent(prepareMessages()))
.flatMapMany(
event -> {
GenerateOptions options =
event.getEffectiveGenerateOptions() != null
? event.getEffectiveGenerateOptions()
: buildGenerateOptions(); // ← Returns options WITHOUT temperature
return model.stream(
event.getInputMessages(),
toolkit.getToolSchemas(),
options) // ← Options passed to model.stream()
.concatMap(chunk -> checkInterruptedAsync().thenReturn(chunk));
})
// ...
}The options passed to model.stream() comes from buildGenerateOptions(), which returns a GenerateOptions object with temperature = null.
4. TelemetryTracer correctly checks for temperature
In AttributesExtractors.getLLMRequestAttributes():
static Attributes getLLMRequestAttributes(..., GenerateOptions options) {
// ...
if (options != null) {
internalSet(builder, GEN_AI_REQUEST_TEMPERATURE, options.getTemperature()); // ← null
// ...
}
// ...
}Since options.getTemperature() returns null, the attribute is not set (due to internalSet skipping null values).
To Reproduce
Steps to reproduce the behavior:
- Setup TelemetryTracer with Langfuse:
TelemetryTracer tracer = TelemetryTracer.builder()
.endpoint("https://us.cloud.langfuse.com/api/public/otel/v1/traces")
.addHeader("Authorization", "Basic " + Base64.getEncoder()
.encodeToString((publicKey + ":" + secretKey).getBytes()))
.build();
TracerRegistry.register(tracer);- Create an agent (no way to set temperature):
// Note: There is no generateOptions() method in the Builder!
ReActAgent agent = ReActAgent.builder()
.name("TestAgent")
.model(chatModel)
// .generateOptions(GenerateOptions.builder().temperature(0.7).build()) // ❌ This method doesn't exist!
.build();
agent.call(Msg.userMsg("Hello")).block();- Check the trace in Langfuse/Jaeger/etc.:
- Navigate to the trace viewer
- Inspect the
chatspan attributes - Observe that
gen_ai.request.temperatureis not present
Expected behavior
- ReActAgent.Builder should have a
generateOptions(GenerateOptions)method - ReActAgent.buildGenerateOptions() should return the configured generation options
- The trace span for model calls should include:
gen_ai.request.temperature: 0.7
gen_ai.request.top_p: 0.9
gen_ai.request.max_tokens: 1000
Suggested Fix
Option 1: Add generateOptions to ReActAgent.Builder
// In ReActAgent.Builder
private GenerateOptions generateOptions;
public Builder generateOptions(GenerateOptions generateOptions) {
this.generateOptions = generateOptions;
return this;
}// In ReActAgent
private final GenerateOptions generateOptions;
@Override
protected GenerateOptions buildGenerateOptions() {
GenerateOptions.Builder builder = GenerateOptions.builder();
// Merge with user-provided generateOptions
if (generateOptions != null) {
if (generateOptions.getTemperature() != null) {
builder.temperature(generateOptions.getTemperature());
}
if (generateOptions.getTopP() != null) {
builder.topP(generateOptions.getTopP());
}
if (generateOptions.getMaxTokens() != null) {
builder.maxTokens(generateOptions.getMaxTokens());
}
// ... other options
}
if (modelExecutionConfig != null) {
builder.executionConfig(modelExecutionConfig);
}
return builder.build();
}Option 2: Use Model's default options
Alternatively, retrieve default options from the Model itself if configured there.
Environment
- AgentScope-Java Version: [e.g. 1.0.8]
- Java Version: 17
- OS: macOS
Additional context
Related code references:
| File | Location | Description |
|---|---|---|
ReActAgent.java |
buildGenerateOptions() |
Returns GenerateOptions without temperature |
ReActAgent.java |
reasoning() |
Where options are built and passed to model |
ReActAgent.java |
Builder class |
Missing generateOptions() method |
TelemetryTracer.java |
callModel() |
Options passed to getLLMRequestAttributes() |
AttributesExtractors.java |
Line 176 | Temperature attribute extraction (skipped when null) |
GenAiIncubatingAttributes.java |
Attribute key | gen_ai.request.temperature definition |
References:
Metadata
Metadata
Assignees
Labels
Type
Projects
Status