The SDK provides several methods to annotate spans with inputs, outputs, metrics, and metadata.
Use the annotateIO()
member method of the LLMObsSpan
interface to add structured input and output data to an LLMObsSpan
. This includes optional arguments and LLM message objects.
Arguments
If an argument is null or empty, nothing happens. For example, if inputData
is a non-empty string while outputData
is null, then only inputData
is recorded.
inputData
- optional - String or List<LLMObs.LLMMessage>
Either a string (for non-LLM spans) or a list of LLMObs.LLMMessage
s for LLM spans. outputData
- optional - String or List<LLMObs.LLMMessage>
Either a string (for non-LLM spans) or a list of LLMObs.LLMMessage
s for LLM spans.
LLM Messages
LLM spans must be annotated with LLM Messages using the LLMObs.LLMMessage
object.
The LLMObs.LLMMessage
object can be instantiated by calling LLMObs.LLMMessage.from()
with the following arguments:
role
- required - String
A string describing the role of the author of the message. content
- required - String
A string containing the content of the message.
Example
import datadog.trace.api.llmobs.LLMObs;
public class MyJavaClass {
public String invokeChat(String userInput) {
LLMObsSpan llmSpan = LLMObs.startLLMSpan("my-llm-span-name", "my-llm-model", "my-company", "maybe-ml-app-override", "session-141");
String systemMessage = "You are a helpful assistant";
Response chatResponse = ... // user application logic to invoke LLM
llmSpan.annotateIO(
Arrays.asList(
LLMObs.LLMMessage.from("user", userInput),
LLMObs.LLMMessage.from("system", systemMessage)
),
Arrays.asList(
LLMObs.LLMMessage.from(chatResponse.role, chatResponse.content)
)
);
llmSpan.finish();
return chatResponse;
}
}
Adding metrics
Bulk add metrics
The setMetrics()
member method of the LLMObsSpan
interface accepts the following arguments to attach multiple metrics in bulk:
Arguments
metrics
- required - Map<String, Number>
A map of JSON-serializable keys and numeric values that users can add to record metrics relevant to the operation described by the span (for example, input_tokens
, output_tokens
, or total_tokens
).
Add a single metric
The setMetric()
member method of the LLMObsSpan
interface accepts the following arguments to attach a single metric:
Arguments
key
- required - CharSequence
The name of the metric. value
- required - int, long, or double
The value of the metric.
Examples
import datadog.trace.api.llmobs.LLMObs;
public class MyJavaClass {
public String invokeChat(String userInput) {
LLMObsSpan llmSpan = LLMObs.startLLMSpan("my-llm-span-name", "my-llm-model", "my-company", "maybe-ml-app-override", "session-141");
String chatResponse = ... // user application logic to invoke LLM
llmSpan.setMetrics(Map.of(
"input_tokens", 617,
"output_tokens", 338,
"time_per_output_token", 0.1773
));
llmSpan.setMetric("total_tokens", 955);
llmSpan.setMetric("time_to_first_token", 0.23);
llmSpan.finish();
return chatResponse;
}
}
For more information about tags, see Getting Started with Tags.
The setTags()
member method of the LLMObsSpan
interface accepts the following arguments to attach multiple tags in bulk:
Arguments
tags
- required - Map<String, Object>
A map of JSON-serializable key-value pairs that users can add as tags to describe the span’s context (for example, session
, environment
, system
, or version
).
Add a single tag
The setTag()
member method of the LLMObsSpan
interface accepts the following arguments to attach a single tag:
Arguments
key
- required - String
The key of the tag. value
- required - int, long, double, boolean, or String
The value of the tag.
Examples
import datadog.trace.api.llmobs.LLMObs;
public class MyJavaClass {
public String invokeChat(String userInput) {
LLMObsSpan llmSpan = LLMObs.startLLMSpan("my-llm-span-name", "my-llm-model", "my-company", "maybe-ml-app-override", "session-141");
String chatResponse = ... // user application logic to invoke LLM
llmSpan.setTags(Map.of(
"chat_source", "web",
"users_in_chat", 3
));
llmSpan.setTag("is_premium_user", true);
llmSpan.finish();
return chatResponse;
}
}
Annotating errors
Adding a Throwable (recommended)
The addThrowable()
member method of the LLMObsSpan
interface accepts the following argument to attach a throwable with a stack trace:
Arguments
throwable
- required - Throwable
The throwable/exception that occurred.
Adding an error message
The setErrorMessage()
member method of the LLMObsSpan
interface accepts the following argument to attach an error string:
Arguments
errorMessage
- required - String
The message of the error.
Setting an error flag
The setError()
member method of the LLMObsSpan
interface accepts the following argument to indicate an error with the operation:
Arguments
error
- required - boolean
true
if the span errored.
Examples
import datadog.trace.api.llmobs.LLMObs;
public class MyJavaClass {
public String invokeChat(String userInput) {
LLMObsSpan llmSpan = LLMObs.startLLMSpan("my-llm-span-name", "my-llm-model", "my-company", "maybe-ml-app-override", "session-141");
String chatResponse = "N/A";
try {
chatResponse = ... // user application logic to invoke LLM
} catch (Exception e) {
llmSpan.addThrowable(e);
throw new RuntimeException(e);
} finally {
llmSpan.finish();
}
return chatResponse;
}
}
The setMetadata()
member method of the LLMObsSpan
interface accepts the following arguments:
metadata
- required - Map<String, Object>
A map of JSON-serializable key-value pairs that contains metadata relevant to the input or output operation described by the span.
Example
import datadog.trace.api.llmobs.LLMObs;
public class MyJavaClass {
public String invokeChat(String userInput) {
LLMObsSpan llmSpan = LLMObs.startLLMSpan("my-llm-span-name", "my-llm-model", "my-company", "maybe-ml-app-override", "session-141");
llmSpan.setMetadata(
Map.of(
"temperature", 0.5,
"is_premium_member", true,
"class", "e1"
)
);
String chatResponse = ... // user application logic to invoke LLM
return chatResponse;
}
}