LLM Observability Instrumentation

This product is not supported for your selected Datadog site. ().

To start sending data to LLM Observability, instrument your application automatically with the LLM Observability Python or Node.js SDKs or by using the LLM Observability API.

You can visualize the interactions and performance data of your LLM applications on the LLM Observability Traces page, where each request fulfilled by your application is represented as a trace.

An LLM Observability trace displaying each span of a request