- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- Administrator's Guide
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
This guide uses the LLM Observability SDK for Python. If your application is written in another language, you can create traces by calling the API instead.
To better understand LLM Observability terms and concepts, you can explore the examples in the LLM Observability Jupyter Notebooks repository. These notebooks provide a hands-on experience, and allow you to apply these concepts in real time.
To generate an LLM Observability trace, you can run a Python script.
OPENAI_API_KEY
. To create one, see Account Setup and Set up your API key in the official OpenAI documentation.Install the SDK by adding the ddtrace
and openai
packages:
pip install ddtrace
pip install openai
Create a Python script and save it as quickstart.py
. This Python script makes a single OpenAI call.
quickstart.py
import os
from openai import OpenAI
oai_client = OpenAI(api_key=os.environ.get("OPENAI_API_KEY"))
completion = oai_client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful customer assistant for a furniture store."},
{"role": "user", "content": "I'd like to buy a chair for my living room."},
],
)
Run the Python script with the following shell command. This sends a trace of the OpenAI call to Datadog.
DD_LLMOBS_ENABLED=1 DD_LLMOBS_ML_APP=onboarding-quickstart \
DD_API_KEY=<YOUR_DATADOG_API_KEY> DD_SITE= \
DD_LLMOBS_AGENTLESS_ENABLED=1 ddtrace-run python quickstart.py
For more information about required environment variables, see the SDK documentation.
Note: DD_LLMOBS_AGENTLESS_ENABLED
is only required if you do not have the Datadog Agent running. If the Agent is running in your production environment, make sure this environment variable is unset.
View the trace of your LLM call on the Traces tab of the LLM Observability page in Datadog.
The trace you see is composed of a single LLM span. The ddtrace-run
command automatically traces your LLM calls from Datadog’s list of supported integrations.
If your application consists of more elaborate prompting or complex chains or workflows involving LLMs, you can trace it using the Setup documentation and the SDK documentation.
추가 유용한 문서, 링크 및 기사: