- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
로그를 검색하고 HTTP를 통해 Datadog 플랫폼으로 전송하세요. 자세한 정보는 로그 관리 페이지를 참고하세요.
POST https://http-intake.logs.ap1.datadoghq.com/v1/inputhttps://http-intake.logs.datadoghq.eu/v1/inputhttps://http-intake.logs.ddog-gov.com/v1/inputhttps://http-intake.logs.datadoghq.com/v1/inputhttps://http-intake.logs.us3.datadoghq.com/v1/inputhttps://http-intake.logs.us5.datadoghq.com/v1/input
Send your logs to your Datadog platform over HTTP. Limits per HTTP request are:
Any log exceeding 1MB is accepted and truncated by Datadog:
Datadog recommends sending your logs compressed.
Add the Content-Encoding: gzip
header to the request when sending compressed logs.
The status codes answered by the HTTP API are:
이름
유형
설명
ddtags
string
Log tags can be passed as query parameters with text/plain
content type.
이름
유형
설명
Content-Encoding
string
HTTP header used to compress the media-type.
Log to send (JSON format).
[
{
"message": "Example-Log",
"ddtags": "host:ExampleLog"
}
]
[
{
"message": "Example-Log",
"ddtags": "host:ExampleLog"
}
]
[
{
"message": "Example-Log",
"ddtags": "host:ExampleLog"
}
]
[
{
"message": "Example-Log",
"ddtags": "host:ExampleLog"
}
]
[
{
"message": "Example-Log",
"ddtags": "host:ExampleLog"
}
]
[
{
"message": "Example-Log",
"ddtags": "host:ExampleLog"
}
]
Response from server (always 200 empty JSON).
{}
unexpected error
Invalid query performed.
{
"code": 0,
"message": "Your browser sent an invalid request."
}
Too many requests
Error response object.
{
"errors": [
"Bad Request"
]
}
## Multi JSON Messages
# Pass multiple log objects at once.
See one of the other client libraries for an example of sending deflate-compressed data.
## Simple JSON Message
# Log attributes can be passed as `key:value` pairs in valid JSON messages.
See one of the other client libraries for an example of sending deflate-compressed data.
## Multi Logplex Messages
# Submit log messages.
See one of the other client libraries for an example of sending deflate-compressed data.
## Simple Logplex Message
# Submit log string.
See one of the other client libraries for an example of sending deflate-compressed data.
## Multi Raw Messages
# Submit log string.
See one of the other client libraries for an example of sending deflate-compressed data.
## Simple Raw Message
# Submit log string. Log attributes can be passed as query parameters in the URL. This enables the addition of tags or the source by using the `ddtags` and `ddsource` parameters: `?host=my-hostname&service=my-service&ddsource=my-source&ddtags=env:prod,user:my-user`.
See one of the other client libraries for an example of sending deflate-compressed data.
## Multi JSON Messages
# Pass multiple log objects at once.
# Curl command
echo $(cat << EOF
[
{
"message": "Example-Log",
"ddtags": "host:ExampleLog"
}
]
EOF
) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
-H "Content-Encoding: gzip" \
-H "DD-API-KEY: ${DD_API_KEY}" \
--data-binary @-
## Simple JSON Message
# Log attributes can be passed as `key:value` pairs in valid JSON messages.
# Curl command
echo $(cat << EOF
[
{
"message": "Example-Log",
"ddtags": "host:ExampleLog"
}
]
EOF
) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \
-H "Accept: application/json" \
-H "Content-Type: application/json;simple" \
-H "Content-Encoding: gzip" \
-H "DD-API-KEY: ${DD_API_KEY}" \
--data-binary @-
## Multi Logplex Messages
# Submit log messages.
# Curl command
echo $(cat << EOF
[
{
"message": "Example-Log",
"ddtags": "host:ExampleLog"
}
]
EOF
) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \
-H "Accept: application/json" \
-H "Content-Type: application/logplex-1" \
-H "Content-Encoding: gzip" \
-H "DD-API-KEY: ${DD_API_KEY}" \
--data-binary @-
## Simple Logplex Message
# Submit log string.
# Curl command
echo $(cat << EOF
[
{
"message": "Example-Log",
"ddtags": "host:ExampleLog"
}
]
EOF
) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \
-H "Accept: application/json" \
-H "Content-Type: application/logplex-1" \
-H "Content-Encoding: gzip" \
-H "DD-API-KEY: ${DD_API_KEY}" \
--data-binary @-
## Multi Raw Messages
# Submit log string.
# Curl command
echo $(cat << EOF
[
{
"message": "Example-Log",
"ddtags": "host:ExampleLog"
}
]
EOF
) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \
-H "Accept: application/json" \
-H "Content-Type: text/plain" \
-H "Content-Encoding: gzip" \
-H "DD-API-KEY: ${DD_API_KEY}" \
--data-binary @-
## Simple Raw Message
# Submit log string. Log attributes can be passed as query parameters in the URL. This enables the addition of tags or the source by using the `ddtags` and `ddsource` parameters: `?host=my-hostname&service=my-service&ddsource=my-source&ddtags=env:prod,user:my-user`.
# Curl command
echo $(cat << EOF
[
{
"message": "Example-Log",
"ddtags": "host:ExampleLog"
}
]
EOF
) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \
-H "Accept: application/json" \
-H "Content-Type: text/plain" \
-H "Content-Encoding: gzip" \
-H "DD-API-KEY: ${DD_API_KEY}" \
--data-binary @-
## Multi JSON Messages
# Pass multiple log objects at once.
# Curl command
curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-d @- << EOF
[
{
"message": "hello"
},
{
"message": "world"
}
]
EOF
## Simple JSON Message
# Log attributes can be passed as `key:value` pairs in valid JSON messages.
# Curl command
curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \
-H "Accept: application/json" \
-H "Content-Type: application/json;simple" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-d @- << EOF
{
"ddsource": "agent",
"ddtags": "env:prod,user:joe.doe",
"hostname": "fa1e1e739d95",
"message": "hello world"
}
EOF
## Multi Logplex Messages
# Submit log messages.
# Curl command
curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \
-H "Accept: application/json" \
-H "Content-Type: application/logplex-1" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-d @- << EOF
hello
world
EOF
## Simple Logplex Message
# Submit log string.
# Curl command
curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \
-H "Accept: application/json" \
-H "Content-Type: application/logplex-1" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-d @- << EOF
hello world
EOF
## Multi Raw Messages
# Submit log string.
# Curl command
curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \
-H "Accept: application/json" \
-H "Content-Type: text/plain" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-d @- << EOF
hello
world
EOF
## Simple Raw Message
# Submit log string. Log attributes can be passed as query parameters in the URL. This enables the addition of tags or the source by using the `ddtags` and `ddsource` parameters: `?host=my-hostname&service=my-service&ddsource=my-source&ddtags=env:prod,user:my-user`.
# Curl command
curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \
-H "Accept: application/json" \
-H "Content-Type: text/plain" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-d @- << EOF
hello world
EOF
// Send deflate logs returns "Response from server (always 200 empty JSON)." response
package main
import (
"context"
"encoding/json"
"fmt"
"os"
"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
"github.com/DataDog/datadog-api-client-go/v2/api/datadogV1"
)
func main() {
body := []datadogV1.HTTPLogItem{
{
Message: "Example-Log",
Ddtags: datadog.PtrString("host:ExampleLog"),
},
}
ctx := datadog.NewDefaultContext(context.Background())
configuration := datadog.NewConfiguration()
apiClient := datadog.NewAPIClient(configuration)
api := datadogV1.NewLogsApi(apiClient)
resp, r, err := api.SubmitLog(ctx, body, *datadogV1.NewSubmitLogOptionalParameters().WithContentEncoding(datadogV1.CONTENTENCODING_DEFLATE))
if err != nil {
fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.SubmitLog`: %v\n", err)
fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
}
responseContent, _ := json.MarshalIndent(resp, "", " ")
fmt.Fprintf(os.Stdout, "Response from `LogsApi.SubmitLog`:\n%s\n", responseContent)
}
// Send gzip logs returns "Response from server (always 200 empty JSON)." response
package main
import (
"context"
"encoding/json"
"fmt"
"os"
"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
"github.com/DataDog/datadog-api-client-go/v2/api/datadogV1"
)
func main() {
body := []datadogV1.HTTPLogItem{
{
Message: "Example-Log",
Ddtags: datadog.PtrString("host:ExampleLog"),
},
}
ctx := datadog.NewDefaultContext(context.Background())
configuration := datadog.NewConfiguration()
apiClient := datadog.NewAPIClient(configuration)
api := datadogV1.NewLogsApi(apiClient)
resp, r, err := api.SubmitLog(ctx, body, *datadogV1.NewSubmitLogOptionalParameters().WithContentEncoding(datadogV1.CONTENTENCODING_GZIP))
if err != nil {
fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.SubmitLog`: %v\n", err)
fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
}
responseContent, _ := json.MarshalIndent(resp, "", " ")
fmt.Fprintf(os.Stdout, "Response from `LogsApi.SubmitLog`:\n%s\n", responseContent)
}
// Send logs returns "Response from server (always 200 empty JSON)." response
package main
import (
"context"
"encoding/json"
"fmt"
"os"
"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
"github.com/DataDog/datadog-api-client-go/v2/api/datadogV1"
)
func main() {
body := []datadogV1.HTTPLogItem{
{
Message: "Example-Log",
Ddtags: datadog.PtrString("host:ExampleLog"),
},
}
ctx := datadog.NewDefaultContext(context.Background())
configuration := datadog.NewConfiguration()
apiClient := datadog.NewAPIClient(configuration)
api := datadogV1.NewLogsApi(apiClient)
resp, r, err := api.SubmitLog(ctx, body, *datadogV1.NewSubmitLogOptionalParameters())
if err != nil {
fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.SubmitLog`: %v\n", err)
fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
}
responseContent, _ := json.MarshalIndent(resp, "", " ")
fmt.Fprintf(os.Stdout, "Response from `LogsApi.SubmitLog`:\n%s\n", responseContent)
}
First install the library and its dependencies and then save the example to main.go
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<DD_API_KEY>" go run "main.go"
// Send deflate logs returns "Response from server (always 200 empty JSON)." response
import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v1.api.LogsApi;
import com.datadog.api.client.v1.api.LogsApi.SubmitLogOptionalParameters;
import com.datadog.api.client.v1.model.ContentEncoding;
import com.datadog.api.client.v1.model.HTTPLogItem;
import java.util.Collections;
import java.util.List;
public class Example {
public static void main(String[] args) {
ApiClient defaultClient = ApiClient.getDefaultApiClient();
LogsApi apiInstance = new LogsApi(defaultClient);
List<HTTPLogItem> body =
Collections.singletonList(
new HTTPLogItem().message("Example-Log").ddtags("host:ExampleLog"));
try {
apiInstance.submitLog(
body, new SubmitLogOptionalParameters().contentEncoding(ContentEncoding.DEFLATE));
} catch (ApiException e) {
System.err.println("Exception when calling LogsApi#submitLog");
System.err.println("Status code: " + e.getCode());
System.err.println("Reason: " + e.getResponseBody());
System.err.println("Response headers: " + e.getResponseHeaders());
e.printStackTrace();
}
}
}
// Send gzip logs returns "Response from server (always 200 empty JSON)." response
import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v1.api.LogsApi;
import com.datadog.api.client.v1.api.LogsApi.SubmitLogOptionalParameters;
import com.datadog.api.client.v1.model.ContentEncoding;
import com.datadog.api.client.v1.model.HTTPLogItem;
import java.util.Collections;
import java.util.List;
public class Example {
public static void main(String[] args) {
ApiClient defaultClient = ApiClient.getDefaultApiClient();
LogsApi apiInstance = new LogsApi(defaultClient);
List<HTTPLogItem> body =
Collections.singletonList(
new HTTPLogItem().message("Example-Log").ddtags("host:ExampleLog"));
try {
apiInstance.submitLog(
body, new SubmitLogOptionalParameters().contentEncoding(ContentEncoding.GZIP));
} catch (ApiException e) {
System.err.println("Exception when calling LogsApi#submitLog");
System.err.println("Status code: " + e.getCode());
System.err.println("Reason: " + e.getResponseBody());
System.err.println("Response headers: " + e.getResponseHeaders());
e.printStackTrace();
}
}
}
// Send logs returns "Response from server (always 200 empty JSON)." response
import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v1.api.LogsApi;
import com.datadog.api.client.v1.model.HTTPLogItem;
import java.util.Collections;
import java.util.List;
public class Example {
public static void main(String[] args) {
ApiClient defaultClient = ApiClient.getDefaultApiClient();
LogsApi apiInstance = new LogsApi(defaultClient);
List<HTTPLogItem> body =
Collections.singletonList(
new HTTPLogItem().message("Example-Log").ddtags("host:ExampleLog"));
try {
apiInstance.submitLog(body);
} catch (ApiException e) {
System.err.println("Exception when calling LogsApi#submitLog");
System.err.println("Status code: " + e.getCode());
System.err.println("Reason: " + e.getResponseBody());
System.err.println("Response headers: " + e.getResponseHeaders());
e.printStackTrace();
}
}
}
First install the library and its dependencies and then save the example to Example.java
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<DD_API_KEY>" java "Example.java"
"""
Send deflate logs returns "Response from server (always 200 empty JSON)." response
"""
from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v1.api.logs_api import LogsApi
from datadog_api_client.v1.model.content_encoding import ContentEncoding
from datadog_api_client.v1.model.http_log import HTTPLog
from datadog_api_client.v1.model.http_log_item import HTTPLogItem
body = HTTPLog(
[
HTTPLogItem(
message="Example-Log",
ddtags="host:ExampleLog",
),
]
)
configuration = Configuration()
with ApiClient(configuration) as api_client:
api_instance = LogsApi(api_client)
response = api_instance.submit_log(content_encoding=ContentEncoding.DEFLATE, body=body)
print(response)
"""
Send gzip logs returns "Response from server (always 200 empty JSON)." response
"""
from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v1.api.logs_api import LogsApi
from datadog_api_client.v1.model.content_encoding import ContentEncoding
from datadog_api_client.v1.model.http_log import HTTPLog
from datadog_api_client.v1.model.http_log_item import HTTPLogItem
body = HTTPLog(
[
HTTPLogItem(
message="Example-Log",
ddtags="host:ExampleLog",
),
]
)
configuration = Configuration()
with ApiClient(configuration) as api_client:
api_instance = LogsApi(api_client)
response = api_instance.submit_log(content_encoding=ContentEncoding.GZIP, body=body)
print(response)
"""
Send logs returns "Response from server (always 200 empty JSON)." response
"""
from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v1.api.logs_api import LogsApi
from datadog_api_client.v1.model.http_log import HTTPLog
from datadog_api_client.v1.model.http_log_item import HTTPLogItem
body = HTTPLog(
[
HTTPLogItem(
message="Example-Log",
ddtags="host:ExampleLog",
),
]
)
configuration = Configuration()
with ApiClient(configuration) as api_client:
api_instance = LogsApi(api_client)
response = api_instance.submit_log(body=body)
print(response)
First install the library and its dependencies and then save the example to example.py
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<DD_API_KEY>" python3 "example.py"
# Send deflate logs returns "Response from server (always 200 empty JSON)." response
require "datadog_api_client"
api_instance = DatadogAPIClient::V1::LogsAPI.new
body = [
DatadogAPIClient::V1::HTTPLogItem.new({
message: "Example-Log",
ddtags: "host:ExampleLog",
}),
]
opts = {
content_encoding: ContentEncoding::DEFLATE,
}
p api_instance.submit_log(body, opts)
# Send gzip logs returns "Response from server (always 200 empty JSON)." response
require "datadog_api_client"
api_instance = DatadogAPIClient::V1::LogsAPI.new
body = [
DatadogAPIClient::V1::HTTPLogItem.new({
message: "Example-Log",
ddtags: "host:ExampleLog",
}),
]
opts = {
content_encoding: ContentEncoding::GZIP,
}
p api_instance.submit_log(body, opts)
# Send logs returns "Response from server (always 200 empty JSON)." response
require "datadog_api_client"
api_instance = DatadogAPIClient::V1::LogsAPI.new
body = [
DatadogAPIClient::V1::HTTPLogItem.new({
message: "Example-Log",
ddtags: "host:ExampleLog",
}),
]
p api_instance.submit_log(body)
First install the library and its dependencies and then save the example to example.rb
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<DD_API_KEY>" rb "example.rb"
// Send deflate logs returns "Response from server (always 200 empty JSON)."
// response
use datadog_api_client::datadog;
use datadog_api_client::datadogV1::api_logs::LogsAPI;
use datadog_api_client::datadogV1::api_logs::SubmitLogOptionalParams;
use datadog_api_client::datadogV1::model::ContentEncoding;
use datadog_api_client::datadogV1::model::HTTPLogItem;
use std::collections::BTreeMap;
#[tokio::main]
async fn main() {
let body = vec![HTTPLogItem::new("Example-Log".to_string())
.ddtags("host:ExampleLog".to_string())
.additional_properties(BTreeMap::from([]))];
let configuration = datadog::Configuration::new();
let api = LogsAPI::with_config(configuration);
let resp = api
.submit_log(
body,
SubmitLogOptionalParams::default().content_encoding(ContentEncoding::DEFLATE),
)
.await;
if let Ok(value) = resp {
println!("{:#?}", value);
} else {
println!("{:#?}", resp.unwrap_err());
}
}
// Send gzip logs returns "Response from server (always 200 empty JSON)." response
use datadog_api_client::datadog;
use datadog_api_client::datadogV1::api_logs::LogsAPI;
use datadog_api_client::datadogV1::api_logs::SubmitLogOptionalParams;
use datadog_api_client::datadogV1::model::ContentEncoding;
use datadog_api_client::datadogV1::model::HTTPLogItem;
use std::collections::BTreeMap;
#[tokio::main]
async fn main() {
let body = vec![HTTPLogItem::new("Example-Log".to_string())
.ddtags("host:ExampleLog".to_string())
.additional_properties(BTreeMap::from([]))];
let configuration = datadog::Configuration::new();
let api = LogsAPI::with_config(configuration);
let resp = api
.submit_log(
body,
SubmitLogOptionalParams::default().content_encoding(ContentEncoding::GZIP),
)
.await;
if let Ok(value) = resp {
println!("{:#?}", value);
} else {
println!("{:#?}", resp.unwrap_err());
}
}
// Send logs returns "Response from server (always 200 empty JSON)." response
use datadog_api_client::datadog;
use datadog_api_client::datadogV1::api_logs::LogsAPI;
use datadog_api_client::datadogV1::api_logs::SubmitLogOptionalParams;
use datadog_api_client::datadogV1::model::HTTPLogItem;
use std::collections::BTreeMap;
#[tokio::main]
async fn main() {
let body = vec![HTTPLogItem::new("Example-Log".to_string())
.ddtags("host:ExampleLog".to_string())
.additional_properties(BTreeMap::from([]))];
let configuration = datadog::Configuration::new();
let api = LogsAPI::with_config(configuration);
let resp = api
.submit_log(body, SubmitLogOptionalParams::default())
.await;
if let Ok(value) = resp {
println!("{:#?}", value);
} else {
println!("{:#?}", resp.unwrap_err());
}
}
First install the library and its dependencies and then save the example to src/main.rs
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<DD_API_KEY>" cargo run
/**
* Send deflate logs returns "Response from server (always 200 empty JSON)." response
*/
import { client, v1 } from "@datadog/datadog-api-client";
const configuration = client.createConfiguration();
const apiInstance = new v1.LogsApi(configuration);
const params: v1.LogsApiSubmitLogRequest = {
body: [
{
message: "Example-Log",
ddtags: "host:ExampleLog",
},
],
contentEncoding: "deflate",
};
apiInstance
.submitLog(params)
.then((data: any) => {
console.log(
"API called successfully. Returned data: " + JSON.stringify(data)
);
})
.catch((error: any) => console.error(error));
/**
* Send gzip logs returns "Response from server (always 200 empty JSON)." response
*/
import { client, v1 } from "@datadog/datadog-api-client";
const configuration = client.createConfiguration();
const apiInstance = new v1.LogsApi(configuration);
const params: v1.LogsApiSubmitLogRequest = {
body: [
{
message: "Example-Log",
ddtags: "host:ExampleLog",
},
],
contentEncoding: "gzip",
};
apiInstance
.submitLog(params)
.then((data: any) => {
console.log(
"API called successfully. Returned data: " + JSON.stringify(data)
);
})
.catch((error: any) => console.error(error));
/**
* Send logs returns "Response from server (always 200 empty JSON)." response
*/
import { client, v1 } from "@datadog/datadog-api-client";
const configuration = client.createConfiguration();
const apiInstance = new v1.LogsApi(configuration);
const params: v1.LogsApiSubmitLogRequest = {
body: [
{
message: "Example-Log",
ddtags: "host:ExampleLog",
},
],
};
apiInstance
.submitLog(params)
.then((data: any) => {
console.log(
"API called successfully. Returned data: " + JSON.stringify(data)
);
})
.catch((error: any) => console.error(error));
First install the library and its dependencies and then save the example to example.ts
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<DD_API_KEY>" tsc "example.ts"
POST https://http-intake.logs.ap1.datadoghq.com/api/v2/logshttps://http-intake.logs.datadoghq.eu/api/v2/logshttps://http-intake.logs.ddog-gov.com/api/v2/logshttps://http-intake.logs.datadoghq.com/api/v2/logshttps://http-intake.logs.us3.datadoghq.com/api/v2/logshttps://http-intake.logs.us5.datadoghq.com/api/v2/logs
Send your logs to your Datadog platform over HTTP. Limits per HTTP request are:
Any log exceeding 1MB is accepted and truncated by Datadog:
Datadog recommends sending your logs compressed.
Add the Content-Encoding: gzip
header to the request when sending compressed logs.
Log events can be submitted with a timestamp that is up to 18 hours in the past.
The status codes answered by the HTTP API are:
이름
유형
설명
ddtags
string
Log tags can be passed as query parameters with text/plain
content type.
이름
유형
설명
Content-Encoding
string
HTTP header used to compress the media-type.
Log to send (JSON format).
[
{
"ddsource": "nginx",
"ddtags": "env:staging,version:5.1",
"hostname": "i-012345678",
"message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
"service": "payment"
}
]
[
{
"ddsource": "nginx",
"ddtags": "env:staging,version:5.1",
"hostname": "i-012345678",
"message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
"service": "payment"
}
]
[
{
"ddsource": "nginx",
"ddtags": "env:staging,version:5.1",
"hostname": "i-012345678",
"message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
"service": "payment",
"status": "info"
}
]
Request accepted for processing (always 202 empty JSON).
{}
Bad Request
Invalid query performed.
항목
유형
설명
errors
[object]
Structured errors.
detail
string
Error message.
status
string
Error code.
title
string
Error title.
{
"errors": [
{
"detail": "Malformed payload",
"status": "400",
"title": "Bad Request"
}
]
}
Unauthorized
Invalid query performed.
항목
유형
설명
errors
[object]
Structured errors.
detail
string
Error message.
status
string
Error code.
title
string
Error title.
{
"errors": [
{
"detail": "Malformed payload",
"status": "400",
"title": "Bad Request"
}
]
}
Forbidden
Invalid query performed.
항목
유형
설명
errors
[object]
Structured errors.
detail
string
Error message.
status
string
Error code.
title
string
Error title.
{
"errors": [
{
"detail": "Malformed payload",
"status": "400",
"title": "Bad Request"
}
]
}
Request Timeout
Invalid query performed.
항목
유형
설명
errors
[object]
Structured errors.
detail
string
Error message.
status
string
Error code.
title
string
Error title.
{
"errors": [
{
"detail": "Malformed payload",
"status": "400",
"title": "Bad Request"
}
]
}
Payload Too Large
Invalid query performed.
항목
유형
설명
errors
[object]
Structured errors.
detail
string
Error message.
status
string
Error code.
title
string
Error title.
{
"errors": [
{
"detail": "Malformed payload",
"status": "400",
"title": "Bad Request"
}
]
}
Too Many Requests
Invalid query performed.
항목
유형
설명
errors
[object]
Structured errors.
detail
string
Error message.
status
string
Error code.
title
string
Error title.
{
"errors": [
{
"detail": "Malformed payload",
"status": "400",
"title": "Bad Request"
}
]
}
Internal Server Error
Invalid query performed.
항목
유형
설명
errors
[object]
Structured errors.
detail
string
Error message.
status
string
Error code.
title
string
Error title.
{
"errors": [
{
"detail": "Malformed payload",
"status": "400",
"title": "Bad Request"
}
]
}
Service Unavailable
Invalid query performed.
항목
유형
설명
errors
[object]
Structured errors.
detail
string
Error message.
status
string
Error code.
title
string
Error title.
{
"errors": [
{
"detail": "Malformed payload",
"status": "400",
"title": "Bad Request"
}
]
}
## Multi JSON Messages
# Pass multiple log objects at once.
See one of the other client libraries for an example of sending deflate-compressed data.
## Simple JSON Message
# Log attributes can be passed as `key:value` pairs in valid JSON messages.
See one of the other client libraries for an example of sending deflate-compressed data.
## Multi Logplex Messages
# Submit log messages.
See one of the other client libraries for an example of sending deflate-compressed data.
## Simple Logplex Message
# Submit log string.
See one of the other client libraries for an example of sending deflate-compressed data.
## Multi Raw Messages
# Submit log string.
See one of the other client libraries for an example of sending deflate-compressed data.
## Simple Raw Message
# Submit log string. Log attributes can be passed as query parameters in the URL. This enables the addition of tags or the source by using the `ddtags` and `ddsource` parameters: `?host=my-hostname&service=my-service&ddsource=my-source&ddtags=env:prod,user:my-user`.
See one of the other client libraries for an example of sending deflate-compressed data.
## Multi JSON Messages
# Pass multiple log objects at once.
# Curl command
echo $(cat << EOF
[
{
"ddsource": "nginx",
"ddtags": "env:staging,version:5.1",
"hostname": "i-012345678",
"message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
"service": "payment"
}
]
EOF
) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
-H "Content-Encoding: gzip" \
-H "DD-API-KEY: ${DD_API_KEY}" \
--data-binary @-
## Simple JSON Message
# Log attributes can be passed as `key:value` pairs in valid JSON messages.
# Curl command
echo $(cat << EOF
[
{
"ddsource": "nginx",
"ddtags": "env:staging,version:5.1",
"hostname": "i-012345678",
"message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
"service": "payment"
}
]
EOF
) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
-H "Content-Encoding: gzip" \
-H "DD-API-KEY: ${DD_API_KEY}" \
--data-binary @-
## Multi Logplex Messages
# Submit log messages.
# Curl command
echo $(cat << EOF
[
{
"ddsource": "nginx",
"ddtags": "env:staging,version:5.1",
"hostname": "i-012345678",
"message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
"service": "payment"
}
]
EOF
) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \
-H "Accept: application/json" \
-H "Content-Type: application/logplex-1" \
-H "Content-Encoding: gzip" \
-H "DD-API-KEY: ${DD_API_KEY}" \
--data-binary @-
## Simple Logplex Message
# Submit log string.
# Curl command
echo $(cat << EOF
[
{
"ddsource": "nginx",
"ddtags": "env:staging,version:5.1",
"hostname": "i-012345678",
"message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
"service": "payment"
}
]
EOF
) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \
-H "Accept: application/json" \
-H "Content-Type: application/logplex-1" \
-H "Content-Encoding: gzip" \
-H "DD-API-KEY: ${DD_API_KEY}" \
--data-binary @-
## Multi Raw Messages
# Submit log string.
# Curl command
echo $(cat << EOF
[
{
"ddsource": "nginx",
"ddtags": "env:staging,version:5.1",
"hostname": "i-012345678",
"message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
"service": "payment"
}
]
EOF
) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \
-H "Accept: application/json" \
-H "Content-Type: text/plain" \
-H "Content-Encoding: gzip" \
-H "DD-API-KEY: ${DD_API_KEY}" \
--data-binary @-
## Simple Raw Message
# Submit log string. Log attributes can be passed as query parameters in the URL. This enables the addition of tags or the source by using the `ddtags` and `ddsource` parameters: `?host=my-hostname&service=my-service&ddsource=my-source&ddtags=env:prod,user:my-user`.
# Curl command
echo $(cat << EOF
[
{
"ddsource": "nginx",
"ddtags": "env:staging,version:5.1",
"hostname": "i-012345678",
"message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
"service": "payment"
}
]
EOF
) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \
-H "Accept: application/json" \
-H "Content-Type: text/plain" \
-H "Content-Encoding: gzip" \
-H "DD-API-KEY: ${DD_API_KEY}" \
--data-binary @-
## Multi JSON Messages
# Pass multiple log objects at once.
# Curl command
curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-d @- << EOF
[
{
"ddsource": "nginx",
"ddtags": "env:staging,version:5.1",
"hostname": "i-012345678",
"message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello",
"service": "payment"
},
{
"ddsource": "nginx",
"ddtags": "env:staging,version:5.1",
"hostname": "i-012345679",
"message": "2019-11-19T14:37:58,995 INFO [process.name][20081] World",
"service": "payment"
}
]
EOF
## Simple JSON Message
# Log attributes can be passed as `key:value` pairs in valid JSON messages.
# Curl command
curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-d @- << EOF
{
"ddsource": "nginx",
"ddtags": "env:staging,version:5.1",
"hostname": "i-012345678",
"message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
"service": "payment"
}
EOF
## Multi Logplex Messages
# Submit log messages.
# Curl command
curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \
-H "Accept: application/json" \
-H "Content-Type: application/logplex-1" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-d @- << EOF
2019-11-19T14:37:58,995 INFO [process.name][20081] Hello
2019-11-19T14:37:58,995 INFO [process.name][20081] World
EOF
## Simple Logplex Message
# Submit log string.
# Curl command
curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \
-H "Accept: application/json" \
-H "Content-Type: application/logplex-1" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-d @- << EOF
2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World
EOF
## Multi Raw Messages
# Submit log string.
# Curl command
curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \
-H "Accept: application/json" \
-H "Content-Type: text/plain" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-d @- << EOF
2019-11-19T14:37:58,995 INFO [process.name][20081] Hello
2019-11-19T14:37:58,995 INFO [process.name][20081] World
EOF
## Simple Raw Message
# Submit log string. Log attributes can be passed as query parameters in the URL. This enables the addition of tags or the source by using the `ddtags` and `ddsource` parameters: `?host=my-hostname&service=my-service&ddsource=my-source&ddtags=env:prod,user:my-user`.
# Curl command
curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \
-H "Accept: application/json" \
-H "Content-Type: text/plain" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-d @- << EOF
2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World
EOF
// Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response
package main
import (
"context"
"encoding/json"
"fmt"
"os"
"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
"github.com/DataDog/datadog-api-client-go/v2/api/datadogV2"
)
func main() {
body := []datadogV2.HTTPLogItem{
{
Ddsource: datadog.PtrString("nginx"),
Ddtags: datadog.PtrString("env:staging,version:5.1"),
Hostname: datadog.PtrString("i-012345678"),
Message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
Service: datadog.PtrString("payment"),
},
}
ctx := datadog.NewDefaultContext(context.Background())
configuration := datadog.NewConfiguration()
apiClient := datadog.NewAPIClient(configuration)
api := datadogV2.NewLogsApi(apiClient)
resp, r, err := api.SubmitLog(ctx, body, *datadogV2.NewSubmitLogOptionalParameters().WithContentEncoding(datadogV2.CONTENTENCODING_DEFLATE))
if err != nil {
fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.SubmitLog`: %v\n", err)
fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
}
responseContent, _ := json.MarshalIndent(resp, "", " ")
fmt.Fprintf(os.Stdout, "Response from `LogsApi.SubmitLog`:\n%s\n", responseContent)
}
// Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response
package main
import (
"context"
"encoding/json"
"fmt"
"os"
"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
"github.com/DataDog/datadog-api-client-go/v2/api/datadogV2"
)
func main() {
body := []datadogV2.HTTPLogItem{
{
Ddsource: datadog.PtrString("nginx"),
Ddtags: datadog.PtrString("env:staging,version:5.1"),
Hostname: datadog.PtrString("i-012345678"),
Message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
Service: datadog.PtrString("payment"),
},
}
ctx := datadog.NewDefaultContext(context.Background())
configuration := datadog.NewConfiguration()
apiClient := datadog.NewAPIClient(configuration)
api := datadogV2.NewLogsApi(apiClient)
resp, r, err := api.SubmitLog(ctx, body, *datadogV2.NewSubmitLogOptionalParameters().WithContentEncoding(datadogV2.CONTENTENCODING_GZIP))
if err != nil {
fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.SubmitLog`: %v\n", err)
fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
}
responseContent, _ := json.MarshalIndent(resp, "", " ")
fmt.Fprintf(os.Stdout, "Response from `LogsApi.SubmitLog`:\n%s\n", responseContent)
}
// Send logs returns "Request accepted for processing (always 202 empty JSON)." response
package main
import (
"context"
"encoding/json"
"fmt"
"os"
"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
"github.com/DataDog/datadog-api-client-go/v2/api/datadogV2"
)
func main() {
body := []datadogV2.HTTPLogItem{
{
Ddsource: datadog.PtrString("nginx"),
Ddtags: datadog.PtrString("env:staging,version:5.1"),
Hostname: datadog.PtrString("i-012345678"),
Message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
Service: datadog.PtrString("payment"),
AdditionalProperties: map[string]string{
"status": "info",
},
},
}
ctx := datadog.NewDefaultContext(context.Background())
configuration := datadog.NewConfiguration()
apiClient := datadog.NewAPIClient(configuration)
api := datadogV2.NewLogsApi(apiClient)
resp, r, err := api.SubmitLog(ctx, body, *datadogV2.NewSubmitLogOptionalParameters())
if err != nil {
fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.SubmitLog`: %v\n", err)
fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
}
responseContent, _ := json.MarshalIndent(resp, "", " ")
fmt.Fprintf(os.Stdout, "Response from `LogsApi.SubmitLog`:\n%s\n", responseContent)
}
First install the library and its dependencies and then save the example to main.go
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<DD_API_KEY>" go run "main.go"
// Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response
import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v2.api.LogsApi;
import com.datadog.api.client.v2.api.LogsApi.SubmitLogOptionalParameters;
import com.datadog.api.client.v2.model.ContentEncoding;
import com.datadog.api.client.v2.model.HTTPLogItem;
import java.util.Collections;
import java.util.List;
public class Example {
public static void main(String[] args) {
ApiClient defaultClient = ApiClient.getDefaultApiClient();
LogsApi apiInstance = new LogsApi(defaultClient);
List<HTTPLogItem> body =
Collections.singletonList(
new HTTPLogItem()
.ddsource("nginx")
.ddtags("env:staging,version:5.1")
.hostname("i-012345678")
.message("2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World")
.service("payment"));
try {
apiInstance.submitLog(
body, new SubmitLogOptionalParameters().contentEncoding(ContentEncoding.DEFLATE));
} catch (ApiException e) {
System.err.println("Exception when calling LogsApi#submitLog");
System.err.println("Status code: " + e.getCode());
System.err.println("Reason: " + e.getResponseBody());
System.err.println("Response headers: " + e.getResponseHeaders());
e.printStackTrace();
}
}
}
// Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response
import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v2.api.LogsApi;
import com.datadog.api.client.v2.api.LogsApi.SubmitLogOptionalParameters;
import com.datadog.api.client.v2.model.ContentEncoding;
import com.datadog.api.client.v2.model.HTTPLogItem;
import java.util.Collections;
import java.util.List;
public class Example {
public static void main(String[] args) {
ApiClient defaultClient = ApiClient.getDefaultApiClient();
LogsApi apiInstance = new LogsApi(defaultClient);
List<HTTPLogItem> body =
Collections.singletonList(
new HTTPLogItem()
.ddsource("nginx")
.ddtags("env:staging,version:5.1")
.hostname("i-012345678")
.message("2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World")
.service("payment"));
try {
apiInstance.submitLog(
body, new SubmitLogOptionalParameters().contentEncoding(ContentEncoding.GZIP));
} catch (ApiException e) {
System.err.println("Exception when calling LogsApi#submitLog");
System.err.println("Status code: " + e.getCode());
System.err.println("Reason: " + e.getResponseBody());
System.err.println("Response headers: " + e.getResponseHeaders());
e.printStackTrace();
}
}
}
// Send logs returns "Request accepted for processing (always 202 empty JSON)." response
import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v2.api.LogsApi;
import com.datadog.api.client.v2.model.HTTPLogItem;
import java.util.Collections;
import java.util.List;
public class Example {
public static void main(String[] args) {
ApiClient defaultClient = ApiClient.getDefaultApiClient();
LogsApi apiInstance = new LogsApi(defaultClient);
List<HTTPLogItem> body =
Collections.singletonList(
new HTTPLogItem()
.ddsource("nginx")
.ddtags("env:staging,version:5.1")
.hostname("i-012345678")
.message("2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World")
.service("payment")
.putAdditionalProperty("status", "info"));
try {
apiInstance.submitLog(body);
} catch (ApiException e) {
System.err.println("Exception when calling LogsApi#submitLog");
System.err.println("Status code: " + e.getCode());
System.err.println("Reason: " + e.getResponseBody());
System.err.println("Response headers: " + e.getResponseHeaders());
e.printStackTrace();
}
}
}
First install the library and its dependencies and then save the example to Example.java
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<DD_API_KEY>" java "Example.java"
"""
Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response
"""
from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v2.api.logs_api import LogsApi
from datadog_api_client.v2.model.content_encoding import ContentEncoding
from datadog_api_client.v2.model.http_log import HTTPLog
from datadog_api_client.v2.model.http_log_item import HTTPLogItem
body = HTTPLog(
[
HTTPLogItem(
ddsource="nginx",
ddtags="env:staging,version:5.1",
hostname="i-012345678",
message="2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
service="payment",
),
]
)
configuration = Configuration()
with ApiClient(configuration) as api_client:
api_instance = LogsApi(api_client)
response = api_instance.submit_log(content_encoding=ContentEncoding.DEFLATE, body=body)
print(response)
"""
Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response
"""
from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v2.api.logs_api import LogsApi
from datadog_api_client.v2.model.content_encoding import ContentEncoding
from datadog_api_client.v2.model.http_log import HTTPLog
from datadog_api_client.v2.model.http_log_item import HTTPLogItem
body = HTTPLog(
[
HTTPLogItem(
ddsource="nginx",
ddtags="env:staging,version:5.1",
hostname="i-012345678",
message="2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
service="payment",
),
]
)
configuration = Configuration()
with ApiClient(configuration) as api_client:
api_instance = LogsApi(api_client)
response = api_instance.submit_log(content_encoding=ContentEncoding.GZIP, body=body)
print(response)
"""
Send logs returns "Request accepted for processing (always 202 empty JSON)." response
"""
from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v2.api.logs_api import LogsApi
from datadog_api_client.v2.model.http_log import HTTPLog
from datadog_api_client.v2.model.http_log_item import HTTPLogItem
body = HTTPLog(
[
HTTPLogItem(
ddsource="nginx",
ddtags="env:staging,version:5.1",
hostname="i-012345678",
message="2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
service="payment",
status="info",
),
]
)
configuration = Configuration()
with ApiClient(configuration) as api_client:
api_instance = LogsApi(api_client)
response = api_instance.submit_log(body=body)
print(response)
First install the library and its dependencies and then save the example to example.py
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<DD_API_KEY>" python3 "example.py"
# Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response
require "datadog_api_client"
api_instance = DatadogAPIClient::V2::LogsAPI.new
body = [
DatadogAPIClient::V2::HTTPLogItem.new({
ddsource: "nginx",
ddtags: "env:staging,version:5.1",
hostname: "i-012345678",
message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
service: "payment",
}),
]
opts = {
content_encoding: ContentEncoding::DEFLATE,
}
p api_instance.submit_log(body, opts)
# Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response
require "datadog_api_client"
api_instance = DatadogAPIClient::V2::LogsAPI.new
body = [
DatadogAPIClient::V2::HTTPLogItem.new({
ddsource: "nginx",
ddtags: "env:staging,version:5.1",
hostname: "i-012345678",
message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
service: "payment",
}),
]
opts = {
content_encoding: ContentEncoding::GZIP,
}
p api_instance.submit_log(body, opts)
# Send logs returns "Request accepted for processing (always 202 empty JSON)." response
require "datadog_api_client"
api_instance = DatadogAPIClient::V2::LogsAPI.new
body = [
DatadogAPIClient::V2::HTTPLogItem.new({
ddsource: "nginx",
ddtags: "env:staging,version:5.1",
hostname: "i-012345678",
message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
service: "payment",
status: "info",
}),
]
p api_instance.submit_log(body)
First install the library and its dependencies and then save the example to example.rb
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<DD_API_KEY>" rb "example.rb"
// Send deflate logs returns "Request accepted for processing (always 202 empty
// JSON)." response
use datadog_api_client::datadog;
use datadog_api_client::datadogV2::api_logs::LogsAPI;
use datadog_api_client::datadogV2::api_logs::SubmitLogOptionalParams;
use datadog_api_client::datadogV2::model::ContentEncoding;
use datadog_api_client::datadogV2::model::HTTPLogItem;
use std::collections::BTreeMap;
#[tokio::main]
async fn main() {
let body = vec![HTTPLogItem::new(
"2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World".to_string(),
)
.ddsource("nginx".to_string())
.ddtags("env:staging,version:5.1".to_string())
.hostname("i-012345678".to_string())
.service("payment".to_string())
.additional_properties(BTreeMap::from([]))];
let configuration = datadog::Configuration::new();
let api = LogsAPI::with_config(configuration);
let resp = api
.submit_log(
body,
SubmitLogOptionalParams::default().content_encoding(ContentEncoding::DEFLATE),
)
.await;
if let Ok(value) = resp {
println!("{:#?}", value);
} else {
println!("{:#?}", resp.unwrap_err());
}
}
// Send gzip logs returns "Request accepted for processing (always 202 empty
// JSON)." response
use datadog_api_client::datadog;
use datadog_api_client::datadogV2::api_logs::LogsAPI;
use datadog_api_client::datadogV2::api_logs::SubmitLogOptionalParams;
use datadog_api_client::datadogV2::model::ContentEncoding;
use datadog_api_client::datadogV2::model::HTTPLogItem;
use std::collections::BTreeMap;
#[tokio::main]
async fn main() {
let body = vec![HTTPLogItem::new(
"2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World".to_string(),
)
.ddsource("nginx".to_string())
.ddtags("env:staging,version:5.1".to_string())
.hostname("i-012345678".to_string())
.service("payment".to_string())
.additional_properties(BTreeMap::from([]))];
let configuration = datadog::Configuration::new();
let api = LogsAPI::with_config(configuration);
let resp = api
.submit_log(
body,
SubmitLogOptionalParams::default().content_encoding(ContentEncoding::GZIP),
)
.await;
if let Ok(value) = resp {
println!("{:#?}", value);
} else {
println!("{:#?}", resp.unwrap_err());
}
}
// Send logs returns "Request accepted for processing (always 202 empty JSON)."
// response
use datadog_api_client::datadog;
use datadog_api_client::datadogV2::api_logs::LogsAPI;
use datadog_api_client::datadogV2::api_logs::SubmitLogOptionalParams;
use datadog_api_client::datadogV2::model::HTTPLogItem;
use std::collections::BTreeMap;
#[tokio::main]
async fn main() {
let body = vec![HTTPLogItem::new(
"2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World".to_string(),
)
.ddsource("nginx".to_string())
.ddtags("env:staging,version:5.1".to_string())
.hostname("i-012345678".to_string())
.service("payment".to_string())
.additional_properties(BTreeMap::from([("status".to_string(), "info".to_string())]))];
let configuration = datadog::Configuration::new();
let api = LogsAPI::with_config(configuration);
let resp = api
.submit_log(body, SubmitLogOptionalParams::default())
.await;
if let Ok(value) = resp {
println!("{:#?}", value);
} else {
println!("{:#?}", resp.unwrap_err());
}
}
First install the library and its dependencies and then save the example to src/main.rs
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<DD_API_KEY>" cargo run
/**
* Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response
*/
import { client, v2 } from "@datadog/datadog-api-client";
const configuration = client.createConfiguration();
const apiInstance = new v2.LogsApi(configuration);
const params: v2.LogsApiSubmitLogRequest = {
body: [
{
ddsource: "nginx",
ddtags: "env:staging,version:5.1",
hostname: "i-012345678",
message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
service: "payment",
},
],
contentEncoding: "deflate",
};
apiInstance
.submitLog(params)
.then((data: any) => {
console.log(
"API called successfully. Returned data: " + JSON.stringify(data)
);
})
.catch((error: any) => console.error(error));
/**
* Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response
*/
import { client, v2 } from "@datadog/datadog-api-client";
const configuration = client.createConfiguration();
const apiInstance = new v2.LogsApi(configuration);
const params: v2.LogsApiSubmitLogRequest = {
body: [
{
ddsource: "nginx",
ddtags: "env:staging,version:5.1",
hostname: "i-012345678",
message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
service: "payment",
},
],
contentEncoding: "gzip",
};
apiInstance
.submitLog(params)
.then((data: any) => {
console.log(
"API called successfully. Returned data: " + JSON.stringify(data)
);
})
.catch((error: any) => console.error(error));
/**
* Send logs returns "Request accepted for processing (always 202 empty JSON)." response
*/
import { client, v2 } from "@datadog/datadog-api-client";
const configuration = client.createConfiguration();
const apiInstance = new v2.LogsApi(configuration);
const params: v2.LogsApiSubmitLogRequest = {
body: [
{
ddsource: "nginx",
ddtags: "env:staging,version:5.1",
hostname: "i-012345678",
message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World",
service: "payment",
additionalProperties: {
status: "info",
},
},
],
};
apiInstance
.submitLog(params)
.then((data: any) => {
console.log(
"API called successfully. Returned data: " + JSON.stringify(data)
);
})
.catch((error: any) => console.error(error));
First install the library and its dependencies and then save the example to example.ts
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<DD_API_KEY>" tsc "example.ts"
POST https://api.ap1.datadoghq.com/api/v2/logs/analytics/aggregatehttps://api.datadoghq.eu/api/v2/logs/analytics/aggregatehttps://api.ddog-gov.com/api/v2/logs/analytics/aggregatehttps://api.datadoghq.com/api/v2/logs/analytics/aggregatehttps://api.us3.datadoghq.com/api/v2/logs/analytics/aggregatehttps://api.us5.datadoghq.com/api/v2/logs/analytics/aggregate
The API endpoint to aggregate events into buckets and compute metrics and timeseries.
This endpoint requires the logs_read_data
permission.
항목
유형
설명
compute
[object]
The list of metrics or timeseries to compute for the retrieved buckets.
aggregation [required]
enum
An aggregation function
Allowed enum values: count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,median
interval
string
The time buckets' size (only used for type=timeseries) Defaults to a resolution of 150 points
metric
string
The metric to use
type
enum
The type of compute
Allowed enum values: timeseries,total
default: total
filter
object
The search and filter query settings
from
string
The minimum time for the requested logs, supports date math and regular timestamps (milliseconds).
default: now-15m
indexes
[string]
For customers with multiple indexes, the indexes to search. Defaults to ['*'] which means all indexes.
default: *
query
string
The search query - following the log search syntax.
default: *
storage_tier
enum
Specifies storage type as indexes, online-archives or flex
Allowed enum values: indexes,online-archives,flex
default: indexes
to
string
The maximum time for the requested logs, supports date math and regular timestamps (milliseconds).
default: now
group_by
[object]
The rules for the group by
facet [required]
string
The name of the facet to use (required)
histogram
object
Used to perform a histogram computation (only for measure facets). Note: at most 100 buckets are allowed, the number of buckets is (max - min)/interval.
interval [required]
double
The bin size of the histogram buckets
max [required]
double
The maximum value for the measure used in the histogram (values greater than this one are filtered out)
min [required]
double
The minimum value for the measure used in the histogram (values smaller than this one are filtered out)
limit
int64
The maximum buckets to return for this group by. Note: at most 10000 buckets are allowed. If grouping by multiple facets, the product of limits must not exceed 10000.
default: 10
missing
<oneOf>
The value to use for logs that don't have the facet used to group by
Option 1
string
The missing value to use if there is string valued facet.
Option 2
double
The missing value to use if there is a number valued facet.
sort
object
A sort rule
aggregation
enum
An aggregation function
Allowed enum values: count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,median
metric
string
The metric to sort by (only used for type=measure
)
order
enum
The order to use, ascending or descending
Allowed enum values: asc,desc
type
enum
The type of sorting algorithm
Allowed enum values: alphabetical,measure
default: alphabetical
total
<oneOf>
A resulting object to put the given computes in over all the matching records.
Option 1
boolean
If set to true, creates an additional bucket labeled "$facet_total"
Option 2
string
A string to use as the key value for the total bucket
Option 3
double
A number to use as the key value for the total bucket
options
object
Global query options that are used during the query. Note: you should supply either timezone or time offset, but not both. Otherwise, the query will fail.
timeOffset
int64
The time offset (in seconds) to apply to the query.
timezone
string
The timezone can be specified as GMT, UTC, an offset from UTC (like UTC+1), or as a Timezone Database identifier (like America/New_York).
default: UTC
page
object
Paging settings
cursor
string
The returned paging point to use to get the next results. Note: at most 1000 results can be paged.
{
"compute": [
{
"aggregation": "count",
"interval": "5m",
"type": "timeseries"
}
],
"filter": {
"from": "now-15m",
"indexes": [
"main"
],
"query": "*",
"to": "now"
}
}
{
"compute": [
{
"aggregation": "count",
"interval": "5m",
"type": "timeseries"
}
],
"filter": {
"from": "now-15m",
"indexes": [
"main"
],
"query": "*",
"to": "now"
},
"group_by": [
{
"facet": "host",
"missing": "miss",
"sort": {
"type": "measure",
"order": "asc",
"aggregation": "pc90",
"metric": "@duration"
}
}
]
}
{
"filter": {
"from": "now-15m",
"indexes": [
"main"
],
"query": "*",
"to": "now"
}
}
OK
The response object for the logs aggregate API endpoint
항목
유형
설명
data
object
The query results
buckets
[object]
The list of matching buckets, one item per bucket
by
object
The key, value pairs for each group by
<any-key>
The values for each group by
computes
object
A map of the metric name -> value for regular compute or list of values for a timeseries
<any-key>
<oneOf>
A bucket value, can be either a timeseries or a single value
Option 1
string
A single string value
Option 2
double
A single number value
Option 3
[object]
A timeseries array
time
string
The time value for this point
value
double
The value for this point
meta
object
The metadata associated with a request
elapsed
int64
The time elapsed in milliseconds
page
object
Paging attributes.
after
string
The cursor to use to get the next results, if any. To make the next request, use the same
parameters with the addition of the page[cursor]
.
request_id
string
The identifier of the request
status
enum
The status of the response
Allowed enum values: done,timeout
warnings
[object]
A list of warnings (non fatal errors) encountered, partial results might be returned if warnings are present in the response.
code
string
A unique code for this type of warning
detail
string
A detailed explanation of this specific warning
title
string
A short human-readable summary of the warning
{
"data": {
"buckets": [
{
"by": {
"<any-key>": "undefined"
},
"computes": {
"<any-key>": {
"description": "undefined",
"type": "undefined"
}
}
}
]
},
"meta": {
"elapsed": 132,
"page": {
"after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ=="
},
"request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR",
"status": "done",
"warnings": [
{
"code": "unknown_index",
"detail": "indexes: foo, bar",
"title": "One or several indexes are missing or invalid, results hold data from the other indexes"
}
]
}
}
Bad Request
API error response.
{
"errors": [
"Bad Request"
]
}
Not Authorized
API error response.
{
"errors": [
"Bad Request"
]
}
Too many requests
API error response.
{
"errors": [
"Bad Request"
]
}
# Curl command
curl -X POST "https://api.ap1.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/analytics/aggregate" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \
-d @- << EOF
{
"compute": [
{
"aggregation": "count",
"interval": "5m",
"type": "timeseries"
}
],
"filter": {
"from": "now-15m",
"indexes": [
"main"
],
"query": "*",
"to": "now"
}
}
EOF
# Curl command
curl -X POST "https://api.ap1.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/analytics/aggregate" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \
-d @- << EOF
{
"compute": [
{
"aggregation": "count",
"interval": "5m",
"type": "timeseries"
}
],
"filter": {
"from": "now-15m",
"indexes": [
"main"
],
"query": "*",
"to": "now"
},
"group_by": [
{
"facet": "host",
"missing": "miss",
"sort": {
"type": "measure",
"order": "asc",
"aggregation": "pc90",
"metric": "@duration"
}
}
]
}
EOF
# Curl command
curl -X POST "https://api.ap1.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/analytics/aggregate" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \
-d @- << EOF
{
"filter": {
"from": "now-15m",
"indexes": [
"main"
],
"query": "*",
"to": "now"
}
}
EOF
// Aggregate compute events returns "OK" response
package main
import (
"context"
"encoding/json"
"fmt"
"os"
"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
"github.com/DataDog/datadog-api-client-go/v2/api/datadogV2"
)
func main() {
body := datadogV2.LogsAggregateRequest{
Compute: []datadogV2.LogsCompute{
{
Aggregation: datadogV2.LOGSAGGREGATIONFUNCTION_COUNT,
Interval: datadog.PtrString("5m"),
Type: datadogV2.LOGSCOMPUTETYPE_TIMESERIES.Ptr(),
},
},
Filter: &datadogV2.LogsQueryFilter{
From: datadog.PtrString("now-15m"),
Indexes: []string{
"main",
},
Query: datadog.PtrString("*"),
To: datadog.PtrString("now"),
},
}
ctx := datadog.NewDefaultContext(context.Background())
configuration := datadog.NewConfiguration()
apiClient := datadog.NewAPIClient(configuration)
api := datadogV2.NewLogsApi(apiClient)
resp, r, err := api.AggregateLogs(ctx, body)
if err != nil {
fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.AggregateLogs`: %v\n", err)
fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
}
responseContent, _ := json.MarshalIndent(resp, "", " ")
fmt.Fprintf(os.Stdout, "Response from `LogsApi.AggregateLogs`:\n%s\n", responseContent)
}
// Aggregate compute events with group by returns "OK" response
package main
import (
"context"
"encoding/json"
"fmt"
"os"
"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
"github.com/DataDog/datadog-api-client-go/v2/api/datadogV2"
)
func main() {
body := datadogV2.LogsAggregateRequest{
Compute: []datadogV2.LogsCompute{
{
Aggregation: datadogV2.LOGSAGGREGATIONFUNCTION_COUNT,
Interval: datadog.PtrString("5m"),
Type: datadogV2.LOGSCOMPUTETYPE_TIMESERIES.Ptr(),
},
},
Filter: &datadogV2.LogsQueryFilter{
From: datadog.PtrString("now-15m"),
Indexes: []string{
"main",
},
Query: datadog.PtrString("*"),
To: datadog.PtrString("now"),
},
GroupBy: []datadogV2.LogsGroupBy{
{
Facet: "host",
Missing: &datadogV2.LogsGroupByMissing{
LogsGroupByMissingString: datadog.PtrString("miss")},
Sort: &datadogV2.LogsAggregateSort{
Type: datadogV2.LOGSAGGREGATESORTTYPE_MEASURE.Ptr(),
Order: datadogV2.LOGSSORTORDER_ASCENDING.Ptr(),
Aggregation: datadogV2.LOGSAGGREGATIONFUNCTION_PERCENTILE_90.Ptr(),
Metric: datadog.PtrString("@duration"),
},
Total: &datadogV2.LogsGroupByTotal{
LogsGroupByTotalString: datadog.PtrString("recall")},
},
},
}
ctx := datadog.NewDefaultContext(context.Background())
configuration := datadog.NewConfiguration()
apiClient := datadog.NewAPIClient(configuration)
api := datadogV2.NewLogsApi(apiClient)
resp, r, err := api.AggregateLogs(ctx, body)
if err != nil {
fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.AggregateLogs`: %v\n", err)
fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
}
responseContent, _ := json.MarshalIndent(resp, "", " ")
fmt.Fprintf(os.Stdout, "Response from `LogsApi.AggregateLogs`:\n%s\n", responseContent)
}
// Aggregate events returns "OK" response
package main
import (
"context"
"encoding/json"
"fmt"
"os"
"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
"github.com/DataDog/datadog-api-client-go/v2/api/datadogV2"
)
func main() {
body := datadogV2.LogsAggregateRequest{
Filter: &datadogV2.LogsQueryFilter{
From: datadog.PtrString("now-15m"),
Indexes: []string{
"main",
},
Query: datadog.PtrString("*"),
To: datadog.PtrString("now"),
},
}
ctx := datadog.NewDefaultContext(context.Background())
configuration := datadog.NewConfiguration()
apiClient := datadog.NewAPIClient(configuration)
api := datadogV2.NewLogsApi(apiClient)
resp, r, err := api.AggregateLogs(ctx, body)
if err != nil {
fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.AggregateLogs`: %v\n", err)
fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
}
responseContent, _ := json.MarshalIndent(resp, "", " ")
fmt.Fprintf(os.Stdout, "Response from `LogsApi.AggregateLogs`:\n%s\n", responseContent)
}
First install the library and its dependencies and then save the example to main.go
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" go run "main.go"
// Aggregate compute events returns "OK" response
import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v2.api.LogsApi;
import com.datadog.api.client.v2.model.LogsAggregateRequest;
import com.datadog.api.client.v2.model.LogsAggregateResponse;
import com.datadog.api.client.v2.model.LogsAggregationFunction;
import com.datadog.api.client.v2.model.LogsCompute;
import com.datadog.api.client.v2.model.LogsComputeType;
import com.datadog.api.client.v2.model.LogsQueryFilter;
import java.util.Collections;
public class Example {
public static void main(String[] args) {
ApiClient defaultClient = ApiClient.getDefaultApiClient();
LogsApi apiInstance = new LogsApi(defaultClient);
LogsAggregateRequest body =
new LogsAggregateRequest()
.compute(
Collections.singletonList(
new LogsCompute()
.aggregation(LogsAggregationFunction.COUNT)
.interval("5m")
.type(LogsComputeType.TIMESERIES)))
.filter(
new LogsQueryFilter()
.from("now-15m")
.indexes(Collections.singletonList("main"))
.query("*")
.to("now"));
try {
LogsAggregateResponse result = apiInstance.aggregateLogs(body);
System.out.println(result);
} catch (ApiException e) {
System.err.println("Exception when calling LogsApi#aggregateLogs");
System.err.println("Status code: " + e.getCode());
System.err.println("Reason: " + e.getResponseBody());
System.err.println("Response headers: " + e.getResponseHeaders());
e.printStackTrace();
}
}
}
// Aggregate compute events with group by returns "OK" response
import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v2.api.LogsApi;
import com.datadog.api.client.v2.model.LogsAggregateRequest;
import com.datadog.api.client.v2.model.LogsAggregateResponse;
import com.datadog.api.client.v2.model.LogsAggregateSort;
import com.datadog.api.client.v2.model.LogsAggregateSortType;
import com.datadog.api.client.v2.model.LogsAggregationFunction;
import com.datadog.api.client.v2.model.LogsCompute;
import com.datadog.api.client.v2.model.LogsComputeType;
import com.datadog.api.client.v2.model.LogsGroupBy;
import com.datadog.api.client.v2.model.LogsGroupByMissing;
import com.datadog.api.client.v2.model.LogsGroupByTotal;
import com.datadog.api.client.v2.model.LogsQueryFilter;
import com.datadog.api.client.v2.model.LogsSortOrder;
import java.util.Collections;
public class Example {
public static void main(String[] args) {
ApiClient defaultClient = ApiClient.getDefaultApiClient();
LogsApi apiInstance = new LogsApi(defaultClient);
LogsAggregateRequest body =
new LogsAggregateRequest()
.compute(
Collections.singletonList(
new LogsCompute()
.aggregation(LogsAggregationFunction.COUNT)
.interval("5m")
.type(LogsComputeType.TIMESERIES)))
.filter(
new LogsQueryFilter()
.from("now-15m")
.indexes(Collections.singletonList("main"))
.query("*")
.to("now"))
.groupBy(
Collections.singletonList(
new LogsGroupBy()
.facet("host")
.missing(new LogsGroupByMissing("miss"))
.sort(
new LogsAggregateSort()
.type(LogsAggregateSortType.MEASURE)
.order(LogsSortOrder.ASCENDING)
.aggregation(LogsAggregationFunction.PERCENTILE_90)
.metric("@duration"))
.total(new LogsGroupByTotal("recall"))));
try {
LogsAggregateResponse result = apiInstance.aggregateLogs(body);
System.out.println(result);
} catch (ApiException e) {
System.err.println("Exception when calling LogsApi#aggregateLogs");
System.err.println("Status code: " + e.getCode());
System.err.println("Reason: " + e.getResponseBody());
System.err.println("Response headers: " + e.getResponseHeaders());
e.printStackTrace();
}
}
}
// Aggregate events returns "OK" response
import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v2.api.LogsApi;
import com.datadog.api.client.v2.model.LogsAggregateRequest;
import com.datadog.api.client.v2.model.LogsAggregateResponse;
import com.datadog.api.client.v2.model.LogsQueryFilter;
import java.util.Collections;
public class Example {
public static void main(String[] args) {
ApiClient defaultClient = ApiClient.getDefaultApiClient();
LogsApi apiInstance = new LogsApi(defaultClient);
LogsAggregateRequest body =
new LogsAggregateRequest()
.filter(
new LogsQueryFilter()
.from("now-15m")
.indexes(Collections.singletonList("main"))
.query("*")
.to("now"));
try {
LogsAggregateResponse result = apiInstance.aggregateLogs(body);
System.out.println(result);
} catch (ApiException e) {
System.err.println("Exception when calling LogsApi#aggregateLogs");
System.err.println("Status code: " + e.getCode());
System.err.println("Reason: " + e.getResponseBody());
System.err.println("Response headers: " + e.getResponseHeaders());
e.printStackTrace();
}
}
}
First install the library and its dependencies and then save the example to Example.java
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" java "Example.java"
"""
Aggregate compute events returns "OK" response
"""
from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v2.api.logs_api import LogsApi
from datadog_api_client.v2.model.logs_aggregate_request import LogsAggregateRequest
from datadog_api_client.v2.model.logs_aggregation_function import LogsAggregationFunction
from datadog_api_client.v2.model.logs_compute import LogsCompute
from datadog_api_client.v2.model.logs_compute_type import LogsComputeType
from datadog_api_client.v2.model.logs_query_filter import LogsQueryFilter
body = LogsAggregateRequest(
compute=[
LogsCompute(
aggregation=LogsAggregationFunction.COUNT,
interval="5m",
type=LogsComputeType.TIMESERIES,
),
],
filter=LogsQueryFilter(
_from="now-15m",
indexes=[
"main",
],
query="*",
to="now",
),
)
configuration = Configuration()
with ApiClient(configuration) as api_client:
api_instance = LogsApi(api_client)
response = api_instance.aggregate_logs(body=body)
print(response)
"""
Aggregate compute events with group by returns "OK" response
"""
from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v2.api.logs_api import LogsApi
from datadog_api_client.v2.model.logs_aggregate_request import LogsAggregateRequest
from datadog_api_client.v2.model.logs_aggregate_sort import LogsAggregateSort
from datadog_api_client.v2.model.logs_aggregate_sort_type import LogsAggregateSortType
from datadog_api_client.v2.model.logs_aggregation_function import LogsAggregationFunction
from datadog_api_client.v2.model.logs_compute import LogsCompute
from datadog_api_client.v2.model.logs_compute_type import LogsComputeType
from datadog_api_client.v2.model.logs_group_by import LogsGroupBy
from datadog_api_client.v2.model.logs_query_filter import LogsQueryFilter
from datadog_api_client.v2.model.logs_sort_order import LogsSortOrder
body = LogsAggregateRequest(
compute=[
LogsCompute(
aggregation=LogsAggregationFunction.COUNT,
interval="5m",
type=LogsComputeType.TIMESERIES,
),
],
filter=LogsQueryFilter(
_from="now-15m",
indexes=[
"main",
],
query="*",
to="now",
),
group_by=[
LogsGroupBy(
facet="host",
missing="miss",
sort=LogsAggregateSort(
type=LogsAggregateSortType.MEASURE,
order=LogsSortOrder.ASCENDING,
aggregation=LogsAggregationFunction.PERCENTILE_90,
metric="@duration",
),
total="recall",
),
],
)
configuration = Configuration()
with ApiClient(configuration) as api_client:
api_instance = LogsApi(api_client)
response = api_instance.aggregate_logs(body=body)
print(response)
"""
Aggregate events returns "OK" response
"""
from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v2.api.logs_api import LogsApi
from datadog_api_client.v2.model.logs_aggregate_request import LogsAggregateRequest
from datadog_api_client.v2.model.logs_query_filter import LogsQueryFilter
body = LogsAggregateRequest(
filter=LogsQueryFilter(
_from="now-15m",
indexes=[
"main",
],
query="*",
to="now",
),
)
configuration = Configuration()
with ApiClient(configuration) as api_client:
api_instance = LogsApi(api_client)
response = api_instance.aggregate_logs(body=body)
print(response)
First install the library and its dependencies and then save the example to example.py
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" python3 "example.py"
# Aggregate compute events returns "OK" response
require "datadog_api_client"
api_instance = DatadogAPIClient::V2::LogsAPI.new
body = DatadogAPIClient::V2::LogsAggregateRequest.new({
compute: [
DatadogAPIClient::V2::LogsCompute.new({
aggregation: DatadogAPIClient::V2::LogsAggregationFunction::COUNT,
interval: "5m",
type: DatadogAPIClient::V2::LogsComputeType::TIMESERIES,
}),
],
filter: DatadogAPIClient::V2::LogsQueryFilter.new({
from: "now-15m",
indexes: [
"main",
],
query: "*",
to: "now",
}),
})
p api_instance.aggregate_logs(body)
# Aggregate compute events with group by returns "OK" response
require "datadog_api_client"
api_instance = DatadogAPIClient::V2::LogsAPI.new
body = DatadogAPIClient::V2::LogsAggregateRequest.new({
compute: [
DatadogAPIClient::V2::LogsCompute.new({
aggregation: DatadogAPIClient::V2::LogsAggregationFunction::COUNT,
interval: "5m",
type: DatadogAPIClient::V2::LogsComputeType::TIMESERIES,
}),
],
filter: DatadogAPIClient::V2::LogsQueryFilter.new({
from: "now-15m",
indexes: [
"main",
],
query: "*",
to: "now",
}),
group_by: [
DatadogAPIClient::V2::LogsGroupBy.new({
facet: "host",
missing: "miss",
sort: DatadogAPIClient::V2::LogsAggregateSort.new({
type: DatadogAPIClient::V2::LogsAggregateSortType::MEASURE,
order: DatadogAPIClient::V2::LogsSortOrder::ASCENDING,
aggregation: DatadogAPIClient::V2::LogsAggregationFunction::PERCENTILE_90,
metric: "@duration",
}),
total: "recall",
}),
],
})
p api_instance.aggregate_logs(body)
# Aggregate events returns "OK" response
require "datadog_api_client"
api_instance = DatadogAPIClient::V2::LogsAPI.new
body = DatadogAPIClient::V2::LogsAggregateRequest.new({
filter: DatadogAPIClient::V2::LogsQueryFilter.new({
from: "now-15m",
indexes: [
"main",
],
query: "*",
to: "now",
}),
})
p api_instance.aggregate_logs(body)
First install the library and its dependencies and then save the example to example.rb
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" rb "example.rb"
// Aggregate compute events returns "OK" response
use datadog_api_client::datadog;
use datadog_api_client::datadogV2::api_logs::LogsAPI;
use datadog_api_client::datadogV2::model::LogsAggregateRequest;
use datadog_api_client::datadogV2::model::LogsAggregationFunction;
use datadog_api_client::datadogV2::model::LogsCompute;
use datadog_api_client::datadogV2::model::LogsComputeType;
use datadog_api_client::datadogV2::model::LogsQueryFilter;
#[tokio::main]
async fn main() {
let body = LogsAggregateRequest::new()
.compute(vec![LogsCompute::new(LogsAggregationFunction::COUNT)
.interval("5m".to_string())
.type_(LogsComputeType::TIMESERIES)])
.filter(
LogsQueryFilter::new()
.from("now-15m".to_string())
.indexes(vec!["main".to_string()])
.query("*".to_string())
.to("now".to_string()),
);
let configuration = datadog::Configuration::new();
let api = LogsAPI::with_config(configuration);
let resp = api.aggregate_logs(body).await;
if let Ok(value) = resp {
println!("{:#?}", value);
} else {
println!("{:#?}", resp.unwrap_err());
}
}
// Aggregate compute events with group by returns "OK" response
use datadog_api_client::datadog;
use datadog_api_client::datadogV2::api_logs::LogsAPI;
use datadog_api_client::datadogV2::model::LogsAggregateRequest;
use datadog_api_client::datadogV2::model::LogsAggregateSort;
use datadog_api_client::datadogV2::model::LogsAggregateSortType;
use datadog_api_client::datadogV2::model::LogsAggregationFunction;
use datadog_api_client::datadogV2::model::LogsCompute;
use datadog_api_client::datadogV2::model::LogsComputeType;
use datadog_api_client::datadogV2::model::LogsGroupBy;
use datadog_api_client::datadogV2::model::LogsGroupByMissing;
use datadog_api_client::datadogV2::model::LogsGroupByTotal;
use datadog_api_client::datadogV2::model::LogsQueryFilter;
use datadog_api_client::datadogV2::model::LogsSortOrder;
#[tokio::main]
async fn main() {
let body = LogsAggregateRequest::new()
.compute(vec![LogsCompute::new(LogsAggregationFunction::COUNT)
.interval("5m".to_string())
.type_(LogsComputeType::TIMESERIES)])
.filter(
LogsQueryFilter::new()
.from("now-15m".to_string())
.indexes(vec!["main".to_string()])
.query("*".to_string())
.to("now".to_string()),
)
.group_by(vec![LogsGroupBy::new("host".to_string())
.missing(LogsGroupByMissing::LogsGroupByMissingString(
"miss".to_string(),
))
.sort(
LogsAggregateSort::new()
.aggregation(LogsAggregationFunction::PERCENTILE_90)
.metric("@duration".to_string())
.order(LogsSortOrder::ASCENDING)
.type_(LogsAggregateSortType::MEASURE),
)
.total(LogsGroupByTotal::LogsGroupByTotalString(
"recall".to_string(),
))]);
let configuration = datadog::Configuration::new();
let api = LogsAPI::with_config(configuration);
let resp = api.aggregate_logs(body).await;
if let Ok(value) = resp {
println!("{:#?}", value);
} else {
println!("{:#?}", resp.unwrap_err());
}
}
// Aggregate events returns "OK" response
use datadog_api_client::datadog;
use datadog_api_client::datadogV2::api_logs::LogsAPI;
use datadog_api_client::datadogV2::model::LogsAggregateRequest;
use datadog_api_client::datadogV2::model::LogsQueryFilter;
#[tokio::main]
async fn main() {
let body = LogsAggregateRequest::new().filter(
LogsQueryFilter::new()
.from("now-15m".to_string())
.indexes(vec!["main".to_string()])
.query("*".to_string())
.to("now".to_string()),
);
let configuration = datadog::Configuration::new();
let api = LogsAPI::with_config(configuration);
let resp = api.aggregate_logs(body).await;
if let Ok(value) = resp {
println!("{:#?}", value);
} else {
println!("{:#?}", resp.unwrap_err());
}
}
First install the library and its dependencies and then save the example to src/main.rs
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" cargo run
/**
* Aggregate compute events returns "OK" response
*/
import { client, v2 } from "@datadog/datadog-api-client";
const configuration = client.createConfiguration();
const apiInstance = new v2.LogsApi(configuration);
const params: v2.LogsApiAggregateLogsRequest = {
body: {
compute: [
{
aggregation: "count",
interval: "5m",
type: "timeseries",
},
],
filter: {
from: "now-15m",
indexes: ["main"],
query: "*",
to: "now",
},
},
};
apiInstance
.aggregateLogs(params)
.then((data: v2.LogsAggregateResponse) => {
console.log(
"API called successfully. Returned data: " + JSON.stringify(data)
);
})
.catch((error: any) => console.error(error));
/**
* Aggregate compute events with group by returns "OK" response
*/
import { client, v2 } from "@datadog/datadog-api-client";
const configuration = client.createConfiguration();
const apiInstance = new v2.LogsApi(configuration);
const params: v2.LogsApiAggregateLogsRequest = {
body: {
compute: [
{
aggregation: "count",
interval: "5m",
type: "timeseries",
},
],
filter: {
from: "now-15m",
indexes: ["main"],
query: "*",
to: "now",
},
groupBy: [
{
facet: "host",
missing: "miss",
sort: {
type: "measure",
order: "asc",
aggregation: "pc90",
metric: "@duration",
},
total: "recall",
},
],
},
};
apiInstance
.aggregateLogs(params)
.then((data: v2.LogsAggregateResponse) => {
console.log(
"API called successfully. Returned data: " + JSON.stringify(data)
);
})
.catch((error: any) => console.error(error));
/**
* Aggregate events returns "OK" response
*/
import { client, v2 } from "@datadog/datadog-api-client";
const configuration = client.createConfiguration();
const apiInstance = new v2.LogsApi(configuration);
const params: v2.LogsApiAggregateLogsRequest = {
body: {
filter: {
from: "now-15m",
indexes: ["main"],
query: "*",
to: "now",
},
},
};
apiInstance
.aggregateLogs(params)
.then((data: v2.LogsAggregateResponse) => {
console.log(
"API called successfully. Returned data: " + JSON.stringify(data)
);
})
.catch((error: any) => console.error(error));
First install the library and its dependencies and then save the example to example.ts
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" tsc "example.ts"
POST https://api.ap1.datadoghq.com/api/v1/logs-queries/listhttps://api.datadoghq.eu/api/v1/logs-queries/listhttps://api.ddog-gov.com/api/v1/logs-queries/listhttps://api.datadoghq.com/api/v1/logs-queries/listhttps://api.us3.datadoghq.com/api/v1/logs-queries/listhttps://api.us5.datadoghq.com/api/v1/logs-queries/list
List endpoint returns logs that match a log search query. Results are paginated.
If you are considering archiving logs for your organization, consider use of the Datadog archive capabilities instead of the log list API. See Datadog Logs Archive documentation.
This endpoint requires thelogs_read_data
permission.Logs filter
항목
유형
설명
index
string
The log index on which the request is performed. For multi-index organizations, the default is all live indexes. Historical indexes of rehydrated logs must be specified.
limit
int32
Number of logs return in the response.
query
string
The search query - following the log search syntax.
sort
enum
Time-ascending asc
or time-descending desc
results.
Allowed enum values: asc,desc
startAt
string
Hash identifier of the first log to return in the list, available in a log id
attribute.
This parameter is used for the pagination feature.
Note: This parameter is ignored if the corresponding log is out of the scope of the specified time window.
time [required]
object
Timeframe to retrieve the log from.
from [required]
date-time
Minimum timestamp for requested logs.
timezone
string
Timezone can be specified both as an offset (for example "UTC+03:00") or a regional zone (for example "Europe/Paris").
to [required]
date-time
Maximum timestamp for requested logs.
{
"index": "main",
"query": "host:Test*",
"sort": "asc",
"time": {
"from": "2021-11-11T10:11:11+00:00",
"timezone": "Europe/Paris",
"to": "2021-11-11T11:11:11+00:00"
}
}
OK
Response object with all logs matching the request and pagination information.
항목
유형
설명
logs
[object]
Array of logs matching the request and the nextLogId
if sent.
content
object
JSON object containing all log attributes and their associated values.
attributes
object
JSON object of attributes from your log.
host
string
Name of the machine from where the logs are being sent.
message
string
The message reserved attribute of your log. By default, Datadog ingests the value of the message attribute as the body of the log entry. That value is then highlighted and displayed in the Logstream, where it is indexed for full text search.
service
string
The name of the application or service generating the log events. It is used to switch from Logs to APM, so make sure you define the same value when you use both products.
tags
[string]
Array of tags associated with your log.
timestamp
date-time
Timestamp of your log.
id
string
ID of the Log.
nextLogId
string
Hash identifier of the next log to return in the list. This parameter is used for the pagination feature.
status
string
Status of the response.
{
"logs": [
{
"content": {
"attributes": {
"customAttribute": 123,
"duration": 2345
},
"host": "i-0123",
"message": "Host connected to remote",
"service": "agent",
"tags": [
"team:A"
],
"timestamp": "2020-05-26T13:36:14Z"
},
"id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA"
}
],
"nextLogId": "string",
"status": "string"
}
Bad Request
Response returned by the Logs API when errors occur.
항목
유형
설명
error
object
Error returned by the Logs API
code
string
Code identifying the error
details
[object]
Additional error details
message
string
Error message
{
"error": {
"code": "string",
"details": [],
"message": "string"
}
}
Authentication error
Error response object.
{
"errors": [
"Bad Request"
]
}
Too many requests
Error response object.
{
"errors": [
"Bad Request"
]
}
# Curl command
curl -X POST "https://api.ap1.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs-queries/list" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
-H "DD-API-KEY: ${DD_API_KEY}" \
-H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \
-d @- << EOF
{
"index": "main",
"query": "host:Test*",
"sort": "asc",
"time": {
"from": "2021-11-11T10:11:11+00:00",
"timezone": "Europe/Paris",
"to": "2021-11-11T11:11:11+00:00"
}
}
EOF
// Search test logs returns "OK" response
package main
import (
"context"
"encoding/json"
"fmt"
"os"
"time"
"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
"github.com/DataDog/datadog-api-client-go/v2/api/datadogV1"
)
func main() {
body := datadogV1.LogsListRequest{
Index: datadog.PtrString("main"),
Query: datadog.PtrString("host:Test*"),
Sort: datadogV1.LOGSSORT_TIME_ASCENDING.Ptr(),
Time: datadogV1.LogsListRequestTime{
From: time.Now().Add(time.Hour * -1),
Timezone: datadog.PtrString("Europe/Paris"),
To: time.Now(),
},
}
ctx := datadog.NewDefaultContext(context.Background())
configuration := datadog.NewConfiguration()
apiClient := datadog.NewAPIClient(configuration)
api := datadogV1.NewLogsApi(apiClient)
resp, r, err := api.ListLogs(ctx, body)
if err != nil {
fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.ListLogs`: %v\n", err)
fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
}
responseContent, _ := json.MarshalIndent(resp, "", " ")
fmt.Fprintf(os.Stdout, "Response from `LogsApi.ListLogs`:\n%s\n", responseContent)
}
First install the library and its dependencies and then save the example to main.go
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" go run "main.go"
// Search test logs returns "OK" response
import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v1.api.LogsApi;
import com.datadog.api.client.v1.model.LogsListRequest;
import com.datadog.api.client.v1.model.LogsListRequestTime;
import com.datadog.api.client.v1.model.LogsListResponse;
import com.datadog.api.client.v1.model.LogsSort;
import java.time.OffsetDateTime;
public class Example {
public static void main(String[] args) {
ApiClient defaultClient = ApiClient.getDefaultApiClient();
LogsApi apiInstance = new LogsApi(defaultClient);
LogsListRequest body =
new LogsListRequest()
.index("main")
.query("host:Test*")
.sort(LogsSort.TIME_ASCENDING)
.time(
new LogsListRequestTime()
.from(OffsetDateTime.now().plusHours(-1))
.timezone("Europe/Paris")
.to(OffsetDateTime.now()));
try {
LogsListResponse result = apiInstance.listLogs(body);
System.out.println(result);
} catch (ApiException e) {
System.err.println("Exception when calling LogsApi#listLogs");
System.err.println("Status code: " + e.getCode());
System.err.println("Reason: " + e.getResponseBody());
System.err.println("Response headers: " + e.getResponseHeaders());
e.printStackTrace();
}
}
}
First install the library and its dependencies and then save the example to Example.java
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" java "Example.java"
"""
Search test logs returns "OK" response
"""
from datetime import datetime
from dateutil.relativedelta import relativedelta
from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v1.api.logs_api import LogsApi
from datadog_api_client.v1.model.logs_list_request import LogsListRequest
from datadog_api_client.v1.model.logs_list_request_time import LogsListRequestTime
from datadog_api_client.v1.model.logs_sort import LogsSort
body = LogsListRequest(
index="main",
query="host:Test*",
sort=LogsSort.TIME_ASCENDING,
time=LogsListRequestTime(
_from=(datetime.now() + relativedelta(hours=-1)),
timezone="Europe/Paris",
to=datetime.now(),
),
)
configuration = Configuration()
with ApiClient(configuration) as api_client:
api_instance = LogsApi(api_client)
response = api_instance.list_logs(body=body)
print(response)
First install the library and its dependencies and then save the example to example.py
and run following commands:
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" python3 "example.py"
# Search test logs returns "OK" response