- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- Administrator's Guide
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
You can use account-level log subscriptions in your AWS environment to automatically forward all of your CloudWatch logs to Datadog. With an account-level log subscription, you don’t need to manually configure log forwarding when you have a new log source, or when AWS releases a new service. You can also define your own selection criteria or filter pattern, for more control over which logs are forwarded.
There are two ways to create an account-level log subscription, through CloudFormation and through manual setup. For the simplest setup, use CloudFormation to create an Amazon Data Firehose and associated resources in each of your selected regions.
https://datadog-cloudformation-template.s3.amazonaws.com/aws_account_level_logs/main.yaml
With new resources (standard)
.datadog-account-level-logs-stack
.us-east-1
) corresponding to the regions to include for the account-level log subscription.I acknowledge that AWS CloudFormation might create IAM resources with custom names
.<REGION>
with the region containing your Datadog Forwarder Lambda function.<ACCOUNT_ID>
with your 12-digit AWS account ID (excluding dashes).aws lambda add-permission \
--region "<REGION>" \
--function-name "forwarder-function" \
--statement-id "forwarder-function" \
--principal "logs.amazonaws.com" \
--action "lambda:InvokeFunction" \
--source-arn "arn:aws:logs:<REGION>:<ACCOUNT_ID>:log-group:*" \
--source-account "<ACCOUNT_ID>"
ERROR
are streamed, except those in the log groups named LogGroupToExclude1
and LogGroupToExclude2
.FORWARDER_ARN
with the ARN of the Datadog Forwarder Lambda function.aws logs put-account-policy \
--policy-name "ExamplePolicyLambda" \
--policy-type "SUBSCRIPTION_FILTER_POLICY" \
--policy-document '{"DestinationArn":"<FORWARDER_ARN>", "FilterPattern": "", "Distribution": "Random"}' \
--scope "ALL"
Note: To exclude certain log groups from log forwarding, use the --selection-criteria
option as outlined in the command reference.
The following steps guide you through creating a bucket and IAM role. This role grants Amazon Data Firehose permission to put data into your Amazon S3 bucket in case of delivery failures.
<BUCKET_NAME>
with the name for your S3 bucket.<REGION>
with the region for your S3 bucket.aws s3api create-bucket \
--bucket MY-BUCKET \
--create-bucket-configuration LocationConstraint=<REGION>
TrustPolicyForFirehose.json
file with the following statement:{
"Statement": {
"Effect": "Allow",
"Principal": { "Service": "firehose.amazonaws.com" },
"Action": "sts:AssumeRole"
}
}
aws iam create-role \
--role-name FirehosetoS3Role \
--assume-role-policy-document file://./TrustPolicyForFirehose.json
PermissionsForFirehose.json
file with the following statement:<BUCKET_NAME>
with the name of your S3 bucket.{
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:AbortMultipartUpload",
"s3:GetBucketLocation",
"s3:GetObject",
"s3:ListBucket",
"s3:ListBucketMultipartUploads",
"s3:PutObject" ],
"Resource": [
"arn:aws:s3:::<BUCKET_NAME>",
"arn:aws:s3:::<BUCKET_NAME>/*" ]
}
]
}
aws iam put-role-policy \
--role-name FirehosetoS3Role \
--policy-name Permissions-Policy-For-Firehose \
--policy-document file://./PermissionsForFirehose.json
The following steps guide you through creating and configuring an Amazon Data Firehose delivery stream.
Amazon Kinesis Data Streams
if your logs are coming from a Kinesis Data StreamDirect PUT
if your logs are coming directly from a CloudWatch log groupDatadog
.Amazon Kinesis Data Streams
, select your Kinesis data stream under Source settings.GZIP
.2
MiB if the logs are single-line messages.The following steps guide you through creating an IAM role for CloudWatch logs. This role grants CloudWatch Logs permission to put data into your Firehose delivery stream.
./TrustPolicyForCWL.json
file with the following statement:<ACCOUNT_ID>
with your 12-digit AWS account ID (excluding dashes).<REGION>
with the region of your CloudWatch logs.{
"Statement": {
"Effect": "Allow",
"Principal": { "Service": "logs.amazonaws.com" },
"Action": "sts:AssumeRole",
"Condition": {
"StringLike": {
"aws:SourceArn": "arn:aws:logs:<REGION>:<ACCOUNT_ID>:*"
}
}
}
}
aws iam create-role \
--role-name CWLtoKinesisFirehoseRole \
--assume-role-policy-document file://./TrustPolicyForCWL.json
Note: The returned Role.Arn value is used in a later step.
./PermissionsForCWL.json
file with the following statement:<REGION>
with the region containing your Datadog Forwarder Lambda function.<ACCOUNT_ID>
with your 12-digit AWS account ID (excluding dashes).<DELIVERY_STREAM_NAME>
with the name of your delivery stream.{
"Statement":[
{
"Effect":"Allow",
"Action":["firehose:PutRecord"],
"Resource":[
"arn:aws:firehose:<REGION>:<ACCOUNT_ID>:deliverystream/<DELIVERY_STREAM_NAME>"]
}
]
}
aws iam put-role-policy \
--role-name CWLtoKinesisFirehoseRole \
--policy-name Permissions-Policy-For-CWL \
--policy-document file://./PermissionsForCWL.json
Before completing this step, the Amazon Data Firehose delivery stream must be in the Active
state.
<POLICY_NAME>
with a name for the subscription filter policy.<CLOUDWATCH_LOGS_ROLE>
with the ARN of the CloudWatch logs role.<DELIVERY_STREAM_ARN>
with the ARN of the Amazon Data Firehose delivery stream.aws logs put-account-policy \
--policy-name "<POLICY_NAME>" \
--policy-type "SUBSCRIPTION_FILTER_POLICY" \
--policy-document '{"RoleArn":"<CLOUDWATCH_LOGS_ROLE>", "DestinationArn":"<DELIVERY_STREAM_ARN>", "FilterPattern": "", "Distribution": "Random"}' \
--scope "ALL"
Note: To exclude certain log groups from log forwarding, use the --selection-criteria
option as outlined in the command reference.
Go to the Log Explorer and enter the search query @aws.firehose.arn:"<FIREHOSE_ARN>"
to view logs forwarded by the Amazon Data Firehose.
<FIREHOSE_ARN>
with the ARN of the log-streaming Firehose.추가 유용한 문서, 링크 및 기사: