- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- Administrator's Guide
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
",t};e.buildCustomizationMenuUi=t;function n(e){let t='
",t}function s(e){let n=e.filter.currentValue||e.filter.defaultValue,t='${e.filter.label}
`,e.filter.options.forEach(s=>{let o=s.id===n;t+=``}),t+="${e.filter.label}
`,t+=`Logs may contain sensitive data, and should be handled carefully. If you are ingesting sensitive data into Datadog, consider the following:
Controlling all of your data can be challenging, especially on a large and collaborative platform. This guide walks you through the different options to discover and manage sensitive data that is ingested into Datadog.
Sensitive Data Scanner is a stream-based, pattern matching service that you can use to identify, tag, and optionally redact or hash sensitive data. With this capability, your security and compliance teams can introduce a line of defense in preventing sensitive data from leaking outside your organization. Sensitive Data Scanner is available in your Organization Settings.
If you have already indexed logs that contain sensitive data, then follow these three steps:
When you set up or edit a scanner rule, there is an Action on Match section where you can set the rule to use the mask action for matched sensitive data. The mask action obfuscates the sensitive data, but users with the Data Scanner Unmask
permission can de-obfuscate (unmask) and view the data in Datadog.
Notes:
To unmask sensitive data, navigate to the Summary page, click on a scanning rule, and then click on a log. If you have permission to see masked data, that data has an eye icon next to it. Click the eye icon to reveal the data. You can also use the Log Explorer to view your masked log data.
First, define a query that outlines the sensitive data. That query will return all logs with sensitive data.
Chances are that queries such as version:x.y.z source:python status:debug
match that expectation. Refer to the Log Search Syntax documentation if you need to use more advanced operators (wildcards, boolean operators, etc.).
This guide refers to this example query as the sensitive outline query.
Once sensitive data in logs is sent to the Datadog platform, it may exist in a number of places. As a result, check each of the following (ordered from most likely to have sensitive data to least likely):
Datadog Indexes are where logs are stored in Datadog until they age out according to index retention. Focus should be on Datadog Indexes as other locations are less likely to be a compliance concern. Check indexes filters and exclusion filters to see if logs with sensitive data are indexed.
Log Archives, which is where Datadog sends logs to be stored. Set up Archive Filters to see if your archive contains sensitive logs.
Metrics generated from logs, which stores aggregated metrics. Sensitive data may have been discarded with this process. Check custom metrics filters to see if logs with sensitive data are processed.
Log Monitors notifications when they include Log Samples. Check specifically for monitors triggered during the period when sensitive data was being sent.
Live Tail, where logs are viewed in real-time by your organization’s users. There is no persistence beyond the 50 logs cached in browsers, and for broader queries, the result can be extremely sampled.
Use out-of-the-box or custom rules to identify and redact other kinds of sensitive data still coming in your logs.
If you’re not using Sensitive Data Scanner, determine whether you want to exclude any new logs containing sensitive data from being indexed entirely. You’ll still need to address the logs containing sensitive data already indexed in Datadog.
If certain kinds of sensitive data are prohibited from leaving your environment and being ingested into Datadog, then add scrubbing rules at source collection.
If you have way to change the loggers themselves, Datadog provides you with solutions to prevent compliance-sensitive data from being sent outside of your platform when using the Datadog Agent for Log Collection:
Similar scrubbing capabilities exist for the Serverless Forwarder.
Take the following steps according to your compliance requirements. You might not need to complete all steps.
This step makes logs with sensitive data, both logs that already sent and logs that might keep flowing in, not queryable in Datadog (Explorer, Dashboards, and Livetail).
Use the Data Access configuration page and a sensitive outline query to define a restriction and apply it to roles in your organization. For example, version:x.y.z source:python status:debug
. You can also restrict over a time period with the @timestamp
attribute. For example, @timestamp:[1731597125165 TO 1731597125200]
.
Note: Using NOT in the sensitive outline query restricts users from the logs matching the query and allows users to see logs that do not match the query.
If you have to patch your archives to remove sensitive data, refer to the format of archives generated by Datadog documentation.
If you have a specific compliance questions or need help, contact Datadog support. When you contact support, it is helpful for you to have the following information available:
추가 유용한 문서, 링크 및 기사: