Network Performance Monitoring is now generally available! Network Monitoring is now available!

Processing

Overview

To access the configuration panel use the left Logs menu then the configuration sub menu.

Log configuration page allows full control over how your logs are processed with Datadog Pipelines and Processors.

Pipelines and Processors can be applied to any type of logs:

Therefore, you don’t need to change how you log, and you don’t need to deploy changes to any server-side processing rules. Everything is happening and can be configured in the Datadog processing page.

The other benefit to implement a log processing strategy is to implement an attribute naming convention for your organization.

Log Processing

Integration logs

For integration logs, an Integration Pipeline is automatically installed that takes care of parsing your logs and adds the corresponding facet in your Logs Explorer. See the ELB logs example below:

Consult the current list of supported integrations.

Custom logs

However, log formats can be totally custom which is why you can define custom processing rules. With any log syntax, you can extract all your attributes and, when necessary, remap them to more global or canonical attributes.

So for instance with custom processing rules you can transform this log:

Into this one:

Consult the Pipelines documentation page to learn more on how to perform actions only on some subset of your logs with the Pipeline filters.

To discover the full list of Processors available, refer to the dedicated Processor documentation page.

If you want to learn more about pure parsing possibilities of the Datadog application, follow the parsing training guide. There is also a parsing best practice and parsing troubleshooting guide.

Reserved attributes

If your logs are formatted as JSON, be aware that some attributes are reserved for use by Datadog:

date attribute

By default Datadog generates a timestamp and appends it in a date attribute when logs are received. However, if a JSON formatted log file includes one of the following attributes, Datadog interprets its value as the log’s official date:

  • @timestamp
  • timestamp
  • _timestamp
  • Timestamp
  • eventTime
  • date
  • published_date
  • syslog.timestamp

You can also specify alternate attributes to use as the source of a log’s date by setting a log date remapper processor.

Note: Datadog rejects a log entry if its official date is older than 6 hours in the past.

The recognized date formats are: ISO8601, UNIX (the milliseconds EPOCH format), and RFC3164.

message attribute

By default, Datadog ingests the message value as the body of the log entry. That value is then highlighted and displayed in the logstream, where it is indexed for full text search.

status attribute

Each log entry may specify a status level which is made available for faceted search within Datadog. However, if a JSON formatted log file includes one of the following attributes, Datadog interprets its value as the log’s official status:

  • status
  • severity
  • level
  • syslog.severity

If you would like to remap a status existing in the status attribute, you can do so with the log status remapper

host attribute

Using the Datadog Agent or the RFC5424 format automatically sets the host value on your logs. However, if a JSON formatted log file includes the following attribute, Datadog interprets its value as the log’s host:

  • host
  • hostname
  • syslog.hostname

source attribute

If a JSON formatted log file includes the ddsource attribute, Datadog interprets its value as the log’s source. To use the same source names Datadog uses, see the Integration Pipeline Reference.

service attribute

Using the Datadog Agent or the RFC5424 format automatically sets the service value on your logs. However, if a JSON formatted log file includes the following attribute, Datadog interprets its value as the log’s service:

  • service
  • syslog.appname

trace_id attribute

By default, Datadog tracers can automatically inject trace and span IDs in the logs. However, if a JSON formatted log includes the following attributes, Datadog interprets its value as the log’s trace_id:

  • dd.trace_id
  • contextMap.dd.trace_id

Edit reserved attributes

You can now control the global hostname, service, timestamp, and status main mapping that are applied before the processing Pipelines. This is useful if logs are sent in JSON or from an external Agent.

To change the default values for each of the reserved attributes, go to the Configuration page and edit the Reserved Attribute mapping:

Further Reading