---
title: Log Collection
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: Docs > OpenTelemetry in Datadog > OpenTelemetry Configuration > Log Collection
---

# Log Collection

{% alert level="info" %}
The Datadog Agent logs pipeline is enabled by default in the Datadog Exporter in v0.108.0. This may cause a breaking change if [`logs::dump_payloads`](https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/b52e760f184b77c6e1a9ccc5121ff7b88d2b8f75/exporter/datadogexporter/examples/collector.yaml#L456-L463) is in use while upgrading, since this option is invalid when the Datadog Agent logs pipeline is enabled. To avoid this issue, remove the `logs::dump_payloads` config option or temporarily disable the `exporter.datadogexporter.UseLogsAgentExporter` feature gate.
{% /alert %}

## Overview{% #overview %}

{% image
   source="https://datadog-docs.imgix.net/images/opentelemetry/collector_exporter/log_collection.fac82a9987836f4208e9959813078603.png?auto=format"
   alt="An information log sent from OpenTelemetry" /%}

To collect logs from files, configure the [filelog receiver](https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/receiver/filelogreceiver) in your Datadog Exporter.

For more information, see the OpenTelemetry project documentation for the [filelog receiver](https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/receiver/filelogreceiver).

## Setup{% #setup %}

{% tab title="Host" %}
For a collector deployed on the same host as the log files to be collected, specify the paths of the log files to collect in your Collector configuration:

```yaml
receivers:
  filelog:
    include_file_path: true
    poll_interval: 500ms
    include:
      - /var/log/*/app.log
    operators:
      - type: json_parser
      # Layout must match log timestamp format. If this section is removed, timestamp will correspond to the time of log intake by Datadog.
      - type: time_parser
        parse_from: attributes.time
        layout: '%Y-%m-%dT%H:%M:%S%z'
```

{% /tab %}

{% tab title="Kubernetes" %}
Add the following lines to `values.yaml`:

```yaml
presets:
  logsCollection:
    enabled: true
    includeCollectorLogs: true
```

The filelog receiver needs access to the file paths. The preset mounts the necessary volumes to the collector container for `/var/log/pods` and collects all logs from `/var/log/pods/*/*/*.log`. See [Important components for Kubernetes](https://opentelemetry.io/docs/kubernetes/collector/components/#filelog-receiver) for a full list of settings set by the preset.

Collector configuration sets up a list of operators to parse the logs based on different formats:

```yaml
filelog:
    include:
      - /var/log/pods/*/*/*.log
    exclude:
      - /var/log/pods/abc/*.log
    operators:
      - type: json_parser
      - type: trace_parser
        trace_id:
          parse_from: attributes.trace_id
        span_id:
          parse_from: attributes.span_id
        trace_flags:
          parse_from: attributes.trace_flags
      - type: time_parser
        parse_from: attributes.time
        layout: '%Y-%m-%dT%H:%M:%S%z'
```

{% /tab %}

### Custom tags{% #custom-tags %}

In order to add custom Datadog tags to logs, set the `ddtags` attribute on the logs. For example, this can be done with the [transform processor](https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/processor/transformprocessor):

```yaml
processors:
  transform:
    log_statements:
      - context: log
        statements:
          - set(attributes["ddtags"], "first_custom:tag, second_custom:tag")
```

## Data collected{% #data-collected %}

Logs from the configured files.

## Full example configuration{% #full-example-configuration %}

For a full working example configuration with the Datadog exporter, see [`logs.yaml`](https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/exporter/datadogexporter/examples/logs.yaml).

## Example logging output{% #example-logging-output %}

```gdscript3
ResourceLog #0
Resource SchemaURL: https://opentelemetry.io/schemas/1.6.1
Resource attributes:
     -> k8s.container.name: Str(loadgenerator)
     -> k8s.namespace.name: Str(otel-staging)
     -> k8s.pod.name: Str(opentelemetry-demo-loadgenerator-d8c4d699d-ztt98)
     -> k8s.container.restart_count: Str(1)
     -> k8s.pod.uid: Str(92bf09ed-0db9-4f69-a9d6-1dadf12e01aa)
     -> k8s.pod.ip: Str(192.168.55.78)
     -> cloud.provider: Str(aws)
     -> cloud.platform: Str(aws_ec2)
     -> cloud.region: Str(us-east-1)
     -> cloud.account.id: Str(XXXXXXXXX)
     -> cloud.availability_zone: Str(us-east-1c)
     -> host.id: Str(i-0368add8e328c28f7)
     -> host.image.id: Str(ami-08a2e6a8e82737230)
     -> host.type: Str(m5.large)
     -> host.name: Str(ip-192-168-53-115.ec2.internal)
     -> os.type: Str(linux)
     -> k8s.daemonset.uid: Str(6d6fef61-d4c7-4226-9b7b-7d6b893cb31d)
     -> k8s.daemonset.name: Str(opentelemetry-collector-agent)
     -> k8s.node.name: Str(ip-192-168-53-115.ec2.internal)
     -> kube_app_name: Str(opentelemetry-collector)
     -> kube_app_instance: Str(opentelemetry-collector)
     -> k8s.pod.start_time: Str(2023-11-20T12:53:23Z)
ScopeLogs #0
ScopeLogs SchemaURL:
InstrumentationScope
LogRecord #0
ObservedTimestamp: 2023-11-20 13:02:04.332021519 +0000 UTC
Timestamp: 2023-11-20 13:01:46.095736502 +0000 UTC
SeverityText:
SeverityNumber: Unspecified(0)
Body: Str( return wrapped_send(self, request, **kwargs))
Attributes:
     -> log.file.path: Str(/var/log/pods/otel-staging_opentelemetry-demo-loadgenerator-d8c4d699d-ztt98_92bf09ed-0db9-4f69-a9d6-1dadf12e01aa/loadgenerator/1.log)
     -> time: Str(2023-11-20T13:01:46.095736502Z)
     -> logtag: Str(F)
     -> log.iostream: Str(stderr)
Trace ID:
Span ID:
Flags: 0
```

## Further reading{% #further-reading %}

- [Setting Up the OpenTelemetry Collector](https://docs.datadoghq.com/opentelemetry/collector_exporter/)
- [Correlate OpenTelemetry Traces and Logs](https://docs.datadoghq.com/opentelemetry/correlate/logs_and_traces/)
