---
title: Processors
description: Parse your logs using the Grok Processor
breadcrumbs: Docs > Log Management > Log Configuration > Processors
---

# Processors

## Overview{% #overview %}

{% alert level="info" %}
The processors outlined in this documentation are specific to cloud-based logging environments. To parse, structure, and enrich on-premises logs, see [Observability Pipelines](https://docs.datadoghq.com/observability_pipelines/processors/).
{% /alert %}

A processor executes within a [Pipeline](https://docs.datadoghq.com/logs/log_configuration/pipelines/) to complete a data-structuring action and generate attributes to enrich your logs.

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/processor_overview.a024cfa4baa0bae398110da4c3a70c29.png?auto=format"
   alt="Processors" /%}

In [log configuration settings](https://docs.datadoghq.com/logs/log_configuration/pipelines/), you can configure processors such as the Grok parser or date remapper to help extract, create, and remap attributes to enrich your logs and enhance faceted search.

**Notes**:

- Structured logs should be shipped in a valid format. If the structure contains invalid characters for parsing, these should be stripped at the Agent level using the [mask_sequences](https://docs.datadoghq.com/agent/logs/advanced_log_collection/?tab=configurationfile#scrub-sensitive-data-from-your-logs) feature.

- As a best practice, it is recommended to use at most 20 processors per pipeline.

## Grok parser{% #grok-parser %}

Create custom grok rules to parse the full message or a specific attribute of your raw event. As a best practice, limit your grok parser to 10 parsing rules. For more information on Grok syntax and parsing rules, see [Parsing](https://docs.datadoghq.com/logs/log_configuration/parsing/?tab=matchers).

{% image
   source="https://datadog-docs.imgix.net/images/logs/processing/processors/define_parsing_rules_syntax_suggestions.da3279fb1301ddf071644d28ec79a2c6.png?auto=format"
   alt="Grok parser syntax suggestions in the UI" /%}

{% tab title="UI" %}
Define the Grok processor on the [**Pipelines** page](https://app.datadoghq.com/logs/pipelines). To configure Grok parsing rules:

1. Click **Parse my logs** to automatically generate a set of three parsing rules based on the logs flowing through the pipeline. **Note**: This feature requires that the corresponding logs are indexed and actively flowing in. You can temporarily deactivate or sample down exclusion filters to allow the feature to detect logs.
1. **Log Samples**: Add up to five sample logs (up to 5000 characters each) to test your parsing rules.
1. **Define parsing rules**: Write your parsing rules in the rule editor. As you define rules, the Grok parser provides syntax assistance:
   - **Matcher suggestions**: Type a rule name followed by `%{`. A dropdown appears with available matchers (such as `word`, `integer`, `ip`, `date`). Select a matcher from the list to insert it into your rule.
     ```
     MyParsingRule %{
     ```
   - **Filter suggestions**: When adding a filter with `:`, a dropdown shows compatible filters for the selected matcher.
1. **Test your rules**: Select a sample by clicking on it to trigger its evaluation against the parsing rule and display the result at the bottom of the screen. All samples show a status (`match` or `no match`), which highlights if one of the parsing rules of the grok parser matches the sample.

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following Grok parser JSON payload:

```json
{
  "type": "grok-parser",
  "name": "Parsing Log message",
  "is_enabled": true,
  "source": "message",
  "samples": ["sample log 1", "sample log 2"],
  "grok": {"support_rules": "<SUPPORT_RULES>", "match_rules": "<MATCH_RULES>"}
}
```

| Parameter            | Type             | Required | Description                                             |
| -------------------- | ---------------- | -------- | ------------------------------------------------------- |
| `type`               | String           | Yes      | Type of the processor.                                  |
| `name`               | String           | No       | Name of the processor.                                  |
| `is_enabled`         | Boolean          | No       | If the processor is enabled or not. Default: `false`.   |
| `source`             | String           | Yes      | Name of the log attribute to parse. Default: `message`. |
| `samples`            | Array of strings | No       | List of (up to 5) sample logs for this grok parser.     |
| `grok.support_rules` | String           | Yes      | List of Support rules for your grok parser.             |
| `grok.match_rules`   | String           | Yes      | List of Match rules for your grok parser.               |

{% /tab %}

## Log date remapper{% #log-date-remapper %}

As Datadog receives logs, it timestamps them using the value(s) from any of these default attributes:

- `timestamp`
- `date`
- `_timestamp`
- `Timestamp`
- `eventTime`
- `published_date`

If your logs have dates in an attribute that are not in this list, use the log date remapper processor to define their date attribute as the official log timestamp:

{% alert level="info" %}
The recognized date formats are: [ISO8601](https://www.iso.org/iso-8601-date-and-time-format.html), [UNIX (the milliseconds EPOCH format)](https://en.wikipedia.org/wiki/Unix_time), and [RFC3164](https://www.ietf.org/rfc/rfc3164.txt).
{% /alert %}

If your logs don't have a timestamp that conforms to the formats listed above, use the grok processor to extract the epoch time from the timestamp to a new attribute. The date remapper uses the newly defined attribute.

To see how a custom date and time format can be parsed in Datadog, see [Parsing dates](https://docs.datadoghq.com/logs/log_configuration/parsing/?tab=matchers#parsing-dates).

**Notes**:

- Log events can be submitted up to 18 hours in the past and two hours in the future.
- As of ISO 8601-1:2019, the basic format is `T[hh][mm][ss]` and the extended format is `T[hh]:[mm]:[ss]`. Earlier versions omitted the T (representing time) in both formats.
- If your logs don't contain any of the default attributes and you haven't defined your own date attribute, Datadog timestamps the logs with the date it received them.
- If multiple log date remapper processors are applied to a given log within the pipeline, the last one (according to the pipeline's order) is taken into account.

{% tab title="UI" %}
Define the log date remapper processor on the [**Pipelines** page](https://app.datadoghq.com/logs/pipelines):

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/date_remapper.8543cb319df2db268ec8d6a391c48087.png?auto=format"
   alt="Define a date attribute" /%}

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/date_remapper_example.eda94fbeec94cd76bc77744e260cc13c.png?auto=format"
   alt="Date and time in the Log Explorer side panel" /%}

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following log date remapper JSON payload:

```json
{
  "type": "date-remapper",
  "name": "Define <SOURCE_ATTRIBUTE> as the official Date of the log",
  "is_enabled": false,
  "sources": ["<SOURCE_ATTRIBUTE_1>"]
}
```

| Parameter    | Type             | Required | Description                                            |
| ------------ | ---------------- | -------- | ------------------------------------------------------ |
| `type`       | String           | Yes      | Type of the processor.                                 |
| `name`       | String           | no       | Name of the processor.                                 |
| `is_enabled` | Boolean          | no       | If the processors is enabled or not. Default: `false`. |
| `sources`    | Array of strings | Yes      | Array of source attributes.                            |

{% /tab %}

## Log status remapper{% #log-status-remapper %}

Use the status remapper processor to assign attributes as an official status to your logs. For example, add a log severity level to your logs with the status remapper.

Each incoming status value is mapped as follows:

- Integers from 0 to 7 map to the [Syslog severity standards](https://en.wikipedia.org/wiki/Syslog#Severity_level)
- Strings beginning with **emerg** or **f** (case-insensitive) map to **emerg (0)**
- Strings beginning with **a** (case-insensitive) map to **alert (1)**
- Strings beginning with **c** (case-insensitive) map to **critical (2)**
- Strings beginning with **err** (case-insensitive) map to **error (3)**
- Strings beginning with **w** (case-insensitive) map to **warning (4)**
- Strings beginning with **n** (case-insensitive) map to **notice (5)**
- Strings beginning with **i** (case-insensitive) map to **info (6)**
- Strings beginning with **d**, **t**, **v**, **trace**, or **verbose** (case-insensitive) map to **debug (7)**
- Strings beginning with **o** or **s**, or matching **OK** or **Success** (case-insensitive) map to **OK**
- All others map to **info (6)**

**Note**: If multiple log status remapper processors are applied to a log within a pipeline, only the first one in the pipeline's order is considered. Additionally, for all pipelines that match the log, only the first status remapper encountered (from all applicable pipelines) is applied.

{% tab title="UI" %}
Define the log status remapper processor on the [**Pipelines** page](https://app.datadoghq.com/logs/pipelines):

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/severity_remapper.413764ff01203652dd49afddd35ab332.png?auto=format"
   alt="Log severity remapping" /%}

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following log status remapper JSON payload:

```json
{
  "type": "status-remapper",
  "name": "Define <SOURCE_ATTRIBUTE> as the official status of the log",
  "is_enabled": true,
  "sources": ["<SOURCE_ATTRIBUTE>"]
}
```

| Parameter    | Type             | Required | Description                                            |
| ------------ | ---------------- | -------- | ------------------------------------------------------ |
| `type`       | String           | Yes      | Type of the processor.                                 |
| `name`       | String           | No       | Name of the processor.                                 |
| `is_enabled` | Boolean          | No       | If the processors is enabled or not. Default: `false`. |
| `sources`    | Array of strings | Yes      | Array of source attributes.                            |

{% /tab %}

## Service remapper{% #service-remapper %}

The service remapper processor assigns one or more attributes to your logs as the official service.

**Note**: If multiple service remapper processors are applied to a given log within the pipeline, only the first one (according to the pipeline's order) is taken into account.

{% tab title="UI" %}
Define the log service remapper processor on the [**Pipelines** page](https://app.datadoghq.com/logs/pipelines):

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/service_remapper.6d8fe070a5a0aa20851e06695ef54613.png?auto=format"
   alt="Service remapper processor" /%}

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following log service remapper JSON payload:

```json
{
  "type": "service-remapper",
  "name": "Define <SOURCE_ATTRIBUTE> as the official log service",
  "is_enabled": true,
  "sources": ["<SOURCE_ATTRIBUTE>"]
}
```

| Parameter    | Type             | Required | Description                                            |
| ------------ | ---------------- | -------- | ------------------------------------------------------ |
| `type`       | String           | Yes      | Type of the processor.                                 |
| `name`       | String           | No       | Name of the processor.                                 |
| `is_enabled` | Boolean          | No       | If the processors is enabled or not. Default: `false`. |
| `sources`    | Array of strings | Yes      | Array of source attributes.                            |

{% /tab %}

## Log message remapper{% #log-message-remapper %}

`message` is a key attribute in Datadog. Its value is displayed in the **Content** column of the Log Explorer to provide context on the log. You can use the search bar to find a log by the log message.

Use the log message remapper processor to define one or more attributes as the official log message. Define more than one attribute for cases where the attributes might not exist and an alternative is available. For example, if the defined message attributes are `attribute1`, `attribute2`, and `attribute3`, and `attribute1` does not exist, then `attribute2` is used. Similarly, if `attribute2` does not exist, then `attribute3` is used.

To define message attributes, first use the string builder processor to create a new string attribute for each of the attributes you want to use. Then, use the log message remapper to remap the string attributes as the message.

**Note**: If multiple log message remapper processors are applied to a given log within the pipeline, only the first one (according to the pipeline order) is taken into account.

{% tab title="UI" %}
Define the log message remapper processor on the [**Pipelines** page](https://app.datadoghq.com/logs/pipelines):

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/message_processor.aa47e90a017fc654f646784b0cb0654c.png?auto=format"
   alt="Message processor" /%}

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following log message remapper JSON payload:

```json
{
  "type": "message-remapper",
  "name": "Define <SOURCE_ATTRIBUTE> as the official message of the log",
  "is_enabled": true,
  "sources": ["msg"]
}
```

| Parameter    | Type             | Required | Description                                            |
| ------------ | ---------------- | -------- | ------------------------------------------------------ |
| `type`       | String           | Yes      | Type of the processor.                                 |
| `name`       | String           | No       | Name of the processor.                                 |
| `is_enabled` | Boolean          | No       | If the processors is enabled or not. Default: `false`. |
| `sources`    | Array of strings | Yes      | Array of source attributes. Default: `msg`.            |

{% /tab %}

## Remapper{% #remapper %}

The remapper processor remaps one or more source attribute(s) or tags to a different target attribute or tag. For example, you can remap the `user` attribute to `firstname` to normalize log data in the Log Explorer.

If the remapper target is an attribute, the processor can also try to cast the value to a new type (`String`, `Integer`, or `Double`). If the cast fails, the original value and type are preserved.

**Note**: The decimal separator for `Double` values must be `.`.

### Naming constraints{% #naming-constraints %}

Characters `:` and `,` are not allowed in the target attribute or tag names. Additionally, tag and attribute names must follow the conventions outlined in [Attributes and Aliasing](https://docs.datadoghq.com/logs/log_configuration/attributes_naming_convention/).

### Reserved attributes{% #reserved-attributes %}

The Remapper processor **cannot be used to remap Datadog reserved attributes**.

- The `host` attribute cannot be remapped.
- The following attributes require dedicated remapper processors and cannot be remapped with the generic Remapper. To remap any of the attributes, use the corresponding specialized remapper or processor instead.
  - `message`: Log message remapper
  - `service`: Service remapper
  - `status`: Log status remapper
  - `date`: Log date remapper
  - `trace_id`: Trace remapper
  - `span_id`: Span remapper

{% tab title="UI" %}
Define the remapper processor on the [**Pipelines** page](https://app.datadoghq.com/logs/pipelines). For example, remap `user` to `user.firstname`.

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/remapper.4ed74daa20b10135620936d0f5410b4c.png?auto=format"
   alt="Attribute remapper processor" /%}

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following Remapper JSON payload:

```json
{
  "type": "attribute-remapper",
  "name": "Remap <SOURCE_ATTRIBUTE> to <TARGET_ATTRIBUTE>",
  "is_enabled": true,
  "source_type": "attribute",
  "sources": ["<SOURCE_ATTRIBUTE>"],
  "target": "<TARGET_ATTRIBUTE>",
  "target_type": "tag",
  "target_format": "integer",
  "preserve_source": false,
  "override_on_conflict": false
}
```

| Parameter              | Type             | Required | Description                                                                                                                                                              |
| ---------------------- | ---------------- | -------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `type`                 | String           | Yes      | Type of the processor.                                                                                                                                                   |
| `name`                 | String           | No       | Name of the processor.                                                                                                                                                   |
| `is_enabled`           | Boolean          | No       | If the processors is enabled or not. Default: `false`.                                                                                                                   |
| `source_type`          | String           | No       | Defines if the sources are from log `attribute` or `tag`. Default: `attribute`.                                                                                          |
| `sources`              | Array of strings | Yes      | Array of source attributes or tags                                                                                                                                       |
| `target`               | String           | Yes      | Final attribute or tag name to remap the sources to.                                                                                                                     |
| `target_type`          | String           | No       | Defines if the target is a log `attribute` or a `tag`. Default: `attribute`.                                                                                             |
| `target_format`        | String           | No       | Defines if the attribute value should be cast to another type. Possible values: `auto`, `string`, or `integer`. Default: `auto`. When set to `auto`, no cast is applied. |
| `preserve_source`      | Boolean          | No       | Remove or preserve the remapped source element. Default: `false`.                                                                                                        |
| `override_on_conflict` | Boolean          | No       | Override or not the target element if already set. Default: `false`.                                                                                                     |

{% /tab %}

## URL parser{% #url-parser %}

The URL parser processor extracts query parameters and other important parameters from a URL. When setup, the following attributes are produced:

{% image
   source="https://datadog-docs.imgix.net/images/logs/processing/processors/url_processor.a1170ed0086a36a18af3a229fc9bfe5d.png?auto=format"
   alt="Url Processor" /%}

{% tab title="UI" %}
Define the URL parser processor on the [**Pipelines** page](https://app.datadoghq.com/logs/pipelines):

{% image
   source="https://datadog-docs.imgix.net/images/logs/processing/processors/url_processor.a1170ed0086a36a18af3a229fc9bfe5d.png?auto=format"
   alt="Url Processor Tile" /%}

{% /tab %}

{% tab title="API" %}

```json
{
  "type": "url-parser",
  "name": "Parse the URL from http.url attribute.",
  "is_enabled": true,
  "sources": ["http.url"],
  "target": "http.url_details"
}
```

| Parameter    | Type             | Required | Description                                                                                                           |
| ------------ | ---------------- | -------- | --------------------------------------------------------------------------------------------------------------------- |
| `type`       | String           | Yes      | Type of the processor.                                                                                                |
| `name`       | String           | No       | Name of the processor.                                                                                                |
| `is_enabled` | Boolean          | No       | If the processors is enabled or not. Default: `false`.                                                                |
| `sources`    | Array of strings | No       | Array of source attributes. Default: `http.url`.                                                                      |
| `target`     | String           | Yes      | Name of the parent attribute that contains all the extracted details from the `sources`. Default: `http.url_details`. |

{% /tab %}

## User-Agent parser{% #user-agent-parser %}

The user-agent parser processor takes a `useragent` attribute and extracts OS, browser, device, and other user data. When set up, the following attributes are produced:

{% image
   source="https://datadog-docs.imgix.net/images/logs/processing/processors/useragent_processor.b8d700926426232c3637bd41a16f0b76.png?auto=format"
   alt="Useragent Processor" /%}

**Note**: If your logs contain encoded user-agents (for example, IIS logs), configure this Processor to **decode the URL** before parsing it.

{% tab title="UI" %}
Define the user-agent processor on the [**Pipelines** page](https://app.datadoghq.com/logs/pipelines):

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/useragent_processor.0fde354cb9b6de42ac7df192f02df08e.png?auto=format"
   alt="Useragent Processor tile" /%}

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following user-agent parser JSON payload:

```json
{
  "type": "user-agent-parser",
  "name": "Parses <SOURCE_ATTRIBUTE> to extract all its User-Agent information",
  "is_enabled": true,
  "sources": ["http.useragent"],
  "target": "http.useragent_details",
  "is_encoded": false
}
```

| Parameter    | Type             | Required | Description                                                                                                                 |
| ------------ | ---------------- | -------- | --------------------------------------------------------------------------------------------------------------------------- |
| `type`       | String           | Yes      | Type of the processor.                                                                                                      |
| `name`       | String           | No       | Name of the processor.                                                                                                      |
| `is_enabled` | Boolean          | No       | If the processors is enabled or not. Default: `false`.                                                                      |
| `sources`    | Array of strings | No       | Array of source attributes. Default: `http.useragent`.                                                                      |
| `target`     | String           | Yes      | Name of the parent attribute that contains all the extracted details from the `sources`. Default: `http.useragent_details`. |
| `is_encoded` | Boolean          | No       | Define if the source attribute is url encoded or not. Default: `false`.                                                     |

{% /tab %}

## Category processor{% #category-processor %}

Use the category processor to add a new attribute (without spaces or special characters in the new attribute name) to a log matching a provided search query. Then, use categories to create groups for an analytical view (for example, URL groups, machine groups, environments, and response time buckets).

**Notes**:

- The syntax of the query is the one in the [Log Explorer](https://docs.datadoghq.com/logs/search_syntax/) search bar. This query can be done on any log attribute or tag, whether it is a facet or not. Wildcards can also be used inside your query.
- Once the log has matched one of the processor queries, it stops. Make sure they are properly ordered in case a log could match several queries.
- The names of the categories must be unique.
- Once defined in the category processor, you can map categories to log status using the log status remapper.

{% tab title="UI" %}
Define the category processor on the [**Pipelines** page](https://app.datadoghq.com/logs/pipelines). For example, to categorize your web access logs based on the status code range value (`"OK" for a response code between 200 and 299, "Notice" for a response code between 300 and 399, ...`) add this processor:

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/category_processor.021b511f06b4c1450e6af7ad34f6c4bc.png?auto=format"
   alt="category processor" /%}

This processor produces the following result:

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/category_processor_result.9540302809d1572f6ca9c3ae7cb268e2.png?auto=format"
   alt="category processor result" /%}

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following category processor JSON payload:

```json
{
  "type": "category-processor",
  "name": "Assign a custom value to the <TARGET_ATTRIBUTE> attribute",
  "is_enabled": true,
  "categories": [
    {"filter": {"query": "<QUERY_1>"}, "name": "<VALUE_TO_ASSIGN_1>"},
    {"filter": {"query": "<QUERY_2>"}, "name": "<VALUE_TO_ASSIGN_2>"}
  ],
  "target": "<TARGET_ATTRIBUTE>"
}
```

| Parameter    | Type            | Required | Description                                                                                                |
| ------------ | --------------- | -------- | ---------------------------------------------------------------------------------------------------------- |
| `type`       | String          | Yes      | Type of the processor.                                                                                     |
| `name`       | String          | No       | Name of the processor.                                                                                     |
| `is_enabled` | Boolean         | No       | If the processors is enabled or not. Default: `false`                                                      |
| `categories` | Array of Object | Yes      | Array of filters to match or not a log and their corresponding `name` to assign a custom value to the log. |
| `target`     | String          | Yes      | Name of the target attribute which value is defined by the matching category.                              |

{% /tab %}

## Arithmetic processor{% #arithmetic-processor %}

Use the arithmetic processor to add a new attribute (without spaces or special characters in the new attribute name) to a log with the result of the provided formula. This remaps different time attributes with different units into a single attribute, or compute operations on attributes within the same log.

An arithmetic processor formula can use parentheses and basic arithmetic operators: `-`, `+`, `*`, `/`.

By default, a calculation is skipped if an attribute is missing. Select *Replace missing attribute by 0* to automatically populate missing attribute values with 0 to ensure that the calculation is done.

**Notes**:

- An attribute may be listed as missing if it is not found in the log attributes, or if it cannot be converted to a number.
- When using the operator `-`, add spaces around it because attribute names like `start-time` may contain dashes. For example, the following formula must include spaces around the `-` operator: `(end-time - start-time) / 1000`.
- If the target attribute already exists, it is overwritten by the result of the formula.
- Results are rounded up to the 9th decimal. For example, if the result of the formula is `0.1234567891`, the actual value stored for the attribute is `0.123456789`.
- If you need to scale a unit of measure, use the scale filter.

{% tab title="UI" %}
Define the arithmetic processor on the [**Pipelines** page](https://app.datadoghq.com/logs/pipelines):

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/arithmetic_processor.b866facee14ced43cb90f984d2136080.png?auto=format"
   alt="Arithmetic Processor" /%}

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following arithmetic processor JSON payload:

```json
{
  "type": "arithmetic-processor",
  "name": "<PROCESSOR_NAME>",
  "is_enabled": true,
  "expression": "<ARITHMETIC_OPERATION>",
  "target": "<TARGET_ATTRIBUTE>",
  "is_replace_missing": false
}
```

| Parameter            | Type    | Required | Description                                                                                                                                  |
| -------------------- | ------- | -------- | -------------------------------------------------------------------------------------------------------------------------------------------- |
| `type`               | String  | Yes      | Type of the processor.                                                                                                                       |
| `name`               | String  | No       | Name of the processor.                                                                                                                       |
| `is_enabled`         | Boolean | No       | If the processors is enabled or not. Default: `false`.                                                                                       |
| `expression`         | String  | Yes      | Arithmetic operation between one or more log attributes.                                                                                     |
| `target`             | String  | Yes      | Name of the attribute that contains the result of the arithmetic operation.                                                                  |
| `is_replace_missing` | Boolean | No       | If `true`, it replaces all missing attributes of `expression` by 0, `false` skip the operation if an attribute is missing. Default: `false`. |

{% /tab %}

## String builder processor{% #string-builder-processor %}

Use the string builder processor to add a new attribute (without spaces or special characters) to a log with the result of the provided template. This enables aggregation of different attributes or raw strings into a single attribute.

The template is defined by both raw text and blocks with the syntax `%{attribute_path}`.

**Notes**:

- This processor only accepts attributes with values or an array of values in the block (see examples in the UI section below.
- If an attribute cannot be used (object or array of object), it is replaced by an empty string or the entire operation is skipped depending on your selection.
- If a target attribute already exists, it is overwritten by the result of the template.
- Results of a template cannot exceed 256 characters.

{% tab title="UI" %}
Define the string builder processor on the [**Pipelines** page](https://app.datadoghq.com/logs/pipelines):

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/stringbuilder_processor.7323e62cd6f9bda3ae17d8a5af674ddc.png?auto=format"
   alt="String builder processor" /%}

With the following log, use the template `Request %{http.method} %{http.url} was answered with response %{http.status_code}` to returns a result. For example:

```json
{
  "http": {
    "method": "GET",
    "status_code": 200,
    "url": "https://app.datadoghq.com/users"
  },
  "array_ids": [123, 456, 789],
  "array_users": [
    {"first_name": "John", "last_name": "Doe"},
    {"first_name": "Jack", "last_name": "London"}
  ]
}
```

Returns the following:

```text
Request GET https://app.datadoghq.com/users was answered with response 200
```

**Note**: `http` is an object and cannot be used in a block (`%{http}` fails), whereas `%{http.method}`, `%{http.status_code}`, or `%{http.url}` returns the corresponding value. Blocks can be used on arrays of values or on a specific attribute within an array.

- For example, adding the block `%{array_ids}` returns:

  ```text
  123,456,789
  ```

- `%{array_users}` does not return anything because it is a list of objects. However, `%{array_users.first_name}` returns a list of `first_name`s contained in the array:

  ```text
  John,Jack
  ```

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following string builder processor JSON payload:

```json
{
  "type": "string-builder-processor",
  "name": "<PROCESSOR_NAME>",
  "is_enabled": true,
  "template": "<STRING_BUILDER_TEMPLATE>",
  "target": "<TARGET_ATTRIBUTE>",
  "is_replace_missing": true
}
```

| Parameter            | Type    | Required | Description                                                                                                                                               |
| -------------------- | ------- | -------- | --------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `type`               | String  | Yes      | Type of the processor.                                                                                                                                    |
| `name`               | String  | No       | Name of the processor.                                                                                                                                    |
| `is_enabled`         | Boolean | No       | If the processor is enabled or not, defaults to `false`.                                                                                                  |
| `template`           | String  | Yes      | A formula with one or more attributes and raw text.                                                                                                       |
| `target`             | String  | Yes      | The name of the attribute that contains the result of the template.                                                                                       |
| `is_replace_missing` | Boolean | No       | If `true`, it replaces all missing attributes of `template` by an empty string. If `false`, skips the operation for missing attributes. Default: `false`. |

{% /tab %}

## GeoIP parser{% #geoip-parser %}

The geoIP parser takes an IP address attribute and extracts continent, country, subdivision, or city information (if available) in the target attribute path.

{% tab title="UI" %}

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/geoip_processor.ee9ab4000a95fc2f22ada17a8b4ecf63.png?auto=format"
   alt="GeoIP Processor" /%}

Most elements contain a `name` and `iso_code` (or `code` for continent) attribute. `subdivision` is the first level of subdivision that the country uses such as "States" for the United States or "Departments" for France.

For example, the geoIP parser extracts location from the `network.client.ip` attribute and stores it into the `network.client.geoip` attribute:

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/geoip_example_blurred.34a8188d395cd82d0968d935fb99c041.png?auto=format"
   alt="GeoIP example" /%}

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following geoIP parser JSON payload:

```json
{
  "type": "geo-ip-parser",
  "name": "Parse the geolocation elements from network.client.ip attribute.",
  "is_enabled": true,
  "sources": ["network.client.ip"],
  "target": "network.client.geoip"
}
```

| Parameter    | Type             | Required | Description                                                                                                               |
| ------------ | ---------------- | -------- | ------------------------------------------------------------------------------------------------------------------------- |
| `type`       | String           | Yes      | Type of the processor.                                                                                                    |
| `name`       | String           | No       | Name of the processor.                                                                                                    |
| `is_enabled` | Boolean          | No       | If the processors is enabled or not. Default: `false`.                                                                    |
| `sources`    | Array of strings | No       | Array of source attributes. Default: `network.client.ip`.                                                                 |
| `target`     | String           | Yes      | Name of the parent attribute that contains all the extracted details from the `sources`. Default: `network.client.geoip`. |

{% /tab %}

## Lookup processor{% #lookup-processor %}

Use the lookup processor to define a mapping between a log attribute and a human readable value saved in a [Reference Table](https://docs.datadoghq.com/integrations/guide/reference-tables/) or the processors mapping table.

For example, you can use the lookup processor to map an internal service ID into a human readable service name. Alternatively, you can use it to check if the MAC address that just attempted to connect to the production environment belongs to your list of stolen machines.

{% tab title="UI" %}
The lookup processor performs the following actions:

- Looks if the current log contains the source attribute.
- Checks if the source attribute value exists in the mapping table.
  - If it does, creates the target attribute with the corresponding value in the table.

  - Optionally, if it does not find the value in the mapping table, it creates a target attribute with the default fallback value set in the `fallbackValue` field. You can manually enter a list of `source_key,target_value` pairs or upload a CSV file on the **Manual Mapping** tab.

    {% image
       source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/lookup_processor_manual_mapping.eb881e927a82dca8304b96bfbe19b0a7.png?auto=format"
       alt="Lookup processor" /%}

The size limit for the mapping table is 100Kb. This limit applies across all Lookup Processors on the platform. However, Reference Tables support larger file sizes.

  - Optionally, if it does not find the value in the mapping table, it creates a target attribute with the value of the reference table. You can select a value for a [Reference Table](https://docs.datadoghq.com/integrations/guide/reference-tables/) on the **Reference Table** tab.

    {% image
       source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/lookup_processor_reference_table.730697ba145319e482bb685bf14e119a.png?auto=format"
       alt="Lookup processor" /%}

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following lookup processor JSON payload:

```json
{
  "type": "lookup-processor",
  "name": "<PROCESSOR_NAME>",
  "is_enabled": true,
  "source": "<SOURCE_ATTRIBUTE>",
  "target": "<TARGET_ATTRIBUTE>",
  "lookup_table": ["key1,value1", "key2,value2"],
  "default_lookup": "<DEFAULT_TARGET_VALUE>"
}
```

| Parameter        | Type             | Required | Description                                                                                                                                                               |
| ---------------- | ---------------- | -------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `type`           | String           | Yes      | Type of the processor.                                                                                                                                                    |
| `name`           | String           | No       | Name of the processor.                                                                                                                                                    |
| `is_enabled`     | Boolean          | Yes      | If the processor is enabled or not. Default: `false`.                                                                                                                     |
| `source`         | String           | Yes      | Source attribute used to perform the lookup.                                                                                                                              |
| `target`         | String           | Yes      | Name of the attribute that contains the corresponding value in the mapping list or the `default_lookup` if not found in the mapping list.                                 |
| `lookup_table`   | Array of strings | Yes      | Mapping table of values for the source attribute and their associated target attribute values, formatted as [ "source_key1,target_value1", "source_key2,target_value2" ]. |
| `default_lookup` | String           | No       | Value to set the target attribute if the source value is not found in the list.                                                                                           |

{% /tab %}

## Trace remapper{% #trace-remapper %}

There are two ways to define correlation between application traces and logs:

1. Follow the documentation on [how to inject a Trace ID in the application logs](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/). Log integrations automatically handle all remaining setup steps by default.

1. Use the trace remapper processor to define a log attribute as its associated trace ID.

{% tab title="UI" %}
Define the trace remapper processor on the [**Pipelines** page](https://app.datadoghq.com/logs/pipelines). Enter the Trace ID attribute path in the processor tile as follows:

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/trace_processor.dd6e6d9572b5c0b01250716efd981389.png?auto=format"
   alt="Trace ID processor" /%}

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following trace remapper JSON payload:

```json
{
  "type": "trace-id-remapper",
  "name": "Define dd.trace_id as the official trace id associate to this log",
  "is_enabled": true,
  "sources": ["dd.trace_id"]
}
```

| Parameter    | Type             | Required | Description                                            |
| ------------ | ---------------- | -------- | ------------------------------------------------------ |
| `type`       | String           | Yes      | Type of the processor.                                 |
| `name`       | String           | No       | Name of the processor.                                 |
| `is_enabled` | Boolean          | No       | If the processors is enabled or not. Default: `false`. |
| `sources`    | Array of strings | No       | Array of source attributes. Default: `dd.trace_id`.    |

{% /tab %}

**Note**: Trace IDs and span IDs are not displayed in your logs or log attributes in the UI.

## Span remapper{% #span-remapper %}

There are two ways to define correlation between application spans and logs:

1. Follow the documentation on [how to inject a Span ID in the application logs](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/). Log integrations automatically handle all remaining setup steps by default.

1. Use the span remapper processor to define a log attribute as its associated span ID.

{% tab title="UI" %}
Define the span remapper processor on the [**Pipelines** page](https://app.datadoghq.com/logs/pipelines). Enter the Span ID attribute path in the processor tile as follows:

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/span_id_remapper.eb62b4c420a48147f378389a50079ec0.png?auto=format"
   alt="Span ID processor" /%}

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following span remapper JSON payload:

```json
{
  "type": "span-id-remapper",
  "name": "Define dd.span_id as the official span id associate to this log",
  "is_enabled": true,
  "sources": ["dd.span_id"]
}
```

| Parameter    | Type             | Required | Description                                                   |
| ------------ | ---------------- | -------- | ------------------------------------------------------------- |
| `type`       | String           | Yes      | Type of the processor.                                        |
| `name`       | String           | No       | Name of the processor.                                        |
| `is_enabled` | Boolean          | No       | Indicates whether the processor is enabled. Default: `false`. |
| `sources`    | Array of strings | No       | Array of source attributes. Default: `dd.trace_id`.           |

{% /tab %}

**Note**: Trace IDs and span IDs are not displayed in your logs or log attributes in the UI.

## Array processor{% #array-processor %}

Use the array processor to extract, aggregate, or transform values from JSON arrays within your logs.

Supported operations include:

- **Select value from a matching element**
- **Compute the length of an array**
- **Append a value to an array**

Each operation is configured through a dedicated processor.

Define the array processor on the [**Pipelines** page](https://docs.datadoghq.com/logs/log_configuration/pipelines/).

### Select value from matching element{% #select-value-from-matching-element %}

Extract a specific value from an object inside an array when it matches a condition.

{% tab title="UI" %}

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/array_processor_select_value.a3b73e7ca447d0c0dda4f0a124130ce0.png?auto=format"
   alt="Array processor - Select value from element" /%}

**Example input:**

```json
{
  "httpRequest": {
    "headers": [
      {"name": "Referrer", "value": "https://example.com"},
      {"name": "Accept", "value": "application/json"}
    ]
  }
}
```

**Configuration steps:**

- **Array path**: `httpRequest.headers`
- **Condition**: `name:Referrer`
- **Extract value of**: `value`
- **Target attribute**: `referrer`

**Result:**

```json
{
  "httpRequest": {
    "headers": [...]
  },
  "referrer": "https://example.com"
}
```

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following array processor JSON payload:

```json
{
  "type": "array-processor",
  "name": "Extract Referrer URL",
  "is_enabled": true,
  "operation" : {
    "type" : "select",
    "source": "httpRequest.headers",
    "target": "referrer",
    "filter": "name:Referrer",
    "value_to_extract": "value"
  }
}
```

| Parameter                    | Type    | Required | Description                                                                   |
| ---------------------------- | ------- | -------- | ----------------------------------------------------------------------------- |
| `type`                       | String  | Yes      | Type of the processor.                                                        |
| `name`                       | String  | No       | Name of the processor.                                                        |
| `is_enabled`                 | Boolean | No       | Whether the processor is enabled. Default: `false`.                           |
| `operation.type`             | String  | Yes      | Type of array processor operation.                                            |
| `operation.source`           | String  | Yes      | Path of the array you want to select from.                                    |
| `operation.target`           | String  | Yes      | Target attribute.                                                             |
| `operation.filter`           | String  | Yes      | Expression to match an array element. The first matching element is selected. |
| `operation.value_to_extract` | String  | Yes      | Attribute to read in the selected element.                                    |

{% /tab %}

### Array length{% #array-length %}

Compute the number of elements in an array.

{% tab title="UI" %}

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/array_processor_length.f5538cc8be08a22cb4aba580720df7c3.png?auto=format"
   alt="Array processor - Length" /%}

**Example input:**

```json
{
  "tags": ["prod", "internal", "critical"]
}
```

**Configuration steps:**

- **Array attribute**: `tags`
- **Target attribute**: `tagCount`

**Result:**

```json
{
  "tags": ["prod", "internal", "critical"],
  "tagCount": 3
}
```

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following array processor JSON payload:

```json
{
  "type": "array-processor",
  "name": "Compute number of tags",
  "is_enabled": true,
  "operation" : {
    "type" : "length",
    "source": "tags",
    "target": "tagCount"
  }
}
```

| Parameter          | Type    | Required | Description                                         |
| ------------------ | ------- | -------- | --------------------------------------------------- |
| `type`             | String  | Yes      | Type of the processor.                              |
| `name`             | String  | No       | Name of the processor.                              |
| `is_enabled`       | Boolean | No       | Whether the processor is enabled. Default: `false`. |
| `operation.type`   | String  | Yes      | Type of array processor operation.                  |
| `operation.source` | String  | Yes      | Path of the array to extract the length of.         |
| `operation.target` | String  | Yes      | Target attribute.                                   |

{% /tab %}

### Append to array{% #append-to-array %}

Add an attribute value to the end of a target array attribute in the log.

**Note**: If the target array attribute does not exist in the log, it is automatically created.

{% tab title="UI" %}

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/array_processor_append.e11f47cf7e633a5dc975fdb206dd74ce.png?auto=format"
   alt="Array processor - Append" /%}

**Example input:**

```json
{
  "network": {
    "client": {
      "ip": "198.51.100.23"
    }
  },
  "sourceIps": ["203.0.113.1"]
}
```

**Configuration steps:**

- **Attribute to append**: `"network.client.ip"`
- **Array attribute to append to**: `sourceIps`

**Result:**

```json
{
  "network": {
    "client": {
      "ip": "198.51.100.23"
    }
  },
  "sourceIps": ["203.0.113.1", "198.51.100.23"]
}
```

{% /tab %}

{% tab title="API" %}
Use the [Datadog Log Pipeline API endpoint](https://docs.datadoghq.com/api/v1/logs-pipelines/) with the following array processor JSON payload:

```json
{
  "type": "array-processor",
  "name": "Append client IP to sourceIps",
  "is_enabled": true,
  "operation" : {
    "type" : "append",
    "source": "network.client.ip",
    "target": "sourceIps"
  }
}
```

| Parameter                   | Type    | Required | Description                                                                |
| --------------------------- | ------- | -------- | -------------------------------------------------------------------------- |
| `type`                      | String  | Yes      | Type of the processor.                                                     |
| `name`                      | String  | No       | Name of the processor.                                                     |
| `is_enabled`                | Boolean | No       | Whether the processor is enabled. Default: `false`.                        |
| `operation.type`            | String  | Yes      | Type of array processor operation.                                         |
| `operation.source`          | String  | Yes      | Attribute to append.                                                       |
| `operation.target`          | String  | Yes      | Array attribute to append to.                                              |
| `operation.preserve_source` | Boolean | No       | Whether to preserve the original source after remapping. Default: `false`. |

{% /tab %}

## Decoder processor{% #decoder-processor %}

The Decoder processor translates binary-to-text encoded string fields (such as Base64 or Hex/Base16) into their original representation. This allows the data to be interpreted in its native context, whether as a UTF-8 string, ASCII command, or a numeric value (for example, an integer derived from a hex string). The Decoder processor is especially useful for analyzing encoded commands, logs from specific systems, or evasion techniques used by threat actors.

**Notes**:

- Truncated strings: The processor handles partially truncated Base64/Base16 strings gracefully by trimming or padding as needed.

- Hex format: Hex input can be decoded into either a string (UTF-8) or an integer.

- Failure handling: If decoding fails (because of invalid input), the processor skips the transformation, and the log remains unchanged

{% tab title="UI" %}

1. Set the source attribute: Provide the attribute path that contains the encoded string, such as `encoded.base64`.
1. Select the source encoding: Choose the binary-to-text encoding of the source: `base64` or `base16/hex`.
1. For `Base16/Hex`: Choose the output format: `string (UTF-8)` or `integer`.
1. Set the target attribute: Enter the attribute path to store the decoded result.

{% image
   source="https://datadog-docs.imgix.net/images/logs/log_configuration/processor/decoder-processor.ea3c692612ee012b78b495739a3dc162.png?auto=format"
   alt="Decoder processor - Append" /%}

{% /tab %}

## Threat intel processor{% #threat-intel-processor %}

Add the Threat Intel Processor to evaluate logs against the table using a specific Indicator of Compromise (IoC) key, such as an IP address. If a match is found, the log is enriched with relevant Threat Intelligence (TI) attributes from the table, which enhances detection, investigation, and response.

For more information, see [Threat Intelligence](https://docs.datadoghq.com/security/threat_intelligence/).

## OCSF processor{% #ocsf-processor %}

Use the OCSF processor to normalize your security logs according to the [Open Cybersecurity Schema Framework (OCSF)](https://docs.datadoghq.com/security/cloud_siem/ingest_and_enrich/open_cybersecurity_schema_framework/). The OCSF processor allows you to create custom mappings that remap your log attributes to OCSF schema classes and their corresponding attributes, including enumerated (ENUM) attributes.

The processor enables you to:

- Map source log attributes to OCSF target attributes
- Configure ENUM attributes with specific numerical values
- Create sub-pipelines for different OCSF target event classes
- Pre-process logs before OCSF remapping

For detailed setup instructions, configuration examples, and troubleshooting guidance, see [OCSF Processor](https://docs.datadoghq.com/security/cloud_siem/ingest_and_enrich/open_cybersecurity_schema_framework/ocsf_processor/).

## Further Reading{% #further-reading %}

- [Discover Datadog Pipelines](https://docs.datadoghq.com/logs/log_configuration/pipelines)
- [Logging without Limits*](https://docs.datadoghq.com/logs/logging_without_limits/)
- [Learn how to explore your logs](https://docs.datadoghq.com/logs/explorer/)
- [Tips and tricks: Add business data to logs from retail endpoints](https://www.youtube.com/watch?v=OztSU3JzfC8&list=PLdh-RwQzDsaM9Sq_fi-yXuzhmE7nOlqLE&index=4&t=245s)
\*Logging without Limits is a trademark of Datadog, Inc.