---
title: Grok Parser Processor
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: Docs > Observability Pipelines > Processors > Grok Parser Processor
---

# Grok Parser Processor

{% callout %}
# Important note for users on the following Datadog sites: app.ddog-gov.com

{% alert level="danger" %}
This product is not supported for your selected [Datadog site](https://docs.datadoghq.com/getting_started/site). ().
{% /alert %}

{% /callout %}
Available for:
{% icon name="icon-logs" /%}
 Logs 
## Overview{% #overview %}

{% alert level="warning" %}
To process logs with the Grok Parser processor, the logs must have a `source` field with the source name. If this field is not added when the log is sent to the Observability Pipelines Worker, you can use the [Add field](https://docs.datadoghq.com/observability_pipelines/processors/edit_fields#add-field) processor to add it.
{% /alert %}

The Grok Parser processor parses logs using the grok parsing rules available for a set of sources. If the `source` field of a log matches one of the grok parsing rule sets, the log's `message` field is checked against those rules. If a rule matches, the resulting parsed data is added in the `message` field as a JSON object, overwriting the original `message`.

If there isn't a `source` field on the log, or no rule matches the log `message`, then no changes are made to the log and it is sent to the next step in the pipeline.

Datadog's Grok patterns differ from the standard Grok pattern, where Datadog's Grok implementation provides:

- Matchers that include options for how you define parsing rules
- Filters for post-processing of extracted data
- A set of built-in patterns tailored to common log formats

See [Parsing](https://docs.datadoghq.com/logs/log_configuration/parsing/) for more information on Datadog's Grok patterns.

## Setup{% #setup %}

To set up the grok parser, define a **filter query**. Only logs that match the specified filter query are processed. All logs, regardless of whether they match the filter query, are sent to the next step in the pipeline. See [Search Syntax](https://docs.datadoghq.com/observability_pipelines/search_syntax/logs/) for more information.

To test log samples for out-of-the-box rules:

1. Click the **Preview Library Rules** button.
1. Search or select a source in the dropdown menu.
1. Enter a log sample to test the parsing rules for that source.

To add a custom parsing rule:

1. Click **Add Custom Rule**.
1. If you want to clone a library rule, select **Clone library rule** and then the library source from the dropdown menu.
1. If you want to create a custom rule, select **Custom** and then enter the `source`. The parsing rules are applied to logs with that `source`.
1. Enter log samples to test the parsing rules.
1. Enter the rules for parsing the logs. See [Parsing](https://docs.datadoghq.com/logs/log_configuration/parsing/) for more information on writing parsing rules with Datadog Grok patterns.**Note**: The `url`, `useragent`, and `csv` filters are not available.
1. Click **Advanced Settings** if you want to add helper rules. See [Using helper rules to reuse common patterns](https://docs.datadoghq.com/logs/log_configuration/parsing/?tab=matchers#using-helper-rules-to-reuse-common-patterns) for more information.
1. Click **Add Rule**.

## Further reading{% #further-reading %}

- [Simplify log collection and aggregation for MSSPs with Datadog Observability Pipelines](https://www.datadoghq.com/blog/observability-pipelines-mssp)
