---
title: Processors
description: >-
  Parse, enrich, and structure your logs using processors in Datadog Log
  Management
breadcrumbs: Docs > Log Management > Log Configuration > Processors
---

# Processors

## Overview{% #overview %}

{% alert level="info" %}
The processors outlined in this documentation are specific to cloud-based logging environments. To parse, structure, and enrich on-premises logs, see [Observability Pipelines](https://docs.datadoghq.com/observability_pipelines/processors.md).
{% /alert %}

A processor executes within a [Pipeline](https://docs.datadoghq.com/logs/log_configuration/pipelines.md) to complete a data-structuring action and generate attributes to enrich your logs.

{% image
   source="https://docs.dd-static.net/images/logs/log_configuration/processor/processor_overview.a024cfa4baa0bae398110da4c3a70c29.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/logs/log_configuration/processor/processor_overview.a024cfa4baa0bae398110da4c3a70c29.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="Processors" /%}

In [log configuration settings](https://docs.datadoghq.com/logs/log_configuration/pipelines.md), you can configure processors such as the [Grok parser](https://docs.datadoghq.com/logs/log_configuration/processors/grok_parser.md) or [date remapper](https://docs.datadoghq.com/logs/log_configuration/processors/log_date_remapper.md) to help extract, create, and remap attributes to enrich your logs and enhance faceted search.

**Notes**:

- Structured logs should be shipped in a valid format. If the structure contains invalid characters for parsing, these should be stripped at the Agent level using the [mask_sequences](https://docs.datadoghq.com/agent/logs/advanced_log_collection.md?tab=configurationfile#scrub-sensitive-data-from-your-logs) feature.

- As a best practice, it is recommended to use at most 20 processors per pipeline.

## Processor types{% #processor-types %}

- [**Arithmetic Processor**: Add a new attribute to a log with the result of a formula applied to existing numeric attributes.](https://docs.datadoghq.com/logs/log_configuration/processors/arithmetic_processor.md)
- [**Array Processor**: Extract, aggregate, or transform values from JSON arrays in your logs.](https://docs.datadoghq.com/logs/log_configuration/processors/array_processor.md)
- [**Attribute Remapper**: Remap source attributes or tags to another target attribute or tag.](https://docs.datadoghq.com/logs/log_configuration/processors/remapper.md)
- [**Category Processor**: Add a new attribute to a log based on a search query match, for grouping and classification.](https://docs.datadoghq.com/logs/log_configuration/processors/category_processor.md)
- [**Decoder Processor**: Translate binary-to-text encoded fields (such as Base64 or Hex) into their original representation.](https://docs.datadoghq.com/logs/log_configuration/processors/decoder_processor.md)
- [**GeoIP Parser**: Extract continent, country, subdivision, or city information from an IP address attribute.](https://docs.datadoghq.com/logs/log_configuration/processors/geoip_parser.md)
- [**Grok Parser**: Create custom parsing rules to extract and structure data from log messages or specific attributes.](https://docs.datadoghq.com/logs/log_configuration/processors/grok_parser.md)
- [**Log Date Remapper**: Assign one or more attributes as the official timestamp for your logs.](https://docs.datadoghq.com/logs/log_configuration/processors/log_date_remapper.md)
- [**Log Message Remapper**: Assign one or more attributes as the official message for your logs.](https://docs.datadoghq.com/logs/log_configuration/processors/log_message_remapper.md)
- [**Log Status Remapper**: Assign one or more attributes as the official severity status for your logs.](https://docs.datadoghq.com/logs/log_configuration/processors/log_status_remapper.md)
- [**Lookup Processor**: Map a log attribute to a human-readable value from a Reference Table or mapping table.](https://docs.datadoghq.com/logs/log_configuration/processors/lookup_processor.md)
- [**OCSF Processor**: Normalize security logs to the Open Cybersecurity Schema Framework (OCSF).](https://docs.datadoghq.com/logs/log_configuration/processors/ocsf_processor.md)
- [**Service Remapper**: Assign one or more attributes as the official service for your logs.](https://docs.datadoghq.com/logs/log_configuration/processors/service_remapper.md)
- [**Span Remapper**: Define a correlation between application spans and logs.](https://docs.datadoghq.com/logs/log_configuration/processors/span_remapper.md)
- [**String Builder Processor**: Build a new attribute from a template of existing attributes and raw strings.](https://docs.datadoghq.com/logs/log_configuration/processors/string_builder_processor.md)
- [**Threat Intel Processor**: Enrich logs with Threat Intelligence attributes by matching against an indicator of compromise (IoC) table.](https://docs.datadoghq.com/logs/log_configuration/processors/threat_intel_processor.md)
- [**Trace Remapper**: Define a correlation between application traces and logs.](https://docs.datadoghq.com/logs/log_configuration/processors/trace_remapper.md)
- [**URL Parser**: Extract query parameters and other components from a URL attribute.](https://docs.datadoghq.com/logs/log_configuration/processors/url_parser.md)
- [**User-Agent Parser**: Parse a user-agent attribute to extract OS, browser, device, and other user data.](https://docs.datadoghq.com/logs/log_configuration/processors/user_agent_parser.md)

## Further reading{% #further-reading %}

- [Discover Datadog Pipelines](https://docs.datadoghq.com/logs/log_configuration/pipelines.md)
- [Logging without Limits*](https://docs.datadoghq.com/logs/logging_without_limits.md)
- [Learn how to explore your logs](https://docs.datadoghq.com/logs/explorer.md)
- [Tips and tricks: Add business data to logs from retail endpoints](https://www.youtube.com/watch?v=OztSU3JzfC8&list=PLdh-RwQzDsaM9Sq_fi-yXuzhmE7nOlqLE&index=4&t=245s)
- [Build and Manage Log Pipelines](https://learn.datadoghq.com/courses/log-pipelines)
- [Process Logs Out of the Box with Integration Pipelines](https://learn.datadoghq.com/courses/integration-pipelines)
