---
title: Extractions
description: >-
  Extract values from your logs at query time using Grok patterns in the Log
  Explorer.
breadcrumbs: Docs > Log Management > Log Explorer > Calculated Fields > Extractions
---

# Extractions

{% callout %}
##### Calculated Fields Extractions is in Preview

Use Calculated Fields Extractions to extract values from your logs in the Log Explorer at query time using Grok patterns.

[Request Access](https://docs.google.com/forms/d/e/1FAIpQLSffBg9ph2zl-jTGzvgBUcXSifOjvPdRh8vJjzTMIclSB2ZLIw/viewform)
{% /callout %}

## Overview{% #overview %}

Calculated Fields Extractions lets you apply Grok parsing rules at query time in the Log Explorer, enabling you to extract values from raw log messages or attributes without modifying pipelines or re-ingesting data. You can generate extraction rules automatically with AI-powered parsing, or manually define your own Grok patterns to match your specific needs.

To create an extraction calculated field, see [Create a calculated field](https://docs.datadoghq.com/logs/explorer/calculated_fields/#create-a-calculated-field).

## Automatic parsing{% #automatic-parsing %}

Use AI-powered automatic parsing to generate Grok rules from your log data. Datadog analyzes the content of your log message and automatically generates an extraction rule, eliminating the need to manually write Grok patterns.

{% image
   source="https://datadog-docs.imgix.net/images/logs/explorer/calculated_fields/extractions/calculated_fields_parse_ai.035e9a766cf17a72695fcd1eb90adaf6.png?auto=format"
   alt="Example of AI-powered Grok parsing in Datadog Calculated Fields" /%}

There are two ways to access automatic parsing from the log side panel:

1. Click the **AI** button 
   {% icon name="icon-bits-ai" /%}
 next to the copy button.
1. Highlight a specific portion of the log message and click the **AI** button 
   {% icon name="icon-bits-ai" /%}
 in the popup menu.

When you click the **AI** button, Datadog automatically populates the Calculated Field form:

1. **Extract from**: Defaults to the full log message. You can change the dropdown to parse individual attributes instead.
1. **Log sample**: Automatically populated with your selected log.
1. **Parsing rule**: Automatically generated from the log sample.

Review and modify the generated rule as needed. You can edit it manually or click **Generate a new rule** for Datadog to try again. You can also modify, insert, or replace the log sample to test your rule against different log formats.
Use the thumbs up or thumbs down buttons to provide inline feedback and help improve the feature.
## Syntax{% #syntax %}

Extraction fields use Grok patterns to identify and capture values from a log attribute. A Grok pattern is composed of one or more tokens in the form:

```
%{PATTERN_NAME:field_name}
```

- `PATTERN_NAME`: A Grok matcher.
- `field_name`: The name of the extracted calculated field.

You can chain multiple patterns together to parse complex log messages.

## Supported matchers and filters at query time{% #supported-matchers-and-filters-at-query-time %}

{% alert level="warning" %}
Grok parsing features available at *query-time* (in the [Log Explorer](https://docs.datadoghq.com/logs/explorer/calculated_fields/)) support a limited subset of matchers (**data**, **integer**, **notSpace**, **number**, and **word**) and filters (**number** and **integer**). For long-term parsing needs, define a log pipeline.
{% /alert %}

Query-time Grok parsing in the Log Explorer supports a limited subset of matchers and filters. Each matcher or filter is used in a Grok pattern with the format:

```
%{MATCHER:field_name}
```

### Matchers{% #matchers %}

| Matcher                                         | Example Grok Pattern            |
| ----------------------------------------------- | ------------------------------- |
| `data`*Any sequence of characters (non-greedy)* | `status=%{data:status}`         |
| `word`*Alphanumeric characters*                 | `country=%{word:country}`       |
| `number`*Floating-point numbers*                | `value=%{number:float_val}`     |
| `integer`*Integer values*                       | `count=%{integer:count}`        |
| `notSpace`*Non-whitespace characters*           | `path=%{notSpace:request_path}` |

### Filters{% #filters %}

Apply filters to cast extracted values into numeric types. Filters use the same pattern syntax as matches.

| Filter                                        | Example Grok Pattern          |
| --------------------------------------------- | ----------------------------- |
| `number`*Parses numeric strings as numbers*   | `latency=%{number:lat}`       |
| `integer`*Parses numeric strings as integers* | `users=%{integer:user_count}` |

### Example{% #example %}

Use this feature to analyze log fields on-demand without modifying your ingestion pipeline. **Log line**:

```
country=Brazil duration=123ms path=/index.html status=200 OK
```

**Extraction grok rule**:

```
country=%{word:country} duration=%{integer:duration} path=%{notSpace:request_path} status=%{data:status}
```

**Resulting calculated fields**:

- `#country = Brazil`
- `#duration = 123`
- `#request_path = /index.html`
- `#status = 200 OK`

## Further reading{% #further-reading %}

- [Learn more about Calculated Fields](https://docs.datadoghq.com/logs/explorer/calculated_fields/)
