이 페이지는 아직 한국어로 제공되지 않습니다. 번역 작업 중입니다.
현재 번역 프로젝트에 대한 질문이나 피드백이 있으신 경우
언제든지 연락주시기 바랍니다.Calculated Fields Extractions is in Preview
Use Calculated Fields Extractions to extract values from your logs in the Log Explorer at query time using Grok patterns.
Request AccessOverview
Calculated Fields Extractions lets you apply Grok parsing rules at query time in the Log Explorer, enabling you to extract values from raw log messages or attributes without modifying pipelines or re-ingesting data. You can generate extraction rules automatically with AI-powered parsing, or manually define your own Grok patterns to match your specific needs.
To create an extraction calculated field, see Create a calculated field.
Automatic parsing
Use AI-powered automatic parsing to generate Grok rules from your log data. Datadog analyzes the content of your log message and automatically generates an extraction rule, eliminating the need to manually write Grok patterns.
There are two ways to access automatic parsing from the log side panel:
- Click the AI button next to the copy button.
- Highlight a specific portion of the log message and click the AI button in the popup menu.
When you click the AI button, Datadog automatically populates the Calculated Field form:
- Extract from: Defaults to the full log message. You can change the dropdown to parse individual attributes instead.
- Log sample: Automatically populated with your selected log.
- Parsing rule: Automatically generated from the log sample.
Review and modify the generated rule as needed. You can edit it manually or click Generate a new rule for Datadog to try again. You can also modify, insert, or replace the log sample to test your rule against different log formats.
Use the thumbs up or thumbs down buttons to provide inline feedback and help improve the feature.
Syntax
Extraction fields use Grok patterns to identify and capture values from a log attribute. A Grok pattern is composed of one or more tokens in the form:
%{PATTERN_NAME:field_name}
PATTERN_NAME: A Grok matcher.field_name: The name of the extracted calculated field.
You can chain multiple patterns together to parse complex log messages.
Supported matchers and filters at query time
Grok parsing features available at
query-time (in the
Log Explorer) support a limited subset of matchers (
data,
integer,
notSpace,
number, and
word) and filters (
number and
integer). For long-term parsing needs, define a log pipeline.
Query-time Grok parsing in the Log Explorer supports a limited subset of matchers and filters. Each matcher or filter is used in a Grok pattern with the format:
Matchers
| Matcher | Example Grok Pattern |
|---|
data Any sequence of characters (non-greedy) | status=%{data:status} |
word Alphanumeric characters | country=%{word:country} |
number Floating-point numbers | value=%{number:float_val} |
integer Integer values | count=%{integer:count} |
notSpace Non-whitespace characters | path=%{notSpace:request_path} |
Filters
Apply filters to cast extracted values into numeric types. Filters use the same pattern syntax as matches.
| Filter | Example Grok Pattern |
|---|
number Parses numeric strings as numbers | latency=%{number:lat} |
integer Parses numeric strings as integers | users=%{integer:user_count} |
Example
Use this feature to analyze log fields on-demand without modifying your ingestion pipeline.
Log line:
country=Brazil duration=123ms path=/index.html status=200 OK
Extraction grok rule:
country=%{word:country} duration=%{integer:duration} path=%{notSpace:request_path} status=%{data:status}
Resulting calculated fields:
#country = Brazil#duration = 123#request_path = /index.html#status = 200 OK
Further reading