- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- Administrator's Guide
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
Observability Pipelines Worker configurations can collect, transform, and route your logs from any source to any destination. The configuration file supports YAML, TOML, and JSON. The three main configuration components are sources, transforms, and sinks.
Source components define how the Observability Pipelines Worker collects or receives data from observability data sources.
Create a YAML configuration file and add the following source example:
sources:
generate_syslog:
type: demo_logs
format: syslog
count: 100
[sources.generate_syslog]
type = "demo_logs"
format = "syslog"
count = 100
"sources": {
"generate_syslog": {
"type": "demo_logs",
"format": "syslog",
"count": 100
}
}
This source
component has a unique ID of generate_syslog
. This unique ID is important for transforming and routing the data with thesink
component.
type
is the source type from which the Observability Pipelines Worker collects observability data. This example uses a demo_logs
source, which creates sample log data that enables you to simulate different types of events in various formats. The format
option tells the demo_logs
source which type of logs to emit, in this case, Syslog format. The count
option tells the demo_logs
source how many lines to emit.
See all supported sources in the Sources documentation.
Use the following example to define a transform component that manipulates the data collected from the demo_logs
source.
transforms:
remap_syslog:
inputs:
- generate_syslog
type: remap
source: |2
structured = parse_syslog!(.message)
. = merge(., structured)
[transforms.remap_syslog]
inputs = ["generate_syslog" ]
type = "remap"
source = '''
structured = parse_syslog!(.message)
. = merge(., structured)
'''
"transforms": {
"remap_syslog": {
"inputs": [
"generate_syslog"
],
"type": "remap",
"source": " structured = parse_syslog!(.message)\n . = merge(., structured)\n"
}
}
In this transforms.remap_syslog
component, the inputs
option is set to generate_syslog
, which means it receives events from the previously defined generate_syslog
source. The transform’s component type is remap
.
The source
contains the list of remapping transformations to apply to each event that the Observability Pipelines Worker receives. In this example, only one operation, parse_syslog
, is performed, but multiple operations can be added.
The parse_syslog
function receives a single field called message
, which contains the Syslog event that is generated in the generate_syslog
source. This function parses the content of the Syslog-formatted message and emits it as a structured event.
This transform example showcases only a portion of the Observability Pipelines Worker’s ability to shape and transform your data*. See the Transforms documentation for all supported transforms, ranging from sampling, filtering, enrichment, and more.
With the data parsed in the transform
component, use the following sink example to route the data to a destination.
sinks:
emit_syslog:
inputs:
- remap_syslog
type: console
encoding:
codec: json
[sinks.emit_syslog]
inputs = [ "remap_syslog" ]
type = "console"
[sinks.emit_syslog.encoding]
codec = "json"
"sinks": {
"emit_syslog": {
"inputs": [
"remap_syslog"
],
"type": "console",
"encoding": {
"codec": "json"
}
}
}
This sink
(or destination) component has the ID of emit_syslog
. The inputs
option specifies that the events generated by the remap_syslog
transform are processed with this sink. The encoding
option tells the sink to emit the events in JSON format.
See the Sinks documentation for all supported sinks.
With these three basic components, a source, transform, and sink, you now have a working Observability Pipelines configuration file.
sources:
generate_syslog:
type: demo_logs
format: syslog
count: 100
transforms:
remap_syslog:
inputs:
- generate_syslog
type: remap
source: |2
structured = parse_syslog!(.message)
. = merge(., structured)
sinks:
emit_syslog:
inputs:
- remap_syslog
type: console
encoding:
codec: json
[sources.generate_syslog]
type = "demo_logs"
format = "syslog"
count = 100
[transforms.remap_syslog]
inputs = [ "generate_syslog" ]
type = "remap"
source = '''
structured = parse_syslog!(.message)
. = merge(., structured)
'''
[sinks.emit_syslog]
inputs = [ "remap_syslog" ]
type = "console"
[sinks.emit_syslog.encoding]
codec = "json"
{
"sources": {
"generate_syslog": {
"type": "demo_logs",
"format": "syslog",
"count": 100
}
},
"transforms": {
"remap_syslog": {
"inputs": [
"generate_syslog"
],
"type": "remap",
"source": " structured = parse_syslog!(.message)\n . = merge(., structured)\n"
}
},
"sinks": {
"emit_syslog": {
"inputs": [
"remap_syslog"
],
"type": "console",
"encoding": {
"codec": "json"
}
}
}
}
Run the following command to compile and run this configuration:
vector --config ./<configuration_filename>
If successfully setup, the parsed demo logs are printed in JSON format.