Events

Events

Getting started

Events are records of notable changes relevant for managing and troubleshooting IT operations, such as code deployments, service health, configuration changes, or monitoring alerts.

Datadog Events gives you a consolidated interface to search, analyze, and filter events from any source in one place.

Without any additional setup, Datadog Events automatically gathers events that are collected by the Agent and installed integrations.

More than 100 Datadog integrations support events collection, including Kubernetes, Docker, Jenkins, Chef, Puppet, AWS ECS or Autoscaling, Sentry, and Nagios.

Sending custom events to Datadog

You can also submit your own custom events using the Datadog API, Custom Agent Check, DogStatsD, or the Events Email API.

Exploring Datadog Events

Events stream

The Datadog Events Stream shows an instant view of your infrastructure and services events, to help you troubleshoot issues happening now or in the past.

The event stream displays the most recent events generated by your infrastructure and the associated monitors.

Events explorer and analytics

The Events Explorer and the features listed below are in private beta. To request access, contact Datadog Support.

Use the Events Explorer to aggregate and view events coming into Datadog. Group or filter events by attribute and graphically represent them with event analytics. Use the query syntax to filter events using Boolean and wildcard operators.

Events as a source in dashboards widgets

You can use events as a data source in graph widgets. You can build timeseries, tables, and top list widgets of your event search queries.

For example, check out the Monitor Notifications Overview dashboard, which analyzes monitor alert event trends to help you improve your configuration and reduce alert fatigue.

Generate custom metrics from events

Generate metrics with 15-month retention from any event search query to create and monitor historical events and alerts.

Normalize and enrich events with processing pipelines

A processor runs data-structuring actions on events attributes when they are ingested. A pipelines is composed of one or multiple processors executed sequentially. With event processing pipelines, you can:

  • Normalize disparate sources of events by remapping attributes. For example, use the same reserved service tags everywhere.
  • Enrich events with external data saved in an enrichment table (beta). For example, map a service name with your service directory to enrich events with team ownership information, links to dashboards, or links to documentation.

We are working to support more processors types. For more details, contact Support.

Learn more about processing pipelines.

Further Reading

Additional helpful documentation, links, and articles: