Why do my logs not have the expected timestamp?

Cette page n'est pas encore disponible en français, sa traduction est en cours.
Si vous avez des questions ou des retours sur notre projet de traduction actuel, n'hésitez pas à nous contacter.

By default, Datadog generates a timestamp and appends it in a date attribute when logs are received on the intake API.

However, this default timestamp does not always reflect the actual value that might be contained in the log itself. This article describes how to override the default timestamp.

Example of log with timestamp
  1. Displayed timestamp. The first thing to understand is how the log timestamp (visible from the Log Explorer and at the top section of the contextual panel) is generated.

    Timestamps are stored in UTC and displayed in the user local timezone. On the above screenshot, the local profile is set to UTC+1, therefore the reception time of the log is 11:06:16.807 UTC.

    Check your user settings to understand if this could be linked to a bad timezone on your profile:

    User setting
    But you can extract the timestamp from the message to override the actual log date for both raw and JSON logs.

  2. Raw logs.

    2.1 Extract the timestamp value with a parser. While writing a parsing rule for your logs, you need to extract the timestamp in a specific attribute. See specific date parsing examples. For the above log, you would use the following rule with the date() matcher to extract the date and pass it into a custom date attribute:

    Parsing date

    2.2 Define a Log Date Remapper. A date attribute stores the timestamp value. Add a Log Date remapper to make sure the value in the date attribute overrides the official log timestamp.

    Log date remapper
    Note: Any modification on a Pipeline only impacts new logs as all the processing is done at ingestion. The following log generated at 06:01:03 EST, which correspond to 11:01:03 UTC, is correctly displayed as 12:01:03 (display timezone is UTC+1 in this case).
    Log post processing with new timestamp

  3. JSON logs.

    3.1 Supported Date formats. JSON logs are automatically parsed in Datadog. The log date attribute is one of the reserved attributes in Datadog which means JSON logs that use those attributes have their values treated specially - in this case to derive the log’s date. Change the default remapping for these attributes at the top of your Pipeline. Imagine that the actual timestamp of the log is contained in the attribute mytimestamp.

    log with mytimestamp attribute
    To make sure this attribute value is taken to override the log date, you must add it in the list of Date attributes. The date remapper looks for each of the reserved attributes in the order in which they are configured in the reserved attribute mapping. To ensure the date comes from the mytimestamp attribute, place it first in the list. Note: Any modification on the Pipeline only impacts new logs as all the processing is done at ingestion. There are specific date formats to respect for the remapping to work. The recognized date formats are: ISO8601, UNIX (the milliseconds EPOCH format) and RFC3164. If the format is different from one of the above (so if your logs still do not have the right timestamp), there is a solution.

    3.2 Custom Date format. If the format is not supported by the remapper by default, parse this format and convert it to a supported format. To do this use a parser Processor that applies only on the attribute. If you do not have a Pipeline filtered on those logs yet, create a new one and add a Processor. Note: Set this Processor only to apply to the custom mytimestamp attribute under the advanced settings.

    Advanced settings date Processor
    Then define the right parsing rule depending on your date format. See the parsing dates examples. Add a Log Date Remapper and to have the correct timestamp on new logs.
    Pipeline example
    Log post processing after previous pipeline