---
title: Logs Not Showing the Expected Timestamp
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: Docs > Log Management > Logs Guides > Logs Not Showing the Expected Timestamp
---

# Logs Not Showing the Expected Timestamp

By default, when logs are received by the Datadog intake API, a timestamp is generated and appended as a date attribute. However, this default timestamp does not always reflect the actual timestamp that might be contained in the log itself. This guides walks you through how to override the default timestamp with the actual timestamp.

{% image
   source="https://docs.dd-static.net/images/logs/guide/log_timestamp_1.9fd3d285e8d9d31470629cdabc722135.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/logs/guide/log_timestamp_1.9fd3d285e8d9d31470629cdabc722135.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="Log panel showing the log timestamp that is different from the timestamp in the message" /%}

## Displayed timestamp{% #displayed-timestamp %}

The log timestamp is located at the top section of the log panel. Timestamps are stored in UTC and displayed in the user's local timezone. In the above screenshot, the local profile is set to `UTC+1`, therefore the time the log was received is `11:06:16.807 UTC`.

The timestamp may not show the expected value because the timezone is incorrectly set. To check if this is the case, go to [Preferences](https://app.datadoghq.com/account/preferences) and look at the **Time zone** section.

If the timezone is correct, extract the timestamp from the message to override the log timestamp being shown.

## Raw logs{% #raw-logs %}

If your raw logs are not showing the expected timestamp in Datadog, extract the correct log timestamp from the raw logs and then remap it.

#### Extract the timestamp value with a parser{% #extract-the-timestamp-value-with-a-parser %}

1. Navigate to [Logs Pipelines](https://app.datadoghq.com/logs/pipelines/) and click on the pipeline processing the logs.
1. Click **Add Processor**.
1. Select **Grok Parser** for the processor type.
1. Use the [date() matcher](https://docs.datadoghq.com/logs/log_configuration/parsing.md) to extract the date and pass it into a custom date attribute. See the below example, as well as [parsing dates examples](https://docs.datadoghq.com/logs/log_configuration/parsing.md#parsing-dates), for details.

For a log example like this:

```
2017-12-13 11:01:03 EST | INFO | (tagger.go:80 in Init) | starting the tagging system
```

Add a parsing rule like:

```
MyParsingRule %{date("yyyy-MM-dd HH:mm:ss z"):date} \| %{word:severity} \| \(%{notSpace:logger.name}:%{integer:logger.line}[^)]*\) \|.*
```

The output for `MyParsingRule`'s extraction:

```
{
  "date": 1513180863000,
  "logger": {
    "line": 80,
    "name": "tagger.go"
  },
  "severity": "INFO"
}
```

The `date` attribute stores the `mytimestamp` value.

#### Define a Log Date Remapper{% #define-a-log-date-remapper %}

Add a [Log Date Remapper](https://docs.datadoghq.com/logs/log_configuration/processors.md?tabs=ui#log-date-remapper) to make sure that the value of the `date` attribute overrides the current log timestamp.

1. Navigate to [Logs Pipelines](https://app.datadoghq.com/logs/pipelines/) and click on the pipeline processing the logs.
1. Click **Add Processor**.
1. Select **Date remapper** as the processor type.
1. Enter a name for the processor.
1. Add **date** to the Set date attribute(s) section.
1. Click **Create**.

The following log generated at `06:01:03 EST`, which correspond to `11:01:03 UTC`, is correctly displayed as 12:01:03 (the display timezone is UTC+1 in this case).

{% image
   source="https://docs.dd-static.net/images/logs/guide/log_timestamp_5.ea0283a5bf36688944d943b280e0c76d.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/logs/guide/log_timestamp_5.ea0283a5bf36688944d943b280e0c76d.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="Log panel showing the correct timestamp" /%}

**Note**: Any modification on a Pipeline only impacts new logs as all the processing is done at ingestion.

## JSON logs{% #json-logs %}

JSON logs are automatically parsed in Datadog. The log `date` attribute is a [reserved attribute](https://docs.datadoghq.com/logs/log_configuration/pipelines.md?tab=date#preprocessing), so it goes through preprocessing operations for JSON logs.

In the below example, the actual timestamp of the log is the value of the `mytimestamp` attribute and not the log timestamp `Dec 13, 2017 at 14:16:45.158`.

{% image
   source="https://docs.dd-static.net/images/logs/guide/log_timestamp_6.b89c1385ded6f5d55f55ef4ebfbbeebf.png?auto=format&fit=max&w=850 1x, https://docs.dd-static.net/images/logs/guide/log_timestamp_6.b89c1385ded6f5d55f55ef4ebfbbeebf.png?auto=format&fit=max&w=850&dpr=2 2x"
   alt="Log panel showing the log timestamp which is different from the mytimestamp attribute value in the message" /%}

### Supported date formats{% #supported-date-formats %}

To make sure the `mytimestamp` attribute value overrides the current log timestamp being shown, you must add it as a date attribute.

1. Go to your [Logs Pipeline](https://app.datadoghq.com/logs/pipelines/).
1. Hover over Preprocessing for JSON Logs, and click the pencil icon.
1. Add `mytimestamp` to the list of date attributes. The date remapper looks for each of the reserved attributes in the order they are listed. To ensure the date comes from the `mytimestamp` attribute, place it first in the list.
1. Click **Save**.

There are specific date formats to follow for the remapping to work. The recognized date formats are: [ISO8601](https://www.iso.org/iso-8601-date-and-time-format.html), [UNIX (the milliseconds EPOCH format)](https://en.wikipedia.org/wiki/Unix_time), and [RFC3164](https://www.ietf.org/rfc/rfc3164.txt).

If a different date format is being used, see Custom date format.

**Note**: Any modification on the Pipeline only impacts new logs as all the processing is done at ingestion.

### Custom date format{% #custom-date-format %}

If the date format is not supported by the remapper by default, you can parse the date using a [Grok parser](https://docs.datadoghq.com/logs/log_configuration/processors.md?tabs=ui#log-date-remapper) and then convert it to a supported format.

1. Go to the [Pipeline](https://app.datadoghq.com/logs/pipelines/) that is processing the logs. If you do not have a Pipeline configured for those logs yet, create a new Pipeline for it.
1. Click **Add Processor**.
1. Select **Grok Parser** for the processor type.
1. Define the parsing rule based on your date format. See these [parsing dates examples](https://docs.datadoghq.com/logs/log_configuration/parsing.md#parsing-dates) for details.
1. In the Advanced Settings section, add `mytimestamp` to the `Extract from` section so that this parser is applied only to the custom `mytimestamp` attribute.
1. Click **Create**.
1. Add a [Log Date Remapper](https://docs.datadoghq.com/logs/log_configuration/processors.md?tabs=ui#log-date-remapper) to map the correct timestamp to the new logs.

- [Learn how to process your logs](https://docs.datadoghq.com/logs/log_configuration/processors.md)
- [Learn more about parsing](https://docs.datadoghq.com/logs/log_configuration/parsing.md)
- [How to investigate a log parsing issue?](https://docs.datadoghq.com/logs/faq/how-to-investigate-a-log-parsing-issue.md)
