---
title: Go Log Collection
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: Docs > Log Management > Log Collection and Integrations > Go Log Collection
---

# Go Log Collection

To send your Go logs to Datadog, log to a file and then [tail](https://docs.datadoghq.com/glossary/#tail) that file with your Datadog Agent. You can use the following setup with [logrus](https://github.com/sirupsen/logrus), an open source logging library.

Datadog strongly encourages setting up your logging library to produce your logs in JSON to avoid the need for [custom parsing rules](https://docs.datadoghq.com/logs/log_configuration/parsing).

## Configure your logger{% #configure-your-logger %}

For a classic Go configuration, open a `main.go` file and paste in the following code:

```go
package main

import (
  log "github.com/sirupsen/logrus"
)

func main() {

    // use JSONFormatter
    log.SetFormatter(&log.JSONFormatter{})

    // log an event as usual with logrus
    log.WithFields(log.Fields{"string": "foo", "int": 1, "float": 1.1 }).Info("My first event from golang to stdout")
}
```

You can add metas to any log if you provide a JSON object that you want to see in the log event.

These metas can be `hostname`, `username`, `customers`, `metric` or any information that can help you troubleshoot and understand what happens in your Go application.

```go
package main

import (
  log "github.com/sirupsen/logrus"
)

func main() {

    // use JSONFormatter
    log.SetFormatter(&log.JSONFormatter{})

    // log an event with logrus
    log.WithFields(log.Fields{"string": "foo", "int": 1, "float": 1.1 }).Info("My first event from golang to stdout")
    
  // for metadata, a common pattern is to reuse fields between logging statements by reusing
  contextualizedLog := log.WithFields(log.Fields{
    "hostname": "staging-1",
    "appname": "foo-app",
    "session": "1ce3f6v"
  })

  contextualizedLog.Info("Simple event with global metadata")
}
```

## Configure your Datadog Agent{% #configure-your-datadog-agent %}

Once [log collection is enabled](https://docs.datadoghq.com/agent/logs/?tab=tailfiles#activate-log-collection), set up [custom log collection](https://docs.datadoghq.com/agent/logs/?tab=tailfiles#custom-log-collection) to tail your log files and send new logs to Datadog.

1. Create a `go.d/` folder in the `conf.d/` [Agent configuration directory](https://docs.datadoghq.com/agent/configuration/agent-configuration-files/?tab=agentv6v7#agent-configuration-directory).

1. Create a `conf.yaml` file in `go.d/` with the following content:

   ```yaml
   ##Log section
   logs:
   
     - type: file
       path: "<path_to_your_go_log>.log"
       service: <service_name>
       source: go
       sourcecategory: sourcecode
   ```

1. [Restart the Agent](https://docs.datadoghq.com/agent/configuration/agent-commands/?tab=agentv6v7#restart-the-agent).

1. Run the [Agent's status subcommand](https://docs.datadoghq.com/agent/configuration/agent-commands/?tab=agentv6v7#agent-status-and-information) and look for `go` under the `Checks` section to confirm logs are successfully submitted to Datadog.

If logs are in JSON format, Datadog automatically [parses the log messages](https://docs.datadoghq.com/logs/log_configuration/parsing/?tab=matchers) to extract log attributes. Use the [Log Explorer](https://docs.datadoghq.com/logs/explorer/#overview) to view and troubleshoot your logs.

## Connect logs and traces{% #connect-logs-and-traces %}

If APM is enabled for this application, the correlation between application logs and traces can be improved by following the [APM Go logging documentation](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/go/) to automatically add trace and span IDs in your logs.

## Best practices{% #best-practices %}

- Name the logger with a name that corresponds to the relevant functionality or service.
- Use the `DEBUG`, `INFO`, `WARNING`, and `FATAL` log levels. In Datadog, Go's `FATAL` maps to a severity level of `Emergency`.
- Start with logging the information that is most important. Expand the comprehensiveness of your logging with further iterations.
- Use metas to add context to any log. This enables you to quickly filter over users, customers, business-centric attributes, etc.

## Further Reading{% #further-reading %}

- [How to collect, standardize, and centralize Golang logs](https://www.datadoghq.com/blog/go-logging/)
- [Learn how to process your logs](https://docs.datadoghq.com/logs/log_configuration/processors)
- [Learn more about parsing](https://docs.datadoghq.com/logs/log_configuration/parsing)
- [Learn how to explore your logs](https://docs.datadoghq.com/logs/explorer/)
- [Perform Log Analytics](https://docs.datadoghq.com/logs/explorer/#visualize)
- [Log Collection Troubleshooting Guide](https://docs.datadoghq.com/logs/faq/log-collection-troubleshooting-guide/)
- [Glossary entry for "tail"](https://docs.datadoghq.com/glossary/#tail)
