Python log collection
Rapport de recherche Datadog : Bilan sur l'adoption de l'informatique sans serveur Rapport : Bilan sur l'adoption de l'informatique sans serveur

Python log collection

Cette page n'est pas encore disponible en français, sa traduction est en cours.
Si vous avez des questions ou des retours sur notre projet de traduction actuel, n'hésitez pas à nous contacter.


Use your favorite Python logger to log into a file on your host. Then monitor the file with the Datadog Agent to send your logs to Datadog.


Python logs are quite complex to handle, mainly because of tracebacks. They are split into multiple lines which make them difficult to associate with the original log event. To address this use case it is strongly recommended to use a JSON formatter when logging in order to:

  • Ensure each stack_trace is properly wrapped into the correct log
  • Ensure all the attributes of a log event are properly extracted (severity, logger name, thread name, etc.)

Here are setup examples for the following logging libraries:

Inject trace IDs in your logs

If APM is enabled for this application, improve the correlation between application logs and traces to automatically add trace and span IDs to your logs.

Once this is done, the log should have the following format:

2019-01-07 15:20:15,972 DEBUG [] [] [dd.trace_id=5688176451479556031 dd.span_id=4663104081780224235] - this is an example

Then configure the Datadog Agent to collect python logs from the file.

Log into a file

Usage example with JSON-log-formatter:

import logging

import json_log_formatter

formatter = json_log_formatter.JSONFormatter()

json_handler = logging.FileHandler(filename='/var/log/my-log.json')

logger = logging.getLogger('my_json')
logger.setLevel(logging.INFO)'Sign up', extra={'referral_code': '52d6ce'})

The log file contains the following log record (inline):

  "message": "Sign up",
  "time": "2015-09-01T06:06:26.524448",
  "referral_code": "52d6ce"

Usage example with Python-json-logger:

    import logging
    from pythonjsonlogger import jsonlogger

    logger = logging.getLogger()

    logHandler = logging.StreamHandler()
    formatter = jsonlogger.JsonFormatter()

Once the handler is configured, the log file contains the following log record (inline):

  "threadName": "MainThread",
  "name": "root",
  "thread": 140735202359648,
  "created": 1336281068.506248,
  "process": 41937,
  "processName": "MainProcess",
  "relativeCreated": 9.100914001464844,
  "module": "tests",
  "funcName": "testFormatKeys",
  "levelno": 20,
  "msecs": 506.24799728393555,
  "pathname": "tests/",
  "lineno": 60,
  "asctime": ["12-05-05 22:11:08,506248"],
  "message": "testing logging format",
  "filename": "",
  "levelname": "INFO",
  "special": "value",
  "run": 12

Configure the Datadog Agent

Create a file conf.yaml in the Agent’s conf.d/python.d/ directory with the following content:



##Log section

  - type: file
    path: "<PATH_TO_PYTHON_LOG>.log"
    service: "<YOUR_APPLICATION>"
    source: python
    sourcecategory: sourcecode
    # For multiline logs, if they start by the date with the format yyyy-mm-dd uncomment the following processing rule
    #  - type: multi_line
    #    name: new_log_start_with_date
    #    pattern: \d{4}\-(0?[1-9]|1[012])\-(0?[1-9]|[12][0-9]|3[01])

Restart the Agent to apply the configuration changes.

Further Reading