---
title: Instrumenting a Python Container App In-Container
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: >-
  Docs > Serverless > Azure Container Apps > In-Container Instrumentation >
  Instrumenting a Python Container App In-Container
---

# Instrumenting a Python Container App In-Container

## Setup{% #setup %}

1. **Install the Datadog Python tracer**.

Add `ddtrace` to your `requirements.txt` or `pyproject.toml`. You can find the latest version on [PyPI](https://pypi.org/project/ddtrace/):

In the `requirements.txt` file:

   ```text
   ddtrace==<VERSION>
```



Alternatively, you can install the tracer in your Dockerfile:

In the `Dockerfile` file:

   ```dockerfile
   RUN pip install ddtrace
```



Then, wrap your start command with `ddtrace-run`:

In the `Dockerfile` file:

   ```dockerfile
   CMD ["ddtrace-run", "python", "app.py"]
```



For more information, see [Tracing Python applications](https://docs.datadoghq.com/tracing/trace_collection/automatic_instrumentation/dd_libraries/python).

1. **Install serverless-init**.

Datadog publishes new releases of the `serverless-init` container image to Google's gcr.io, AWS's ECR, and on Docker Hub:

| hub.docker.com          | gcr.io                           | public.ecr.aws                         |
| ----------------------- | -------------------------------- | -------------------------------------- |
| datadog/serverless-init | gcr.io/datadoghq/serverless-init | public.ecr.aws/datadog/serverless-init |

Images are tagged based on semantic versioning, with each new version receiving three relevant tags:

   - `1`, `1-alpine`: use these to track the latest minor releases, without breaking changes
   - `1.x.x`, `1.x.x-alpine`: use these to pin to a precise version of the library
   - `latest`, `latest-alpine`: use these to follow the latest version release, which may include breaking changes

Add the following instructions and arguments to your Dockerfile.

   ```dockerfile
   COPY --from=datadog/serverless-init:<YOUR_TAG> /datadog-init /app/datadog-init
   ENTRYPOINT ["/app/datadog-init"]
   CMD ["ddtrace-run", "python", "path/to/your/python/app.py"]
   ```

   {% collapsible-section %}
   Alternative configuration: 
Datadog expects `serverless-init` to be the top-level application, with the rest of your app's command line passed in for `serverless-init` to execute.

If you already have an entrypoint defined inside your Dockerfile, you can instead modify the CMD argument.

   ```dockerfile
   CMD ["/app/datadog-init", "ddtrace-run", "python", "path/to/your/python/app.py"]
   ```

If you require your entrypoint to be instrumented as well, you can instead swap your entrypoint and CMD arguments.

   ```dockerfile
   ENTRYPOINT ["/app/datadog-init"]
   CMD ["/your_entrypoint.sh", "ddtrace-run", "python", "path/to/your/python/app.py"]
   ```

As long as your command to run is passed as an argument to `datadog-init`, you will receive full instrumentation.
   {% /collapsible-section %}

1. **Set up logs**.

To enable logging, set the environment variable `DD_LOGS_ENABLED=true`. This allows `serverless-init` to read logs from stdout and stderr.

Datadog also recommends the following environment variables:

   - `ENV PYTHONUNBUFFERED=1`: Ensure Python outputs appear immediately in container logs instead of being buffered.
   - `ENV DD_LOGS_INJECTION=true`: Enable log/trace correlation for supported loggers.
   - `ENV DD_SOURCE=python`: Enable advanced Datadog log parsing.

If you want multiline logs to be preserved in a single log message, Datadog recommends writing your logs in JSON format. For example, you can use a third-party logging library such as `structlog`:

   ```python
   import structlog
   
   def tracer_injection(logger, log_method, event_dict):
       event_dict.update(tracer.get_log_correlation_context())
       return event_dict
   
   structlog.configure(
       processors=[
           tracer_injection,
           structlog.processors.EventRenamer("msg"),
           structlog.processors.JSONRenderer()
       ],
       logger_factory=structlog.WriteLoggerFactory(file=sys.stdout),
   )
   
   logger = structlog.get_logger()
   
   logger.info("Hello world!")
```



For more information, see [Correlating Python Logs and Traces](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/python/).

1. **Configure your application**.

After the container is built and pushed to your registry, set the required environment variables for the Datadog Agent:

   - `DD_API_KEY`: Your [Datadog API key](https://app.datadoghq.com/organization-settings/api-keys), used to send data to your Datadog account. For privacy and safety, configure this API key as a Google Cloud Secret.
   - `DD_SITE`: Your [Datadog site](https://docs.datadoghq.com/getting_started/site/). For example, `datadoghq.com`.

For more environment variables, see the Environment variables section on this page.

**Send custom metrics**.

To send custom metrics, [install the DogStatsD client](https://docs.datadoghq.com/extend/dogstatsd/?tab=python#install-the-dogstatsd-client) and [view code examples](https://docs.datadoghq.com/metrics/custom_metrics/dogstatsd_metrics_submission/?tab=python#code-examples-5). In serverless, only the *distribution* metric type is supported.

**Enable profiling (preview)**.

To enable the [Continuous Profiler](https://docs.datadoghq.com/profiler/), set the environment variable `DD_PROFILING_ENABLED=true`.

{% alert level="info" %}
Datadog's Continuous Profiler is available in preview for Azure Container Apps.
{% /alert %}

### Environment variables{% #environment-variables %}

| Variable                   | Description                                                                                                                                                                                                                                                        |
| -------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `DD_API_KEY`               | [Datadog API key](https://app.datadoghq.com/organization-settings/api-keys) - **Required**                                                                                                                                                                         |
| `DD_SITE`                  | [Datadog site](https://docs.datadoghq.com/getting_started/site/) - **Required**                                                                                                                                                                                    |
| `DD_SERVICE`               | Datadog Service name. **Required**                                                                                                                                                                                                                                 |
| `DD_AZURE_SUBSCRIPTION_ID` | Azure Subscription ID. **Required**                                                                                                                                                                                                                                |
| `DD_AZURE_RESOURCE_GROUP`  | Azure Resource Group name. **Required**                                                                                                                                                                                                                            |
| `DD_LOGS_ENABLED`          | When true, send logs (stdout and stderr) to Datadog. Defaults to false.                                                                                                                                                                                            |
| `DD_LOGS_INJECTION`        | When true, enrich all logs with trace data for supported loggers. See [Correlate Logs and Traces](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/) for more information.                                                               |
| `DD_VERSION`               | See [Unified Service Tagging](https://docs.datadoghq.com/getting_started/tagging/unified_service_tagging/).                                                                                                                                                        |
| `DD_ENV`                   | See [Unified Service Tagging](https://docs.datadoghq.com/getting_started/tagging/unified_service_tagging/).                                                                                                                                                        |
| `DD_SOURCE`                | Set the log source to enable a [Log Pipeline](https://docs.datadoghq.com/logs/log_configuration/pipelines) for advanced parsing. To automatically apply language-specific parsing rules, set to `python`, or use your custom pipeline. Defaults to `containerapp`. |
| `DD_TAGS`                  | Add custom tags to your logs, metrics, and traces. Tags should be comma separated in key/value format (for example: `key1:value1,key2:value2`).                                                                                                                    |

**Do not set** the following environment variables in your serverless environment. They should only be set in non-serverless environments.

- `DD_AGENT_HOST`
- `DD_TRACE_AGENT_URL`

## Troubleshooting{% #troubleshooting %}

This integration depends on your runtime having a full SSL implementation. If you are using a slim image, you may need to add the following command to your Dockerfile to include certificates:

```dockerfile
RUN apt-get update && apt-get install -y ca-certificates
```

To have your Azure Container Apps appear in the [software catalog](https://docs.datadoghq.com/internal_developer_portal/software_catalog/), you must set the `DD_SERVICE`, `DD_VERSION`, and `DD_ENV` environment variables.

## Further reading{% #further-reading %}

- [Tracing Python Applications](https://docs.datadoghq.com/tracing/trace_collection/automatic_instrumentation/dd_libraries/python/)
- [Correlating Python Logs and Traces](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/python/)
