---
title: Instrumenting a Node.js Cloud Run Container In-Container
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: >-
  Docs > Serverless > Google Cloud Run > Choosing an Instrumentation Method for
  Containers > In-Container Instrumentation > Instrumenting a Node.js Cloud Run
  Container In-Container
---

# Instrumenting a Node.js Cloud Run Container In-Container

## Setup{% #setup %}

{% alert level="info" %}
A sample application is [available on GitHub](https://github.com/DataDog/serverless-gcp-sample-apps/tree/main/cloud-run/in-container/node).
{% /alert %}

1. **Install the Datadog Node.js tracer**.

   1. In your main application, install the `dd-trace` package.

      ```shell
      npm install dd-trace
```

   1. Initialize the Node.js tracer with the `NODE_OPTIONS` environment variable:

      ```dockerfile
      ENV NODE_OPTIONS="--require dd-trace/init"
```

For more information, see [Tracing Node.js applications](https://docs.datadoghq.com/tracing/trace_collection/automatic_instrumentation/dd_libraries/nodejs/).

1. **Install serverless-init**.

Datadog publishes new releases of the `serverless-init` container image to Google's gcr.io, AWS's ECR, and on Docker Hub:

| hub.docker.com          | gcr.io                           | public.ecr.aws                         |
| ----------------------- | -------------------------------- | -------------------------------------- |
| datadog/serverless-init | gcr.io/datadoghq/serverless-init | public.ecr.aws/datadog/serverless-init |

Images are tagged based on semantic versioning, with each new version receiving three relevant tags:

   - `1`, `1-alpine`: use these to track the latest minor releases, without breaking changes
   - `1.x.x`, `1.x.x-alpine`: use these to pin to a precise version of the library
   - `latest`, `latest-alpine`: use these to follow the latest version release, which may include breaking changes

Add the following instructions and arguments to your Dockerfile.

   ```dockerfile
   COPY --from=datadog/serverless-init:<YOUR_TAG> /datadog-init /app/datadog-init
   ENTRYPOINT ["/app/datadog-init"]
   CMD ["/nodejs/bin/node", "/path/to/your/app.js"]
   ```

   {% collapsible-section %}
   Alternative configuration: 
Datadog expects `serverless-init` to be the top-level application, with the rest of your app's command line passed in for `serverless-init` to execute.

If you already have an entrypoint defined inside your Dockerfile, you can instead modify the CMD argument.

   ```dockerfile
   CMD ["/app/datadog-init", "/nodejs/bin/node", "/path/to/your/app.js"]
   ```

If you require your entrypoint to be instrumented as well, you can instead swap your entrypoint and CMD arguments.

   ```dockerfile
   ENTRYPOINT ["/app/datadog-init"]
   CMD ["/your_entrypoint.sh", "/nodejs/bin/node", "/path/to/your/app.js"]
   ```

As long as your command to run is passed as an argument to `datadog-init`, you will receive full instrumentation.
   {% /collapsible-section %}

1. **Set up logs**.

To enable logging, set the environment variable `DD_LOGS_ENABLED=true`. This allows `serverless-init` to read logs from stdout and stderr.

Datadog also recommends setting the environment variable `DD_LOGS_INJECTION=true` and `DD_SOURCE=nodejs` to enable advanced Datadog log parsing.

If you want multiline logs to be preserved in a single log message, Datadog recommends writing your logs in JSON format. For example, you can use a third-party logging library such as `winston`:

   ```javascript
   const { createLogger, format, transports } = require('winston');
   
   const logger = createLogger({
     level: 'info',
     exitOnError: false,
     format: format.json(),
     transports: [
       new transports.Console()
     ],
   });
   
   logger.info('Hello world!');
```



For more information, see [Correlating Node.js Logs and Traces](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/nodejs/).

1. **Configure your application**.

After the container is built and pushed to your registry, set the required environment variables for the Datadog Agent:

   - `DD_API_KEY`: Your [Datadog API key](https://app.datadoghq.com/organization-settings/api-keys), used to send data to your Datadog account. For privacy and safety, configure this API key as a Google Cloud Secret.
   - `DD_SITE`: Your [Datadog site](https://docs.datadoghq.com/getting_started/site/). For example, `datadoghq.com`.

For more environment variables, see the Environment variables section on this page.

The following command deploys the service and allows any external connection to reach it. In this example, your service listening is set to port 8080. Ensure that this port number matches the exposed port inside of your Dockerfile.

   ```shell
   gcloud run deploy <APP_NAME>
     --image=gcr.io/<YOUR_PROJECT>/<APP_NAME> \
     --port=8080 \
     --update-env-vars=DD_API_KEY=$DD_API_KEY \
     --update-env-vars=DD_SITE=$DD_SITE \
   ```

**Add a service label in Google Cloud**. In your Cloud Run service's info panel, add a label with the following key and value:

| Key       | Value                                                                                          |
| --------- | ---------------------------------------------------------------------------------------------- |
| `service` | The name of your service. Matches the value provided as the `DD_SERVICE` environment variable. |

See [Configure labels for services](https://cloud.google.com/run/docs/configuring/services/labels) in the Cloud Run documentation for instructions.

**Send custom metrics**.

To send custom metrics, [view code examples](https://docs.datadoghq.com/metrics/custom_metrics/dogstatsd_metrics_submission/?tab=nodejs#code-examples-5). In serverless, only the *distribution* metric type is supported.

**Enable profiling (preview)**.

To enable the [Continuous Profiler](https://docs.datadoghq.com/profiler/), set the environment variable `DD_PROFILING_ENABLED=true`.

{% alert level="info" %}
Datadog's Continuous Profiler is available in preview for Google Cloud Run services.
{% /alert %}

### Environment variables{% #environment-variables %}

| Variable            | Description                                                                                                                                                                                                                                                    |
| ------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `DD_API_KEY`        | [Datadog API key](https://app.datadoghq.com/organization-settings/api-keys) - **Required**                                                                                                                                                                     |
| `DD_SITE`           | [Datadog site](https://docs.datadoghq.com/getting_started/site/) - **Required**                                                                                                                                                                                |
| `DD_SERVICE`        | Datadog Service name. **Required**                                                                                                                                                                                                                             |
| `DD_LOGS_ENABLED`   | When true, send logs (stdout and stderr) to Datadog. Defaults to false.                                                                                                                                                                                        |
| `DD_LOGS_INJECTION` | When true, enrich all logs with trace data for supported loggers. See [Correlate Logs and Traces](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/) for more information.                                                           |
| `DD_VERSION`        | See [Unified Service Tagging](https://docs.datadoghq.com/getting_started/tagging/unified_service_tagging/).                                                                                                                                                    |
| `DD_ENV`            | See [Unified Service Tagging](https://docs.datadoghq.com/getting_started/tagging/unified_service_tagging/).                                                                                                                                                    |
| `DD_SOURCE`         | Set the log source to enable a [Log Pipeline](https://docs.datadoghq.com/logs/log_configuration/pipelines) for advanced parsing. To automatically apply language-specific parsing rules, set to `nodejs`, or use your custom pipeline. Defaults to `cloudrun`. |
| `DD_TAGS`           | Add custom tags to your logs, metrics, and traces. Tags should be comma separated in key/value format (for example: `key1:value1,key2:value2`).                                                                                                                |

**Do not set** the following environment variables in your serverless environment. They should only be set in non-serverless environments.

- `DD_AGENT_HOST`
- `DD_TRACE_AGENT_URL`

## Distributed tracing with Pub/Sub{% #distributed-tracing-with-pubsub %}

To get end-to-end distributed traces between Pub/Sub producers and Cloud Run services, configure your push subscriptions with the `--push-no-wrapper` and `--push-no-wrapper-write-metadata` flags. This moves message attributes from the JSON body to HTTP headers, allowing Datadog to extract producer trace context and create proper span links.

For more information, see [Producer-aware tracing for Google Cloud Pub/Sub and Cloud Run](https://www.datadoghq.com/blog/pubsub-cloud-run-tracing/) and [Payload unwrapping](https://cloud.google.com/pubsub/docs/payload-unwrapping) in the Google Cloud documentation.

### Configure push subscriptions for full trace visibility{% #configure-push-subscriptions-for-full-trace-visibility %}

**Create a new push subscription:**

```shell
gcloud pubsub subscriptions create order-processor-sub \
  --topic=orders \
  --push-endpoint=https://order-processor-xyz.run.app/pubsub \
  --push-no-wrapper \
  --push-no-wrapper-write-metadata
```

**Update an existing push subscription:**

```shell
gcloud pubsub subscriptions update order-processor-sub \
  --push-no-wrapper \
  --push-no-wrapper-write-metadata
```

### Configure Eventarc Pub/Sub triggers{% #configure-eventarc-pubsub-triggers %}

Eventarc Pub/Sub triggers use push subscriptions as the underlying delivery mechanism. When you create an Eventarc trigger, GCP automatically creates a managed push subscription. However, Eventarc does not expose `--push-no-wrapper-write-metadata` as a trigger creation parameter, so you must manually update the auto-created subscription.

1. **Create the Eventarc trigger:**

   ```shell
   gcloud eventarc triggers create order-processor-trigger \
     --destination-run-service=order-processor \
     --destination-run-region=us-central1 \
     --event-filters="type=google.cloud.pubsub.topic.v1.messagePublished" \
     --event-filters="topic=projects/my-project/topics/orders" \
     --location=us-central1
```

1. **Find the auto-created subscription:**

   ```shell
   gcloud pubsub subscriptions list \
     --filter="topic:projects/my-project/topics/orders" \
     --format="table(name,pushConfig.pushEndpoint)"
```

Example output:

   ```
   NAME                                                          PUSH_ENDPOINT
   eventarc-us-central1-order-processor-trigger-abc-sub-def      https://order-processor-xyz.run.app
   ```

1. **Update the subscription for trace propagation:**

   ```shell
   gcloud pubsub subscriptions update \
     eventarc-us-central1-order-processor-trigger-abc-sub-def \
     --push-no-wrapper \
     --push-no-wrapper-write-metadata
```

## Troubleshooting{% #troubleshooting %}

This integration depends on your runtime having a full SSL implementation. If you are using a slim image, you may need to add the following command to your Dockerfile to include certificates:

```dockerfile
RUN apt-get update && apt-get install -y ca-certificates
```

To have your Cloud Run services appear in the [software catalog](https://cloud.google.com/run/docs/configuring/services/labels), you must set the `DD_SERVICE`, `DD_VERSION`, and `DD_ENV` environment variables.

## Further reading{% #further-reading %}

- [Tracing Node.js Applications](https://docs.datadoghq.com/tracing/trace_collection/automatic_instrumentation/dd_libraries/nodejs/)
- [Correlating Node.js Logs and Traces](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/nodejs/)
