---
title: Instrumenting a Node.js Cloud Run Function
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: >-
  Docs > Serverless > Google Cloud Run > Instrumenting Cloud Run Functions >
  Instrumenting a Node.js Cloud Run Function
---

# Instrumenting a Node.js Cloud Run Function

{% alert level="info" %}
A sample application is [available on GitHub](https://github.com/DataDog/serverless-gcp-sample-apps/tree/main/cloud-run-functions/node).
{% /alert %}

## Setup{% #setup %}

1. **Install the Datadog Node.js tracer**.

   1. In your main application, install the `dd-trace` package.

      ```shell
      npm install dd-trace
```

   1. Initialize the Node.js tracer with the `NODE_OPTIONS` environment variable:

      ```dockerfile
      ENV NODE_OPTIONS="--require dd-trace/init"
```

For more information, see [Tracing Node.js applications](https://docs.datadoghq.com/tracing/trace_collection/automatic_instrumentation/dd_libraries/nodejs/).

1. **Install serverless-init as a sidecar**.

   {% tab title="Datadog CLI" %}
   Setup: 
Install the Datadog CLI

   ```shell
   npm install -g @datadog/datadog-ci @datadog/datadog-ci-plugin-cloud-run
   ```

Install the [gcloud CLI](https://cloud.google.com/sdk/docs/install) and authenticate with `gcloud auth login`.
Configuration: 
Configure the [Datadog site](https://docs.datadoghq.com/getting_started/site/) and Datadog API key, and define the service name to use in Datadog.

   ```shell
   export DATADOG_SITE="<DATADOG_SITE>"
   export DD_API_KEY="<DD_API_KEY>"
   export DD_SERVICE="<SERVICE_NAME>"
   ```
Instrument: 
If you are new to Datadog serverless monitoring, launch the Datadog CLI in interactive mode to guide your first installation for a quick start.

   ```shell
   datadog-ci cloud-run instrument -i
   ```

To set up the Datadog sidecar for your applications, run the `instrument` command *after* your normal deployment. You can specify multiple services to instrument by passing multiple `--service` flags.

   ```shell
   datadog-ci cloud-run instrument --project <GCP-PROJECT-ID> --service <CLOUD-RUN-SERVICE-NAME> --region <GCP-REGION>
   ```

You can pin to a specific image with the `--sidecar-image` flag. See the [latest releases on Docker Hub](https://hub.docker.com/r/datadog/serverless-init).

Additional parameters can be found in the [CLI documentation](https://github.com/DataDog/datadog-ci/tree/master/packages/plugin-cloud-run#arguments).
   {% /tab %}

   {% tab title="Terraform" %}
The [Datadog Terraform module for Google Cloud Run](https://github.com/DataDog/terraform-google-cloud-run-datadog) wraps the [`google_cloud_run_v2_service`](https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/cloud_run_v2_service) resource and automatically configures your Cloud Run app for Datadog Serverless Monitoring by adding required environment variables and the serverless-init sidecar.

If you don't already have Terraform set up, [install Terraform](https://developer.hashicorp.com/terraform/install), create a new directory, and make a file called `main.tf`.

Then, add the following to your Terraform configuration, updating it as necessary based on your needs:

   ```tf
   variable "datadog_api_key" {
     description = "Your Datadog API key"
     type        = string
     sensitive   = true
   }
   
   module "my-cloud-run-app" {
     source  = "DataDog/cloud-run-datadog/google"
     version = "~> 1.0"
   
     project  = "my-gcp-project"
     name     = "my-cloud-run-app"
     location = "us-central1"
   
     datadog_api_key = var.datadog_api_key
     datadog_service = "test-service" // your application service
     datadog_version = "0.0.0" // your code version
     datadog_env     = "prod" // your application environment
     
     datadog_enable_logging = true
   
     deletion_protection = false
     build_config = {
       function_target          = "helloHttp" // your function entry point
       image_uri                = "us-docker.pkg.dev/cloudrun/container/hello"
       base_image               = "us-central1-docker.pkg.dev/serverless-runtimes/google-22-full/runtimes/your-runtime" // base image for your runtime
       enable_automatic_updates = true
     }
     template = {
       containers = [
         {
           name  = "main"
           image = "us-docker.pkg.dev/cloudrun/container/hello"
           base_image_uri = "us-central1-docker.pkg.dev/serverless-runtimes/google-22-full/runtimes/your-runtime" // base image for your runtime
           resources = {
             limits = {
               cpu    = "1"
               memory = "512Mi"
             }
           }
           ports = {
             container_port = 8080
           }
           env = [
             { name = "DD_TRACE_ENABLED", value = "true" },
           ]
         },
       ]
     }
   }
   ```

See the Environment Variables for more information on the configuration options available through the `env`.

Ensure the container port for the main container is the same as the one exposed in your Dockerfile/service.

If you haven't already, initialize your Terraform project:

   ```shell
   terraform init
   ```

To deploy your app, run:

   ```shell
   terraform apply
   ```

      {% /tab %}

   {% tab title="Other" %}
After deploying your Cloud Run app, you can manually modify your app's settings to enable Datadog monitoring.

   1. Create a **Volume** with `In-Memory` volume type.

   1. Add a **new container** with image URL: `gcr.io/datadoghq/serverless-init:<YOUR_TAG>`. See the [latest releases on Docker Hub](https://hub.docker.com/r/datadog/serverless-init) to pin a specific version.

   1. Add the volume mount to every container in your application. Choose a path such as `/shared-volume`, and remember it for the next step.

   1. Add the following environment variables to your `serverless-init` sidecar container:

      - `DD_SERVICE`: A name for your service. For example, `gcr-sidecar-test`.
      - `DD_ENV`: A name for your environment. For example, `dev`.
      - `DD_SERVERLESS_LOG_PATH`: Your log path. For example, `/shared-volume/logs/*.log`. The path must begin with the mount path you defined in the previous step.
      - `DD_API_KEY`: Your [Datadog API key](https://app.datadoghq.com/organization-settings/api-keys).
      - `FUNCTION_TARGET`: The entry point of your function. For example, `Main`.

For a list of all environment variables, including additional tags, see Environment variables.

      {% /tab %}

1. **Set up logs**.

In the previous step, you created a shared volume. You may have also set the `DD_SERVERLESS_LOG_PATH` environment variable, which defaults to `/shared-volume/logs/app.log`.

In this step, configure your logging library to write logs to the file set in `DD_SERVERLESS_LOG_PATH`. In Node.js, Datadog recommend writing logs in a JSON format. For example, you can use a third-party logging library such as `winston`:

   ```javascript
   const { createLogger, format, transports } = require('winston');
   
   const LOG_FILE = "/shared-volume/logs/app.log"
   
   const logger = createLogger({
     level: 'info',
     exitOnError: false,
     format: format.json(),
     transports: [
       new transports.File({ filename: LOG_FILE }),
       new transports.Console()
     ],
   });
   
   logger.info('Hello world!');
```



Datadog recommends setting the environment variables `DD_LOGS_INJECTION=true` (in your main container) and `DD_SOURCE=nodejs` (in your sidecar container) to enable advanced Datadog log parsing.

For more information, see [Correlating Node.js Logs and Traces](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/nodejs/).

1. **Add a service label in Google Cloud**. In your Cloud Run service's info panel, add a label with the following key and value:

| Key       | Value                                                                                          |
| --------- | ---------------------------------------------------------------------------------------------- |
| `service` | The name of your service. Matches the value provided as the `DD_SERVICE` environment variable. |

See [Configure labels for services](https://cloud.google.com/run/docs/configuring/services/labels) in the Cloud Run documentation for instructions.

**Send custom metrics**.

To send custom metrics, [view code examples](https://docs.datadoghq.com/metrics/custom_metrics/dogstatsd_metrics_submission/?tab=nodejs#code-examples-5). In Serverless Monitoring, only the *distribution* metric type is supported.

**Enable profiling (preview)**.

To enable the [Continuous Profiler](https://docs.datadoghq.com/profiler/), set the environment variable `DD_PROFILING_ENABLED=true` in your application container.

{% alert level="info" %}
Datadog's Continuous Profiler is available in preview for 2nd gen Cloud Run functions.
{% /alert %}

### Environment variables{% #environment-variables %}

| Variable                 | Description                                                                                                                                                                                                                                                    | Container             |
| ------------------------ | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | --------------------- |
| `DD_API_KEY`             | [Datadog API key](https://app.datadoghq.com/organization-settings/api-keys) - **Required**                                                                                                                                                                     | Sidecar container     |
| `DD_SITE`                | [Datadog site](https://docs.datadoghq.com/getting_started/site/) - **Required**                                                                                                                                                                                | Sidecar container     |
| `DD_SERVICE`             | Datadog Service name. **Required**                                                                                                                                                                                                                             | Both containers       |
| `DD_SERVERLESS_LOG_PATH` | The path where the sidecar should tail logs from. Recommended to set to `/shared-volume/logs/app.log`.                                                                                                                                                         | Sidecar container     |
| `DD_LOGS_INJECTION`      | When true, enrich all logs with trace data for supported loggers. See [Correlate Logs and Traces](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/) for more information.                                                           | Application container |
| `DD_VERSION`             | See [Unified Service Tagging](https://docs.datadoghq.com/getting_started/tagging/unified_service_tagging/).                                                                                                                                                    | Both containers       |
| `DD_ENV`                 | See [Unified Service Tagging](https://docs.datadoghq.com/getting_started/tagging/unified_service_tagging/).                                                                                                                                                    | Both containers       |
| `DD_SOURCE`              | Set the log source to enable a [Log Pipeline](https://docs.datadoghq.com/logs/log_configuration/pipelines) for advanced parsing. To automatically apply language-specific parsing rules, set to `nodejs`, or use your custom pipeline. Defaults to `cloudrun`. | Sidecar container     |
| `DD_TAGS`                | Add custom tags to your logs, metrics, and traces. Tags should be comma separated in key/value format (for example: `key1:value1,key2:value2`).                                                                                                                | Sidecar container     |
| `FUNCTION_TARGET`        | Required for correct tagging. The entry point of your function. For example, `Main`. You can also find `FUNCTION_TARGET` on the source tab inside Google console: `Function entry point`.                                                                      | Sidecar container     |

**Do not set** the following environment variables in your serverless environment. They should only be set in non-serverless environments.

- `DD_AGENT_HOST`
- `DD_TRACE_AGENT_URL`

## Distributed tracing with Pub/Sub{% #distributed-tracing-with-pubsub %}

To get end-to-end distributed traces between Pub/Sub producers and Cloud Run functions, configure your push subscriptions with the `--push-no-wrapper` and `--push-no-wrapper-write-metadata` flags. This moves message attributes from the JSON body to HTTP headers, allowing Datadog to extract producer trace context and create proper span links.

For more information, see [Producer-aware tracing for Google Cloud Pub/Sub and Cloud Run](https://www.datadoghq.com/blog/pubsub-cloud-run-tracing/) and [Payload unwrapping](https://cloud.google.com/pubsub/docs/payload-unwrapping) in the Google Cloud documentation.

### Configure push subscriptions for full trace visibility{% #configure-push-subscriptions-for-full-trace-visibility %}

**Create a new push subscription:**

```shell
gcloud pubsub subscriptions create order-processor-sub \
  --topic=orders \
  --push-endpoint=https://order-processor-xyz.run.app/pubsub \
  --push-no-wrapper \
  --push-no-wrapper-write-metadata
```

**Update an existing push subscription:**

```shell
gcloud pubsub subscriptions update order-processor-sub \
  --push-no-wrapper \
  --push-no-wrapper-write-metadata
```

### Configure Eventarc Pub/Sub triggers{% #configure-eventarc-pubsub-triggers %}

Eventarc Pub/Sub triggers use push subscriptions as the underlying delivery mechanism. When you create an Eventarc trigger, GCP automatically creates a managed push subscription. However, Eventarc does not expose `--push-no-wrapper-write-metadata` as a trigger creation parameter, so you must manually update the auto-created subscription.

1. **Create the Eventarc trigger:**

   ```shell
   gcloud eventarc triggers create order-processor-trigger \
     --destination-run-service=order-processor \
     --destination-run-region=us-central1 \
     --event-filters="type=google.cloud.pubsub.topic.v1.messagePublished" \
     --event-filters="topic=projects/my-project/topics/orders" \
     --location=us-central1
```

1. **Find the auto-created subscription:**

   ```shell
   gcloud pubsub subscriptions list \
     --filter="topic:projects/my-project/topics/orders" \
     --format="table(name,pushConfig.pushEndpoint)"
```

Example output:

   ```
   NAME                                                          PUSH_ENDPOINT
   eventarc-us-central1-order-processor-trigger-abc-sub-def      https://order-processor-xyz.run.app
   ```

1. **Update the subscription for trace propagation:**

   ```shell
   gcloud pubsub subscriptions update \
     eventarc-us-central1-order-processor-trigger-abc-sub-def \
     --push-no-wrapper \
     --push-no-wrapper-write-metadata
```

## Troubleshooting{% #troubleshooting %}

This integration depends on your runtime having a full SSL implementation. If you are using a slim image, you may need to add the following command to your Dockerfile to include certificates:

```dockerfile
RUN apt-get update && apt-get install -y ca-certificates
```

To have your Cloud Run services appear in the [software catalog](https://cloud.google.com/run/docs/configuring/services/labels), you must set the `DD_SERVICE`, `DD_VERSION`, and `DD_ENV` environment variables.

## Further reading{% #further-reading %}

- [Tracing Node.js Applications](https://docs.datadoghq.com/tracing/trace_collection/automatic_instrumentation/dd_libraries/nodejs/)
- [Correlating Node.js Logs and Traces](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/nodejs/)
