---
title: Instrumenting a Java Cloud Run Job
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: >-
  Docs > Serverless > Google Cloud Run > Instrumenting Cloud Run Jobs >
  Instrumenting a Java Cloud Run Job
---

# Instrumenting a Java Cloud Run Job

## Setup{% #setup %}

{% alert level="info" %}
A sample application is [available on GitHub](https://github.com/DataDog/serverless-gcp-sample-apps/tree/main/cloud-run/in-container/java).
{% /alert %}

{% alert level="info" %}
For full visibility and access to all Datadog features in Cloud Run Jobs, ensure you've [installed the Google Cloud integration](https://docs.datadoghq.com/integrations/google_cloud_platform/) and are using [serverless-init version 1.9.0 or later](https://hub.docker.com/r/datadog/serverless-init).
{% /alert %}

1. **Install the Datadog Java tracer**.

   1. Add the Datadog Java tracer to your Dockerfile:

In the `Dockerfile` file:

      ```dockerfile
      ADD 'https://dtdg.co/latest-java-tracer' agent.jar
      ENV JAVA_TOOL_OPTIONS="-javaagent:agent.jar"
```

   1. Add the tracer artifacts.

      {% tab title="Maven" %}

      ```xml
      <dependency>
        <groupId>com.datadoghq</groupId>
        <artifactId>dd-trace-api</artifactId>
        <version>DD_TRACE_JAVA_VERSION_HERE</version>
      </dependency>
```

      {% /tab %}

      {% tab title="Gradle" %}

      ```groovy
      implementation 'com.datadoghq:dd-trace-api:DD_TRACE_JAVA_VERSION_HERE'
```

      {% /tab %}



See [dd-trace-java releases](https://github.com/DataDog/dd-trace-java/releases) for the latest tracer version.

**Note**: Cloud Run Jobs run to completion rather than serving requests, so auto instrumentation won't create a top-level "job" span. For end-to-end visibility, create your own root span. See the [Java Custom Instrumentation](https://docs.datadoghq.com/tracing/trace_collection/custom_instrumentation/java/dd-api#adding-spans) instructions.

For more information, see [Tracing Java Applications](https://docs.datadoghq.com/tracing/trace_collection/automatic_instrumentation/dd_libraries/java/).

1. **Install serverless-init**.
Important alert (level: info): Serverless-init automatically creates a span for the duration of a task, even if the tracer is not installed. You can disable this by setting `DD_APM_ENABLED=false`. However, tracing is **recommended** because it is required for task-level visibility.Important alert (level: info): Cloud Run Jobs requires `serverless-init` version 1.9.0 or later.Datadog publishes new releases of the `serverless-init` container image to Google's gcr.io, AWS's ECR, and on Docker Hub:
| hub.docker.com          | gcr.io                           | public.ecr.aws                         |
| ----------------------- | -------------------------------- | -------------------------------------- |
| datadog/serverless-init | gcr.io/datadoghq/serverless-init | public.ecr.aws/datadog/serverless-init |

Images are tagged based on semantic versioning, with each new version receiving three relevant tags:

   - `1`, `1-alpine`: use these to track the latest minor releases, without breaking changes
   - `1.x.x`, `1.x.x-alpine`: use these to pin to a precise version of the library
   - `latest`, `latest-alpine`: use these to follow the latest version release, which may include breaking changes

Add the following instructions and arguments to your Dockerfile.

   ```dockerfile
   COPY --from=datadog/serverless-init:<YOUR_TAG> /datadog-init /app/datadog-init
   ENTRYPOINT ["/app/datadog-init"]
   CMD ["./mvnw", "spring-boot:run"]
   ```

   {% collapsible-section %}
   Alternative configuration: 
Datadog expects `serverless-init` to be the top-level application, with the rest of your app's command line passed in for `serverless-init` to execute.

If you already have an entrypoint defined inside your Dockerfile, you can instead modify the CMD argument.

   ```dockerfile
   CMD ["/app/datadog-init", "./mvnw", "spring-boot:run"]
   ```

If you require your entrypoint to be instrumented as well, you can instead swap your entrypoint and CMD arguments.

   ```dockerfile
   ENTRYPOINT ["/app/datadog-init"]
   CMD ["/your_entrypoint.sh", "./mvnw", "spring-boot:run"]
   ```

As long as your command to run is passed as an argument to `datadog-init`, you will receive full instrumentation.
   {% /collapsible-section %}

1. **Set up logs**.

To enable logging, set the environment variable `DD_LOGS_ENABLED=true`. This allows `serverless-init` to read logs from stdout and stderr.

Datadog also recommends setting the environment variable `DD_LOGS_INJECTION=true` and `DD_SOURCE=java` to enable advanced Datadog log parsing.

If you want multiline logs to be preserved in a single log message, Datadog recommends writing your logs in *compact* JSON format. For example, you can use a third-party logging library such as `Log4j 2`:

   ```java
   private static final Logger logger = LogManager.getLogger(App.class);
   logger.info("Hello World!");
```

In the `resources/log4j2.xml` file:

   ```xml
   <Configuration>
     <Appenders>
       <Console name="Console"><JsonLayout compact="true" eventEol="true" properties="true"/></Console>
     </Appenders>
     <Loggers><Root level="info"><AppenderRef ref="Console"/></Root></Loggers>
   </Configuration>
```

For more information, see [Correlating Java Logs and Traces](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/java/).

1. **Configure your application**.

After the container is built and pushed to your registry, set the required environment variables for the Datadog Agent:

   - `DD_API_KEY`: Your [Datadog API key](https://app.datadoghq.com/organization-settings/api-keys), used to send data to your Datadog account. For privacy and safety, configure this API key as a Google Cloud Secret.
   - `DD_SITE`: Your [Datadog site](https://docs.datadoghq.com/getting_started/site/). For example, `datadoghq.com`.

For more environment variables, see the Environment variables section on this page.

The following command deploys the job:

   ```shell
   gcloud run jobs deploy <JOB_NAME> \
     --image=gcr.io/<YOUR_PROJECT>/<APP_NAME> \
     --set-env-vars=DD_API_KEY=$DD_API_KEY \
     --set-env-vars=DD_SITE=$DD_SITE \
   ```

**Add a service label in Google Cloud**. In your Cloud Run service's info panel, add a label with the following key and value:

| Key       | Value                                                                                          |
| --------- | ---------------------------------------------------------------------------------------------- |
| `service` | The name of your service. Matches the value provided as the `DD_SERVICE` environment variable. |

See [Configure labels for services](https://cloud.google.com/run/docs/configuring/services/labels) in the Cloud Run documentation for instructions.

**Set up a retention filter for Cloud Run Jobs traces**. Datadog relies on traces to display executions and tasks in the UI. To ensure traces are retained, create a [retention filter](https://docs.datadoghq.com/tracing/trace_pipeline/trace_retention/#create-your-own-retention-filter) with the query `@origin:cloudrunjobs` and set the span retention rate to 100%.



**Send custom metrics**.

To send custom metrics, [install the DogStatsD client](https://docs.datadoghq.com/extend/dogstatsd/?tab=java#install-the-dogstatsd-client) and [view code examples](https://docs.datadoghq.com/metrics/custom_metrics/dogstatsd_metrics_submission/?tab=java#code-examples-5). In serverless, only the *distribution* metric type is supported.

### Environment variables{% #environment-variables %}

| Variable            | Description                                                                                                                                                                                                                                                  |
| ------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `DD_API_KEY`        | [Datadog API key](https://app.datadoghq.com/organization-settings/api-keys) - **Required**                                                                                                                                                                   |
| `DD_SITE`           | [Datadog site](https://docs.datadoghq.com/getting_started/site/) - **Required**                                                                                                                                                                              |
| `DD_SERVICE`        | Datadog Service name. **Required**                                                                                                                                                                                                                           |
| `DD_LOGS_ENABLED`   | When true, send logs (stdout and stderr) to Datadog. Defaults to false.                                                                                                                                                                                      |
| `DD_LOGS_INJECTION` | When true, enrich all logs with trace data for supported loggers. See [Correlate Logs and Traces](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/) for more information.                                                         |
| `DD_VERSION`        | See [Unified Service Tagging](https://docs.datadoghq.com/getting_started/tagging/unified_service_tagging/).                                                                                                                                                  |
| `DD_ENV`            | See [Unified Service Tagging](https://docs.datadoghq.com/getting_started/tagging/unified_service_tagging/).                                                                                                                                                  |
| `DD_SOURCE`         | Set the log source to enable a [Log Pipeline](https://docs.datadoghq.com/logs/log_configuration/pipelines) for advanced parsing. To automatically apply language-specific parsing rules, set to `java`, or use your custom pipeline. Defaults to `cloudrun`. |
| `DD_TAGS`           | Add custom tags to your logs, metrics, and traces. Tags should be comma separated in key/value format (for example: `key1:value1,key2:value2`).                                                                                                              |
| `JAVA_TOOL_OPTIONS` | **Required** for tracing. The path to the Datadog Java agent. For example, `-javaagent:/path/to/dd-java-agent.jar`.                                                                                                                                          |

**Do not set** the following environment variables in your serverless environment. They should only be set in non-serverless environments.

- `DD_AGENT_HOST`
- `DD_TRACE_AGENT_URL`

## Troubleshooting{% #troubleshooting %}

This integration depends on your runtime having a full SSL implementation. If you are using a slim image, you may need to add the following command to your Dockerfile to include certificates:

```dockerfile
RUN apt-get update && apt-get install -y ca-certificates
```

To have your Cloud Run services appear in the [software catalog](https://cloud.google.com/run/docs/configuring/services/labels), you must set the `DD_SERVICE`, `DD_VERSION`, and `DD_ENV` environment variables.

## Further reading{% #further-reading %}

- [Tracing Java Applications](https://docs.datadoghq.com/tracing/trace_collection/automatic_instrumentation/dd_libraries/java/)
- [Correlating Java Logs and Traces](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/java/)
