---
title: Getting Started with Datadog
description: Datadog, the leading service for cloud-scale monitoring.
breadcrumbs: Docs > Infrastructure > Datadog Resource Catalog
---

# aws_bedrock_model_invocation_job{% #aws_bedrock_model_invocation_job %}

## `account_id`{% #account_id %}

**Type**: `STRING`

## `client_request_token`{% #client_request_token %}

**Type**: `STRING`**Provider name**: `clientRequestToken`**Description**: A unique, case-sensitive identifier to ensure that the API request completes no more than one time. If this token matches a previous request, Amazon Bedrock ignores the request, but does not return an error. For more information, see [Ensuring idempotency](https://docs.aws.amazon.com/AWSEC2/latest/APIReference/Run_Instance_Idempotency.html).

## `end_time`{% #end_time %}

**Type**: `TIMESTAMP`**Provider name**: `endTime`**Description**: The time at which the batch inference job ended.

## `input_data_config`{% #input_data_config %}

**Type**: `STRUCT`**Provider name**: `inputDataConfig`**Description**: Details about the location of the input to the batch inference job.

- `s3_input_data_config`**Type**: `STRUCT`**Provider name**: `s3InputDataConfig`**Description**: Contains the configuration of the S3 location of the input data.
  - `s3_bucket_owner`**Type**: `STRING`**Provider name**: `s3BucketOwner`**Description**: The ID of the Amazon Web Services account that owns the S3 bucket containing the input data.
  - `s3_input_format`**Type**: `STRING`**Provider name**: `s3InputFormat`**Description**: The format of the input data.
  - `s3_uri`**Type**: `STRING`**Provider name**: `s3Uri`**Description**: The S3 location of the input data.

## `job_arn`{% #job_arn %}

**Type**: `STRING`**Provider name**: `jobArn`**Description**: The Amazon Resource Name (ARN) of the batch inference job.

## `job_expiration_time`{% #job_expiration_time %}

**Type**: `TIMESTAMP`**Provider name**: `jobExpirationTime`**Description**: The time at which the batch inference job times or timed out.

## `job_name`{% #job_name %}

**Type**: `STRING`**Provider name**: `jobName`**Description**: The name of the batch inference job.

## `last_modified_time`{% #last_modified_time %}

**Type**: `TIMESTAMP`**Provider name**: `lastModifiedTime`**Description**: The time at which the batch inference job was last modified.

## `message`{% #message %}

**Type**: `STRING`**Provider name**: `message`**Description**: If the batch inference job failed, this field contains a message describing why the job failed.

## `model_id`{% #model_id %}

**Type**: `STRING`**Provider name**: `modelId`**Description**: The unique identifier of the foundation model used for model inference.

## `output_data_config`{% #output_data_config %}

**Type**: `STRUCT`**Provider name**: `outputDataConfig`**Description**: Details about the location of the output of the batch inference job.

- `s3_output_data_config`**Type**: `STRUCT`**Provider name**: `s3OutputDataConfig`**Description**: Contains the configuration of the S3 location of the output data.
  - `s3_bucket_owner`**Type**: `STRING`**Provider name**: `s3BucketOwner`**Description**: The ID of the Amazon Web Services account that owns the S3 bucket containing the output data.
  - `s3_encryption_key_id`**Type**: `STRING`**Provider name**: `s3EncryptionKeyId`**Description**: The unique identifier of the key that encrypts the S3 location of the output data.
  - `s3_uri`**Type**: `STRING`**Provider name**: `s3Uri`**Description**: The S3 location of the output data.

## `role_arn`{% #role_arn %}

**Type**: `STRING`**Provider name**: `roleArn`**Description**: The Amazon Resource Name (ARN) of the service role with permissions to carry out and manage batch inference. You can use the console to create a default service role or follow the steps at [Create a service role for batch inference](https://docs.aws.amazon.com/bedrock/latest/userguide/batch-iam-sr.html).

## `status`{% #status %}

**Type**: `STRING`**Provider name**: `status`**Description**: The status of the batch inference job. The following statuses are possible:

- Submitted – This job has been submitted to a queue for validation.
- Validating – This job is being validated for the requirements described in [Format and upload your batch inference data](https://docs.aws.amazon.com/bedrock/latest/userguide/batch-inference-data.html). The criteria include the following:
  - Your IAM service role has access to the Amazon S3 buckets containing your files.
  - Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the `modelInput` value matches the request body for the model.
  - Your files fulfill the requirements for file size and number of records. For more information, see [Quotas for Amazon Bedrock](https://docs.aws.amazon.com/bedrock/latest/userguide/quotas.html).
- Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.
- Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.
- InProgress – This job has begun. You can start viewing the results in the output S3 location.
- Completed – This job has successfully completed. View the output files in the output S3 location.
- PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.
- Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the [Amazon Web ServicesSupport Center](https://console.aws.amazon.com/support/home/).
- Stopped – This job was stopped by a user.
- Stopping – This job is being stopped by a user.



## `submit_time`{% #submit_time %}

**Type**: `TIMESTAMP`**Provider name**: `submitTime`**Description**: The time at which the batch inference job was submitted.

## `tags`{% #tags %}

**Type**: `UNORDERED_LIST_STRING`

## `timeout_duration_in_hours`{% #timeout_duration_in_hours %}

**Type**: `INT32`**Provider name**: `timeoutDurationInHours`**Description**: The number of hours after which batch inference job was set to time out.

## `vpc_config`{% #vpc_config %}

**Type**: `STRUCT`**Provider name**: `vpcConfig`**Description**: The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job. For more information, see [Protect batch inference jobs using a VPC](https://docs.aws.amazon.com/bedrock/latest/userguide/batch-vpc).

- `security_group_ids`**Type**: `UNORDERED_LIST_STRING`**Provider name**: `securityGroupIds`**Description**: An array of IDs for each security group in the VPC to use.
- `subnet_ids`**Type**: `UNORDERED_LIST_STRING`**Provider name**: `subnetIds`**Description**: An array of IDs for each subnet in the VPC to use.
