- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- Administrator's Guide
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
",t};e.buildCustomizationMenuUi=t;function n(e){let t='
",t}function s(e){let n=e.filter.currentValue||e.filter.defaultValue,t='${e.filter.label}
`,e.filter.options.forEach(s=>{let o=s.id===n;t+=``}),t+="${e.filter.label}
`,t+=`account_id
Type: STRING
client_request_token
Type: STRING
Provider name: clientRequestToken
Description: A unique, case-sensitive identifier to ensure that the API request completes no more than one time. If this token matches a previous request, Amazon Bedrock ignores the request, but does not return an error. For more information, see Ensuring idempotency.
end_time
Type: TIMESTAMP
Provider name: endTime
Description: The time at which the batch inference job ended.
input_data_config
Type: STRUCT
Provider name: inputDataConfig
Description: Details about the location of the input to the batch inference job.
s3_input_data_config
STRUCT
s3InputDataConfig
s3_bucket_owner
STRING
s3BucketOwner
s3_input_format
STRING
s3InputFormat
s3_uri
STRING
s3Uri
job_arn
Type: STRING
Provider name: jobArn
Description: The Amazon Resource Name (ARN) of the batch inference job.
job_expiration_time
Type: TIMESTAMP
Provider name: jobExpirationTime
Description: The time at which the batch inference job times or timed out.
job_name
Type: STRING
Provider name: jobName
Description: The name of the batch inference job.
last_modified_time
Type: TIMESTAMP
Provider name: lastModifiedTime
Description: The time at which the batch inference job was last modified.
message
Type: STRING
Provider name: message
Description: If the batch inference job failed, this field contains a message describing why the job failed.
model_id
Type: STRING
Provider name: modelId
Description: The unique identifier of the foundation model used for model inference.
output_data_config
Type: STRUCT
Provider name: outputDataConfig
Description: Details about the location of the output of the batch inference job.
s3_output_data_config
STRUCT
s3OutputDataConfig
s3_bucket_owner
STRING
s3BucketOwner
s3_encryption_key_id
STRING
s3EncryptionKeyId
s3_uri
STRING
s3Uri
role_arn
Type: STRING
Provider name: roleArn
Description: The Amazon Resource Name (ARN) of the service role with permissions to carry out and manage batch inference. You can use the console to create a default service role or follow the steps at Create a service role for batch inference.
status
Type: STRING
Provider name: status
Description: The status of the batch inference job. The following statuses are possible:
modelInput
value matches the request body for the model.submit_time
Type: TIMESTAMP
Provider name: submitTime
Description: The time at which the batch inference job was submitted.
tags
Type: UNORDERED_LIST_STRING
timeout_duration_in_hours
Type: INT32
Provider name: timeoutDurationInHours
Description: The number of hours after which batch inference job was set to time out.
vpc_config
Type: STRUCT
Provider name: vpcConfig
Description: The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job. For more information, see Protect batch inference jobs using a VPC.
security_group_ids
UNORDERED_LIST_STRING
securityGroupIds
subnet_ids
UNORDERED_LIST_STRING
subnetIds