Bedrock System-Defined Inference Profile

The Bedrock System-Defined Inference Profile in AWS provides predefined configurations for running inference with foundation models on Amazon Bedrock. These profiles specify optimized settings such as compute resources and performance characteristics, allowing users to quickly deploy and run model inference without manually tuning infrastructure. They simplify the process of selecting the right environment for workloads, ensuring efficiency and consistency across model deployments.

aws.bedrock_system_defined_inference_profile

Fields

TitleIDTypeData TypeDescription
_keycorestring
account_idcorestring
created_atcoretimestampThe time at which the inference profile was created.
descriptioncorestringThe description of the inference profile.
inference_profile_arncorestringThe Amazon Resource Name (ARN) of the inference profile.
inference_profile_idcorestringThe unique identifier of the inference profile.
inference_profile_namecorestringThe name of the inference profile.
modelscorejsonA list of information about each model in the inference profile.
statuscorestringThe status of the inference profile. ACTIVE means that the inference profile is ready to be used.
tagscorehstore
typecorestringThe type of the inference profile. The following types are possible: SYSTEM_DEFINED – The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles. APPLICATION – The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
updated_atcoretimestampThe time at which the inference profile was last updated.