We recommend new projects start with resources from the AWS provider.
AWS Cloud Control v1.26.0 published on Wednesday, Mar 12, 2025 by Pulumi
aws-native.bedrock.getApplicationInferenceProfile
Explore with Pulumi AI
We recommend new projects start with resources from the AWS provider.
AWS Cloud Control v1.26.0 published on Wednesday, Mar 12, 2025 by Pulumi
Definition of AWS::Bedrock::ApplicationInferenceProfile Resource Type
Using getApplicationInferenceProfile
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getApplicationInferenceProfile(args: GetApplicationInferenceProfileArgs, opts?: InvokeOptions): Promise<GetApplicationInferenceProfileResult>
function getApplicationInferenceProfileOutput(args: GetApplicationInferenceProfileOutputArgs, opts?: InvokeOptions): Output<GetApplicationInferenceProfileResult>def get_application_inference_profile(inference_profile_identifier: Optional[str] = None,
                                      opts: Optional[InvokeOptions] = None) -> GetApplicationInferenceProfileResult
def get_application_inference_profile_output(inference_profile_identifier: Optional[pulumi.Input[str]] = None,
                                      opts: Optional[InvokeOptions] = None) -> Output[GetApplicationInferenceProfileResult]func LookupApplicationInferenceProfile(ctx *Context, args *LookupApplicationInferenceProfileArgs, opts ...InvokeOption) (*LookupApplicationInferenceProfileResult, error)
func LookupApplicationInferenceProfileOutput(ctx *Context, args *LookupApplicationInferenceProfileOutputArgs, opts ...InvokeOption) LookupApplicationInferenceProfileResultOutput> Note: This function is named LookupApplicationInferenceProfile in the Go SDK.
public static class GetApplicationInferenceProfile 
{
    public static Task<GetApplicationInferenceProfileResult> InvokeAsync(GetApplicationInferenceProfileArgs args, InvokeOptions? opts = null)
    public static Output<GetApplicationInferenceProfileResult> Invoke(GetApplicationInferenceProfileInvokeArgs args, InvokeOptions? opts = null)
}public static CompletableFuture<GetApplicationInferenceProfileResult> getApplicationInferenceProfile(GetApplicationInferenceProfileArgs args, InvokeOptions options)
public static Output<GetApplicationInferenceProfileResult> getApplicationInferenceProfile(GetApplicationInferenceProfileArgs args, InvokeOptions options)
fn::invoke:
  function: aws-native:bedrock:getApplicationInferenceProfile
  arguments:
    # arguments dictionaryThe following arguments are supported:
- InferenceProfile stringIdentifier 
- Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- InferenceProfile stringIdentifier 
- Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- inferenceProfile StringIdentifier 
- Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- inferenceProfile stringIdentifier 
- Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- inference_profile_ stridentifier 
- Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- inferenceProfile StringIdentifier 
- Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
getApplicationInferenceProfile Result
The following output properties are available:
- CreatedAt string
- Time Stamp
- InferenceProfile stringArn 
- The Amazon Resource Name (ARN) of the inference profile.
- InferenceProfile stringId 
- The unique identifier of the inference profile.
- InferenceProfile stringIdentifier 
- Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- Models
List<Pulumi.Aws Native. Bedrock. Outputs. Application Inference Profile Inference Profile Model> 
- List of model configuration
- Status
Pulumi.Aws Native. Bedrock. Application Inference Profile Inference Profile Status 
- The status of the inference profile. ACTIVEmeans that the inference profile is ready to be used.
- 
List<Pulumi.Aws Native. Outputs. Tag> 
- List of Tags
- Type
Pulumi.Aws Native. Bedrock. Application Inference Profile Inference Profile Type 
- The type of the inference profile. The following types are possible:- SYSTEM_DEFINED– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.
- APPLICATION– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
 
- UpdatedAt string
- Time Stamp
- CreatedAt string
- Time Stamp
- InferenceProfile stringArn 
- The Amazon Resource Name (ARN) of the inference profile.
- InferenceProfile stringId 
- The unique identifier of the inference profile.
- InferenceProfile stringIdentifier 
- Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- Models
[]ApplicationInference Profile Inference Profile Model 
- List of model configuration
- Status
ApplicationInference Profile Inference Profile Status 
- The status of the inference profile. ACTIVEmeans that the inference profile is ready to be used.
- Tag
- List of Tags
- Type
ApplicationInference Profile Inference Profile Type 
- The type of the inference profile. The following types are possible:- SYSTEM_DEFINED– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.
- APPLICATION– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
 
- UpdatedAt string
- Time Stamp
- createdAt String
- Time Stamp
- inferenceProfile StringArn 
- The Amazon Resource Name (ARN) of the inference profile.
- inferenceProfile StringId 
- The unique identifier of the inference profile.
- inferenceProfile StringIdentifier 
- Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- models
List<ApplicationInference Profile Inference Profile Model> 
- List of model configuration
- status
ApplicationInference Profile Inference Profile Status 
- The status of the inference profile. ACTIVEmeans that the inference profile is ready to be used.
- List<Tag>
- List of Tags
- type
ApplicationInference Profile Inference Profile Type 
- The type of the inference profile. The following types are possible:- SYSTEM_DEFINED– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.
- APPLICATION– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
 
- updatedAt String
- Time Stamp
- createdAt string
- Time Stamp
- inferenceProfile stringArn 
- The Amazon Resource Name (ARN) of the inference profile.
- inferenceProfile stringId 
- The unique identifier of the inference profile.
- inferenceProfile stringIdentifier 
- Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- models
ApplicationInference Profile Inference Profile Model[] 
- List of model configuration
- status
ApplicationInference Profile Inference Profile Status 
- The status of the inference profile. ACTIVEmeans that the inference profile is ready to be used.
- Tag[]
- List of Tags
- type
ApplicationInference Profile Inference Profile Type 
- The type of the inference profile. The following types are possible:- SYSTEM_DEFINED– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.
- APPLICATION– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
 
- updatedAt string
- Time Stamp
- created_at str
- Time Stamp
- inference_profile_ strarn 
- The Amazon Resource Name (ARN) of the inference profile.
- inference_profile_ strid 
- The unique identifier of the inference profile.
- inference_profile_ stridentifier 
- Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- models
Sequence[ApplicationInference Profile Inference Profile Model] 
- List of model configuration
- status
ApplicationInference Profile Inference Profile Status 
- The status of the inference profile. ACTIVEmeans that the inference profile is ready to be used.
- Sequence[root_Tag]
- List of Tags
- type
ApplicationInference Profile Inference Profile Type 
- The type of the inference profile. The following types are possible:- SYSTEM_DEFINED– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.
- APPLICATION– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
 
- updated_at str
- Time Stamp
- createdAt String
- Time Stamp
- inferenceProfile StringArn 
- The Amazon Resource Name (ARN) of the inference profile.
- inferenceProfile StringId 
- The unique identifier of the inference profile.
- inferenceProfile StringIdentifier 
- Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- models List<Property Map>
- List of model configuration
- status "ACTIVE"
- The status of the inference profile. ACTIVEmeans that the inference profile is ready to be used.
- List<Property Map>
- List of Tags
- type "APPLICATION" | "SYSTEM_DEFINED"
- The type of the inference profile. The following types are possible:- SYSTEM_DEFINED– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.
- APPLICATION– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
 
- updatedAt String
- Time Stamp
Supporting Types
ApplicationInferenceProfileInferenceProfileModel     
- ModelArn string
- ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- ModelArn string
- ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- modelArn String
- ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- modelArn string
- ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- model_arn str
- ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- modelArn String
- ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
ApplicationInferenceProfileInferenceProfileStatus     
ApplicationInferenceProfileInferenceProfileType     
Tag
Package Details
- Repository
- AWS Native pulumi/pulumi-aws-native
- License
- Apache-2.0
We recommend new projects start with resources from the AWS provider.
AWS Cloud Control v1.26.0 published on Wednesday, Mar 12, 2025 by Pulumi