We recommend new projects start with resources from the AWS provider.
aws-native.sagemaker.getInferenceExperiment
Explore with Pulumi AI
We recommend new projects start with resources from the AWS provider.
Resource Type definition for AWS::SageMaker::InferenceExperiment
Using getInferenceExperiment
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getInferenceExperiment(args: GetInferenceExperimentArgs, opts?: InvokeOptions): Promise<GetInferenceExperimentResult>
function getInferenceExperimentOutput(args: GetInferenceExperimentOutputArgs, opts?: InvokeOptions): Output<GetInferenceExperimentResult>def get_inference_experiment(name: Optional[str] = None,
                             opts: Optional[InvokeOptions] = None) -> GetInferenceExperimentResult
def get_inference_experiment_output(name: Optional[pulumi.Input[str]] = None,
                             opts: Optional[InvokeOptions] = None) -> Output[GetInferenceExperimentResult]func LookupInferenceExperiment(ctx *Context, args *LookupInferenceExperimentArgs, opts ...InvokeOption) (*LookupInferenceExperimentResult, error)
func LookupInferenceExperimentOutput(ctx *Context, args *LookupInferenceExperimentOutputArgs, opts ...InvokeOption) LookupInferenceExperimentResultOutput> Note: This function is named LookupInferenceExperiment in the Go SDK.
public static class GetInferenceExperiment 
{
    public static Task<GetInferenceExperimentResult> InvokeAsync(GetInferenceExperimentArgs args, InvokeOptions? opts = null)
    public static Output<GetInferenceExperimentResult> Invoke(GetInferenceExperimentInvokeArgs args, InvokeOptions? opts = null)
}public static CompletableFuture<GetInferenceExperimentResult> getInferenceExperiment(GetInferenceExperimentArgs args, InvokeOptions options)
public static Output<GetInferenceExperimentResult> getInferenceExperiment(GetInferenceExperimentArgs args, InvokeOptions options)
fn::invoke:
  function: aws-native:sagemaker:getInferenceExperiment
  arguments:
    # arguments dictionaryThe following arguments are supported:
- Name string
- The name for the inference experiment.
- Name string
- The name for the inference experiment.
- name String
- The name for the inference experiment.
- name string
- The name for the inference experiment.
- name str
- The name for the inference experiment.
- name String
- The name for the inference experiment.
getInferenceExperiment Result
The following output properties are available:
- Arn string
- The Amazon Resource Name (ARN) of the inference experiment.
- CreationTime string
- The timestamp at which you created the inference experiment.
- DataStorage Pulumi.Config Aws Native. Sage Maker. Outputs. Inference Experiment Data Storage Config 
- The Amazon S3 location and configuration for storing inference request and response data.
- Description string
- The description of the inference experiment.
- DesiredState Pulumi.Aws Native. Sage Maker. Inference Experiment Desired State 
- The desired state of the experiment after starting or stopping operation.
- EndpointMetadata Pulumi.Aws Native. Sage Maker. Outputs. Inference Experiment Endpoint Metadata 
- LastModified stringTime 
- The timestamp at which you last modified the inference experiment.
- ModelVariants List<Pulumi.Aws Native. Sage Maker. Outputs. Inference Experiment Model Variant Config> 
- An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- Schedule
Pulumi.Aws Native. Sage Maker. Outputs. Inference Experiment Schedule 
- The duration for which the inference experiment ran or will run. - The maximum duration that you can set for an inference experiment is 30 days. 
- ShadowMode Pulumi.Config Aws Native. Sage Maker. Outputs. Inference Experiment Shadow Mode Config 
- The configuration of ShadowModeinference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
- Status
Pulumi.Aws Native. Sage Maker. Inference Experiment Status 
- The status of the inference experiment.
- StatusReason string
- The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- 
List<Pulumi.Aws Native. Outputs. Tag> 
- An array of key-value pairs to apply to this resource.
- Arn string
- The Amazon Resource Name (ARN) of the inference experiment.
- CreationTime string
- The timestamp at which you created the inference experiment.
- DataStorage InferenceConfig Experiment Data Storage Config 
- The Amazon S3 location and configuration for storing inference request and response data.
- Description string
- The description of the inference experiment.
- DesiredState InferenceExperiment Desired State 
- The desired state of the experiment after starting or stopping operation.
- EndpointMetadata InferenceExperiment Endpoint Metadata 
- LastModified stringTime 
- The timestamp at which you last modified the inference experiment.
- ModelVariants []InferenceExperiment Model Variant Config 
- An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- Schedule
InferenceExperiment Schedule 
- The duration for which the inference experiment ran or will run. - The maximum duration that you can set for an inference experiment is 30 days. 
- ShadowMode InferenceConfig Experiment Shadow Mode Config 
- The configuration of ShadowModeinference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
- Status
InferenceExperiment Status 
- The status of the inference experiment.
- StatusReason string
- The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- Tag
- An array of key-value pairs to apply to this resource.
- arn String
- The Amazon Resource Name (ARN) of the inference experiment.
- creationTime String
- The timestamp at which you created the inference experiment.
- dataStorage InferenceConfig Experiment Data Storage Config 
- The Amazon S3 location and configuration for storing inference request and response data.
- description String
- The description of the inference experiment.
- desiredState InferenceExperiment Desired State 
- The desired state of the experiment after starting or stopping operation.
- endpointMetadata InferenceExperiment Endpoint Metadata 
- lastModified StringTime 
- The timestamp at which you last modified the inference experiment.
- modelVariants List<InferenceExperiment Model Variant Config> 
- An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- schedule
InferenceExperiment Schedule 
- The duration for which the inference experiment ran or will run. - The maximum duration that you can set for an inference experiment is 30 days. 
- shadowMode InferenceConfig Experiment Shadow Mode Config 
- The configuration of ShadowModeinference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
- status
InferenceExperiment Status 
- The status of the inference experiment.
- statusReason String
- The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- List<Tag>
- An array of key-value pairs to apply to this resource.
- arn string
- The Amazon Resource Name (ARN) of the inference experiment.
- creationTime string
- The timestamp at which you created the inference experiment.
- dataStorage InferenceConfig Experiment Data Storage Config 
- The Amazon S3 location and configuration for storing inference request and response data.
- description string
- The description of the inference experiment.
- desiredState InferenceExperiment Desired State 
- The desired state of the experiment after starting or stopping operation.
- endpointMetadata InferenceExperiment Endpoint Metadata 
- lastModified stringTime 
- The timestamp at which you last modified the inference experiment.
- modelVariants InferenceExperiment Model Variant Config[] 
- An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- schedule
InferenceExperiment Schedule 
- The duration for which the inference experiment ran or will run. - The maximum duration that you can set for an inference experiment is 30 days. 
- shadowMode InferenceConfig Experiment Shadow Mode Config 
- The configuration of ShadowModeinference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
- status
InferenceExperiment Status 
- The status of the inference experiment.
- statusReason string
- The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- Tag[]
- An array of key-value pairs to apply to this resource.
- arn str
- The Amazon Resource Name (ARN) of the inference experiment.
- creation_time str
- The timestamp at which you created the inference experiment.
- data_storage_ Inferenceconfig Experiment Data Storage Config 
- The Amazon S3 location and configuration for storing inference request and response data.
- description str
- The description of the inference experiment.
- desired_state InferenceExperiment Desired State 
- The desired state of the experiment after starting or stopping operation.
- endpoint_metadata InferenceExperiment Endpoint Metadata 
- last_modified_ strtime 
- The timestamp at which you last modified the inference experiment.
- model_variants Sequence[InferenceExperiment Model Variant Config] 
- An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- schedule
InferenceExperiment Schedule 
- The duration for which the inference experiment ran or will run. - The maximum duration that you can set for an inference experiment is 30 days. 
- shadow_mode_ Inferenceconfig Experiment Shadow Mode Config 
- The configuration of ShadowModeinference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
- status
InferenceExperiment Status 
- The status of the inference experiment.
- status_reason str
- The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- Sequence[root_Tag]
- An array of key-value pairs to apply to this resource.
- arn String
- The Amazon Resource Name (ARN) of the inference experiment.
- creationTime String
- The timestamp at which you created the inference experiment.
- dataStorage Property MapConfig 
- The Amazon S3 location and configuration for storing inference request and response data.
- description String
- The description of the inference experiment.
- desiredState "Running" | "Completed" | "Cancelled"
- The desired state of the experiment after starting or stopping operation.
- endpointMetadata Property Map
- lastModified StringTime 
- The timestamp at which you last modified the inference experiment.
- modelVariants List<Property Map>
- An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- schedule Property Map
- The duration for which the inference experiment ran or will run. - The maximum duration that you can set for an inference experiment is 30 days. 
- shadowMode Property MapConfig 
- The configuration of ShadowModeinference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
- status "Creating" | "Created" | "Updating" | "Starting" | "Stopping" | "Running" | "Completed" | "Cancelled"
- The status of the inference experiment.
- statusReason String
- The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- List<Property Map>
- An array of key-value pairs to apply to this resource.
Supporting Types
InferenceExperimentCaptureContentTypeHeader     
- CsvContent List<string>Types 
- The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- JsonContent List<string>Types 
- The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- CsvContent []stringTypes 
- The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- JsonContent []stringTypes 
- The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- csvContent List<String>Types 
- The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- jsonContent List<String>Types 
- The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- csvContent string[]Types 
- The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- jsonContent string[]Types 
- The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- csv_content_ Sequence[str]types 
- The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- json_content_ Sequence[str]types 
- The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- csvContent List<String>Types 
- The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- jsonContent List<String>Types 
- The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
InferenceExperimentDataStorageConfig    
- Destination string
- The Amazon S3 bucket where the inference request and response data is stored.
- ContentType Pulumi.Aws Native. Sage Maker. Inputs. Inference Experiment Capture Content Type Header 
- Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- KmsKey string
- The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- Destination string
- The Amazon S3 bucket where the inference request and response data is stored.
- ContentType InferenceExperiment Capture Content Type Header 
- Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- KmsKey string
- The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- destination String
- The Amazon S3 bucket where the inference request and response data is stored.
- contentType InferenceExperiment Capture Content Type Header 
- Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- kmsKey String
- The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- destination string
- The Amazon S3 bucket where the inference request and response data is stored.
- contentType InferenceExperiment Capture Content Type Header 
- Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- kmsKey string
- The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- destination str
- The Amazon S3 bucket where the inference request and response data is stored.
- content_type InferenceExperiment Capture Content Type Header 
- Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- kms_key str
- The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- destination String
- The Amazon S3 bucket where the inference request and response data is stored.
- contentType Property Map
- Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- kmsKey String
- The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
InferenceExperimentDesiredState   
InferenceExperimentEndpointMetadata   
- EndpointName string
- The name of the endpoint.
- EndpointConfig stringName 
- The name of the endpoint configuration.
- EndpointStatus Pulumi.Aws Native. Sage Maker. Inference Experiment Endpoint Metadata Endpoint Status 
- The status of the endpoint. For possible values of the status of an endpoint.
- EndpointName string
- The name of the endpoint.
- EndpointConfig stringName 
- The name of the endpoint configuration.
- EndpointStatus InferenceExperiment Endpoint Metadata Endpoint Status 
- The status of the endpoint. For possible values of the status of an endpoint.
- endpointName String
- The name of the endpoint.
- endpointConfig StringName 
- The name of the endpoint configuration.
- endpointStatus InferenceExperiment Endpoint Metadata Endpoint Status 
- The status of the endpoint. For possible values of the status of an endpoint.
- endpointName string
- The name of the endpoint.
- endpointConfig stringName 
- The name of the endpoint configuration.
- endpointStatus InferenceExperiment Endpoint Metadata Endpoint Status 
- The status of the endpoint. For possible values of the status of an endpoint.
- endpoint_name str
- The name of the endpoint.
- endpoint_config_ strname 
- The name of the endpoint configuration.
- endpoint_status InferenceExperiment Endpoint Metadata Endpoint Status 
- The status of the endpoint. For possible values of the status of an endpoint.
- endpointName String
- The name of the endpoint.
- endpointConfig StringName 
- The name of the endpoint configuration.
- endpointStatus "Creating" | "Updating" | "SystemUpdating" | "Rolling Back" | "In Service" | "Out Of Service" | "Deleting" | "Failed" 
- The status of the endpoint. For possible values of the status of an endpoint.
InferenceExperimentEndpointMetadataEndpointStatus     
InferenceExperimentModelInfrastructureConfig    
- InfrastructureType Pulumi.Aws Native. Sage Maker. Inference Experiment Model Infrastructure Config Infrastructure Type 
- The type of the inference experiment that you want to run.
- RealTime Pulumi.Inference Config Aws Native. Sage Maker. Inputs. Inference Experiment Real Time Inference Config 
- The infrastructure configuration for deploying the model to real-time inference.
- InfrastructureType InferenceExperiment Model Infrastructure Config Infrastructure Type 
- The type of the inference experiment that you want to run.
- RealTime InferenceInference Config Experiment Real Time Inference Config 
- The infrastructure configuration for deploying the model to real-time inference.
- infrastructureType InferenceExperiment Model Infrastructure Config Infrastructure Type 
- The type of the inference experiment that you want to run.
- realTime InferenceInference Config Experiment Real Time Inference Config 
- The infrastructure configuration for deploying the model to real-time inference.
- infrastructureType InferenceExperiment Model Infrastructure Config Infrastructure Type 
- The type of the inference experiment that you want to run.
- realTime InferenceInference Config Experiment Real Time Inference Config 
- The infrastructure configuration for deploying the model to real-time inference.
- infrastructure_type InferenceExperiment Model Infrastructure Config Infrastructure Type 
- The type of the inference experiment that you want to run.
- real_time_ Inferenceinference_ config Experiment Real Time Inference Config 
- The infrastructure configuration for deploying the model to real-time inference.
- infrastructureType "RealTime Inference" 
- The type of the inference experiment that you want to run.
- realTime Property MapInference Config 
- The infrastructure configuration for deploying the model to real-time inference.
InferenceExperimentModelInfrastructureConfigInfrastructureType      
InferenceExperimentModelVariantConfig    
- InfrastructureConfig Pulumi.Aws Native. Sage Maker. Inputs. Inference Experiment Model Infrastructure Config 
- The configuration for the infrastructure that the model will be deployed to.
- ModelName string
- The name of the Amazon SageMaker Model entity.
- VariantName string
- The name of the variant.
- InfrastructureConfig InferenceExperiment Model Infrastructure Config 
- The configuration for the infrastructure that the model will be deployed to.
- ModelName string
- The name of the Amazon SageMaker Model entity.
- VariantName string
- The name of the variant.
- infrastructureConfig InferenceExperiment Model Infrastructure Config 
- The configuration for the infrastructure that the model will be deployed to.
- modelName String
- The name of the Amazon SageMaker Model entity.
- variantName String
- The name of the variant.
- infrastructureConfig InferenceExperiment Model Infrastructure Config 
- The configuration for the infrastructure that the model will be deployed to.
- modelName string
- The name of the Amazon SageMaker Model entity.
- variantName string
- The name of the variant.
- infrastructure_config InferenceExperiment Model Infrastructure Config 
- The configuration for the infrastructure that the model will be deployed to.
- model_name str
- The name of the Amazon SageMaker Model entity.
- variant_name str
- The name of the variant.
- infrastructureConfig Property Map
- The configuration for the infrastructure that the model will be deployed to.
- modelName String
- The name of the Amazon SageMaker Model entity.
- variantName String
- The name of the variant.
InferenceExperimentRealTimeInferenceConfig     
- InstanceCount int
- The number of instances of the type specified by InstanceType.
- InstanceType string
- The instance type the model is deployed to.
- InstanceCount int
- The number of instances of the type specified by InstanceType.
- InstanceType string
- The instance type the model is deployed to.
- instanceCount Integer
- The number of instances of the type specified by InstanceType.
- instanceType String
- The instance type the model is deployed to.
- instanceCount number
- The number of instances of the type specified by InstanceType.
- instanceType string
- The instance type the model is deployed to.
- instance_count int
- The number of instances of the type specified by InstanceType.
- instance_type str
- The instance type the model is deployed to.
- instanceCount Number
- The number of instances of the type specified by InstanceType.
- instanceType String
- The instance type the model is deployed to.
InferenceExperimentSchedule  
- end_time str
- The timestamp at which the inference experiment ended or will end.
- start_time str
- The timestamp at which the inference experiment started or will start.
InferenceExperimentShadowModeConfig    
- ShadowModel List<Pulumi.Variants Aws Native. Sage Maker. Inputs. Inference Experiment Shadow Model Variant Config> 
- List of shadow variant configurations.
- SourceModel stringVariant Name 
- The name of the production variant, which takes all the inference requests.
- ShadowModel []InferenceVariants Experiment Shadow Model Variant Config 
- List of shadow variant configurations.
- SourceModel stringVariant Name 
- The name of the production variant, which takes all the inference requests.
- shadowModel List<InferenceVariants Experiment Shadow Model Variant Config> 
- List of shadow variant configurations.
- sourceModel StringVariant Name 
- The name of the production variant, which takes all the inference requests.
- shadowModel InferenceVariants Experiment Shadow Model Variant Config[] 
- List of shadow variant configurations.
- sourceModel stringVariant Name 
- The name of the production variant, which takes all the inference requests.
- shadow_model_ Sequence[Inferencevariants Experiment Shadow Model Variant Config] 
- List of shadow variant configurations.
- source_model_ strvariant_ name 
- The name of the production variant, which takes all the inference requests.
- shadowModel List<Property Map>Variants 
- List of shadow variant configurations.
- sourceModel StringVariant Name 
- The name of the production variant, which takes all the inference requests.
InferenceExperimentShadowModelVariantConfig     
- SamplingPercentage int
- The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- ShadowModel stringVariant Name 
- The name of the shadow variant.
- SamplingPercentage int
- The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- ShadowModel stringVariant Name 
- The name of the shadow variant.
- samplingPercentage Integer
- The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- shadowModel StringVariant Name 
- The name of the shadow variant.
- samplingPercentage number
- The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- shadowModel stringVariant Name 
- The name of the shadow variant.
- sampling_percentage int
- The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- shadow_model_ strvariant_ name 
- The name of the shadow variant.
- samplingPercentage Number
- The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- shadowModel StringVariant Name 
- The name of the shadow variant.
InferenceExperimentStatus  
Tag
Package Details
- Repository
- AWS Native pulumi/pulumi-aws-native
- License
- Apache-2.0
We recommend new projects start with resources from the AWS provider.