We recommend new projects start with resources from the AWS provider.
aws-native.mwaa.Environment
Explore with Pulumi AI
We recommend new projects start with resources from the AWS provider.
Resource schema for AWS::MWAA::Environment
Create Environment Resource
Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.
Constructor syntax
new Environment(name: string, args?: EnvironmentArgs, opts?: CustomResourceOptions);@overload
def Environment(resource_name: str,
                args: Optional[EnvironmentArgs] = None,
                opts: Optional[ResourceOptions] = None)
@overload
def Environment(resource_name: str,
                opts: Optional[ResourceOptions] = None,
                airflow_configuration_options: Optional[Any] = None,
                airflow_version: Optional[str] = None,
                dag_s3_path: Optional[str] = None,
                endpoint_management: Optional[EnvironmentEndpointManagement] = None,
                environment_class: Optional[str] = None,
                execution_role_arn: Optional[str] = None,
                kms_key: Optional[str] = None,
                logging_configuration: Optional[EnvironmentLoggingConfigurationArgs] = None,
                max_webservers: Optional[int] = None,
                max_workers: Optional[int] = None,
                min_webservers: Optional[int] = None,
                min_workers: Optional[int] = None,
                name: Optional[str] = None,
                network_configuration: Optional[EnvironmentNetworkConfigurationArgs] = None,
                plugins_s3_object_version: Optional[str] = None,
                plugins_s3_path: Optional[str] = None,
                requirements_s3_object_version: Optional[str] = None,
                requirements_s3_path: Optional[str] = None,
                schedulers: Optional[int] = None,
                source_bucket_arn: Optional[str] = None,
                startup_script_s3_object_version: Optional[str] = None,
                startup_script_s3_path: Optional[str] = None,
                tags: Optional[Any] = None,
                webserver_access_mode: Optional[EnvironmentWebserverAccessMode] = None,
                weekly_maintenance_window_start: Optional[str] = None)func NewEnvironment(ctx *Context, name string, args *EnvironmentArgs, opts ...ResourceOption) (*Environment, error)public Environment(string name, EnvironmentArgs? args = null, CustomResourceOptions? opts = null)
public Environment(String name, EnvironmentArgs args)
public Environment(String name, EnvironmentArgs args, CustomResourceOptions options)
type: aws-native:mwaa:Environment
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
Parameters
- name string
- The unique name of the resource.
- args EnvironmentArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args EnvironmentArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args EnvironmentArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args EnvironmentArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args EnvironmentArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
Environment Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.
The Environment resource accepts the following input properties:
- AirflowConfiguration objectOptions 
- Key/value pairs representing Airflow configuration variables. Keys are prefixed by their section: - [core] dags_folder={AIRFLOW_HOME}/dags- Would be represented as - "core.dags_folder": "{AIRFLOW_HOME}/dags" - Search the CloudFormation User Guide for - AWS::MWAA::Environmentfor more information about the expected schema for this property.
- AirflowVersion string
- The version of Apache Airflow to use for the environment. If no value is specified, defaults to the latest version. - If you specify a newer version number for an existing environment, the version update requires some service interruption before taking effect. - Allowed Values : - 1.10.12|- 2.0.2|- 2.2.2|- 2.4.3|- 2.5.1|- 2.6.3|- 2.7.2|- 2.8.1|- 2.9.2(latest)
- DagS3Path string
- The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags. To learn more, see Adding or updating DAGs .
- EndpointManagement Pulumi.Aws Native. Mwaa. Environment Endpoint Management 
- Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to SERVICE, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set toCUSTOMER, you must create, and manage, the VPC endpoints in your VPC.
- EnvironmentClass string
- The environment class type. Valid values: mw1.small,mw1.medium,mw1.large. To learn more, see Amazon MWAA environment class .
- ExecutionRole stringArn 
- The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access AWS resources in your environment. For example, arn:aws:iam::123456789:role/my-execution-role. To learn more, see Amazon MWAA Execution role .
- KmsKey string
- The AWS Key Management Service (KMS) key to encrypt and decrypt the data in your environment. You can use an AWS KMS key managed by MWAA, or a customer-managed KMS key (advanced).
- LoggingConfiguration Pulumi.Aws Native. Mwaa. Inputs. Environment Logging Configuration 
- The Apache Airflow logs being sent to CloudWatch Logs: DagProcessingLogs,SchedulerLogs,TaskLogs,WebserverLogs,WorkerLogs.
- MaxWebservers int
- The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for - MaxWebserverswhen you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set in- MaxWebserers. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set in- MinxWebserers.- Valid values: For environments larger than mw1.micro, accepts values from - 2to- 5. Defaults to- 2for all environment sizes except mw1.micro, which defaults to- 1.
- MaxWorkers int
- The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkersfield. For example,20. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers.
- MinWebservers int
- The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for - MaxWebserverswhen you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set in- MinxWebserers.- Valid values: For environments larger than mw1.micro, accepts values from - 2to- 5. Defaults to- 2for all environment sizes except mw1.micro, which defaults to- 1.
- MinWorkers int
- The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkersfield. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkersfield. For example,2.
- Name string
- The name of your Amazon MWAA environment.
- NetworkConfiguration Pulumi.Aws Native. Mwaa. Inputs. Environment Network Configuration 
- The VPC networking components used to secure and enable network traffic between the AWS resources for your environment. To learn more, see About networking on Amazon MWAA .
- PluginsS3Object stringVersion 
- The version of the plugins.zip file on your Amazon S3 bucket. To learn more, see Installing custom plugins .
- PluginsS3Path string
- The relative path to the plugins.zipfile on your Amazon S3 bucket. For example,plugins.zip. To learn more, see Installing custom plugins .
- RequirementsS3Object stringVersion 
- The version of the requirements.txt file on your Amazon S3 bucket. To learn more, see Installing Python dependencies .
- RequirementsS3Path string
- The relative path to the requirements.txtfile on your Amazon S3 bucket. For example,requirements.txt. To learn more, see Installing Python dependencies .
- Schedulers int
- The number of schedulers that you want to run in your environment. Valid values:- v2 - Accepts between 2 to 5. Defaults to 2.
- v1 - Accepts 1.
 
- SourceBucket stringArn 
- The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name. To learn more, see Create an Amazon S3 bucket for Amazon MWAA .
- StartupScript stringS3Object Version 
- The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script. - Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example: - 3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo- For more information, see Using a startup script . 
- StartupScript stringS3Path 
- The relative path to the startup shell script in your Amazon S3 bucket. For example, - s3://mwaa-environment/startup.sh.- Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script . 
- object
- A map of tags for the environment. - Search the CloudFormation User Guide for - AWS::MWAA::Environmentfor more information about the expected schema for this property.
- WebserverAccess Pulumi.Mode Aws Native. Mwaa. Environment Webserver Access Mode 
- The Apache Airflow Web server access mode. To learn more, see Apache Airflow access modes . Valid values: PRIVATE_ONLYorPUBLIC_ONLY.
- WeeklyMaintenance stringWindow Start 
- The day and time of the week to start weekly maintenance updates of your environment in the following format: DAY:HH:MM. For example:TUE:03:30. You can specify a start time in 30 minute increments only. Supported input includes the following:- MON|TUE|WED|THU|FRI|SAT|SUN:([01]\d|2[0-3]):(00|30)
 
- AirflowConfiguration interface{}Options 
- Key/value pairs representing Airflow configuration variables. Keys are prefixed by their section: - [core] dags_folder={AIRFLOW_HOME}/dags- Would be represented as - "core.dags_folder": "{AIRFLOW_HOME}/dags" - Search the CloudFormation User Guide for - AWS::MWAA::Environmentfor more information about the expected schema for this property.
- AirflowVersion string
- The version of Apache Airflow to use for the environment. If no value is specified, defaults to the latest version. - If you specify a newer version number for an existing environment, the version update requires some service interruption before taking effect. - Allowed Values : - 1.10.12|- 2.0.2|- 2.2.2|- 2.4.3|- 2.5.1|- 2.6.3|- 2.7.2|- 2.8.1|- 2.9.2(latest)
- DagS3Path string
- The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags. To learn more, see Adding or updating DAGs .
- EndpointManagement EnvironmentEndpoint Management 
- Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to SERVICE, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set toCUSTOMER, you must create, and manage, the VPC endpoints in your VPC.
- EnvironmentClass string
- The environment class type. Valid values: mw1.small,mw1.medium,mw1.large. To learn more, see Amazon MWAA environment class .
- ExecutionRole stringArn 
- The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access AWS resources in your environment. For example, arn:aws:iam::123456789:role/my-execution-role. To learn more, see Amazon MWAA Execution role .
- KmsKey string
- The AWS Key Management Service (KMS) key to encrypt and decrypt the data in your environment. You can use an AWS KMS key managed by MWAA, or a customer-managed KMS key (advanced).
- LoggingConfiguration EnvironmentLogging Configuration Args 
- The Apache Airflow logs being sent to CloudWatch Logs: DagProcessingLogs,SchedulerLogs,TaskLogs,WebserverLogs,WorkerLogs.
- MaxWebservers int
- The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for - MaxWebserverswhen you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set in- MaxWebserers. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set in- MinxWebserers.- Valid values: For environments larger than mw1.micro, accepts values from - 2to- 5. Defaults to- 2for all environment sizes except mw1.micro, which defaults to- 1.
- MaxWorkers int
- The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkersfield. For example,20. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers.
- MinWebservers int
- The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for - MaxWebserverswhen you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set in- MinxWebserers.- Valid values: For environments larger than mw1.micro, accepts values from - 2to- 5. Defaults to- 2for all environment sizes except mw1.micro, which defaults to- 1.
- MinWorkers int
- The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkersfield. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkersfield. For example,2.
- Name string
- The name of your Amazon MWAA environment.
- NetworkConfiguration EnvironmentNetwork Configuration Args 
- The VPC networking components used to secure and enable network traffic between the AWS resources for your environment. To learn more, see About networking on Amazon MWAA .
- PluginsS3Object stringVersion 
- The version of the plugins.zip file on your Amazon S3 bucket. To learn more, see Installing custom plugins .
- PluginsS3Path string
- The relative path to the plugins.zipfile on your Amazon S3 bucket. For example,plugins.zip. To learn more, see Installing custom plugins .
- RequirementsS3Object stringVersion 
- The version of the requirements.txt file on your Amazon S3 bucket. To learn more, see Installing Python dependencies .
- RequirementsS3Path string
- The relative path to the requirements.txtfile on your Amazon S3 bucket. For example,requirements.txt. To learn more, see Installing Python dependencies .
- Schedulers int
- The number of schedulers that you want to run in your environment. Valid values:- v2 - Accepts between 2 to 5. Defaults to 2.
- v1 - Accepts 1.
 
- SourceBucket stringArn 
- The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name. To learn more, see Create an Amazon S3 bucket for Amazon MWAA .
- StartupScript stringS3Object Version 
- The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script. - Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example: - 3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo- For more information, see Using a startup script . 
- StartupScript stringS3Path 
- The relative path to the startup shell script in your Amazon S3 bucket. For example, - s3://mwaa-environment/startup.sh.- Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script . 
- interface{}
- A map of tags for the environment. - Search the CloudFormation User Guide for - AWS::MWAA::Environmentfor more information about the expected schema for this property.
- WebserverAccess EnvironmentMode Webserver Access Mode 
- The Apache Airflow Web server access mode. To learn more, see Apache Airflow access modes . Valid values: PRIVATE_ONLYorPUBLIC_ONLY.
- WeeklyMaintenance stringWindow Start 
- The day and time of the week to start weekly maintenance updates of your environment in the following format: DAY:HH:MM. For example:TUE:03:30. You can specify a start time in 30 minute increments only. Supported input includes the following:- MON|TUE|WED|THU|FRI|SAT|SUN:([01]\d|2[0-3]):(00|30)
 
- airflowConfiguration ObjectOptions 
- Key/value pairs representing Airflow configuration variables. Keys are prefixed by their section: - [core] dags_folder={AIRFLOW_HOME}/dags- Would be represented as - "core.dags_folder": "{AIRFLOW_HOME}/dags" - Search the CloudFormation User Guide for - AWS::MWAA::Environmentfor more information about the expected schema for this property.
- airflowVersion String
- The version of Apache Airflow to use for the environment. If no value is specified, defaults to the latest version. - If you specify a newer version number for an existing environment, the version update requires some service interruption before taking effect. - Allowed Values : - 1.10.12|- 2.0.2|- 2.2.2|- 2.4.3|- 2.5.1|- 2.6.3|- 2.7.2|- 2.8.1|- 2.9.2(latest)
- dagS3Path String
- The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags. To learn more, see Adding or updating DAGs .
- endpointManagement EnvironmentEndpoint Management 
- Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to SERVICE, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set toCUSTOMER, you must create, and manage, the VPC endpoints in your VPC.
- environmentClass String
- The environment class type. Valid values: mw1.small,mw1.medium,mw1.large. To learn more, see Amazon MWAA environment class .
- executionRole StringArn 
- The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access AWS resources in your environment. For example, arn:aws:iam::123456789:role/my-execution-role. To learn more, see Amazon MWAA Execution role .
- kmsKey String
- The AWS Key Management Service (KMS) key to encrypt and decrypt the data in your environment. You can use an AWS KMS key managed by MWAA, or a customer-managed KMS key (advanced).
- loggingConfiguration EnvironmentLogging Configuration 
- The Apache Airflow logs being sent to CloudWatch Logs: DagProcessingLogs,SchedulerLogs,TaskLogs,WebserverLogs,WorkerLogs.
- maxWebservers Integer
- The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for - MaxWebserverswhen you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set in- MaxWebserers. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set in- MinxWebserers.- Valid values: For environments larger than mw1.micro, accepts values from - 2to- 5. Defaults to- 2for all environment sizes except mw1.micro, which defaults to- 1.
- maxWorkers Integer
- The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkersfield. For example,20. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers.
- minWebservers Integer
- The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for - MaxWebserverswhen you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set in- MinxWebserers.- Valid values: For environments larger than mw1.micro, accepts values from - 2to- 5. Defaults to- 2for all environment sizes except mw1.micro, which defaults to- 1.
- minWorkers Integer
- The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkersfield. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkersfield. For example,2.
- name String
- The name of your Amazon MWAA environment.
- networkConfiguration EnvironmentNetwork Configuration 
- The VPC networking components used to secure and enable network traffic between the AWS resources for your environment. To learn more, see About networking on Amazon MWAA .
- pluginsS3Object StringVersion 
- The version of the plugins.zip file on your Amazon S3 bucket. To learn more, see Installing custom plugins .
- pluginsS3Path String
- The relative path to the plugins.zipfile on your Amazon S3 bucket. For example,plugins.zip. To learn more, see Installing custom plugins .
- requirementsS3Object StringVersion 
- The version of the requirements.txt file on your Amazon S3 bucket. To learn more, see Installing Python dependencies .
- requirementsS3Path String
- The relative path to the requirements.txtfile on your Amazon S3 bucket. For example,requirements.txt. To learn more, see Installing Python dependencies .
- schedulers Integer
- The number of schedulers that you want to run in your environment. Valid values:- v2 - Accepts between 2 to 5. Defaults to 2.
- v1 - Accepts 1.
 
- sourceBucket StringArn 
- The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name. To learn more, see Create an Amazon S3 bucket for Amazon MWAA .
- startupScript StringS3Object Version 
- The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script. - Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example: - 3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo- For more information, see Using a startup script . 
- startupScript StringS3Path 
- The relative path to the startup shell script in your Amazon S3 bucket. For example, - s3://mwaa-environment/startup.sh.- Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script . 
- Object
- A map of tags for the environment. - Search the CloudFormation User Guide for - AWS::MWAA::Environmentfor more information about the expected schema for this property.
- webserverAccess EnvironmentMode Webserver Access Mode 
- The Apache Airflow Web server access mode. To learn more, see Apache Airflow access modes . Valid values: PRIVATE_ONLYorPUBLIC_ONLY.
- weeklyMaintenance StringWindow Start 
- The day and time of the week to start weekly maintenance updates of your environment in the following format: DAY:HH:MM. For example:TUE:03:30. You can specify a start time in 30 minute increments only. Supported input includes the following:- MON|TUE|WED|THU|FRI|SAT|SUN:([01]\d|2[0-3]):(00|30)
 
- airflowConfiguration anyOptions 
- Key/value pairs representing Airflow configuration variables. Keys are prefixed by their section: - [core] dags_folder={AIRFLOW_HOME}/dags- Would be represented as - "core.dags_folder": "{AIRFLOW_HOME}/dags" - Search the CloudFormation User Guide for - AWS::MWAA::Environmentfor more information about the expected schema for this property.
- airflowVersion string
- The version of Apache Airflow to use for the environment. If no value is specified, defaults to the latest version. - If you specify a newer version number for an existing environment, the version update requires some service interruption before taking effect. - Allowed Values : - 1.10.12|- 2.0.2|- 2.2.2|- 2.4.3|- 2.5.1|- 2.6.3|- 2.7.2|- 2.8.1|- 2.9.2(latest)
- dagS3Path string
- The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags. To learn more, see Adding or updating DAGs .
- endpointManagement EnvironmentEndpoint Management 
- Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to SERVICE, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set toCUSTOMER, you must create, and manage, the VPC endpoints in your VPC.
- environmentClass string
- The environment class type. Valid values: mw1.small,mw1.medium,mw1.large. To learn more, see Amazon MWAA environment class .
- executionRole stringArn 
- The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access AWS resources in your environment. For example, arn:aws:iam::123456789:role/my-execution-role. To learn more, see Amazon MWAA Execution role .
- kmsKey string
- The AWS Key Management Service (KMS) key to encrypt and decrypt the data in your environment. You can use an AWS KMS key managed by MWAA, or a customer-managed KMS key (advanced).
- loggingConfiguration EnvironmentLogging Configuration 
- The Apache Airflow logs being sent to CloudWatch Logs: DagProcessingLogs,SchedulerLogs,TaskLogs,WebserverLogs,WorkerLogs.
- maxWebservers number
- The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for - MaxWebserverswhen you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set in- MaxWebserers. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set in- MinxWebserers.- Valid values: For environments larger than mw1.micro, accepts values from - 2to- 5. Defaults to- 2for all environment sizes except mw1.micro, which defaults to- 1.
- maxWorkers number
- The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkersfield. For example,20. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers.
- minWebservers number
- The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for - MaxWebserverswhen you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set in- MinxWebserers.- Valid values: For environments larger than mw1.micro, accepts values from - 2to- 5. Defaults to- 2for all environment sizes except mw1.micro, which defaults to- 1.
- minWorkers number
- The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkersfield. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkersfield. For example,2.
- name string
- The name of your Amazon MWAA environment.
- networkConfiguration EnvironmentNetwork Configuration 
- The VPC networking components used to secure and enable network traffic between the AWS resources for your environment. To learn more, see About networking on Amazon MWAA .
- pluginsS3Object stringVersion 
- The version of the plugins.zip file on your Amazon S3 bucket. To learn more, see Installing custom plugins .
- pluginsS3Path string
- The relative path to the plugins.zipfile on your Amazon S3 bucket. For example,plugins.zip. To learn more, see Installing custom plugins .
- requirementsS3Object stringVersion 
- The version of the requirements.txt file on your Amazon S3 bucket. To learn more, see Installing Python dependencies .
- requirementsS3Path string
- The relative path to the requirements.txtfile on your Amazon S3 bucket. For example,requirements.txt. To learn more, see Installing Python dependencies .
- schedulers number
- The number of schedulers that you want to run in your environment. Valid values:- v2 - Accepts between 2 to 5. Defaults to 2.
- v1 - Accepts 1.
 
- sourceBucket stringArn 
- The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name. To learn more, see Create an Amazon S3 bucket for Amazon MWAA .
- startupScript stringS3Object Version 
- The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script. - Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example: - 3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo- For more information, see Using a startup script . 
- startupScript stringS3Path 
- The relative path to the startup shell script in your Amazon S3 bucket. For example, - s3://mwaa-environment/startup.sh.- Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script . 
- any
- A map of tags for the environment. - Search the CloudFormation User Guide for - AWS::MWAA::Environmentfor more information about the expected schema for this property.
- webserverAccess EnvironmentMode Webserver Access Mode 
- The Apache Airflow Web server access mode. To learn more, see Apache Airflow access modes . Valid values: PRIVATE_ONLYorPUBLIC_ONLY.
- weeklyMaintenance stringWindow Start 
- The day and time of the week to start weekly maintenance updates of your environment in the following format: DAY:HH:MM. For example:TUE:03:30. You can specify a start time in 30 minute increments only. Supported input includes the following:- MON|TUE|WED|THU|FRI|SAT|SUN:([01]\d|2[0-3]):(00|30)
 
- airflow_configuration_ Anyoptions 
- Key/value pairs representing Airflow configuration variables. Keys are prefixed by their section: - [core] dags_folder={AIRFLOW_HOME}/dags- Would be represented as - "core.dags_folder": "{AIRFLOW_HOME}/dags" - Search the CloudFormation User Guide for - AWS::MWAA::Environmentfor more information about the expected schema for this property.
- airflow_version str
- The version of Apache Airflow to use for the environment. If no value is specified, defaults to the latest version. - If you specify a newer version number for an existing environment, the version update requires some service interruption before taking effect. - Allowed Values : - 1.10.12|- 2.0.2|- 2.2.2|- 2.4.3|- 2.5.1|- 2.6.3|- 2.7.2|- 2.8.1|- 2.9.2(latest)
- dag_s3_ strpath 
- The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags. To learn more, see Adding or updating DAGs .
- endpoint_management EnvironmentEndpoint Management 
- Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to SERVICE, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set toCUSTOMER, you must create, and manage, the VPC endpoints in your VPC.
- environment_class str
- The environment class type. Valid values: mw1.small,mw1.medium,mw1.large. To learn more, see Amazon MWAA environment class .
- execution_role_ strarn 
- The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access AWS resources in your environment. For example, arn:aws:iam::123456789:role/my-execution-role. To learn more, see Amazon MWAA Execution role .
- kms_key str
- The AWS Key Management Service (KMS) key to encrypt and decrypt the data in your environment. You can use an AWS KMS key managed by MWAA, or a customer-managed KMS key (advanced).
- logging_configuration EnvironmentLogging Configuration Args 
- The Apache Airflow logs being sent to CloudWatch Logs: DagProcessingLogs,SchedulerLogs,TaskLogs,WebserverLogs,WorkerLogs.
- max_webservers int
- The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for - MaxWebserverswhen you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set in- MaxWebserers. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set in- MinxWebserers.- Valid values: For environments larger than mw1.micro, accepts values from - 2to- 5. Defaults to- 2for all environment sizes except mw1.micro, which defaults to- 1.
- max_workers int
- The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkersfield. For example,20. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers.
- min_webservers int
- The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for - MaxWebserverswhen you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set in- MinxWebserers.- Valid values: For environments larger than mw1.micro, accepts values from - 2to- 5. Defaults to- 2for all environment sizes except mw1.micro, which defaults to- 1.
- min_workers int
- The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkersfield. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkersfield. For example,2.
- name str
- The name of your Amazon MWAA environment.
- network_configuration EnvironmentNetwork Configuration Args 
- The VPC networking components used to secure and enable network traffic between the AWS resources for your environment. To learn more, see About networking on Amazon MWAA .
- plugins_s3_ strobject_ version 
- The version of the plugins.zip file on your Amazon S3 bucket. To learn more, see Installing custom plugins .
- plugins_s3_ strpath 
- The relative path to the plugins.zipfile on your Amazon S3 bucket. For example,plugins.zip. To learn more, see Installing custom plugins .
- requirements_s3_ strobject_ version 
- The version of the requirements.txt file on your Amazon S3 bucket. To learn more, see Installing Python dependencies .
- requirements_s3_ strpath 
- The relative path to the requirements.txtfile on your Amazon S3 bucket. For example,requirements.txt. To learn more, see Installing Python dependencies .
- schedulers int
- The number of schedulers that you want to run in your environment. Valid values:- v2 - Accepts between 2 to 5. Defaults to 2.
- v1 - Accepts 1.
 
- source_bucket_ strarn 
- The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name. To learn more, see Create an Amazon S3 bucket for Amazon MWAA .
- startup_script_ strs3_ object_ version 
- The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script. - Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example: - 3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo- For more information, see Using a startup script . 
- startup_script_ strs3_ path 
- The relative path to the startup shell script in your Amazon S3 bucket. For example, - s3://mwaa-environment/startup.sh.- Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script . 
- Any
- A map of tags for the environment. - Search the CloudFormation User Guide for - AWS::MWAA::Environmentfor more information about the expected schema for this property.
- webserver_access_ Environmentmode Webserver Access Mode 
- The Apache Airflow Web server access mode. To learn more, see Apache Airflow access modes . Valid values: PRIVATE_ONLYorPUBLIC_ONLY.
- weekly_maintenance_ strwindow_ start 
- The day and time of the week to start weekly maintenance updates of your environment in the following format: DAY:HH:MM. For example:TUE:03:30. You can specify a start time in 30 minute increments only. Supported input includes the following:- MON|TUE|WED|THU|FRI|SAT|SUN:([01]\d|2[0-3]):(00|30)
 
- airflowConfiguration AnyOptions 
- Key/value pairs representing Airflow configuration variables. Keys are prefixed by their section: - [core] dags_folder={AIRFLOW_HOME}/dags- Would be represented as - "core.dags_folder": "{AIRFLOW_HOME}/dags" - Search the CloudFormation User Guide for - AWS::MWAA::Environmentfor more information about the expected schema for this property.
- airflowVersion String
- The version of Apache Airflow to use for the environment. If no value is specified, defaults to the latest version. - If you specify a newer version number for an existing environment, the version update requires some service interruption before taking effect. - Allowed Values : - 1.10.12|- 2.0.2|- 2.2.2|- 2.4.3|- 2.5.1|- 2.6.3|- 2.7.2|- 2.8.1|- 2.9.2(latest)
- dagS3Path String
- The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags. To learn more, see Adding or updating DAGs .
- endpointManagement "CUSTOMER" | "SERVICE"
- Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to SERVICE, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set toCUSTOMER, you must create, and manage, the VPC endpoints in your VPC.
- environmentClass String
- The environment class type. Valid values: mw1.small,mw1.medium,mw1.large. To learn more, see Amazon MWAA environment class .
- executionRole StringArn 
- The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access AWS resources in your environment. For example, arn:aws:iam::123456789:role/my-execution-role. To learn more, see Amazon MWAA Execution role .
- kmsKey String
- The AWS Key Management Service (KMS) key to encrypt and decrypt the data in your environment. You can use an AWS KMS key managed by MWAA, or a customer-managed KMS key (advanced).
- loggingConfiguration Property Map
- The Apache Airflow logs being sent to CloudWatch Logs: DagProcessingLogs,SchedulerLogs,TaskLogs,WebserverLogs,WorkerLogs.
- maxWebservers Number
- The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for - MaxWebserverswhen you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set in- MaxWebserers. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set in- MinxWebserers.- Valid values: For environments larger than mw1.micro, accepts values from - 2to- 5. Defaults to- 2for all environment sizes except mw1.micro, which defaults to- 1.
- maxWorkers Number
- The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkersfield. For example,20. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers.
- minWebservers Number
- The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for - MaxWebserverswhen you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set in- MinxWebserers.- Valid values: For environments larger than mw1.micro, accepts values from - 2to- 5. Defaults to- 2for all environment sizes except mw1.micro, which defaults to- 1.
- minWorkers Number
- The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkersfield. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkersfield. For example,2.
- name String
- The name of your Amazon MWAA environment.
- networkConfiguration Property Map
- The VPC networking components used to secure and enable network traffic between the AWS resources for your environment. To learn more, see About networking on Amazon MWAA .
- pluginsS3Object StringVersion 
- The version of the plugins.zip file on your Amazon S3 bucket. To learn more, see Installing custom plugins .
- pluginsS3Path String
- The relative path to the plugins.zipfile on your Amazon S3 bucket. For example,plugins.zip. To learn more, see Installing custom plugins .
- requirementsS3Object StringVersion 
- The version of the requirements.txt file on your Amazon S3 bucket. To learn more, see Installing Python dependencies .
- requirementsS3Path String
- The relative path to the requirements.txtfile on your Amazon S3 bucket. For example,requirements.txt. To learn more, see Installing Python dependencies .
- schedulers Number
- The number of schedulers that you want to run in your environment. Valid values:- v2 - Accepts between 2 to 5. Defaults to 2.
- v1 - Accepts 1.
 
- sourceBucket StringArn 
- The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name. To learn more, see Create an Amazon S3 bucket for Amazon MWAA .
- startupScript StringS3Object Version 
- The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script. - Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example: - 3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo- For more information, see Using a startup script . 
- startupScript StringS3Path 
- The relative path to the startup shell script in your Amazon S3 bucket. For example, - s3://mwaa-environment/startup.sh.- Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script . 
- Any
- A map of tags for the environment. - Search the CloudFormation User Guide for - AWS::MWAA::Environmentfor more information about the expected schema for this property.
- webserverAccess "PRIVATE_ONLY" | "PUBLIC_ONLY"Mode 
- The Apache Airflow Web server access mode. To learn more, see Apache Airflow access modes . Valid values: PRIVATE_ONLYorPUBLIC_ONLY.
- weeklyMaintenance StringWindow Start 
- The day and time of the week to start weekly maintenance updates of your environment in the following format: DAY:HH:MM. For example:TUE:03:30. You can specify a start time in 30 minute increments only. Supported input includes the following:- MON|TUE|WED|THU|FRI|SAT|SUN:([01]\d|2[0-3]):(00|30)
 
Outputs
All input properties are implicitly available as output properties. Additionally, the Environment resource produces the following output properties:
- Arn string
- The ARN for the Amazon MWAA environment.
- CeleryExecutor stringQueue 
- The queue ARN for the environment's Celery Executor . Amazon MWAA uses a Celery Executor to distribute tasks across multiple workers. When you create an environment in a shared VPC, you must provide access to the Celery Executor queue from your VPC.
- DatabaseVpc stringEndpoint Service 
- The VPC endpoint for the environment's Amazon RDS database.
- Id string
- The provider-assigned unique ID for this managed resource.
- WebserverUrl string
- The URL of your Apache Airflow UI.
- WebserverVpc stringEndpoint Service 
- The VPC endpoint for the environment's web server.
- Arn string
- The ARN for the Amazon MWAA environment.
- CeleryExecutor stringQueue 
- The queue ARN for the environment's Celery Executor . Amazon MWAA uses a Celery Executor to distribute tasks across multiple workers. When you create an environment in a shared VPC, you must provide access to the Celery Executor queue from your VPC.
- DatabaseVpc stringEndpoint Service 
- The VPC endpoint for the environment's Amazon RDS database.
- Id string
- The provider-assigned unique ID for this managed resource.
- WebserverUrl string
- The URL of your Apache Airflow UI.
- WebserverVpc stringEndpoint Service 
- The VPC endpoint for the environment's web server.
- arn String
- The ARN for the Amazon MWAA environment.
- celeryExecutor StringQueue 
- The queue ARN for the environment's Celery Executor . Amazon MWAA uses a Celery Executor to distribute tasks across multiple workers. When you create an environment in a shared VPC, you must provide access to the Celery Executor queue from your VPC.
- databaseVpc StringEndpoint Service 
- The VPC endpoint for the environment's Amazon RDS database.
- id String
- The provider-assigned unique ID for this managed resource.
- webserverUrl String
- The URL of your Apache Airflow UI.
- webserverVpc StringEndpoint Service 
- The VPC endpoint for the environment's web server.
- arn string
- The ARN for the Amazon MWAA environment.
- celeryExecutor stringQueue 
- The queue ARN for the environment's Celery Executor . Amazon MWAA uses a Celery Executor to distribute tasks across multiple workers. When you create an environment in a shared VPC, you must provide access to the Celery Executor queue from your VPC.
- databaseVpc stringEndpoint Service 
- The VPC endpoint for the environment's Amazon RDS database.
- id string
- The provider-assigned unique ID for this managed resource.
- webserverUrl string
- The URL of your Apache Airflow UI.
- webserverVpc stringEndpoint Service 
- The VPC endpoint for the environment's web server.
- arn str
- The ARN for the Amazon MWAA environment.
- celery_executor_ strqueue 
- The queue ARN for the environment's Celery Executor . Amazon MWAA uses a Celery Executor to distribute tasks across multiple workers. When you create an environment in a shared VPC, you must provide access to the Celery Executor queue from your VPC.
- database_vpc_ strendpoint_ service 
- The VPC endpoint for the environment's Amazon RDS database.
- id str
- The provider-assigned unique ID for this managed resource.
- webserver_url str
- The URL of your Apache Airflow UI.
- webserver_vpc_ strendpoint_ service 
- The VPC endpoint for the environment's web server.
- arn String
- The ARN for the Amazon MWAA environment.
- celeryExecutor StringQueue 
- The queue ARN for the environment's Celery Executor . Amazon MWAA uses a Celery Executor to distribute tasks across multiple workers. When you create an environment in a shared VPC, you must provide access to the Celery Executor queue from your VPC.
- databaseVpc StringEndpoint Service 
- The VPC endpoint for the environment's Amazon RDS database.
- id String
- The provider-assigned unique ID for this managed resource.
- webserverUrl String
- The URL of your Apache Airflow UI.
- webserverVpc StringEndpoint Service 
- The VPC endpoint for the environment's web server.
Supporting Types
EnvironmentEndpointManagement, EnvironmentEndpointManagementArgs      
- Customer
- CUSTOMER
- Service
- SERVICE
- EnvironmentEndpoint Management Customer 
- CUSTOMER
- EnvironmentEndpoint Management Service 
- SERVICE
- Customer
- CUSTOMER
- Service
- SERVICE
- Customer
- CUSTOMER
- Service
- SERVICE
- CUSTOMER
- CUSTOMER
- SERVICE
- SERVICE
- "CUSTOMER"
- CUSTOMER
- "SERVICE"
- SERVICE
EnvironmentLoggingConfiguration, EnvironmentLoggingConfigurationArgs      
- DagProcessing Pulumi.Logs Aws Native. Mwaa. Inputs. Environment Module Logging Configuration 
- Defines the processing logs sent to CloudWatch Logs and the logging level to send.
- SchedulerLogs Pulumi.Aws Native. Mwaa. Inputs. Environment Module Logging Configuration 
- Defines the scheduler logs sent to CloudWatch Logs and the logging level to send.
- TaskLogs Pulumi.Aws Native. Mwaa. Inputs. Environment Module Logging Configuration 
- Defines the task logs sent to CloudWatch Logs and the logging level to send.
- WebserverLogs Pulumi.Aws Native. Mwaa. Inputs. Environment Module Logging Configuration 
- Defines the web server logs sent to CloudWatch Logs and the logging level to send.
- WorkerLogs Pulumi.Aws Native. Mwaa. Inputs. Environment Module Logging Configuration 
- Defines the worker logs sent to CloudWatch Logs and the logging level to send.
- DagProcessing EnvironmentLogs Module Logging Configuration 
- Defines the processing logs sent to CloudWatch Logs and the logging level to send.
- SchedulerLogs EnvironmentModule Logging Configuration 
- Defines the scheduler logs sent to CloudWatch Logs and the logging level to send.
- TaskLogs EnvironmentModule Logging Configuration 
- Defines the task logs sent to CloudWatch Logs and the logging level to send.
- WebserverLogs EnvironmentModule Logging Configuration 
- Defines the web server logs sent to CloudWatch Logs and the logging level to send.
- WorkerLogs EnvironmentModule Logging Configuration 
- Defines the worker logs sent to CloudWatch Logs and the logging level to send.
- dagProcessing EnvironmentLogs Module Logging Configuration 
- Defines the processing logs sent to CloudWatch Logs and the logging level to send.
- schedulerLogs EnvironmentModule Logging Configuration 
- Defines the scheduler logs sent to CloudWatch Logs and the logging level to send.
- taskLogs EnvironmentModule Logging Configuration 
- Defines the task logs sent to CloudWatch Logs and the logging level to send.
- webserverLogs EnvironmentModule Logging Configuration 
- Defines the web server logs sent to CloudWatch Logs and the logging level to send.
- workerLogs EnvironmentModule Logging Configuration 
- Defines the worker logs sent to CloudWatch Logs and the logging level to send.
- dagProcessing EnvironmentLogs Module Logging Configuration 
- Defines the processing logs sent to CloudWatch Logs and the logging level to send.
- schedulerLogs EnvironmentModule Logging Configuration 
- Defines the scheduler logs sent to CloudWatch Logs and the logging level to send.
- taskLogs EnvironmentModule Logging Configuration 
- Defines the task logs sent to CloudWatch Logs and the logging level to send.
- webserverLogs EnvironmentModule Logging Configuration 
- Defines the web server logs sent to CloudWatch Logs and the logging level to send.
- workerLogs EnvironmentModule Logging Configuration 
- Defines the worker logs sent to CloudWatch Logs and the logging level to send.
- dag_processing_ Environmentlogs Module Logging Configuration 
- Defines the processing logs sent to CloudWatch Logs and the logging level to send.
- scheduler_logs EnvironmentModule Logging Configuration 
- Defines the scheduler logs sent to CloudWatch Logs and the logging level to send.
- task_logs EnvironmentModule Logging Configuration 
- Defines the task logs sent to CloudWatch Logs and the logging level to send.
- webserver_logs EnvironmentModule Logging Configuration 
- Defines the web server logs sent to CloudWatch Logs and the logging level to send.
- worker_logs EnvironmentModule Logging Configuration 
- Defines the worker logs sent to CloudWatch Logs and the logging level to send.
- dagProcessing Property MapLogs 
- Defines the processing logs sent to CloudWatch Logs and the logging level to send.
- schedulerLogs Property Map
- Defines the scheduler logs sent to CloudWatch Logs and the logging level to send.
- taskLogs Property Map
- Defines the task logs sent to CloudWatch Logs and the logging level to send.
- webserverLogs Property Map
- Defines the web server logs sent to CloudWatch Logs and the logging level to send.
- workerLogs Property Map
- Defines the worker logs sent to CloudWatch Logs and the logging level to send.
EnvironmentLoggingLevel, EnvironmentLoggingLevelArgs      
- Critical
- CRITICAL
- Error
- ERROR
- Warning
- WARNING
- Info
- INFO
- Debug
- DEBUG
- EnvironmentLogging Level Critical 
- CRITICAL
- EnvironmentLogging Level Error 
- ERROR
- EnvironmentLogging Level Warning 
- WARNING
- EnvironmentLogging Level Info 
- INFO
- EnvironmentLogging Level Debug 
- DEBUG
- Critical
- CRITICAL
- Error
- ERROR
- Warning
- WARNING
- Info
- INFO
- Debug
- DEBUG
- Critical
- CRITICAL
- Error
- ERROR
- Warning
- WARNING
- Info
- INFO
- Debug
- DEBUG
- CRITICAL
- CRITICAL
- ERROR
- ERROR
- WARNING
- WARNING
- INFO
- INFO
- DEBUG
- DEBUG
- "CRITICAL"
- CRITICAL
- "ERROR"
- ERROR
- "WARNING"
- WARNING
- "INFO"
- INFO
- "DEBUG"
- DEBUG
EnvironmentModuleLoggingConfiguration, EnvironmentModuleLoggingConfigurationArgs        
- CloudWatch stringLog Group Arn 
- The ARN of the CloudWatch Logs log group for each type of Apache Airflow log type that you have enabled. - CloudWatchLogGroupArnis available only as a return value, accessible when specified as an attribute in the- Fn:GetAttintrinsic function. Any value you provide for- CloudWatchLogGroupArnis discarded by Amazon MWAA.
- Enabled bool
- Indicates whether to enable the Apache Airflow log type (e.g. DagProcessingLogs) in CloudWatch Logs.
- LogLevel Pulumi.Aws Native. Mwaa. Environment Logging Level 
- Defines the Apache Airflow logs to send for the log type (e.g. DagProcessingLogs) to CloudWatch Logs. Valid values:CRITICAL,ERROR,WARNING,INFO.
- CloudWatch stringLog Group Arn 
- The ARN of the CloudWatch Logs log group for each type of Apache Airflow log type that you have enabled. - CloudWatchLogGroupArnis available only as a return value, accessible when specified as an attribute in the- Fn:GetAttintrinsic function. Any value you provide for- CloudWatchLogGroupArnis discarded by Amazon MWAA.
- Enabled bool
- Indicates whether to enable the Apache Airflow log type (e.g. DagProcessingLogs) in CloudWatch Logs.
- LogLevel EnvironmentLogging Level 
- Defines the Apache Airflow logs to send for the log type (e.g. DagProcessingLogs) to CloudWatch Logs. Valid values:CRITICAL,ERROR,WARNING,INFO.
- cloudWatch StringLog Group Arn 
- The ARN of the CloudWatch Logs log group for each type of Apache Airflow log type that you have enabled. - CloudWatchLogGroupArnis available only as a return value, accessible when specified as an attribute in the- Fn:GetAttintrinsic function. Any value you provide for- CloudWatchLogGroupArnis discarded by Amazon MWAA.
- enabled Boolean
- Indicates whether to enable the Apache Airflow log type (e.g. DagProcessingLogs) in CloudWatch Logs.
- logLevel EnvironmentLogging Level 
- Defines the Apache Airflow logs to send for the log type (e.g. DagProcessingLogs) to CloudWatch Logs. Valid values:CRITICAL,ERROR,WARNING,INFO.
- cloudWatch stringLog Group Arn 
- The ARN of the CloudWatch Logs log group for each type of Apache Airflow log type that you have enabled. - CloudWatchLogGroupArnis available only as a return value, accessible when specified as an attribute in the- Fn:GetAttintrinsic function. Any value you provide for- CloudWatchLogGroupArnis discarded by Amazon MWAA.
- enabled boolean
- Indicates whether to enable the Apache Airflow log type (e.g. DagProcessingLogs) in CloudWatch Logs.
- logLevel EnvironmentLogging Level 
- Defines the Apache Airflow logs to send for the log type (e.g. DagProcessingLogs) to CloudWatch Logs. Valid values:CRITICAL,ERROR,WARNING,INFO.
- cloud_watch_ strlog_ group_ arn 
- The ARN of the CloudWatch Logs log group for each type of Apache Airflow log type that you have enabled. - CloudWatchLogGroupArnis available only as a return value, accessible when specified as an attribute in the- Fn:GetAttintrinsic function. Any value you provide for- CloudWatchLogGroupArnis discarded by Amazon MWAA.
- enabled bool
- Indicates whether to enable the Apache Airflow log type (e.g. DagProcessingLogs) in CloudWatch Logs.
- log_level EnvironmentLogging Level 
- Defines the Apache Airflow logs to send for the log type (e.g. DagProcessingLogs) to CloudWatch Logs. Valid values:CRITICAL,ERROR,WARNING,INFO.
- cloudWatch StringLog Group Arn 
- The ARN of the CloudWatch Logs log group for each type of Apache Airflow log type that you have enabled. - CloudWatchLogGroupArnis available only as a return value, accessible when specified as an attribute in the- Fn:GetAttintrinsic function. Any value you provide for- CloudWatchLogGroupArnis discarded by Amazon MWAA.
- enabled Boolean
- Indicates whether to enable the Apache Airflow log type (e.g. DagProcessingLogs) in CloudWatch Logs.
- logLevel "CRITICAL" | "ERROR" | "WARNING" | "INFO" | "DEBUG"
- Defines the Apache Airflow logs to send for the log type (e.g. DagProcessingLogs) to CloudWatch Logs. Valid values:CRITICAL,ERROR,WARNING,INFO.
EnvironmentNetworkConfiguration, EnvironmentNetworkConfigurationArgs      
- SecurityGroup List<string>Ids 
- A list of security groups to use for the environment.
- SubnetIds List<string>
- A list of subnets to use for the environment. These must be private subnets, in the same VPC, in two different availability zones.
- SecurityGroup []stringIds 
- A list of security groups to use for the environment.
- SubnetIds []string
- A list of subnets to use for the environment. These must be private subnets, in the same VPC, in two different availability zones.
- securityGroup List<String>Ids 
- A list of security groups to use for the environment.
- subnetIds List<String>
- A list of subnets to use for the environment. These must be private subnets, in the same VPC, in two different availability zones.
- securityGroup string[]Ids 
- A list of security groups to use for the environment.
- subnetIds string[]
- A list of subnets to use for the environment. These must be private subnets, in the same VPC, in two different availability zones.
- security_group_ Sequence[str]ids 
- A list of security groups to use for the environment.
- subnet_ids Sequence[str]
- A list of subnets to use for the environment. These must be private subnets, in the same VPC, in two different availability zones.
- securityGroup List<String>Ids 
- A list of security groups to use for the environment.
- subnetIds List<String>
- A list of subnets to use for the environment. These must be private subnets, in the same VPC, in two different availability zones.
EnvironmentWebserverAccessMode, EnvironmentWebserverAccessModeArgs        
- PrivateOnly 
- PRIVATE_ONLY
- PublicOnly 
- PUBLIC_ONLY
- EnvironmentWebserver Access Mode Private Only 
- PRIVATE_ONLY
- EnvironmentWebserver Access Mode Public Only 
- PUBLIC_ONLY
- PrivateOnly 
- PRIVATE_ONLY
- PublicOnly 
- PUBLIC_ONLY
- PrivateOnly 
- PRIVATE_ONLY
- PublicOnly 
- PUBLIC_ONLY
- PRIVATE_ONLY
- PRIVATE_ONLY
- PUBLIC_ONLY
- PUBLIC_ONLY
- "PRIVATE_ONLY"
- PRIVATE_ONLY
- "PUBLIC_ONLY"
- PUBLIC_ONLY
Package Details
- Repository
- AWS Native pulumi/pulumi-aws-native
- License
- Apache-2.0
We recommend new projects start with resources from the AWS provider.