-
Notifications
You must be signed in to change notification settings - Fork 16.7k
Description
Description
AWS recently announced account-regional namespaces for Amazon S3 general purpose buckets (March 12, 2026). This feature allows creating S3 buckets in an account-regional namespace (e.g., mybucket-123456789012-us-east-1-an) instead of the global namespace, eliminating the need for globally unique bucket names.
S3Hook.create_bucket() and S3CreateBucketOperator should support creating buckets in the account-regional namespace. The underlying boto3 create_bucket API now accepts a BucketInfo parameter with Type and DataRedundancy fields for this purpose.
Use case/motivation
Users who want to create S3 buckets in their account-regional namespace using Airflow DAGs currently have no way to pass the required BucketInfo parameter through S3Hook.create_bucket() or S3CreateBucketOperator.
With the new S3 account-regional namespace, organizations can:
- Create buckets without worrying about global name uniqueness
- Use IAM policies and SCPs with the
s3:x-amz-bucket-namespacecondition key to enforce that buckets are only created in the account-regional namespace - Build workloads that use a bucket per customer, team, or dataset with predictable naming
Related issues
- AWS announcement: https://aws.amazon.com/about-aws/whats-new/2026/03/amazon-s3-account-regional-namespaces/
- AWS blog post: https://aws.amazon.com/blogs/aws/introducing-account-regional-namespaces-for-amazon-s3-general-purpose-buckets/
- Terraform AWS provider tracking issue: r/aws_s3_bucket: Account regional namespaces for general purpose buckets hashicorp/terraform-provider-aws#46902
Are you willing to submit a PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct