I'm going to build on the other answers here for completeness.
I have moved my bucket to a subdomain so that the contents can be cached by Cloudflare.
- Old S3 Bucket Name: autoauctions
- New S3 Bucket Name: img.autoauctions.io
- CNAME record: img.autoauctions.io.s3.amazonaws.com
Now you'll need to copy all of your objects since you cannot rename a bucket. Here's how to do that with AWS CLI:
pip install awscli
aws configure
Now you'll copy your old bucket contents to your new bucket.
aws s3 sync s3://autoauctions s3://img.autoauctions.io
I found this to be too slow for the 1TB of images I needed to copy, so I increased the number of concurrent connections and re-ran from an EC2 instance.
aws configure set default.s3.max_concurrent_requests 400
Sync it up!
Want to make folders within your bucket public? Create a bucket policy like this:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::img.autoauctions.io/copart/*"
},
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::img.autoauctions.io/iaai/*"
}
]
}
And now the image loads from img.autoauctions.io
via Cloudflare's cache.
Hope this helps some people!