(s3-deployment): Support large s3 bucket deployments
See original GitHub issueWe want to enable users to deploy large amounts of data onto s3 buckets. Currently, such deployments frequently fail because we are limited by the lambda execution environment:
- Deployment size should be no more than 500MB because that is the amount of disk space available on the
/tmp
mount of the lambda. sync
duration has to be less than 15 minutes because this is a hard execution timeout for lambda.
See https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-limits.html
Use Case
Users use the s3-deployment
module to deploy static websites to an s3 bucket. On many occasions, the contents of the website directory can be quite big, and contain many files.
See:
- https://github.com/aws/aws-cdk/issues/7571
- https://github.com/aws/aws-cdk/issues/7316
- https://github.com/aws/aws-cdk/issues/6893
Proposed Solution
No clear solution to this problem yet, this issue is created so we discuss various approaches.
- 👋 I may be able to implement this feature request
- ⚠️ This feature might incur a breaking change
This is a 🚀 Feature Request
Issue Analytics
- State:
- Created 3 years ago
- Reactions:24
- Comments:10 (3 by maintainers)
Top Results From Across the Web
aws-cdk/aws-s3-deployment module - AWS Documentation
This library allows populating an S3 bucket with the contents of .zip files from other S3 buckets or from local disk. The following...
Read more >@aws-cdk/aws-s3-deployment | Yarn - Package Manager
Constructs for deploying contents to S3 buckets ... allows populating an S3 bucket with the contents of .zip files from other S3 buckets...
Read more >b-cfn-s3-large-deployment - PyPI
BucketDeploymentSource - uses another S3 bucket object(-s) as source for the deployment to a destination bucket. Only files up to 5TB are supported...
Read more >Upload to AWS S3 template - Octopus Deploy
Octopus supports the uploading an entire package or the files contained within a package through the Upload a package to an AWS S3...
Read more >How To Deploy Your React App to AWS S3 - Andela
Amazon Simple Storage Service (Amazon S3) is an object storage service that offers industry-leading scalability, data availability, security, ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@ababakanian the memory limit needs to be increased on the custom resource that deploys to the S3 bucket behind your CloudFront distribution.
You want the last line in that,
memoryLimit
.Some work has been done to solve the 500MB limitation by allowing to attach an EFS volume. See https://github.com/aws/aws-cdk/pull/12361.
That PR has gone stale and is now closed, but if anyone want to pick it up i’ll be happy to merge it.