question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

(s3-deployment): Support large s3 bucket deployments

See original GitHub issue

We want to enable users to deploy large amounts of data onto s3 buckets. Currently, such deployments frequently fail because we are limited by the lambda execution environment:

  • Deployment size should be no more than 500MB because that is the amount of disk space available on the /tmp mount of the lambda.
  • sync duration has to be less than 15 minutes because this is a hard execution timeout for lambda.

See https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-limits.html

Use Case

Users use the s3-deployment module to deploy static websites to an s3 bucket. On many occasions, the contents of the website directory can be quite big, and contain many files.

See:

Proposed Solution

No clear solution to this problem yet, this issue is created so we discuss various approaches.

  • 👋 I may be able to implement this feature request
  • ⚠️ This feature might incur a breaking change

This is a 🚀 Feature Request

Issue Analytics

  • State:open
  • Created 3 years ago
  • Reactions:24
  • Comments:10 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
aiden-sobeycommented, Nov 21, 2022

@ababakanian the memory limit needs to be increased on the custom resource that deploys to the S3 bucket behind your CloudFront distribution.

import * as deployment from 'aws-cdk-lib/aws-s3-deployment';

        new deployment.BucketDeployment(this, 'deploy', {
            destinationBucket: this.bucket,
            distribution: this.distribution,
            distributionPaths: ['/*'], // Invalidate the CloudFront cache on upload
            sources: [
                deployment.Source.asset(buildPath),
            ],
            memoryLimit: 1024,
        });

You want the last line in that, memoryLimit.

1reaction
iliapolocommented, May 17, 2021

Some work has been done to solve the 500MB limitation by allowing to attach an EFS volume. See https://github.com/aws/aws-cdk/pull/12361.

That PR has gone stale and is now closed, but if anyone want to pick it up i’ll be happy to merge it.

Read more comments on GitHub >

github_iconTop Results From Across the Web

aws-cdk/aws-s3-deployment module - AWS Documentation
This library allows populating an S3 bucket with the contents of .zip files from other S3 buckets or from local disk. The following...
Read more >
@aws-cdk/aws-s3-deployment | Yarn - Package Manager
Constructs for deploying contents to S3 buckets ... allows populating an S3 bucket with the contents of .zip files from other S3 buckets...
Read more >
b-cfn-s3-large-deployment - PyPI
BucketDeploymentSource - uses another S3 bucket object(-s) as source for the deployment to a destination bucket. Only files up to 5TB are supported...
Read more >
Upload to AWS S3 template - Octopus Deploy
Octopus supports the uploading an entire package or the files contained within a package through the Upload a package to an AWS S3...
Read more >
How To Deploy Your React App to AWS S3 - Andela
Amazon Simple Storage Service (Amazon S3) is an object storage service that offers industry-leading scalability, data availability, security, ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found