question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

aws-s3-deployment timeout on larger dataset

See original GitHub issue

🐛 Bug Report

What is the problem?

Deployment to an s3 bucket with the s3-deployment module times out on a moderately sized dataset containing 6 zip files with a combined size of 271MB. Error messages differ, I got these: Failed to create resource. Command '['python3', '/var/task/aws', 's3', 'cp', 's3://cdktoolkit-stagingbucket-mhglm1j9x6gh/assets/[...]f5.zip', '/tmp/tmp8p62yjtl/archive.zip']' died with <Signals.SIGKILL: 9> and: Custom Resource failed to stabilize in expected time

Synching the same contents with the AWS CLI works in about 4min.

Running with the same code as quoted below, but small sized test files, all is good, too and the files appear in the destination bucket.

I did not see any timeout/data size warnings in the docs, other than a lambda execution limit of 15min, but I am not sure this is applicable here.

Reproduction Steps

I used the Python code below with AWS CDK 1.8.0 (build 5244f97).

		# Get the existing bukcet
		layer_bucket = s3.Bucket.from_bucket_name(
			self,
			id = layer_bucket_name,
			bucket_name = layer_bucket_name
		)

		# Upload from a local directory
		s3deploy.BucketDeployment(
			self,
			id = layer_bucket_name + 'Deployment',
			source = s3deploy.Source.asset('./builds/dev/lambda_layers/'),
			destination_bucket = layer_bucket,
		)

Verbose Log

See error messages above.

Environment

  • CDK CLI Version: 1.8.0 (build 5244f97)
  • Module Version: 1.8.0
  • OS: Amazon Linux in a Docker container
  • Language: Python

Other information

n/a

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Reactions:2
  • Comments:6 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
nmussycommented, Sep 23, 2019

The contents of the files are wholly loaded in memory and uploaded to S3. Instead of increasing the RAM to whatever the asset size will be (and then some for overhead), we could do a multi-part S3 upload, loading the file pieces in memory as needed.

1reaction
enginealcommented, Sep 14, 2019

I just ran into this issue as well, in typescript. The s3 sync command tries to get the asset from the staging bucket, but if the asset is larger than the deployment lambda’s memorySize of 128 MB, then the lambda hits the memory limit and the s3 sync command pauses.

I used the following typescript code to increase the memory size: (staticBucketContents.node.findChild('CustomResourceHandler').lambdaFunction.node.defaultChild as CfnFunction).memorySize = 512; where staticBucketContents is my BucketDeployment. Even though typescript throws an error while building this, it still generates the template with the overridden memory size. The increased memory size fixed the issue.

It would be nice to at least be able to properly set the memorySize for the BucketDeployment’s CustomResourceHandler, or better yet, set the memorySize based off the size of the asset.

Read more comments on GitHub >

github_iconTop Results From Across the Web

aws-cdk/aws-s3-deployment module - AWS Documentation
AWS Lambda execution time is limited to 15min. This limits the amount of data which can be deployed into the bucket by this...
Read more >
Resolve deployment timeout errors in AWS CodeDeploy
My AWS CodeDeploy deployment is timing out and returns the following error: "The deployment timed out while waiting for a status callback.
Read more >
Query large datasets (Amazon Athena, Amazon S3, AWS Glue ...
The Deploy and run page is displayed, listing the resources that will be created. This sample project creates the following resources: Amazon Athena...
Read more >
class BucketDeployment (construct) · AWS CDK
The S3 bucket to sync the contents of the zip file to. sources. Type: ISource []. The sources from which to deploy the...
Read more >
Quotas in AWS CodePipeline
If the upload to S3 times out during deployment of a large ZIP file, the action fails with a timeout error. Try breaking...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found