[pipelines] Asset stage can't support more than 50 assets.
See original GitHub issueWhen attempting to deploy a pipeline with more than 50 assets, the deploy fails due to the Asset stage having more than 50 actions.
Reproduction Steps
Deploy a pipeline with > 50 assets.
for (let i = 0; i < 32; i++) {
const websiteBucket = new s3.Bucket(this, `WebsiteBucket${i}`, {
websiteIndexDocument: 'index.html',
});
new s3deploy.BucketDeployment(this, `DeployWebsite${i}`, {
sources: [s3deploy.Source.asset('./test-file-asset', { assetHash: `file-asset-${i}` })],
destinationBucket: websiteBucket,
});
new ecr_assets.DockerImageAsset(this, `DockerAsset${i}`, {
directory: './test-docker-asset',
extraHash: `docker-asset-${i}`,
});
}
Error Log
On deploy, the following error is hit:
Pipeline stage 'Assets' has too many actions. There can only be up to 50 actions in a pipeline stage (Service: AWSCodePipeline; Status Code: 400; Error Code: InvalidStageDeclarationException; Request ID: xxx)
Other
The approach will likely be to dynamically create new asset publishing stages to the pipeline when > 50 assets are created. This will require API support in CodePipelines, as currently stages can’t be inserted, only added to the end of the pipeline.
This is 🐛 Bug Report
Issue Analytics
- State:
- Created 3 years ago
- Reactions:7
- Comments:6 (3 by maintainers)
Top Results From Across the Web
[IMPORTANT]Asset store compatibility and SRP issues, for ...
Many assets right now will not work, and will not have any purpose in the new pipelines as they were designed specifically to...
Read more >Troubleshooting common AWS CDK issues
(The staging bucket is used when deploying stacks that contain assets or that synthesize an AWS CloudFormation template larger than 50K.) Use an...
Read more >Deploy assets by using Azure Pipelines - Finance & Operations
This article explains how you can deploy assets from the Asset library in Microsoft Dynamics Lifecycle Services (LCS) by using pipelines in ...
Read more >Adding data to a project - IBM Cloud Pak for Data as a Service
However, you can't add an asset type with the same name multiple times. You can use the following methods to add data assets...
Read more >`.gitlab-ci.yml` keyword reference - GitLab Docs
stages, The names and order of the pipeline stages. ... Jobs that do not define one or more of the listed keywords use...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@rix0rrr Is there a plan of how this will potentially be implemented yet? I’m coming pretty close to this limit, 48 assets, so if there is anyway I can help push this through that would be great!
AWS CodePipeline has (at the time of this writing) a non-adjustable quota of 50 total stages per pipeline and 50 total actions per stage. High number of artifacts can drive a pipeline to exceed 50 stages, hitting the non-adjustable quota.
What about using AWS Step Functions to execute AWS CodeBuild projects using a Map state? CodePipeline has a native integration with Step Functions, so no custom code would be required for such invocation. Step Functions has a native integration with CodeBuild, so no custom code would be required here as well.
The flow would be: CDK Pipelines
Assets
stage =>Step Functions
action =>Map
state with artifact IDs (potentially with concurrency) =>CodeBuild
project =>cdk-assets
command.The Map state can hold IDs of the artifacts (managed by
pipelines
module), for example, and set them as environment variables for the CodeBuild project (using environmentVariablesOverride parameter). The build would usecdk-assets
to uploads these artifacts. Map state supports concurrency, which can further speed up processing. Step Functions now supports payload sizes up to 256KB, which should support processing large number of artifacts in a single CodePipeline stage and action. The number of artifacts processed depends on what will be the actual parameters passed to CodeBuild from Map state array.