question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

force:source:push -- ERROR: Maximum size of request reached. Maximum size of request is 52428800 bytes

See original GitHub issue

Summary

Trying to use force:source:push to delivery metadata and receiving an error that the request is too large.

Steps To Reproduce:

Add a significant amount of metadata, most easily static resource files. Static resources have an individual max of 5MB so you’ll need a few of them. The compressed total size should be > 40MB to be safe.

Try and deliver that metadata to an org using sfdx force:source:push

Expected result

Push would be successful and deal with the various size limitations without crashing.

DX already pre-compresses the static resources into zip files and stores them in the temp directory. It might be appropriate to have a flag or other configuration that authorizes delivering the static resources in appropriate sized chunks BEFORE the main deployment. Static resources don’t have external dependencies that I know of so they are an ideal candidate to pre-load before the remainder of the metadata.

This is opening up the scratch org to partial success and somewhat unpredictable state. I think that is a reasonable trade off and also why I recommend this be a behavioral flag instead of default behavior.

Actual result

Push fails and offers no reasonable recourse to use that mechanism to delivery configuration.

Additional information

This is a specific documented limit https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/meta_deploy.htm The base64 encoded size of the deploy package cannot be over 50MB, ~39MB on disk.

In my case I have roughly 4MB of metadata configuration files (apex, aura, lwc, object, etc…) and 36MB of static resources.

SFDX CLI Version(to find the version of the CLI engine run sfdx --version): sfdx-cli/7.8.1-8f830784cc win32-x64 node-v10.15.3

SFDX plugin Version(to find the version of the CLI plugin run sfdx plugins --core)

@oclif/plugin-commands 1.2.2 (core)
@oclif/plugin-help 2.1.6 (core)
@oclif/plugin-not-found 1.2.2 (core)
@oclif/plugin-plugins 1.7.8 (core)
@oclif/plugin-update 1.3.9 (core)
@oclif/plugin-warn-if-update-available 1.7.0 (core)
@oclif/plugin-which 1.0.3 (core)
@salesforce/sfdx-trust 3.0.2 (core)
analytics 1.1.2 (core)
generator 1.1.0 (core)
salesforcedx 45.16.0 (core)
├─ force-language-services 45.12.0 (core)
└─ salesforce-alm 45.18.0 (core)

sfdx-cli 7.8.1 (core)
sfdx-typegen 0.6.0 (link) C:\Users\aheber\dev\sfdx-typegen

OS and version: Windows 10

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:7 (2 by maintainers)

github_iconTop GitHub Comments

2reactions
ahebercommented, Jun 25, 2019

For any future readers. I’ve built an SFDX plugin to help handle this. You can deploy all static resources via the tooling api. This allows us not to push ALL content at once and bypass the error.

https://github.com/aheber/sfdx-heber#sfdx-heberstaticresourcesdeploy--c--r--v-string--u-string---apiversion-string---json---loglevel-tracedebuginfowarnerrorfataltracedebuginfowarnerrorfatal

From there we .forceignore all static resources temporarily during the initial push and that is getting us off the ground. As an added bonus we went from a 7+ minute static resource deployment to ~30 seconds.

1reaction
ahebercommented, Jun 21, 2019

@clairebianchi I want to make a case for this. DX should do something other than fall over because I have too much code/static resources/etc. #110 still only helps as a workaround to the limit described here.

Is this a “the team doesn’t consider this a problem” or is it “they are unable to take on the work to fix this as an unsupported edge case?” If the later could this be a backlog item somewhere instead of being closed?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Error 'Maximum size of request reached ... - Salesforce Help
Maximum size of request is 52428800 bytes." Resolution. In most circumstances the Batch size limit via the API on Attachment or Content import ......
Read more >
“EXCEEDED_MAX_SIZE_REQUEST: Maximum size of ...
Maximum size of request is 52428800 bytes.” error ... If the size of this file exceeds the defined limit you will encounter this...
Read more >
ERROR running force:source:push: Push failed
If updating sfdx CLI with sfdx command does not resolves the issue, workaround is to : Add --json flag to the push command,...
Read more >
Maximum size of request reached. Maximum size of request is ...
When trying to upload file to Salesforce i am getting error as Maximum size of request reached. Maximum size of request is 52428800...
Read more >
@salesforce/plugin-source - npm
No pull request will be accepted without unit tests. ... --zipfile=<value> path to .zip file of metadata to deploy -g, --ignorewarnings ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found