onRetry obj.attempt undefined when intelligent uploading is enabled
See original GitHub issueOn 0.9.12, In my uploadConfig
, I have
onRetry: (obj) ->
console.debug "Retrying #{obj.location} (for #{obj.filename}, attempt #{obj.attempt} of 10" +
", location #{obj.location}.", obj
However, once my account got intelligent uploading enabled on it, I noticed that obj.attempt
was undefined and simply didn’t exist in obj
above. I verified this hypothesis by setting intelligent: false
, and the attempt number came back.
Issue Analytics
- State:
- Created 6 years ago
- Comments:7 (4 by maintainers)
Top Results From Across the Web
filestack-js - npm
The Filestack Picker - an upload widget for the web that integrates over a dozen cloud ... .crop boolean | object = true...
Read more >Resolve migration failures | Kibana Guide [8.5] - Elastic
Migrating Kibana primarily involves migrating saved object documents to be compatible with the new version. Saved object migration failuresedit.
Read more >Troubleshooting CI/CD - GitLab Docs
GitLab provides several tools to help make troubleshooting your pipelines easier. This guide also lists common issues and possible solutions.
Read more >Web service error codes (Microsoft Dataverse) - Power Apps
This topic lists the error codes you might encounter when you debug your code.
Read more >Admin Authentication API Errors | Firebase - Google
Error Code, Description and Resolution Steps. auth/claims-too-large, The claims payload provided to setCustomUserClaims() exceeds the maximum allowed size ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@wuservices 0.10.0 is now live, and it includes a patch that lets
intelligent: 'fallback'
be used in the picker. Feel free to open another issue if you run into problems with this release.Thanks for verifying on those things. I’ll deploy with intelligent on but raise the default chunk size for now, but we’re looking forward to that new release!
As a side note, it kind of seems like Google Cloud Storage supports chunks in multiples of 256 KB out of the box https://cloud.google.com/storage/docs/json_api/v1/how-tos/resumable-upload. I wonder if that’s a lot simpler and more efficient than you guys handling it on your own in conjunction with S3.