question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Is DATA_UPLOAD_MAX_MEMORY_SIZE actually used?

See original GitHub issue

Hi !

I have been using SRegistry on my Synology NAS without issue for a couple of months now, and that’s really great. However, I am starting to manipulate significantly larger containers (about 3GB, while so far I used 700MB). My NAS has only 2GB of RAM (yeah… I should definitely increase it, but in lockdown condition, it’s, for now, impossible), and the push of my large containers fails with a 403 error (in addition to perturbing the other services running on the NAS). The 403 error received from the client, but the SRegistry logs show things like this:

[error] 6#6: *3392 upstream timed out (110: Operation timed out) while reading response header from upstream, client: 172.17.0.1, server: localhost, request: "PUT /v2/push/imagefile/261/SHA_REDACTED HTTP/1.1", upstream: "uwsgi://172.17.0.4:3031", host: "REDACTED" 

In config.py, I set DATA_UPLOAD_MAX_MEMORY_SIZE = 500, but it does not seem to help.

When I use the search bar in GitHub, DATA_UPLOAD_MAX_MEMORY_SIZE seems to appear only once on the repo (in the config.py). So, I am wondering if this setting is actually used and whether there is a way to improve the situation on my side?

Note: copying the same containers on the same NAS via NFS works without issue, so it is not a network capability issue.

I will be very glad if you have a suggestion on this.

Thanks a lot in advance.

Best regards and stay safe!

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:31 (31 by maintainers)

github_iconTop GitHub Comments

1reaction
Aneoshuncommented, Apr 4, 2020

Hi @vsoch, Don’t be sorry for poking me, I am very happy to see that you have an idea that could solve my problem. I have been very busy in the last few days, I will try the PR asap!

1reaction
Aneoshuncommented, Mar 26, 2020

I am testing your PR, but it will take some time before I can report back as each test take dozens of minutes. More soon.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Django: FILE_UPLOAD_MAX_MEMORY vs ...
POST and is calculated against the total request size excluding any file upload data. FILE_UPLOAD_MAX_MEMORY_SIZE. The maximum size (in bytes) ...
Read more >
Maximum Memory or File Size Exceeded
In the 32-bit version of Office, the maximum files size for a workbook containing a Data Model is 2 GB, and the maximum...
Read more >
Can FILE_UPLOAD_MAX_MEMORY_SIZE be set to None?
The check is only being used for "how much memory should I consume before using ... Django doesn't have a limit on the...
Read more >
Common Pitfalls - Manual
The MAX_FILE_SIZE item cannot specify a file size greater than the file size that has been set in the upload_max_filesize in the php.ini...
Read more >
Storage and upload limits for Google Workspace
Individual users can only upload 750 GB each day between My Drive and all shared drives. Users who reach the 750-GB limit or...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found