Writing large file stuck in retry
See original GitHub issue>>> p = S3Path("/foo/bar")
This works
>>> with p.open('w') as fp:
... json.dump([None]*10,fp)
This get stuck for a long time
>>> with p.open('w') as fp:
... json.dump([None]*5000,fp)
This works well also
>>> p.write_text(json.dumps([None]*5000))
When it is stuck, I’ve seen a lot of log messages like this.
DEBUG:urllib3.util.retry:Converted retries value: False -> Retry(total=False, connect=None, read=None, redirect=0, status=None)
DEBUG:botocore.awsrequest:Waiting for 100 Continue response.
DEBUG:botocore.awsrequest:100 Continue response seen, now sending request body.
This looks like an infinite loop. But after a long time, it completed successfully.
Issue Analytics
- State:
- Created 3 years ago
- Comments:8 (5 by maintainers)
Top Results From Across the Web
How to Solve File Stops Copying Halfway (5 Ways) - EaseUS
Click "Cancel" on the file copying taskbar or the close button at the right top corner.
Read more >Error Request timed out when you try to upload a large file to a ...
Describes an issue that occurs when you try to upload a large file to a document library on a Windows SharePoint Services 3.0...
Read more >Resolve issues with uploading large files in Amazon S3
I'm trying to upload a large file (1 GB or larger) to Amazon Simple Storage Service (Amazon S3) using the console. However, the...
Read more >Large files are not uploading · Issue #327 · s3fs-fuse ... - GitHub
Using s3fs with rsync, any large files do not seem to upload. The symptom is really odd. rsync shows the file uploading at...
Read more >How to copy files with an option for skipping stuck files?
/W:1 means to wait 1 second between retries. You can change this too. Type robocopy at the command prompt to see more options....
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
For writing small files,
s3path
seems fine as is. For a single-step write,boto3
works. For a streaming writesmart_open
is best.In all cases I rely on
s3path
to manipulate paths and generate URIs.I think this is the same bug I ran into. I noticed it continuously uploading data and never finishing.
Using
smart_open
might be a good idea since it is much more widely used and tested. It does magic things though like decompression of certain file types based on extension which would need to be disabled.