question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Writing large file stuck in retry

See original GitHub issue
>>> p = S3Path("/foo/bar")

This works

>>> with p.open('w') as fp:
...   json.dump([None]*10,fp)


This get stuck for a long time

>>> with p.open('w') as fp:
...   json.dump([None]*5000,fp)


This works well also

>>> p.write_text(json.dumps([None]*5000))

When it is stuck, I’ve seen a lot of log messages like this.

DEBUG:urllib3.util.retry:Converted retries value: False -> Retry(total=False, connect=None, read=None, redirect=0, status=None)
DEBUG:botocore.awsrequest:Waiting for 100 Continue response.
DEBUG:botocore.awsrequest:100 Continue response seen, now sending request body.

This looks like an infinite loop. But after a long time, it completed successfully.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:8 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
impredicativecommented, Dec 10, 2020

For writing small files, s3path seems fine as is. For a single-step write, boto3 works. For a streaming write smart_open is best.

In all cases I rely on s3path to manipulate paths and generate URIs.

1reaction
markopycommented, Dec 10, 2020

I think this is the same bug I ran into. I noticed it continuously uploading data and never finishing.

Using smart_open might be a good idea since it is much more widely used and tested. It does magic things though like decompression of certain file types based on extension which would need to be disabled.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to Solve File Stops Copying Halfway (5 Ways) - EaseUS
Click "Cancel" on the file copying taskbar or the close button at the right top corner.
Read more >
Error Request timed out when you try to upload a large file to a ...
Describes an issue that occurs when you try to upload a large file to a document library on a Windows SharePoint Services 3.0...
Read more >
Resolve issues with uploading large files in Amazon S3
I'm trying to upload a large file (1 GB or larger) to Amazon Simple Storage Service (Amazon S3) using the console. However, the...
Read more >
Large files are not uploading · Issue #327 · s3fs-fuse ... - GitHub
Using s3fs with rsync, any large files do not seem to upload. The symptom is really odd. rsync shows the file uploading at...
Read more >
How to copy files with an option for skipping stuck files?
/W:1 means to wait 1 second between retries. You can change this too. Type robocopy at the command prompt to see more options....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found