Resume file transfers after connection issues
See original GitHub issueBased on the issue here https://github.com/patrickjuchli/basic-ftp/issues/123#issuecomment-573421466, it might be helpful to offer automatic retries when file transfers fail due to connectivity issues.
This kind of functionality is not within the original scope of this library. basic-ftp
takes care of the basics of FTP, things like retries have been left to the target audience of this library, developers. Reconsider this.
Issue Analytics
- State:
- Created 4 years ago
- Reactions:2
- Comments:7 (3 by maintainers)
Top Results From Across the Web
WebSphere® Adapter for FTP - Resume file transfer - IBM
When the connection is reestablished you can resume the transfer of files. The files are transferred from the point at which it was...
Read more >File Transfer Resume with FTP & SFTP | ExaVault Blog
Interruptions have a way of happening when you are least prepared. The FTP resume feature helps you recover and gets your files where...
Read more >File Transfer Resume - WinSCP
Basic Usage; Automatic Resume / Transfer to Temporary Filename; Manual Resume; Common Problems; Resuming from Broken Connections.
Read more >Automatically resume large file transfer after reconnection?
I have a very large file that's being transfer to my server, over 100GB. It would take 10 days or so, so I...
Read more >resume option for file transfer over network/wifi
the problem i have is im trying to transfer a 1gb+ file over my network from my wifi laptop to my network hard...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@patrickjuchli thank you very much for this clarification. Well, when I have some free capacity I will make a function with reconnect.
It should do basically the same like client.uploadFromDir(“my/local/directory”)
Just with an list of fiels and index and maxReconnectCount.
@patrickjuchli : From my side it is totally ok if I handle the reconnects. But the problem is that I can not continue the job where it stopped. For example if I am uploading 100 files via ftp, I uploaded 76, but the server connection get lost. I want to continue the job only for the 24 left. Is it possible right now to handle this? I think right now we have to restart the whole process?