question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

"get" method doesn't read whole file and finish handling earlier then last bite of file will be read

See original GitHub issue

Env

  • Node version v16.13.2
  • ssh2-sftp-client version 7.2.1 / 7.2.2 both produce the same bug
  • Platform macOS / Linux in docker
  • Platform SFTP server https://www.couchdrop.io/ Account is trial and don’t have any sensative data. (Will be closed after 7 days) Creads: host: ‘m23djak@m23djak.couchdrop.io’, username: ‘m23djak’, password: ‘@hBSg5zu3h8ViV5’,

Set up and code example

npm init npm install --save ssh2-sftp-client create index.js and put code below to it specify any file form sftp storage remotePath variable and run node index.js

'use strict';
// node v16.13.2

const Client = require('ssh2-sftp-client');
const { PassThrough } = require('stream');

const config = {
    host: 'm23djak@m23djak.couchdrop.io',
    username: 'm23djak',
    password: '@hBSg5zu3h8ViV5',
    port: 22,
    debug: (msg) => {
        console.log(`SFTPLOG ${msg}`);
    }
};

let remotePath = '/Archive 2.zip';

const sftp = new Client();
async function main() {
    try {
        await sftp.connect(config);
        console.log('connection established');

        let size = 0;
        const ptStream = new PassThrough();
        ptStream.on('data', (chunk) => {
            size += chunk.length;
            console.log(`PassThrough data recived ${chunk.length} ${size}`)
        });
        ptStream.on('error', (error) => {
            console.log('PassThrough error')
        });
        ptStream.on('finish', () => {
            console.log(`PassThrough finish ${size}`)
        });

        await sftp.get(remotePath, ptStream);
        console.log('File retrieved');
    } catch (err) {
        console.error(`Error: ${err.message}`);
    } finally {
        await sftp.end();
        console.log('conneciton closed');
    }
}

main()
    .then(() => {
        console.log('All done!');
    })
    .catch((err) => {
        console.error('Error in main', err);
    });

What the problem

If file small about 50 mb it is ok But if file larger then 200 mb it is not ok )) File starts read from sftp but read stream finished much earlier than whole file will be read. And every time in new place. For example Archive 2.zip which size is 222237810 (217 mb) stoped in 32489744, 85920480, 125406729, 62412083, 104448777 bites

Example of log

SFTPLOG Outbound: Sending CHANNEL_DATA (r:0, 28)
SFTPLOG SFTP: Outbound: Buffered READ
SFTPLOG Inbound: CHANNEL_DATA (r:0, 31965)
SFTPLOG SFTP: Inbound: Received DATA (id:4777, 31952)
SFTPLOG Outbound: Sending CHANNEL_DATA (r:0, 28)
SFTPLOG SFTP: Outbound: Buffered READ
SFTPLOG Inbound: CHANNEL_DATA (r:0, 31965)
SFTPLOG SFTP: Inbound: Received DATA (id:4778, 31952)
SFTPLOG Outbound: Sending CHANNEL_DATA (r:0, 28)
SFTPLOG SFTP: Outbound: Buffered READ
SFTPLOG Inbound: CHANNEL_DATA (r:0, 1645)
SFTPLOG SFTP: Inbound: Received DATA (id:4779, 1632)
PassThrough data recived 65536 104398848
SFTPLOG Outbound: Sending CHANNEL_DATA (r:0, 28)
SFTPLOG SFTP: Outbound: Buffered READ
SFTPLOG Inbound: CHANNEL_DATA (r:0, 31965)
SFTPLOG SFTP: Inbound: Received DATA (id:4780, 31952)
SFTPLOG Outbound: Sending CHANNEL_DATA (r:0, 28)
SFTPLOG SFTP: Outbound: Buffered READ
SFTPLOG Inbound: CHANNEL_DATA (r:0, 17990)
SFTPLOG SFTP: Inbound: Received DATA (id:4781, 17977)
PassThrough data recived 49929 104448777
SFTPLOG Outbound: Sending CHANNEL_DATA (r:0, 28)
SFTPLOG SFTP: Outbound: Buffered READ
SFTPLOG Inbound: CHANNEL_DATA (r:0, 32)
SFTPLOG SFTP: Inbound: Received STATUS (id:4782, 1, "End of file")
SFTPLOG Outbound: Sending CHANNEL_DATA (r:0, 16)
SFTPLOG SFTP: Outbound: Buffered CLOSE
PassThrough finish 104448777
SFTPLOG CLIENT[sftp]: get resolved on writer finish event
SFTPLOG CLIENT[sftp]: get: Removing temp event listeners
File retrieved
SFTPLOG CLIENT[sftp]: end: Adding temp event listeners
SFTPLOG CLIENT[sftp]: Adding listener to close event
SFTPLOG CLIENT[sftp]: end: Have connection - calling end()
SFTPLOG Outbound: Sending DISCONNECT (11)
SFTPLOG Inbound: CHANNEL_DATA (r:0, 28)
SFTPLOG SFTP: Inbound: Received STATUS (id:4783, 0, "Success")
SFTPLOG Socket ended
SFTPLOG CLIENT[sftp]: end: Ignoring expected end event
SFTPLOG CLIENT[sftp]: Global: Ignoring hanlded end event
SFTPLOG Socket closed
SFTPLOG CLIENT[sftp]: end: Connection closed
SFTPLOG CLIENT[sftp]: end: ignoring expected close event
SFTPLOG CLIENT[sftp]: Global: Ignoring handled close event
SFTPLOG CLIENT[sftp]: end: finally clause fired
SFTPLOG CLIENT[sftp]: end: Removing temp event listeners
SFTPLOG CLIENT[sftp]: Removing listener from close event
conneciton closed
All done!

Log the same in app on which I work

In additional I read quite close issue but thier don’t help https://github.com/theophilusx/ssh2-sftp-client#logging-issues https://github.com/theophilusx/ssh2-sftp-client/issues/355

Also I connect to https://www.couchdrop.io/ support about limitation but it service doesn’t introduce any limitation on trial which cause this behaviour like size limit or timeout. Screenshot 2022-02-03 at 12 50 13 I from Ukranian and there is no problem with internet connection or speed

Also in company we have our own sftp and it upload correctly almost 2Gb file.

Could you help with this because I can’t find and reason why it shouldn’t work. Thanks in advance

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
theophilusxcommented, Feb 3, 2022

As another data point, I just tried downloading your test file from another sftp server (this one is an openssh sftp server running on Ubuntu Mate 21.10) and the download worked and was fast. This would indicate the issue may be specific to the sftp server used by couchdrop.io. I used the same script I posted before for pure ssh2 download.

It might be worthwhile trying some of the rarely used connect options for ssh2. In particular, I would explicitly turn off compression and try forcing only IPv4 and then IPv6. In particular, I have seen problems with compression enabled with some servers.

1reaction
theophilusxcommented, Feb 3, 2022

OK, I ran your script and was able to reproduce the issue. I then modified your script and just did a get without the pass through stream and got the same truncation issue (different size, but still short).

I then used the openSSH sftp CLI and was able to download the file with no errors.

I then used the below script to get the file using only ssh2 (no ssh2-sftp-client). This also failed, which tells me the bug is in the ssh2 module. You will need to log an issue with the ssh2 project.

One thing which did surprise me was how slow both ssh2-sftp-client and ssh2 were. They are MUCH slower than openSSH’s sftp client. While I would expect a node application to be a little slower, the difference was much bigger than I would have expected and it seems a lot slower than the previous ssh2 implementation (ssh2 has recently undergone a complete re-write to address some design limitations with the previous implementation and to get it to work correctly with more recent node versions).

For the record, my testing was on a Ubuntu 21.04 system, running node version 16.13.2. ssh2-sftp-client 7.2.2 and ssh2 1.6.0.

The ssh2 script I used was as follows (btw you have the hostname wrong in the script you sent to me - the error was obvious, so I was able to fix it).

‘use strict’;

const { Client } = require(‘ssh2’); const fs = require(‘fs’);

const config = { host: ‘m23djak.couchdrop.io’, username: ‘m23djak’, password: @.***', port: 22 };

const client = new Client();

const remotePath = ‘/Archive 2.zip’; const localPath = ‘./archive3.zip’;

client .on(‘ready’, function () { console.log(‘Client ready event fired’); client.sftp(function (err, sftp) { if (err) { console.log(SFTP Error: ${err.message}); } else { let sout = fs.createWriteStream(localPath); let sin = sftp.createReadStream(remotePath); sout.on(‘error’, (err) => { console.error(Write Error: ${err.message}); }); sin.on(‘error’, (err) => { console.error(Read Error: ${err.message}); }); sout.on(‘finish’, () => { console.log(‘File dowwload complete’); console.log(‘Closing client connection’); client.end(); }); sin.pipe(sout); } }); }) .on(‘error’, function (err) { console.error(Client Error: ${err.message}); }) .on(‘end’, () => { console.log(‘Client end event fired’); }) .on(‘close’, () => { console.log(‘Client close event fired’); }) .connect(config);

When you log the issue with the ssh2 project, you probably need to include the above (or similar) script and a log dump.

There isn’t much more I can do from the ssh2-sftp-client layer. I suspect this issue is related to the ssh2 re-write. I will try and test with your Archive 2.zip file on my own server and see if I can replicate it in my test environment as well.

If you do log an issue with the ssh2 project, it would be great if you can link it in this issue so we have a full record and to help others who might run into the same issue. Also, if you add the link here, I can add to the issue in the ssh2 project if I am able to reproduce the problem on my own test system.

Read more comments on GitHub >

github_iconTop Results From Across the Web

What happens if a file doesn't end exactly at the last byte?
Since a file is a collection of bytes, not bits, the file will always end at the last byte. Since file sizes are...
Read more >
How to Read Files Easily and Fast (Java Files Tutorial)
The FileInputStream.read() method reads one byte at a time from the file. When it reaches the end of the file, it returns -1....
Read more >
read(2) - Linux manual page - man7.org
read () attempts to read up to count bytes from file descriptor fd into the buffer starting at buf. On files that support...
Read more >
Python Write to File – Open, Read, Append, and Other File ...
Read () The first method that you need to learn about is read() , which returns the entire content of the file as...
Read more >
FileStream.Read Method (System.IO) - Microsoft Learn
Reads a sequence of bytes from the current file stream and advances the position within the file stream by the number of bytes...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found