0 votes
by (170 points)

I have an issue when downloading files from one of our new client's sftp server.
When using the streaming option like this:

using (Stream remoteFileStream = <Sftp client>.GetStream(fullFilePath, FileMode.Open, FileAccess.Read))
{
   <some custom object>.SaveFileFromStream(remoteFileStream, <other params>, ...);
}

Passing the stream to save a file results with:

[System.IO.IOException] = {"Cannot close stream until all bytes are
written."}

Because accessing Length property of the stream throws an exception:

Length = 'remoteFileStream.Length' threw an exception of type
'Rebex.Net.SftpException' base {Rebex.Net.NetworkSessionException} =
{"Failure; The message [] is not extractable!."}

I tried saving the file with sample code using GetFiles() method without streaming and the files get saved with no issues.
The only difference I could find between the prod server that has issue, and our test server is the ProtocolVersion property of Sftp client which is 3 in prod vs. 4 in test.
Other servers in prod never had issues like this, but I can't tell what is their ProtocolVersion because we don't log that info.
Saving file locally instead of streaming is not a viable option.
Any help is appreciated.

Applies to: Rebex SFTP

1 Answer

0 votes
by (58.9k points)
selected by
 
Best answer

Hello,

it seems that the SFTP server that you use is unable to retrieve the length of the already opened file. This is a not related to a version of SFTP protocol, but rather a feature/bug of the SFTP server. It might be worth reporting it to your SFTP server vendor.

More details: when Rebex SFTP asks for the stream.Length, (this is done via SSHFXPFSTAT command), the server replies with "Failure; The message [] is not extractable!."

As a workaround please try to retrieve the remote file length before actually calling the sftp.GetStream and then pass the already retrieved length to your custom method like this:

long fileLength = sftp.GetFileLength(fullFilePath);

using (Stream remoteFileStream = sftp.GetStream(fullFilePath, FileMode.Open, FileAccess.Read))
{
    < some custom object>.SaveFileFromStream(remoteFileStream, fileLength, < other params>, ...);
}

Let me know whether it helped to solve your issue.

by (170 points)
Hi Tomas,

Unfortunately, I don't read the Length property explicitly, but it gets called internally as part of Stream.CopyTo(Stream destination) which throws:
[System.IO.IOException] = {"Cannot close stream until all bytes are
 written."}

And I need to make that call because I need to pass the stream to the web request. So you are saying it's the error on the sftp server side, and there is nothing we can do on our side.

Thanks for the answer.
by (148k points)
Well, if you need to work around the server issue, you can implement a custom `Stream` that wraps the stream returned by `GetStream` and a custom implementation of `Length` that calls `Sftp` object's `GetFileLength` method. We can help with that tomorrow.
by (170 points)
Hi Lukas, thanks for the suggestion.
I was able to get to the support of the sftp server vendor of our client, and we got to the bottom of this. The issue here is with the files staged with an extractability count of 1.
When you get the stream with rebex sftp like this:

Stream remoteFileStream = sftp.GetStream(fullFilePath, FileMode.Open, FileAccess.Read)


and then pass it to the web request body using Stream.CopyTo(Stream destination)

HttpWebRequest request = ...
using (Stream requestStream = request.GetRequestStream())
{
    // Send the file as body request
    remoteFileStream.CopyTo(requestStream);
}

The exception gets thrown because rebex is trying to read again from the sftp server information about a file that is already extracted:
[System.IO.IOException] = {"Cannot close stream until all bytes are written."}

But if you change the way you get the stream to:

Stream remoteFileStream = new MemoryStream();
sftp.GetFile(item.FullFilePath, remoteFileStream);

then there is no issue. So it looks like method GetStream cannot be used for files that are available for one time download.
Am I using GetStream the wrong way, or is it a bug/feature that all properties of the stream are not composed in one shot?
by (148k points)
Actually, "one-time-download" is not a feature of the SFTP protocol - it's an addition implemented on top of it by your server vendor, and although it doesn't modify existing SFTP commands, it limits actions that can be done. It's not a bug or feature, it's just the way this particular server works when accessing files like this one.

This said, GetStream can actually be used for "one-time-download" files as long as you don't try to determine their length or other information. And I'm not quite sure why Stream.CopyTo needs to determine the file length - perhaps using something like this instead would solve the problem:

        public static void Copy(Stream input, Stream output)
        {
            int bufferSize = 32 * 1024;
            byte[] buffer = new byte[bufferSize];
            while (true)
            {
                int n = input.Read(buffer, 0, bufferSize);
                if (n == 0)
                    break;
                output.Write(buffer, 0, n);
            }
        }

The GetFile and MemoryStream approach works because GetFile doesn't try to determine the file length (unless you registered a TransferProgressChanged event). However, please be aware that if you need to work with very long files, storing them in MemoryStream might become problematic (it needs to be able to allocate a continuous block of memory, which might fail even when there is plenty of free memory available).
by (170 points)
We ended up agreeing with the client to temporarily change settings on their sftp server not to expire files after download.
Permanent solution is what you suggested in your last post, to stop using System.IO.Stream.CopyTo and have custom implementation for copying the stream that does not require additional opening of the file on the sftp server.

Thanks so much
...