0 votes
by (150 points)
edited

I'm using the Rebex SFTP libraries (Build 4060 of the .NET 2.0-3.5 libraries) to download files from a SFTP host running GlobalSCAPE. Small file downloads are retrieved without issue. Larger files seem to stop downloading after ~1 GB of data. During all of my tests, I'm downloading within the same corporate network as the SFTP server and pulling files from 3 to 8 GB in size.

I've looked at the GlobalSCAPE settings for the user account and group account membership, and there is no limit on speed, quota, or bandwidth. I've also tried downloading the file with Filezilla, and it pulls the entire 3.5 GB file without issue. (Though I'm not sure whether this detail is endemic.)

I can also tell that the resume logic works, and subsequent MD5 checks (after 4 download attempts to piece together the file) show that the file is accurate. It's just this stalling/resuming that I am trying to troubleshoot.

Below is my code:

    public void DownloadFile(string remotePath, string localPath)
    {
        SftpItem item;
        Stream local = null;

        try
        {
            using (var client = this.GetClient())
            {
                // Ensure file exists on remote server
                item = client.GetInfo(remotePath, false);
                if (item == null || item.IsFile == false)
                    return;

                long remoteOffset = 0;

                if (System.IO.File.Exists(localPath))
                {
                    remoteOffset = client.GetFileLength(remotePath);

                    if (new FileInfo(localPath).Length < remoteOffset)
                    {
                        // Resuming the incomplete download --> seek to the end and read the position 
                        local = File.OpenWrite(localPath);
                        local.Seek(0, SeekOrigin.End);
                        remoteOffset = local.Position;
                    }
                    else
                    {
                        // Remote file is larger than local file --> Kill file and start download from beginning.
                        File.Delete(localPath);     // Important!
                        local = File.Create(localPath);
                    }
                }
                else
                {
                    // File doesn't exist locally --> Create.
                    local = File.Create(localPath);
                }

                // transfer data 
                client.GetFile(remotePath, local, remoteOffset, -1);
            }
        }
        catch (Exception ex)
        {
            // TODO: eat exception
        }
        finally
        {
            // Object cleanup
            if (local != null)
            {
                local.Flush();
                local.Close();
                local.Dispose();
                local = null;
            }

            item = null;
        }
    }

    private Sftp GetClient()
    {
        var client = new Sftp() { Options = SftpOptions.UseLargeBuffers };
        client.Connect(this._serverName, this._portNumber);
        client.Login(this._username, this._password);
        return client;
    }

Any thoughts on why the download would stop after ~1 GB of data? Oddly enough, the thread still seems like it is working for another two minutes before seeing a "server closed the connection" exception message.

(One off-topic observation... If I remove the "File.Delete(localPath)" line, then the file contents are destroyed but the SftpClient will never write to the "local" Stream object. I haven't done much tracing into "why" yet; just wanted to pose that observation for thought.)

Thanks!

Applies to: Rebex SFTP
by (144k points)
We were able reproduced this issue against "GlobalSCAPE EFT Server (v. 6.0)". We are going to look into it now and hopefully have a hotfix soon. Thanks for reporting this!
by (144k points)
(We are going to look into the off-topic observation as well.)

1 Answer

0 votes
by (144k points)
edited
 
Best answer

This is a bug in build 4060 of Rebex SFTP that appeared as a side-effect of unrelated bugfix. It is related to SSH key re-exchange that is performed after an amount of data is transfered (1GB in case of GlobalScape). We will be releasing a new build soon as this is a severe issue. I sent a link to a hotfix to your e-mail, please give it a try and let us know whether it solves the problem. Many thanks for letting us know about this!

UPDATE The fix was released in build 4086 in March 2011.


Regarding the off-topic observation: When you create a new file, you have to set "remoteOffset = 0" as well, otherwise the remoteOffset value passed to the GetFile method will be larger than the remote file size, which means no data would be downloaded. Removing the File.Delete call doesn't seem to have any effect on this. Also, I guess there should be "Remote file is smaller than local file" instead of "Remote file is larger than local file in the comment.

by (150 points)
Lukas -- Thanks for the reply, the subsequent email, and also for catching that logic error in my if/else clause. I'll report back my testing soon (likely on Monday, February 28).
by (150 points)
Lukas -- I finished replacing the 4060 binaries with the ones you sent, and the download went through to completion and passed a subsequent MD5 check. Thanks again for the quick response and turnaround.
...