I'm using the Rebex SFTP libraries (Build 4060 of the .NET 2.0-3.5 libraries) to download files from a SFTP host running GlobalSCAPE. Small file downloads are retrieved without issue. Larger files seem to stop downloading after ~1 GB of data. During all of my tests, I'm downloading within the same corporate network as the SFTP server and pulling files from 3 to 8 GB in size.
I've looked at the GlobalSCAPE settings for the user account and group account membership, and there is no limit on speed, quota, or bandwidth. I've also tried downloading the file with Filezilla, and it pulls the entire 3.5 GB file without issue. (Though I'm not sure whether this detail is endemic.)
I can also tell that the resume logic works, and subsequent MD5 checks (after 4 download attempts to piece together the file) show that the file is accurate. It's just this stalling/resuming that I am trying to troubleshoot.
Below is my code:
public void DownloadFile(string remotePath, string localPath)
{
SftpItem item;
Stream local = null;
try
{
using (var client = this.GetClient())
{
// Ensure file exists on remote server
item = client.GetInfo(remotePath, false);
if (item == null || item.IsFile == false)
return;
long remoteOffset = 0;
if (System.IO.File.Exists(localPath))
{
remoteOffset = client.GetFileLength(remotePath);
if (new FileInfo(localPath).Length < remoteOffset)
{
// Resuming the incomplete download --> seek to the end and read the position
local = File.OpenWrite(localPath);
local.Seek(0, SeekOrigin.End);
remoteOffset = local.Position;
}
else
{
// Remote file is larger than local file --> Kill file and start download from beginning.
File.Delete(localPath); // Important!
local = File.Create(localPath);
}
}
else
{
// File doesn't exist locally --> Create.
local = File.Create(localPath);
}
// transfer data
client.GetFile(remotePath, local, remoteOffset, -1);
}
}
catch (Exception ex)
{
// TODO: eat exception
}
finally
{
// Object cleanup
if (local != null)
{
local.Flush();
local.Close();
local.Dispose();
local = null;
}
item = null;
}
}
private Sftp GetClient()
{
var client = new Sftp() { Options = SftpOptions.UseLargeBuffers };
client.Connect(this._serverName, this._portNumber);
client.Login(this._username, this._password);
return client;
}
Any thoughts on why the download would stop after ~1 GB of data? Oddly enough, the thread still seems like it is working for another two minutes before seeing a "server closed the connection" exception message.
(One off-topic observation... If I remove the "File.Delete(localPath)" line, then the file contents are destroyed but the SftpClient will never write to the "local" Stream object. I haven't done much tracing into "why" yet; just wanted to pose that observation for thought.)
Thanks!