0 votes
by (180 points)

We are using the SFTP FileServer component to allow users to download large files from our servers. These files can commonly be more than 2GB, up to 6GB and even more. The largest is 12GB. We are seeing problems with customers being able to download these files.

As the file is downloading, we see memory usage on the w3wp worker process increase and increase to the point that the process crashes, which crashes the FileServer and stops the download.

If there are two users downloading files at the same time, the memory usage increases even faster, and the crash happens faster.

What's the appropriate way to serve these large files to users?

Applies to: Rebex SFTP, File Server

1 Answer

0 votes
by (147k points)

This looks like something very strange is going on. In the SFTP protocol (and therefore Rebex File Server), files are transfered in multiple chunks of several tens of kilobytes in length each, which means that memory usage should be virtually the same when transferring a 1 MB or 20 GB file, and it should stay pretty much the same for the duration of the transfer.

To make sure this is indeed the case with Rebex File Server, I launched our GUI File Server - FileServerWinForm sample and observed the memory usage using Windows Task Manager. As expected, the memory usage stayed almost constant the whole time, with the working set staying around 34 MB and private working set staying around 10 MB.

Could you please give this sample app a try and let us know whether your results differ in any way? Or supply an application that reproduces the issue? Thanks!

...