Note: this is reaction to this comment.
I am sorry. I got stuck with question title "Improve performance of GetFiles ..."
It is possible to filter files on the fly using ListItemReceived event or by rewriting
FileSet.IsMatch() method. However, non of them can be used in case 'oldest 20 files', because this condition requires to iterate through whole list and filter at the end of the process (not during the process). This is currently not possible.
So, my suggestion for your task is:
- filter items using
- select the oldest 20 files by LINQ
- create the
FileSet for those files
- download files based on the created
The code can look like this:
//download oldest 20 files in folder that match criteria
var sftpItems = client.GetList(remoteSearch).Cast<SftpItem>()
.Where(w => w.IsFile)
.OrderBy(o => o.LastWriteTime)
var set = new FileSet(".");
foreach (SftpItem item in sftpItems)
client.Download(set, tempDirectory, TransferMethod.Move, ActionOnExistingFiles.OverwriteAll);
This solution is not the most optimal. It has to iterate through remote directory two times (firstly for
GetList(remoteSearch) and secondly for
This overhead can be eliminated only by not using
Download(set). Instead you can call
client.GetFile() for each item in loop. However, using 'GetFile()' solution, you cannot use
ActionOnExistingFiles.OverwriteAll. So, you would need to handle it for yourself.