Note: this is reaction to this comment.
I am sorry. I got stuck with question title "Improve performance of GetFiles ..."
It is possible to filter files on the fly using ListItemReceived event or by rewriting FileSet.IsMatch()
method. However, non of them can be used in case 'oldest 20 files', because this condition requires to iterate through whole list and filter at the end of the process (not during the process). This is currently not possible.
So, my suggestion for your task is:
- filter items using
GetList(remoteSearch)
- select the oldest 20 files by LINQ
- create the
FileSet
for those files
- download files based on the created
FileSet
The code can look like this:
//download oldest 20 files in folder that match criteria
var sftpItems = client.GetList(remoteSearch).Cast<SftpItem>()
.Where(w => w.IsFile)
.OrderBy(o => o.LastWriteTime)
.Take(20);
var set = new FileSet(".");
foreach (SftpItem item in sftpItems)
{
set.Include(item.Name, TraversalMode.NonRecursive);
}
client.Download(set, tempDirectory, TransferMethod.Move, ActionOnExistingFiles.OverwriteAll);
This solution is not the most optimal. It has to iterate through remote directory two times (firstly for GetList(remoteSearch)
and secondly for Download(set)
).
This overhead can be eliminated only by not using Download(set)
. Instead you can call client.GetFile()
for each item in loop. However, using 'GetFile()' solution, you cannot use TransferMethod.Move
and ActionOnExistingFiles.OverwriteAll
. So, you would need to handle it for yourself.