Upload large files with temporary name...

0 votes
asked Sep 20, 2010 by Dan (120 points)
edited Mar 23, 2011

Hi, I have a watch process on our SFTP servers that looks for certain file names and picks them for processing. Now my problem is that some of the files are large, so if I SFTP them as is the watch process picks up partial files as the upload did not finish. Is there a way to have each file that is uploaded have a temporary name until it is fully uploaded and to rename it back to the actual name. I know it would be possible to manually upload the files with the temp name and run the rename command once the file is uploaded, but I would like to avoid that if possible.

Thank you, Dan

Applies to: Rebex SFTP
commented Sep 21, 2010 by Lukas Matyska (38,400 points)
Unfortunately, the SFTP protocol doesn’t have such functionality and Rebex SFTP doesn’t have it either (but it has Move/Rename capabilities). Actually, you are facing a problem of synchronizing two processes. Moreover, one of them is a network process which can be interrupted due to connection loss. I think solving it by renaming the uploaded file is the simplest and easiest solution, but please be aware of the unfinished uploads problem (due to possible connection loss). Can you let us know why you would ratheravoid manual renaming? We should be able to offer you other solutions.
commented Sep 21, 2010 by Dan (120 points)
Well, one of the reason is that I'm not really sure what the best way is to implement the rename when multiple files/dir are involved, especially when a wild mask is used.Get/Put one file at a time is no problem and the implem. can be done easy.I was thinking to run the GetList to get a list of dir and files and parse it and run the CreateDirectory and GetFile/PutFile manually for each,that way I could rename each file.I would really appreciate it if you guys have any better solution or examples to do this.I need to support Put and Get of multiple files/dir using wild masks. Thanks you, Dan
commented Sep 21, 2010 by Dan (120 points)
Also, the processes on the SFTP servers are out of my control, as they are managed by other teams and third-party, as such I don't know if any particular file is being watched or not, thus have to ensure that any files transfered will not be partial. That's why using a temp name fixes two possible issues, another process picking up a partially transfered file, and in case the connection is lost, the files will be in the temp state and would never be picked up. Thanks again, Dan

1 Answer

0 votes
answered Sep 22, 2010 by Lukas Matyska (38,400 points)
edited Sep 22, 2010

I recommend the approach demonstrated by the code below. The approach uploads files to a temporary folder, moving each file to the final folder when it is successfully transferred. It also cleans temporary folder (structure) afterwords.

// holds list of path which failed to transfer
List<string> _failed = new List<string>();

// holds list of temporary created folders
List<string> _createdFolders = new List<string>();

string sourcePath = @"c:\temp\extracted\*.txt";
string tempFolder = "/home/tester0/upload/temp";
string finalFolder = "/home/tester0/upload/final";

public void Run()
{
    using (Sftp client = new Sftp())
    {
        // connect and login
        client.Connect("server");
        client.Login("user", "pass");

        // register batch events
        client.BatchTransferProgress += new SftpBatchTransferProgressEventHandler(client_BatchTransferProgress);
        client.BatchTransferProblemDetected += new SftpBatchTransferProblemDetectedEventHandler(client_BatchTransferProblemDetected);

        // create temporary folder for files to upload
        if (!client.DirectoryExists(tempFolder))
        {
            client.CreateDirectory(tempFolder);
            _createdFolders.Add(tempFolder);
        }

        // create final folder for files to upload
        if (!client.DirectoryExists(finalFolder))
            client.CreateDirectory(finalFolder);

        // upload files in XCopy mode overwriting all existing files
        client.PutFiles(sourcePath, tempFolder, SftpBatchTransferOptions.XCopy, SftpActionOnExistingFiles.OverwriteAll);

        // remove temporary folder structure (from the end)
        for (int i = _createdFolders.Count-1; i >= 0; i--)
        {
            client.RemoveDirectory(_createdFolders[i]);
        }
    }

    // report failed files
    if (_failed.Count > 0)
    {
        Console.WriteLine("Following file(s) failed to transfer:");
        foreach (string path in _failed)
        {
            Console.WriteLine(path);
        }
    }
}

void client_BatchTransferProgress(object sender, SftpBatchTransferProgressEventArgs e)
{
    // get Sftp client object
    Sftp client = (Sftp)sender;

    // prepare final path
    string finalPath = finalFolder + e.RemotePath.Substring(tempFolder.Length);

    switch (e.Operation)
    {
        case SftpBatchTransferOperation.DirectoryCreated:
            // batch operation created folder, store it for the clean up
            _createdFolders.Add(e.RemotePath);
            break;

        case SftpBatchTransferOperation.DirectoryProcessingStarted:
            // create final folder
            if (!client.DirectoryExists(finalPath))
                client.CreateDirectory(finalPath);
            break;

        case SftpBatchTransferOperation.FileTransferred:
            // delete the file if it already exists in target destination
            if (client.FileExists(finalPath))
                client.DeleteFile(finalPath);

            // move/rename successfully transferred file
            client.Rename(e.RemotePath, finalPath);
            break;
    }
}

void client_BatchTransferProblemDetected(object sender, SftpBatchTransferProblemDetectedEventArgs e)
{
    // ignore FileExists problem (use default behaviour)
    if (e.ProblemType == SftpBatchTransferProblemType.FileExists)
        return;

    // store path of the failed transfer
    _failed.Add(e.LocalPath);
    e.Action = SftpBatchTransferAction.Skip;
}
...