Virtual File System for Azure Blog Storage Error (Similar to Oct27 2017 questions by Paula)

0 votes
asked Dec 7, 2020 by XM (120 points)

I am creating a virtual file system for azure blob storage. I am utilizing the Azure Blob storage client library v12 for .NET. Neither of the following two methods seem to work properly:


I am testing with a filezilla client and get the following:

Command: remote:/Somecontainer/testfileforsftp.txt => local:C:\temp\testfileforsftp3.txt
Error: error while reading: received failure with description 'Read fault.'
Error: File transfer failed

The relevant code is simple and listed below:

potected override NodeContent GetContent(...    
  BlobClient blobClient = getAZBlobClient(getNodeBlobName(node));
  BlobDownloadInfo download = blobClient.Download();
  return NodeContent.CreateReadOnlyContent(download.Content);
  //return NodeContent.CreateImmediateWriteContent(download.Content.)


Applies to: Rebex SFTP, File Server

2 Answers

0 votes
answered Dec 7, 2020 by renestein (3,430 points)
edited Dec 7, 2020 by renestein

Hello XM,
thanks for the question and welcome to the Rebex forum.
From the description provided, I can only guess that an exception has been thrown from the Azure BlobClient (Azure stream). We had encountered a misbehaving stream in previous versions of the Azure SDK (unsupported SetLength method, problematic Seek method).
Server log should reveal where the problem lies.

To create a server log, use code similar to code on the following line.

mySftpServer.LogWriter = new Rebex.FileLogWriter(@"c:\temp\log.txt", Rebex.LogLevel.Debug);

Alternatively, you can log file operations in your file system provider. Your provider should have a constructor that accepts an instance of the Rebex FileSystemProviderSettings.

 public MyAzureProvider(FileSystemProviderSettings fileSystemSettings = null) : base(fileSystemSettings)

Use logger in the provider.

var settings = new FileSystemProviderSettings()
                   LogWriter = new Rebex.FileLogWriter(@"c:\temp\log.txt", Rebex.LogLevel.Debug)
var myAzureProvider = new MyAzureProvider(settings);
0 votes
answered Dec 11, 2020 by renestein (3,430 points)
edited Dec 11, 2020 by renestein

I can confirm that Azure SDK v12 (package Azure Storage Blob 12.7) streams exhibit strange behavior in some scenarios.
If you need a read-only stream, use the stream returned from the OpenRead method.

var blobClient = GetBlobClientForNode(node);
  if (contentParameters.AccessType == NodeContentAccess.Read)
    return NodeContent.CreateReadOnlyContent(blobClient.OpenRead());

If you or another future visitor of the forum experience problems with 'write' streams created by the BlockBlobClient.OpenWrite method (upload failed because blob E-Tag has been changed and so on), please drop an email to
I have written an experimental 'BlobWriteStream' that is able to upload blob blocks in parallel.

It can be used like this in the FileSystemProvider GetContent method.

  var blobClient = GetBlobClientForNode(node);

  if (contentParameters.AccessType == NodeContentAccess.Write)
    var blockBlobClient = GetContainerClientForNode(node.Parent).GetBlockBlobClient(node.Name);
    var blobWriteStream = new BlobWriteStream(blobClient, blockBlobClient, _memoryStreamManager);
    return NodeContent.CreateImmediateWriteContent(blobWriteStream);
commented Dec 31, 2020 by XM (120 points)
Thank you for the support provided above.  

I put some further work into this.  The offending issue in this case is that the returning stream for AZ does not support seek.  To accommodate I utilized code to wrap the stream from the contributors to stackoverflow at this link:

the replacement code is:
 return NodeContent.CreateReadOnlyContent(new SeekableBlobReadStream(blobClient));


This is now working.
commented Dec 31, 2020 by renestein (3,430 points)
Thanks for letting us know. As I wrote in the answer above, we have experimental 'BlogWriteStream' which, as the name suggests, supports 'Write' (upload file) to Azure scenarios.