0 votes
by (120 points)

I am creating a virtual file system for azure blob storage. I am utilizing the Azure Blob storage client library v12 for .NET. Neither of the following two methods seem to work properly:

CreateReadOnlyContent
CreateImmediateWriteContent

I am testing with a filezilla client and get the following:

Command: remote:/Somecontainer/testfileforsftp.txt => local:C:\temp\testfileforsftp3.txt
Error: error while reading: received failure with description 'Read fault.'
Error: File transfer failed

The relevant code is simple and listed below:

potected override NodeContent GetContent(...    
  BlobClient blobClient = getAZBlobClient(getNodeBlobName(node));
  BlobDownloadInfo download = blobClient.Download();
  return NodeContent.CreateReadOnlyContent(download.Content);
  //return NodeContent.CreateImmediateWriteContent(download.Content.)

;

Applies to: Rebex SFTP, File Server

2 Answers

0 votes
by (5.3k points)
edited by

Hello XM,
thanks for the question and welcome to the Rebex forum.
From the description provided, I can only guess that an exception has been thrown from the Azure BlobClient (Azure stream). We had encountered a misbehaving stream in previous versions of the Azure SDK (unsupported SetLength method, problematic Seek method).
Server log should reveal where the problem lies.

To create a server log, use code similar to code on the following line.

mySftpServer.LogWriter = new Rebex.FileLogWriter(@"c:\temp\log.txt", Rebex.LogLevel.Debug);

Alternatively, you can log file operations in your file system provider. Your provider should have a constructor that accepts an instance of the Rebex FileSystemProviderSettings.

 public MyAzureProvider(FileSystemProviderSettings fileSystemSettings = null) : base(fileSystemSettings)
{
...
}

Use logger in the provider.

var settings = new FileSystemProviderSettings()
                 {
                   LogWriter = new Rebex.FileLogWriter(@"c:\temp\log.txt", Rebex.LogLevel.Debug)
                 };
var myAzureProvider = new MyAzureProvider(settings);
0 votes
by (5.3k points)
edited by

I can confirm that Azure SDK v12 (package Azure Storage Blob 12.7) streams exhibit strange behavior in some scenarios.
If you need a read-only stream, use the stream returned from the OpenRead method.

var blobClient = GetBlobClientForNode(node);
  if (contentParameters.AccessType == NodeContentAccess.Read)
  {
    return NodeContent.CreateReadOnlyContent(blobClient.OpenRead());
  }

If you or another future visitor of the forum experience problems with 'write' streams created by the BlockBlobClient.OpenWrite method (upload failed because blob E-Tag has been changed and so on), please drop an email to support@rebex.net.
I have written an experimental 'BlobWriteStream' that is able to upload blob blocks in parallel.

It can be used like this in the FileSystemProvider GetContent method.

  var blobClient = GetBlobClientForNode(node);

  if (contentParameters.AccessType == NodeContentAccess.Write)
  {
    var blockBlobClient = GetContainerClientForNode(node.Parent).GetBlockBlobClient(node.Name);
    var blobWriteStream = new BlobWriteStream(blobClient, blockBlobClient, _memoryStreamManager);
    return NodeContent.CreateImmediateWriteContent(blobWriteStream);
  }
by (120 points)
Thank you for the support provided above.  

I put some further work into this.  The offending issue in this case is that the returning stream for AZ does not support seek.  To accommodate I utilized code to wrap the stream from the contributors to stackoverflow at this link: https://stackoverflow.com/questions/16497241/how-to-enable-seeking-in-azure-blob-stream

the replacement code is:
 return NodeContent.CreateReadOnlyContent(new SeekableBlobReadStream(blobClient));

Thanks.

This is now working.
by (5.3k points)
Thanks for letting us know. As I wrote in the answer above, we have experimental 'BlogWriteStream' which, as the name suggests, supports 'Write' (upload file) to Azure scenarios.
...