Gold Award WinnerGold Award Winner

Recently I have been working on moving an e-commerce website to an Azure Web App.  Most of the e-commerce functionality worked without issues in Azure except for how the site had implemented integration.  This was done using a collection of XML file based jobs.

There were many options possible and I chose to explore Azure File Storage as from what I have read, it seemed like a good fit.  This article illustrates the general approach.

Note: As Microsoft Azure is rapidly changing it is likely that this article will become outdated quickly as limitations with the platform are addressed.

Context


The e-commerce site is built upon MVC and has 10+ years of development.  The area of the package that needed refactoring consisted of a couple dozen inbound and outbound data feeds loaded into the system.  The feeds are controlled by scheduled tasks termed "Jobs".  Fortunately the package supports many ways of extending the solution including the jobs.  

The current jobs were built using System.IO classes like Directory and File.  The external on-premises solutions create and remove files from a collection of shared network drives.  This is a simple and common integration approach.

With the move to Azure, this no longer was suitable.  The main reason is persistent local storage is not available to web app unlike an Azure VM or on-premises server.  And, of course, the Azure web app did not have access to on-premises network shares.

Design


Azure File Storage (AFS) is a relatively new offering in Azure Storage and initially seemed like a great fit.  AFS has the ability to be mapped within the enterprise as a network drive and in Azure.  This was great news as if we created a shared directory both the existing on-premises systems and the e-commerce would not have to change.  

Unfortunately things are rarely this simple.  Azure web apps do not support accessing AFS as network shares (msdn forum).  I suspect this will be supported in the future but that makes System.IO not suitable.  Ok, converting from the existing Directory and File to using Azure File Storage classes should not be too much of a change.  

One of the great features of AFS is that mounting as a network share is supported on Windows 8, Windows 10 and Windows Server 2012.  This is really simple to set up and there is an excellent article explaining the steps.

That is of course unless you are some of the poor sods that can't seem to get it to work on their network...  At least I am not alone in getting a System Error 53.  I have confidence that soon this will be resolved and/or someone will post a solution but I have a deadline so I needed a work around.

Asking the teams responsible for the external systems to update from System.IO to Azure File Storage client was not a reasonable option so I needed a mechanism to push files from on-premises folders to an AFS share in the short term.

Implementation


The following is the basis of a console application that monitors a folder for new files, and when one is detected moves the file to a AFS file share.  The main purpose is to illustrate how easy AFS is to use and should be viewed as a supplement to the Microsoft article.  Please refer to the article for setting up the initial AFS file share.

The first step was to create a simple console application and set up a watcher on the folder.  The following sets up a listener to when files are created in a specified folder:
01.static void Main(string[] args)
02.{
03.    string syncFolder = ConfigurationManager.AppSettings["SyncFolder"];
04. 
05.    var watcher = new FileSystemWatcher();
06.    watcher.Path = syncFolder;
07.    watcher.Created += Watcher_Created;
08.    watcher.EnableRaisingEvents = true;
09. 
10.    Console.WriteLine(string.Format("Watching the {0} folder.", syncFolder));
11.    Console.WriteLine("Press any key to quit!");          
12.    Console.ReadKey();
13.}
As I wanted to user the AFS client, I installed the WindowsAzure.Storage nuget package.  In order to access a file share, I created a static property as follows:
01.private static CloudFileShare _share;
02.private static CloudFileShare IntegrationFileShare
03.{
04.    get
05.    {
06.        if (_share == null)
07.        {
08.            CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageAccount"]);
09.            CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
10.            _share = fileClient.GetShareReference("integration");
11.        }
12. 
13.        if (!_share.Exists())
14.            throw new ApplicationException(string.Format("Integration file share name {0} does not exist.", _share.Name));
15. 
16.        return _share;
17.    }
18.}

Note that my AFS share is called integration.  Now that I have access to the share, the next step is to push the file to the cloud.  In this example I am deleting the file after the copy but I could always copy to an archive location instead.
01.private static void Watcher_Created(object sender, FileSystemEventArgs e)
02.{
03.    try
04.    {
05.        Stopwatch sw = new Stopwatch();
06.        sw.Start();
07. 
08.        // copy the file out to AFS
09.        CloudFileDirectory rootDir = IntegrationFileShare.GetRootDirectoryReference();
10.        CloudFile file = rootDir.GetFileReference(e.Name);
11.        file.UploadFromFile(e.FullPath, System.IO.FileMode.Open);
12.        sw.Stop();
13. 
14.        // remove the file
15.        File.Delete(e.FullPath);
16. 
17.        Console.WriteLine(string.Format("Uploaded file {0} in {1} milliseconds.", e.Name, sw.ElapsedMilliseconds));
18.    }
19.    catch(Exception ex)
20.    {
21.        Console.WriteLine(string.Format("Failed to upload file {0} with error {1}", e.Name, ex.Message));
22.    }
23.}

Conclusion


Though the simple push mechanism is not an enterprise grade solution, it does unblock the team so they can concentrate on building the e-commerce site.  Appropriate tickets have been raised with external teams and hopefully with either another release by Microsoft, some configuration by the corporation's ISP or a clever technet wiki post we can remove the component.  

Often I find schedules slip due to issues like the above where something should just work but it doesn't.  Recognizing when to find an alternative is often tricky but when the alternative is so easy, sometimes it just makes sense.