↑ Back to top
For example, we might have three functions that are called in sequence or chained together. Function One gets called first, and when it's finished, it triggers Function Two, and then when that's finished, it triggers Function Three. And, of course, it's possible to implement this with regular Azure Functions. We could use queue messages. So, Function One puts a message on a queue, and that message triggers Function Two, and so on. But what if we wanted to implement a fan-out, and fan-in pattern?
So, Function One might trigger multiple instances of Function Two, and then we want to wait, until all of those instances of Function Two have finished, before triggering Function Three. But that's actually quite a lot harder to do, just with regular Azure Functions. Or, going back to the chaining example, what if we wanted to write an error handler, that could handle an error, wherever in the chain it occurred, whether it was in Function One, Two, or Three? Again, that's quite tricky to implement, with regular Azure Functions. So, what Durable Functions allows us to do, is define our workflow in code. We write some very simple code in C#, that defines the workflow, what order to run the functions in. We can trigger parallel execution, and we can handle errors at any point in the workflow. And not only does this make workflows easier to implement, but they're also easier to understand, because the relationship between all the functions is defined in a single place, in what's called the Orchestrator Function. Durable Functions also solves the problem of state for us. In a serverless environment, we need our functions to be stateless, but our workflow inherently has state associated with it. We need to track how far through the workflow we are, so that we know what the next step in the process is. And typically, if we are doing this ourself, we'd end up storing that state in a database. But Durable Functions handles all state management transparently for us, so it becomes really easy to implement some quite advanced workflow patterns.
and don't worry if we've already installed Visual Studio without this, we can always go into the installer, select modify, and then add the Azure development workload later.
And this is going to give us several really nice Azure development extensions, as well as the Azure SDK. And two important things that we'll get, are the Azure Functions and WebJob Tools Visual Studio extension.
Here we can see it in extensions list. And having this present, will allow us to create new Azure Functions apps from the File-New Project menu.
We'll also get the Azure Storage Emulator. This is necessary, if we want to test Durable Functions locally. And that's because, as we've explained, the Durable Functions extension makes use of Azure storage tables and queues, to implement its task hubs, for storing orchestration state and control messages. So, having the Storage Emulator enabled, will allow us to run and test our Durable Functions locally, against an emulated storage account.
Let's quickly see, how we can create a new Azure Function app, in Visual Studio.
So here we are, in Visual Studio 2017, and we've already installed the Azure development workload. If we go to File-New Project, we'll see that there's an Azure Functions template, that we can choose. we'll need to enter a name for our project, and click OK, and that's going to create a basic function for us.
Now, it's going to prompt for the version of the Azure Functions Runtime, that we want to use. The original version of Azure Functions, was based on the .NET framework. Now, we can also choose whether to create an empty function app, or create one that's got a starter function. we're going to pick one with an HTTP function, just so that we've got something to test, to prove that it's working. And over on the right, we can see, that we can configure some settings for our function app. We can setup the storage account, that we'll be using, and by default, that's going to point to the Storage Emulator. Obviously, in production, we're going to be pointing at a real storage account. But for development purposes, this is fine.
So, we'll click OK, and it will take a minute or so, to create, but once it's done, we can see that we've got a fairly sparse project, that we've created. There's a local.settings.json file, and if we open this file, we can see the connection string, pointing at the Storage Emulator, that we can use, for local development and testing. And this vault doesn't get checked into source control, so when our function app is deployed to Azure, it will be getting these connection strings from our function apps app settings instead. And of course, there's our example function, Function1.cs, which is just going to respond with a hello message, whenever we call it.
Now, we're going to enable Include prerelease, and search for Durable Functions. And here we can see, that the Microsoft Azure WebJobs Extensions Durable Task NuGet package is available, and that's what we need to add. We'll be able to select 1.6.2, there will be newer versions that have been released.
So, once we've installed this NuGet extension to our project, we can see what it's done, by right-clicking on the project, and saying that we want to edit the csproj file.
And we can see, that all it's done is just added a new package reference, to the DurableTask extension.
and when we start debugging, it's going to launch the local Azure Functions host, and it will tell us the URI of our function.
In this case, it is a http://localhost:7071/api/Function1. So, let's copy that into browser, and let's call the function,
and we can see yes, it has called our function, but we need it to pass a name parameter in the query string.
So, let's do that.
we're going to add name equals Kamlesh and going to call function again. And now we can see, Hello Kamlesh. So, we're up and running.
Another important place to find a huge amount of Azure related articles is the TechNet Wiki itself.