Monthly Archives: August 2017

Creating a Basic Azure Web Job

In this article, I discussed the use of Azure functions; however, Web Jobs perform a similar task. Azure Functions are effectively an abstraction on top of Web Jobs – meaning that, while you have more control when using Web Jobs, there’s a little more to do when writing them.

This article covers the basics of Web Jobs, and has a walk-through for creating a very simple task using one.

Create a new Web Job

Once you create this project, you’ll need to fill in the following values in the app.config:

<configuration>
  <connectionStrings>
    <!-- The format of the connection string is "DefaultEndpointsProtocol=https;AccountName=NAME;AccountKey=KEY" -->
    <!-- For local execution, the value can be set either in this config file or through environment variables -->
    <add name="AzureWebJobsDashboard" connectionString="" />
    <add name="AzureWebJobsStorage" connectionString="" />
  </connectionStrings>

These can both be the same value, but they refer to where Azure stores it’s data.

AzureWebJobsDashboard

This is the storage account used to store logs.

AzureWebJobsStorage

This is the storage account used to store whatever the application needs to function (for example: queues or tables). In the example below, it’s where the file will go.

Storage accounts can be set-up from the Azure dashboard (more on this later):

A Basic Application

For this example, let’s take a file from a blob storage and parse it, then write out the result in a log. Specifically, we’ll take an XML file, and write the number of nodes into a log; here’s the file:

<test>
    <myNode>
    </myNode>
    <myNode>
    </myNode>
</test>

I think we’ll probably be looking for a figure around 2.

Blob Storage

Before we can do anything with blob storage, we’ll need a new storage area; create a new storage account:

Set the storage kind to “General Storage” (because we’re working with files); other than that, go with your gut.

Uploading

Once you’ve created the account, you’ll need to add a file – otherwise nothing will happen. You can do this in the web portal, or you can do it via a desktop utility that Microsoft provide: Storage Explorer.

I kind of expected this to take me to the web page mentioned… but it doesn’t! You have to navigate there manually:

http://storageexplorer.com

Install it… unless you want to upload your file using the web portal… in which case: don’t.

We can create a new container:

Now, we can see the storage account and any containers:

Now, you can upload a file from here (remember that you can do all this inside the Portal):

Once you’ve created this, go back and update the storage connection string (described above). You may also want to repeat the process for a dashboard storage area (or, as stated above, they can be the same).

Programmatically Downloading

Now we have a file in the directory, it can be downloaded via the WebJob; here’s a function that will download a file:

        public static async Task<string> GetFileContents(string connectionString, string containerString, string fileName)
        {
            CloudStorageAccount storage = CloudStorageAccount.Parse(connectionString);
            CloudBlobClient client = storage.CreateCloudBlobClient();
            CloudBlobContainer container = client.GetContainerReference(containerString);
            CloudBlob blob = container.GetBlobReference(fileName);

            MemoryStream ms = new MemoryStream();
            await blob.DownloadToStreamAsync(ms);
            ms.Position = 0;

            StreamReader sr = new StreamReader(ms);
            string contents = sr.ReadToEnd();
            return contents;
        }

The code to call this is here (note the commented out commands from the default WebJob Template):

        static void Main()
        {
            Console.WriteLine("Starting");

            var config = new JobHostConfiguration();

            if (config.IsDevelopment)
            {
                config.UseDevelopmentSettings();
            }

            //var host = new JobHost();

            string fileContents = AzureHelpers.GetFileContents(config.StorageConnectionString, "testblob", "test.xml").Result;
            Console.WriteLine(fileContents);

            // The following code ensures that the WebJob will be running continuously
            //host.RunAndBlock();

            Console.WriteLine("Done");
        }

Although this works (sort of – it doesn’t check for new files, and it would need to be run on a scheduled basis – “On Demand” in Azure terms), you don’t need it (at least not for jobs that react to files being uploaded to storage containers). WebJobs provide this functionality out of the box! There are a number of decorators that you can use for various purposes:

  • string
  • TextReader
  • Stream
  • ICloudBlob
  • CloudBlockBlob
  • CloudPageBlob
  • CloudBlobContainer
  • CloudBlobDirectory
  • IEnumerable<CloudBlockBlob>
  • IEnumerable<CloudPageBlob>

Here, we’ll use a BlobTrigger and accept a string. Moreover, doing it this way makes the writing to the log much easier, as there’s injection of sorts (at least I’m assuming that’s what it’s doing). Here’s what the complete solution looks like in the new paradigm:

        public static void ProcessFile([BlobTrigger("testblob/{name}")] string fileContents, TextWriter log)
        {            
            XmlDocument xmlDoc = new XmlDocument();
            xmlDoc.LoadXml(fileContents);            
            log.WriteLine($"Node count: {xmlDoc.FirstChild.ChildNodes.Count}");
        }

The key thing to notice here is that the function is static and public (the class it’s in needs to be public, too – even is that’s the Program class). The WebJob framework uses reflection to work out which functions it needs to run.

The other point to note is that I’m getting the parameter as a string – the article above details what you could have it as; for example, if you wanted to delete it afterwards, you’d probably want to use an ICloudBlob or something similar.

Anyway, it works:

The log file

Remember the storage area that we specified for the dashboard earlier? You should now see some new containers created in that storage area:

This has created a number of directories, but the one that we’re interested in is “output-logs” in the “azure-webjobs-hosts” container:

And here’s the log itself:

References

https://docs.microsoft.com/en-us/azure/app-service-web/web-sites-create-web-jobs

https://stackoverflow.com/questions/36610952/azure-webjobs-vs-azure-functions-how-to-choose

https://stackoverflow.com/questions/27580264/where-do-i-get-the-azurewebjobsdashboard-connection-string-information

http://www.hanselman.com/blog/IntroducingWindowsAzureWebJobs.aspx

https://stackoverflow.com/questions/24286214/where-are-azure-webjobs-blobinput-and-bloboutput-classes

https://docs.microsoft.com/en-us/azure/app-service-web/websites-dotnet-webjobs-sdk-storage-blobs-how-to

Deploying an Azure Recommendation Service Using an ARM Template

Azure provides a number of AI services out of the box. The recommendation service is one of these, and it’s part of the Azure Cognitive Services.

Why?

Deploying a new service to Azure is quite straightforward; for recommendations, you Navigate to the Portal and select a new service:

Then you select the various options one by one, and finally, you create the resource.

But, what if you want to create this in development, and then in test, and then again in production, or what if you want to deploy it again multiple times? Although it’s straightforward, putting data in this kind of form is prone to error – and it’s time consuming.

ARM Templates

Azure allows you to create a template, and to create your resource based on that. There are a number of ways to do this; ultimately, it’s just a JSON document, so you could open up notepad and just type one.

Here’s how I created it initially:

Create a new resource group:

However, this doesn’t seem to give you too much out of the box (there are templates, but recommendations isn’t one of them):

Fortunately, you can reverse engineer the deployment that you’ve already made:

Downloading this gives you everything you need to re-deploy:

Running the template

So, now you’ve got a JSON file, how do you tell Azure what to do? Powershell seems to be the answer of choice (as it is for so many things these days) for Microsoft.

You’ll need to change the execution policy first:

Set-ExecutionPolicy -Scope Process -ExecutionPolicy Unrestricted

Then run the script:

Success

When it works, you’ll get something like this:

And here’s the service:

Errors

It would be a gross exaggeration to say this worked straight away for me; here’s the errors that I encountered, and how I resolved them.

Resource Group Name is null

New-AzureRmResourceGroupDeployment : 18:23:28 – Error: Code=InvalidDeploymentParameterValue; Message=The value of deployment parameter ‘accounts_TestRecommendations_name’ is null. Please specify the value or use the parameter reference. See https://aka.ms/arm-deploy/#parameter-file for details.
At C:\Users\Paul\Downloads\ExportedTemplate-pcm-dev\deploy.ps1:104 char:5
+ New-AzureRmResourceGroupDeployment -ResourceGroupName $resourceGr …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [New-AzureRmResourceGroupDeployment], Exception
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDeploymentCmdlet

Resolution

This is caused because the parameter is set to null by default. Change parameters.json:

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "accounts_TestRecommendations_name": {
            "value": "testRecommendations1"
        }
    }
}

No connection

Login-AzureRmAccount : The browser based authentication dialog failed to complete. Reason: The server or proxy was not found.
At C:\Users\Paul\Downloads\ExportedTemplate-pcm-dev\deploy.ps1:71 char:1
+ Login-AzureRmAccount;
+ ~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [Add-AzureRmAccount], AadAuthenticationFailedException
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.Profile.AddAzureRMAccountCommand

Resolution

This is caused by not having a connection to Azure… so the resolution is to connect.

Invalid parameter value

C:\Users\Paul\Downloads\ExportedTemplate-pcm-dev\deploy.ps1 : Cannot retrieve the dynamic parameters for the cmdlet.
Error parsing boolean value. Path ‘parameters.accounts_TestRecommendations_name.value’, line 6, position 22.
At line:1 char:1
+ .\deploy.ps1
+ ~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [deploy.ps1], ParameterBindingException
+ FullyQualifiedErrorId : GetDynamicParametersException,deploy.ps1

Resolution

In my first attempt to resolve the first error, I specified a name without quotes; i.e.:

            "value": testRecommendations1

This seems to cause Azure to consider this a boolean; the fix is pretty straightforward once you’ve worked out what it’s actually saying:

            "value": "testRecommendations1"

Error

New-AzureRmResourceGroupDeployment : 07:58:51 – Resource Microsoft.CognitiveServices/accounts ‘testRecommendations1’
failed with message ‘{
“error”: {
“code”: “CanNotCreateMultipleFreeAccounts”,
“message”: “Operation failed. Only one free account is allowed for account type ‘Recommendations’.”
}
}’
At C:\Users\Paul\Downloads\ExportedTemplate-pcm-dev\deploy.ps1:104 char:5
+ New-AzureRmResourceGroupDeployment -ResourceGroupName $resourceGr …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [New-AzureRmResourceGroupDeployment], Exception
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDeploymentCmdlet

Resolution

This was caused because my account would only allow me to have a single recommendation service at any time. So the fix is to delete existing recommendation account:

References

https://blogs.endjin.com/2015/07/using-azure-resource-manager-and-powershell-dsc-to-create-and-provision-a-vm/

https://blogs.endjin.com/2016/01/azure-resource-manager-authentication-from-a-powershell-script/