Tag Archives: Service Bus

Setting up an e-mail Notification System using Logic Apps

One of the new features of the Microsoft’s Azure offering are Logic Apps: these are basically a workflow system, not totally dis-similar to Windows Workflow (WF so as not to get sued by panda bears). I’ve worked with a number of workflow systems in the past, from standard offerings to completely bespoke versions. The problem always seems to be that, once people start using them, they become the first thing you reach for to solve every problem. Not to say that you can’t solve every problem using a workflow (obviously, it depends which workflow and what you’re doing), but they are not always the best solution. In fact, they tend to be at their best when they are small and simple.

With that in mind, I thought I’d start with a very straightforward e-mail alert system. In fact, all this is going to do is to read an service bus queue and send an e-mail. I discussed a similar idea here, but that was using a custom written function.

Create a Logic App

The first step is to create a new Logic App project:

There are three options here: create a blank logic app, choose from a template (for example, process a service bus message), or define your own with a given trigger. We’ll start from a blank app:

Trigger

Obviously, for a workflow to make sense, it has to start on an event or a schedule. In our case, we are going to run from a service bus entry, so let’s pick that from the menu that appears:

In this case, we’ll choose Peek-Lock, so that we don’t lose our message if something fails. I can now provide the connection details, or simply pick the service bus from a list that it already knows about:

It’s not immediately obvious, but you have to provide a connection name here:

If you choose Peek-Lock, you’ll be presented with an explanation of what that means, and then a screen such as the following:

In addition to picking the queue name, you can also choose the queue type (as well as listening to the queue itself, you can run your workflow from the dead-letter queue – which is very useful in its own right, and may even be a better use case for this type of workflow). Finally, you can choose how frequently to poll the queue.

If you now pick “New step”, you should be faced with an option:

In our case, let’s provide a condition (so that only queue messages with “e-mail” in the message result in an e-mail):

Before progressing to the next stage – let’s have a look at the output of this (you can do this by running the workflow and viewing the “raw output”):

Clearly the content data here is not what was entered. A quick search revealed that the data is Base64 encoded, so we have to make a small tweak in advanced mode:

Okay – finally, we can add the step that actually sends the e-mail. In this instance, I simply picked Outlook.com, and allowed Azure access to my account:

The last step is to complete the message. Because we only took a “peek-lock”, we now need to manually complete the message. In the designer, we just need to add an action:

Then tell it that we want to use the service bus again. As you can see – that’s one of the options in the list:

Finally, it wants the name of the queue, and asks for the lock token – which it helpfully offers from dynamic content:

Testing

To test this, we can add a message to our test queue using the Service Bus Explorer:

I won’t bother with a screenshot of the e-mail, but I will show this:

Which provides a detailed overview of exactly what has happened in the run.

Summary

Having a workflow system built into Azure seems like a two edged sword. On the one hand, you could potentially use it to easily augment functionality and quickly plug holes; however, on the other hand, you might find very complex workflows popping up all over the system, creating an indecipherable architecture.

Working with Multiple Cloud Providers – Part 3 – Linking Azure and GCP

This is the third and final post in a short series on linking up Azure with GCP (for Christmas). In the first post, I set-up a basic Azure function that updated some data in table storage, and then in the second post, I configured the GCP link from PubSub into BigQuery.

In the post, we’ll square this off by adapting the Azure function to post a message directly to PubSub; then, we’ll call the Azure function with Santa’a data, and watch that appear in BigQuery. At least, that was my plan – but Microsoft had other ideas.

It turns out that Azure functions have a dependency on Newtonsoft Json 9.0.1, and the GCP client libraries require 10+. So instead of being a 10 minute job on Boxing day to link the two, it turned into a mammoth task. Obviously, I spent the first few hours searching for a way around this – surely other people have faced this, and there’s a redirect, setting, or way of banging the keyboard that makes it work? Turns out not.

The next idea was to experiment with contacting the Google server directly, as is described here. Unfortunately, you still need the Auth libraries.

Finally, I swapped out the function for a WebJob. WebJobs give you a little move flexibility, and have no hard dependencies. So, on with the show (albeit a little more involved than expected).

WebJob

In this post I described how to create a basic WebJob. Here, we’re going to do something similar. In our case, we’re going to listen for an Azure Service Bus Message, and then update the Azure Storage table (as described in the previous post), and call out to GCP to publish a message to PubSub.

Handling a Service Bus Message

We weren’t originally going to take this approach, but I found that WebJobs play much nicer with a Service Bus message, than with trying to get them to fire on a specific endpoint. In terms of scaleability, adding a queue in the middle can only be a good thing. We’ll square off the contactable endpoint at the end with a function that will simply convert the endpoint to a message on the queue. Here’s what the WebJob Program looks like:

public static void ProcessQueueMessage(
    [ServiceBusTrigger("localsantaqueue")] string message,
    TextWriter log,
    [Table("Delivery")] ICollector<TableItem> outputTable)
{
    Console.WriteLine("test");
 
    log.WriteLine(message);
 
    // parse query parameter
    TableItem item = Newtonsoft.Json.JsonConvert.DeserializeObject<TableItem>(message);
    if (string.IsNullOrWhiteSpace(item.PartitionKey)) item.PartitionKey = item.childName.First().ToString();
    if (string.IsNullOrWhiteSpace(item.RowKey)) item.RowKey = item.childName;
 
    outputTable.Add(item);
 
    GCPHelper.AddMessageToPubSub(item).GetAwaiter().GetResult();
    
    log.WriteLine("DeliveryComplete Finished");
 
}

Effectively, this is the same logic as the function (obviously, we now have the GCPHelper, and we’ll come to that in a minute. First, here’s the code for the TableItem model:


[JsonObject(MemberSerialization.OptIn)]
public class TableItem : TableEntity
{
    [JsonProperty]
    public string childName { get; set; }
 
    [JsonProperty]
    public string present { get; set; }
}

As you can see, we need to decorate the members with specific serialisation instructions. The reason being that this model is being used by both GCP (which only needs what you see on the screen) and Azure (which needs the inherited properties).

GCPHelper

As described here, you’ll need to install the client package for GCP into the Azure Function App that we created in post one of this series (referenced above):

Install-Package Google.Cloud.PubSub.V1 -Pre

Here’s the helper code that I mentioned:

public static class GCPHelper
{
    public static async Task AddMessageToPubSub(TableItem toSend)
    {
        string jsonMsg = Newtonsoft.Json.JsonConvert.SerializeObject(toSend);
        
        Environment.SetEnvironmentVariable(
            "GOOGLE_APPLICATION_CREDENTIALS",
            Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Test-Project-8d8d83hs4hd.json"));
        GrpcEnvironment.SetLogger(new ConsoleLogger());

        PublisherClient publisher = PublisherClient.Create();
        string projectId = "test-project-123456";
        TopicName topicName = new TopicName(projectId, "test");
        SimplePublisher simplePublisher = 
            await SimplePublisher.CreateAsync(topicName);
        string messageId = 
            await simplePublisher.PublishAsync(jsonMsg);
        await simplePublisher.ShutdownAsync(TimeSpan.FromSeconds(15));
    }
 
}

I detailed in this post how to create a credentials file; you’ll need to do that to allow the WebJob to be authorised. The Json file referenced above was created using that process.

Azure Config

You’ll need to create an Azure message queue (I’ve called mine localsantaqueue):

I would also download the Service Bus Explorer (I’ll be using it later for testing).

GCP Config

We already have a DataFlow, a PubSub Topic and a BigQuery Database, so GCP should require no further configuration; except to ensure the permissions are correct.

The Service Account user (which I give more details of here needs to have PubSub permissions. For now, we’ll make them an editor, although in this instance, they probably only need publish:

Test

We can do a quick test using the Service Bus Explorer and publish a message to the queue:

The ultimate test is that we can then see this in the BigQuery Table:

Lastly, the Function

This won’t be a completely function free post. The last step is to create a function that adds a message to the queue:

[FunctionName("Function1")]
public static HttpResponseMessage Run(
    [HttpTrigger(AuthorizationLevel.Function, "post")]HttpRequestMessage req,             
    TraceWriter log,
    [ServiceBus("localsantaqueue")] ICollector<string> queue)
{
    log.Info("C# HTTP trigger function processed a request.");
    var parameters = req.GetQueryNameValuePairs();
    string childName = parameters.First(a => a.Key == "childName").Value;
    string present = parameters.First(a => a.Key == "present").Value;
    string json = "{{ 'childName': '{childName}', 'present': '{present}' }} ";            
    queue.Add(json);
    

    return req.CreateResponse(HttpStatusCode.OK);
}

So now we have an endpoint for our imaginary Xamarin app to call into.

Summary

Both GCP and Azure are relatively immature platforms for this kind of interaction. The GCP client libraries seem to be missing functionality (and GCP is still heavily weighted away from .Net). The Azure libraries (especially functions) seem to be in a pickle, too – with strange dependencies that makes it very difficult to communicate outside of Azure. As a result, this task (which should have taken an hour or so) took a great deal of time, and it was completely unnecessary.

Having said that, it is clearly possible to link the two systems, if a little long-winded.

References

https://blog.falafel.com/rest-google-cloud-pubsub-with-oauth/

https://github.com/Azure/azure-functions-vs-build-sdk/issues/107

https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-service-bus

https://stackoverflow.com/questions/48092003/adding-to-a-queue-using-an-azure-function-in-c-sharp/48092276#48092276

Implicitly Acknowledging a Message from Azure Service Bus

In this post I discussed receiving, processing and acknowledging a message using the Azure Service Bus. There are two ways to acknowledge a message received from the queue (which are common to all message broker systems that I’ve used so far). That is, you either take the message, process it, and then go back to the broker to tell it you’re done (explicit acknowledgement); or, you remove the queue and then process it (implicit acknowledgement).

Explicit Acknowledgement / PeekLock

If the message is not processed within a period of time, then it will be unlocked and returned to the queue to be picked up by the next client request.

The code for this is as follows (it is also the default behaviour in Azure Service Bus):

QueueClient queueClient = QueueClient.CreateFromConnectionString(connectionString, queueName, ReceiveMode.PeekLock);

Remember that, with this code, if you don’t call:

message.Complete();

Then you will repeatedly read the same message over and over again.

Implicit Acknowledgement / ReadAndDelete

Here, if the message is not processed within a period of time, or fails to process, then it is likely lost. So, why would you ever use this method of acknowledgement? Well, speed is the main reason; because you don’t need to go back to the server, you potentially increase the whole transaction speed; furthermore, there is clearly work involved for the broker in maintaining the state of a message on the queue, expiring the message lock, etc.

The code for the implicit acknowledgement is:

QueueClient queueClient = QueueClient.CreateFromConnectionString(connectionString, queueName, ReceiveMode.ReceiveAndDelete);

References

https://docs.microsoft.com/en-us/rest/api/servicebus/peek-lock-message-non-destructive-read

Reading a Message From an Azure Service Bus Queue

In this post. I documented how to create a new application using Azure Service Bus and add a message to the queue. In this post, I’ll cover how to read that post from the queue, and how to deal with acknowledging the receipt.

The Code

The code from this post can be found here.

The code uses a lot of hard coded strings and static methods, and this is because it makes it easier to see exactly what it happening and when. This is not intended as an example of production code, more as a cut-and-paste repository.

Reading a Message

Most of the code that we’ve written can simply be re-hashed for the receipt. First, initialise the queue as before:

            Uri uri = ServiceManagementHelper.GetServiceUri();
            TokenProvider tokenProvider = ServiceManagementHelper.GetTokenProvider(uri);

            NamespaceManager nm = new NamespaceManager(uri, tokenProvider);
            if (!nm.QueueExists("TestQueue")) return;

Obviously, if the queue doesn’t exist for reading, there’s limited point in creating it. The next step is to set-up a queue client:

            string connectionString = GetConnectionString();

            QueueClient queueClient = QueueClient.CreateFromConnectionString(connectionString, queueName);

            return queueClient;

The connection string is found here:

Finally, ask for the next message:

            BrokeredMessage message = queueClient.Receive();
            string messageBody = message.GetBody<string>();
            Console.WriteLine($"Message received: {messageBody}");

And we can see the contents of the queue:

If we run again:

We can see that, despite being read, the message is still sat in the queue:

Acknowledging the Message

To explicitly acknowledge a message, just calling the Complete method on the message object will work:

            BrokeredMessage message = queueClient.Receive();
            string messageBody = message.GetBody<string>();

            message.Complete();

            Console.WriteLine($"Message received: {messageBody}");

And the message is gone:

Summary and Cost

We now have a basic, working, message queue. But one thing that I always worry about with Azure is how much this costs. Let’s run a send and receive for 100 messages with the content: “test” as above.

The first thing is to change the code slightly so that it reads through all messages (not just the first):

                while (true)
                {
                    string message = ReadMessage("TestQueue");

                    if (string.IsNullOrWhiteSpace(message)) break;
                    Console.WriteLine($"Message received: {message}");
                }
        private static string ReadMessage(string queueName)
        {
            QueueClient client = QueueManagementHelper.GetQueueClient(queueName);

            BrokeredMessage message = client.Receive();
            if (message == null) return string.Empty;
            string messageBody = message.GetBody<string>();

            message.Complete();

            return messageBody;
        }

Then run this to clear the queue. By default, client.Receive has a default timeout, so it will pause for a few seconds before returning if there are no messages. This timeout is a very useful feature. Most of this post was written on a train with a flaky internet connection, and this mechanism provided a resilient way to allow communications to continue when the connection was available.

And change the send code:


            string message = Console.ReadLine();

            for (int i = 1; i <= 100; i++)
            {
                AddNewMessage("1", message, "TestQueue");
            }

Next, the current credit on my account:

Let’s run 100 messages:

That looks familiar. Let’s try 10,000:

I’ve added some times to this, too. It’s processing around 10 / second – which is not astoundingly quick. It’s worth mentioning again that this post was written largely on a train, but still, 10 messages per second means that 10K messages will take around 15 mins. It is faster when you have a reliable (non-mobile) internet connection, but still. Anyway, back to cost. 10K messages still showed up as a zero cost.

But, Azure is a paid service, so this has to start costing money. This time, I’m adding 1000 character string to send as a message, and sending that 100,000 times.

After this, the balance was the same; however, the following day, it dropped slightly to £36.94. So, as far as I can tell, the balance is updated based on some kind of job that runs each day (which means that the balance might not be updated in real-time).

I asked this question here.

The published pricing details are here, but it looks like you should be able to post around 500,000 messages before you start incurring cost (1M operations).

References

https://insidethecpu.com/2015/11/06/levaraging-azure-service-bus-with-c/

https://www.simple-talk.com/cloud/cloud-data/an-introduction-to-windows-azure-service-bus-brokered-messaging/

https://msdn.microsoft.com/en-gb/library/hh868041.aspx?f=255&MSPPError=-2147217396

https://stackoverflow.com/questions/14831281/how-does-the-service-bus-queueclient-receive-work