Tag Archives: Google Cloud Platform

A C# Programmer’s Guide to Google Cloud Pub Sub Messaging

The Google Cloud Platform provides a Publish / Subscriber system called ‘PubSub’. In this post I wrote a basic guide on setting up RabbitMQ, and here I wrote about ActiveMQ. In this post I wrote about using the Azure messaging system. Here, I’m going to give an introduction to using the GCP PubSub system.


The above systems that I’ve written about in the past are fully featured (yes, including Azure) message bus systems. While the GCP offering is a Message Bus system of sorts, it is definitely lacking some of the features of the other platforms. I suppose this stems from the fact that, in the GCP case, it serves a specific purpose, and is heavily geared toward that purpose.

Other messaging systems do offer the Pub / Sub model. The idea being that you create a topic, and anyone that’s interested can subscribe to the topic. Once you’ve subscribed, you’re guaranteed* to get at least one delivery of the published message. You can also, kind of, simulate a message queue, because more than one subscriber can take message from a single subscription.


If you want to follow along with the post, you’ll need to have a GCP subscription, and a GCP project configured.


In order to set-up a new topic, we’re going to navigate to the PubSub menu in the console (you may be prompted to Enable PubSub when you arrive).

As you can see, you’re inundated with choice here. Let’s go for “Create a topic”:

Cloud Shell

You’ve now created a topic; however, that isn’t the only way that you can do this. Google are big on using the Cloud Shell; and so you can create a topic using that; in order to do so, you select the cloud shell icon:

Once you get the cloud shell, you can use the following command**:

gcloud beta pubsub topics create "test"

Subscriptions and Publishing

You can publish a message now if you like; either from the console:

Or from the Cloud Shell:

gcloud beta pubsub topics publish "test" "message"

Both will successfully publish a message that will get delivered to all subscribers. The problem is that you haven’t created any subscribers yet, so it just dissipates into the ether***.

You can see there are no subscriptions, because the console tells you****:

Let’s create one:

Again, you can create a subscription from the cloud shell:

gcloud beta pubsub topics subscriptions create --topic "test" "mysubscription"

So, we now have a subscription, and a message.

Consuming messages

In order to consume messages in this instance, let’s create a little cloud function. I’ve previously written about creating these here. Instead of creating a HTTP trigger, this time, we’re going to create a function that reacts to something on a cloud Pub/Sub topic:

Select the relevant topic; the default code just writes the test out to the console; so that’ll do:

 * Triggered from a message on a Cloud Pub/Sub topic.
 * @param {!Object} event The Cloud Functions event.
 * @param {!Function} The callback function.
exports.subscribe = function subscribe(event, callback) {
  // The Cloud Pub/Sub Message object.
  const pubsubMessage = event.data;

  // We're just going to log the message to prove that
  // it worked.
  console.log(Buffer.from(pubsubMessage.data, 'base64').toString());

  // Don't forget to call the callback.

So, now we have a subscription:

Let’s see what happens when we artificially push a message to it.

If we now have a look at the Cloud Function, we can see that something has happened:

And if we select “View Logs”, we can see what:

It worked! Next…

Create Console App

Now we have something that will react to a message, let’s try and generate one programmatically, in C# from a console app. Obviously the first thing to do is to install a NuGet package that isn’t past the beta stage yet:

Install-Package Google.Cloud.PubSub.V1 -Pre


In this post I described how you might create a credentials file. You’ll need to do that again here (and, I think anywhere that you want to access GCP from outside of the cloud).

In APIs & Services, select “Create credentials”:

Again, select a JSON file:

The following code publishes a message to the topic:

static async Task Main(string[] args)
        Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "my-credentials-file.json"));
    GrpcEnvironment.SetLogger(new ConsoleLogger());
    // Instantiates a client
    PublisherClient publisher = PublisherClient.Create();
    string projectId = "test-project-123456";
    var topicName = new TopicName(projectId, "test");
    SimplePublisher simplePublisher = await SimplePublisher.CreateAsync(topicName);
    string messageId = await simplePublisher.PublishAsync("test message");
    await simplePublisher.ShutdownAsync(TimeSpan.FromSeconds(15));

And we can see that message in the logs of the cloud function:


Unless you choose otherwise, the service account will look something like this:

The Editor permission that it gets by default is a sort of God permission. This can be fine-grained by removing that, and selecting specific permissions; in this case, Pub/Sub -> Publisher. It’s worth bearing in mind that as soon as you remove all permissions, the account is removed, so try to maintain a single permission (project browser seems to be suitably innocuous).


* Google keeps messages for up to 7 days, so the guarantee has a time limit.

** gcloud may need to be initialised. If it does then:

gcloud init
gcloud components install beta

*** This is a big limitation. Whilst all topic subscriptions in other systems do work like this, in those systems, you have the option of a queue – i.e. a place for messages to live that no-one is listening for.

**** If you create a subscription in the Cloud Shell, it will not show in the console until you F5 (there may be a timeout, but I didn’t wait that long). The problem here is that F5 messes up the shell window.




Google Cloud Datastore – Setting up a new Datastore and accessing it from a console application

Datastore is a NoSql offering from Google. It’s part of their Google Cloud Platform (GCP). The big mind shift, if you’re used to a relational database is to remember that each row (although they aren’t really rows) in a table (they aren’t really tables) can be different. The best way I could think about it was a text document; each line can have a different number of words, numbers and symbols.

However, just because it isn’t relational, doesn’t mean you don’t have to consider the structure; in fact, it actually seems to mean that there is more onus on the designer to consider what and where the data will be used.


In order to follow this post, you’ll need an account on GCP, and a Cloud Platform Project.

Set-up a New Cloud Datastore

The first thing to do is to set-up a new Datastore:


The next step is to select a Zone. The big thing to consider, in terms of cost and speed is to co-locate your data where possible. Specifically with data, you’ll incur egress charges (that is, you’ll be charged as your data leaves its zone), so your zone should be nearby, and co-located with anything that accesses it. Obviously, in this example, you’re accessing the data from where your machine is located, so pick a zone that is close to where you live.

In Britain, we’re in Europe-West-2:

Entities and Properties

The next thing is to set-up new entity. As we said, an entity is loosely analogous to a table.

Now we have an entity, the entity needs some properties. This, again, is loosely analogous to a field; if the fields were not required to be consistent throughout the table. I’m unsure how this works behind the scenes, but it appears to simply null out the columns that have no value; I suspect this may be a visual display thing.

You can set the value (as above), and then query the data, either in a table format (as below):

Or, you can use a SQL like syntax (as below).


In order to access the datastore from outside the GCP, you’ll need a credentials file. You;ll need to start off in the Credentials screen:

In this instance, we’ll set-up a service account key:

This creates the key as a json file:

The file should looks broadly like this:

  "type": "service_account",
  "project_id": "my-project-id",
  "private_key_id": "private_key_id",
  "private_key": "-----BEGIN PRIVATE KEY-----\nkeydata\n-----END PRIVATE KEY-----\n",
  "client_email": "my-project-id@appspot.gserviceaccount.com",
  "client_id": "clientid",
  "auth_uri": "https://accounts.google.com/o/oauth2/auth",
  "token_uri": "https://accounts.google.com/o/oauth2/token",
  "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
  "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/my-project-id%40appspot.gserviceaccount.com"

Keep hold of this file, as you’ll need it later.

Client Library

There is a .Net client library provided for accessing this functionality from your website or desktop app. What we’ll do next is access that entity from a console application. The obvious first step is to create one:

Credentials again

Remember that credentials file I said to hang on to; well now you need it. It needs to be accessible from your application; there’s a number of ways to address this problem, and the one that I’m demonstrating here is probably not a sensible solution in real life, but for the purpose of testing, it works fine.

Copy the credentials file into your project directory and include it in the project, then, set the properties to:

Build Action: None
Copy to Output Directory: Copy if Newer

GCP Client Package

You’ll need to install the correct NuGet package:

Install-Package Google.Cloud.Datastore.V1

Your Project ID

As you use the GCP more, you’ll come to appreciate that the project ID is very important. You’ll need to make a note of it (if you can’t find it, simply select Home from the hamburger menu):

The Code

All the pieces are now in place, so let’s write some code to access the datastore:

    Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "my-credentials-file.json"));
GrpcEnvironment.SetLogger(new ConsoleLogger());
// Your Google Cloud Platform project ID
string projectId = "my-project-id";
DatastoreClient datastoreClient = DatastoreClient.Create();
DatastoreDb db = DatastoreDb.Create(projectId, "TestNamespace", datastoreClient);

string kind = "MyTest";

string name = "newentitytest3";
KeyFactory keyFactory = db.CreateKeyFactory(kind);
Key key = keyFactory.CreateKey(name);
var task = new Entity
    Key = key,
    ["test1"] = "Hello, World",
    ["test2"] = "Goodbye, World",
    ["new field"] = "test"
using (DatastoreTransaction transaction = db.BeginTransaction())

If you now check, you should see that your Datastore has been updated:

There’s a few things to note here; the first is that you will need to select the right Namespace and Kind. Namespace defaults to [default], and so you won’t see your new records until you select that.

When things go wrong

The above instructions are deceptively simple; however, getting this example working was, by no means, straight-forward. Fortunately, when you have a problem with GCP and you ask on StackOverflow, you get answered by Jon Skeet. The following is a summary of an error that I encountered.


System.InvalidOperationException: ‘The Application Default Credentials are not available. They are available if running in Google Compute Engine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials. See https://developers.google.com/accounts/docs/application-default-credentials for more information.’

The error occurred on the BeginTransaction line.

The ConsoleLogger above isn’t just there for show, and does give some additional information; in this case:

D1120 17:59:00.519509 Grpc.Core.Internal.UnmanagedLibrary Attempting to load native library “C:\Users\pmichaels.nuget\packages\grpc.core\1.4.0\lib\netstandard1.5../..\runtimes/win/native\grpc_csharp_ext.x64.dll” D1120 17:59:00.600298 Grpc.Core.Internal.NativeExtension gRPC native library loaded successfully. E1120 17:59:02.176461 0 C:\jenkins\workspace\gRPC_build_artifacts\platform\windows\workspace_csharp_ext_windows_x64\src\core\lib\security\credentials\plugin\plugin_credentials.c:74: Getting metadata from plugin failed with error: Exception occured in metadata credentials plugin.

It turns out that the code was failing somewhere in here. Finally, with much help, I managed to track the error down to being a firewall restriction.









Google Cloud Platform – Using Cloud Functions

In this post and this post I wrote about how you might create a basic Azure function. In this post, I’ll do the same thing using the Google Cloud Platform (GCP).

Google Cloud Functions

This is what google refer to as their serverless function offering. The feature is, at the time of writing, still in Beta, and only allows JavaScript functions (although after the amount of advertising they have been doing on Dot Net Rocks, I did look for a while for the C# switch).

As with many of the Google features, you need to enable functions for the project that you’re in:

Once the API is enabled, you’ll have the opportunity to create a new function; doing so should present you with this screen:

Since the code has to be in JavaScript, you could use the following (which is a cut down version of the default code):

exports.helloWorld = function helloWorld(req, res) {
    res.status(200).send('Test function');


Once you create the function, you’ll see it spin for a while before it declares that you’re ready to go:


In order to test that this works, simply navigate to the URL given earlier on: