Tag Archives: GCP

Working with Multiple Cloud Providers – Part 3 – Linking Azure and GCP

This is the third and final post in a short series on linking up Azure with GCP (for Christmas). In the first post, I set-up a basic Azure function that updated some data in table storage, and then in the second post, I configured the GCP link from PubSub into BigQuery.

In the post, we’ll square this off by adapting the Azure function to post a message directly to PubSub; then, we’ll call the Azure function with Santa’a data, and watch that appear in BigQuery. At least, that was my plan – but Microsoft had other ideas.

It turns out that Azure functions have a dependency on Newtonsoft Json 9.0.1, and the GCP client libraries require 10+. So instead of being a 10 minute job on Boxing day to link the two, it turned into a mammoth task. Obviously, I spent the first few hours searching for a way around this – surely other people have faced this, and there’s a redirect, setting, or way of banging the keyboard that makes it work? Turns out not.

The next idea was to experiment with contacting the Google server directly, as is described here. Unfortunately, you still need the Auth libraries.

Finally, I swapped out the function for a WebJob. WebJobs give you a little move flexibility, and have no hard dependencies. So, on with the show (albeit a little more involved than expected).

WebJob

In this post I described how to create a basic WebJob. Here, we’re going to do something similar. In our case, we’re going to listen for an Azure Service Bus Message, and then update the Azure Storage table (as described in the previous post), and call out to GCP to publish a message to PubSub.

Handling a Service Bus Message

We weren’t originally going to take this approach, but I found that WebJobs play much nicer with a Service Bus message, than with trying to get them to fire on a specific endpoint. In terms of scaleability, adding a queue in the middle can only be a good thing. We’ll square off the contactable endpoint at the end with a function that will simply convert the endpoint to a message on the queue. Here’s what the WebJob Program looks like:

public static void ProcessQueueMessage(
    [ServiceBusTrigger("localsantaqueue")] string message,
    TextWriter log,
    [Table("Delivery")] ICollector<TableItem> outputTable)
{
    Console.WriteLine("test");
 
    log.WriteLine(message);
 
    // parse query parameter
    TableItem item = Newtonsoft.Json.JsonConvert.DeserializeObject<TableItem>(message);
    if (string.IsNullOrWhiteSpace(item.PartitionKey)) item.PartitionKey = item.childName.First().ToString();
    if (string.IsNullOrWhiteSpace(item.RowKey)) item.RowKey = item.childName;
 
    outputTable.Add(item);
 
    GCPHelper.AddMessageToPubSub(item).GetAwaiter().GetResult();
    
    log.WriteLine("DeliveryComplete Finished");
 
}

Effectively, this is the same logic as the function (obviously, we now have the GCPHelper, and we’ll come to that in a minute. First, here’s the code for the TableItem model:


[JsonObject(MemberSerialization.OptIn)]
public class TableItem : TableEntity
{
    [JsonProperty]
    public string childName { get; set; }
 
    [JsonProperty]
    public string present { get; set; }
}

As you can see, we need to decorate the members with specific serialisation instructions. The reason being that this model is being used by both GCP (which only needs what you see on the screen) and Azure (which needs the inherited properties).

GCPHelper

As described here, you’ll need to install the client package for GCP into the Azure Function App that we created in post one of this series (referenced above):

Install-Package Google.Cloud.PubSub.V1 -Pre

Here’s the helper code that I mentioned:

public static class GCPHelper
{
    public static async Task AddMessageToPubSub(TableItem toSend)
    {
        string jsonMsg = Newtonsoft.Json.JsonConvert.SerializeObject(toSend);
        
        Environment.SetEnvironmentVariable(
            "GOOGLE_APPLICATION_CREDENTIALS",
            Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Test-Project-8d8d83hs4hd.json"));
        GrpcEnvironment.SetLogger(new ConsoleLogger());

        PublisherClient publisher = PublisherClient.Create();
        string projectId = "test-project-123456";
        TopicName topicName = new TopicName(projectId, "test");
        SimplePublisher simplePublisher = 
            await SimplePublisher.CreateAsync(topicName);
        string messageId = 
            await simplePublisher.PublishAsync(jsonMsg);
        await simplePublisher.ShutdownAsync(TimeSpan.FromSeconds(15));
    }
 
}

I detailed in this post how to create a credentials file; you’ll need to do that to allow the WebJob to be authorised. The Json file referenced above was created using that process.

Azure Config

You’ll need to create an Azure message queue (I’ve called mine localsantaqueue):

I would also download the Service Bus Explorer (I’ll be using it later for testing).

GCP Config

We already have a DataFlow, a PubSub Topic and a BigQuery Database, so GCP should require no further configuration; except to ensure the permissions are correct.

The Service Account user (which I give more details of here needs to have PubSub permissions. For now, we’ll make them an editor, although in this instance, they probably only need publish:

Test

We can do a quick test using the Service Bus Explorer and publish a message to the queue:

The ultimate test is that we can then see this in the BigQuery Table:

Lastly, the Function

This won’t be a completely function free post. The last step is to create a function that adds a message to the queue:

[FunctionName("Function1")]
public static HttpResponseMessage Run(
    [HttpTrigger(AuthorizationLevel.Function, "post")]HttpRequestMessage req,             
    TraceWriter log,
    [ServiceBus("localsantaqueue")] ICollector<string> queue)
{
    log.Info("C# HTTP trigger function processed a request.");
    var parameters = req.GetQueryNameValuePairs();
    string childName = parameters.First(a => a.Key == "childName").Value;
    string present = parameters.First(a => a.Key == "present").Value;
    string json = "{{ 'childName': '{childName}', 'present': '{present}' }} ";            
    queue.Add(json);
    

    return req.CreateResponse(HttpStatusCode.OK);
}

So now we have an endpoint for our imaginary Xamarin app to call into.

Summary

Both GCP and Azure are relatively immature platforms for this kind of interaction. The GCP client libraries seem to be missing functionality (and GCP is still heavily weighted away from .Net). The Azure libraries (especially functions) seem to be in a pickle, too – with strange dependencies that makes it very difficult to communicate outside of Azure. As a result, this task (which should have taken an hour or so) took a great deal of time, and it was completely unnecessary.

Having said that, it is clearly possible to link the two systems, if a little long-winded.

References

https://blog.falafel.com/rest-google-cloud-pubsub-with-oauth/

https://github.com/Azure/azure-functions-vs-build-sdk/issues/107

https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-service-bus

https://stackoverflow.com/questions/48092003/adding-to-a-queue-using-an-azure-function-in-c-sharp/48092276#48092276

Short Walks – GCP Credit Alerts

One of the things that is quite unnerving when you start using GCP is the billing. Unlike Azure (with your MSDN monthly credits), GCP just has a single promotion and that’s it; consequently, they don’t have any real way to have an automatic shut off instead of actually charging you (unlike to regular mails that I get from MS telling me they’ve suspended my account until next month).

When you start messing around with BigTable and BigQuery, you can eat up tens, or even hundreds of pounds very quickly, and you might not even realise you’ve done it.

GCP does have a warning, and you can set it to e-mail you at certain intervals within a spending limit:

However, this doesn’t include credit by default. That is, if Google give you a credit to start with (for example, because you’re trying out GCP or, I imagine, if you load your account up before-hand) then that doesn’t get included in your alerts.

Credit Checkbox

There is a checkbox that allows you to switch this behaviour, so that these credit totals are included:

And now you can see how much of your credit you’ve used:

And even receive e-mail warnings:

Disable Billing

One other thing you can do is to disable billing:

Unfortunately, this works differently from Azure, and effectively suspends you project:

Working with Multiple Cloud Providers – Part 2 – Getting Data Into BigQuery

In this post, I described how we might attempt to help Santa and his delivery drivers to deliver presents to every child in the world, using the combined power of Google and Microsoft.

In this, the second part of the series (there will be one more), I’m going to describe how we might set-up a GCP pipeline that feeds that data into BigQuery (Google’s BigData NoSQL warehouse offering). We’ll first set up BigQuery, then the PubSub topic, and finally, we’ll set-up the dataflow, ready for Part 3, which will be joining the two systems together.

BigQuery

Once you navigate to the BigQuery section of the GCP console, you’ll be able to create a Dataset:

You can now set-up a new table. As this is an illustration, we’ll keep it as simple as possible, but you can see that this might be much more complex:

One thing to bear in mind about BigQuery, and cloud data storage in general is that, often, it makes sense to de-normalise your data – storage is often much cheaper than CPU time.

PubSub

Now we have somewhere to put the data; we could simply have the Azure function write the data into BigQuery. However, we might then run into problems if the data flow suddenly spiked. For this reason, Google recommends the use of PubSub as a shock absorber.

Let’s create a PubSub topic. I’ve written in more detail on this here:

DataFlow

The last piece of the jigsaw is Dataflow. Dataflow can be used for much more complex tasks than to simply take data from one place and put it in another, but in this case, that’s all we need. Before we can set-up a new dataflow job, we’ll need to create a storage bucket:

We’ll create the bucket as Regional for now:

Remember that the bucket name must be unique (so no-one can ever pick pcm-data-flow-bucket again!)

Now, we’ll move onto the DataFlow itself. We get a number of dataflow templates out of the box; and we’ll use one of those. Let’s launch dataflow from the console:

Here we create a new Dataflow job:

We’ll pick “PubSub to BigQuery”:

You’ll then get asked for the name of the topic (which was created earlier) and the storage bucket (again, created earlier); you’re form should look broadly like this when you’re done:

I strongly recommend specifying a maximum number of workers, at least while you’re testing.

Testing

Finally, we’ll test it. PubSub allows you to publish a message:

Next, visit the Dataflow to see what’s happening:

Looks interesting! Finally, in BigQuery, we can see the data:

Summary

We now have the two separate cloud systems functioning independently. Step three will be to join them together.

Working with Multiple Cloud Providers – Part 1 – Azure Function

Regular readers (if there are such things to this blog) may have noticed that I’ve recently been writing a lot about two main cloud providers. I won’t link to all the articles, but if you’re interested, a quick search for either Azure or Google Cloud Platform will yield several results.

Since it’s Christmas, I thought I’d do something a bit different and try to combine them. This isn’t completely frivolous; both have advantages and disadvantages: GCP is very geared towards big data, whereas the Azure Service Fabric provides a lot of functionality that might fit well with a much smaller LOB app.

So, what if we had the following scenario:

Santa has to deliver presents to every child in the world in one night. Santa is only one man* and Google tells me there are 1.9B children in the world, so he contracts out a series of delivery drivers. There needs to be around 79M deliveries every hour, let’s assume that each delivery driver can work 24 hours**. Each driver can deliver, say 100 deliveries per hour, that means we need around 790,000 drivers. Every delivery driver has an app that links to their depot; recording deliveries, schedules, etc.

That would be a good app to write in, say, Xamarin, and maybe have an Azure service running it; here’s the obligatory box diagram:

The service might talk to the service bus, might control stock, send e-mails, all kinds of LOB jobs. Now, I’m not saying for a second that Azure can’t cope with this, but what if we suddenly want all of these instances to feed metrics into a single data store. There’s 190*** countries in the world; if each has a depot, then there’s ~416K messages / hour going into each Azure service. But there’s 79M / hour going into a single DB. Because it’s Christmas, let assume that Azure can’t cope with this, or let’s say that GCP is a little cheaper at this scale; or that we have some Hadoop jobs that we’d like to use on the data. In theory, we can link these systems; which might look something like this:

So, we have multiple instances of the Azure architecture, and they all feed into a single GCP service.

Disclaimer

At no point during this post will I attempt to publish 79M records / hour to GCP BigQuery. Neither will any Xamarin code be written or demonstrated – you have to use your imagination for that bit.

Proof of Concept

Given the disclaimer I’ve just made, calling this a proof of concept seems a little disingenuous; but let’s imagine that we know that the volumes aren’t a problem and concentrate on how to link these together.

Azure Service

Let’s start with the Azure Service. We’ll create an Azure function that accepts a HTTP message, updates a DB and then posts a message to Google PubSub.

Storage

For the purpose of this post, let’s store our individual instance data in Azure Table Storage. I might come back at a later date and work out how and whether it would make sense to use CosmosDB instead.

We’ll set-up a new table called Delivery:

Azure Function

Now we have somewhere to store the data, let’s create an Azure Function App that updates it. In this example, we’ll create a new Function App from VS:

In order to test this locally, change local.settings.json to point to your storage location described above.

And here’s the code to update the table:


    public static class DeliveryComplete
    {
        [FunctionName("DeliveryComplete")]
        public static HttpResponseMessage Run(
            [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)]HttpRequestMessage req, 
            TraceWriter log,            
            [Table("Delivery", Connection = "santa_azure_table_storage")] ICollector<TableItem> outputTable)
        {
            log.Info("C# HTTP trigger function processed a request.");
 
            // parse query parameter
            string childName = req.GetQueryNameValuePairs()
                .FirstOrDefault(q => string.Compare(q.Key, "childName", true) == 0)
                .Value;
 
            string present = req.GetQueryNameValuePairs()
                .FirstOrDefault(q => string.Compare(q.Key, "present", true) == 0)
                .Value;            
 
            var item = new TableItem()
            {
                childName = childName,
                present = present,                
                RowKey = childName,
                PartitionKey = childName.First().ToString()                
            };
 
            outputTable.Add(item);            
 
            return req.CreateResponse(HttpStatusCode.OK);
        }
 
        public class TableItem : TableEntity
        {
            public string childName { get; set; }
            public string present { get; set; }
        }
    }

Testing

There are two ways to test this; the first is to just press F5; that will launch the function as a local service, and you can use PostMan or similar to test it; the alternative is to deploy to the cloud. If you choose the latter, then your local.settings.json will not come with you, so you’ll need to add an app setting:

Remember to save this setting, otherwise, you’ll get an error saying that it can’t find your setting, and you won’t be able to work out why – ask me how I know!

Now, if you run a test …

You should be able to see your table updated (shown here using Storage Explorer):

Summary

We now have a working Azure function that updates a storage table with some basic information. In the next post, we’ll create a GCP service that pipes all this information into BigTable and then link the two systems.

Footnotes

* Remember, all the guys in Santa suits are just helpers.
** That brandy you leave out really hits the spot!
*** I just Googled this – it seems a bit low to me, too.

References

https://docs.microsoft.com/en-us/azure/azure-functions/functions-how-to-use-azure-function-app-settings#manage-app-service-settings

https://anthonychu.ca/post/azure-functions-update-delete-table-storage/

https://stackoverflow.com/questions/44961482/how-to-specify-output-bindings-of-azure-function-from-visual-studio-2017-preview

A C# Programmer’s Guide to Google Cloud Pub Sub Messaging

The Google Cloud Platform provides a Publish / Subscriber system called ‘PubSub’. In this post I wrote a basic guide on setting up RabbitMQ, and here I wrote about ActiveMQ. In this post I wrote about using the Azure messaging system. Here, I’m going to give an introduction to using the GCP PubSub system.

Introduction

The above systems that I’ve written about in the past are fully featured (yes, including Azure) message bus systems. While the GCP offering is a Message Bus system of sorts, it is definitely lacking some of the features of the other platforms. I suppose this stems from the fact that, in the GCP case, it serves a specific purpose, and is heavily geared toward that purpose.

Other messaging systems do offer the Pub / Sub model. The idea being that you create a topic, and anyone that’s interested can subscribe to the topic. Once you’ve subscribed, you’re guaranteed* to get at least one delivery of the published message. You can also, kind of, simulate a message queue, because more than one subscriber can take message from a single subscription.

Pre-requisites

If you want to follow along with the post, you’ll need to have a GCP subscription, and a GCP project configured.

Topics

In order to set-up a new topic, we’re going to navigate to the PubSub menu in the console (you may be prompted to Enable PubSub when you arrive).

As you can see, you’re inundated with choice here. Let’s go for “Create a topic”:

Cloud Shell

You’ve now created a topic; however, that isn’t the only way that you can do this. Google are big on using the Cloud Shell; and so you can create a topic using that; in order to do so, you select the cloud shell icon:

Once you get the cloud shell, you can use the following command**:

gcloud beta pubsub topics create "test"

Subscriptions and Publishing

You can publish a message now if you like; either from the console:

Or from the Cloud Shell:

gcloud beta pubsub topics publish "test" "message"

Both will successfully publish a message that will get delivered to all subscribers. The problem is that you haven’t created any subscribers yet, so it just dissipates into the ether***.

You can see there are no subscriptions, because the console tells you****:

Let’s create one:

Again, you can create a subscription from the cloud shell:

gcloud beta pubsub topics subscriptions create --topic "test" "mysubscription"

So, we now have a subscription, and a message.

Consuming messages

In order to consume messages in this instance, let’s create a little cloud function. I’ve previously written about creating these here. Instead of creating a HTTP trigger, this time, we’re going to create a function that reacts to something on a cloud Pub/Sub topic:

Select the relevant topic; the default code just writes the test out to the console; so that’ll do:

/**
 * Triggered from a message on a Cloud Pub/Sub topic.
 *
 * @param {!Object} event The Cloud Functions event.
 * @param {!Function} The callback function.
 */
exports.subscribe = function subscribe(event, callback) {
  // The Cloud Pub/Sub Message object.
  const pubsubMessage = event.data;

  // We're just going to log the message to prove that
  // it worked.
  console.log(Buffer.from(pubsubMessage.data, 'base64').toString());

  // Don't forget to call the callback.
  callback();
};

So, now we have a subscription:

Let’s see what happens when we artificially push a message to it.

If we now have a look at the Cloud Function, we can see that something has happened:

And if we select “View Logs”, we can see what:

It worked! Next…

Create Console App

Now we have something that will react to a message, let’s try and generate one programmatically, in C# from a console app. Obviously the first thing to do is to install a NuGet package that isn’t past the beta stage yet:

Install-Package Google.Cloud.PubSub.V1 -Pre

Credentials

In this post I described how you might create a credentials file. You’ll need to do that again here (and, I think anywhere that you want to access GCP from outside of the cloud).

In APIs & Services, select “Create credentials”:

Again, select a JSON file:

The following code publishes a message to the topic:


static async Task Main(string[] args)
{
    Environment.SetEnvironmentVariable(
        "GOOGLE_APPLICATION_CREDENTIALS",
        Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "my-credentials-file.json"));
 
    GrpcEnvironment.SetLogger(new ConsoleLogger());
 
    // Instantiates a client
    PublisherClient publisher = PublisherClient.Create();
 
    string projectId = "test-project-123456";
    var topicName = new TopicName(projectId, "test");
 
    SimplePublisher simplePublisher = await SimplePublisher.CreateAsync(topicName);
    string messageId = await simplePublisher.PublishAsync("test message");
    await simplePublisher.ShutdownAsync(TimeSpan.FromSeconds(15));
}

And we can see that message in the logs of the cloud function:

Permissions

Unless you choose otherwise, the service account will look something like this:

The Editor permission that it gets by default is a sort of God permission. This can be fine-grained by removing that, and selecting specific permissions; in this case, Pub/Sub -> Publisher. It’s worth bearing in mind that as soon as you remove all permissions, the account is removed, so try to maintain a single permission (project browser seems to be suitably innocuous).

Footnotes

* Google keeps messages for up to 7 days, so the guarantee has a time limit.

** gcloud may need to be initialised. If it does then:

gcloud init
gcloud components install beta

*** This is a big limitation. Whilst all topic subscriptions in other systems do work like this, in those systems, you have the option of a queue – i.e. a place for messages to live that no-one is listening for.

**** If you create a subscription in the Cloud Shell, it will not show in the console until you F5 (there may be a timeout, but I didn’t wait that long). The problem here is that F5 messes up the shell window.

References

https://cloud.google.com/pubsub/docs/reference/libraries

https://cloud.google.com/iam/docs/understanding-roles

Google Cloud Datastore – Setting up a new Datastore and accessing it from a console application

Datastore is a NoSql offering from Google. It’s part of their Google Cloud Platform (GCP). The big mind shift, if you’re used to a relational database is to remember that each row (although they aren’t really rows) in a table (they aren’t really tables) can be different. The best way I could think about it was a text document; each line can have a different number of words, numbers and symbols.

However, just because it isn’t relational, doesn’t mean you don’t have to consider the structure; in fact, it actually seems to mean that there is more onus on the designer to consider what and where the data will be used.

Pre-requisites

In order to follow this post, you’ll need an account on GCP, and a Cloud Platform Project.

Set-up a New Cloud Datastore

The first thing to do is to set-up a new Datastore:

Zones

The next step is to select a Zone. The big thing to consider, in terms of cost and speed is to co-locate your data where possible. Specifically with data, you’ll incur egress charges (that is, you’ll be charged as your data leaves its zone), so your zone should be nearby, and co-located with anything that accesses it. Obviously, in this example, you’re accessing the data from where your machine is located, so pick a zone that is close to where you live.

In Britain, we’re in Europe-West-2:

Entities and Properties

The next thing is to set-up new entity. As we said, an entity is loosely analogous to a table.

Now we have an entity, the entity needs some properties. This, again, is loosely analogous to a field; if the fields were not required to be consistent throughout the table. I’m unsure how this works behind the scenes, but it appears to simply null out the columns that have no value; I suspect this may be a visual display thing.

You can set the value (as above), and then query the data, either in a table format (as below):

Or, you can use a SQL like syntax (as below).

Credentials

In order to access the datastore from outside the GCP, you’ll need a credentials file. You;ll need to start off in the Credentials screen:

In this instance, we’ll set-up a service account key:

This creates the key as a json file:

The file should looks broadly like this:

{
  "type": "service_account",
  "project_id": "my-project-id",
  "private_key_id": "private_key_id",
  "private_key": "-----BEGIN PRIVATE KEY-----\nkeydata\n-----END PRIVATE KEY-----\n",
  "client_email": "my-project-id@appspot.gserviceaccount.com",
  "client_id": "clientid",
  "auth_uri": "https://accounts.google.com/o/oauth2/auth",
  "token_uri": "https://accounts.google.com/o/oauth2/token",
  "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
  "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/my-project-id%40appspot.gserviceaccount.com"
}

Keep hold of this file, as you’ll need it later.

Client Library

There is a .Net client library provided for accessing this functionality from your website or desktop app. What we’ll do next is access that entity from a console application. The obvious first step is to create one:

Credentials again

Remember that credentials file I said to hang on to; well now you need it. It needs to be accessible from your application; there’s a number of ways to address this problem, and the one that I’m demonstrating here is probably not a sensible solution in real life, but for the purpose of testing, it works fine.

Copy the credentials file into your project directory and include it in the project, then, set the properties to:

Build Action: None
Copy to Output Directory: Copy if Newer

GCP Client Package

You’ll need to install the correct NuGet package:

Install-Package Google.Cloud.Datastore.V1

Your Project ID

As you use the GCP more, you’ll come to appreciate that the project ID is very important. You’ll need to make a note of it (if you can’t find it, simply select Home from the hamburger menu):

The Code

All the pieces are now in place, so let’s write some code to access the datastore:

Environment.SetEnvironmentVariable(
    "GOOGLE_APPLICATION_CREDENTIALS",
    Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "my-credentials-file.json"));
 
GrpcEnvironment.SetLogger(new ConsoleLogger());
 
// Your Google Cloud Platform project ID
string projectId = "my-project-id";
 
DatastoreClient datastoreClient = DatastoreClient.Create();
 
DatastoreDb db = DatastoreDb.Create(projectId, "TestNamespace", datastoreClient);

string kind = "MyTest";

string name = "newentitytest3";
KeyFactory keyFactory = db.CreateKeyFactory(kind);
Key key = keyFactory.CreateKey(name);
 
var task = new Entity
{
    Key = key,
    ["test1"] = "Hello, World",
    ["test2"] = "Goodbye, World",
    ["new field"] = "test"
};
 
using (DatastoreTransaction transaction = db.BeginTransaction())
{                
    transaction.Upsert(task);
    transaction.Commit();
}

If you now check, you should see that your Datastore has been updated:

There’s a few things to note here; the first is that you will need to select the right Namespace and Kind. Namespace defaults to [default], and so you won’t see your new records until you select that.

When things go wrong

The above instructions are deceptively simple; however, getting this example working was, by no means, straight-forward. Fortunately, when you have a problem with GCP and you ask on StackOverflow, you get answered by Jon Skeet. The following is a summary of an error that I encountered.

System.InvalidOperationException

System.InvalidOperationException: ‘The Application Default Credentials are not available. They are available if running in Google Compute Engine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials. See https://developers.google.com/accounts/docs/application-default-credentials for more information.’

The error occurred on the BeginTransaction line.

The ConsoleLogger above isn’t just there for show, and does give some additional information; in this case:

D1120 17:59:00.519509 Grpc.Core.Internal.UnmanagedLibrary Attempting to load native library “C:\Users\pmichaels.nuget\packages\grpc.core\1.4.0\lib\netstandard1.5../..\runtimes/win/native\grpc_csharp_ext.x64.dll” D1120 17:59:00.600298 Grpc.Core.Internal.NativeExtension gRPC native library loaded successfully. E1120 17:59:02.176461 0 C:\jenkins\workspace\gRPC_build_artifacts\platform\windows\workspace_csharp_ext_windows_x64\src\core\lib\security\credentials\plugin\plugin_credentials.c:74: Getting metadata from plugin failed with error: Exception occured in metadata credentials plugin.

It turns out that the code was failing somewhere in here. Finally, with much help, I managed to track the error down to being a firewall restriction.

References

https://cloud.google.com/datastore/docs/reference/libraries

https://cloud.google.com/datastore/docs/concepts/entities?hl=en_US&_ga=2.55488619.-171635733.1510158034

https://cloud.google.com/dotnet/

https://github.com/GoogleCloudPlatform/dotnet-docs-samples

https://developers.google.com/identity/protocols/application-default-credentials

https://cloud.google.com/datastore/docs/concepts/overview

https://cloud.google.com/datastore/docs/reference/libraries

Google Cloud Platform – Using Cloud Functions

In this post and this post I wrote about how you might create a basic Azure function. In this post, I’ll do the same thing using the Google Cloud Platform (GCP).

Google Cloud Functions

This is what google refer to as their serverless function offering. The feature is, at the time of writing, still in Beta, and only allows JavaScript functions (although after the amount of advertising they have been doing on Dot Net Rocks, I did look for a while for the C# switch).

As with many of the Google features, you need to enable functions for the project that you’re in:

Once the API is enabled, you’ll have the opportunity to create a new function; doing so should present you with this screen:

Since the code has to be in JavaScript, you could use the following (which is a cut down version of the default code):

exports.helloWorld = function helloWorld(req, res) {
    
    console.log(req.body.message);
    res.status(200).send('Test function');

};

Once you create the function, you’ll see it spin for a while before it declares that you’re ready to go:

Testing

In order to test that this works, simply navigate to the URL given earlier on:

References

https://cloud.google.com/functions/docs/writing/http