Skip to content

Deploying with Cloud Functions on Google Cloud Platform

Part 5 of a 6 Part Series on Deploying on Google Cloud Platform

We recently did five mock deployments on Google Cloud Platform using different methods in an effort to understand the nuances associated with each method.

The five methods we investigated were:

We have examined 5 approaches over the last 5 weeks and next we will wrap it all up with an article that talks about our findings in the context of all five approaches.

For our fifth article in the series, we will be deploying an application on Google Cloud Platform using Cloud Functions.

Cloud Functions Approach

Cloud Functions is a serverless execution environment for building, running and managing cloud services. It enables the user to attach simple, single-purpose applications to events emitted from other parts of the cloud infrastructure and make meaningful processes run based on those events.

The code executes in a fully managed environment, removing the need to provision pieces of infrastructure or worry about managing servers.

Cloud Functions can be written using Node.js, Python, Go, Java, .NET, Ruby and PHP. They run in language-specific runtime environments. They are not limited to event-based processing, since they can be activated by HTTP requests, with the benefit of Google managing the TLS certificates for the functions.

More Cloud Functions features can be found at .

When to use Cloud Functions

Cloud Functions is a very versatile environment. For example when you need to run simple, single-purpose code in a production environment without worrying about where the code is run Cloud Functions can handle this.

No server configuration required. It is a good option for small applications that don’t necessarily need to connect to other applications. It is also a wise option for when the team doesn’t have much SysAdmin knowledge since it is serverless.

Some other interesting use-cases for Cloud Functions are: when the application needs to receive events from other services in GKE, like Cloud Storage, Cloud Pub/Sub and so on.

And, finally, it’s a good choice for applications that do not need to run all day on a server.

How we implemented it

For this solution we have used a terraform project that declares the following resources:

  • A VPC with a subnet for connecting Redis and CloudFunctions
  • A bucket to store the application sources
  • Two functions exposed as GetCounters and SetCounter

The project contains a simple Counter Application consuming a Redis Memory Store.

Main Components

Redis Here is the Terraform block declaring the Redis instance.

Plain Text
resource "google_redis_instance" "data" {
 name               = "${var.app_name}-redis"
 region             = var.region
 tier               = "BASIC"
 memory_size_gb     = var.regis_memory_size_gb
 authorized_network =
 connect_mode       = "PRIVATE_SERVICE_ACCESS"

Connector Here we declare the connector that will be used to allow the application to access the Redis instance.

Plain Text
resource "google_vpc_access_connector" "connector" {
 provider      = google-beta
 name          = "${var.app_name}-conn"
 region        = var.region
 project       = var.project_id
 subnet {
   name =

CloudFunctions In the declaration of CloudFunctions (in this case GetCounters) we set the runtime, source code file and bucket, environment variables and the entry point. For CloudFunctions the artifact is declared as a library (Aka Golang library) and the entry point is a public function that intercepts the requests.

Plain Text
resource "google_cloudfunctions_function" "get_counters" {
 name                  = format("%s-%s", var.app_name, "get-counters")
 runtime               = "go116"
 source_archive_bucket =
 source_archive_object =
 trigger_http          = true
 entry_point           = "GetCounters"
 vpc_connector         =
 environment_variables = {
   REDIS_ADDR = "${}:${}"
 service_account_email =

Other components The complete solution with other components can be found in our repository:

Pros Cons
Deploy code to production fast and easily Limited networking and provision configuration
Google-managed TLS certificates for HTTPS requests It only supports some languages, therefore limited use-cases
Predictable pricing, charged by resources used rather than instance types It can add complexity to the application, making the architecture harder to understand
No worries about server upgrades and OS security breaches Vendor lock-in, since each serverless solution from cloud providers have their own rules
No need to worry about autoscaling since it can scale indefinitely, automatically Vendor lock-in, since each serverless solution from cloud providers have their own rules
Simple pricing model, you pay for the time your code is running Can be difficult to keep track of all the Cloud Functions running

Insight, imagination and expertly engineered solutions to accelerate and sustain progress.