8th September 2022
We recently did five mock deployments on Google Cloud Platform using different methods in an effort to understand the nuances associated with each method.
The five methods we investigated were:
In the previous articles, we tried and deployed the same application using 5 approaches to deploy apps to Google Cloud Platform. We also went in-depth on the most appealing features of each platform and we identified the pros and cons of each one. Below you will find a summary of the findings we had on the 5 articles.
Google Compute Engine (GCE)
The go-to IaaS (Infrastructure as a Service) solution for compute instances on Google Cloud Platform. It can pretty much scale to infinity, as long as the operations team configures it to do so. It also lets the users configure their instances the way they want, and install the software they need.
One of the key features of GCE is the Managed Instance Groups, which let the users manage sets of instances with the same configuration. It’s also very useful for auto-scaling and machine images which make it easy to keep instances with the same configuration.
GCE is a good option for applications or sets of applications that run 24/7 and need instant responses. It can also be configured for huge applications with the microservice architecture, although it’s not the best choice for that use case. It is also the most straightforward service for people that already work with IT infrastructure if that counts.
GCE is not the best option for applications that run in short spikes or for brief periods of time. It also is a non-managed solution, which requires SysAdmin knowledge from the team that will operate it.
Google Kubernetes Engine (GKE)
GKE (Google Kubernetes Engine) is the CaaS (Container as a Service) solution provided by Google Cloud Platform. It allows users to easily run Docker containers in a fully-managed Kubernetes environment. It takes care of the automation, management, and deployment of containers, their networking, storage and resources.
One of the GKE’s most important advantages is that it leverages Kubernetes features and exposes them in its Control Panel. It gets even better when using Autopilot, which is an operation mode that manages the entire underlying infrastructure, including nodes and node pools.
It is one of the best options when it comes to applications with complex architectures, like distributed microservices sending and receiving data from a lot of different places. It is highly-available by nature and gives the user a lot of control over the clusters resources through the Control Panel.
GKE is not the best option for teams that don’t have people who understand the concepts of Kubernetes, even though it is not a huge blocker. It is also a more expensive solution than running the same set of applications using GCE, but the cost difference can be worth it in some cases.
Google App Engine (GAE)
One of the fully managed, serverless solutions provided by Google Cloud Platform. It allows users to build monolithic “server-side“ rendered websites, and it supports popular development languages with a range of developer tools. It takes care of the server and deployment management so the user can focus on the code.
As detailed in the previous article, App Engine has two different environments, standard and flexible. We detailed some of their features that are crucial for users to consider when choosing which environment to use. To recap, the standard environment only allows specific versions of the supported languages to run, as opposed to the flexible environment, which uses Docker containers and therefore can run any version of the supported languages.
Google App Engine is a good option for applications that expect a lot of traffic and/or receive huge traffic spikes. It is also a good option for users that want to go serverless and have their applications written in one of the supported languages: Python, Java, Node.js, Go, Ruby, PHP, or .NET. It also has a free tier, which can be a very appealing feature for small applications.
App Engine is not the best option for applications that run 24/7, nor for huge distributed applications using the microservices architecture.
Cloud Run is the “Container to Production in seconds” solution that Google Cloud Platform offers. It allows users to deploy and run request-serving containers, or even small jobs to a fully managed, serverless platform.
It stands out for its good integration with other GCP services, for using Docker images as its smallest unit of deployment and its scalability. The pricing model is simple and predictable, which makes it a good platform to run applications on.
Cloud Run is a good solution for serverless applications written in pretty much any language, as it uses Docker under the hood. It also supports the microservice architecture, given its scalability and the fact that it uses Docker as its runtime environment. One of its features, Cloud Run Jobs, is also a good option for jobs that don’t last long and have specific use cases.
Cloud Run is not the best option for teams that need control over node allocation and networking features, since it’s fully managed. It is also not very useful for applications with background tasks.
Cloud Functions allow users to run their code with no servers or containers to maintain and with scalable, pay-as-you-go functions as a service (FaaS) on Google Cloud Platform.
It is a great platform because it runs code in a fully managed environment, in addition to supporting various languages and even Docker container images. It lacks some networking and provision configuration but that is more related to the serverless model itself.
Cloud Functions is a good option for exposing snippets of code that integrate with third-party solutions or common functions that are used by various services that can be called using events.
Cloud Functions is not the best option for running services with a lot of business logic or big monolith systems as you have to expose each endpoint as a function
With all this information on GCP and its products/platforms, we end our series of articles. We hope the knowledge shared in these articles helps you decide which platform to use when running your next application on GCP.