Knative logo

What is serverless computing?

Serverless computing allows you to run your application without the need for a server and without having to worry about the underlying infrastructure. Serverless computing is on the rise. The possibilities were already there, but due to the fragmentation of the serverless landscape with, among others, AWS Lambda, Google’s App Engine and Azure Cloud Functions, there was not one standard and many developers were wary of a Cloud-lock in. With the launch of Knative mid-2018, a standard seems to be emerging and the adoption of serverless is gaining momentum. Knative makes it possible to run application code independently from the underlying infrastructure or platform, so you are no longer tied to one Cloud provider. 

Thanks to Knative, serverless comes within reach of a larger audience. Because with Knative you are not tied to one provider with its methods and peculiarities, Knative has already been embraced by many parties in a short time. Google, IBM, SAP, Red Hat and Pivotal are contributing to the development and GitLab announced in December 2018 that it would provide support for serverless deployments with Knative.

Co-initiator Pivotal describes Knative as follows:

“A simpler way for developers to deploy and run serverless apps and functions atop Kubernetes and Istio (Pivotal)”

We expect that the adoption of serverless computing will rise in 2019 and that Knative will prove itself as a standard in serverless computing. There are many use cases where serverless computing amounts to higher efficiency and better use of resources.

How is serverless different from containers?

Knative is middleware for Kubernetes. It expands functionality and adds a layer between your application and Kubernetes. Knowing that you can build a container and roll it out on Kubernetes in no time, you can wonder what the added value of serverless computing is. Certainly if you know that serverless applications in the background are also just containers. How serverless differs from container orchestration, quickly becomes clear once you know the basic concepts of both.

With container orchestration with Kubernetes you are responsible for (at least part of) the design of the infrastructure. For example, you need to make sure that you can reach the application from the outside, automatically scale it up, restart in the event of problems and that container upgrades are performed correctly.

In the case of serverless computing, you no longer have to worry about the infrastructural part. Knative has arranged that for you and introduces three new concepts that offer all necessary flexibility and control to run your application. These concepts in a nutshell:

  • Building
    • With built-in functionality, it is possible to generate real-time Docker images from a Github repository. Using the build packs from Cloud Foundry you automatically create an image and then roll it out.
  • Serving
    • A request driven platform that starts, scales and routes traffic to the correct revision of your application. Knative provides enough replicas of your application to process the workload. Inactivity stops your application. The system is request driven, so your application will start automatically and quickly when needed.
  • Eventing
    • Messaging brokers are indispensable for multi-Cloud, IOT and many other workloads. Any desired event can trigger your application to start or perform a function.

With the above concepts, Knative offers an innovative solution that allows you to run scalable workloads in a cloud-independent manner. It is particularly suitable for performing functions or microservices, i.e. small applications with one task. For that reason, platforms such as Knative, AWS Lambda and Google App Engine are also called FaaS (Functions-as-a-Service). The big advantage compared to other FaaS solutions is that every Kubernetes cluster can be expanded with Knative, meaning that you are no longer tied to one supplier.

For whom is serverless / Knative interesting?

There are a number of clear situations in which Knative offers benefits. The most obvious situations in which the use of serverless computing is worth:

  • If you run workloads that are periodically executed. After a period of inactivity, the application is scaled back to 0. If your application does not have to be permanently online, it is a waste of resources to always keep server capacity idle. As soon as the route of your application is called, your application will start and if necessary it will scale up to multiple replicas. Because of this behavior, responsiveness is low after a period of inactivity. Serverless is therefore not useful for, (web) applications that must require a quick first-response time, but it is particularly interesting for applications or functions that are called periodically of infrequently.
  • Serverless computing is interesting for microservices that only need to become active after an HTTP request or event from a broker. Knative supports a variety of event sources, including Google’s Pub / Sub, Apache Kafka, RabbitMQ, Github webhooks, and various databases.
  • Knative can be installed on any Kubernetes cluster. That makes it possible to set up your serverless platform without additional investment.

Getting started with Serverless computing and Kubernetes?

With Knative it becomes possible to set up your Serverless platform yourself. Knative runs on any Kubernetes cluster, both on-premise and at any Cloud provider. That makes it very interesting for use cases where serverless applications have to run in a private cloud, for example, due to strict security requirements or legislation.

Curious about what serverless computing can mean for your situation? Leave your contact details in the form below for a free consultation!