NetworkWorld

Angel Diaz

Serverless: The next step in cloud computing’s evolution

From conferences around the world, to colleagues, customers, and partners, I’m seeing firsthand that the industry is abuzz over serverless computing.

Expectations are high and steadily growing for how this new architecture can revolutionize the way organizations approach development and innovation.

Defining serverless

First, know that “serverless” itself is a bit of a misnomer. There are servers involved behind the scenes, of course, but as you’ll see, they’re abstracted in such a way that developers are free from having to address operational concerns and instead focus on the creativity of writing code.

One way to think about the concepts supporting a serverless architecture is to look at them as a set of three layers that sit atop your existing compute, network and storage resources: fabric, framework and functions.

  • The serverless fabric removes traditional operations functions and concerns from the developer’s plate and allows them to focus on what they do best: write beautiful code for amazing applications.
  • The event-driven programming model provides a framework for the creation of that code. This is ideal for adaptable applications such as IoT, which have a large number of inputs and outputs. Framework manages the cause and effect of the code being written.
  • Functions as a Service provides the packages, patterns and reference architectures needed to assemble the application. This is the code, logic and brains behind the effect that enables the appropriate response.
  • The benefits of serverless computing

You’re probably already starting to imagine the benefits that serverless offers:

  • Scalability: In a serverless environment, the ability to scale an application to meet user demand is handled by the platform hosting the code. If an application has 10,000 or 10 million users, it doesn’t matter. That eliminates operational concerns about pre-provisioning or over-provisioning servers.
  • Cost benefits: Traditional runtime models have processes that constantly run, and the user pays for them even when they’re not being utilized. A serverless environment can be more cost effective because you’re not paying a fixed cost per instance deployed, but instead for the time those instances are actually doing work.

Serverless use case: IoT

Even though we’re still in the early days of serverless, I’m starting to see an uptake for workloads that involve data processing, Internet of Things (IoT), cognitive bots, mobile back ends and REST APIs.

Take, for example, an IoT use case. IoT is not just about the ingestion of data; it’s about driving better outcomes. This means you have to be flexible in the way applications are deployed, ensuring that if a new feature is introduced, it won’t break the application. This is where serverless can help.

Specifically, let’s say you have a refrigerator that periodically updates the user about the state of its parts. In the event that the refrigerator’s water filter has depleted, say, to 20 percent of its lifespan, a message could be sent to the user giving them the option to purchase a new one from their smartphone.

This function requires bringing together the ingestion of various IoT data, including identity (who is the customer?) and warranties (back office systems). Serverless becomes especially useful by making it easier to stitch those things together and respond.

You have an event (depleted filter) that’s propagated to the back end. We can take that event and call a serverless function to do a simple search. Does this customer have a warranty? Are the filters covered in the warranty? Serverless allows your back end to respond to those types of IoT events quickly.

By nature, vents in the IoT user space are going to be sporadic. But a serverless environment allows you to deal with some of those unpredictable loads and use them to drive value or connectivity to other lines of business that are going to deliver a better long-term customer experience.

Before you rush out and dive head first into the serverless pool, know that not all workloads are ideal for this architecture. Consider serverless as another tool in the arsenal for building cloud-native applications.

As with any new technology, people will read about what it does, while others will use it and reap the benefits. There is no denying that serverless computing is on the rise and will play a major role within the cloud ecosystem. I look forward to watching this technology evolve.

 

This article was written by Angel Diaz from NetworkWorld and was legally licensed through the NewsCred publisher network.