Any tech meetup or a conference you go these days, there is a very high chance of hearing these terms – microservices and serverless computing. In this post, I will touch upon serverless and discuss it as an evolution of server computing.
This is Wikipedia’s definition of serverless computing.
Serverless computing is a cloud computing execution model in which the cloud provider dynamically manages the allocation of machine resources. Pricing is based on the actual amount of resources consumed by an application, rather than on pre-purchased units of capacity. It is a form of utility computing.
In other words, serverless computing allows you to develop business applications without having to spend effort and money up-front on infrastructure.
For instance, Lambda is a serverless model offered by AWS. By defining independent and self-contained functions in a variety of languages and connecting them to a variety of AWS resources, Lambda provides an on-demand model to run business applications.
Serverless doesn’t mean there is no server. But instead, it means that you don’t have to actively manage the server
That is, you ignore that a server exists and focus on your business logic. If your business is doing great and functions are invoked many times, you pay the server host. As simple as that!
If you think more about it, it is nothing but a natural evolution of server computing.
Up to 1990
Application code was tightly coupled with server infrastructure. Code was virtually inseparable from where it can run. This leads to superior stability and reliability (think Apple devices), but very inflexible to changes. Mainframe systems, AS400 etc. were the market leaders during this era.
This was the era of internet. Web and mobile applications, combined with proliferation in server market, mandated that code can no longer be coupled to server infrastructure. This led to specifications like J2EE (later Java EE). As long as the application code and server implementation are aligned by specifications, they live happily together.
This model solved several problems. However, up-front provisioning of server, pricing model delinked from usage and inflexibility in scaling were the new problems.
Cloud technologies make it possible to treat servers like any other resource, say water. They are controlled by a provider, available on-demand and paid for usage. It becomes a lot easier for organizations to plan server resources, as a lot of flexibility is built into the model.
Taking it a step further, serverless abstracts the provisioning and running of servers – they are handled by the cloud provider. All you do is write code, connect to resources you need (potentially offered by the same cloud provider) and leave the rest to them.
As you can see, it is pure evolution in play.
What next? Possibly codeless applications, where all the developers do is to define application logic with a meta-data like language and cloud provider convert to code, provisions the server, deploys and makes it available for the world. May be…