Containers and Serverless as Alternate Cloud Deployment Options | Deloitte US has been saved
With rapid advancements in cloud computing, many enterprises may no longer want to simply “lift and shift” applications to cloud because of potential limitations in scalability, complexity with deployment, and low flexibility. In such scenarios, serverless and containerized solutions can offer more cost-effective and flexible paths to leverage the value and benefits of cloud over server-based cloud deployments.
Containers—easy portability for applications
Containerization involves encapsulating applications in “containers” with their own operating environments and a common operating system. A container is a standard unit of software that includes the application’s code and its dependencies. Because of this, a containerized application runs with great speed and reliability from one computing environment to another.
Since containerized applications have their “own” operating systems, they do not require a separate, full-fledged operating system and thus consume fewer resources. This also makes them very portable across multi-cloud and on-premises environments. A monolithic, on-premises application can be refactored to a microservices-based architecture using containers or simply lifted and shifted if it is already containerized.
Containerization is also cost effective for applications that need to be modernized quickly using existing infrastructure, with low upfront investments. In an interesting example, a leading American test-preparation business with a global footprint switched its applications from traditional cloud compute to a microservices-based architecture using containerization. By doing so, the business was able to save 40% in costs per application and reduced the number of compute instances deployed by 70%. This further led the business to automate its container provisioning, resulting in reduction of manual effort. 1
Serverless—quick, efficient deployment
Serverless computing is a cloud-native model wherein applications are written as functions that run when triggered. Allocation of resources is managed by the cloud provider, allowing developers to build and run applications without having to manage servers. Serverless functions also scale automatically, making businesses agile and capable of catering to varying traffic in real time with low latency requirements.
Since serverless applications run only when their underlying functions are invoked, businesses can reduce their operating expenses by up to 43%.2 Also, developers need not be concerned with capacity planning, configuration, management, and fault tolerance, which can improve operational efficiency across the firm.
Using serverless technology, applications can be built and deployed in a few hours instead of long lead times spanning across weeks and months. A leading global video-on-demand platform used serverless functions to create rule-based, self-managing systems to reduce errors and load time and improve their traffic handling capabilities. For the workloads deployed on serverless computing, the daily costs came down by about 90% vis-à-vis traditional cloud compute. 3
Writing an application that can run on serverless technology, however, involves reengineering as opposed to a simple “lift and shift” operation. Hence, businesses must make an informed choice to use the technology efficiently.
Typical use cases
Containerization can be a valuable mode of cloud deployment for applications that need multi-cloud support or need to be replicated across different environments. Since containers are self-reliant, independent units, they can accelerate modernization of applications using only their existing infrastructure. This can be further extended to modernization of applications that need not be refactored in the long run.
Serverless deployment is often recommended for digital businesses that frequently release new products and services to their customers. Since applications deployed on serverless can scale automatically, this is an excellent option for businesses and startups with unpredictable workloads, especially those that run for short bursts of time.
While containers and serverless are compared primarily in terms of alternative compute deployment options, serverless options are now available for multiple cloud services, such as databases, streaming solutions, key management, etc. For instance, Amazon Web Services (AWS) provides a variety of database options—users can create their own databases on compute platforms, use a managed database like Amazon DocumentDB (with MongoDB compatibility), or leverage fully serverless databases like Amazon Aurora.
To compare the costs associated with deploying applications on serverless, containers, and traditional compute, Deloitte analyzed the expected costs for an internal application across these three deployment options, leveraging insights from its subject-matter specialists (SMSs) and previous client engagements.
This analysis compares costs for serverless, traditional compute, and containers for the following AWS services for the northern Virginia region:
This analysis combines qualitative and quantitative methods. For quantitative analysis, the application’s run cost is calculated for the four scenarios across different deployment models using the AWS pricing calculator. For the qualitative analysis, costs for the application’s initial development and maintenance costs have been arrived at in close discussions with Deloitte’s SMSs and through secondary research, since these may vary based on business priorities. Further, costs associated with physical infrastructure, facilities, storage, databases, networks, and data transfer have not been considered, with the assumption that they would be similar across all scenarios.
Using qualitative approximation, relative costs associated with the application’s development and maintenance were compared and found to be the highest for traditional compute and the least for serverless. These observations find merit since applications developed using serverless technology do not require developers to manage servers. It is important to note that these comparisons are relative and not based on a thorough cost analysis. The findings are summarized in table 1.
Further, the following run costs were calculated for serverless, containerized, and traditional cloud deployments across the four scenarios with varying workloads (table 2).
Here, it was observed that serverless deployment using AWS Lambda was the most cost-effective option for low traffic volumes (attributed to a lower number of requests and/or a lower average duration of request execution), since it does not require provisioning and management of servers and users are billed only for the resources used in real time. Hence, this solution can be cost effective for unpredictable workloads. This can additionally help enterprises focus on activities that add to business differentiation.
Containerization using AWS Fargate is a good deployment option for low traffic volumes, with costs closely mirroring those across serverless deployment for low traffic. However, with increasing traffic (attributed to a higher number of requests and/or a higher average duration of request execution), running applications on containers can turn expensive. Containers are an excellent choice for businesses looking to modernize their applications with minimal effort in very little time.
With increasing traffic, traditional compute using Amazon EC2 instances becomes the most cost-effective deployment option. For more predictable workloads and steady traffic, EC2 instances can even be reserved for longer periods of time to leverage long-term discounts. A comparative summary of the different costs associated with the application can be seen in table 3 and figure 3.
Containers and serverless are different technologies, and both can help businesses derive more value compared to traditional modes of cloud deployment. These technologies utilize computing resources more efficiently, help applications scale quickly, and increase developer productivity.
Choosing between these options might lead to an “analysis paralysis” within an enterprise, since selecting the best deployment option depends on the business’s use cases, investments of time and money, developer skills, and other related factors such as frequency of use and predictability of workloads.
While some workloads might be suitable for deployment using services like traditional compute, it is evident that businesses need to look beyond traditional modes of cloud deployment to maintain competitiveness and increase cloud value by being user-centric while focusing on the bottom line. So, why not use a combination of containers and serverless to improve efficiency and optimize costs?
To leverage the best of both worlds, businesses should consider combining containers and serverless technologies to build powerful applications that are cloud neutral and agile and incur costs only when invoked. For instance, a modern business seeking to derive more business value can deploy a serverless application on containers to reduce the time to market, increase resource efficiency, and reduce maintenance and run-the-business costs. The world of cloud is changing at a rapid pace and brings with it new ways of doing business, and laggards on cloud adoption might be left behind.
1 Amazon Web Services, “Kaplan containers web study,” accessed August 8, 2023.
2 Akash Tayal et al., Determining the total cost of ownership of serverless technologies when compared to traditional cloud, Deloitte, September 2019.
3 Chithrai Mani, “Is serverless architecture right for your organization?,” Forbes, November 14, 2018; Mark Stier and Igor Okulist, “Netflix images enhanced with AWS Lambda,” Netflix Technology Blog, March 23, 2020.
Amod is a principal with Deloitte Consulting LLP and leads go-to-market for cloud transformations across the Application Modernization & Innovation operating portfolio. With more than 25 years of IT industry experience, Amod specializes in renovating architecture and migrating complex enterprise applications to the cloud, essentially helping to create value by modernizing clients’ legacy systems. His ability to lead organizations through digital transformation journeys is the reason he emerged as a leader in application modernization.