Enterprises are increasingly complementing their cloud-based IoT solutions with edge computing to accelerate the pace of data analysis and make better decisions, faster.
Just a few years ago, many expected all the Internet of Things (IoT) to move to the cloud—and much of the consumer-connected IoT indeed lives there—but one of the key basics of designing and building enterprise-scale IoT solutions is to make a balanced use of edge and cloud computing.1 Most IoT solutions now require a mix of cloud and edge computing. Compared to cloud-only solutions, blended solutions that incorporate edge can alleviate latency, increase scalability, and enhance access to information so that better, faster decisions can be made, and enterprises can become more agile as a result.
Read more from the Internet of Things collection
Explore the Emerging technologies collection
Subscribe to receive related content from Deloitte Insights
That being said, complexity introduced by edge computing should justify the objectives at hand, which include scale, speed, and resiliency. A choice that goes too far in one direction typically introduces substantial operational complexities and expenses. Ultimately, the enterprise should take into consideration a full range of factors that reflect its own particular objectives in designing and building an IoT solution in the first place.
In this article, we discuss when and how enterprises can optimally make use of both the edge and the cloud in their IoT solutions. We explain the roles edge and cloud computing play, why the edge may be needed, and how to approach selecting a solution. We also explain some of the complexities with edge computing and provide some use cases.
Edge computing is a distributed architecture functionality, such as processing and storage, that is located closer to—even on—the very source of the data. Examples include cameras with on-device vision processing and wearable medical devices that send data to a mobile phone via Bluetooth. Given these qualities, making a balanced use of both edge and cloud computing is now often considered a key requirement of designing and building enterprise-scale IoT solutions.
The key representative functionality often used at the edge can be seen in figure 1.
We have experienced a veritable explosion of cloud adoption in the past decade—the IT functionality of many modern companies exists exclusively, or in large part, in the cloud.2 Among the many benefits of the cloud infrastructure are cost effectiveness, scale, self-service automation, interoperability with traditional back-office systems, and centralized functionality.3
At the same time, the amount of sensor-generated data has grown strongly too, and this trend is expected to continue in the years ahead.4 Because data can become essentially valueless after it is generated, often within milliseconds, the speed at which organizations can convert data into insight and then into action is generally considered mission critical. Therefore, having the smallest possible latency between data generation and the decision or action can be critical to preserve an organization’s agility. However, as the speed of data transmission is inviolably bounded by the speed of light, it is only by reducing the distance that data must travel that the latency challenge can be mitigated or avoided altogether. In a cloud-only world the data ends up traveling hundreds or even thousands of miles, so where latency is critical to a solution, edge computing can become key.
According to one estimate, as much as 55 percent of IoT data could soon be processed near the source, either on the device or through edge computing. Indeed, scale plays a big role in this likely shift—growing data demands will likely put the focus on latency, and decreased latency could dramatically improve the response time, thereby saving both time and money.5
Figure 2 shows the relative scale of typical latencies, ranging from on-device to the public cloud.
Latency is just one of the many reasons to drive the addition of edge functionality to an IoT solution. A fuller list of the potential benefits of edge computing is given in figure 3.
In bringing edge and cloud computing to life, an understanding of real-world cases can go a long way. Constant technology evolution, such as the eventual availability of 5G, often affects the cost/latency/balance equation. As such, it is pertinent to consider current conditions in making decisions rather than simply defaulting to previous choices. Being mindful of all drivers while designing an IoT solution is important, as multiple drivers may apply in a given situation.
The IoT can have a dramatic effect on an organization’s ability to be more agile. Given below are some ways in which the edge and cloud help aggregate and transmit data on behalf of enterprises connected by the IoT.
Enterprises are quickly moving toward an event-driven architecture and real-time automated digital processes.10 But when you consider that many manufacturers have multiple plants across geographies—each typically with unique characteristics and functional requirements— the challenge to maintain an exclusively centralized data-analysis capability in the cloud or at a corporate data center becomes apparent.
To be sure, cloud computing offers a number of benefits and it most certainly has a role in smart manufacturing. With data in the cloud, a centralized operations facility can monitor systems and processes across a large, possibly global, portfolio. It is also possible to undertake comparative analysis across the full portfolio that can determine potential for optimizations.
Still, we envision that an integrated edge–cloud architecture would provide the kind of speedy and nearly unimpeded connectivity that smart factories require.
Figure 4 illustrates how the edge and cloud typically work with sensors and devices on a manufacturing floor.
The rise of smart, connected IoT devices and ubiquitous connectivity has created an opportunity to transform buildings—whether they are offices, retail stores, factories, or hospitals—into cost-efficient, responsive environments for delivering exceptional experiences to their occupants.11 Smart buildings are digitally connected structures that combine optimized building and operational automation with intelligent space management to enhance the user experience, increase productivity, reduce costs, and mitigate physical and cybersecurity risks. Smart, digital buildings span industries and uses, but all of them can provide the same basic capabilities: They connect humans; they provide better control of facilities and operations; they support ways to collaborate digitally; and they enable owners to conserve resources, including space, energy, water, and employees. Each of these four capabilities can form the basis for creating a smart building strategy that can deliver measurable benefits. Figure 5 illustrates some of the different types of sensors and applications that can be utilized in a smart building.
For example, 75–80 percent of a building’s life cycle costs are related to building operations. All existing commercial and large residential buildings have some form of building automation (or management) system which controls such things as HVAC. In order to introduce smart building features, such as smart lighting with embedded occupancy sensors, and have these interact with the main system, at minimum a gateway is required with some additional capability that would usually come with an edge server.
The edge capability would provide data (occupancy data, for instance) from strategically placed sensors to a cloud-hosted service that performs some specialized analytics. The outcomes from these analytics can be sent back, via the gateway or edge server, to alter the schedules of equipment connected to the main system in order to optimize operations. This configuration is also necessary to obtain a portfoliowide view of building operations and conditions. Edge computing and cloud enable smarter management of resources.
While edge computing offers solid benefits, it can also introduce operational and design complexities. Edge processing is highly distributed and often includes far-flung and/or difficult-to-access locations, including sensors/actuators and gateways in offices, plants, at campuses, on pipelines, and in various remote field sites. Any given organization can have thousands of devices and hundreds of associated gateways. All these edge nodes have firmware, operating systems, some form of virtualization and containers, and software installed, some of which are provided by manufacturers and some by solution providers. These need to be properly managed and maintained by the owner/manager of these edge nodes, mandating an enormous degree of automation (such as for backups, patching, updates, and monitoring).
The number of potential problems is enormous, and troubleshooting can be very challenging and complex in a highly distributed model. In many cases, field service technicians are required to be regularly onsite to address issues that occur as a result of upgrades or even general maintenance. These drivers also tend toward the need for a widespread “software defined everything” approach, as software upgrades are more easily and conveniently achieved than hardware upgrades.
Despite its challenges, cloud computing obviates concerns with many key IT issues, providing a degree of self-service and automation. Edge processing requires common data center operations (provisioning, updating, change management, and monitoring) in addition to the other higher-level functions (device management, updating machine-learning models, etc.) to be undertaken and replicated to all of the edge nodes and clusters. This is a heavy undertaking, requiring the enterprise to shift the focus from business needs to some extent.
Policies and practices used in traditional data centers are often not readily applicable to edge deployments, which are distributed across multiple locations and considerably more dynamic than traditional data centers. Undertaking the operational management of such a system is a complex challenge.
While the cloud offers on-demand scalability and is readily configurable, automated, and resilient, providing for these capabilities at the edge can be costly and complex. Accommodating the expansion of an existing edge deployment to allow for an increased number of devices and edge nodes can involve significant investment in additional hardware and software, and much complex work.
Extending the cloud and the data center to the edge with multiple nodes and devices exponentially increases the surface area for cyberattacks. Insecure endpoints, such as devices and edge nodes, can be used as entry points to valuable assets within an enterprise network and for other nefarious purposes, such as distributed denial-of-service attacks. Maintaining the physical and cybersecurity position of all assets in the edge is a complex and critical challenge.
Including unnecessary complexity in a solution can be costly, risky, and wasteful, so whether to add edge processing to an IoT solution is a decision best taken with care and based on a risk/reward assessment. To that end, figure 6 offers some guidelines that may help.
In many IoT use cases, the edge is simply a necessary or mandatory part of the solution, given the already existing operational technology. Adding a cloud-hosted component of an IoT solution requires some degree of edge-computing presence, even if primarily a gateway. Similarly, the desire to add smart capabilities to existing building management system infrastructure and provide for a cloud-based real estate portfolio view necessitates use of some edge-processing capability.
Designing a large IoT solution that does essentially nothing at the edge and sends all data for action to the cloud often presents scaling challenges in terms of bandwidth usage, possibly necessitating upgrades to networking infrastructure. Additionally, as the solution scales, exclusive use of the cloud could require reconfiguring ingestion engines and load balancing through manual intervention.
The complexities of an exclusively edge-based distributed architecture (especially one that may involve distributed processing) are no less, and they increase with scale. Systems and application management are highly complex, and tools needed for these have yet to mature. In many cases, edge deployments do not adequately consider expandability, complicating the support of more devices and more data.
The first step is to assess whether edge computing is needed at all. It may well be that the best solution is a purely cloud solution. The next step is to determine the capabilities needed at the edge, and followed by determining the most appropriate deployment model, given that edge processing can be on devices, gateways, edge servers, possibly in multiple tiers, or micro data centers. There can be large variations in compute capabilities, responsiveness, and placement.
In some cases, preconfigured solutions, packaged into a single, integrated product, can offer simplicity, but at the possible expense of flexibility. The capability provided by the flexibility of self-constructing from best-of-breed components is attractive, though it comes at the expense of software/product development and stabilization, which lengthens the time taken to deliver a solution and brings a number of inherent risks.
Paying attention to the changes in the edge compute landscape is important, as is conducting a proof of concept with the relevant deployment design in order to finalize the best fit choice for the use case involved.
There is another variable worth noting—the edge-computing vendor landscape. It is undergoing rapid change. Most IoT infrastructure or platform vendors recognize that edge computing is an important part of many IoT solutions and delivery hardware, such as gateways or servers, with some data processing, analytics, and local storage capabilities. These hardware vendors tend to rely on others for device management, protocol handling, and conversion, in addition to other capabilities. Significant consolidation is likely to occur in this space as vendors seek to offer end-to-end solutions.12
IoT devices and the data they can provide are changing the world and how we interact. Much of the connected-consumer IoT world resides primarily on the cloud largely because of its copious benefits. However, in most cases an IoT solution will involve some mix of the edge and the cloud. Bringing it to the edge can alleviate latency, boost scalability, and increase access to information so that better, faster decisions can be made, and organizations can become more agile as a result.
While deciding on the right balance of edge and cloud functionality in an IoT solution, it is helpful to keep in mind that edge computing has various configurations, and all can bring their own benefits, but they also can present unique challenges. Substantial operational complexities and expenses can emerge in in no time, so enterprises should take into consideration a full range of factors while designing and building any given IoT solution.
Even then, an IoT solution should be only as simple as it needs to be, and no simpler. Conversely, it should be only as complex as it needs to be, and no more complex. These seemingly straightforward, yet essential, points can make a difference in the success of a solution.
Clearly, there is no single correct answer in the cloud vs. edge assessment in an IoT context. Every situation is unique. What is clear, however, is that a balance between cloud and edge computing will likely make up tomorrow’s IoT architecture.