While cloud computing has grown at a phenomenal rate over the past decade, applications that rely solely on the cloud for data storage and processing are showing signs of strain – especially those that require guarantees of high availability and ultra-low latency. Edge computing addresses these issues by moving data and compute closer to the clients that use it, making applications faster and more resilient by eliminating dependencies on distant cloud data centers.
In our recent Connect ONLINE developer conference, Dave McCarthy, Research VP of Cloud and Edge Infrastructure Services at IDC, gave a fascinating Keynote presentation on edge computing for ultra-low latency applications. In the discussion Dave focused on the edge computing landscape, as well as IDC research findings on edge adoption. He also touched on the impressive results from recent Couchbase latency tests with Edge Services from AWS and Verizon.
Dave started off by pointing out that IDC has been covering edge computing in depth for the last several years, gathering real data from enterprises on how they approach edge as a strategy for their business. He also made the point that many organizations are beyond planning to pursue edge computing, they are already doing it – for a variety of reasons. In an IDC survey on the topic, 73% of respondents said they view the edge as a strategic investment, and another 17% said that edge is required by business operations. The survey also found that two thirds of respondents were already in production with edge computing in some capacity, and 40% of organizations plan to invest in new edge solutions within the next year.
Primary motivations for edge computing initiatives coalesce around themes of speed, availability and governance. They include:
- Prohibitive costs of bandwidth with a centralized infrastructure
- Security and data protection
- Deterministic latency and distance limitations
- Compliance with sovereign entities and industry regulations
- Continuous operation if network access is interrupted
Cloud to Edge Locations
The discussion then turned to the spectrum of computing models, which correspond to data processing locations, from the cloud to the edge. These locations include the core public cloud, “MEC” (Multi-access Edge Computing) on telco providers networks, regional edge data centers in major metro areas, on-premise data centers and finally on-device data processing. Viewing this spectrum through a lens of latency, the core public cloud is the highest, with decreasing latency through the edge location models to “on-device” data processing, which guarantees the lowest possible latency.
On where compute and storage resources should physically reside, Dave made the point that it’s not about cloud OR edge, the architecture is actually a continuum that ideally distributes compute and storage ACROSS the entire ecosystem, processing data where it best serves the application. Further, IDC found that there is no “one” prevailing edge model today, organizations choose the model that best suits their use case and requirements.
On that point, and to help customers quickly adopt edge computing, cloud service providers like AWS, Azure and Google are bringing new edge services to market that map directly to the cloud-to-edge spectrum, to meet nearly any requirement. The new services are making edge computing more accessible and repeatable, and providing more options and choices for customers.
Couchbase on AWS Edge Services
The presentation next focused on AWS Edge Services, and specifically on AWS Wavelength, which provides low latency by placing the data center directly into a 5G network, and AWS Local Zones, which provides low latency by placing the data center in major metropolitan cities to serve users in those areas. Dave pointed out that in the US, AWS Wavelength is offered primarily via partnership with Verizon, and as such IDC sees a lot of edge synergy between the two vendors. He also said that he focused on these two AWS services mainly because of compelling findings from Couchbase, who conducted latency tests that ultimately proved these services claims of low latency.
The tests compared response times when accessing a Couchbase powered app deployed in a standard AWS Region vs in AWS Wavelength and AWS Local Zones respectively. Specifically, the tests measured latency for a client running in Los Angeles accessing data in an LA-based data center, and then accessing data in a distant cloud data center.
The results were impressive, measuring low double-digits to single-digit millisecond response times between the edge zones and the client. The tests showed up to an 82% reduction in latency on Wavelength and a 78% reduction in latency on Local Zones over standard AWS Regions, making it clear that these services do deliver on their promises of ultra-low latency.
Dave closed his presentation on three essential points to consider when pursuing edge computing initiatives:
- Think of edge as a compliment to the cloud. Distributed applications can provide “best of both worlds” benefits and unlock new use cases.
- Embrace the fact that no one edge solution will meet all needs. In some cases, a multi-tiered approach is best.
- Understand that performance, latency and costs are central to making decisions on edge architectures
He wrapped up by saying how excited he was about the work that Couchbase is doing in edge computing with cloud services providers, which will unlock next generation use cases and innovations.
IDC Market Perspective – Get The Report
To get more of IDC’s insights on the edge computing landscape, and see the detailed Couchbase latency test results on AWS Edge Services, download the latest IDC Market Perspective report: Performance Accountability and Edge Decision Making with Couchbase