Can it be safer on the edge?

Children are warned to stay away from the edge of cliffs, bridges, roads and rivers, but in computing, the edge is turning out to be one of the safer locations for hosting processing power and analysing data. Antony Savvas explores why this is the case, while also addressing the security concerns that do exist around edge computing and assessing how these are being resolved.

The edge computing market includes hardware, edge nodes or gateways and servers, sensors and routers, software including databases and analytics, services and edge-managed platforms. The global edge computing market size covering all these areas is anticipated to reach US$43.4bn by 2027, seeing a CAGR of 37% over the forecast period, according to a March 2020 report by Grand View Research.

Covering just hardware, platforms and services, rival research house MarketsandMarkets projects the global edge computing market will grow from US$3.6bn in 2020 to US$15.7bn by 2025, at a CAGR of 34% during the forecast period.

Some key players in the edge computing market are ADLINK, Altran, Amazon Web Services (AWS), Axellio, Belden, Cisco Systems, Clearblade, Dell Technologies, Digi International, EdgeConneX, Edge Intelligence, Edgeworx, FogHorn Systems, GE Digital, Google, Hewlett Packard Enterprise, IBM, Intel, Juniper Networks, Litmus Automation, MachineShop, Microsoft, Moxa, Nokia, Sierra Wireless, SixSq, Vapor IO, VMware and VoltDB, among many others.

Antony Savvas

The main drivers

5G is expected to act as a big catalyst for market growth. Applications using 5G are expected to change traffic demand patterns, “enabling technology growth avenues” for the communications service providers (CSPs), said Grand View. The cloud market leaders see this as a threat and have started investing in the edge ecosystem themselves by engaging in partnerships with CSPs.

CSPs are expected to embrace new opportunities in the multi-access edge computing (MEC) market place, said Grand View. MEC allows providers to mitigate network congestion and ensure higher application performance by bringing processing tasks and running applications closer to the cellular customer. The implementation of MEC at mobile base stations or edge nodes is expected to facilitate the rapid and flexible deployment of new services and applications for customers, which “promises healthy market growth”, Grand View said.

Furthermore, there is an anticipated wave of micro edge data centre (EDC) capacity that differs from large centralised data centres. This new capacity is expected to range from small clusters of edge cloud resources located on street-lights to a few racks located in a shelter at the base of a cell tower or inside buildings.

In addition, 5G networks can use EDC facilities to provide efficient local data services, redirecting edge traffic away from the carrier networks to local public internet networks. Various start-ups, such as EdgeMicro, are in the process of deploying commercial mini data centres with IT computing stacks, redundant cooling, fire suspension and biometric security.

MarketsandMarkets warns however that the costs of moving to the edge can be significant. It says: “Edge computing might reduce data transmission and storage costs through localised processing, but investing in edge infrastructure still adds to the capex of companies, including heavy investment in edge nodes, other edge devices and edge data centres.”

They would also be required to spend more on making the devices and the entire network secure. But, MarketsandMarkets added: “The edge infrastructure cost is a restraining factor, though, with advancement and continuous R&D, the cost of edge technology is expected to reduce soon.” So despite some obstacles, there is plenty of interest in processing data at the edge. But why is it safer, overall, for receiving, processing and exchanging data – instead of relying on cloud data centres to do everything?

Edge safety and performance

Dheeraj Remella, the chief product officer at edge database and analytics provider VoltDB, says: “It is pure and simple. Getting closer to the event source decreases the time elapsed before the event data becomes stale. You can apply this to a variety of value extraction principles such as personalisation, operational automation, preventative maintenance and most importantly securing assets, processes and even employees and customers.”

“While the central cloud data centres offer larger infrastructure capacity, just the travel time for the data to get to the data centre robs the enterprise of the opportunity to respond to the event in a timely manner,” he adds. “You can observe this emphasis on getting close to the event source in efforts by CSPs partnering with cloud vendors for the deployment of edge data centres. And enterprise customers stand to benefit by bringing the intelligence even closer to their premises to support lower latency applications and deliver improved security.”

So is it just 5G and the Internet of Things (IoT) that have driven the edge data processing and analytics market? Or has this been an evolving migration driven by the needs of communication service providers and enterprises, and if so, what else do they want out of the edge?

“5G and IoT are accelerants to this awakening to the real-time needs of digital transformation,” Remella explains. “Customers already in the space of tapping into event-driven real-time decisions and automation have benefited from revenue increases and greater security. These organisations were pioneers because they saw the value of what is being ignored in the first ten milliseconds or less.”

“What 5G brings to the table, especially when combined with narrowband IoT (NB-IoT) and Cat-M, is the ability to do away with hops and interim aggregators to get to the network directly,” he says. “Gateway-less IoT is going to become mainstream and this will allow much richer intelligence near the edge, be it machine learning or event-driven realtime decisions.”

It’s time to get sassy

With more data moving to the edge, last December analyst house Gartner promoted the Secure Access Service Edge (SASE) framework. Pronounced sassy, SASE enables increasingly distributed and mobile workforces to remotely and securely access corporate networks and clouds. Interest in SASE has grown substantially due to Covid-19 as enterprises recognise its potential as a business continuity solution.

SASE combines network security functions – such as secure web gateway (SWG), cloud access security broker (CASB), firewall as-a-service (FWaaS) and zero trust network access (ZTNA) – with software-defined wide area networks (SD-WAN) to support the dynamic secure access needs of organisations. These capabilities, says Gartner, are delivered primarily as-a-service with the ability to identify sensitive data or malware and the capability to decrypt content at line speed.

“Although SASE is relatively new, the ongoing pandemic has fostered the need for business continuity plans that include flexible, anywhere, anytime, secure remote access at scale, even from untrusted devices,” said Gartner analyst Joe Skorupa in August 2020. “Mobile workforce, contractor access and edge computing applications that are latency sensitive are three market opportunities. Over the last three months, SASE has been adopted by more than 40% of global remote workers.”

Many suppliers have already launched SASE-based products onto the market, with VMware one of the latest to do so in September 2020. The VMware SASE Platform converges cloud networking, cloud security and zero trust network access with web security, to deliver flexibility, agility and scalability for enterprises of all sizes, said the vendor. With the edge data processing/analytics market expanding as billions more things are added to the edge, we are inevitably going see more edge data leakages and security mishaps hitting the headlines.

To mitigate matters VoltDB’s Remella says edge intelligence is going to be a central theme in data-driven enterprises. He says real-time dashboards at operations centres are going to give way to automated processes that will decrease the burden of manual intervention and resolution.

“Machine learning is going to slowly start moving elements from the central data centre to the edge data centre,” he explains. “What if we can do away with all the unnecessary raw data and instead digest it locally near the edge? And the central data centres are intelligence aggregation points where only the learnings from the edges need to be mixed together to create higher level models, which can then be sent back to individual edge centres to incorporate into local decisions?”

This would reduce the amount of data being stored in cloud data centres, reducing energy consumption, with organisations only having to store human transactional data covering financial or medical or insurance matters, for instance. “I still encounter enterprises talking about storing several petabytes of data and the need for massive high-performance compute clusters to churn through that data,” says Remella. “People might call me utopian, but I strongly feel we can get there.”

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close