The rise of edge enabled IoT

As the digital landscape grows ever more complex, companies across all industries must keep up with the constant shifts in how data is created and utilised. According to a study from the International Data Corporation (IDC), 45 percent of all data created by IoT devices will be stored, processed, analysed and acted upon close to or at the edge of a network by 2020. By Alan Conboy, Office of the CTO, Scale Computing.

  • 4 years ago Posted in

Traditional data centres are no longer the go-to when it comes to data generation. Instead, in an increasingly data-driven world, organisations are looking to edge computing. Edge devices place the physical computing infrastructure at the edges of the network where the data is being generated, and in many cases, that is exactly where the data is most needed.


The capabilities of edge infrastructure to collect, process, and reduce enormous quantities of data that can then be sent to a centralised data centre or the cloud, makes it ideally suited to the IoT landscape. With only a small hardware footprint, edge computing acts as a high-performance bridge from local compute to private and public clouds.

Built to stand the test of time

It is commonly thought that IoT will need edge computing in order to be successful in the long-term, with the inherent latency of cloud no longer cutting it. Edge computing is here to solve this problem,  mitigating the latency associated with the cloud and ensuring that the latest IoT developments are available to any size of business, and across every industry.

Particularly organisations with remote sites, such as industrial, finance, retail, and remote office branch office (ROBO) will reap the benefits when harnessing the power of edge computing. In retail, for example, businesses need reliable computing that can provide maximum uptime for point of sale, inventory management and security applications for the numerous store locations on the edges of their networks. Banks and other financial institutions with multiple branch offices also require reliable computing to support rapid, business-critical transactions.

There is a vast amount of data generated by the global network of IoT devices, and edge computing is well-positioned as the key technology to process it efficiently. The need for speed and effectiveness that edge computing fulfils becomes more important when communication of that data to the cloud may not be reliable or fast enough to be effective.

In the case of ROBO deployments, the infrastructure, on which small branch locations are increasingly running core mission-critical applications, must develop and evolve to match the critical nature of these workloads.

The needs of edge computing sites are usually specific and require notably smaller deployments than the primary data centre site. Many organisations may have dozens or hundreds of smaller edge computing sites and they cannot afford to roll out complex, expensive IT infrastructure to each site.

Making the edge work 

Organisations are increasingly running their critical applications at the edge, because of this, the requirements are similar to those of a data centre. Security, resiliency, scalability, high availability, and human IT resources are all easily achieved in a data centre environment, but how can businesses address the growing distance between the importance of the applications and the infrastructure, and the IT that supports them at the edge?

The answer is that edge computing systems need to be equally if not more reliable, affordable, self-healing, high-performance, efficient, and easy to deploy and use in order to support critical applications with little or no on-site IT staff. In many instances, to keep applications running without dedicated IT staff onsite, systems require automation that eliminates mundane manual IT tasks where human error can cause problems.

What the edge should look like

Systems are always up when automation is utilised, monitoring for complex system failure conditions and correcting them with real-time action. This mitigates the downtime that would take a system offline and require an IT staffer to come onsite to bring it back online. Even when hardware components fail, automation can shift application workloads to redundant hardware components to continue operating.

Organisations with a lot of sites can’t afford to spend weeks deploying complex hardware to each site, so edge computing infrastructure systems need to be easy to deploy and manage. They need to be able to plug in the infrastructure, bring systems online and remotely manage the sites going forward. The more complex the infrastructure, the more time they will spend deploying and managing it.

Edge computing systems need to be self-healing to provide high availability for applications without requiring IT staff resources, with automated error detection, mitigation, and correction. They should be easy to deploy, and simple to manage. On top of this, they should be scalable up and down, depending on the requirement. Ultimately, edge computing systems should go a long way in ensuring that your business is never lumped with excessive overhead for resources you don’t need.

 

By Steve Young, UK SVP and MD, Dell Technologies.
By Richard Chart, Chief Scientist and Co-Founder, ScienceLogic.
By Óscar Mazón, Senior Product Manager Process Automation at Ricoh Europe.
By Chris Coward, Director of Project Management, BCS.
By Trevor Schulze, Chief Information Officer at Alteryx.