The role of containment in mission-critical edge deployments

Today, edge data centers need to provide a highly efficient, resilient, dynamic, scalable and sustainable environment for critical IT applications. At Subzero Engineering, we believe containment has a vital role to play in addressing these requirements. By Gordon Johnson, Senior CFD Engineer at Subzero Engineering

In recent years edge computing has become one of the most prevalent topics of discussion within our industry. In many respects, the main purpose of edge data centers is to reduce latency and delays in transmitting data and to store critical IT applications securely. In other words, edge data centers store and process data and services as close to the end user as possible. 

Edge is a term that’s also become synonymous with some of the world’s most cutting-edge technologies. Autonomous vehicles have often been discussed as one of the truest examples of the edge in action, where anything less than near real-time data processing and ultra-low latency could have fatal consequences for the user. There are also many mission-critical scenarios, including within retail, logistics and healthcare, where a typically high density computing environment, packed into a relatively small footprint and a high kW/rack load is housed within an edge environment.

Drivers at the edge  

According to Gartner, by 2020, internet capable devices worldwide reached over 20 billion, and are expected to double by 2025.  It is also estimated that approximately 463 exabytes of data (1 exabyte is equivalent to 1 billion gigabytes) will be generated each day by people as of 2025, which equates to the same volume of data as 212,765,957 DVDs per day! 

While the Internet of Things (IoT) was the initial driver of edge computing, especially for smart devices, these examples have been joined by content delivery networks, video streaming and remote monitoring services, with augmented and virtual reality software, expected to be another key use case. What’s more, transformational 5G connectivity has yet to have its predicted, major impact on the edge.

Clearly, there are significant benefits in decentralizing computing power away from a traditional data center and moving it closer to the point where data is generated and/or consumed. Right now, edge computing is still evolving but one thing we can say with certainty, is that the demand for local, near real-time computing represents a major shift in what types of services edge data centers will need to provide.

Efficiency and optimization remain key

An optimized edge data center environment is required to meet a long list of criteria, the first being reliability as edge facilities are often remote and have no on-site maintenance capabilities. Secondly, they require modularity and scalability, the ability to grow with demands. Thirdly, there’s the issue of a lack of a ‘true’ definition. Customers still need to define the edge in the context of their business requirements, deploying infrastructure in line with business demands, which can of course affect the design of their environment. And finally, speed of installation. For many end-users time to market is critical, so an edge data center often needs to be built and delivered on-site in a matter of weeks. 

There is, however, one more important factor to consider. An edge data center should offer true flexibility, allowing the user to quickly adapt or capitalize on new business opportunities while offering sustainable and energy efficient performance.

Edge data centers are, in many respects, no different from traditional facilities when it comes to the twin imperatives of efficiency and sustainability. PUE as a measure of energy efficiency applies to the edge as much as to large, centralized facilities. 

And sustainability, especially the drive towards Net Zero, is a major focus for the sector in its entirety. However, what will change over time is the ratio of edge data centers. By 2040, it’s predicted that 80% of total data center energy consumption will be from edge data centers, which begs an obvious question: what will make the edge energy efficient, environmentally responsible, reliable and sustainable all at the same time?

The role of containment

Containment is almost certainly the easiest way to increase efficiency in the data center. It also makes a data center environmentally conscious because, instead of consuming energy, containment saves it. This is especially true at the edge. 

Containment helps users get the most out of an edge deployment because containment prevents cold supply from mixing with hot exhaust air. This allows supply temperatures at the server inlets to be increased. 

Since today’s servers are recommended to operate at temperatures as high as 80.6 degrees Fahrenheit (27 degrees Celsius), containment allows for higher supply temperatures, less overall cooling, lower fan speeds, increased use of free cooling and reduced water consumption – all important factors when it comes to improving efficiency and reducing carbon footprint at the edge.

Further, a contained solution consumes less power than an application without it, which means an environmentally friendly, cost-effective environment. Additionally, it improves reliability, delivering longer Mean Time Between Failures (MTBF) for the IT equipment, as well as lower PUE.

Uncertainty demands flexibility

At Subzero we believe an edge data center needs to be flexible and both quick and easy to install. It needs to be right-sized for the here and now, but capable of incremental, scalable growth. Further, it should allow the customer to specify the key components, such as the IT, storage, power and cooling solutions, without constraining them by size or vendor selection.

Thankfully, there are edge data center providers who now offer an enclosure built on-site in a matter of days, with ground-supported or ceiling-hung infrastructure to support ladder racks, cable trays, racks and cooling equipment. 

These architectures mean the customer can choose their own power and cooling systems and once the IT stack is on-site and the power is connected, the data center can be up and running in a matter of days. 

Back in 2018, Gartner predicted that, by 2023, three-quarters of all enterprise-generated data would be created and processed outside a traditional, centralized data center. As more and more applications move from large, centralized data centers to small edge environments, we anticipate that only a flexible, containerized architecture will offer end-users the perfect balance of efficiency, sustainability and performance.  

The latest Subzero White Paper by Gordon Johnson – Making the Edge Efficient, Scalable, and Sustainable can be found here.

With data centres now deemed as critical infrastructure, organisations must evolve beyond...
By Michael Crook, Data Center Market Development Manager, Corning Optical Communications.
By Dael Williamson, Chief Technology Officer EMEA at Databricks.
By Ramzi Charif, VP Technical Operations, EMEA, VIRTUS Data Centres.
Companies are facing a Catch 22 when it comes to the need to invest in new forms of AI, whilst...
Ben hadn’t considered a career in the Data Centre industry until he saw the advert by BCS on the...
By Marc Caiola – nVent Vice President of Global Data Solutions.