Building the future, facing the neighbours: gigawatt data centres in 2026 

As AI leads to bigger data centres, power grids and public perceptions may shape what’s actually possible.   By Tate Cantrell, Verne CTO.

The industry in 2026 will need to get ready for hyper-dense gigawatt-scale data centers but preparation will be more complicated than purely infrastructure design. AI’s exploding computational demand is pushing designers to deliver facilities with greater density that consume a growing volume of power and challenge conventional cooling. 

 

The growth of hyperscale campuses risks colliding with a public increasingly aware of power and water consumption. If that happens, a gap may open between what designers can achieve with the latest technology and what communities are willing to accept.  

  

A growing public awareness of data centers  

   

The sector has entered an era of scale that would have seemed implausible a few years ago. Internet giants are investing billions of dollars in facilities that redefine large-scale and are reshaping the market. Gigawatt-class sites are being built to train and deploy AI models for the next generation of online services. But their impact extends beyond the data center industry: the communities hosting these “AI factories” are being transformed, too.   

   

This is leading to engineered landscapes: industrial campuses spanning hundreds of acres, integrating data halls with power distribution systems and cooling infrastructure. As these sites become more visible, public awareness of the resources they consume is growing. The data center has become a local landmark - and it’s under scrutiny.  

  

Power versus perception

   

Power is one area receiving attention. Data center growth is coinciding with the perception that hyperscale operators are competing for grid capacity or diverting renewable power that might otherwise support local decarbonisation. There is no shortage of coverage suggesting data centers are pushing up energy prices, too. These perceptions have already had consequences. In the UK, a proposed 90-megawatt facility near London was challenged in 2025 by campaigners warning that residents and businesses would be forced to compete for electricity with what one campaign group leader called “power-guzzling behemoth”. In Belgium, grid operator Elia may limit the power allocated to operators to protect other industrial users.  

   

It would not be surprising to see this reaction continue in 2026, despite the steps taken by all data center operators to maximise power efficiency and sustainability.  

   

Cool misunderstandings  

   

Water has become another focal point.

Training and inference models rely on concentrated clusters of GPUs with rack densities that exceed 100kW. The amount of heat produced in such a dense space exceeds the capabilities of air-based cooling, driving the move to more efficient liquid systems. Yet “liquid cooling” is often interpreted by the public as “water cooling”, feeding a perception that data centers are draining natural water sources to cool servers.   

  

In practice this is rarely the case. While data centers of the past have relied heavily on evaporative cooling towers to deliver lower Power Usage Effectiveness, today we now see a strong and consistent trend towards lower Water Usage Effectiveness through smarter cooling and sustainable design. Developments in technology are making water-free cooling possible, too, with half of England’s data centers using waterless cooling. Many operators use non-water coolants and closed-loop systems that conserve resources.  

  

Data centers as part of the community  

   

Addressing public concerns will require a change in how operators think about their place in communities. Once built, a data center becomes part of the local fabric and the company behind it, a neighbour. Developers need to view that relationship as more than transactional. They must demonstrate that growth is supported by resilient grids capable of meeting new demand without destabilising supply or driving up cost.  

 

Water and power are essential resources, so public concern is understandable. It’s therefore important that operators show that density and efficiency can be achieved without disproportionate environmental impact. The continued rollout of AI-ready data centers will depend as much on social alignment as on advances in chip performance.  

   

That alignment will be tested in 2026 and beyond as another wave of high-density deployments arrives. Based on NVIDIA’s product roadmap, we already have a sense of what’s coming: each generation of hardware delivers more power and heat, requiring more advanced infrastructure. NVIDIA’s chief executive Jensen Huang introduced the DSX data center architecture at GTC 2025 in Washington DC, a framework designed to make it easier for developers with limited experience to deploy large-scale, AI-ready facilities. In effect, it offers a global blueprint for gigawatt-scale “AI factories.”

   

A positive outcome of this will be a stronger push towards supply chain standardisation. Companies such as Vertiv, Schneider Electric and Eaton are aligning around modular power and cooling systems that are easily integrated into these architectures. Nvidia, AMD and Qualcomm, meanwhile, have every incentive to encourage that standardisation. The faster infrastructure can be deployed, the faster their chips can deliver the required compute capacity. Standardisation, then, becomes a commercial and operational imperative but it also reinforces the need for transparency and shared responsibility.  

  

Efficiency and expansion  

   

Behind all of this lies the computational driver: the transformer model. These AI architectures process and generate language, code or other complex data at scale, the foundation of today’s generative AI. They are, however, enormously power-hungry and even though it’s reasonable to expect a few DeepSeek-type breakthroughs in 2026 - discoveries that achieve similar performance with far less energy thanks to advances in algorithms, hardware and networking - we shouldn’t expect demand for power to drop.

   

The technical roadmap during 2026 is clear. We are heading towards greater density, wider uptake of liquid cooling and further standardisation. With data centers running as efficiently and sustainably as possible, developers and operators will need to establish trust with local stake holders for the resources required to develop and power the AI factories that will drive a new era of industrial innovation.

By David Davies, Associate Director at Arup.
TES Power’s Managing Director, Michael Beagan explores the top five power challenges the data...
By Buddie Ceronie, GM for Telecommunications, VertiGIS.
Straightline Consulting’s Craig Eadie discusses how the rapid rise of hyperscale and AI-driven...
This article was developed jointly by engineers from Black & White Engineering’s global offices,...
By Jon Healy, Managing Director, EMEA, at Salute.
By Jennifer Holmes, CEO of the London Internet Exchange (LINX).