Exascale computing could supercharge crisis response capability

A team of UK scientists are hoping to show how an advanced form of computer modelling could help supercharge how governments deal with major issues like pandemics – if it can be used on the world’s latest and most powerful so-called “exascale” computers.

  • 1 year ago Posted in

Agent based modelling can simulate how societies would react to a pandemic, war and energy supply problems. But it requires huge computational power to run and can take months to get results, which could be too late to influence decisions to avert or mitigate crises. 

Now, researchers at The James Hutton Institute in Aberdeen, together with the University of Glasgow (Urban Big Data Centre) and University College London (The Bartlett Centre for Advanced Spatial Analysis), are assessing if these models can be used on so-called exascale computers, which could potentially do the number crunching in just minutes.

The work is being conducted through a project funded by the Engineering & Physical Sciences Research Council.

Exascale computers can perform a billion billion computations per second but have only been available since 2018 and, to date, have mostly been used to undertake hugely complex calculations around understanding things like black holes, advanced medicines for cancer or the weather.

Gary Polhill, the lead researcher at the Hutton, an independent research organisation, says, “Some of the social science challenges we’re dealing with computationally in agent-based models can be as complex as those faced by scientists working on astrophysical, medical, or meteorological problems.

“So access to this type and speed of computing for us, where we’re dealing with questions around human behaviour and societal responses to things like regulations around pandemics or supply chain issues, could be transformative.”

Agent based modelling has risen in use since the 1990s, when it was first used to model things like pollution and disease. It went through a watershed during the Covid-19 pandemic, with countries like Australia using these simulations to model how different policy decisions would play out. However, it was done on high performance computing platforms, which still took weeks and months to get results.

“Exascale computing could give us answers in minutes,” adds Polhill, who has 25 years of experience working on agent-based models. “It would be transformative for the social sciences, giving governments much faster ability to respond to live, complex events, such as supply chain issues in the wake of droughts or heat waves, or even war or energy supply disruption, in a relevant timeframe.

“But it also raises interesting questions around the use of computing in decision making. The project will explore how the potential of exascale computing can be unlocked for the agent-based modelling community.”

La Molisana, a leading Italian pasta company, selects Hitachi Vantara’s Virtual Storage Platform...
Cerabyte, the pioneering leader in ceramic-based data storage technology, has been awarded a highly...
Innovations for large-scale deployments focused on flexibility, operational efficiency, resilience,...
New study by Splunk shows that a significant number of UK CISOs are stressed, tired, and aren’t...
PagerDuty has released a study that reveals service disruptions remain a critical concern for IT...
NVIDIA continues to dominate the AI hardware market: powering over 2x the enterprise AI deployments...
Hitachi Vantara survey finds data demands to triple by 2026, highlighting critical role of data...
ELTEX, Inc., a pioneer in the e-commerce industry in Japan, has modernised its storage...