When we look back in thirty, forty, fifty years, what will be the abiding image of this global pandemic? Undoubtedly, the face mask will serve as a symbol of its human impact on each and every one of us. But I would like to proffer another likely mental keepsake from this bewildering period: the statistical graph.
In the early stages of the pandemic, as government scientists provided information on the scale of the UK epidemic at their daily briefings, those graphs and the words “next slide, please” became a sombre drumbeat of our everyday lives. Scientific concepts like the R number, logarithmic scales, asymptomatic transmission that would have bamboozled most Brits at the Christmas dinner table last year have now entered the daily lexicon.
The sudden intrusion of scientific data into our lives is significant, I believe, precisely because it marks an inflection point in our societies. Economic forecasters at the Treasury, the OBR, the IFS have long produced data modelling which feeds into the way our government operates and the decisions taken at the top. However, the extent to which data modelling has become a central part of our government’s decision-making and the national conversation during this pandemic is something new – and it’s here to stay.
Arguably the most consequential pieces of data modelling in the history of this country was published in March. The Imperial College London study, authored by Prof. Neil Ferguson and many other leading epidemiologists, warned the UK Government of half a million deaths unless full lockdown measures were implemented. That study has been widely cited as the trigger point for the government’s sudden shift to a strategy – previously considered unthinkable in the Western world – of actively suppressing the spread of a virus by means of a nationwide lockdown, entered into on 23 March and lasting three months.
The impact of the Imperial study merits a wider look at the role that data modelling has played in government responses to the pandemic and what role it will play in future.
Data modelling is an imperfect science, by definition. From my time working closely with data and analytical models in the world of business, I know that they are curious beasts, which help to make predictive estimates of likely outcomes according to the data that is inputted.
In the era of Covid, particularly in the early days of the pandemic, when data for the spread of the virus was limited and of poor reliability, modelling becomes more of a speculative art. Developers had to build models practically from scratch in the early weeks of the pandemic in order to track the novel virus, with deep uncertainties surrounding the means and the speed by which it was transmitted.
Interpreting those models to make policy decisions is not clear-cut. Government officials, aided by the scientific experts sitting on the SAGE committee, have all been on a learning curve, weighing innumerable data points to make frankly impossible decisions – they should be applauded for their tireless work. The £459m that the government has spent on consulting fees to outsource aspects of the pandemic response, including data modelling, or to upgrade its technological capabilities is a sign of the overwhelming nature of the administrative challenge this crisis represents.
But one positive we can take from the turmoil of recent months is that we will emerge from it with more data-literate government structures than ever before. Data modelling has the potential to greatly improve policy outcomes. As we enter a 5G-enabled world, public policy makers will have more data at their fingertips than ever before. Knowing how to utilise that data will help to inform decision-makers about what issues should be public policy priorities and what courses of action will be most effective.
Also important is wider public awareness and understanding of data modelling and its role in policymaking. We have already seen push-back, in some quarters, against the use of speculative models being used as the basis of policy decisions, particularly in the case of draconian lockdown measures that have enormous social and economic consequences. This exemplifies the need for greater democratic oversight of government use of data analytics and modelling. As algorithms become more complex and hold greater sway over our governance, it is vital that they are scrutinised, explained and held accountable.
Data modelling will become a major resource for tackling all forms of crises in the years to come and will perhaps form the basis of future paradigm-shifting policy decisions, like the one we saw in March. The use of data modelling to predict the catastrophic impacts of climate change is one likely example. To make effective decisions and gain the requisite political buy-in from all stakeholders, the lessons of Covid-19 and the new-found prominence of data in our public life will be invaluable. While the data must be handled with carea, its manifold benefits should also be celebrated.