Blog by Sarah Cumbers, Chief Executive, Royal Statistical Society
One of the clearest lessons from the Covid-19 pandemic is that the UK’s statistics system was too often forced to improvise – like most of government, it wasn’t ready to deal with the shock.
When I gave evidence to the UK Covid-19 Inquiry, I spoke about the scale of the effort made by statisticians during the crisis. New analysis was produced at speed, methods were adapted, data sharing agreements were made, and urgent questions from policymakers were answered under massive pressure. That work was vital. But it also exposed weaknesses in the system. Too much depended on informal arrangements, goodwill and rapid fixes, rather than on a system designed to adapt and operate under strain.
Preparedness does not mean predicting the next crisis in detail. It means thinking ahead to the types of questions governments and others will need to ask, the decisions they will need to make, what evidence will be required, and whether the data collection systems are in place and sufficiently resilient to support that. During Covid, those foundations were often missing. As a result, time and energy were spent overcoming barriers that should not have existed.
The pandemic also highlighted the risks of an overly narrow approach to statistics. In normal times, it can appear efficient to focus on a small number of headline economic measures. During Covid, policy priorities shifted quickly. Questions about jobs, household finances, prices and living standards evolved from week to week. A system with limited breadth struggles to keep up. Breadth is what allows statistics to remain useful when circumstances change.
This was particularly clear in discussions about inflation and the cost of living. Headline inflation measures play an important role, but they do not capture how price changes vary across different households. Evidence to the Inquiry underlined the importance of developing the Household Costs Indices (HCIs) to sit alongside existing measures so that policy decisions are informed by granular understanding. Equally important, this enables people to see their own experience reflected in the data used to explain policy decisions, fostering understanding and trust.
Transparency was another consistent theme. Many decisions during the pandemic had to be taken with incomplete information, and that was unavoidable. What matters is that the evidence behind those decisions is set out clearly and published wherever possible. To maintain public trust, government must demonstrate trustworthiness: openness cultivates trust and helps people understand the choices made, even when the choices themselves are difficult.
The same is true when it comes to uncertainty. Being clear about what is known, what is uncertain and where judgments have been made does not weaken confidence. It supports better decision-making and more informed public debate, particularly in a crisis.
Taken together, the evidence heard by the Inquiry points to a straightforward conclusion. A resilient statistical system is one that is prepared in advance, broad enough to respond to changing needs, transparent in its use of evidence and open and clear about uncertainty. These are practical requirements, not abstract ideals.
Covid will not be the last national emergency the UK faces. The decisions taken now about data sharing, statistical capacity and engagement will shape how well we respond when the next one comes.