This event, which took place on 17 November 2020, was developed and delivered by the Newton Gateway to Mathematics working with Professor
Jane Hutton (University of Warwick) and in collaboration with the Royal Statistical Society. It took place virtually and was attended by just over 60 delegates.
The reliability and accuracy of reporting is key to ensuring confidence in the financial sector, yet there is limited regulation on data quality and the software used, a lack of consistency and no agreed documentation of modelling. This is in contrast to medicines and medical devices, where there are auditable processes in place with the requirement to demonstrate the source of data used and the need to show documentation of the workings of the software. The workshop provided an overview of the different expectations of regulatory bodies, and an opportunity to discuss best practice and useful resources.
Deborah Ashby, President of the Royal Statistical Society and professor at Imperial College London, began the day with a brief history of regulation of medicines. Thirty years ago, there was limited statistical and mathematical input to UK medicines regulation. Now many expert mathematical scientists collaborate in regulation. Excellence requires different disciplines to work cooperatively. There is extensive information on what constitutes good planning, data collection and analysis to assess the safety and efficacy of medicine. This helps to ensure quality, though that comes with increased costs and administration. New entrants can easily find what is required, but should not replace common sense by rule-following. Experienced statisticians are still essential, to provide rationale for regulations, understanding of underlying principles and critical discussion when new situations arise. The MHRA is consulting on how real world evidence can be used in trials to support regulatory decisions. Collaboration to improve the quality and accessibility of data continues through non-profit consortiums, including the Equator Network. Deborah referred to the Clinical Data Interchange Standards Consortium. She finished by highlighting some of the risks and the benefits of regulation.
Sheila Bird (MRC Biostatistics Unit and University of Edinburgh) outlined the role of the National Centre for Health and Care Excellence in evaluating the cost effectiveness of different medical procedures. The members of the NICE Appraisal Committees are committed to providing the reasons for their decisions and the leadership of NICE’s first chairman, Professor Sir Michael Rawlins, ensured adherence to sound scientific principles. NICE reports are used internationally and the institution is respected for its transparency. In contrast to some areas of regulation, submissions must include an executable, fully documented version of any software used to model costs and benefits, or generate scenarios. This allows NICE personnel to check assumptions and assess how sensitive results are to changes in parameter values. In its turn, NICE prefers to have the permission of its academic assessment-teams to provide an executable version of their health economics modelling when it is requested. Under the title 'Good law risks bad science', Sheila explained why this open and rational approach was not endorsed in a Royal Courts of Justice
judgement. She finished her talk by speaking about the antigen mass screening of asymptotic adults in Liverpool for COVID-19, where there is, as yet, no published protocol, contrary to established good practice.
Saul Jacka (University of Warwick) considered regulation of life annuities and pensions. One seventeenth century approach to regulation was to lynch the first person who had showed the correct actuarial valuation of a life annuity. In contrast, after the major company, Equitable Life, failed to value risks correctly and went bankrupt, many reports were written but annuitants were not compensated. The Pensions Regulator (TPR) does not seem to be required to promote good value pensions, merely to protect pension benefits offered by employers. In Saul’s view, TPR is struggling to reconcile protecting the Pension Protect Fund with minimising adverse impact on employers, with the majority of companies required to divert funds into pensions to the detriment of investment in people and technology. TPRs emphasis on gilts as the right investment makes a mockery of the stated purpose of Quantitative Easing-to depress yields and encourage other investments. He felt it was worrying that TPR promotes, in ‘Pensions of the future', the most inefficient schemes, Defined Contribution, and has a graphic depicting pension income which is strongly at variance with ONS data. Experience in the 12 years after the publication of Ann Abraham's 'A decade of regulatory failure' suggests that modern UK actuaries do not need to worry about public dismay or government regulators.
Sam Blundy and
Alex Cunningham (The Pensions Regulator) provided an Overview of Data, Standards & Approaches to Risk-based Regulation of Workplace Pension. They were not able to give permission to record or report on their talk.
Nicholas Davidson QC (4 New Square) discussed the mismatch between client and professional, with particular reference to financial matters. The majority of people buying insurance or making savings do not understand the issues or potential risks involved and are not able to challenge the professionals. Nicholas commented on the change in culture among financial professionals, from providing disinterested advice to clients, to focussing on strategies to find clients who can be used to make money for the professional. He highlighted the growth of machine learning (ML) and artificial intelligence which pose new ethical and regulatory questions, but have great potential for innovation. ML could be used to identify benefits for the consumer of financial products. However, it could also be used to benefit providers of these products, for example, by identifying the customers who generate the most profit. Most financial professionals are very able, and regulators might not have sufficient resources to address the difficulties arising from a culture of greed. Customers do not necessarily have the knowledge that a financial organisation has or the time to look for better process. The FCA Market Survey reported in 2018 that 6 million insurance policy holders were paying very high premiums and if they were paying an average premium rate they would have been paying £1.6 billion less. The FCA estimated that companies paid £2 billion per year to attract customers who they hoped would then remain profitable for them. Nicholas closed with highlighting that the role of mathematicians remains fundamental for the regulator and the financial services industry.
Niamh Nic Daeid (University of Dundee) presented the relative success story of regulation in Forensic Science as well as the challenges. Evidence provided by new science, DNA analysis, and some serious mistakes provided the motivation for change. Approaches developed to use and communicate DNA tests effectively have informed other areas of forensic science. The cultural transformation that forensic science practice has undergone in recognising the important of quality standards has resulted in clearer processes, greater accountability, more rigorous science and higher standards of reporting. Although the Forensic Science Regulator has only about 3 FTE, work by fellows of the Royal Statistical Society over the last 25 years has had a substantial impact on the quality of statistical and evaluative evidence provided to police, lawyers and courts. There remain plenty of opportunities for further improvement.
Data Analysts from the Civil Service,
Jonathan Tecwyn (Department for Education) and
Katherine Byrne (Home Office Analysis and Insight) presented their excellent tools for Assurance and Handling Analytical Uncertainty. These implement recommendations in The Aqua Book: guidance on producing quality analysis for government, published by HM Treasury. The material developed to support decision-makers throughout all government departments is publicly available. Despite uncertain internet connections, the participants were guided through the resource, which was illustrated by a number of case studies. Some participants accessed the
Toolkit in real time and expressed their appreciation of this valuable resource.
Andreas Tsanakas (Cass Business School) reported on a qualitative study of Models and Uncertainty in Insurance Regulation and Decision Making. Insurers use simulation based models of assets and liabilities. These models start with random vectors of risk factors, which are aggregated under a set of assumptions before some performance functional is applied to the output. He highlighted examples of how insurers use their models: in regulation, the link between the decision and model is rigid whereas in decision-making, the link can be more flexible. He explored stakeholder’s practices when dealing with models and uncertainty. Examples of negotiations between actuaries and underwriters on which figure to use were given. Andreas closed by reflecting that insurance risk models are subject to much uncertainty but that practitioners have found ways of adapting to this.
A panel discussion then took place where on a number of different issues were raised:
- Can a risk based regulation be efficient?
- The issue of uncertainty in relation to flood risk was raised. With climate change we no longer feel we have real sense of risk of extreme events that we face.
- The average citizen should be the ultimate beneficiary of any regulation. How is an "average customer/client" defined?
- Are there any data standards that are regularly implemented in sectors such as insurance and e-commerce?
- Lessons Learnt in Forensic Science was shared – are there other examples from government?
The meeting closed with a discussion of some possible next steps:
- Develop an overview by pulling together the different approaches to regulation and methods for improving quality of data and analyses in different sectors.
eg, 1. Compare the reporting requirements of different regulators;
eg, 2. Compare Equator Guidelines, CDISC, The AQUA Book, Technical Actuarial Standards.
- Assess reputations of various regulators for transparency, and rationale for decisions. NICE is a starting point.
- Apply AQUA standards to outputs of regulators.
- NICE2: a National Institute for Criminal Justice Excellence
- Reconsider the (> two) decades of regulatory failure in annuities and pensions: working group to progress implementation of all the reviews, assess impact of Technical Actuarial Standards on practice.
- It was agreed that it would be helpful to have a harmonised approach to public engagements between a number of organisations.