
Keeping numerical order
10th August 2020One feature of life in lockdown was the daily digest of statistics about the battle with COVID-19. The government briefings presented a raft of data in the form of tables, graphs, pie charts and bar charts. Many of the figures may have been grim but, we were urged, look more deeply and there’s some scope for optimism; for instance, there may be an underlying trend that provides hope comparable at the very least to Chancellor Norman Lamont’s ‘green shoots of recovery’ during the 1991 UK recession.
One telling point to emerge has been that the hugely important R-number is calculated by means of ten different statistical models. The published range is then apparently based on what those models indicate collectively. That seems right, but the variations between the results from different models highlight how statistical information may potentially be abused. An unscrupulous user of statistics could choose the model that best supports the point they wish to make.
During the briefings there were often references to the Office for National Statistics, which succinctly describes its role as ‘collecting, analysing and disseminating statistics about the UK’s economy, society and population’. The ONS collects the raw numbers from a wide range of public and private sector sources, processing them ahead of publication in 600+ releases annually. Close attention is paid to its unemployment count, GDP data and Consumer Prices Index as well as COVID-19 figures.
Regulating official statistics
Some briefings also mentioned the UK Statistics Authority. Its stated statutory objectives are: ‘promoting and safeguarding the production and publication of official statistics that serve the public good’. UKSA adds that its duties include ‘regulating quality and publicly challenging the misuse of statistics’; these duties are discharged by an offshoot, the Office for Statistics Regulation. UKSA also oversees the Government Statistical Service and its own executive arm – none other than the ONS.
So, there’s a highly-intervolved structure to the compilation and interpretation of ‘official statistics’ in the UK, but of course statistical work is conducted by many bodies, including trade associations. UK Finance compiles analyses based, for example, on its members’ mortgage lending figures. The Association of British Insurers provides data on claims under various types of policy; and the Investment Association publishes regular statistics on how UK savers are choosing to invest.
Statistics and marketing
Relevant statistics can be a useful aid to marketing and, in the case of charities, fundraising. Figures disclosing the death rate for a particular illness are often quoted by charities involved in research or patient care for its sufferers, to highlight its prevalence. Death and illness rates may also be used by insurers to draw attention to risks that some of the population wrongly believe could never affect them. The percentage of claims paid on each type of insurance cover is also useful information.
Some firms conduct or commission surveys on attitudes and behaviour in specific areas of people’s lives, including their finances. Such surveys may range from how much pocket money the children receive to whether a strategy exists for meeting any future care home fees. Results extrapolated from small or non-representative samples could produce a misleading indication of the national picture, so care is needed to ensure that a sample’s size and basis of selection are adequate.
Dispel the ‘damned lies’ slur
Where statistics and survey data are used in advertising or promotion, it’s vital not to mislead; and the Advertising Standards Authority or a relevant regulator such as the Financial Conduct Authority or Solicitors Regulation Authority may take action. Professional statisticians abhor any misleading use of their work, such as prompted the historic reference to three types of untruth: ‘lies, damned lies and statistics’. The Office for Statistics Regulation et al are intent on keeping numerical order.