UKHSA statistics production hub

GitHub icon

Ensuring good quality assurance

Main messages

This page is based upon a range of sources from across government and beyond. For a reading list, please see the sources section towards the bottom of this page.

What is quality?

The quality of an analytical output may be thought of in simple terms as its “fitness for purpose”. When data is fit for purpose, it is less likely to be misleading and we can make effective decisions using the data. The UK Statistics Authority Code of Practice uses the 5 Dimensions of Quality of the European Statistical System (ESS) Code of Practice as its criteria for assessing fitness for purpose of statistical outputs. These 5 dimensions are:

  1. Relevance
  2. Accuracy and reliability
  3. Timeliness and Punctuality
  4. Comparability and Coherence
  5. Accessibility and Clarity

All quality assessments of official statistics should be conducted in line with these dimensions and both our QA review conversation tool and QA log template sets out these dimensions for you to consider.

See inside the expandable sections below for example questions you might consider as part of each dimension.

Quality trade-off

A pentagon showing the 5 Dimensions of Quality of the European Statistical System (ESS) Code of Practice: Relevance, Accuracy and Reliability, Timeliness and Punctuality, Comparability and Coherence and Accessibility and Clarity



The 5 dimensions of statistical output quality are not mutually exclusive. There are relationships between them and there are instances where improvements in one dimension would lead to deterioration in another dimension.
Statistics should always be produced to a level of quality that meets users’ needs, and quality assurance should be proportionate to the nature of the quality issues and the importance of the statistics in serving the public good.
Understanding user needs is really important when measuring the quality of your data. Perfect data quality may not always be achievable and therefore focus should be given to ensuring the data is as fit for purpose as it can be.

There may need to be trade-offs between different dimensions of data quality, depending on the needs and priorities of your users. You should prioritise the data quality dimensions that align with your user and business needs. For example, if the timeliness of a data set is the most important dimension for the user, this may come at the expense of the data set’s accuracy, and vice versa. It is important to communicate these trade-offs to the users of your data to avoid ambiguity and misuse of the data.

In practice, the nature and extent of quality assurance activities that are carried out for each project should depend on what is considered appropriate and proportionate, defined as follows:

What do we mean by quality assurance?

Quality assurance (QA) plays an essential part in any analytical project to ensure high quality analysis; it is so much more than just ‘getting the numbers right’. Effective QA ensures that decisions are made with an appropriate understanding of evidence and risks, and helps analysts ensure the integrity of the analytical output. (It therefore forms a crucial part of the UKHSA’s strategic priority to “improve action on public health through data and insight”.)

In high quality analysis: You can transfer knowledge about the data, your calculations are correct, your methods are appropriate, others can test and replicate your analysis and your data and assumptions are fit for purpose. These can be summarised as 5 key areas: documentation and planning, verification, validation, project reproducibility and communication and outputs.

The ‘QA mindset’

In high quality analysis, we should know how data got from the initial input through to the final output. We refer to this as the ‘data journey’.

QA should operate throughout the entire data journey and is not something that can be added at the end of production. This means that we should be thinking about QA at each stage of our analytical project, from project governance and planning to producing outputs and communicating findings. We need to be thinking about the data sources and their appropriateness, whether our methods are sound and if the interpretation of results is robust, communicating any caveats that need sharing with users. This is known as having a ‘QA mindset’.

It is important for analysts to be curious about data and not take it at face-value. If there are values that look inaccurate or unusual they should be investigated and verified. Analysts should understand the full data journey and be able to identify steps that are vulnerable and could introduce errors.

There are 4 key areas of a QA mindset to consider when embarking on an analytical project:

Governance and planning

Good governance and planning is an essential part of quality assurance. This involves understanding who will be involved in the QA process and who will be benefiting from the analysis. Find out who your users are and consider who could use your outputs. Check that the work will meet these user needs appropriately.

Verification and validation

Verification and validation are two important processes aimed at ensuring that outputs will meet requirements and satisfy user needs. To ensure quality, we need to be confident that the methods and processes chosen are appropriate and that they have been implemented correctly. It is important to be curious about data and not take it at face-value. If there are values that look inaccurate they should be investigated and verified.

Reproducibility

Reproducible analysis relies on a transparent production process, so that anyone can follow our steps and understand our results. This transparency eases reuse of methods and results. Easy reproducibility helps our colleagues test and validate what we have done and increases trust in the statistics. Reproducible pipelines are also easier to quality assure than manual processes, leading to higher quality.

Communication of outputs

It is our role as analysts to explain how any limitations in our outputs might impact on the decisions that users take. Being clear about these issues is absolutely vital. It protects the integrity of the findings and supports the users of our outputs in drawing the correct conclusions to inform the decisions they make.

These key stages along the data journey, alongside the 5 Dimensions of Quality of the European Statistical System (ESS), form the basis of our QA review conversation tool and QA log template.

Roles and responsibilities

The AQUA Book defines 3 key roles for QA of analysis: the commissioner, the analyst, and the assurer. These roles are used by many teams to delegate responsibility during QA. In addition, the Aqua Book and Macpherson Review state that analysis which is business critical should have a single senior responsible officer (SRO).

In practice, the exact nature of roles required may differ based on project scope and needs, but these roles provide a useful framework for considering the different ways QA should be built into the life cycle of a project.

See a summary of each of these roles inside the expandable sections below.

How can I adopt a QA mindset?

We have created two tools to help teams discuss, plan, record and sign off on quality:

  1. QA review conversation tool
  2. QA log template

What is the QA review conversation tool?

Our QA review conversation tool was created to provide a useful starting point for thinking about the QA of statistical outputs in line with the five European Statistical System (ESS) Quality Dimensions. It has been designed to facilitate team discussions about quality at either the start of a new project or at regular intervals for cyclical releases to help teams plan QA, identifying gaps in current procedures. We have compiled discussion questions that teams can use to reflect on the quality of their analysis, identify areas for improvement, and consider how to communicate quality to users. The outcomes of these discussions should inform QA plans and QA logs which teams should be using during QA activities.

What is the QA log template?

To help teams record and sign off on quality, we have also created a QA log template. This approach is standard practice across government statistics and should ensure that a QA mindset is present when conducting analytical projects. A QA log should be used every time you are producing a publication.

A QA log will:

An audit trail can be particularly helpful if something goes wrong; it allows you go back and review your processes to see what how any errors might have slipped through and whether you might need to make any changes to the checks you are doing for next time.

Utilising these tools

In both templates, there are sheets relating to the different stages of the QA process as outlined in the ‘QA mindset’ section that relate to the ESS quality dimensions.

In the review conversation tool, each sheet contains space for the analyst to comment on how well requirements are being met and to note any outstanding issues or actions to improve quality or how they communicate quality to users. There is also space for the assurer to provide their sign off that they’re satisfied with the QA process at each stage.

In the QA log template, suggested criteria are given in order to attain a high level of quality. Each criterion is given one row, with space for an assurer to describe whether the criterion has been met, the steps they have taken to assess the project against that criterion, and any outstanding work that may need to be completed. This template was created with the intention that teams can adapt and customise it dependent on their needs to create a comprehensive overview. While it is not a complete list of QA concerns, the template provides a starting point for thinking about the assurance process.

The expandable sections below cover in more detail the five main stages of a traditional publication cycle, giving guiding questions to consider and links to further resources, to help you customise your QA logs based on your project and user needs.

Common questions

The expandable sections below are designed to address the most common queries we receive about quality assurance. Please get in touch with us if there’s anything missing that you would like to see.

Sources

  1. Department for Education: Quality Assurance Framework
  2. Government Analysis Function: Government Statistical Service (GSS) Quality Strategy
  3. Government Analysis Function: Quality assurance of code for analysis and research [“The Duck Book”]
  4. Government Analysis Function: Quality statistics in government
  5. Government Analysis Function: Communicating quality, uncertainty and change
  6. Office for Statistics Regulation: Regulatory guidance - thinking about quality when producing statistics
  7. The Aqua Book: guidance on producing quality analysis for government
  8. UK Statistics Authority: Code of Practice for Statistics