Guide to evaluating interventions (differential attainment)

Planning an evaluation

We asked Work Psychology Group to develop practical resources to help you plan your evaluation. These are designed for anyone running or planning to run any intervention who wants to understand if this has had, or will have, a meaningful impact. It can also highlight unintended consequences.

It’s important to consider how these resources can be used most appropriately in the context of your own organisation, region or intervention.

How to approach an evaluation

When planning an evaluation you need to consider the approach which most suits your needs; how to measure success and which tools will be most effective in collecting data on your chosen measures. This could be:

  • a summative approach – this is where you’re focusing on assessing if the intervention was successful at addressing the issues intended
  • a formative approach – this involves assessing different aspects of the intervention while it’s underway and making adjustments during its implementation

Defining success

There are challenges when evaluating interventions aimed at improving educational outcomes; there are often long delays before the effect can be observed, such as exam outcomes, and this delay makes it more difficult to know whether the intervention itself directly contributed to the outcome.

Measuring potential effects can be taken at different depths and timescales. For example, a coaching session to develop consultation skills might be measured by

  • individuals’ reaction, such as positive feedback following the coaching
  • individuals’ learning, which could be measured through an assessed role play at the end of the coaching session
  • individuals’ change in behaviour, which may be measured by observing a real consultation a few months after the initial coaching
  • the overall outcome for the individual or the organisation as a result of the individuals’ change. This may be measured by a sustained improvement in exam outcomes which test consultation skills or a reduction in training time extension for the programme

You can use a range of techniques to build a qualitative assessment of the impact of an intervention and its likelihood of achieving its ultimate aims. This includes case studies or before and after studies. This approach will be especially useful if it may take a number of years before a full evaluation can be made. More information on approaches to evaluation design. 

If you’d like further advice to help you decide on your approach to evaluation, please see Work Psychology Group’s research

Working with small sample sizes

Small sample sizes may be challenging when carrying out your evaluation. Interventions aimed at a minority group may sometimes, by their very nature, have an impact on a small number of doctors. Although smaller sample sizes can mean that your conclusions are indicative rather than definitive, these results will still be valuable as you’re growing your evidence base. Work Psychology Group’s research identified some tips for working with a small sample size:

  • Consider how to increase the amount of data available by looking at information available through other organisations. For example, we publish a range of data on the progression of doctors in training across the UKand we can also provide you with specific anonymised or aggregated data by contacting Quality@gmc-uk.org. Research databases such as UKMED can allow your data to be linked to other datasets through a secure portal
  • Consider if you could aggregate information over time, across several cohorts, across institutions or by grouping together smaller groups into broader categories, to increase your sample size. It’s important to remember that there are challenges associated with combining different individual experiences together and this may mask differences that exist within the aggregated groups.
  • The design of your evaluation may benefit from involving some element of qualitative data and/or measures that will still show a meaningful difference, even with a small sample
A doctor talking to a patient

Which measures and tools should you use?

This section will help you decide on the most appropriate way to measure the impact of your intervention and which tools to use to collect the information.

Some important points to consider are:

  • Multiple factors: Differential attainment has a number of potential influencing factors, including the trainee, the trainer, the institution and the wider environment. Depending on what your intervention is designed to target, consider using multiple measures in your evaluation rather than focusing solely on one measure.
  • Availability: You may be able to rely on readily available measures for some of the data informing your evaluation, while other data may need to be collected by you – this could be tailored data you collect specifically for the evaluation, or data that you already collect as part of an existing process.
  • Survey data: Surveys can be a good mechanism to collect data to evaluate an intervention. However, there are a number of considerations when interpreting this data. Sometimes results are based on a small number of people, the data is self-reported and may contain biases, completion rates may be low and patient responses may reflect aspects other than the doctors’ performance.

In their research, Work Psychology Group identified and categorised a wide range of measures relating to differential attainment and experience for you to consider when assessing your intervention. They evaluated measures on categories such as usefulness and appropriateness, as well as how easy they are to access. In this table, you can view all 69 measures to help you think about their suitability for your particular needs.

A group of people sat in a circle talking

Checklists

Our evaluation checklist takes you through a number of important areas to consider when you’re planning your evaluation.

Data Collection

Implementing an evaluation generally consists of two stages; data collection and data analysis. This section focuses on data collection. The data analysis will be bespoke to your evaluation design, measures used and the overarching purpose of the evaluation and so is not covered here.

If you have planned to collect new data to evaluate your intervention, you may need to do this before, during or after the intervention is implemented, depending on the evaluation design you have chosen to follow.

The methods you could use include:

  • surveys
  • interviews and focus groups
  • bespoke simulations or assessments, such as those using actors in clinical settings

Work Psychology Group research has identified some key considerations to help you decide on which method of data collection is most suitable for your needs. 

Key considerations on deciding which method of data collection to use
Method  Considerations

Surveys/questionnaires     

  • Quick and easy to administer
  • Anonymity is possible
  • Data analysis may require expertise
  • Potential for low response rate / response delays
  • Potential for dishonest reporting
  • The variable ability of an individual to accurately assess their own knowledge and behaviour
 Interviews/focus groups  
  • Trained interviewer ideally used
  • More in-depth than surveys
  • Opportunity for clarification
  • Opportunity for shared dialogue (focus groups)
  • More personal approach
  • Individuals may feel uncomfortable being interviewed face to face, including when being asked personal questions
  • Thematic analysis can be resource intensive
  • Voluntary nature/small sample size could introduce biases
 

Simulation/assessment

  • Close relationship to performance/behaviour
  • Reliable and objective
  • Resource intensive to develop and administer  

Collecting data is important but it’s not without its challenges. Here are some examples of how educators have addressed some of these issues.

Examples of how educators have addressed issues

Further good practice guidance on collecting new data, including tips for interviewers, and examples of questionnaires can be found in our Useful Materials section.