Guide to evaluating interventions (differential attainment)

Evaluating interventions

 

What is an intervention?

By interventions, we mean any changes to the working or learning experience aimed at improving the outcomes for medical students and trainees. Research identifies five main types of interventions:

  • Training or support for trainers, such as having the time to develop strong relationships with trainees or techniques for tackling difficult conversations
  • Training or support for trainees, such as familiarisation with exam structures to reduce anxiety
  • Engaging medical education leadership, such as raising awareness amongst educational leaders of the challenges for specific groups
  • Transparency around the issues and causes of differential attainment and engaging with stakeholders. For example, making outcome data available publicly to show differences in attainment across groups
  • Designing processes such as recruitment and assessment to minimise bias. For example, removing names from interview forms.

Interventions can be targeted at different levels:

  • System or policy level, such as the random allocation of training posts.
  • Institutional or regional level, such as structured induction programmes for overseas trainees and their trainers.
  • Individual level, such as one to one coaching or mentoring.
Two people talking over paperwork on a table

Considerations when designing an intervention

Work Psychology Group have identified some helpful tips for developing an intervention in response to observed differences in educational experiences or outcomes:

  • Consider what your data means within your local context. The issues giving rise to the variation may differ across the UK, so how you respond may be different
  • Clearly define and document the intended purpose of the intervention. For example, is it a targeted intervention to support a specific group, or is it a broader intervention that provides everyone with the same opportunity to access it?
  • Define the intended outcomes of the intervention - what does success look like? Having a realistic outcome in mind can help to design a good quality evaluation and also help to make sure the intervention remains focused on the core purpose.
  • Consider who the intervention is aimed at. Targeted attendance lists, or invitations to these types of interventions, should be carefully considered and communicated to make sure individuals do not feel singled out, stigmatised or perceived in a negative way.
  • Consider any other interventions already in place; interventions will work best if they’re designed to work in a complementary way rather than working in isolation. For example, an intervention targeted at trainees who require additional support, paired with trainer focused interventions. This can help with the challenge of evaluating a multi-factorial issue like differential attainment.

It’s worth remembering that variation in attainment is complex and happens across a number of protected characteristics, including age, gender and ethnicity. It is likely that things we cannot easily measure, such as self-belief and social capital, are also contributing to the differences.

There is no single agreed cause of these variations, and so you are unlikely to identify one single factor or specific area that should be changed to resolve the problem. It is for this reason that it will be useful to evaluate any interventions you introduce.

A group of attendees feeding back to a group facilitator.

The benefits of evaluating interventions

Evaluating your interventions will help to continually improve their impact. We asked the Work Psychology Group to develop practical resources to help you plan your evaluation. By following this, you can:

  • assess the stronger or weaker elements of interventions and work towards improving them
  • identify potential unintended consequences (both positive and negative) of the intervention and correct for them if necessary
  • increase the evidence available to other organisations on the effectiveness of interventions, to help them improve the fairness of training

You should try and plan your evaluation alongside the design for your intervention. If this isn’t possible, evaluation can still be completed effectively and the principles of good evaluation design still apply.