We check and monitor the education and training that doctors receive by carrying out reviews on particular aspects of training. These include:
- small specialty reviews
- risk-based spot checks
- specific issues such as bullying.
We gather feedback and consider national concerns raised in the media, by government or through other channels to help us identify important themes to report on.
Carrying out specialty reviews
Since small specialties have fewer than 250 doctors in training, our annual national training survey results are less reliable from a statistical point of view. Specialty reviews give us more insight into the quality of training.
After analysing our evidence and collecting information from the relevant royal college or faculty and postgraduate training providers, we carry out targeted site visits.
We publish a report at the end of a review about the delivery of training. It identifies trends and examples of good practice, and areas for improvement nationally.
Risk-based spot checks
Risk based spot checks are one way that we monitor the quality of medical education and training delivered by medical schools, postgraduate deaneries, local education providers or specialties.
These checks are outside of the normal cycle of regional visits we carry out. They are either targeted short visits where we have identified a risk or a gap in our evidence base, or random short visits to test the accuracy of the evidence we hold.
We carry out checks to:
- explore issues we’re aware of, taking into account the standards and outcomes we set
- monitor a specific area of interest when a regional visit is not imminent
- test the accuracy of the evidence that we receive from medical schools, deaneries and local education and training boards (LETBs), and the results of our national training surveys
- monitor progress against actions from previous visits.
Checks we have conducted
In 2013 we began a review of undergraduate assessment, which aimed to identify good assessment practice and check that each medical school's overall assessment system met our standards. We also wanted an overview of undergraduate assessment practice across all medical schools.
The information reviewed included:
- schools' assessment strategies
- delivery standards
- quality management
- governance processes and outcomes.
Between September and December 2014 we visited 12 local education providers in England, Scotland and Northern Ireland to investigate concerns about the undermining of doctors in training.
The sites were chosen after detailed exploration of our evidence which included:
- bi-annual dean’s reports
- data from the 2013 and 2014 national training surveys
- evidence from the Joint Committee on Surgical Training and Royal College of Obstetricians and Gynaecologists
- local intelligence from the deaneries and LETBs.
Emergency medicine checks
Between December 2012 and February 2013 we completed checks to seven local education providers in England and Jersey to review the delivery of training in emergency medicine.
Our checks were prompted by the increasing number of concerns reported to us about the education and training in emergency medicine, particularly about very junior doctors in training working unsupervised at night.
As part of the review, we wrote individual site reports which detail local findings and good practice at the local education providers we visited. They also explain the requirements and recommendation we set for each site.
Clinical academic training
In 2013 and 2014 we reviewed clinical academic training.
The review looked at doctors in training who are undertaking both academic and clinical training, either through:
- integrated pathways which incorporate both clinical training and academic training, or
- out of programme research which doctors in training carry out whilst not also undertaking clinical training.
We are not responsible for regulating academic training but we are responsible for regulating clinical training and we have an interest where there is an overlap between the two.