Examples of how your answers improve training

The national training surveys help us check that doctors in training receive high quality training in a safe and effective clinical environment, and that trainers are well supported in their roles. They are powerful quality assurance tools, allowing us to recognise good practice, and also identify areas for improvement.

Here you can read some examples of how the surveys have led to improvements in local training environments.

Providing evidence to support new initiatives

Association of Anaesthetists fatigue working group - more detail available in the Training environments 2018 report

Following the tragic death of an anaesthetic trainee driving home after working night shifts, the Association of Anaesthetists established a working group to help improve the culture around fatigue. In 2016 they surveyed anaesthetics trainees to investigate how fatigue and shift work was impacting them. The findings highlighted a serious problem among UK trainees.

To address the issues raised, they established a fatigue working group with the Association of Anaesthetists, Royal College of Anaesthetists, Faculty of Intensive Care Medicine and experts in the field. The group developed standards for rest facilities and culture. They produced educational resources on fatigue, sleep, and managing shift work, and ran a #fightfatigue campaign. Anecdotal feedback suggested some organisations were adopting their recommendations but for others, change was slower.

In 2018, all anaesthetics trainees were asked four new questions about fatigue and shift work as part of the specialty specific questions in the national training survey. The results highlighted areas for concern including that one in three trainees regularly felt too tired to get home safely. The results also suggested that teaching on fatigue and sleep was helpful, but unfortunately this was not available for most trainees.

The fatigue working group see the inclusion of questions about fatigue in the national training survey as crucial; ongoing data collection and feedback is needed to support change by individuals and departments. It also signifies how serious national organisations consider the issue and its potential to impact on training. The fatigue working group hope to influence behaviour by changing conversations at local, regional and national levels, and see the inclusion of questions in the national training survey as an important step in allowing these conversations to begin

Using trainer results to understand issues and good practice

Deanery quality management and improvement in Scotland - from 2017 key findings report.

In Scotland, deanery quality management and improvement is managed by eight specialty Quality Management Groups (sQMGs).

Each sQMG is accountable for the quality management of training in all posts and programmes within its scope across Scotland.

At the start of each quality annual cycle, typically in August or September, each sQMG convenes Quality Review Panels (QRPs) with college, trainee and lay members to review all of the new data, information and intelligence available for all posts and programmes. Each QRP reviews data from a number of sources, including: national training surveys results from trainees and trainers; patient safety and bullying and undermining comments; Deans’ reports; enhanced monitoring details; Scottish Training Survey data; and requirements from recent deanery visits.

Reviewing all of this information allows the QRPs to identify how training environments are performing. Supported by a ‘decision aid’ the panels decide which posts or programmes need 'triggered visits' to gather more information.

The trainer survey data is integral to the data, information and intelligence that is reviewed and discussed for each post at the QRPs. When the trainers' survey shows several red or pink flags - despite indications otherwise that the training environment is satisfactory - a Director of Medical Education enquiry is triggered (and the response would be considered by the sQMG). NHS Education for Scotland have also investigated sites that have green flags in the trainers' survey to understand what good practice may be taking place.

Valuing education through an education contract

Health Education and Improvement Wales Education Contract - from the 2017 key findings report.

Health Education and Improvement Wales (HEIW) has recently introduced an Education Contract, between the doctor in training, local education provider (LEP) and itself. It documents specialty specific expectations in terms of sessions trainees must attend, and key education and training opportunities.

HEIW mapped these criteria and metrics against our approved curricula and royal college training standards. Our Promoting excellence standards form the basis against which we and HEIW will monitor the delivery of training across Wales. And a number of those standards and requirements are detailed within the education contract.

Accountability for meeting the responsibilities set out in the contracts lies with the trainee, LEP, and HEIW. To make sure issues are identified quickly, HEIW reviews progress against the contract at agreed intervals through existing processes, such as the trainees' Annual Review of Competence Progression; trainee end of placement evaluation feedback processes; real-time monitoring systems and self-reporting for LEPs.

The national training survey results are used in three ways to aid the accountability. Firstly, results from the existing processes highlighted above are triangulated with the national training survey results to determine trends and whether there is alignment over particular concerns. Secondly, the survey results are used to inform HEIW of whether any sites need to be added to the risk register. Thirdly, the results are used as a means of measuring the effectiveness of interventions put in place to address issues.

By signing this contract, all parties are demonstrating their commitment and support to developing a culture across NHS Wales which supports learning, education and training.

Working together to improve training and patient safety

North Middlesex University Hospital NHS Trust

In March 2016, we carried out an inspection of the emergency medicine department of North Middlesex University Hospital with Health Education England (HEE). Previous national training surveys showed very poor results for doctors in the second year of the Foundation Programme, general practice and specialty doctors in training at this department.

We worked closely with the trust, HEE, NHS England, NHS Improvement and the Care Quality Commission to closely monitor these concerns and to put measures in place to help improve the standard of training and patient safety in the department. Follow up visits in June and September 2016 revealed some improvements in the level of support and supervision that doctors in training were receiving and we saw echoes of this in the national training surveys results for the department in 2017.

However when some of the measures were withdrawn in 2017, North Middlesex began to deteriorate again. We set additional conditions, which were more prescriptive, with specific requirements around how supervision of doctors in training should be structured. We did this in order to make sure the trust was clear about our expectations, and to help them plan and implement a structure that met our standards.

A series of visits throughout late 2017 and early 2018 confirmed that the conditions had been enacted, resulting in a safer and more supportive environment. We continue to work closely with all the organisations involved to keep the situation under close review to make sure that trainees are being properly supported.

Promoting new ways of working

Musgrave Park Hospital, Belfast Trust

In the 2015 national training surveys, handover in the rheumatology department at Musgrave Park Hospital showed negative (red) outliers. Doctors in training were located in multiple sites making face-to-face handover difficult.

The local education provider introduced a number of measures, including a working group with doctors in training to review handover arrangements. The hospital made changes including a daily formal virtual handover, via a shared database. Each doctor who reviews the patients’ and outcome data updates the database every day. Each week, a report of the outcome data is produced for consultants to review.

Because of these changes, handover changed to a positive (green) outlier in the 2016 national training survey results.