Practical changes

Changes to GMC Connect 

Organisations will complete the self-assessment questionnaire within GMC Connect, the declaration form will be sent to the organisation to sign and upload to GMC connect. They will also have access to organisation-specific dashboards within GMC Connect.

Dashboards and reporting

For PTOs we have already launched our new education data reporting tool. This includes a summary dashboard that shows NTS priorities, NTS results, trainer survey results, and enhanced monitoring summaries side-by-side for the first time.

As well as the summary dashboard, we have brought progression report data and enhanced monitoring data into the same reporting tool to create a single site for all our education QA data.

Over the next few years, we will continue to develop this reporting tool and add new information and ways of viewing that information. We welcome feedback and requests - please do get in touch. Our annual QA summary for each organisation will be published in the dashboard, but will also be available in letter form if needed.

Improvements to monitoring and enhanced monitoring

From November, medical schools will respond to existing requirements and recommendations, raised through the old regional/national reviews using our online monitoring system in GMC Connect. They will also use this system to raise any in-year concerns about undergraduate education in their area.

For PTOs, monitoring arrangements will not change significantly. Enhanced monitoring will continue in its current form, to deal with local education provider (LEP) or programme-specific issues. We will continue to work closely with PTOs to ensure that training programmes are delivered according to our standards, and that any risks are appropriately managed.

Routine monitoring through the online dean's report (DR) will also continue; this is how we track the higher risk issues within training programmes, and prepare for potential regulatory involvement through enhanced monitoring. We've been working closely with your quality management teams to make improvements to the process. We have now mapped our reporting thresholds directly to local risk thresholds in the four countries, so that there is clarity on what should and should not be included in the DR, and how issues are opened and closed. We have also made some technical improvements to the system to make it easier to use. We welcome feedback on these improvements.

Medical school annual return (MSAR)

We no longer require medical schools to complete the MSAR. In 2019 we will ask them to respond to their existing requirements and recommendations using the new online monitoring system (see above). We will also ask for some information about how they are implementing the revised Outcomes for graduates and practical skills and procedures list.

Our discussions with medical schools about clinical placement data are ongoing and we have not yet reached a consensus about the format and content of this collection. It is important that we know which locations they place students in for several reasons. Firstly, we need to understand the demand of clinical placements on service providers from a safety perspective, in particular in relation to applications for new medical schools. Secondly, we need to be able to check that students receive the appropriate support if they are placed in locations where postgraduate training is poor. For these reasons, as an interim process until our discussions about clinical placement data have concluded, we may ask them to tell us which locations they use for clinical placements. We will aim to ensure this list is as simple as possible.

The Medical Licensing Assessment (MLA)

The MLA team will continue to work with medical schools to discuss how they are able to meet the Clinical and professional skills assessment (CPSA) requirements over the next few years. When the MLA is launched the CPSA requirements will be monitored as part of the QA process and schools will be asked to report on them in their self-assessment.

Using data

We are developing a series of dashboards which will help organisations self-assess against our standards. The dashboards will be displayed in GMC Connect so that each organisation can only see the information that is relevant to them. They will be available all year round and refreshed when new data is available.

If the data highlights something we're concerned about, we will include a question about it in that organisations' self-assessment questionnaire.

Meeting trainers, doctors in training and students

We are committed to continue meeting face-to-face with trainers, doctors in training and students from your organisation or that you work with. However, we feel we can do this outside the context of a formal visit in most cases.

We are still developing our plans in this area, but it is likely that we will want to ensure each of these groups has had an opportunity to meet us at least once in a four-year cycle for each organisation we are quality assuring. 

Health Education England, regions and local offices

We will continue to quality assure postgraduate training in England as it is divided up by the programmes that we have approved, with a postgraduate dean continuing to be accountable for those programmes. This is also a practical decision, as it allows us to spread our resources more efficiently.

HEE now operates quality teams that are both England-wide and across its four regions (North, Midlands, South and London), with a seven-region structure coming into place. To avoid any unnecessary duplication, we will adapt our self-assessments so that if we've already heard about a centralised process from one HEE local office, we won't ask another office to report to us on it.

We will also check some of the centralised processes. If we have checked something centralised and we're assured that this works in the same way for several offices, we will record and report that accordingly.

We remain flexible and will be able to adapt to any structural changes in the future where appropriate.