RCGP statement on the use of artificial intelligence (AI) in postgraduate training, examinations and registration

The RCGP training, examinations and registration team has been asked by several of our stakeholders about its position on the use of AI in exams and training, and particularly our view on its use by GPs in training in the completion of learning logs and other elements of the Trainee Portfolio.

This is obviously a rapidly developing area, where a consensus view on the use of AI is still emerging. Nevertheless, there have already been many organisations that have done considerable thinking about the potential issues and published useful guidance. One excellent example among many of published guidance, is that produced by the Russell Group (Russell Group principles on the use of generative AI tools in education (PDF file, 126 KB)). This provides a starting point for all organisations engaged in higher level or professional qualifications.

In the specific context of GP training, there are a number of further points we would stress, both in terms of the use of AI in training and registration, and the potential use within our own test development.

The use of AI by GPs in training

On the key central issue of the use of AI by GPs in training, given the proliferation of AI tools, the time pressures on the wider assessment community, and particularly the future expansion of this technology, the RCGP does not believe it is possible to mandate that GPs in training should never use AI when completing their learning logs or other elements of their trainee portfolio. On the contrary, the effective, critical use of AI technology will be a skill all GPs need.

A possible approach to easily and remotely regulate the use of AI would be to require GPs in training to indicate that they had not used AI and to 'sign' a statement to that effect. However, given the increasing use of IT tools, such a statement would put GPs in training at risk of accidental probity issues, as they might sign such a statement in the mistaken belief that they hadn't used AI when in fact they had used common software such as a search engine, auto-fill or spellcheck that did make use of this technology.

We believe the key issue is reflective learning. A significant element of training in health education is teaching and encouraging appropriate reflection. Given the importance of reflective learning in GP training and practice it is important that the GP in training's reflective learning is their own, and that when reflecting on interactions with patients they must be based on real patients (just as MSFs and PSQs must be based on the feedback of real colleagues and real patients). AI tools can clearly help with the drafting process, but to use AI to create 'artificial patient encounters' or to take a purely mechanistic, cut-and-paste approach to producing learning logs risks raising questions of probity. These questions would also be raised in other areas of training, examinations and registration where reflections are a key component of what is being assessed or reviewed.

As well as promoting reflection, learning logs are used to demonstrate the trainee’s coverage of the capabilities across the breath of the GP curriculum by way of linking to the Clinical experience groups. It is essential for this that they are based on real cases to ensure that the training has been extensive enough and to avoid probity issues.

To ensure that the GP in training has used real patients and real cases, and has actively engaged with them, Educational Supervisors and ARCPs panels should explore individual Clinical Case Review (CCR) learning log entries with the GP in training, particularly when they have concerns about the authenticity of the underlying case, or the quality of the learning that has resulted. GPs in training should be prepared to have some of their learning logs interrogated by their Educators (including ES/CS) and the ARCP panel. Likewise, applicants in other areas and to other processes within training, examinations and registration should also expect to have their work interrogated if there are concerns about authenticity.

The importance of reflective learning is clearly an area that needs to be reiterated and one on which the RCGP will look to improve our own guidance.

It should be stressed that while the effective use of appropriate medical AI products is a developing skillset, GPs in training should not currently be using, generic, commercially available AI products such as ChatGPT to generate diagnosis, interpret clinical information or to advise on clinical management. This is potentially dangerous with possible significant impacts on patient care and outcomes.

The use of AI in test development

The RCGP believes there may be benefits to our examiners, staff and GPs in training in using AI within our development processes. We will have to explore this issue further, but we will be mindful of the need to ensure:

  1. Security - developments should clearly be done in a controlled and closed environment, to prevent both assessments leaking, and the development of harmful and incorrect shadow question banks.
  2. Experienced Clinical Judgements - Human interaction within the assessment process, by trained and standardised qualified GPs is clearly a core and vital part of the medical assessment process. The important role that experienced GPs play as assessors needs to be safeguarded, not least to ensure our patients retain confidence in our exams, and the doctors that qualify by passing them.

The future use of AI - Areas to consider

This is clearly a rapidly developing area and one the RCGP will continue to keep under review. We will particularly need to consider and review:

  1. The potential impact on Differential Attainment. This is as a result of the potential conscious and unconscious biases of the original AI developers, to the quality and scope of the data used to train the AI tools, and as the time saving benefits of AI might depend on the user’s primary language. GPs in trainings whose first language is not English might not as quickly be able to identify poor AI generated content, and as a result they may be unable to access the time saving benefits it offers to GPs in training whose first language is English. However, we also recognise the potential benefits of using AI in a trainee’s first language and as a translation tool.
  2. The implications for the curriculum and the skills needed to be a GP. In particular the need to ensure future GPs can evaluate new and developing AI tools, to ensure they are fit for purpose, offer value for money and do not impact patient safety.