Advertising

Professional
Volume 50, Issue 10, October 2021

Random case analysis in general practice clinical team meetings

Gerard Ingham    Mary Beth MacIsaac    Rebecca Kippen   
doi: 10.31128/AJGP-02-21-5850   |    Download article
Cite this article    BIBTEX    REFER    RIS

Background

The Medical Board of Australia intends to mandate that at least 25% of continuing professional development (CPD) is focused on performance review.

Objective

The aim of this article is to describe how random case analysis (RCA) can be used for performance review in general practice clinical team meetings, and outline its benefits and challenges.

Discussion
RCA is a powerful learning method for CPD. Involving peers in case review allows practice quality improvement and safety issues to be explored. Planning is required to overcome logistic and legal barriers and to ensure accreditation of the activity by The Royal Australian College of General Practitioners. Vital to the success of RCA is a supportive educational environment and the provision of learner-centred and specific feedback.
 

‘Good medical practice requires doctors to reflect regularly on their practice and its effectiveness’.1 The Medical Board of Australia has signalled its intent to soon require at least 25% of continuing professional development (CPD) undertaken annually to be focused on reviewing performance.2 This will bring Australia in line with comparable countries,3–5 with some additionally requiring review of performance with a peer to maintain medical registration. In keeping with the Medical Board of Australia’s published intentions, random case analysis (RCA), a performance review activity that can be undertaken with peers, was added in 2020 to The Royal Australian College of General Practitioners’ (RACGP’s) list of self-directed accredited CPD activities.6

The aim of this article is to describe how RCA can be used in practice clinical team meetings, its benefits and challenges.

Individual performance review and continuing professional development

RCA is a specific method of case note review that has been used extensively in the education of general practice registrars.7 RCA can review the full scope of general practice and all RACGP curriculum domains of practice,8 but it is best suited to review clinical reasoning and medical record keeping.

RCA in clinical team meetings involves a general practitioner (GP) having records of their recent clinical encounters reviewed with a group of peers. A core feature of RCA is that the selection of each clinical record is random and not directed by the GP having their records reviewed. This selection method is one reason why RCA is useful in identifying blind spots or ‘unknown unknowns’ in a clinician’s knowledge and skills, making it a powerful educational tool. Following selection of the record, the group facilitator leads a discussion with the GP to understand what happened during the consultation, so the full case is being reviewed and not just the clinical record. Next, analysis of the case is undertaken by asking why diagnostic, investigative or management decisions were made. The ‘what’ and ‘why’ questions distinguish RCA from a ‘chart audit’, which is only a review of the medical record.

The GP can request feedback about the case from their peers. Peer review positively contributes to motivation to change and affords ideas for change that may not be generated in individual review.9 Finally, consideration is given to whether the clinical notes were adequate to enable other clinicians to continue care, and any safety concerns identified by the review are managed.

For the activity to fulfil the RACGP’s requirements for self-accredited CPD, it must include at least four hours of attendance for each of the 4–12 participants as well as a planning and a review meeting. The planning meeting can be held at the start of the first meeting and the review meeting at the conclusion of the final meeting. The timetable and content of meetings is outlined in Table 1, with further detail available in the Random case analysis in practice clinical meetings manual.10 

Table 1. Timetable and content of meetings for continuing professional development (CPD) accreditation of random case analysis meetings
Type of meeting Content of meeting
Planning (initial meeting)
  • Decide if reviewing all records or if focusing on a particular problem (eg opioid prescribing, wound management, women’s health)
  • Appoint initial presenter of records (usually senior clinician), facilitator, and scribe
  • Set time and dates to maximise inclusivity
  • Arrange at least four hours of meetings
  • Determine if face-to-face or videoconference
Random case analysis
  • Set ground rules – confidentiality, respect, quality improvement, how feedback will be provided
  • The facilitator selects a recent record for review
  • For each record:
    • Clarify: what happened?
    • Explore: why were decisions made?
    • Provide feedback: learner-centered, balanced, safe, specific
    • Identify safety and quality improvement actions
  • Further records are selected as time permits
  • Record meeting outcomes
Review (final meeting)
  • Was there agreement about the decisions made in the cases reviewed? Are any differences a matter of clinical opinion or are there guidelines or other resources that might resolve them?
  • Were the reviewed medical records sufficient to enable other practitioners to continue care? How might they be improved?
  • Did this activity motivate a change in practice systems and safety?
  • What monitoring of change is required?
  • How might the random case analysis activity be improved next time?

Review of practice systems and quality improvement

Reviewing cases with a group of peers allows RCA to look beyond the review of individual performance and consider the performance of the group. The clinician’s peers can reflect on whether they have a consistent diagnostic and management approach and, if not, whether there is an evidence or guidelines basis for one practice or another. In Europe, peer review meetings, called ‘quality circles’, have been noted to progress from CPD to quality improvement.11 After individual performance reviews, debate naturally follows about how practice systems can be altered to improve the quality of clinical care.

Quality circle meetings are a peer-driven approach to quality improvement that rely on a climate of trust among equals and permit the participants to determine the focus.12 The alternative to quality circles is top-down, outcomes-driven incentive schemes. While a measurement approach can improve patient outcomes, some of this improvement may just be due to better measuring and data recording.13 Furthermore, there are circumstances in which these schemes paradoxically reduce the quality of care.14,15 Not all markers of quality care are measurable, and GP ‘soft skills’ that should be recognised and encouraged may instead be ignored.16 Clinician behaviour can be driven by incentives towards measured behaviour even when it may not be good practice. For example, the Diabetes Service Incentive Payment encourages a clinician to continue to measure lipid levels inappropriately in an elderly patient with limited life expectancy.17

Patient safety

In a study on the use of RCA with general practice registrars, 30% of supervisors found a patient safety concern in the records of registrars, and 16% needed to contact a patient to remedy the problem.18 It is not clear if this rate will be replicated in reviews of the records of experienced clinicians, but it would be naive to believe no threats to safety will be found. RCA provides a ‘prospective’ approach to patient safety that complements the usual retrospective approach of analysing records where an error or ‘near miss’ has already been identified.

Legal

This article cannot cover in detail the legal requirements for examining records using RCA. A useful resource produced by the RACGP is Privacy and managing health information in general practice.19 In general, accessing records for quality improvement activities is covered by the same provisions that allow their access in practice accreditation visits.

Logistic

The COVID-19 pandemic has resulted in greater familiarity with the use of videoconferencing. Even without an infection-control imperative, holding meetings via videoconference may be preferable to face-to-face meetings if it enables more practice team members to attend and is more inclusive. Countering these benefits are the potential for technical difficulties and the missing of non-verbal cues to a participant’s emotional response. Whichever meeting mode is selected, consideration should be given to finding a suitable time, confirming the medical records can be seen by all participants and ensuring confidentiality is maintained.

Educational environment

The educational environment is not just the physical surrounding but encompasses the psychological state of the participants, the interactions between them and the organisational culture.20 The educational environment significantly affects learning.21 Performance review with peers is likely a novel experience for most Australian GPs. For many, previous clinical group learning experiences as medical students may have been detrimental to learning and had a negative impact on them.22,23 Humiliation during learning can leave an indelible mark, creating cycles of abuse with the subtext being a false belief that painful exposure is the best learning motivator.24

RCA session facilitators have an important role in creating a best-practice educational environment; one that is welcoming and safe, values learning and seeks continuous quality improvement.25 The sessions should not descend into an interrogation of the presenting GP. Molloy and Bearman,26 noting the tension that exists for health professionals between vulnerability and credibility, recommend that senior clinicians go first and share their inner doubts and knowledge gaps before expecting junior clinicians to do so. This ‘intellectual candour’ or humility promotes a culture that acknowledges fallibility, allowing the focus to instead turn to what can be learnt and improved.

Providing feedback

Although the medical literature contains many publications about providing effective feedback, there is little high-quality evidence to support one method over another.27 In addition to following the ‘senior clinician goes first’ rule, it is recommended that the consulting skills of a good GP practising patient-centred medicine can be used as a model for providing effective feedback (Table 2). 

Table 2. Using a general practice consultation model as a template for the provision of feedback
General practice consulting model Feedback model
Patient centred Learner centred
Patient’s agenda first Learner’s issues first
Any immediate major health problems are addressed Any significant patient safety issues are addressed
Other items on the doctor’s agenda are managed subject to time and patient readiness for change The teacher’s issues are managed subject to time and the educational environment
Not everything has to be managed in one consultation Not everything has to be learnt in one education session
Patient education is specific and clear to enable instructions to be followed Feedback on performance is specific and clear to enable behaviour change

When consulting in the patient-centred medicine model, a GP gives priority to understanding and addressing the patient’s agenda and assesses their readiness for change before embarking on the provision of any other health messages they wish to impart. Unless there is a risk of missing a time-critical diagnosis, items on the doctor’s agenda are sometimes left for another day, or a brief intervention is used to plant a seed for more detailed consideration in the future. Not all of the patient’s health problems must be solved in one day. Any patient instructions provided are clear and specific, so the patient knows what to do.

Similarly, in RCA ‘consultations’, it is important to uncover the issues that are important for the clinician having their records reviewed. Answering them must be given priority. The issues may be unexpected and differ from those identified by other clinicians in the room. In the absence of any clear safety concerns in the record reviewed, whether to progress to other concerns held by the peers requires nuanced judgement. The decision will depend on the group members weighing up the psychological, social and cultural factors in the learning environment. Although it is important that feedback does not avoid challenging conversations, each feedback interaction is part of an improvement process that operates over time and progresses as the relationship develops.28 It may be appropriate to leave some issues to another day. Any feedback that is provided should be specific about what might be done differently, so the clinician understands what behaviours they could consider changing.

Limitations

GPs working in practices with fewer than four GPs are not able to meet the RACGP CPD requirements. GPs with a specific interest – such as acupuncture, psychological medicine, addiction medicine, human immunodeficiency virus care or integrative medicine – might consider discussion with colleagues not practising in their fields less valuable. Consequently, scope of practice may need to be considered in selecting GPs into a group undertaking RCA and in selection of records for review. However, GPs with specific interests must still ensure their practice is acceptable to their colleagues. Medicare requires that notes must be sufficient to enable another GP, whether they have a specific interest or not, to continue care.

Alternatives

Even with the best facilitation, open review with colleagues is a confronting challenge, and individual clinical audit or multisource feedback may be a preferable method of performance review for some clinicians. Practice improvement can be achieved by other quality improvement activities such as Plan-Do-Study-Act cycles.

Conclusion

RCA in practice clinical team meetings is a novel method of peer review of performance that also promotes quality improvement in practice systems and can identify patient safety concerns. Successful implementation requires planning to overcome logistic hurdles and the creation of a learning environment that is safe, values learning and focuses on quality improvement.

Key points

  • Performance review will soon be a significant component of CPD.
  • RCA can be used in clinician meetings for peer review of performance.
  • Review of cases can springboard quality improvement discussions.
  • A supportive educational environment is required when learning with peers.
  • Feedback on cases should be learner-centred and specific.
Competing interests: None.
Provenance and peer review: Not commissioned, externally peer reviewed.
Funding: The authors declare that the development of the Random case analysis in practice clinical meetings manual was funded by a research grant provided by the Avant Foundation.
Correspondence to:
drgingham@gmail.com
This event attracts CPD points and can be self recorded

Did you know you can now log your CPD with a click of a button?

Create Quick log
References
  1. Medical Board of Australia. Good medical practice: A code of conduct for doctors in Australia. Melbourne, Vic: Australian Health Practitioner Regulation Agency, 2020. Available at www.medicalboard.gov.au/Codes-Guidelines-Policies/Code-of-conduct.aspx [Accessed 13 February 2021]. Search PubMed
  2. Medical Board of Australia. Building a professional performance framework. Melbourne, Vic: Australian Health Practitioner Regulation Agency, 2018. Available at www.medicalboard.gov.au/registration/professional-performance-framework.aspx [Accessed 13 February 2021]. Search PubMed
  3. The Royal College of General Practitioners. Review of practice. London, UK: RCGP, 2021. Available at www.rcgp.org.uk/training-exams/practice/revalidation/guide-to-supporting-information-for-appraisal-and-revalidation/review-of-practice.aspx [Accessed 13 February 2021]. Search PubMed
  4. Medical Council of New Zealand. Policy on regular practice review. Wellington, NZ: MCNZ, 2016. Available at www.mcnz.org.nz/assets/Policies/23319bbd6c/Policy-on-regular-practice-review.pdf [Accessed 13 February 2021]. Search PubMed
  5. Irish College of General Practitioners. Professional competence scheme. Dublin: ICGP, 2021. Available at www.icgp.ie/go/pcs/about_icgp_pcs [Accessed 27 January 2021]. Search PubMed
  6. The Royal Australian College of General Practitioners. Continuing Professional Development (CPD) program: Handbook for general practitioners. East Melbourne, Vic: RACGP, 2020. Available at www.racgp.org.au/education/professional-development/qi-cpd/handbook-for-general-practitioners/cpd-program-requirements-for-the-2020-22-triennium [Accessed 27 January 2021]. Search PubMed
  7. Morgan S, Ingham G. Random case analysis – A new framework for Australian general practice training. Aust Fam Physician 2013;42(1–2):69–73. Search PubMed
  8. The Royal Australian College of General Practitioners. Curriculum for Australian general practice 2016 – CS16 core skills unit. East Melbourne, Vic: RACGP, 2016. Search PubMed
  9. van Braak M, Visser M, Holtrop M, Statius Muller I, Bont J, van Dijk N. What motivates general practitioners to change practice behaviour? A qualitative study of audit and feedback group sessions in Dutch general practice. BMJ Open 2019;9(5):e025286. doi: 10.1136/bmjopen-2018-025286. Search PubMed
  10. Ingham G, MacIsaac M, Kippen R. Random case analysis in practice clinical meetings manual. Melbourne, Vic: Monash University, 2021. doi: 10.26180/14050742. Search PubMed
  11. Rohrbasser A, Kirk UB, Arvidsson E. Use of quality circles for primary care providers in 24 European countries: An online survey of European Society for Quality and Safety in family practice delegates. Scand J Prim Health Care 2019;37(3):302–11. doi: 10.1080/02813432.2019.1639902. Search PubMed
  12. Rohrbasser A, Harris J, Mickan S, Tal K, Wong G. Quality circles for quality improvement in primary health care: Their origins, spread, effectiveness and lacunae – A scoping review. PLoS One. 2018;13(12):e0202616. doi: 10.1371/journal.pone.0202616. Search PubMed
  13. Knight AW, Caesar C, Ford D, Coughlin A, Frick C. Improving primary care in Australia through the Australian Primary Care Collaboratives Program: A quality improvement report. BMJ Qual Saf 2012;21(11):948–55. doi: 10.1136/bmjqs-2011-000165. Search PubMed
  14. Chew-Graham CA, Hunter C, Langer S, et al. How QOF is shaping primary care review consultations: A longitudinal qualitative study. BMC Fam Pract 2013;14:103. doi: 10.1186/1471-2296-14-103. Search PubMed
  15. Heath I, Hippisley-Cox J, Smeeth L. Measuring performance and missing the point? BMJ 2007;335(7629):1075–56. doi: 10.1136/bmj.39377.387373.AD. Search PubMed
  16. Dawda P. Revalidation – A personal reflection. Aust Fam Physician 2013;42(11):826–28. Search PubMed
  17. Australian Government Department of Health. Medicare Benefits Schedule – Note AN.0.54. Canberra, ACT: MBS Online, 2020. Available at www9.health.gov.au/mbs/fullDisplay.cfm?type=note&qt=NoteID&q=AN.0.54 [Accessed 13 February 2021]. Search PubMed
  18. Morgan S, Ingham G, Kinsman L, Fry J. Clinical supervision using random case analysis in general practice training. Educ Prim Care 2015;26(1):40–46. Search PubMed
  19. The Royal Australian College of General Practitioners. Privacy and managing health information in general practice. East Melbourne, Vic: RACGP, 2017. Search PubMed
  20. Shochet RB, Colbert-Getz JM, Levine RB, Wright SM. Gauging events that influence students’ perceptions of the medical school learning environment: Findings from one institution. Acad Med 2013;88(2):246–52. doi: 10.1097/ACM.0b013e31827bfa14. Search PubMed
  21. Hutchinson L. Educational environment. BMJ 2003;326(7393):810–12. doi: 10.1136/bmj.326.7393.810. Search PubMed
  22. Dunham L, Dekhtyar M, Gruener G, et al. Medical student perceptions of the learning environment in medical school change as students transition to clinical training in undergraduate medical school. Teach Learn Med 2017;29(4):383–91. doi: 10.1080/10401334.2017.1297712. Search PubMed
  23. Hojat M, Vergare MJ, Maxwell K, et al. The devil is in the third year: A longitudinal study of erosion of empathy in medical school. Acad Med 2009;84(9):1182–91. doi: 10.1097/ACM.0b013e3181b17e55. Search PubMed
  24. Wilkinson TJ, Gill DJ, Fitzjohn J, Palmer CL, Mulder RT. The impact on students of adverse experiences during medical school. Med Teach 2006;28(2):129–35. doi: 10.1080/01421590600607195. Search PubMed
  25. Darcy Associates. The best practice clinical learning environment framework: Delivering quality clinical education for learners. Melbourne, Vic: Department of Health and Human Services Victoria, 2016. Search PubMed
  26. Molloy E, Bearman M. Embracing the tension between vulnerability and credibility: ‘Intellectual candour’ in health professions education. Med Educ 2019;53(1):32–41. doi: 10.1111/medu.13649. Search PubMed
  27. Bing-You R, Hayes V, Varaklis K, Trowbridge R, Kemp H, McKelvy D. Feedback for learners in medical education: What is known? A scoping review. Acad Med 2017;92(9):1346–54. doi: 10.1097/ACM.0000000000001578. Search PubMed
  28. Molloy E, Ajjawi R, Bearman M, Noble C, Rudland J, Ryan A. Challenging feedback myths: Values, learner involvement and promoting effects beyond the immediate task. Med Educ 2020;54(1):33–39. doi: 10.1111/medu.13802. Search PubMed

General practiceOngoing vocational educationQuality care

Download article