Advertising

Professional
Volume 47, Issue 8, August 2018

Producing a general practice workforce: Let’s count what counts

Tarun Sen Gupta    Carole Reeve    Sarah Larkins    Richard Hays   
doi: 10.31128/AJGP-02-18-4488   |    Download article
Cite this article    BIBTEX    REFER    RIS

Background
Medical workforce problems still dominate headlines despite considerable investment in education, training and other initiatives. There is little consensus about what Australia’s general practice workforce should look like or what training outcomes should be reported.
Objective
The aim of this paper was to explore a number of issues relevant to outcomes of workforce programs and offer suggestions for identifying and overcoming these issues.
Discussion
Social accountability literature highlights the importance of outcomes focusing on community needs. We suggest that evaluations should ‘count what counts’ and be careful what is counted. Numbers are only part of the story; not everything that counts is counted, and synergies and cooperation are key. Australia has many general practice workforce programs that are generally heading in the right direction. We believe that closer attention to appropriate outcome measures is important if we are to maximise return on investment and get the best outcomes for the community.
ArticleImage

Everyone talks about the weather but no one does anything about it.

The weather may well be a metaphor for the medical workforce in Australia in 2018. We all talk about it, but many things – for example, where medical practitioners choose to work – are largely out of the public’s control. There is little consensus about what the ideal workforce looks like, so monitoring progress is challenging. Australian and international reports still highlight geographical and vocational workforce maldistribution,1,2 made more acute by rising healthcare costs. Fewer articles relate these workforce issues to the resultant failure to meet community needs. Many possible solutions have been proposed, addressing funding, selection, training and short-term locum workforce. These suggestions have resulted in a considerable growth in government investment, particularly for training.3 However, despite these workforce initiatives, limited evidence of efficacy exists.4 This article aims to provoke debate about which outcomes to measure in order to best assess how well we are producing a fit-for-purpose generalist workforce.

What is the workforce issue?

Despite dramatic growth in supply, Murray and Wilson observe, ‘The statement “we have plenty of doctors in Australia” would probably not pass the pub test’.5 A 2011 study noted that ‘compared with metropolitan areas, rural Australia is characterised by poorer health outcomes, which are linked to poorer access to health services and undersupply of general practice workforce’.6 The number of employed medical practitioners is growing, with a 2015 report suggesting 392 full-time equivalence (FTE) per 100,000 population in 2015, up from 374 per 100,000 in 2012. However, much of the growth was in non-general practitioner (GP) specialists: ‘the supply of … GPs … changed little between 2005 and 2015, ranging from 109 per 100,000 people in 2008 to 114 in 2015 (24,655 to 28,329 GPs)’.7

Walters et al note conflicting scenarios of national GP distribution. One study using self-reported work hours found that rural supply is equal or above that in major cities; other data, based on Medicare Benefits Schedule billing, suggested poorer supply with increasing remoteness. They conclude:

These analyses are hampered by considerable limitations, including poor differentiation of consulting room general practice from on-call hours, procedural activity and hospital work. They poorly account for factors which increase with remoteness, including salaried activity, high workforce turnover and use of locums, poorer population health status, and reliance on international medical graduates. This compromises accurate perspectives of national GP supply and distribution.1

Despite these conflicting reports, there is little doubt that many communities still experience shortfalls; there is a risk that aggregates and averages can obscure the local picture. Reporting of workforce participation varies, with some studies using ‘head count’ (total numbers) instead of FTE at a time when working hours are generally reducing. We suggest that the real workforce measure is how well the distribution of medical professionals matches community needs. This demands a radical rethinking of the outcome measures that medical education institutions use to evaluate their success or otherwise at producing graduates and a thorough understanding of the needs of the communities they serve. Further qualitative research will help to increase understanding of how broader societal trends affect medical workforce participation.

So, how are we doing?

A review of rural clinical schools (RCSs) over a decade found ‘extensive positive impacts on rural and regional communities, curriculum innovation in medical education programs and community engagement activities’.8 Many of these measures – such as investment in infrastructure, engagement of local clinicians and recruitment of new ones, and positive student experiences – were worthwhile, but the evidence was weaker for the ultimate outcome: recruitment of workforce. The authors note ‘… well-established programs are finding graduates who are returning to rural practice’, reminding us that workforce programs take time to produce results.

Other authors note limited reporting of outcomes from postgraduate GP training, suggesting, ‘Success of the regionalisation of the general practice training program in Australia will ultimately be measured by the ability of the program to deliver a sufficient rural general practice workforce to meet the health needs of rural communities’.6

Several points are worth considering when looking at reports of workforce outcomes. Intent is widely reported and is a helpful process indicator, particularly in new programs. For example, one study from the University of Sydney showed rural career interest dropped from 20.7% (79/382) to 12.5% (54/433) between entry and exit. Ultimately, 8.1% (35/434) accepted a rural internship, although 14.5% (60/415) had indicated a first preference for a rural post. The authors felt this showed that rural placements had a stronger association with ultimate rural interest and internship than rural background, but this differs to other literature and merits further exploration.9

All such studies involve a number of complex, compounding interrelated factors. Programs do not exist in isolation; qualitative studies may be needed to understand the factors interacting in a complex environment and their influences on graduates’ choices of career and location.

Ongoing evidence supports the rural pipeline. The MABEL study, a large prospective Australian cohort study, found clear evidence of an association between school-age rural background and subsequent rural practice, particularly for GPs.10

A 2014 evaluation of practice location of 536 James Cook University (JCU) medical graduates in the first seven cohorts showed that 65% undertook non-metropolitan internships. Of those in specialty training, the most frequent specialty was general practice (48%; 97/203), including 13% (27/203) in the ‘rural medicine’ subspecialty.11 Despite these promising outcomes, there is more to do: local workforce shortages in general practice and rural medicine persist; ‘rural’ outcomes vary across the region, with less impact in remote communities. Importantly, rural interest increases over the JCU program with positive experiences, role-modelling and an attractive pathway creating excitement and interest – a kind of Venturi effect.11

Queensland Health’s Rural Generalist Pathway is an incentivised pathway encouraging junior doctors to train in rural and remote medicine.12 The pathway supports more than 312 medical officers across Queensland, with the largest group (43% or 172 trainees and fellows) from JCU, illustrating the benefit of joined-up programs.13 A 2013 Ernst and Young report evaluated the extent to which the pathway met the needs and expectations of rural communities. A comparative analysis of administrative costs and recognition costs concluded that the pathway represented value for money with a return on investment ratio conservatively estimated to be approximately 1.2.14

Kitchener suggests some suitable measures for key performance outcome indicators to bridge the evidence gap in vocational training. These include the rural retention rate (RRR), reflecting the number of registrars in rural practice one or more years after completing training, and advanced rural skills proportion (ARSP), the proportion of all completing registrars achieving fellowships of the Australian College of Rural and Remote Medicine or in Advanced Rural General Practice. He cites an RRR of 75% (38/51 registrars) and an ARSP of 49% (25/51 registrars) in one dedicated rural medical vocational training program.15

Many rural programs report higher proportions of students adopting rural practice than comparison groups. For example, a large Western Australian study tracked the actual practice location of over 90% of eligible graduates across eight cohorts, showing a strong association between RCS participation and the likelihood of working rurally. Graduates from urban backgrounds who participated in the RCS were much more likely to work in rural areas than others. The authors concluded that ‘these data substantiate the [Rural Clinical School of Western Australia] as an effective rural workforce strategy’. The proportions of RCS students working rurally (42/258 or 16.3%) were markedly higher than those for non-RCS students (36/759 or 4.7%), with impressive odds ratios quoted.16

However, a total of 8% of eight cohorts is less than the proportion of the state’s population that is ‘rural’, and 78 of 1017 graduates are unlikely to meet rural workforce needs. A recent WA workforce report projected a shortfall of 1450 medical practitioners across all medical specialties by 2025, including 974 GPs, ‘with the resulting risk of not being able to meet health service needs and compromising safety and quality of care’.17

Many solutions focus on education programs – training the right people in the right things in the right places at the right time. Medical schools have modified selection processes and curricula, developing a variety of short-term and long-term rural placements. GP training places have also expanded nationally, growing from 450 to 1500 in Australia between 2003 and 2015.18

While rurally focused education strategies are broadly accepted as a strategy for rural workforce shortages, there is little in the way of objective measures or evidence of whether (and why) these programs work.4,6,15 There is limited work exploring their utility to improve workforce coverage for other under-served populations. A growing body of evidence suggests that investing in socially accountable medical education could be a mechanism for addressing the global health workforce maldistribution.19 The World Health Organization defines social accountability as:

The obligation of medical schools to direct education, research and service activities towards addressing the priority health concerns of the community, region or nation that they are mandated to serve. The priority health concerns are to be identified jointly by governments, healthcare organizations, health professionals and the public.20

We contend that training organisations have a responsibility to the community they serve, whether they recognise it or not. This means they need to do more than train people to a set of standards – that is necessary but not sufficient. As part of their social accountability, educational institutions should be obliged to consider the populations they serve and how their selection, curricula and training programs are meeting these needs, and report against these measures. Given that training is publically funded, there is, however, an expectation – perhaps an obligation – that education providers and their trainees will meet community needs, within the constraints of their other social contracts. Training organisations should therefore work with key stakeholders and jointly hold themselves accountable for progress toward meeting these priorities through outcomes such as workforce, how well local needs are addressed, the alignment of research and service with these needs, or other targets such as the sustainable development goals.

We support the idea of key outcome indicators, such as rural/remote retention rates, but believe current strategies (with a few notable exceptions) are not resulting in recruitment sufficient to meet community needs. Campbell et al cite the importance of the rural pipeline and of developing ‘shared programs and activities with RCSs, rural medical schools and local hospitals’.6 Such approaches need vertically integrated structures, pooling resources and talent to find local solutions to local problems in both metropolitan and rural settings. Recent introduction of integrated rural clinical training hubs should support this approach.21

Restricting provider numbers for overseas-trained doctors has improved workforce in under-served areas and is one of the only ways of controlling the distribution of doctors nationally. In Queensland, 49.4% of rural doctors are overseas-trained,22 yet continued reliance on importing our workforce raises ethical issues about the impact on the developing world. The strategy of limiting provider numbers to areas of need is also being attempted through rural scholarships and bonding programs, and effective evaluation of these programs will be important for informing future planning.

In addition to geographic redistribution, reasserting medical generalism and the role of general practice and primary care is essential to meet health needs and contain burgeoning health costs. One recent report noted, ‘General practice is increasingly seen as an important area where effective treatment, advice and intervention can prevent costly inpatient hospital stays’.17

We need to think hard about our multiple education and training programs. They are mostly world-class, but considerable activity and energy is diverted towards reporting activity and governance. While this is necessary, particularly initially, routine monitoring through accreditation processes should then suffice so that the focus can move to measuring the most important outcomes. We argue for more focus on educational institutions defining the outcome of interest in consultation with stakeholders, particularly the under-served, developing targets and devising ways to measure their progress. This should be public, transparent and accountable – so funders, educators, managers and students work together for a common goal. We suggest the following ways forward:

  • Count what counts: Consult widely, particularly with under-served communities, for the outcomes of interest. Community needs – reflecting our social accountability – are paramount.

  • Be careful what is counted: Head count (total numbers) may be different to full-time equivalence, and workforce contribution is the key issue. Do not confuse associations with causation.

  • Numbers are only part of the story: Qualitative approaches may help to understand the statistics.

  • Not everything that counts is counted: More sophisticated metrics may be needed; explore new ways to measure our ability to meet community needs.

  • Synergies and cooperation are key: Programs should act together; unintended consequences (eg from misguided targets) should be avoided; relationships should be prioritised and measured.

Definitions and good data sources are key. Helpfully, rural classifications are becoming more nuanced. We need to ask how many populations are benefitting, how many have their needs met and whether the most under-served communities are benefitting the most. Comparative data are useful, enabling meaningful comparisons of programs, but this should be moderated by understanding the ‘baseline’ and what is reasonable and expected.

Some common reporting outcomes can then be developed, including sensible frameworks enabling pooling of data, national comparisons and robust monitoring of progress. For Australian general practice, this means understanding the type of workforce we need and identifying current gaps, then tuning our training programs accordingly. Developing a self-sustaining workforce may need educational reform, shifting mindsets for students and staff, and advocacy for successful programs. But if a wealthy country with a strong education system like Australia cannot solve its general practice workforce problems, then who can?

We think the future is bright, with many positive initiatives that will bear fruit. We have many strategies, with the recent introduction of college-based selection and the National Rural Generalist Pathway. Rural doctors have welcomed the Stronger Rural Health Strategy announced in the recent budget.23 But let us get the reporting right to achieve the best outcomes for funders and the communities we serve. If assessment drives learning, perhaps evaluation metrics drive educational outcomes.

Competing interests: None.
Provenance and peer review: Commissioned, externally peer reviewed.
This event attracts CPD points and can be self recorded

Did you know you can now log your CPD with a click of a button?

Create Quick log
References
  1. Walters LK, McGrail MR, Carson DB, et al. Where to next for rural general practice policy and research in Australia? Med J Aust 2017;207(2):56–58. doi: 10.5694/mja17.000216. Search PubMed
  2. Laurence CO, Karnon J. Improving the planning of the GP workforce in Australia: A simulation model incorporating work transitions, health need and service usage. Hum Resour Health 2016;14:13. doi: 10.1186/s12960-016-0110-2. Search PubMed
  3. Hays R. Rural initiatives at the James Cook University School of Medicine: A vertically integrated regional/rural/remote medical education provider. Aust J Rural Health 2001;9(Suppl 1):S2–5. Search PubMed
  4. Strasser R, Couper I, Wynn-Jones J, Rourke J, Chater A, Reid S. Education for rural practice in rural practice. Educ Prim Care 2016;27(1):10–14. doi: 10.1080/14739879.2015.1128684. Search PubMed
  5. Murray R, Wilson A. How can Australia have too many doctors, but still not meet patient needs? The Conversation. 5 June 2017. Available at https://theconversation.com/how-can-australia-have-too-many-doctors-but-still-not-meet-patient-needs-78535 [Accessed 29 January 2018]. Search PubMed
  6. Campbell DG, Greacen JH, Giddings PH, Skinner LP. Regionalisation of general practice training – Are we meeting the needs of rural Australia? Med J Aust 2011;194(11):S71–74. Search PubMed
  7. Australian Institute of Health and Welfare. Medical practitioners workforce 2015. Canberra: AIHW, updated Aug 2016. Available at www.aihw.gov.au/reports/workforce/medical-practitioners-workforce-2015/contents/how-many-medical-practitioners-are-there [Accessed 29 January 2018]. Search PubMed
  8. Greenhill J, Walker J, Playford D. Outcomes of Australian rural clinical schools: A decade of success building the rural medical workforce through the education and training continuum. Rural Remote Health. 2015;15(3):2991. Search PubMed
  9. Clark TR, Freedman SB, Croft AJ, et al. Medical graduates becoming rural doctors: Rural background versus extended rural placement. Med J Aust 2013;199(11):779–82. doi: 10.5694/mja13.10036. Search PubMed
  10. McGrail, M, Humphreys J, Joyce C. Nature of association between rural background and practice location: A comparison of general practitioners and specialists. BMC Health Serv Res 2011;11:63. doi: 10.1186/1472-6963-11-63. Search PubMed
  11. Sen Gupta T, Woolley T, Murray R, Hays R, McCloskey T. Positive impacts on rural and regional workforce from the first seven cohorts of James Cook University medical graduates. Rural Remote Health 2014;14:2657. Search PubMed
  12. Sen Gupta T, Manahan D, Lennox D, Taylor N. The Queensland Health Rural Generalist Pathway: Providing a medical workforce for the bush. Rural Remote Health 2013;13:2319. Search PubMed
  13. Queensland Health. The Generalist Pathway: 2017 data summary. Queensland: Queensland Health, 2017. Available at http://ruralgeneralist.qld.gov.au/wp-content/uploads/2017/06/data-summry-as-per-april.pdf [Accessed 29 January 2018]. Search PubMed
  14. Ernst & Young. Evaluation and investigative study of the Queensland Rural Generalist Program. Queensland: Queensland Health, February 2013. Available at http://ruralgeneralist.qld.gov.au/wp-content/uploads/2017/07/qrgpeval_rpt_feb13..pdf [Accessed 29 January 2018]. Search PubMed
  15. Kitchener S. Reporting rural workforce outcomes of rural-based postgraduate vocational training. Med J Aust 2015;202(1):18–19. Search PubMed
  16. Playford DE, Evans SF, Atkinson DN, Auret KA, Riley GJ. Impact of the Rural Clinical School of Western Australia on work location of medical graduates. Med J Aust 2014;200(2):104–07. Search PubMed
  17. Western Australia Department of Health. Medical workforce report 2015–16. East Perth, WA: Medical Workforce Branch, Office of the Chief Medical Officer, Department of Health, 2017. Available at http://ww2.health.wa.gov.au/~/media/Files/Corporate/Reports%20and%20publications/Medical%20Workforce/Medical-Workforce-Report-2015-16.pdf [Accessed 29 January 2018]. Search PubMed
  18. Australian General Practice Training. Annual report to 30 June 2014: Chair’s report. Canberra: GPET, 2014. Available at www.agpt.com.au/About-Us/Annual-Report [Accessed 1 February 2018]. Search PubMed
  19. Larkins S, Preston R, Matte M, et al. Measuring social accountability in health professional education: Development and international pilot testing of an evaluation framework. Med Teach 2013;35(1):32–45. doi: 10.3109/0142159X.2012.731106. Search PubMed
  20. Boelen C, Heck JE. Defining and measuring the social accountability of medical schools. Geneva: World Health Organization, 1995. Available at https://medicine.usask.ca/documents/social-accountability/WHO%20Original%20Article%20on%20SA%201995.pdf [Accessed 29 January 2018]. Search PubMed
  21. Department of Health. Regional training hubs. Canberra: DoH, 2017. Available at www.health.gov.au/internet/main/publishing.nsf/content/regional-training-hubs [Accessed 29 January 2018]. Search PubMed
  22. Health Workforce Queensland. Minimum data set summary report 2016. Brisbane: Health Workforce Queensland, 2016. Available at www.healthworkforce.com.au/media/Healthworkforce/client/20170628_MDS_Web.pdf [Accessed 29 January 2018]. Search PubMed
  23. Rural Doctors Association of Australia. Budget delivers for rural health. Manuka, ACT: Rural Doctors Association of Australia, [date unknown]. Available at www.rdaa.com.au/documents/item/398 [Accessed 1 June 2018]. Search PubMed

health services researchworkforce

Download article