AI is coming to healthcare – and it’s here to help

Doug Hendrie

16/08/2019 11:59:46 AM

GP19 keynote speaker Dr Martin Seneviratne talks to newsGP about the hype – and real promise – of artificial intelligence in healthcare.

Data and artificial intelligence
‘GPs will have a really important translator role to play between machine learning and the broader clinical world,’ according to Dr Martin Seneviratne.

The potential of artificial intelligence (AI) in healthcare has generated a considerable amount of excitement in recent years.
Not all of the hype is necessarily warranted, of course, but there is a real opportunity to improve healthcare if AI is used correctly.
That’s the view of Dr Martin Seneviratne, an Australian doctor turned clinical informatician.
‘There’s a lot of hype around how image-based medical specialties like dermatology and radiology will be overtaken by AI. But we’re far away from that, to be honest,’ Dr Seneviratne told newsGP.
‘The mantra coined by radiology professor Curtis Langlotz is that doctors who use AI will replace those who don’t.
‘So rather than doctors being replaced, there will be a redistribution of the kinds of things doctors work on.’
Dr Seneviratne is one of the keynote speakers at the RACGP’s annual conference for general practice – GP19 – to be held in Adelaide this October. He is a clinician scientist at DeepMind Health in London, with a previous role as the Digital Health Fellow at healthcare technology think-tank Stanford Medicine X.
‘If we do this right, we can hopefully give clinicians more time,’ he said.
‘There are a lot of routine tasks in medicine which are suited to AI automation. Contouring in many of the image-based specialties, screening for contraindicated medications, searching through a dense and disorganised tome of past medical notes.
‘The really nuanced part of making a rare diagnosis is very hard for AI, as well being hard for a human.’
For GPs, that could mean being freed from some of the burden of documentation and reliance on computers.
‘Documentation is a constant issue, and so is having a computer separating you and the patient,’ Dr Seneviratne said.

‘The dream of this AI revolution is that it helps with the parts of medicine doctors and patients don’t like, creates a safety net for ensuring quality across the board, and gives clinicians more time to be with their patients.’

Dr Martin Seneviratne believes AI could liberate GPs from the ‘burden of documentation’.

Dr Seneviratne said GPs can expect to see AI first embedded in operational support software, tackling issues like patient flow and capacity management.
‘We have started to see this appear already, predicting how many patients you’re likely to see tomorrow, how many nurses you’ll need, and all of that resource planning,’ he said.
A later AI use in general practice, Dr Seneviratne said, would be patient level tools that could help GPs personalise therapy for each patient or predict likely prognoses.
‘That’s a very high bar of AI, requiring nuanced clinical prediction,’ he said. ‘But there’s certainly a lot of research in that space.’
Population-level tools are coming too.
‘You could look across your cohort of patients to see if there were outliers, people who might have fallen through the cracks who you could call in for review,’ Dr Seneviratne said.
‘You could see what the trajectory is for this population, what the burden of disease is in the cohort you’re looking after, and project it forward to see how much resourcing you’d need to deal with that.’
But AI should not be seen as a panacea.
One of Dr Seneviratne’s key interests is the ways in which health AI can fail. He lists three major failure modes that must be overcome for AI to achieve take-off in healthcare: bias, interpretability, and actionability.
‘Machine learning can have biases, particularly when you’re training models on historical data embedded with all sorts of biases, such as rural and urban populations or Indigenous and non-Indigenous, who historically have very different health outcomes,’ he said.
‘If you don’t account for biases, the model might start to predict poorer outcomes for certain groups. You risk repeating old errors, basically – it’s the garbage in, garbage out problem.’
These types of failure are of particular interest for Dr Seneviratne because of the light they shed on how AI actually works. 
‘Another issue is the theme of interpretability,’ he said.
‘You might have this big deep-learning model that might spit out a diagnosis if you give it the patient records, but it’s unclear how the model arrives at that answer. It’s the black box issue. In medicine, that isn’t palatable.
‘Will we educate clinicians not to necessarily accept but to understand these kinds of model? Will it be like drugs where we don’t fully understand the clinical mechanism but we use them because they’re safe and effective in clinical trials? Will it be the same for algorithms?’
Actionability is another challenge. Many machine-learning algorithms excel in one specific area: pattern recognition. But finding the needle in a haystack is not the same as knowing what to do about it.
‘Though this can be really impressive, it’s a whole different ballgame to then become clinically actionable,’ Dr Seneviratne said.
‘The AI might make a wonderful prediction, but what should the clinician do with that? Should the patient have a scan? A new medication?
‘That’s something really special about health AI; it’s not just the mathematics of good prediction, but the bigger social intervention around it that goes into saying to a clinician or patient, “What should you do?”
‘It’s a shortcoming of a lot of machine-learning efforts to date.’
In his keynote address at GP19, Dr Seneviratne will stress that GPs will have a vital role to play in bringing machine learning into clinical practice.
‘GPs – who see the full spectrum of healthcare – have an opportunity to feed back to the machine-learning researchers to say, this is not useful, or this is dangerous, or this is how it could be improved,’ he said.
‘GPs will have a really important translator role to play between machine learning and the broader clinical world.’
GP19 will be held in Adelaide from 24–26 October. More information is available on the conference website.
These are Dr Seneviratne’s personal views and do not represent the views of his employer.

AI artificial intelligence digital health ehealth GP19

newsGP weekly poll What is your chief concern with role substitution?

newsGP weekly poll What is your chief concern with role substitution?



Login to comment

A.Prof Christopher David Hogan   17/08/2019 7:45:28 PM

As suggested by Dr Seneviratne the issue about medical records is that they need to be accurate , complete & relevant. A difficulty is that in machine learning any information might be relevant .
You only know if it is relevant after analysis!