loader

AI in Medical Diagnosis

Introduction

Newswise — While the use of artificial intelligence (AI) for medical diagnosis is growing, new research by the University of Adelaide has found there are still major hurdles to cover when compared to a clinician.

The AI Chasm

In a paper published in The Lancet Digital Health, Australian Institute for Machine Learning PhD student Lana Tikhomirov, Professor Carolyn Semmler and team from the University of Adelaide, have drawn on external research to investigate what’s known as the ‘AI chasm’.

The AI chasm has occurred because development and commercialisation of AI decision-making systems has outpaced our understanding of their value for clinicians and how they impact human-decision making.

Consequences of the AI Chasm

“This can have consequences such as automation bias (being blind to AI errors) or misapplication,” said Ms Tikhomirov.

“Misconceptions about AI also restrict our ability to maximise this new technology and augment the human properly.”

Comparison with Other High-Risk Settings

“Although technology implementation in other high-risk settings, such as increased automation in aeroplane cockpits, has been previously investigated to understand and improve how it is used, evaluating AI implementation for clinicians remains a neglected area,” Ms Tikhomirov added.

AI as a Clinical Tool

“We should be using AI more like a clinical drug rather than a device.”

Clinicians vs AI Models

The research found clinicians are contextually motivated, mentally resourceful decision makers whereas AI models make decisions without context or understanding correlations in data and patients.

“The clinical environment is rich with sensory cues used to carry out diagnoses, even if they are unnoticeable to the novice observer,” said Ms Tikhomirov.

Importance of Experience

“With experience, clinicians learn which cues guide their attention towards the most clinically relevant information in their environment.”

“This ability to use domain-relevant information is known as cue utilisation and it is a hallmark of expertise which enables clinicians to rapidly extract the essential features from the clinical scene while remaining highly accurate, guiding subsequent processing and analysis of specific clinical features.”

Limitations of AI Models

“An AI model cannot question its dataset in the same way clinicians are encouraged to question the validity of what they have been taught: a practice in the clinical setting called epistemic humility.”

Conclusion

As AI continues to evolve, understanding its limitations and the unique capabilities of human clinicians is essential for effective integration into medical practice.