logo
ResearchBunny Logo
Out with AI, in with the psychiatrist: a preference for human-derived clinical decision support in depression care

Psychology

Out with AI, in with the psychiatrist: a preference for human-derived clinical decision support in depression care

M. M. Maslej, S. Kloiber, et al.

This groundbreaking study by Marta M. Maslej and colleagues explores psychiatrists' views on AI-based clinical support tools for major depressive disorder. Despite advancements in AI, the findings reveal a surprising preference for human-generated summaries and recommendations, especially when accuracy is on the line. Dive into this intriguing research to understand perceptions shaping the future of psychiatric care.

00:00
00:00
~3 min • Beginner • English
Abstract
Advancements in artificial intelligence (AI) are enabling the development of clinical support tools (CSTs) in psychiatry to facilitate the review of patient data and inform clinical care. To promote their successful integration and prevent over-reliance, it is important to understand how psychiatrists will respond to information provided by AI-based CSTs, particularly if it is incorrect. We conducted an experiment to examine psychiatrists’ perceptions of AI-based CSTs for treating major depressive disorder (MDD) and to determine whether perceptions interacted with the quality of CST information. Eighty-three psychiatrists read clinical notes about a hypothetical patient with MDD and reviewed two CSTs embedded within a single dashboard: the note’s summary and a treatment recommendation. Psychiatrists were randomised to believe the source of CSTs was either AI or another psychiatrist, and across four notes, CSTs provided either correct or incorrect information. Psychiatrists rated the CSTs on various attributes. Ratings for note summaries were less favourable when psychiatrists believed the notes were generated with AI as compared to another psychiatrist, regardless of whether the notes provided correct or incorrect information. A smaller preference for psychiatrist-generated information emerged in ratings of attributes that reflected the summary’s accuracy or its inclusion of important information from the full clinical note. Ratings for treatment recommendations were also less favourable when their perceived source was AI, but only when recommendations were correct. There was little evidence that clinical expertise or familiarity with AI impacted results. These findings suggest that psychiatrists prefer human-derived CSTs. This preference was less pronounced for ratings that may have prompted a deeper review of CST information (i.e. a combined note to evaluate the summary’s accuracy or consistency factors and assessment recommendations for integrating AI into psychiatric care. Future work should explore other contextual factors and avenues for integrating AI into psychiatric care.
Publisher
Translational Psychiatry
Published On
Jun 16, 2023
Authors
Marta M. Maslej, Stefan Kloiber, Marzyeh Ghassemi, Joanna Yu, Sean L. Hill
Tags
psychiatry
AI clinical support tools
major depressive disorder
human preference
treatment recommendations
clinical expertise
user perceptions
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny