Abstract
Objective
To investigate how AI-provided explanations impact efficiency, diagnostic accuracy, user perceptions, and workflow integration in ophthalmologists’ clinical diagnostic and treatment workflows, this study explores the challenges in human-AI interaction with transparency features in time-sensitive environments.
Background
While explainable AI (XAI) aims to foster trust and understanding, its introduction into complex work domains can unintentionally increase cognitive load and disrupt workflows, especially in high-stakes medical settings, potentially impairing system performance.
Method
The multi-phase, mixed-methods study included two parts. Study 1 (N = 32) was a between-subjects experiment in which ophthalmologists diagnosed diabetic retinopathy with AI support, with or without visual explanations (e.g., highlighting lesions). Measures included diagnostic accuracy, diagnostic time, trust, and usefulness. Study 2 (N = 11) employed qualitative methods, including think-aloud protocols and interviews, to explore clinicians’ experiences with AI in daily (treatment) workflows.
Results
In Study 1, explanations did not improve accuracy but increased decision time, reducing efficiency. Trends suggested lower perceived usefulness and trust in the explanation condition. Qualitative data from Study 2 supported these findings; clinicians found explanations time-consuming and disruptive, questioning their practical value, especially for routine cases.
Conclusion
A critical trade-off exists between pursuing AI transparency and the operational demand for efficiency. Explanations, while well-intentioned, can function as efficiency pitfalls in time-pressured clinical practice, highlighting the boundary conditions and challenges in designing effective human-AI systems.
Application
These insights inform future AI system design, favoring adaptable, on-demand explanations tailored to user needs. Such a user-centric approach supports complex cases without impeding routine task efficiency.
Get full access to this article
View all access options for this article.
