Leveraging Historical Medical Records as a Proxy via Multimodal Modeling and Visualization to Enrich Medical Diagnostic Learning

Yang Ouyang, Yuchen Wu, He Wang, Chenyang Zhang, Furui Cheng, Chang Jiang, Lixia Jin, Yuanwu Cao, Quan Li

Room: 105

2023-10-26T05:45:00ZGMT-0600Change your timezone on the schedule page
2023-10-26T05:45:00Z
Exemplar figure, described by caption below
The system interface of DiagnosisAssistant contains (A) the User panel, (B) the Embedding Transition View, (C) the Modality Exploration View, and (D) the Comparison View.
Fast forward
Full Video
Keywords

Multimodal Medical Dataset, Visual Analytics, Explainable Machine Learning

Abstract

Simulation-based Medical Education (SBME) has been developed as a cost-effective means of enhancing the diagnostic skills of novice physicians and interns, thereby mitigating the need for resource-intensive mentor-apprentice training. However, feedback provided in most SBME is often directed towards improving the operational proficiency of learners, rather than providing summative medical diagnoses that result from experience and time. Additionally, the multimodal nature of medical data during diagnosis poses significant challenges for interns and novice physicians, including the tendency to overlook or over-rely on data from certain modalities, and difficulties in comprehending potential associations between modalities. To address these challenges, we present DiagnosisAssistant, a visual analytics system that leverages historical medical records as a proxy for multimodal modeling and visualization to enhance the learning experience of interns and novice physicians. The system employs elaborately designed visualizations to explore different modality data, offer diagnostic interpretive hints based on the constructed model, and enable comparative analyses of specific patients. Our approach is validated through two case studies and expert interviews, demonstrating its effectiveness in enhancing medical training.