Interweaving Multimodal Interaction with Flexible Unit Visualizations for Data Exploration

Arjun Srinivasan, Bongshin Lee, John Stasko

View presentation: 2020-10-29T16:00:00Z GMT-0600 Change your timezone on the schedule page
2020-10-29T16:00:00Z
Exemplar figure
DataBreeze running on an 84” Microsoft Surface Hub with an external microphone placed on top of the display to record speech input.
Fast forward

Direct link to video on YouTube: https://youtu.be/dRA1FENai9I

Keywords

Multimodal interaction, Natural language interfaces, Speech interaction, Pen and touch interaction, Unit visualizations

Abstract

Multimodal interfaces that combine direct manipulation and natural language have shown great promise for data visualization. Such multimodal interfaces allow people to stay in the flow of their visual exploration by leveraging the strengths of one modality to complement the weaknesses of others. In this work, we introduce an approach that interweaves multimodal interaction combining direct manipulation and natural language with flexible unit visualizations. We employ the proposed approach in a proof-of-concept system, DataBreeze. Coupling pen, touch, and speech-based multimodal interaction with flexible unit visualizations, DataBreeze allows people to create and interact with both systematically bound (e.g., scatterplots, unit column charts) and manually customized views, enabling a novel visual data exploration experience. We describe our design process along with DataBreeze's interface and interactions, delineating specific aspects of the design that empower the synergistic use of multiple modalities. We also present a preliminary user study with DataBreeze, highlighting the data exploration patterns that participants employed. Finally, reflecting on our design process and preliminary user study, we discuss future research directions.