Let's Get Vis-ical: Perceptual Accuracy in Visual & Tactile Encodings

Zhongzheng Xu, Emily Wall, Kristin Williams

Room: 104

2023-10-24T22:18:00ZGMT-0600Change your timezone on the schedule page
2023-10-24T22:18:00Z
Exemplar figure, but none was provided by the authors
Fast forward
Full Video
Keywords

Human-centered computing—Visualization—Visualization techniques; Human-centered computing—Visualization—Visualization design and evaluation methods

Abstract

In this paper, we explore the effectiveness of tactile data encodings using swell paper in comparison to visual encodings displayed with SVGs for data perception tasks. By replicating and adapting Cleveland and McGill's graphical perception study for the tactile modality, we establish a novel tactile encoding hierarchy. In a study with 12 university students, we found that participants perceived visual encodings more accurately when comparing values, judging their ratios with lower cognitive load, and better self-evaluated performance than tactile encodings. However, tactile encodings differed from their visual counterparts in terms of how accurately values could be decoded from them. This suggests that data physicalizations will require different design guidance than that developed for visual encodings. By providing empirical evidence for the perceptual accuracy of tactile encodings, our work contributes to foundational research on forms of data representation that prioritize tactile perception such as tactile graphics.