RoboHapalytics: A Robot Assisted Haptic Controller for Immersive Analytics

Shaozhang Dai, Tim Dwyer, Barrett Ens, Jim Smiley, Lonni Besançon

View presentation:2022-10-19T19:48:00ZGMT-0600Change your timezone on the schedule page
2022-10-19T19:48:00Z
Exemplar figure, described by caption below
RoboHapalytics: A Robot Assisted Haptic Controller for Immersive Analytics [2022 VIS]

Prerecorded Talk

The live footage of the talk, including the Q&A, can be viewed on the session page, Immersive Analytics and Situated Visualization.

Fast forward
Abstract

Immersive environments offer new possibilities for exploring three-dimensional volumetric or abstract data. However, typical mid-air interaction offers little guidance to the user in interacting with the resulting visuals. Previous work has explored the use of haptic controls to give users tangible affordances for interacting with the data, but these controls have either: been limited in their range and resolution; were spatially fixed; or required users to manually align them with the data space. We explore the use of a robot arm with hand tracking to align tangible controls under the user’s fingers as they reach out to interact with data affordances. We begin with a study evaluating the effectiveness of a robot-extended slider control compared to a large fixed physical slider and a purely virtual mid-air slider. We find that the robot slider has similar accuracy to the physical slider but is significantly more accurate than mid-air interaction. Further, the robot slider can be arbitrarily reoriented, opening up many new possibilities for tangible haptic interaction with immersive visualisations. We demonstrate these possibilities through three use-cases: selection in a time-series chart; interactive slicing of CT scans; and finally exploration of a scatter plot depicting time-varying socio-economic data.