Real-Time Gaze Tracking with Event-Driven Eye Segmentation

Yu Feng, Nathan Goulding-Hotta, Asif Khan, Hans Reyserhove, Yuhao Zhu

View presentation:2022-10-20T19:24:00ZGMT-0600Change your timezone on the schedule page
2022-10-20T19:24:00Z
Exemplar figure, described by caption below
Event-Driven ROI-based eye segmentation.

Prerecorded Talk

The live footage of the talk, including the Q&A, can be viewed on the session page, VR Invited Talks.

Fast forward
Keywords

Gaze, eye tracking, event camera, segmentation

Abstract

Gaze tracking is increasingly becoming an essential component in Augmented and Virtual Reality. Modern gaze tracking algorithms are heavyweight; they operate at most 5 Hz on mobile processors despite that near-eye cameras comfortably operate at a real-time rate (> 30 Hz). This paper presents a real-time eye tracking algorithm that, on average, operates at 30 Hz on a mobile processor, achieves 0.1°–0.5° gaze accuracies, all the while requiring only 30K parameters, one to two orders of magnitude smaller than state-of-the-art eye tracking algorithms. The crux of our algorithm is an Auto ROI mode, which continuously predicts the Regions of Interest (ROIs) of near-eye images and judiciously processes only the ROIs for gaze estimation. To that end, we introduce a novel, lightweight ROI prediction algorithm by emulating an event camera. We discuss how a software emulation of events enables accurate ROI prediction without requiring special hardware. The code of our paper is available at https://github.com/horizon-research/edgaze.