MultiMediate: Multi-modal Group Behaviour Analysis for Artificial Mediation

Prediction of Gaze Estimation Error for Error-Aware Gaze-Based Interfaces

Michael Barz, Florian Daiber, Andreas Bulling

Proc. International ACM Symposium on Eye Tracking Research and Applications (ETRA), pp. 275-278, 2016.


Abstract

Gaze estimation error is inherent in head-mounted eye trackers and seriously impacts performance, usability, and user experience of gaze-based interfaces. Particularly in mobile settings, this error varies constantly as users move in front and look at different parts of a display. We envision a new class of gaze-based interfaces that are aware of the gaze estimation error and adapt to it in real time. As a first step towards this vision we introduce an error model that is able to predict the gaze estimation error. Our method covers major building blocks of mobile gaze estimation, specifically mapping of pupil positions to scene camera coordinates, marker-based display detection, and mapping of gaze from scene camera to on-screen coordinates. We develop our model through a series of principled measurements of a state-of-the-art head-mounted eye tracker.

Links


BibTeX

@inproceedings{barz16_etra, author = {Barz, Michael and Daiber, Florian and Bulling, Andreas}, title = {Prediction of Gaze Estimation Error for Error-Aware Gaze-Based Interfaces}, booktitle = {Proc. International ACM Symposium on Eye Tracking Research and Applications (ETRA)}, year = {2016}, pages = {275-278}, doi = {10.1145/2857491.2857493} }