MultiMediate: Multi-modal Group Behaviour Analysis for Artificial Mediation

Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency

Yusuke Sugano, Andreas Bulling

Proc. ACM Symposium on User Interface Software and Technology (UIST), pp. 363-372, 2015.


Abstract

Head-mounted eye tracking has significant potential for gaze-based applications such as life logging, mental health monitoring, or quantified self. However, a neglected challenge for such applications is that drift in the initial person-specific eye tracker calibration, for example caused by physical activity, can severely impact gaze estimation accuracy and, thus, system performance and user experience. We first analyse calibration drift on a new dataset of natural gaze data recorded using synchronised video-based and Electrooculography-based eye trackers of 20 users performing everyday activities in a mobile setting. Based on this analysis we present a method to automatically self-calibrate head-mounted eye trackers based on a computational model of bottom-up visual saliency. Through evaluations on the dataset we show that our method is 1) effective in reducing calibration drift in calibrated eye trackers and 2) given sufficient data, can achieve competitive gaze estimation accuracy to a calibrated eye tracker without any manual calibration.

Links


BibTeX

@inproceedings{sugano15_uist, title = {Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency}, author = {Sugano, Yusuke and Bulling, Andreas}, year = {2015}, booktitle = {Proc. ACM Symposium on User Interface Software and Technology (UIST)}, doi = {10.1145/2807442.2807445}, pages = {363-372}, video = {https://www.youtube.com/watch?v=CvsZ3YCWFPk} }