MultiMediate: Multi-modal Group Behaviour Analysis for Artificial Mediation

Discrimination of Gaze Directions Using Low-Level Eye Image Features

Yanxia Zhang, Andreas Bulling, Hans Gellersen

Proc. International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI), pp. 9-13, 2011.


Abstract

In mobile daily life settings, video-based gaze tracking faces challenges associated with changes in lighting conditions and artefacts in the video images caused by head and body movements. These challenges call for the development of new methods that are robust to such influences. In this paper we investigate the problem of gaze estimation, more specifically how to discriminate different gaze directions from eye images. In a 17 participant user study we record eye images for 13 different gaze directions from a standard web- cam. We extract a total of 50 features from these images that encode information on color, intensity and orientations. Using mRMR feature selection and a k-nearest neighbor (kNN) classifier we show that we can estimate these gaze directions with a mean recognition performance of 86%.

Links


BibTeX

@inproceedings{zhang11_petmei, author = {Zhang, Yanxia and Bulling, Andreas and Gellersen, Hans}, title = {Discrimination of Gaze Directions Using Low-Level Eye Image Features}, booktitle = {Proc. International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI)}, year = {2011}, pages = {9-13}, doi = {10.1145/2029956.2029961} }