MultiMediate: Multi-modal Group Behaviour Analysis for Artificial Mediation

Analysing EOG Signal Features for the Discrimination of Eye Movements with Wearable Devices

Mélodie Vidal, Andreas Bulling, Hans Gellersen

Proc. International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI), pp. 15-20, 2011.


Abstract

Eye tracking research in human-computer interaction and experimental psychology traditionally focuses on stationary devices and a small number of common eye movements. The advent of pervasive eye tracking promises new applications, such as eye-based mental health monitoring or eye-based activity and context recognition. These applications might require further research on additional eye movement types such as smooth pursuits and the vestibulo-ocular reflex as these movements have not been studied as extensively as saccades, fixations and blinks. In this paper we report our first step towards an effective discrimination of these movements. In a user study we collect naturalistic eye movements from 19 people using the two most common measurement techniques (EOG and IR-based). We develop a set of basic signal features that we extract from the collected eye movement data and show that a feature-based approach has the potential to discriminate between saccades, smooth pursuits, and vestibulo-ocular reflex movements.

Links


BibTeX

@inproceedings{vidal11_petmei, author = {Vidal, M{\'{e}}lodie and Bulling, Andreas and Gellersen, Hans}, title = {Analysing EOG Signal Features for the Discrimination of Eye Movements with Wearable Devices}, booktitle = {Proc. International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI)}, year = {2011}, pages = {15-20}, doi = {10.1145/2029956.2029962} }