MultiMediate: Multi-modal Group Behaviour Analysis for Artificial Mediation

EyeMote - Towards Context-Aware Gaming Using Eye Movements Recorded From Wearable Electrooculography

Andreas Bulling, Daniel Roggen, Gerhard Tröster

Proc. ACM International Conference on Fun and Games (FnG), pp. 33-45, 2008.


Abstract

Physical activity has emerged as a novel input modality for so-called active video games. Input devices such as music instruments, dance mats or the Wii accessories allow for novel ways of interaction and a more immersive gaming experience. In this work we describe how eye movements recognised from electrooculographic (EOG) signals can be used for gaming purposes in three different scenarios. In contrast to common video-based systems, EOG can be implemented as a wearable and light-weight system which allows for long-term use with unconstrained simultaneous physical activity. In a stationary computer game we show that eye gestures of varying complexity can be recognised online with equal performance to a state-of-the-art video-based system. For pervasive gaming scenarios, we show how eye movements can be recognised in the presence of signal artefacts caused by physical activity such as walking. Finally, we describe possible future context-aware games which exploit unconscious eye movements and show which possibilities this new input modality may open up.

Links


BibTeX

@inproceedings{bulling08_fng, author = {Bulling, Andreas and Roggen, Daniel and Tr{\"{o}}ster, Gerhard}, keywords = {Active Video Games, Context-awareness, Electrooculography (EOG), Eye Tracking, Human-Computer Interaction (HCI), Location-Based Gaming, Pervasive Gaming}, title = {EyeMote - Towards Context-Aware Gaming Using Eye Movements Recorded From Wearable Electrooculography}, booktitle = {Proc. ACM International Conference on Fun and Games (FnG)}, year = {2008}, pages = {33-45}, doi = {10.1007/978-3-540-88322-7_4} }