MultiMediate: Multi-modal Group Behaviour Analysis for Artificial Mediation

Classifying Attention Types with Thermal Imaging and Eye Tracking

Yomna Abdelrahman, Anam Ahmad Khan, Joshua Newn, Eduardo Velloso, Sherine Ashraf Safwat, James Bailey, Andreas Bulling, Frank Vetere, Albrecht Schmidt

Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), 3(3), pp. 69:1-69:27, 2019.


Abstract

Despite the importance of attention in user performance, current methods for attention classification do not allow to discriminate between different attention types. We propose a novel method that combines thermal imaging and eye tracking to unobtrusively classify four types of attention: sustained, alternating, selective, and divided. We collected a data set in which we stimulate these four attention types in a user study (N=22) using combinations of audio and visual stimuli while measuring users’ facial temperature and eye movement. Using a Logistic Regression on features extracted from both sensing technologies, we can classify the four attention types with high AUC scores up to 75.7% for the user independent-condition independent, 87% for the user-independent-condition dependent, and 77.4% for the user-dependent prediction. Our findings not only demonstrate the potential of thermal imaging and eye tracking for unobtrusive classification of different attention types but also pave the way for novel applications for attentive user interfaces and attention-aware computing.

Links


BibTeX

@article{abdelrahman19_imwut, author = {Abdelrahman, Yomna and Khan, Anam Ahmad and Newn, Joshua and Velloso, Eduardo and Safwat, Sherine Ashraf and Bailey, James and Bulling, Andreas and Vetere, Frank and Schmidt, Albrecht}, title = {Classifying Attention Types with Thermal Imaging and Eye Tracking}, journal = {Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT)}, year = {2019}, volume = {3}, number = {3}, pages = {69:1-69:27}, doi = {10.1145/3351227} }