MultiMediate: Multi-modal Group Behaviour Analysis for Artificial Mediation

How far are we from quantifying visual attention in mobile HCI?

Mihai Bâce, Sander Staal, Andreas Bulling

IEEE Pervasive Computing, 19(2), pp. 46-55, 2020.


With an ever-increasing number of mobile devices competing for attention, quantifying when, how often, or for how long users look at their devices has emerged as a key challenge in mobile human-computer interaction. Encouraged by recent advances in automatic eye contact detection using machine learning and device-integrated cameras, we provide a fundamental investigation into the feasibility of quantifying overt visual attention during everyday mobile interactions. We discuss the main challenges and sources of error associated with sensing visual attention on mobile devices in the wild, including the impact of face and eye visibility, the importance of robust head pose estimation, and the need for accurate gaze estimation. Our analysis informs future research on this emerging topic and underlines the potential of eye contact detection for exciting new applications towards next-generation pervasive attentive user interfaces.



@article{bace20_pcm, title = {How far are we from quantifying visual attention in mobile HCI?}, author = {Bâce, Mihai and Staal, Sander and Bulling, Andreas}, journal = {IEEE Pervasive Computing}, year = {2020}, volume = {19}, number = {2}, doi = {10.1109/MPRV.2020.2967736}, pages = {46-55} }