MultiMediate: Multi-modal Group Behaviour Analysis for Artificial Mediation

Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable Sensors

Julian Steil, Philipp Müller, Yusuke Sugano, Andreas Bulling

Proc. ACM International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI), pp. 1:1–1:13, 2018.

Best paper award


Abstract

Visual attention is highly fragmented during mobile interactions but the erratic nature of attention shifts currently limits attentive user interfaces to adapt after the fact, i.e. after shifts have already happened. We instead study attention forecasting – the challenging task of predicting users’ gaze behavior (overt visual attention) in the near future. We present a novel long-term dataset of everyday mobile phone interactions, continuously recorded from 20 participants engaged in common activities on a university campus over 4.5 hours each (more than 90 hours in total). We propose a proof-of-concept method that uses device-integrated sensors and body-worn cameras to encode rich information on device usage and users’ visual scene. We demonstrate that our method can forecast bidirectional attention shifts and whether the primary attentional focus is on the handheld mobile device. We study the impact of different feature sets on performance and discuss the significant potential but also remaining challenges of forecasting user attention during mobile interactions.

Links


BibTeX

@inproceedings{steil18_mobilehci, author = {Steil, Julian and M{\"{u}}ller, Philipp and Sugano, Yusuke and Bulling, Andreas}, title = {Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable Sensors}, booktitle = {Proc. ACM International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI)}, year = {2018}, doi = {10.1145/3229434.3229439}, pages = {1:1--1:13} }