MultiMediate: Multi-modal Group Behaviour Analysis for Artificial Mediation

Discovery of Everyday Human Activities From Long-term Visual Behaviour Using Topic Models

Julian Steil, Andreas Bulling

Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), pp. 75-85, 2015.


Abstract

Human visual behaviour has significant potential for activity recognition and computational behaviour analysis, but previous works focused on supervised methods and recognition of predefined activity classes based on short-term eye movement recordings. We propose a fully unsupervised method to discover users’ everyday activities from their long-term visual behaviour. Our method combines a bag-of-words representation of visual behaviour that encodes saccades, fixations, and blinks with a latent Dirichlet allocation (LDA) topic model. We further propose different methods to encode saccades for their use in the topic model. We evaluate our method on a novel long-term gaze dataset that contains full-day recordings of natural visual behaviour of 10 participants (more than 80 hours in total). We also provide annotations for eight sample activity classes (outdoor, social interaction, focused work, travel, reading, computer work, watching media, eating) and periods with no specific activity. We show the ability of our method to discover these activities with performance competitive with that of previously published supervised methods.

Links


BibTeX

@inproceedings{steil15_ubicomp, author = {Steil, Julian and Bulling, Andreas}, title = {Discovery of Everyday Human Activities From Long-term Visual Behaviour Using Topic Models}, booktitle = {Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp)}, year = {2015}, doi = {10.1145/2750858.2807520}, pages = {75-85} }