MultiMediate: Multi-modal Group Behaviour Analysis for Artificial Mediation

EyeWear Computers for Human-Computer Interaction

Andreas Bulling, Kai Kunze

ACM Interactions, 23(3), pp. 70-73, 2016.


Abstract

Head-worn displays and eye trackers, augmented and virtual reality glasses, egocentric cameras, and other "smart eyewear" have recently emerged as a research platform in fields such as ubiquitous computing, computer vision, and cognitive and social science. While earlier generations of devices were too bulky to be worn regularly, recent technological advances have made eyewear unobtrusive and lightweight, and therefore more suitable for daily use. Given that many human senses are located on the head, smart eyewear provides opportunities for types of interaction that were impossible before now. In this article, we highlight the potential of eyewear computing for HCI, discuss available input and output modalities, and suggest the most promising future directions for eyewear computing research, namely multimodal user modeling, lifelong learning, and large-scale (collective) human-behavior sensing and analysis.

Links


BibTeX

@article{bulling16_interactions, title = {EyeWear Computers for Human-Computer Interaction}, author = {Bulling, Andreas and Kunze, Kai}, year = {2016}, journal = {ACM Interactions}, volume = {23}, number = {3}, doi = {10.1145/2912886}, pages = {70-73} }