GazeConduits: Calibration-Free Cross-Device Collaboration through Gaze and Touch
Paper at ACM CHI '20
by Simon Voelker, Sebastian Hueber, Christian Holz, Christian Remy, and Nicolai Marquardt
Abstract
We present GazeConduits, a calibration-free ad-hoc mobile device setup that enables users to collaboratively interact with tablets, other users, and content in a cross-device setting using gaze and touch input. GazeConduits leverages recently presented phone capabilities to detect facial features and estimate users’ gaze directions. To join a collaborative setting, users place one or more tablets onto a shared table and position their phone in the center, which then tracks present users as well as their gaze direction to predict the tablets they look at.
Using GazeConduits, we demonstrate a series of techniques for collaborative interaction across mobile devices for content selection and manipulation. Our evaluation with 20 simultaneous tablets on a table showed that GazeConduits can reliably identify at which tablet or at which collaborator a user is looking, enabling a rich set of interaction techniques.
Video
Publications
- Simon Voelker, Sebastian Hueber, Christian Holz, Christian Remy and Nicolai Marquardt. GazeConduits: Calibration-Free Cross-Device Collaboration through Gaze and Touch. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, CHI '20, pages 451:1–451:10, ACM, New York, NY, USA, April 2020.
2020