Abstract
Understanding human attention in mobile interaction is a relevant part of human computer interaction, indica-ting focus of task, emotion and communication. Lack of large scale studies enabling statistically significant re-sults is due to high costs of manual penetration in eye tracking analysis. With high quality wearable cameras for eye-tracking and Google glasses, video analysis for visual attention analysis will become ubiquitous for automated large scale annotation. We describe for the first time precise gaze estimation on mobile displays and surrounding, its performance and without markers. We demonstrate accurate POR (point of regard) re-covery on the mobile device and enable heat mapping of visual tasks. In a benchmark test we achieve a mean accuracy in the POR localization on the display by ≈1.5 mm, and the method is very robust to illumination changes. We conclude from these results that this sys-tem may open new avenues in eye tracking research for behavior analysis in mobile applications.
Originalsprache | Englisch |
---|---|
Titel | Proceedings of the extended abstracts of the 32nd annual ACM conference on Human factors in computing systems |
Seiten | 1717-1722 |
Seitenumfang | 6 |
DOIs | |
Publikationsstatus | Veröffentlicht - 2014 |
Veranstaltung | CHI 2014 - One of a CHInd - Dauer: 26 Apr. 2014 → 1 Mai 2014 |
Konferenz
Konferenz | CHI 2014 - One of a CHInd |
---|---|
Zeitraum | 26/04/14 → 1/05/14 |
Research Field
- Ehemaliges Research Field - Technology Experience
Schlagwörter
- Human attention; gaze recovery; mobile interaction heat maps