L Yasmeen Abdrabou (Lancaster University), Mariam Hassib (Fortiss Research Institute of the Free State of Bavaria), Shuqin Hu (LMU Munich), Ken Pfeuffer (Aarhus University), Mohamed Khamis (University of Glasgow), Andreas Bulling (University of Stuttgart), Florian Alt (University of the Bundeswehr Munich)

Existing gaze-based methods for user identification either require special-purpose visual stimuli or artificial gaze behaviour. Here, we explore how users can be differentiated by analysing natural gaze behaviour while freely looking at images. Our approach is based on the observation that looking at different images, for example, a picture from your last holiday, induces stronger emotional responses that are reflected in gaze behavioor and, hence, is unique to the person having experienced that situation. We collected gaze data in a remote study (N = 39) where participants looked at three image categories: personal images, other people’s images, and random images from the Internet. We demonstrate the potential of identifying different people using machine learning with an accuracy of 85%. The results pave the way towards a new class of authentication methods solely based on natural human gaze behaviour.

View More Papers

Understanding reCAPTCHAv2 via a Large-Scale Live User Study

Andrew Searles (University of California Irvine), Renascence Tarafder Prapty (University of California Irvine), Gene Tsudik (University of California Irvine)

Read More

WIP: Towards a Certifiably Robust Defense for Multi-label Classifiers...

Dennis Jacob, Chong Xiang, Prateek Mittal (Princeton University)

Read More

WIP: Threat Modeling Laser-Induced Acoustic Interference in Computer Vision-Assisted...

Nina Shamsi (Northeastern University), Kaeshav Chandrasekar, Yan Long, Christopher Limbach (University of Michigan), Keith Rebello (Boeing), Kevin Fu (Northeastern University)

Read More