Instant interaction driven adaptive gaze control interface

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Gaze estimation is long been recognised as having potential as the basis for human-computer interaction (HCI) systems, but usability and robustness of performance remain challenging . This work focuses on systems in which there is a live video stream showing enough of the subjects face to track eye movements and some means to infer gaze location from detected eye features. Currently, systems generally require some form of calibration or set-up procedure at the start of each user session. Here we explore some simple strategies for enabling gaze based HCI to operate immediately and robustly without any explicit set-up tasks. We explore different choices of coordinate origin for combining extracted features from multiple subjects and the replacement of subject specific calibration by system initiation based on prior models. Results show that referencing all extracted features to local coordinate origins determined by subject start position enables robust immediate operation. Combining this approach with an adaptive gaze estimation model using an interactive user interface enables continuous operation with the 75th percentile gaze errors of 0.7∘, and maximum gaze errors of 1.7∘ during prospective testing. There constitute state-of-the-art results and have the potential to enable a new generation of reliable gaze based HCI systems.

Original languageEnglish
Article number11661
JournalScientific Reports
Volume14
Issue number1
DOIs
Publication statusPublished - 22 May 2024

Keywords

  • Humans
  • Fixation, Ocular/physiology
  • Eye Movements/physiology
  • User-Computer Interface
  • Male
  • Eye-Tracking Technology
  • Female
  • Adult

Fingerprint

Dive into the research topics of 'Instant interaction driven adaptive gaze control interface'. Together they form a unique fingerprint.

Cite this