Data driven group comparisons of eye fixations to dynamic stimuli

Tochukwu Onwuegbusi*, F. Hermens, Todd Hogue

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review


Recent advances in software and hardware have allowed eye tracking to move away from static images to more ecologically relevant video streams. The analysis of eye tracking data for such dynamic stimuli, however, is not without challenges. The frame-by-frame coding of regions of interest (ROIs) is labour-intensive and computer vision techniques to automatically code such ROIs are not yet mainstream, restricting the use of such stimuli. Combined with the more general problem of defining relevant ROIs for video frames, methods are needed that facilitate data analysis. Here, we present a first evaluation of an easy-to-implement data-driven method with the potential to address these issues. To test the new method, we examined the differences in eye movements of self-reported politically left- or right-wing leaning participants to video clips of left- and right-wing politicians. The results show that our method can accurately predict group membership on the basis of eye movement patterns, isolate video clips that best distinguish people on the political left–right spectrum, and reveal the section of each video clip with the largest group differences. Our methodology thereby aids the understanding of group differences in gaze behaviour, and the identification of critical stimuli for follow-up studies or for use in saccade diagnosis.

Original languageEnglish
Article number17470218211048060
Pages (from-to)989-1003
Number of pages15
JournalQuarterly Journal of Experimental Psychology
Issue number6
Early online date29 Sept 2021
Publication statusPublished - Jun 2022


  • Eye movements
  • dynamic stimuli
  • eye tracking
  • group comparisons
  • saccade diagnosis


Dive into the research topics of 'Data driven group comparisons of eye fixations to dynamic stimuli'. Together they form a unique fingerprint.

Cite this