Space-variant spatio-temporal filtering of video for gaze visualization and perceptual learning

Michael Dorr, Halszka Jarodzka, Erhardt Barth

    Research output: Chapter in Book/Report/Conference proceedingConference Abstract/Poster in proceedingAcademicpeer-review

    71 Downloads (Pure)


    We introduce an algorithm for space-variant filtering of video based on a spatio-temporal Laplacian pyramid and use this algorithm to render videos in order to visualize prerecorded eye movements. Spatio-temporal contrast and colour saturation are reduced as a function of distance to the nearest gaze point of regard, i.e. non- fixated, distracting regions are filtered out, whereas fixated image regions remain unchanged. Results of an experiment in which the eye movements of an expert on instructional videos are visualized with this algorithm, so that the gaze of novices is guided to relevant image locations. Results show that this visualization technique facilitates the novices’ perceptual learning.
    Original languageEnglish
    Title of host publicationETRA '10: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
    Place of PublicationNew York
    PublisherAssociation for Computing Machinery (ACM)
    Number of pages8
    ISBN (Print)978-1-60558-994-7
    Publication statusPublished - 2010
    EventEye Tracking Research & Applications Symposium 2010 - Austin, United States
    Duration: 22 Mar 201024 Mar 2010
    Conference number: 6th


    SymposiumEye Tracking Research & Applications Symposium 2010
    Abbreviated titleETRA 2010
    Country/TerritoryUnited States
    Internet address


    • eye tracking
    • image processing


    Dive into the research topics of 'Space-variant spatio-temporal filtering of video for gaze visualization and perceptual learning'. Together they form a unique fingerprint.

    Cite this