Autonomous social gaze model for an interactive virtual character in real-life settings

Zerrin Yumak, Bram van den Brink, Arjan Egges

Research output: Contribution to journalArticleAcademicpeer-review

7 Downloads (Pure)

Abstract

This paper presents a gaze behavior model for an interactive virtual character situated in the real world. We are interested in estimating which user has an intention to interact, in other words which user is engaged with the virtual character. The model takes into account behavioral cues such as proximity, velocity, posture and sound, estimates an engagement score and drives the gaze behavior of the virtual character. Initially, we assign equal weights to these features. Using data collected in a real setting, we analyze which features have higher importance. We found that the model with weighted features correlates better with the ground-truth data.
Original languageEnglish
Article number1757
Number of pages9
JournalComputer Animation and Virtual Worlds
Volume28
Issue number3-4
DOIs
Publication statusPublished - 1 Apr 2017
Externally publishedYes

Fingerprint

Acoustic waves

Keywords

  • engagement
  • gaze model
  • multiparty interactions
  • interactive virtual humans
  • ATTENTION

Cite this

Yumak, Zerrin ; van den Brink, Bram ; Egges, Arjan. / Autonomous social gaze model for an interactive virtual character in real-life settings. In: Computer Animation and Virtual Worlds. 2017 ; Vol. 28, No. 3-4.
@article{324cf4a01e0c447cacad9278ed797ed6,
title = "Autonomous social gaze model for an interactive virtual character in real-life settings",
abstract = "This paper presents a gaze behavior model for an interactive virtual character situated in the real world. We are interested in estimating which user has an intention to interact, in other words which user is engaged with the virtual character. The model takes into account behavioral cues such as proximity, velocity, posture and sound, estimates an engagement score and drives the gaze behavior of the virtual character. Initially, we assign equal weights to these features. Using data collected in a real setting, we analyze which features have higher importance. We found that the model with weighted features correlates better with the ground-truth data.",
keywords = "engagement, gaze model, multiparty interactions, interactive virtual humans, ATTENTION",
author = "Zerrin Yumak and {van den Brink}, Bram and Arjan Egges",
year = "2017",
month = "4",
day = "1",
doi = "10.1002/cav.1757",
language = "English",
volume = "28",
journal = "Computer Animation and Virtual Worlds",
issn = "1546-4261",
publisher = "John Wiley and Sons Ltd",
number = "3-4",

}

Autonomous social gaze model for an interactive virtual character in real-life settings. / Yumak, Zerrin; van den Brink, Bram; Egges, Arjan.

In: Computer Animation and Virtual Worlds, Vol. 28, No. 3-4, 1757, 01.04.2017.

Research output: Contribution to journalArticleAcademicpeer-review

TY - JOUR

T1 - Autonomous social gaze model for an interactive virtual character in real-life settings

AU - Yumak, Zerrin

AU - van den Brink, Bram

AU - Egges, Arjan

PY - 2017/4/1

Y1 - 2017/4/1

N2 - This paper presents a gaze behavior model for an interactive virtual character situated in the real world. We are interested in estimating which user has an intention to interact, in other words which user is engaged with the virtual character. The model takes into account behavioral cues such as proximity, velocity, posture and sound, estimates an engagement score and drives the gaze behavior of the virtual character. Initially, we assign equal weights to these features. Using data collected in a real setting, we analyze which features have higher importance. We found that the model with weighted features correlates better with the ground-truth data.

AB - This paper presents a gaze behavior model for an interactive virtual character situated in the real world. We are interested in estimating which user has an intention to interact, in other words which user is engaged with the virtual character. The model takes into account behavioral cues such as proximity, velocity, posture and sound, estimates an engagement score and drives the gaze behavior of the virtual character. Initially, we assign equal weights to these features. Using data collected in a real setting, we analyze which features have higher importance. We found that the model with weighted features correlates better with the ground-truth data.

KW - engagement

KW - gaze model

KW - multiparty interactions

KW - interactive virtual humans

KW - ATTENTION

U2 - 10.1002/cav.1757

DO - 10.1002/cav.1757

M3 - Article

VL - 28

JO - Computer Animation and Virtual Worlds

JF - Computer Animation and Virtual Worlds

SN - 1546-4261

IS - 3-4

M1 - 1757

ER -