This paper presents a gaze behavior model for an interactive virtual character situated in the real world. We are interested in estimating which user has an intention to interact, in other words which user is engaged with the virtual character. The model takes into account behavioral cues such as proximity, velocity, posture and sound, estimates an engagement score and drives the gaze behavior of the virtual character. Initially, we assign equal weights to these features. Using data collected in a real setting, we analyze which features have higher importance. We found that the model with weighted features correlates better with the ground-truth data.
- gaze model
- multiparty interactions
- interactive virtual humans
Yumak, Z., van den Brink, B., & Egges, A. (2017). Autonomous social gaze model for an interactive virtual character in real-life settings. Computer Animation and Virtual Worlds, 28(3-4), . https://doi.org/10.1002/cav.1757