Abstract
Collaboration is an important skill of the 21st century. It can take place in an online (or remote) setting or in a colocated (or face-to-face) setting. With the large scale adoption of sensor use, studies on co-located collaboration (CC) has gained momentum. CC takes place in physical spaces where the group members share each other's social and epistemic space. This involves subtle multimodal interactions such as gaze, gestures, speech, discourse which are complex in nature. The aim of this PhD is to detect these interactions and then use these insights to build an automated real-time feedback system to facilitate co-located collaboration.
Original language | English |
---|---|
Title of host publication | ICMI '19 |
Subtitle of host publication | 2019 International Conference on Multimodal Interaction |
Editors | Wen Gao, Helen Mei Ling Meng, Matthew Turk, Susan R. Fussell, Björn Schuller, Yale Song, Kai Yu |
Place of Publication | New York, NY, USA |
Publisher | Association for Computing Machinery (ACM) |
Pages | 473-476 |
Number of pages | 4 |
ISBN (Electronic) | 9781450368605 |
ISBN (Print) | 9781450368605 |
DOIs | |
Publication status | Published - 14 Oct 2019 |
Keywords
- Co-located collaboration
- Multimodal interactions
- Multimodal learning analytics
- Realtime feedback