From signals to knowledge: A conceptual model for multimodal learning analytics

D. Di Mitri, Jan Schneider , M.M. Specht, H.J. Drachsler

Research output: Contribution to journalArticleAcademicpeer-review


Multimodality in learning analytics and learning science is under the spotlight. The landscape of sensors and wearable trackers that can be used for learning support is evolving rapidly, as well as data collection and analysis methods. Multimodal data can now be collected and processed in real time at an unprecedented scale. With sensors, it is possible to capture observable events of the learning process such as learner's behaviour and the learning context. The learning process, however, consists also of latent attributes, such as the learner's cognitions or emotions. These attributes are unobservable to sensors and need to be elicited by human‐driven interpretations. We conducted a literature survey of experiments using multimodal data to frame the young research field of multimodal learning analytics. The survey explored the multimodal data used in related studies (the input space) and the learning theories selected (the hypothesis space). The survey led to the formulation of the Multimodal Learning Analytics Model whose main objectives are of (O1) mapping the use of multimodal data to enhance the feedback in a learning context; (O2) showing how to combine machine learning with multimodal data; and (O3) aligning the terminology used in the field of machine learning and learning science
Original languageEnglish
Pages (from-to)338-349
Number of pages12
JournalJournal of Computer Assisted Learning
Issue number4
Publication statusPublished - Aug 2018


  • learning analytics
  • machine learning
  • multimodal data
  • multimodality
  • sensors
  • social signal processing

Fingerprint Dive into the research topics of 'From signals to knowledge: A conceptual model for multimodal learning analytics'. Together they form a unique fingerprint.

Cite this