Abstract

Stealth assessment is a methodology that utilizes machine learning for processing unobtrusively collected data from serious games to produce inferences regarding learners' mastery level. Although stealth assessment can produce valid and reliable assessments, its robustness over a wide a range of conditions has not been examined yet. The main reason is its complex, laborious, and time-consuming practical application. Therefore, its exposure to different conditions has been limited, as well as its wider uptake from the serious game community. Nevertheless, a framework for developing a generic tool has been proposed to lift its barriers. In this study, a generic SA software tool was developed based on this framework to examine the robustness of the stealth assessment methodology under various conditions. In specific, the conditions relate to (a) processing datasets of different distribution types and sizes (960 datasets containing a total of 72.336.000 data points are used for this reason), (b) utilizing two different machine learning algorithms (Gaussian Naive Bayes Network and C4.5), and (c) using statistical models relating to two different competency constructs. Results show that stealth assessment is a robust methodology, whilst the generic SA tool is a highly accurate tool capable of handling efficiently a wide range of conditions.
Original languageEnglish
Pages (from-to)180-192
Number of pages1
JournalIEEE Transactions on Games
Volume13
Issue number2
DOIs
Publication statusPublished - Jun 2021

Keywords

  • generic tool
  • machine learning
  • robustness
  • serious games
  • simulation
  • stealth assessment
  • Tools
  • Generic tool
  • machine learning (ML)
  • Task analysis
  • Games
  • Machine learning
  • Robustness
  • Software tools

Fingerprint

Dive into the research topics of 'On The Robustness of Stealth Assessment'. Together they form a unique fingerprint.

Cite this