A Systematic Literature Review of Automated Feedback Generation for Programming Exercises

Hieke Keuning, J.T. Jeuring, B.J. Heeren

Research output: Contribution to journalReview articlepeer-review

24 Citations (Web of Science)
1074 Downloads (Pure)

Abstract

Formative feedback, aimed at helping students to improve theirwork, is an important factor in learning. Many tools that offer programming exercises provide automated feedback on student solutions. We have performed a systematic literature review to find out what kind of feedback is provided, which techniques are used to generate the feedback, how adaptable the feedback is, and how these tools are evaluated. We have designed a labelling to classify the tools, and use Narciss' feedback content categories to classify feedback messages. We report on the results of coding a total of 101 tools. We have found that feedback mostly focuses on identifying mistakes and less on fixing problems and taking a next step. Furthermore, teachers cannot easily adapt tools to their own needs. However, the diversity of feedback types has increased over the past decades and new techniques are being applied to generate feedback that is increasingly helpful for students.

Original languageEnglish
Article number3
Number of pages43
JournalACM Transactions on Computing Education
Volume19
Issue number1
DOIs
Publication statusPublished - Jan 2019

Keywords

  • ASSIGNMENTS
  • COMPLETION
  • COMPUTER
  • DIAGNOSIS
  • ENVIRONMENT
  • INTELLIGENT TUTORING SYSTEMS
  • KNOWLEDGE
  • SOFTWARE
  • STUDENT PROGRAMS
  • Systematic literature review
  • automated feedback
  • learning programming
  • programming tools

Fingerprint

Dive into the research topics of 'A Systematic Literature Review of Automated Feedback Generation for Programming Exercises'. Together they form a unique fingerprint.

Cite this