Towards a systematic review of automated feedback generation for programming exercises

Hieke Keuning, J.T. Jeuring, B.J. Heeren

Research output: Chapter in Book/Report/Conference proceedingConference Article in proceedingAcademicpeer-review

Abstract

Formative feedback, aimed at helping students to improve their work, is an important factor in learning. Many tools that offer programming exercises provide automated feedback on student solutions. We are performing a systematic literature review to find out what kind of feedback is provided, which techniques are used to generate the feedback, how adaptable the feedback is, and how these tools are evaluated. We have designed a labelling to classify the tools, and use Narciss' feedback content categories to classify feedback messages. We report on the results of the first iteration of our search in which we coded 69 tools. We have found that tools do not often give feedback on fixing problems and taking a next step, and that teachers cannot easily adapt tools to their own needs.
Original languageEnglish
Title of host publicationITiCSE '16
Subtitle of host publicationProceedings of the 2016 ACM Conference on Innovation and Technology in Computer Science Education
Place of PublicationNew York
PublisherACM Digital Library
Pages41-46
Number of pages6
ISBN (Print)978-1-4503-4231-5
DOIs
Publication statusPublished - Jul 2016
Event2016 ACM Conference on Innovation and Technology in Computer Science Education - Arequipa, Peru
Duration: 11 Jul 201613 Jul 2016

Conference

Conference2016 ACM Conference on Innovation and Technology in Computer Science Education
Abbreviated titleITiCSE '16
Country/TerritoryPeru
CityArequipa
Period11/07/1613/07/16

Fingerprint

Dive into the research topics of 'Towards a systematic review of automated feedback generation for programming exercises'. Together they form a unique fingerprint.

Cite this