Abstract
Formative feedback, aimed at helping students to improve their work, is an important factor in learning. Many tools that offer programming exercises provide automated feedback on student solutions. We are performing a systematic literature review to find out what kind of feedback is provided, which techniques are used to generate the feedback, how adaptable the feedback is, and how these tools are evaluated. We have designed a labelling to classify the tools, and use Narciss' feedback content categories to classify feedback messages. We report on the results of the first iteration of our search in which we coded 69 tools. We have found that tools do not often give feedback on fixing problems and taking a next step, and that teachers cannot easily adapt tools to their own needs.
Original language | English |
---|---|
Title of host publication | ITiCSE '16 |
Subtitle of host publication | Proceedings of the 2016 ACM Conference on Innovation and Technology in Computer Science Education |
Place of Publication | New York |
Publisher | ACM Digital Library |
Pages | 41-46 |
Number of pages | 6 |
ISBN (Print) | 978-1-4503-4231-5 |
DOIs | |
Publication status | Published - Jul 2016 |
Event | 2016 ACM Conference on Innovation and Technology in Computer Science Education - Arequipa, Peru Duration: 11 Jul 2016 → 13 Jul 2016 |
Conference
Conference | 2016 ACM Conference on Innovation and Technology in Computer Science Education |
---|---|
Abbreviated title | ITiCSE '16 |
Country/Territory | Peru |
City | Arequipa |
Period | 11/07/16 → 13/07/16 |