The Impact of Adding Textual Explanations to Next-step Hints in a Novice Programming Environment

Abstract

Automated hints, a powerful feature of many programming envi- ronments, have been shown to improve students’ performance and learning. New methods for generating these hints use historical data, allowing them to scale easily to new classrooms and contexts. These scalable methods often generate next-step, code hints that suggest a single edit for the student to make to their code. However, while these code hints tell the student what to do, they do not explain why, which can make these hints hard to interpret and decrease students’ trust in their helpfulness. In this work, we augmented code hints by adding adaptive, textual explanations in a block-based, novice programming environment. We evaluated their impact in two controlled studies with novice learners to in- vestigate how our results generalize to different populations. We measured the impact of textual explanations on novices’ program- ming performance. We also used quantitative analysis of log data, self-explanation prompts, and frequent feedback surveys to evaluate novices’ understanding and perception of the hints throughout the learning process. Our results showed that novices perceived hints with explanations as significantly more relevant and inter- pretable than those without explanations, and were also better able to connect these hints to their code and the assignment. However, we found little difference in novices’ performance. Our results suggest that explanations have the potential to make code hints more useful, but it is unclear whether this translates into better overall performance and learning.

Publication
Proceedings of the Annual Conference on Innovation and Technology in Computer Science Education (28% acceptance rate; 67/243 full papers)