Welcome to the iSnap demo! iSnap is a data-driven, intelligent tutoring system for block-based programming, offering support such as hints, feedback and curated examples. iSnap extends Snap, an online, bock-based programming environment designed to make programming more powerful and accessible to novices.
The iSnap project has also produced various papers and public datasets, which can be found below.
iSnap is combines a number of systems and support features, which you read about below:
Try the SourceCheck Hints demo
Using data collected from real students working on programming assignments, we are able to generate on-demand, next-step hints for students who get stuck on these assignments. The SourceCheck algorithm matches students' code to previously observed code from students who successfully completed the assignment and recommends an edit based on how those students progressed.
See an explanation of iSnap's help features below, or try them out yourself with this demo. Select any assignment and test out the hints.
When a student needs help, they can ask iSnap to check their work. To start off, it shows two colors:
If a student requests a next-step hint, iSnap also adds:
The above demo shows off iSnap's newest hint interface, but much of the earlier research with iSnap used a simpler hint interface, based on the Contextual Tree Decomposition (CTD) algorithm. See a demo of these hint in action by completing the Guessing Game Part 1. The assignment asks you to create a guessing game, in which the computer stores a random number and then repeated asks the player to guess it, telling them if they are too high, too low or correct.
For more information on the data-driven algorithm that powers iSnap, see:
For more information on iSnap and its initial pilot evaluation, see:
Price, T. W., Y. Dong and D. Lipovac. "iSnap: Towards Intelligent Tutoring in Novice Programming Environments." ACM Special Interest Group on Computer Science Education (SIGCSE). 2017. [Paper | Slides]
Price, T. W., Z. Liu, V. Cateté and T. Barnes. "Factors Influencing Students’ Help-Seeking Behavior while Programming with Human and Computer Tutors." International Computing Education Research (ICER) Conference. 2017. [Paper | Slides]
Price, T. W., R. Zhi and T. Barnes. "Hint Generation Under Uncertainty: The Effect of Hint Quality on Help-Seeking Behavior." International Conference on Artificial Intelligence in Education. 2017. [Paper | Slides]
Price, T. W., R. Zhi, Y. Dong, N. Lytle and T. Barnes. "The Impact of Data Quantity and Source on the Quality of Data-driven Hints for Programming." International Conference on Artificial Intelligence in Education. 2018, forthcoming.
Try the AIF demo
Programming can be quite challenging, and for novices it can be filled with negative self-assessments (e.g. "I'm not cut out for this."). What students often fail to realize is that they are making progress, they just can't see it yet.
The AIF system addresses this in part by breaking the problem down into subgoals that are more manageable than an entire problem.
AIF also provides students with real-time feedback on their progress on programming assignments, using a hybrid data-driven progress assessment algorithm. As students work, AIF monitors their progress, displaying it as a progress bar. It can even detect progress that's out of order, e.g. not yet in the correct procedure.
When students make a mistake, they can see that immediately, as the progress bar decreases.
AIF also pops up encouraging messages, both when students progress, and also when they fail to make progress for a while.
Try the Example Helper demo
Example Helper is an code example gallery to support open-ended programming, where students design and implement projects connected to their own interested. Because students design these programs and specify their own goals, it is difficult to support them with hints and feedback. Example Helper addresses this by offering a curated gallery of code examples, derived from prior students' projects.
Using Example Helper, a student can find an example by browsing through the gallery, or filtering and search for examples by clicking on a tag, or querying in a search box.
After finding a needed example, the student can preview it to run, edit and tinker with the code. They are encouraged to reading the code and relate it to its output, and write a self-explanation.
View a demo of the self-explainable debugging
One of the most challenging tasks for novices is understanding why their code isn't working the way it's expected. iSnap offers a debugging visualization (currently in development) that helps students understand the execution of visual programs.
After running a script, students can hover over any block in their code to see what it did during execution.
When the student hovers over the outer loop in this code, they see all the movement the sprite made as a part of each iteration of the loop.
Hovering over the inner loop shows the same movement, but broken down into the smaller steps of the inner loop.
Hovering over a variable-related block shows how the variable's value changed as the script executed and the sprite moved.
View a demo of the logging output
iSnap logs all actions taken by students in the environment, as well as snapshots of
students' code as they work. Logs can be saved to a database for future analysis, review
or grading. To get a quick overview of the programs that have been created on this demo
site, check out the viewer page.
Note: this is a demo site and does not include actual student data.
The iSnap project has also produced public datasets, consisting of log data from students working in an introductory computing class, which can be found on the PSLC Datashop.