The problem: The stats lab I had inherited relied heavily on a copy/paste method of teaching. Students A) didn’t like that very much and B) weren’t learning to troubleshoot when there were mistakes in the code. Instead, when they ran it and it didn’t work, instead of trying to use the error messages provided by the statistics program, they would automatically raise their hand and wait for me to come rescue them.
The innovation: I created a series of exercises thatcontained intentional bad code. These were given to students at the beginning of class, covering what we learned during the last session. Their task was to fix what was broken and get the code to run. They were also asked to report results appropriately once they had successfully achieved them. In Fall 2014, students received these exercises for half the topics in the course. In Fall 2015, they received an exercise at the start of every class.
The assessment: Students rated their confidence for conducting, troubleshooting, reporting, and creating a variety of statistical analyses covered in the course on a scale from 1 (no confidence or experience) to 5 (full confidence/mastery) during both the first and last class of the semester. During the last class they were also asked to rank various instructional methods, including the Make it Work! exercises from most to least helpful (on a scale of 7) and were given a chance to provide open-ended feedback.
The results: In Fall 2014, students gained more confidence when they had a Make it Work! exercise for a topic then when they didn’t. They also rated the exercises among the top 3 most useful aspects of the course and had positive feedback. Full research results can be seen here.
“They were helpful; a few more would have been great.”
(This project was completed as part of the requirements for the CIRTL Scholar certificate. More details on CIRTL@UAB can be found here.)