Several theories, including Experiential Learning Theory, describe the importance of reflection for learning, and a host of articles have called for additional research on reflection in engineering education. Ambrose has called for engineering curricula with “opportunities for reflection to connect thinking and doing,” since students learn by doing but only when they reflect on the doing too. Regular reflection plays a critical role in the construction of metacognitive knowledge and self-regulatory skills, which includes monitoring and evaluating one’s own learning, knowledge, and skills. Unfortunately, the development of metacognitive skills is often not formally included within curriculums. However, simple in-class active learning exercises, such as think-pair-share or minute papers, as well as post-exam analysis by students, can promote reflection and metacognition. In a microelectronics course, we recently incorporated post-exam reflective exercises using SPICE simulation tools to guide students’ reflections on errors made and strategies to improve future performance. The instructor was inspired to use this approach after learning of its use in an introductory circuits course. In the circuits course, the instructors had used a reflective approach known as Exam Analysis and Reflection (EAR), which had previously been developed for mechanical engineering courses.
In the microelectronics course, we preliminarily incorporated reflective exercises after two exams and applied the EAR with the second. After the first exam, students used the simulator to correct any errors, which introduced them to using simulation for reflection. With the second exam, which was a small quiz, a similar procedure was followed, in which students used the simulator to reconstruct the amplifier circuit on the exam. Thus, students used the simulator to “re-do” the quiz to determine the simulated values, with the goal of having students recognize and question any differences, which could have resulted from calculation errors or natural differences between simulation and hand calculations, among other reasons. Students were then asked to reflect on the following questions from the EAR approach: “How is my exam result different from the simulated result?“, “What went wrong with my solution?”, and “How can I use this to improve my performance in the future?”
To assess the impact of using simulation to reflect on their exams, we interviewed students as well as directly assessed their performance. Students were given a final exam problem that was very similar to the quiz problem where they applied the EAR approach. We compared the results from this final exam problem to those from the prior year (without reflection), in which the final exam was the same. We also determined the correlation between the quality of students’ reflections and their performance on this final exam problem. We assessed the quality and depth of the reflections using a four-category rubric from the published literature. The preliminary results have been promising, showing evidence of students’ appreciation of the reflective approach in their interviews and depth in their EAR responses. The interview data also highlighted lessons on improving our initial implementation of simulation for this type of reflection and comparison.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.