One of the major challenges in a successful flipped classroom is ensuring pre-class preparation by all students. Currently, in most cases, such pre-class preparation involves the assignment of a few short videos, textbook content, and a quiz that may be given online or in-class. However, this one-size-fits-all approach does not account for differential needs and may not be motivating enough for all students.
To improve the flipped classroom in a Numerical Methods course, with funding through an exploratory NSF IUSE development grant, we prepared 17 adaptive lessons for half of the topics in the course. The lessons were developed using the popular adaptive learning platform (ALP) of Smart Sparrow. This ALP allowed combining the elements of videos, textbook content, simulations, and quizzes. The quizzes consisting of multiple-choice, fill-in-the-blank, and algorithmic questions provided students with immediate feedback on how they were doing and directed them along personalized paths based on how they responded to the questions on the quiz.
In this paper, we discuss how the adaptive lessons were developed for the course, the metrics collected through the ALP, and their usefulness and interpretation. The student-level ALP metrics included the number of attempts to complete a lesson, the raw score (based on all attempts made), the lesson score (based on the maximum score for the various attempts), the time spent on a lesson, and the number of hours before the deadline that a lesson was completed. Lesson-level metrics included the percentage of students who completed the lesson and percentage of adaptive feedback in use. This latter percentage was based on the number of custom states or states with adaptive feedback that were triggered and seen by at least one student.
In assessing the relationship between exam performance and ALP use, the correlation between the final examination results and most of the lesson metrics were not sizable or statistically significant. For example, there was almost no relationship between the final examination score and the total hours spent on the lesson (r = -0.003). Most students also received high lesson scores since they could pursue multiple attempts. We hence sought what differentiates the lesser-performing from the better-performing students. This differentiation was evident in the relationship between the final examination score and the raw score, where we measured a correlation of r=0.35 with p<0.0005. This demonstrated that students who answered the questions correctly in the initial attempt achieved better performance on the final examination, suggesting that stronger preparation or due diligence leads to better exam performance. Based on these results, we believe that the study should be extended to gain additional insights by developing adaptive lessons for the whole course and implementing and assessing them at multiple universities.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.