Many engineering programs have begun adopting interactive online homework systems, often as a way to stretch the precious resources of faculty time and energy. While an increasing number of online homework offerings are available from textbook publishers, many of these systems have proven less than ideal. Some issues the authors have experienced with these systems include errors in the embedded solutions, inflexibility to correct or expand exercise problems, and sparse or incomplete coverage of the material covered in the texts, not to mention the additional cost students incur for access to these systems.
As an alternative, the authors have developed a robust set of integrated Statics and Mechanics of Materials exercise problems for use within a free, open-source, online homework delivery tool called WeBWorK. This tool has seen wide adoption in mathematics courses worldwide (over 1000 institutions) and the authors’ institution has considerable experience using it in that context. The problem set being studied in this paper was developed as part of an NSF-funded project to expand the use of WeBWorK into three sophomore-level engineering courses.
The effect of online homework on student learning was studied by comparing the performance of cohorts of students using the online homework system with cohorts of students completing the same problems without access to the online system. For each homework set studied, “performance” was assessed using a quiz specifically covering the same material as the set. Each quiz was given shortly after submission of the corresponding homework set, and the student’s performance on each quiz accounted for a small percentage (~1%) of their overall course grade. The cohorts using the online system were required to turn in their homework solutions on paper and submit their answers in the online system, while the cohorts without access to the system were only required to turn in their paper solutions.
The study was completed for two terms, fall 2014 (F14) and winter 2015 (W15). In F14, there were four sections of the course offered; two sections from one instructor (I1) and one each from two other instructors (I2 and I3). In W15, three sections of the course were offered; two sections from I1, and one section from I2. In an attempt to maintain consistency of experience within each section, cohorts were defined by course section for each homework set studied. In an attempt to minimize variability due to potential instructor differences, student cohorts were selected such that for each homework set studied, a cohort of each type (i.e. WeBWorK vs. paper-only) were in sections taught by I1. In an attempt to minimize variability due to the overall aptitude of one section versus another, the cohorts alternated between having access to WeBWorK and completing paper-only homework for the homework sets studied. Four homework sets were studied in each term.
This paper presents the results of this study, including analyses to locate any statistically significant difference between the performance of each cohort.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.