I Am A...
Calendar
Marino Nader1, and Ronald F. DeMara2
1Department of Mechanical and Aerospace Engineering, and
2Department of Electrical and Computer Engineering
University of Central Florida, Orlando, FL 32816-2362
Abstract
Instructor-level assessment methodologies specific to engineering core curricula are synergized with institutional-level testing infrastructures to improve outcomes spanning academic integrity, grade accuracy, and elevating students’ success. The combination of multiple attempt testing within a properly-proctored testing environment is explored herein. Namely, we compare the students’ success rates in two different engineering core courses: Dynamics (required course for engineering majors) and Thermodynamics (required by several degree programs in engineering at the institution). Dynamics enrolled 155 students and Thermodynamics enrolled 282 at a large state university during Summer 2022, which were each analyzed via control and invention cohorts.
Both classes were delivered as hybrid online/live courses and the tests were assessed in an Evaluation Proficiency Center (EPC) where students were permitted three attempts per test. They were also afford the opportunity to convene with the Graduate Teaching Assistants (GTAs) after machine scoring of each attempt to engage metacognition and learn from their mistakes. These tests are based on a foundation of large questions pools, so that later attempts rarely repeat delivery of the identical problems and values. Yet, there is intrinsic motivation that one or two questions may be re-asked from a previous attempt giving students an innovative incentive to pursue better clarification in case of repetition.
The EPC provides a uniform testing environment with 140 seats, materials in lockers, distractions reduced, and ceiling-mounted cameras as deterrent to integrity violations. CANVAS was the Learning Management System (LMS) used for these courses, which provided Computer-Based Assessment (CBA) that facilitated the three-attempt testing. With the above testing environment, the least class average improvement from the first and the third (last) attempt for Thermodynamics was 16% in the third test. A similar comparison for dynamics attained a rate of 41% improvement for the first assessment in the course. A student survey was conducted for each course that confirmed a very large percentage of students agreed that this formative assessment method is effective and worthwhile to motivate their learning.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.