This paper presents the results of surveys of students, educators, and advisors who used a custom online queuing system in diverse educational settings. Prior work identified that using technology such as a mobile-friendly, web-based queue has benefits to scaling student/educator interactions. The current study was developed to collect student, instructor, and advisor feedback to understand best practices, challenges, and perceptions from using the online queuing system for office hours, active learning, and advising.
There is an increasing need to facilitate quality instruction in large enrollment courses. Towards addressing this need, we have previously described the development and early use of an online queue system for education (BLINDED). The Queue is an open-source application that allows students to add their name and a question or topic to an online queue that is monitored by course staff or advisors. Students can access the Queue web page with a cell phone, tablet, laptop, or any other computing device. Both students and course staff can view which students are in the queue and what questions they have. While the Queue software was originally developed for use in office hours of large enrollment courses, the software has since been adopted for other educational purposes, including, drop-in advising, peer learning, and active learning (BLINDED). Since its implementation in Fall 2017, the Queue has been adopted by 20 courses, 3 advising offices, and has facilitated over 50,000 questions from over 6,000 different students.
In the early use cases of the Queue, we have identified several benefits for students and instructors, including but not limited to saved time, improved accessibility, and improved use of space since office hours are not set to a fixed location that may or may not accommodate demand. Student surveys will validate those benefits and add new personal insights into how the Queue enhances their interactions and success in courses. Surveys will collect data on student preferences when using the Queue to inform development features (e.g., I would prefer to be anonymous on the Queue) as well as assessing students perceptions about learning material (e.g., The Queue helped me toward mastering material in the course). Further, student surveys will assess whether the Queue facilitates student-instructor interactions (e.g., I am more likely to approach course or office staff using a digital queue). Student feedback on additional software features will also be solicited. Queue adopter surveys (administered to faculty, advisors, and staff who use the system) will assess ease of implementation (e.g., The Queue was easy to implement in my course/office) as well as solicit general feedback on features and data collection.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.