After-school maker programs provide opportunities to introduce youth to engineering concepts and skills before college and engage them in hands-on projects that require creative problem solving, teamwork and persistence. An important part in evaluating the quality of these programs is conducting assessments that capture their impact on the skills and attitudes of youth participants. As interest in after-school programs grow and they expand to include hundreds of youth participants, it is important to identify qualitative and quantitative assessment tools that can be deployed at scale and provide useful insights into programs’ impact to educators. In this context, a key factor that impacts the successful deployment of assessment tools is youth’s attitudes towards them.
In this study, we investigated the impact of youth’s attitudes towards both quantitative assessment tools, including Grit-S and Alternative Uses Test (AUT), and qualitative assessment tools, including open portfolios and showcase presentations. We analyzed survey data from 159 youth who participated in learning programs offered at a local after-school learning center over three years. We also used participant observations and a focus group with 8 youth in a professional training program offered at the center. Additionally, we conducted interviews with three adult program staff who administered the different assessments and collected their observations and reflections about youth’s attitudes towards them.
We found that the youth in the learning programs exhibited negative attitudes towards quantitative survey tools that resulted in low completion rates and resistance in participating in assessments. This result was more pronounced in the case of Grit-S than AUT. The youth exhibited more positive attitudes towards the open portfolio and showcase assessments, although the performative aspects of these activities failed to engage all of them. The youth in the professional training program preferred having multiple assessment tools and described how each tool can provide different insights. The staff identified a need to better match the content and format of assessments with the maker culture as practiced in the center. Based on these results we present a set of recommendations for developing and deploying relevant and engaging future assessment in these contexts.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.