MENU
  • ASEE Logo
  • Give
    Give
    ASEE Donations...
    Classified Volunteer
    Login
  • Join Login Volunteer Classified Give
    Give
    ASEE Donations...



About
  • Overview
    • Mission, Vision, Goals
    • Public Policy Statements
    • Constitution
    • Bylaws
    • Organizational Structure
    • Investment Policy
    • Financial Policy
  • Our History
  • Staff Contacts
  • Leadership
    • Board Of Directors
    • Academy Of Fellows
    • Past Board Members
    • Advisory Committees
    • Representatives to External Organizations
    • Executive Director's Message
    • Meeting Minutes
  • Volunteer
  • Careers at ASEE
  • Privacy Statement
I Am A...
  • Member
      Login Required
    • Your Member Page
    • Membership Directory
    • Financials
    • Volunteer for Task Force
      • COVID Recovery
      • Engineering Culture
    • No Login Required
    • Awards
    • Divisions, Fellows, and Campus Reps
    • Sections and Zones
    • Resources
  • Prospective Individual/Organizational Member
    • About ASEE
    • Individual Membership
    • Institutional Membership
    • Major Activities
  • Donor
  • Prospective Partner or Sponsor
  • Advertiser
  • Fellowship Seeker
    • About Fellowships
    • High School
    • Undergraduate
    • Graduate
    • Post-Doctoral
    • Other Programs
Events
  • Conferences and Meetings
    • 2022 Annual Conference & Exposition
    • 2021 Virtual Annual Conference & Exposition
    • 2020 Virtual Annual Conference & Exposition
    • Section & Zone Meetings
  • Council Events
    • Conference for Industry and
      Education Collaboration (CIEC)
    • CMC Workforce Summit
    • Engineering Deans Institute (EDI)
    • Research Leadership Institute (RLI) (Formerly ERC)
    • Engineering Technology Leaders Institute (ETLI)
    • EDC Public Policy Colloquium (PPC)
  • Featured Events
    • Frontiers in Education
    • NETI
    • CoNECD
    • First Year Engineering Experience
    • Workforce Summit
  • Future Conference Dates
Publications
  • News
    • Newsletters
    • eGFI
    • Division Publications
  • Journals and Conference Papers
    • Overview
    • Journal of Engineering Education
    • Advances in Engineering Education
    • Conference Proceedings
    • Section Proceedings
    • Zone Proceedings
    • PEER
    • Plagiarism
  • Monographs and Reports
  • Prism Magazine
  • Data
    • Profiles of E&ET Colleges
    • Case Study Series: Engineering-Enhanced Liberal Education
Impact
  • Public Policy Statements
  • Data Analysis
  • Annual Reports
  • Diversity
Education & Careers
  • Academic Job Opportunities
  • Course Catalog
  • Engineering Education Research and Innovation
    • Engineering Education Community Resource
  • PreK-12
    • eGFI Teachers
    • eGFI Students
  • Engineering Teacher PD Endorsement
Calendar
2020 Annual Conference
The ASEE 2020 Virtual Annual Conference content is available.
See More....
  • International Forum
    • Past Forums
  • 2020 Research Leadership Institute(RLI) (Formerly ERC)
    • Overview
    • Registration
    • Housing
    • Program Schedule
    • ERC Past Conferences
  • 2018 Engineering Technology Leaders Institute (ETLI)
    • Engineering Technology Leaders Institute (ETLI)
    • Registration
    • Housing
    • Program Schedule
    • ETLI Sponsorship Options
  • 2017 Global Colloquium
  • Pre K-12 Workshop
    • Call for Proposals
    • Registration
    • Housing
    • Sponsors
    • Program Schedule
    • Past Conferences
  • STEP Grantees Meeting
  • Annual Conference
    • Past Conferences
  • International Forum
    • Past Forums
  • 2020 Research Leadership Institute(RLI) (Formerly ERC)
    • Overview
    • Registration
    • Housing
    • Program Schedule
    • ERC Past Conferences
  • 2018 Engineering Technology Leaders Institute (ETLI)
    • Engineering Technology Leaders Institute (ETLI)
    • Registration
    • Housing
    • Program Schedule
    • ETLI Sponsorship Options
  • 2017 Global Colloquium
  • Pre K-12 Workshop
    • Call for Proposals
    • Registration
    • Housing
    • Sponsors
    • Program Schedule
    • Past Conferences
  • STEP Grantees Meeting
  • Annual Conference
    • Past Conferences

2019 ASEE Annual Conference & Exposition

Human vs. Automated Coding Style Grading in Computing Education

Presented at Technical Session 11: Topics related to Computer Science

Human vs. Automated Coding Style Grading in Computing Education

Computer programming courses often evaluate student coding style manually. Static analysis tools provide an opportunity to automate this process. In this paper, we explore the tradeoffs of human style graders and general-purpose static analysis tools to evaluate student code. We investigate the following research questions:
- Are human coding style evaluation scores consistent with static analysis tools?
- Which style grading criteria are best evaluated with existing static analysis tools and which are more effectively evaluated by human graders?

We analyze data from a second-semester programming course at a large research institution with 943 students enrolled. Hired student graders evaluated student code with rubric criteria such as “Lines are not too long” or “Code is not too deeply nested.” We also ran several static analysis tools on the same student code to evaluate the same criteria. We then analyzed the correlation between the number of static analysis warnings and human style grading score for each criterion.

In our preliminary results, we see that static analysis tools tend to be more effective at evaluating objective code style criteria. We found a weak negative or no correlation between the human style grading score and number of static analysis warnings. Note that we expect student code with more static analysis warnings to receive fewer human style grading points. When comparing the “Lines are not too long” human style grading criterion to a related line-length static analysis inspection, we see a Pearson correlation score of r=-0.21. We also see trends in the distributions of human style grading scores that suggest human graders perform inconsistently. For example, 50% of students who received full human style grading points for the line-length criterion had 3 or more static analysis warnings from a related line-length inspection. Additionally, 23% of students who received no points on the same criterion had no static analysis warnings for the line-length inspection.

We also found that some code style criteria are not well suited to the general-purpose static analysis tools we investigated. For example, none of the static analysis tools we investigated provide a robust way of evaluating the quality of variable and function names in a program. Some tools provide an inspection for detecting variable names that are shorter than a user-specified length threshold; however, this inspection fails to identify low-quality variable names that happen to be longer than the minimum allowed length. Furthermore, there are some common scenarios where a short variable name is acceptable by convention.

Static analysis tools have the benefit of integration with an automated grading system, facilitating faster and more frequent feedback compared to human grading. The literature suggests that frequent feedback encourages students to actively improve on their work (Spacco et al. 2006). There is also evidence to suggest that increased engagement is most beneficial to students with less experience (Carini et al. 2006). Our results suggest that automated code quality evaluation could be one tool that benefits student learning in intro CS courses, helping most those students with least access to CS training pre-college.

References
- Carini, R.M., Kuh, G.D. & Klein, S.P. Res High Educ (2006) 47: 1.
- Spacco, Jaime and Pugh, William. Helping students appreciate test-driven development (TDD). Proceedings of OOPSLA, pages 907–913, 2006.

Authors
  1. James Perretta University of Michigan [biography]

    James Perretta is currently pursuing a master's degree in Computer Science at the University of Michigan, where he also develops automated grading systems. His research interests and prior work focus on using automated grading systems and feedback policies to enhance student learning.

  2. Dr. Westley Weimer University of Michigan
  3. Dr. Andrew Deorio Orcid 16x16http://orcid.org/0000-0001-5653-5109 University of Michigan [biography]

    Andrew DeOrio is a teaching faculty member at the University of Michigan and a consultant for web and machine learning projects. His research interests are in ensuring the correctness of computer systems, including medical and IOT devices and digital hardware, as well as engineering education. In addition to teaching software and hardware courses, he teaches Creative Process and works with students on technology-driven creative projects. His teaching has been recognized with the Provost's Teaching Innovation Prize, and he has twice been named Professor of the Year by the students in his department.

Download paper (242 KB)

Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.

» Download paper

« View session


  • Follow Us
  • twitter
  • facebook
  • youtube
  • instagram
  • linkedin
  • 1818 N Street N.W. Suite 600, Washington DC 20036

  • Telephone: 202.331.3500 | Fax: 202.265.8504

  • © 2025 Copyright: ASEE.org All rights reserved. Privacy Policy.