As many colleges and universities continue to expand and refine global learning opportunities for engineering students they face a complex array of challenges. As one among many such concerns, administrators and other stakeholders increasingly expect that costly investments in international programming generate substantial benefits and impacts, including but not limited to growing the number of students served. Further, many programs are being asked to produce hard evidence of how research, work, study, and service abroad can enhance student learning and growth, synergistic with rising accountability pressures across the higher education landscape.
Fortunately, dozens of assessment instruments are available to measure many different facets of inter/cross-cultural competence, global competence, and related constructs. Some tools, such as the Intercultural Development Inventory (IDI), are also synergistic with training programs designed to promote student learning and development. However, debates persist about what tools are most valid and useful for assessing different kinds of interventions. Further, there remains a lack of high quality, validated assessment tools that are specifically focused on global engineering practice. As a consequence, administrators and researchers alike often find it difficult to collect robust, convincing evidence of how programs are supporting student growth and learning. There also remain unanswered questions about what program formats and training interventions have the largest measurable impacts on participants.
This paper responds to these challenges and pressures by introducing two tools that can be used for assessment and instructional purposes. We particularly emphasize the Global Engineering Competency – Situational Judgment Test (GEC-SJT) as a behavioral measure of competency. We more specifically present and discuss one sample assessment question drawn from a larger collection of scenarios focused on engineering in the Chinese national/cultural context. First, we review some relevant literature and background information about our larger research project. Second, we explain how the sample scenario was initially developed, as well as how we created a scoring key and collected validity evidence for the scenario through multiple rounds of data collection with subject matter experts (SMEs, i.e., practicing engineers). Third, we discuss how this scenario can be used for assessment purposes in the context of global engineering programs. Fourth, we present an instructional guide for those who might wish to use this type of scenario for training. Fifth and finally, we provide information about a complementary assessment tool we developed, the self-report Global Engineering Competency Scale (GECS).
This paper is expected to be of interest to faculty, staff, and administrators who are interested in assessing global engineering programs, as well as researchers who wish to measure global learning among engineering students, practicing professionals, and other populations. We also expect that instructors will benefit from this paper’s discussion of scenario-based instruction as an accessible and impactful way to promote global competency and other professional learning outcomes among students in engineering and other professional fields. This work may especially resonate with those who are eager to help current and future engineers appreciate – and more effectively navigate – the kinds of cross-cultural dynamics often faced in global technical work.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.