Customized Versus Standardized Exams For Learning Outcomes Assessment In An Undergraduate Business Program

Main Article Content

Amy L. Phelps
William E. Spangler

Keywords

Program-Level Outcomes Assessment, Standardized Exam, Major Field Test

Abstract

A standardized exam for program-level assessment can take the form of 1) a customized exam developed in-house by faculty and linked explicitly to program-level learning goals; or 2) a standardized exam developed externally by assessment experts and linked to a set of somewhat broader and more generalizable learning goals.  This article discusses the design, development and implementation of a customized exam and subsequent transition to a commercial exam (i.e., the ETS Major Field Test), within an undergraduate business program.  We discuss the lessons learned from our experience with the customized exam, our analysis of both the assessment process and the results gathered (primarily curriculum-related), and the rationale underlying the eventual migration to the commercial exam.  Of particular emphasis is the situation-dependent and potentially complementary roles of the customized and commercial exams.  In this regard we provide a comparison of the two approaches through a framework based on a set of administrative and assessment considerations.  They include: relevance to learning goals, exam design and development process, delivery of the exam, impact on learning, impact on courses and curriculum, and impact on student monitoring and management.  We note that although the customized exam no longer exists as a standalone assessment instrument, it continues to play a role in assessment by complementing other methods.  This outcome, as well as the process leading to it, are potentially applicable to other institutions pursuing an evolutionary approach to learning outcomes assessment.

Downloads

Download data is not yet available.
Abstract 274 | PDF Downloads 308