This paper uses the binary logistic regression
to show how exam policies affect students’ learning outcomes. Types of examinations
employed by instructors are divided broadly into three, namely traditional, nontraditional,
and project. Using data from an undergraduate business program, the study develops
a binary logistic regression model predicting the effects of the three types of
examinations on students’ learning outcomes. The results showed that the traditional
(in-class) examinationhad the largest predictive powers on students’ learning outcomes.
Nontraditional examination and project had significantly lesser predictive powers
than traditional examination, with project having the least powers. The findings
suggest, first, that instructors’ examination policies may be less
impactful or have negative effects on learning outcomes; second, there can be a
particular combination of traditional, nontraditional, and project examinations,
which can most effectively boost students’ learning outcomes; third, students who
participate in academic program with higher correctly classified estimates would
be expected to acquire higher learning outcomes than students who participate in
an academic program with significantly lower correctly classified estimates; fourth,
examination policies can be deployed as a critical tool for students’ learning outcomes;
and, fifth, a periodic evaluation of examination policies in an academic program
may be useful.
 Astin, A. W. (1993). What Matters in College? Four Critical Years Revisited. San Francisco, CA: Jossey-Bass.
 Berger, J., & Milem, J. (2000). Organizational Behavior in Higher Education and Student Outcomes. In J. C. Smart (Ed.), Higher Education: Handbook of Theory and Research, (Vol. XV: pp. 268-338). New York: Agathon.
 Chen, H. L., Lattuca, L. R., & Hamilton, E. R. (2008). Conceptualizing Engagement: Contributions of Faculty to Student Engagement in Engineering. Journal of Engineering Education, 97, 339-353.
 Coelho, C., Ylvisaker, M., & Turkstra, L. S. (2005). Nonstandardized Assessment Approaches for Individuals with Traumatic Brain Injuries. Seminars in Speech and Language, 26, 223-241. http://dx.doi.org/10.1055/s-2005-922102
 Cornell University Statistical Consulting Unit (2011). Binary Logistic Regression Models and Statistical Software: What You Need to Know. Ithaca, NY: Cornell University.
 Crozier, J. (2002). China Focus. The Journal of the Society for Anglo-Chinese Understanding.
 Dey, E., Hurtado, S., Rhee, B., Inkelas, K. K., Wimsatt, L. A., & Guan, F. (1997). Improving Research on Postsecondary Outcomes: A Review of the Strengths and Limitations of National Data Sources. Palo Alto, CA: Stanford University, National Center for Postsecondary Improvement.
 Gray, P. (2013). Free to Learn: Why Unleashing the Instinct to Play Will Make Our Children Happier, More Self-Reliant, and Better Students for Life. New York: Basic Books.
 Henry, P. (2007). The Case Against Standardized Testing. Minnesota English Journal, 55.
 Hu, S., McCormick, A. C., & Gonyea, R. M. (2012). Examining the Relationship between Student Learning and Persistence. Innovative Higher Education, 37, 387-395. http://dx.doi.org/10.1007/s10755-011-9209-5
 Jackson, C. K. (2008). Cash for Test Scores. Education Next, 8, 70-77.
 Khan, A. (2010). Model Building Using Logistic Regression (pp. 8-9). St Lucia, QLD: University of Queensland.
 Kuh, G. D. (1993). In Their Own Words: What Students Learn Outside the Classroom. American Educational Research Journal, 30, 277-304. http://dx.doi.org/10.3102/00028312030002277
 Kuh, G. (1999). Setting the Bar High to Promote Student Learning. In G. S. Blimling, E. J. Whitt, & Associates (Eds.), Good Practice in Student Affairs: Principles to Foster Student Learning (pp. 67-89). San Francisco: Jossey-Bass Publishers.
 Linden, S. (2007). The Impact of Standardized Testing on Student Performance in the United States. Pell Scholars and Senior Theses, Paper 10, Newport, RI: Salve Regina University.
 National Commission on the Future of Higher Education (2006). A Test of Leadership: Charting the Future of U.S. Higher Education. Washington, DC: US Department of Education.
 Organisation for Economic Cooperation and Development (2005) Education at a Glance: OECD Indicators. Paris: Author.
 Pascarella, E. T., & Terenzini, P. T. (1991). How College Affects Students: Findings and Insights from 20 Years of Research. San Francisco, CA: Jossey-Bass.
 Pascarella, E. T., & Terenzini, P. T. (2005). How College Affects Students: A Third Decade of Research. San Francisco, CA: Jossey-Bass.
 Smart, J. C., Feldman, K. A., & Ethington, C. A. (2000). Academic Disciplines: Holland’s Theory and the Study of College Students and Faculty. Nashville, TN: Vanderbilt University Press.
 Terenzini, P. T., & Reason, R. D. (2005). Parsing the First Year of College: Rethinking the Effects of College on Students. In The Meeting of the Association for the Study of Higher Education (p. 630). Philadelphia, PA: The Association for the Study of Higher Education.
 Terenzini, P. T., Ro, H. K., & Yin, A. C. (2005). Between-College Effects on Students Reconsidered. In The Meeting of the Association for the Study of Higher Education. Philadelphia, PA: The Association for the Study of Higher Education.
 Terenzini, P. T., Volkwein, J. F., & Lattuca, L. R. (2007). The Effects of Conventional vs. Internal Organizational Characteristics on Student Experiences. In The Meeting of the Association for the Study of Higher Education. Louisville, KY: The Association for the Study of Higher Education.
 Zucker, S. (2003). Assessment Report: Fundamentals of Standardized Testing. London: Pearson.