For many education providers, student
engagement can be a major issue. Given the positive correlation between engagement
and good performance, providers are continually looking for ways to engage
students in the learning process. The growth of student digital literacy, the
wide proliferation of online tools and the understanding of why online gaming
can be addictive have combined to create a set of tools that providers can
leverage to enhance engagement. One such tool is Peerwise, https://peerwise.cs.auckland.ac.nz/,
an online, multiple choice question (MCQ) and answer tool in which students
create questions that are answered by other students. Why use MCQs? Using MCQs
tests knowledge, provides reassurance of learning, identifies gaps and makes
this data available to student and provider. Students use this information to
focus their time on areas requiring additional work , benefiting from the
early feedback provided. Formative assess- ments using MCQs are beneficial in
preparing students for summative testing and are appreciated and liked by
students . Providers can use this information to determine how the material
is being received and react accordingly. Students use Peerwise to create MCQs
that are answered, rated and commented on by their peers. Students’ engagement
in Peerwise earns trophies for contributing regular use and for providing
feedback, all of which act to stimulate further engagement, using the
principles of gamification. Bournemouth University, a public university in the
UK with over 18,000 students, has been embedding Peerwise in under-graduate and
post-graduate units since 2014. The results experienced by Bournemouth
University have been beneficial and correlate with other studies of using
Peerwise  . A statistically significant improvement was seen by one cohort
of students compared to the previous year where Peerwise was not used. However,
no correlation was found between Peerwise participation and a student’s unit
mark. The processes followed by Bournemouth University and the advantages and
disadvantages, backed by qualitative and quantitative data, will be presented
so that other institutions can gain an informed view of the merits of Peerwise
for their own teaching and learning environments.
Cite this paper
Biggins, D. , Crowley, E. , Bolat, E. , Dupac, M. and Dogan, H. (2015) Enhancing University Student Engagement Using Online Multiple Choice Questions and Answers. Open Journal of Social Sciences
, 71-76. doi: 10.4236/jss.2015.39011
 Fielding, M. (2001) Students as Radical Agents of Change. Journal of Educational Change, 2, 123-141.
 Foos, P.W. (1989) Effects of Student-Written Questions on Student Test Performance. Teaching of Psychology, 16, 77- 78. http://dx.doi.org/10.1207/s15328023top1602_10
 Denny, P. (2010) Motivating Online Collaborative Learning. ITiCSE’10: Proceedings of the 15th Annual Conference on Innovation and Technology in Computer Science Education, Ankara, June 2010, 26-30.
 Luxton-Reilly, A., Denny, P., Plimmer, B. and Sheehan, R. (2012) Activities, Affordances and Attitude: How Student- Generated Questions Assist Learning. Proceedings of the 17th ACM Annual Conference on Innovation and Technology in Computer Science Education, 3-5 July 2012. http://dx.doi.org/10.1145/2325296.2325302
 Bloxham, S. (2007) The Busy Teacher Educator’s Guide to Assess-ment. http://dera.ioe.ac.uk/13028/
 Simon, B. and Cutts, Q. (2012) Peer Instruction: A Teaching Method to Foster Deep Understanding, Communications of the ACM, 55, 27-29. http://dx.doi.org/10.1145/2076450.2076459
 Entwistle, N. (2000) Promoting Deep Learning through Teaching and Assessment: Conceptual Frameworks and Educational Contexts. 1st Annual Conference ESRC Teaching and Learning Research Programme (TLRP), University of Leicester, November 2000. http://www.tlrp.org/acadpub/Entwistle2000.pdf
 Biggs, J. (2003) Aligning Teaching and Assessing to Course Objectives. Teaching and Learning in Higher Education: New Trends and Innovations, University of Aveiro, 13-17 April 2003.
 Denny, P., Luxton-Reilly, A. and Hamer, J. (2008) Student Use of the PeerWise System. ITICSE’ 08: Proceedings of the 13th Annual Conference on Innovation and Technology in Computer Science Education, Madrid, 30 June-02 July 2008, 73-77. http://dx.doi.org/10.1145/1384271.1384293
 Hinton, D. and Cooner, T.S. (2014) Blended Learning Design Planner v1.2 Resource Pack. Design for Inquiry-Based Blended Learning (DIBL), University of Birmingham, Birmingham.
 Hanrahan, M.U. (1998) The Effect of Learning Environment Factors on Students’ Motivation and Learning. In-ternational Journal of Science Education, 20,737-753. http://eprints.qut.edu.au/1352/#
 Biggs, J. and Moore P. (1993) The Process of Learning. Prentice Hall, New York.
 Gibbs, G. and Simpson, C. (2004) Conditions under Which Assessment Supports Students’ Learning. Learning and Teaching in Higher Education (LATHE), 1, 3-31.
 Hounsell, D. (2007) Towards More Sustainable Feedback to Students. In: Falchikov, N. and Boud, D., Eds., Rethinking Assessment in Higher Education, Routledge, London, 101-113.
 Draper, SW. (2009) Catalytic Assessment: Understanding How MCQs and EVS Can Foster Deep Learning. British Journal of Educational Technology, 40. http://dx.doi.org/10.1111/j.1467-8535.2008.00920.x
 Denny, P., Luxton-Reilly, A. and Simon, B. (2009) Quality of Student Contributed Questions Using Peerwise. ACE’ 09: Proceedings of the 11th Australasian Conference on Computing Education, Wellington, 55-63.
 Swailes, S and Senior, B. (1999) The Dimensionality of Honey and Mumford’s Learning Style Questionnaire. International Journal of Selection and Assessment. http://dx.doi.org/10.1111/1468-2389.00099
 Anderson, L.W. and Krathwohl, D.R. (2001) A Taxonomy for Learning, Teaching, and Assessing. Longman, New York.
 Purchase, H., Hamer, J., Denny, P. and Luxton-Reilly, A. (2010) The Quality of a PeerWise MCQ repository. ACE’ 10: Proceedings of the 12th Australasian Conference on Computing Education, Brisbane, 37-146.