Received 13 May 2016; accepted 19 August 2016; published 22 August 2016
1.1. Polling Students: A Form of Interactivity
Large classes are a feature of modern higher secondary institutions. Most university programs involve large groups of students especially in the undergraduate programs. It is a continuous challenge to maintain interactivity. Interactivity with students in education is vital in order to build academic potential and knowledge transference. Student involvement, engagement and participation in higher level education have been a key area of focus for educators because learning occurs when students are actively engaged in classes. One of the strategies for active learning is the use of polls at various junctures during class.
With the increased accessibility and availability of technology in society, technology has now become a vital component of polling students in education. The modern classroom utilizes polling tools to enhance learning. These innovative tools are easy to use, and lead to better interactivity in the classroom both among students and with lecturers and can be used at any class level (Bojinova & Oigara, 2013) .
1.2. Evolution of Polling
Early methods at polling included raising hands. Another popular approach was using colored cards especially for multiple-choice questions, where students raised different color cards to indicate their choice to the projected answer choices for a question, and the teacher could see the responses easily. However, this was limited by the inability of teachers to easily collate and display responses as well as the inconvenience of providing differently colored papers to each of the students in a large class.
Manual polling evolved over time with the introduction of technology linked polling devices known as Student Response Systems (SRS), also known as Classroom Response Systems. These are tools that have buttons closely mirroring regular phone keypads with numbers and letters that students can push to enter their answers to a question. These tools are often called Clickers (Cora and Liyan 2013).
SRS devices that allow for responses to questions have been in use since the 1960s and were initially used to record audience responses to television programs and movies (Mahon, n.d.) . Early SRS technology were homemade. They were soon followed by commercial versions that were hard-wired into classrooms in the 1960s and 70s (Ward, Reeves, & Heath, 2003) . The technology evolved from wired hardware to portable and wireless devices that work together with a software, making integration of the process easier. This gave rise to the first generation of wireless versions which utilized infrared technology. Second generation versions were Radio Frequency based with the ability to support up to 1000 - 2000 transmitters. The keypads (clickers) can be used up to 150 - 300 feet from the receiver (Carnaghan et al., 2011) .
Today, years after the first low-cost radio-frequency student response system was introduced for classrooms, a variety of user-friendly, multi-featured polling systems are on the market, driven by competition and improving technology (Barber & Njus, 2007) . The newer devices are typically handheld and compact resembling a telephone-like keypad with a display LCD screen which can be used for more complex responses.
The newest forms of SRS technology use internet technology thereby allowing students to use regular smart phones to transmit responses to questions. Examples such as Poll Everywhere©, Top Hat© typically use a webpage. As such, these system can be used anywhere, and are not limited by distance between the receiver and the transmitter (Carnaghan et al., 2011) . They are limited by the need for an internet connection therefor cannot be used in isolation from the internet. SRS systems were originally designed for traditional classroom courses. They are now adaptable to online management systems such as Blackboard (Ward, Reeves, & Heath, 2003) .
With the advent of easier, portable systems, the use of clickers has spread to various areas and disciplines in education (Mahon, n.d.) . The technology has rapidly advanced with cell phones/tablets not only being able to transmit a clicker response but to also download and save the results of each interactive session.
1.3. How Clickers Work
A clicker is an electronic polling system which mainly supports multiple choice questions. Students are presented with questions in the form of a quiz, to which they respond with a small handheld device, usually after engaging in peer discussions (Nielsen, Hansen, & Stav, 2013) . Many instructors now use clickers for students’ interaction during lectures to take a poll of the understanding of a concept or topic area (Caldwell, 2007).
A typical application of clickers is during a lecture where the instructor synchronizes all clickers to a particular radio frequency and may register each clicker to a student so as to generate a unique, identifiable signal. The instructor either projects or poses a question and each student with a clicker selects the preferred option for the answer. Clickers use radio frequency technology to transmit and record student responses to questions which in turn transmits it to the voting software on the instructors’ computer via the aid of a small, portable receiving station in the classroom (Stav et al., 2010) (Figure 1).
The software application installed on the instructor’s computer facilitates the collection, processing, display and storage of response data. The instructor uses the software to operate clicker technology so as to create interactive presentations. The software displays the result of the poll in a histogram, pie chart, or in other appropriate format by utilizing state of the art SRS decision process solutions. Using a projector connected to the computer, the instructor projects the information to the students (Gok, 2011) . The system also allows the instructor to actively monitor student responses in real-time and he/she can provide immediate feedback to the student about any confusion or misunderstandings (Simelane & Dimpe, 2011) . The software also allows for recording of data so that results can be analyzed later.
1.4. Use of Clickers in Higher Education
Clickers represent some of the novel educational technologies available today. Clickers are tools and can be used in various ways and for different purposes. Thus, the use of clickers is becoming more prevalent in higher level education. As a result of their portability and ease-of-use, instructors are increasingly using clickers as a teaching tool. The creativity and imagination of the instructor, the questioning format and the way questions are presented to students limits the use of clickers (Morales, 2011) .
1.4.1. Active Learning Strategies
Many have focused on using clickers in the field of education as a means of promoting involvement based on the underlying belief that if an individual becomes and remains involved in their education, he or she thrives better in studies. As such, many educators use clickers to promote active learning and minimize poor concentration and interaction (Morales, 2011) . Instructors use clickers as a means of increasing student participation. Though little is known of how and why students become more participatory when using clickers, it appears students are motivated and more engaged in class, as well as more attentive and involved in class discussions.
Of paramount importance also, is the ability of lecturers to have more interactive discussions with students in a less threatening manner (Heaslip, Donovan, & Cullen, 2014) . As a result of difficulty faced by instructors to effectively manage interactions in large classrooms and the tendency of some students to shy away from speaking up publicly. Instructors use clickers as questioning aids in class to enhance active anonymous interaction. Clickers are tools that positively activate learning experience and provide a more relaxed atmosphere, to prompt interactions among students and with instructors (Morales, 2011) . In addition, since the students use their clickers to submit answers, it promotes student confidentiality (Stav et al., 2010) .
Interactive engagement utilized with the clickers is particularly promising (Draper & Brown, 2004) . Hoekstra & Mollborn (2012) have shown that clickers support good teaching, critical thinking across different disciplines in higher education and enhances diversity in discussion. Similarly, clickers can be a tool for engaging students in remote sites as seen in Premkumar et al. (2011) where volunteer students and staff from the College of
Medicine used clickers to participate in remote or face-to-face seminars at the same time. In Singapore, Nanyang Technological University (NTU) implemented an initiative called “Learning that Clicks” on using the interactive technology of clickers. The survey administered to undergraduate students in varying disciplines on learning in clicker-supported instructional environments showed that clickers are effective instructional devices (Laxman, 2011) . Thus clickers have increasingly become a required item, along with text books and calculators for students.
1.4.2. Review of Lecture and Questions
Instructors often use clickers to gather information about the students’ level of knowledge in respect to courses taught, share knowledge and experiences, poll opinions and review questions. The ability to tabulate and display the collective data from class polls to the entire class is an important feature which instructors make use of to promptly and easily assess students’ understanding of material covered in class and provide the instructor a way to pace their lecture based on student responses (Cummings & Hsu, 2007) . It increases students’ awareness of their understanding of subjects, what is important to learn, to add to what is already known, and what to focus on for further learning (Egelandsdal & Krumsvik, 2015). They are generally used as transitions, during a lecture or presentation, to generate discussion, poll student opinion or to review specific information. Thus, clickers such as Poll Everywhere offer additional assets of instant responses and easy export of data for analysis and also assist students acknowledge their feelings and attitudes related to the learning process, which is essential to their growth as lifelong learners (Rimland, 2013).
1.4.3. Formative Assessment
Early use of SRS focused on the conditioning effect of providing immediate feedback to students and remedial instructions to correct student misunderstandings where needed. Instructors provide instantaneous feedback via clickers during teaching to promote interactive engagement in class. Thus, clickers are ideal for measuring affective learning hence addressing factors such as anonymous participation and frequent measurement which are related to assessing affective constructs. Mayer et al. (2009) indicates that students who use clickers are more cognitively engaged during learning. Their study showed that students who used clickers scored significantly higher on the course exams in examination compared to students who answered in-class questions without clickers and those with no in-class questions. DeBourgh (2008) described the use of clicker technology in a baccalaureate nursing program for formative assessment to promote acquisition and application of advanced reasoning skills. Instructors used clickers to provide feedback to students on analysis and decision-making in case studies, and 72% of students found this feedback useful.
Formative assessment tools via quizzes use clickers at the beginning of each class and some studies associate improved student performance based on their use. They quiz students on assigned readings, to test the recall of a lecture point or concept, determine opinions and to apply concepts (Groen, 2010) . Feedback and assessment is vital in the learning of oral presentation skills (Grez, 2010) and using clickers is an effective way to produce feedback for presenters, assessors and educators.
Hence, clickers are growing in popularity among faculty in colleges and universities because they provide a method to insert assessment as part of the learning process rather than depend on the traditional form of summative assessment for assigning grades (DeBourgh, 2008) .
1.4.4. Clicker Use in Examinations
SRS can be used for both formative and summative assessment. In formative feedback, the instructor provides feedback to the student the comprehension of materials. The instructor can also obtain information on teaching effectiveness and modify techniques accordingly. As an aid to summative assessment, SRS allows for easy administration of questions, instant grading and provision of immediate feedback (Premkumar & Coupal, 2008) .
However, clickers have not been extensively used in high-stake examinations. This may be due to limitations in previous SRS units wherein questions are projected in the class and students had to wait for the last person to answer before proceeding to the next question. This can be disadvantageous to students who are able to answer questions quicker because they feel restricted by the pace of the test. Similarly, slow students may feel deprived of being able to ponder on questions as long as they want to. Also, the way instructors used clickers for summative tests prevented students from randomly answering questions, reviewing and correcting prior selected answers like a traditional test allows (Hancock, 2010) .
With the progress in clicker technology, these limitations have been addressed and rectified.
Cubric & Jefferies (2015) examined students’ perceptions on a large scale deployment of SRS across different subject areas and different levels of summative and formative use. 590 students across eleven academic schools in UK participated in the online survey and data showed that there was significant differences in learning benefits and challenges across different subject groups but, more divided views related to SRS use and students’ experience with it. Content analysis showed that summative use and staff competencies were key issues related to SRS use by students. Irrespective of the perception of the ease of use, it was found that ease of use could be a challenge for students when SRS was used for summative assessment. The study observed that despite the increased investment in staff training and classroom equipment, students continue to report that SRS did not work, and disliked its use for summative assessment.
The clinical skills laboratories of the nursing program at the University of Limerick used clickers as a pilot phase to gauge students’ thoughts on their appropriateness and to determine if clickers aided nursing students in studying for in-class assessments. The researchers believed that clickers will enhance and actively engage student understanding of the information through immediate feedback from the lab facilitator. The study aimed to evaluate if actively engaging the students in the laboratory session via clickers enhanced the retention of knowledge. An online quiz which accounted as part of the modules’ assessment was developed by the lab facilitator and answered via clickers (Johnson & Lillis, 2010) .
In addition, Kulesza, Clawson, & Ridgway (2014) explored the usefulness of implementing clickers as a tool for students to complete in-class quizzes and its impact on performance in exams. The study was carried out at Ohio State University and the university supports Turning Technologies clickers as such, students in this course utilized the RF Response Cards clickers. The instructor in 2011 implemented the use of clicker quizzes as a method of formative feedback. Students completed multiple-choice clicker quizzes which contributed to 8% of their final grades at the start of each lecture that were based on their readings and previous lecture material. During the normal lecture time, students completed two midterms and a non-comprehensive final exam in the form of objective MCQ and short answer pen-and-paper assessments. When students used clickers in midterm exams it produced significantly higher results. The results indicate better performance from students with prior practice using clicker questions. (Kulesza, Clawson, & Ridgway, 2014) however focused their study on using clickers for formative assessment rather than in summative assessments.
Since instructors can use clickers as tools for various educational activities. Thus, it will be interesting to study the utilization of clickers in high stake summative examinations and its impact from the perceptive of students and lecturers.
2. Purpose of Study
This study examined the feasibility of the use of clickers in high-stakes, paper-based examinations and to identify the benefits, limitations and barriers to its use.
The research was carried out with 26 students undertaking the Computer Science Course (CMPT 352) between January to April 2013 at the University of Saskatchewan. The students were undergraduates registered in the three/four year bachelor degree program. The clickers used were Response Cards from Turning Technologies© that had an LCD screen (Figure 2).
3.2. Implementation Process of Clicker Usage in the Classroom
The students were oriented to clickers in the first week of class by the instructor, followed by practice quizzes to make sure the students understood the functionality of the device. Subsequently, students used the clickers to answer midterm and final examinations.
The perception of the students to the use of clickers in high-stakes examinations was obtained using an online survey and interviews of volunteer students and the instructor after the completion of the course. This study was approved by the University Ethics Committee.
Prior to the administration of every examination, the instructor prepared exam questions in various versions in
Figure 2. Clickers used by students.
order to reduce plagiarism, made paper copies and uploaded the answers to the application (Turning Technologies). The instructor also checked all clicker batteries, cleared prior exam responses from clickers and provided the clickers to student prior to the examination.
On the day of the examination, the instructor set up a computer with receiver in the exam hall and opened the poll when students were ready to commence. The instructor monitored the progress of every student as they answered the questions.
To commence with the exam, students entered their ID and the name of exam. The question number appeared and students entered their answers accordingly. The examination could be taken at the student’s own pace and students had the option to review and change answers on the clicker. On completion, students submitted their answers. The answers were collated in the instructor’s computer, when submitted. Following submission of the exam answers, the students were given the option, via their clicker, to see their results of the examination.
Following the examination, the instructor collected the clickers as the students left the examination hall.
The instructor was interviewed at the end of the course. Transcriptions of the interview were provided to the instructor by e-mail as a member check for validity and confirmation of the transcript.
3.3. Methods of Data Analysis
The responses of the students to the survey questions were imported into statistical software (SPSS 11.5, SPSS Science, Chicago) that was used for analysis. The interview transcriptions were reviewed by one of the researchers (KP) for common themes.
4. Results and Discussion
The survey results present that 70% of students had not used clickers in exams that counted. 66% of students were not worried that the clickers would transmit wrong response or miss a question. 50% of students preferred use of clickers as compared to OpScan (16%) or computer based online (16%). Clickers prevented the need to mark answers on the bubble sheet provided and to write names on the question paper. Students also found clicker usage more comfortable and easy to use which aligns with studies where students mentioned that the ease of clicker use was an advantage as well as useful for their active learning (Heaslip, Donovan, & Cullen, 2014) .
Students perceived that it took more time to answer questions using clickers and were of the opinion that regular use of clickers is paramount for its use in examinations.
“I enjoy using the clickers and find them efficient and useful but I would need more time to get more familiar with the clickers to feel confident in using them in an exam.”
Also, students regarded the prompt feedback on exams as a major strength in the use of clickers.
“I enjoyed the immediate grading response that you received at the end of the exam when using clickers.”
On the other hand, students were skeptical of the efficiency of the technology and concerned that responses were not delivered and properly saved in the system. Students indicated that it will be daunting to answer more than 100 questions with clickers as well as entering phrases or sentence responses. This aligns with (Hancock, 2010) who noted that many students preferred paper tests when there are many questions to answer. Thus in this study, only 33% preferred clickers when >100 questions (33% computer online; 16% OpScan).
The purpose of the interview was to explore the instructor’s experience on using clickers for examinations and to identify the challenges.
One of the key benefits, as expressed by the instructor, was the ability to reduce/eliminate cheating. The smallness of the device and screen as well as assigning multiple versions of paper-based exams to students were beneficial and eased invigilation of examinations.
In addition, clicker use increased the ability of the instructor to monitor the exam while students answered questions and make corrections to questions where necessary. The instructor indicated that another key benefit of clicker use in examination is that responses gets stored in the clickers (backup) in addition to being received by the instructor’s computer. Another benefit is the quick turnaround time as a result of automatic grading of examinations.
Clickers save trips to the scan center for examination response analysis. Clickers allow entry of the student ID using the keypad. If the ID is entered incorrectly it will not be located in the student list and become known to the instructor immediately. Unlike OpScan sheets, with clickers it is not necessary to review the student answer sheet to ensure their ID has been entered or pencil marks of answers are dark enough. One advantage of the OpScan for instructors and students is that it shows all answers on one sheet and can be reviewed quickly. Note worthily, clickers are green and save trees.
Another benefit of the clicker model used, is that the examinations can be conducted anywhere, with no dependence on internet connections. The only requirement is a computer with receiver and the clickers in working condition.
4.3. Challenges of Clicker Use in Examinations
One of the challenges expressed by the instructor was that it is time consuming to delete previous tests on the clickers before the examination. Also, with multiple versions of MCQ examinations, the instructor needs to identify the version before beginning the examination and he needs to have multiple keys (one per version). Although the software supports various question types, clickers are not ideal for short and long answers, as the key pad and the LCD screen are small. This limits the practical use to questions to multiple choice questions. Similarly, it is not easy to use mathematical symbols on the device.
With regard to security, a student can walk off with the clicker and have access to the answers or download it if they have the know-how. The clicker does not actually store any question text, only the question type, correct answer(s) and correct/incorrect value weights.
In prepping for the examinations, the instructor needs to set up the computer, load the answer key and open the poll. He also needs to ensure that all clickers are in working condition, prior to the examination and have spare batteries available.
5. Limitations and Future Research
This study was limited to students enrolled in a computer science course. Studies of the use of SRS in summative assessments of other disciplines may help identify its feasibility based on content. Given the rapid changes in technology and SRS devices, devices with their changing features need to be researched and examined on a continuous basis. In addition, there is a trend in higher education towards the use of students’ own devices for teaching and learning. Further research is required to explore how participant’s device can be utilized for summative examination.
This study aimed to determine the feasibility of use of clickers in high-stakes, paper-based exams and to identify the benefits, limitations and barriers to its use, a topic that is currently under-represented in literature.
To check the students’ perception of longer time taken to input responses using clickers, we carried out a mini experiment where the time taken to respond to 20 questions using OpScan vsusing clickers was measured. Two participants were timed as they responded to the same questions with pre-determined answers using clickers followed by entry in OpScan sheets. There was no significant difference between the time taken, indicating that it was only student perception, but not true when objectively measured.
In conclusion, clickers may be considered as an efficient strategy for administering short, MCQ high stakes, summative examinations. It saves time and provides instant feedback in examinations. However, they are not very useful for examinations that require written answers. More studies are required to explore clickers use in weighted and summative examinations across various disciplines.
1) Is this the first time you have used clickers before?
2) Have you used clickers in exams that counted?
3) Were you worried that the clicker would transmit the wrong response?
4) Were you worried that you might miss a question using the clicker?
5) Which would you prefer for MCQ exams that counted towards your grades:
・ Bubble sheets
・ Computer-based online
Give reasons for your answer.
6) Which would you prefer for MCQ exams that counted towards your grades, containing more than 100 questions:
・ Bubble sheets
・ Computer-based online
Give reasons for your answer.
7) Do you find it difficult to enter answers through a clicker?
8) How would you compare doing the midterm of 50 questions to a final of 100 questions using clickers?
9) Does it make a difference using clickers if the number of questions are significantly increased?
10) How do you normally tackle exams that count (do questions sequentially, jump around, do easy ones first, …)?
11) Is this keyboard similar to your iPhone? If you had to enter a short phrase or sentence, would you be okay with using a clicker?
12) Would you prefer OpScan sheets or clickers for a large number (100+) MC questions. Why?
13) What suggestions would you give the instructor in order to improve the usage of clickers for weighted exams?
14) Did using the clickers slow down the pace of you answering the exam?
15) Did using the clickers slow down the implementation of the exam on the day of the exam?
Submit or recommend next manuscript to SCIRP and we will provide best service for you:
Accepting pre-submission inquiries through Email, Facebook, LinkedIn, Twitter, etc.
A wide selection of journals (inclusive of 9 subjects, more than 200 journals)
Providing 24-hour high-quality service
User-friendly online submission system
Fair and swift peer-review system
Efficient typesetting and proofreading procedure
Display of the result of downloads and visits, as well as the number of cited articles
Maximum dissemination of your research work
Submit your manuscript at: http://papersubmission.scirp.org/
http://dx.doi.org/10.1187/cbe.06-12-0206  Bojinova, E., & Oigara, J. (2013). Teaching and Learning in Higher Education. International Journal of Teaching and Learning in Higher Education, 25, 154-165.  Carnaghan, C., Edmonds, T. P., Lechner, T. A., & Olds, P. R. (2011). Using Student Response Systems in the Accounting Classroom: Strengths, Strategies and Limitations. Journal of Accounting Education, 29, 265-283.  Cubric, M., & Jefferies, A. (2015). The Benefits and Challenges of Large-Scale Deployment of Electronic Voting Systems: University Student Views from across Different Subject Groups. Computers & Education, 87, 98-111.  Cummings, R. G., & Hsu, M. (2007). An Investigation in a Tax Accounting Class. Journal of College Teaching & Learning, 4, 21-26.  DeBourgh, G. A. (2008). Use of Classroom “Clickers” to Promote Acquisition of Advanced Reasoning Skills. Nurse Education in Practice, 8, 76-87.
http://dx.doi.org/10.1016/j.nepr.2007.02.002  Draper, S. W., & Brown, M. I. (2004). Increasing Interactivity in Lectures Using an Electronic Voting System. Journal of Computer Assisted Learning, 20, 81-94.
http://dx.doi.org/10.1111/j.1365-2729.2004.00074.x  Gok, T. (2011). An Evaluation of Student Response Systems from the Viewpoint of Instructors and Students. Turkish Online Journal of Educational Technology, 10, 67-83.  Grez, D. (2010). Student Response System and Learning Oral Presentation Skills. Procedia-Social and Behavioral Sciences, 2, 1786-1789.
http://dx.doi.org/10.1016/j.sbspro.2010.03.985  Groen, J. (2010). Clickers in Higher Education: A Tool to Increase Student Feedback and Participation. Centre for University Teaching, 3.  Hancock, T. M. (2010). Use of Audience Response Systems for Summative Assessment in Large Classes Summative Assessment in Large Classes. Australasian Journal of Educational Technology, 26, 226-237.
http://dx.doi.org/10.14742/ajet.1092  Heaslip, G., Donovan, P., & Cullen, J. G. (2014). Student Response Systems and Learner Engagement in Large Classes. Active Learning in Higher Education, 15, 11-24.
http://dx.doi.org/10.1177/1469787413514648  Hoekstra, A., & Mollborn, S. (2012). How Clicker Use Facilitates Existing Pedagogical Practices in Higher Education: Data from Interdisciplinary Research on Student Response Systems. Learning, Media and Technology, 37, 303-320.
http://dx.doi.org/10.1080/17439884.2011.568493  Johnson, K., & Lillis, C. (2010). Clickers in the Laboratory: Student Thoughts and Views. Interdisciplinary Journal of Information, Knowledge, and Management, 5, 139-151.  Kulesza, A. E., Clawson, M. E., & Ridgway, J. S. (2014). Student Success Indicators Associated with Clicker-Administered Quizzes in an Honors Introductory Biology Course Journal of College Science Teaching, 43, 73-79.
http://search.ebscohost.com/login.aspx?direct=true&AuthType=cookie,ip,uid&db=afh&AN=94637294&site=ehost-live  Laxman, K. (2011). A Study on the Adoption of Clickers in Higher Education. Australasian Journal of Educational Technology, 27, 1291-1303.
http://dx.doi.org/10.14742/ajet.894  Mahon, K. L. (n.d.). Using Student Response Systems to Improve Student Outcomes.  Mayer, R. E., Stull, A., DeLeeuw, K., Almeroth, K., Bimber, B., Chun, D., Bulger, M., Campbell, J., Knight, A., & Zhang, H. J. (2009). Clickers in College Classrooms: Fostering Learning with Questioning Methods in Large Lecture Classes. Contemporary Educational Psychology, 34, 51-57.  Morales, L. (2011). Can the Use of Clickers or Continuous Assessment Motivate Critical Thinking? A Case Study Based on Corporate Finance Students. Higher Learning Research Communications, 1, 33-42.
http://dx.doi.org/10.18870/hlrc.v1i1.31 Nielsen, K., Hansen, G., & Stav, J. B. (2013). Teaching with Student Response Systems (SRS): Teacher-Centric Aspects that Can Negatively Affect Students’ Experience of Using SRS. Research in Learning Technology, 21, 18989.
http://dx.doi.org/10.3402/rlt.v21i0.18989  Premkumar, K., & Coupal, C. (2008). Rules of Engagement-12 Tips for Successful Use of “Clickers” in the Classroom. Medical Teacher, 30, 146-149.
http://dx.doi.org/10.1080/01421590801965111  Premkumar, K., Coupal, C., Trinder, K., & Majd, S. S. (2011). Engaging Students with Clickers in a Distributed Environment—Lessons Learned. Medical Science Educator, 21, 336-346.  Simelane, S., & Dimpe, D. M. (2011). Clicker Technology: The Tool to Promote Active Learning in the Classroom. In A. Méndez-Vilas (Ed.), Education in a Technological World: Communicating Current and Emerging Research and Technological Efforts (pp. 83-89).
http://www.formatex.info/ict/book/83-98.pdf  Stav, J., Nielsen, K., Hansen-nygard, G., & Thorseth, T. (2010). Experiences Obtained with Integration of Student Response Systems for iPod Touch and iPhone into E-Learning Environments. Electronic Journal of E-Learning, 8, 179-190.  Ward, C. R., Reeves, J. H., & Heath, B. P. (2003). Encouraging Active Student Participation in Chemistry Classes with a Web-Based, Instant Feedback, Student Response System.