Today’s medical professionals are used to an environment rich in media and connections, and want more engaging and interactive educational experiences because of this. New media can provide a solution, offering additional avenues for medical education beyond the traditional venue of the large lecture halls common to both undergraduate medical education and continuing professional medical education. Use of standardized patient actors or elaborate simulation centers are a “gold-standard” of undergraduate medical education, but can be prohibitively expensive and logistically challenging (Maloney & Haines, 2016) even for medical schools, and are seldom used for professional education. Serious games can include case-based training and the use of simulated patients, and are often-used learning tools in medical education (Karagiorgas & Niemann, 2017; Olszewski & Wolbrink, 2017; Wang et al., 2017). Virtual case simulations have been used successfully to train medical students via environments that duplicate the real-world challenges of patient care. Hands-on learning in these environments helps to increase retention of knowledge and application of skills (Gorbanev et al., 2018; Olszewski & Wolbrink, 2017; Wang et al., 2017; Westera, 2017), as well as intrinsic motivation (Diehl et al., 2013) and self-confidence (Nickerson et al., 2011; Shoemaker et al., 2011), in comparison to traditional lectures or textbook learning.
We describe the development and assessment of a new media informed patient case simulation experience. The simulation, Clinical Encounters, draws on the principles of multiple theoretical models, including social cognitive theory (REF), experiential learning model of Dewey (Dewey, 1938) and Kolb ( Kolb, 1983, as well as significant research on the value of simulations to increase enthusiasm and increase motivation. Previous research suggests that our Clinical Encounters experience should improve knowledge, retention, and confidence in clinical skills (Buttussi et al., 2013; Diehl et al., 2013; Graafland et al., 2012; Yang et al., 2012), and we assessed this in a small trial.
2. Simulation Design: Clinical Encounters
2.1. Reasons to Choose Simulation Experiences for Medical Students
Medical students in the pre-clinical years of medical school are familiar with simulations and comfortable with the experience. A computer-based simulation can be available when the medical student has time, rather than only when the entire class may be scheduled, or actors and space available. Further, computer-based simulations do not require dedicated space as do simulation centers with mannequins or live actors. Simulations of all types expose students to the challenge of clinical care in a realistic and safe environment where strategies can be tested and outcomes assessed. Of the 80% of medical students who positively described their real-life clinical experiences, most credited supportive learning environments (35%) and hands-on experiences (32%) (Ofei-Dodoo et al., 2018). Thus, increasing the use of simulation experiences should enhance learning and improve quality of life in medical students.
The individual focus of simulations supports attention, exploration, and confidence in asking questions. In contrast, learning in teams may not always be optimal. Team size negatively affected students’ learning experiences (Kandiah, 2017). Students in larger groups were less able or willing to ask questions or share opinions on cases, potentially due to time constraints and traditional hierarchy issues seen in groups (Ofei-Dodoo et al., 2018). The latter issue may be especially problematic for women and minorities.
A meta-analysis found that simulations increased self-efficacy in a variety of clinical skills by 20%, declarative knowledge by 11%, procedural knowledge by 14%, and retention by 9% more than control approaches (Sitzmann, 2011). Increased self-efficacy is a key component of decreasing the stress of real-world clerkship experiences and preparedness (Bosch et al., 2017). In sum, a simulation can provide an individualized, scalable, reproducible, comprehensive, and standardized experience as part of the medical school curriculum.
In order to provide an engaging, interactive clinical education experience utilizing new media for medical students and younger providers, we created a framework for presenting simulated patient encounters using a simplified, easy-to-navigate, electronic medical record (EMR). The framework expands on our previous experiences with EHR-based simulations (Metcalf et al., 2010a; Metcalf et al., 2010b; Tanner et al., 2012). The Clinical Encounters experience captures the value of entertainment-oriented games, demands problem-solving, provides opportunities for reflection (Bauman, 2012), and can deliver targeted feedback to enhance empowerment and confidence (De Noble et al., 1999; Maddux, 2009; Mateja Drnovšek et al., 2010). While older providers may not have the exposure to games that younger providers may have, most providers have been exposed to the use of the EHR for the last 10 to 20 years.
2.2. Simulation Setting
Electronic medical records, EMRs, are becoming the dominant form of documentation in today’s health care world. When coupled with a simulated patient encounter, the combination becomes a potential tool for facilitating learning during patient care (Gibbs et al., 2018; McLeod et al., 2017). We reviewed past efforts in designing and implementing these types of games as we developed our Clinical Encounters interactive case framework. Working from the typical components of a patient encounter and a framework of common EHR structures, we created a simple, tabbed interface where learners could navigate through the steps of a typical patient encounter. Learners using our framework to review a case, review patient information, interact with the simulated patient through dialogue exchanges, answer quizzes to make decisions on assessment and evaluation, rank potential diagnoses, and determine treatment as they proceed through the encounter.
3. Initial Creation and User Response
During initial development of Clinical Encounters, we surveyed faculty members and students to guide adaptations of medical cases to a computer simulation format in order to create a beta version for wider review. The first step was to assess the content of the proposed simulation. Even if a simulation is engaging and realistic, it will not be a useful pedagogical experience for the students if the content itself is not appropriate. We asked 24 medical school faculty to review the proposed curriculum content to assure it was appropriate for the target audience.
With the content overview validated by the faculty members, we next assessed the format and pacing of the simulation. Since the simulation is intended to mimic a typical patient encounter, we surveyed 14 practicing physicians to help develop the sequence, flow, and pacing of the simulated patient encounter. Physicians were asked specific questions related to experiences within the simulation. Based on their recommendations, we determined that each encounter with a patient would ideally include five phases: History Taking, Physical Exam, Evaluation, Diagnosis, and Treatment Planning.
Practicing physicians provided input based on their clinical experience, such as the best time in the flow of a medical workup to ask for a differential diagnosis of the patients’ complaint, after the initial review of symptoms, after the history taking, or after the physical exam. Physician responses generally indicated that all of these options were appropriate, because they constantly revise what diagnoses they consider as they progress through the patient encounter. Additionally, we asked for feedback on how to present the “treatment plan options”. Based on the feedback, we broke the treatment planning into several types, including Behavioral (Example: Recommend bed rest), Pharmacological (Example: Prescribe medication), and Surgical.
Starting with Harrison’s widely used text on the practice of medicine (Kasper et al., 2015) and patient encounter skills required for medical licensing exams (USMLE, 2020), we worked with medical school faculty, and practicing physicians to develop a list of “steps” for the simulation experience. The final steps are based on a sequence of events commonly used in medicine that would cover a wide range of medical conditions.
One of our goals for these case simulations was to model an “ideal”, structured patient encounter flow so that students can create a mental template of the typical sequence of events; for example, not making a plan for treatment until the patient evaluation is completed. Some of the steps we include in the simulation may be completed more efficiently in real-life practice. However, for the sake of a learning vehicle, we chose to make them an explicit, full experience, such as using a formal substance use screening survey instead of just asking the patient a few, quick questions about their substance use.
Our initial release of a Clinical Encounters case was integrated into an existing online clinical training activity. This case, Patient Chad Wright, involved a fictional patient who was looking for a new provider to prescribe pain medication for an old knee injury. The interaction included common steps in a patient encounter, including an introduction to the patient, history, evaluation, medical tests, diagnosis, treatment planning, treatment, and a patient note summarizing the visit. In addition to the information presented, learners engaged with interactive elements, such as patient dialogue exchanges, quizzes about clinical choices, ranking of potential diagnoses to form a differential diagnosis, and a final post-test to apply what they learned.
3.1. Usability Testing with Medical Students
We conducted usability testing of the interface with a group of 10 medical students. Using an iterative approach appropriate to usability testing, we conducted a round of testing, made revisions to the interface, and conducted a second round of testing in order to get well-rounded feedback on the prototype design. Each round of usability survey included 5 students, based on Nielsen’s dictum (Nielsen, 2012) that 5 to 8 individuals can detect 80% of usability problems and that testing with higher numbers is inefficient.
In both rounds, users indicated that using the simulation was more engaging than traditional forms of education about the topic (10/10) and was logically oriented (9/10). Almost all users found the organization of the simulation enhanced their experience. There was satisfaction with the usability of the simulation, particularly in terms of navigation. User commands were also highly rated, with all users agreeing they could determine how to make a clinical choice in the simulation.
Results from the first round of usability testing indicated that navigation was the most problematic area. To address these concerns, we made revisions to the prototype with an emphasis on improving the lowest scoring item, clarity on how to navigate. We improved the instruction delivery, and we added contrasting colors to group sections more clearly.
3.2. Prototype Testing (March to December 2018)
After the usability testing and revisions, we published the case as part of a pre-existing online learning activity on a pain topic. We asked users of the activity a set of questions specifically addressing the Patient Chad Wright case after they completed the clinical learning activity. Questions used a 5-point scale of Strongly Disagree to Strongly Agree. Medical professionals (N = 122) gave feedback from March to December 2018.
Over three-fourths of users enjoyed working through the case and found it easy to navigate. A large number of users also found the format effective for training in the clinical skills used within the patient case.
Learners did not want to have the entirety of the learning activity comprised of case simulations, but instead wanted the case supplemented by didactic material in the common module format. Combined results are shown in the table. Navigation through the case experience continued to be the lowest-rated area.
Although we had aimed for an intuitive interface not requiring instructions, we realized that some users preferred explicit directions. We added instructions at the beginning that tell the user how to navigate through the game, emphasizing the pathway of a typical clinical visit (History, Evaluation, Diagnosis Treatment, and Summary).
Abbreviations for common medical terms were not familiar to some students, so we changed to using full words; for example replacing “Tx” with “Treatment”.
User feedback also indicated a preference for a more narrative, story-like approach to the case history. To address this, we added additional information about the “patient” at the beginning of each case.
We also increased the amount of interactivity and added quizzes challenging the user to make a simulated clinical decision at regular intervals throughout the experience, in the middle and end of the sections. This produced a pattern of challenge followed by integration. Feedback was provided immediately on the choices that users made.
3.3. Prototype Evaluation (December 2018 to March 2019)
We evaluated user responses to questions about usability of the updated version of the case during late 2018 and early 2019. This version maintained the same academic and interactive content, but included more explicit directions in the interface. We analyzed the feedback from users during the first 4 months of use.
Similar to results for the first round of testing, a large majority of users found the format effective for training in the clinical skills used within the patient case (81%). There continued to be room for some improvement in enjoyment and ease of navigation, with around three-fourths of users enjoying working through the case and finding it easy to navigate. Users continued not to want to have the entirety of the learning activity be comprised of cases, but instead wanted cases supplemented by the more common module format of online pages of text. Combined results are shown in the table.
In response to both the survey and free-form comments, we adapted the interface and the user path experience. Interactive elements based on typical clinical choices were added to the experience at appropriate points in the Clinical Encounter process. This allowed the question/answer process to more closely mirror points in the experience where a real-life clinical decision would need to be made. For example, a question about interpreting a medical test result was placed after the screen presenting the results. Additionally, we provided more lengthy feedback describing why the choice was or was not the best one and how it impacted clinical care or patient outcomes immediately after the user submitted their question responses.
We added a “drag-and-drop” functionality to the Differential Diagnosis step. Previously, for Differential Diagnosis, we had asked users to give a numerical rank to each potential diagnosis. In the new version with drag and drop functionality, users are given a list of diagnoses and asked to drag them into the order that seems most likely, from most likely at the top to least likely at the bottom. In a demo of the new drag-and-drop functionality at a conference, the functionality was well-liked by medical faculty and considered “fun”.
A final change was purely visual—we added small images of the patient in front of each line of patient dialogue, replacing a generic icon we had used in earlier testing.
In order to improve ratings for navigation, we allowed for more detailed tab headers by adding expand/collapse functionality to the tabs.
Although we initially designed a more “free form experience,” user feedback indicated that learners preferred a more defined path for the experience. Navigation and usability ratings improved over time as we added arrows to indicate that more tabs are available when the number of tabs overflowed a section, included prominent continue buttons at the bottom of each tab so the learner had two ways of navigating (tabs across the top vs continue buttons at the bottom right), and finally added additional labels (a “close” button, “go back” rather than using only icons or graphic elements for these functions).
3.4. Summative Testing (April 2019)
With these improvements now integrated, we released the improved simulation format with the Chad Wright case. Upon completion of the learning activity, learners were again required to complete the case and a post-survey. We asked learner opinion on the case, ranking on a scale of Strongly Disagree to Strongly Agree. An N of 35 healthcare providers gave feedback on their experience.
Results improved significantly over previous versions of the case. As previously, almost all users enjoyed working through the case and found it both a valuable learning experience and an effective method for training in the clinical skills covered within the patient case. This last round of changes proved successful in improving the user navigation experience.
Interestingly, with the improved interface, the proportion of users who expressed a preference that the majority of the learning experience should be similar to the case rather than online text to read increased over previous versions.
Following the trend toward increasing use of simulations in medical education, we designed a way to provide clinical training for medical students and professionals using patient encounter simulations to present cases. The framework design was roughly based on a format familiar to all medical professionals, the electronic medical record, but modified to simplify it for ease of use. Through iterative development and testing of the simulation, we learned that we had to design the simulation carefully to mimic the relevant elements of a patient encounter with easy-to-use interactivity that supports learning relevant clinical skills.
We learned several lessons that can be applied broadly to the use of case-based simulation in online learning, which inform both our work and those of others who might use new media-based case simulations. The most significant of these is that medical learners place an extremely high value on ease of use and navigation. While the academic/learning content was essentially constant throughout the round of testing, user enthusiasm varied based on perceived ease of navigation. Thus, any online simulation navigation needs to be extremely intuitive and easy to use. This may be more challenging than developers or authors initially assume, and require planning beyond typical “good” user interface design. We had initially assumed that our learner audience would all be familiar with the conventions of a medical clinical encounter. However, this assumption did not lead to the most usable design for all users. Even users experienced with medical clinical encounters preferred to have the software/experience navigation be directive and obvious, despite a high level of complexity in “real world” electronic medical records. Students appear to appreciate being able to focus more intently on the content of the simulation rather than the simulation process. Thus, case simulations for students should differentiate between teaching the process of using a simulation, and the medical content.
Further, interface design should accommodate an audience that has a wide range of familiarity and comfort with game-like navigation conventions, and target the least experienced users. Experienced users did not indicate a dislike of explicit direction, while inexperienced users had a clear preference for it. This challenged an expectation we had started with, which was that users would reject an interface that seemed to “talk down” to them. In fact, navigation ratings were higher from medical students who probably have relatively more experience with games than practicing providers. Options need to be offered for those who may find one approach confusing or unfamiliar.
We also learned that users prefer a higher level of background content detail that we had initially anticipated. While our previous work indicated to us that an audience of medical students and professionals preferred focused content in didactic presentation, this was not true in a simulation learning experience. The simulation storyline needed details about the patient to make the patient seem real. Adding images of the patient throughout transcripts of dialogue with the patient added further realism, which appeared to contribute to user enjoyment of the simulation experience.
Users appreciated the realistic pacing and sequence of the experiences. This allowed us to create a focused experience that drew on users’ pre-existing knowledge and increased their perceptions of effectiveness. User interactions with the simulation should be evenly distributed throughout the case and can provide sufficient challenge to help the learner integrate skills learned in the didactic component of the training. Integrating those interactions into the clinical activity, such as asking for an interpretation immediately after test results are presented instead of putting interpretations at the end, contributed to enhanced enjoyment of the simulation experience.
The more improvements we made to the interactive case simulations, the more users wanted the training delivered via such cases instead of in the traditional format. However, even though a large majority of users of the final case enjoyed the experience and would like the training delivered via interactive cases, a minority of users did not agree. To respond to these results, our current design for clinical training is to feature the interactive cases but supplement them with summaries of the key skills learned using the more traditional format of an easily skimmed online document.
In order to control the user experience in this evaluation, we concentrated on one patient case that utilized most of the features available in Clinical Encounters. This is a limitation of the evaluation—a simpler patient clinical case might have yielded different results. Additionally, our study is an effectiveness trial, using a real-world situation where users sought out online medical continuing medical education. Thus, the audience was possibly predisposed positively towards online and new media experiences. Finally, the sample included self-selected users, and was not analyzed by age, gender, race, or profession; these variables might impact results. However, since there was significant agreement on all issues by users, we do not believe these variables are impactful. Future work should explore this assumption.
Our experience indicates that case-based education is well-received and effective for this medical audience, but that presentation elements are essential. Development of case-based materials using new media would be well served by comprehensive user testing for interface design and usability, based on our experiences. Similarly, users had a positive response to the parts of the experience that mimicked “real-life” most closely in terms of pacing and case presentation.
5. Looking Forward
The Clinical Encounters product includes several advantages for training medical students in the areas of patient interaction and the use of EHR frameworks. Our interactions provide a close resemblance to actual clinical experience through hands-on learning, which supports superior memory, transfer, motivation (Chapple, 2014; Cook et al., 2011; Tai & Yuen, 2007; Virtusphere, 2013); a greater connection to the materials being presented (Oblinger, 2004); emphasis on experiential learning; employment of clear instructional design, learner control, and constructive learning; achievement of deeper learning by being able to change parameters and see the effect; and the use of a need- and outcome-oriented approach to education.
A case focused simulation such as that created by the Clinical Encounters platform has the potential to yield an engaging, focused and effective approach for medical students and professionals. The opportunity to focus on real-world style cases allows a focus on patient interaction skills in a reproducible, always-available approach. Additionally, online or mobile training can support an unlimited number of users and resources, unlike traditional live action or manikin-based simulations used in medical schools today. Improvements and adaptations are always possible, based on user feedback, in order to provide a more targeted learning experience that supports learner needs. This approach to training is beneficial and has the potential to strengthen learning for the next generation of health professionals. Additional work will expand the variety of cases and assess if user results are consistent with what we have presented here. We also plan to assess a larger sample size focusing demographic variables and online experiences.
Clinical Encounters was developed by Clinical Tools, Inc (CTI) with funding from the following organizations:
• National Institute on Drug Abuse (Grants #N44DA65530, #1R44DA035042, #R44DA12066, and #1R44DA027245-01; Contracts #HHSN271200800038C, #HHSN271200655304C, and #HHSN271200900003C);
• National Institute on Alcohol Abuse and Alcoholism (Grants #1R43AA020456-01A1 and #2R44AA020456-02);
• National Institute of Diabetes and Digestive and Kidney Diseases (Grant #2R44DK091144-01A1).
We gratefully acknowledge this support, which was the sole funding source for this project’s development.
 Bauman, E. B. R. (2012). Game-Based Teaching and Simulation in Nursing and Health Care. Berlin: Springer Publishing Company.
 Bosch, J., Maaz, A., Hitzblech, T., Holzhausen, Y., & Peters, H. (2017). Medical Students’ Preparedness for Professional Activities in Early Clerkships. BMC Medical Education, 17, Article No. 140.
 Buttussi, F., Pellis, T., Cabas Vidani, A., Pausler, D., Carchietti, E., & Chittaro, L. (2013). Evaluation of a 3D Serious Game for Advanced Life Support Retraining. International Journal of Medical Informatics, 82, 798-809.
 Chapple, C. (2014). New VR Treadmill the Virtualizer Hits Ground Running on Kickstarter.
 Cook, D. A., Hatala, R., Brydges, R., Zendejas, B., Szostek, J. H., Wang, A. T., Erwin, P. J., & Hamstra, S. J. (2011). Technology-Enhanced Simulation for Health Professions Education: A Systematic Review and Meta-Analysis. JAMA, 306, 978-988.
 De Noble, A. F., Jung, D., & Ehrlich, S. B. (1999). Entrepreneurial Self-Efficacy: The Development of a Measure and Its Relationship to Entrepreneurial Action. In Frontiers for Entrepreneurship Research (pp. 73-78). Waltham, MA: P&R Publication Inc.
 Diehl, L. A., Souza, R. M., Alves, J. B., Gordan, P. A., Esteves, R. Z., Jorge, M. L. S. G., & Coelho, I. C. M. (2013). InsuOnline, a Serious Game to Teach Insulin Therapy to Primary Care Physicians: Design of the Game and a Randomized Controlled Trial for Educational Validation. JMIR Research Protocols, 2, e5.
 Gibbs, D., Hewitt, B., & McLeod, A. (2018). The Gamification of Electronic Health Records: A Systematic Literature Review.
 Gorbanev, I., Agudelo-Londono, S., González, R. A., Cortes, A., Pomares, A., Delgadillo, V., Yepes, F. J., & Munoz, ó. (2018). A Systematic Review of Serious Games in Medical Education: Quality of Evidence and Pedagogical Strategy. Medical Education Online, 23, Article ID: 1438718.
 Graafland, M., Schraagen, J. M., & Schijven, M. P. (2012). Systematic Review of Serious Games for Medical Education and Surgical Skills Training. The British Journal of Surgery, 99, 1322-1330.
 Kasper, D. L., Fauci, A. S., Hauser, S. L., Longo, D. L., Jameson, J. L., & Loscalzo, J. (2015). Harrison’s Principles of Internal Medicine 19/E (Vol. 1 & Vol. 2) (ebook). New York: McGraw Hill Professional.
 Kolb, D. A. (1983). Experiential Learning: Experience as the Source of Learning and Development. Upper Saddle River, NJ: Prentice Hall.
 Maddux, J. E. (2009). Self-Efficacy: The Power of Believing You Can. In C. R. Snyder, & S. J. Lopez (Eds.), Handbook of Positive Psychology (pp. 277-287). Oxford: Oxford University Press.
 Maloney, S., & Haines, T. (2016). Issues of Cost-Benefit and Cost-Effectiveness for Simulation in Health Professions Education. Advances in Simulation, 1, 13.
 Drnovsek, M., Wincent, J., & Cardon, M. S. (2010). Entrepreneurial Self-Efficacy and Business Start-Up: Developing a Multi-Dimensional Definition. International Journal of Entrepreneurial Behavior & Research, 16, 329-348.
 McLeod, A., Hewitt, B., Gibbs, D., & Kristof, C. (2017). Evaluating Motivation for the Use of an Electronic Medical Record Simulation Game. Perspectives in Health Information Management, 14, 1-19.
 Metcalf, M. P., Tanner, T. B., & Buchanan, A. (2010a). Effectiveness of an Online Curriculum for Medical Students on Genetics, Genetic Testing and Counseling. Medical Education Online, 15, 4856.
 Metcalf, M. P., Tanner, T., & Wilhelm, S. (2010b). Assessing the Impact on Medical Students of an Online Curriculum on Opioid Dependence and Treatment. Substance Abuse, 32, 60-61.
 Nickerson, M., Morrison, B., & Pollard, M. (2011). Simulation in Nursing Staff Development: A Concept Analysis. Journal for Nurses in Staff Development: JNSD: Official Journal of the National Nursing Staff Development Organization, 27, 81-89.
 Ofei-Dodoo, S., Goerl, K., & Moser, S. (2018). Exploring the Impact of Group Size on Medical Students’ Perception of Learning and Professional Development during Clinical Rotations. Kansas Journal of Medicine, 11, 70-75.
 Olszewski, A. E., & Wolbrink, T. A. M. (2017). Serious Gaming in Medical Education: A Proposed Structured Framework for Game Development. Journal of the Society for Simulation in Healthcare, 12, 240-253.
 Shoemaker, M. J., Beasley, J., Cooper, M., Perkins, R., Smith, J., & Swank, C. (2011). A Method for Providing High-Volume Interprofessional Simulation Encounters in Physical and Occupational Therapy Education Programs. Journal of Allied Health, 40, e15-e21.
 Sitzmann, T. (2011). A Meta-Analytic Examination of the Instructional Effectiveness of Computer-Based Simulation Games. Personnel Psychology, 64, 489-528.
 Tanner, T. B., Wilhelm, S. E., Rossie, K. M., & Metcalf, M. P. (2012). Web-Based SBIRT Skills Training for Health Professional Students and Primary Care Providers. Substance Abuse, 33, 316-320.
 Wang, R., DeMaria, S., Goldberg, A., & Katz, D. (2017). A Systematic Review of Serious Games in Training Health...: Simulation in Healthcare. Journal of the Society for Simulation in Healthcare, 11, 41-51.
 Westera, W. (2017). Why and How Serious Games Can Become Far More Effective: Accommodating Productive Learning Experiences, Learner Motivation and the Monitoring of Learning Gains. Educational Technology & Society, 22, 59-69.
 Yang, B., Zhao, X., Ou, Y., Zhang, J., Li, Q., & Liu, Z. (2012). Design and Implementation of Virtual Reality Software with Psychological Treatment for Drug-Dependent Patients. Journal of Biomedical Engineering, 29, 1174-1177.