Competency-based medical education implementation has been increasingly construed as organizational change (Albanese et al., 2010; Carraccio et al., 2016; Englander et al., 2017; Hall et al., 2020; Klamen, Williams, Roberts, & Ciancolo, 2016; Kogan, Conforti, Yamazaki, Iobst, & Holmboe, 2017; Van Melle et al., 2019). One of the difficulties normally associated with organizational change, which can lead individuals to become disengaged in the process, is a lack of ways to report on progress towards change. We deemed it important to find a solution to this problem.
Over the past two decades, much effort has been placed on reforming medical education curricula along four orientations: 1) ensuring certifiable educational outcomes; 2) a deliberate focus on developing multiple abilities; 3) a deemphasis on training as a function of time, and; 4) a move towards learner-centred learning environments (Frank et al., 2010). Furthermore, medical schools are being increasingly held to account for the social relevance of the skills graduates bring to health care (Carraccio et al., 2016). These orientations for medical training entail important organizational changes to schools, hinging on paradigmatic shifts, (Carraccio, Wolfsthal, Englander, Ferentz, & Martin, 2002) that ultimately reflect how instructors on the ground adapt. Entrustable professional activities (EPA), introduced by the Royal College of Physicians and Surgeons of Canada are seen as the operational manifestation of CBME (Englander et al., 2017; ten Cate & Hoff, 2017). EPAs comprise the link between CanMEDs (Frank, Snell, & Sherbino, 2015) competencies and the professional activities that a budding physician must master to graduate.
Some organizational change frameworks focus on power relationships and structures within the organization (French, Bell, & Zawacki, 2006; Krupat, Pololi, Schnell, & Kern, 2013; Giddens, 1984), others on the organization’s identity and interactions with the surrounding environment (Ashforth & Mael, 1989; Stensaker, 2015) and still others on communities of practice (Billett, 2000; Wenger, 1998) and developing new knowledge (Baumgard, 1999). In contrast to the former, the latter perspective construes the individuals in an organization as knowledge creators. The process of knowledge creation, where “knowledge is created and expanded through social interaction” (Nonaka & Tekeuchi, 1995: p. 61), cannot be understood as a linear process, but as a recursive spiral process that expands organizational knowledge and distributes it from individual, to group, to organisation and interorganisational levels (Nonaka, Toyama, & Konno, 2000). The model was developed by studying management practices in major Japanese firms in the 1970s and 1980s aimed at understanding the process of innovation in manufacturing. The spiral nature of the process and the implications with regards to the nature of learning is conceptually very close to Kolb’s work on Experiential Learning (Kolb, 1984, 2015).
2. The Knowledge Creation Process
It is implied that knowledge creation is not an individual activity and that it occurs within a given space and time, what Nonaka calls ba (Nonaka et al., 2000). Ba is a shared mental space and time where information is shared and interpreted collectively. Authors posit that ba describes the required condition for individuals to willingly share their knowledge. The knowledge creation process is comprised of four stages that unfold within the ba: Socialization, Externalization, Combination and Internalization (SECI).
The socialization stage emerges through day-to-day experiences in teaching medical students in a clinical setting. For example, medical educators become aware of contradictions (perhaps by participating in Faculty Development activities) in the way teaching is carried out and talked about. Talking to peers brings these contradictions to the fore. The socialization stage ends when individuals embrace actions to “resolve these contradictions” (Nonaka & Toyama, 2003: p. 4).
During the externalization stage, individuals use their discursive consciousness to rationalize and articulate the contradictions they have encountered. Nonaka and Toyama insist that at this stage, individuals “seek to detach themselves from routines’ (p. 4) to eventually put new ones in their place”. Practical knowledge is actively shared with peers in an effort to form new concepts. The externalization stage ends when these new concepts become clear enough to be applied successfully in practice.
In the combination stage concepts are tested and disseminated to other members of the organization. This stage is where leaders “break down” concepts, such as CBME, so that peers can develop their understanding of them in their discursive consciousness and make sense of them. The combination stage ends when peers take ownership of the new concepts and introduce them in their practice.
In the internalization stage newly created concepts generate new practical knowledge. The change in the organization becomes visible as a greater number of members adopt change and new routines take hold. This stage ends when individuals begin to critically question theses routines and a new cycle of knowledge creation begins. Table 1 summarizes the four stages of the knowledge creation model.
To summarize, an organization creates knowledge through the knowledge conversion process (SECI) which occurs in a specific time and space (ba) in the interactions between individuals and groups. The knowledge creation process is linear and it describes an expansive iterative movement visualized as a spiral.
Table 1. The Four stages of the organizational knowledge creation model—SECI (read clockwise).
Aim of the Present Paper
In an era of intense change in Medical Schools, a theoretical framework that can inform stakeholders about progress as well as provide encouragement to individuals leading the change, can be useful to minimize Faculty disengagement. We successfully used Organizational Knowledge Creation Model (OKCM) to communicate and understand progress within our medical school. Hence this paper’s aim is to illustrate how to recognize and communicate about change within a medical school using the Organizationa Knowledge Creation Model (OKCM).
We relied on the Consolidated Criteria for Reporting Qualitative Research (COREQ) guidelines to report the findings of our study (Tong, Sainsbury, & Jonathan, 2007). The strategy chosen by our school to enact change is based on Academic Leaders who’s mandate was to support peers, develop and deliver training activities and advise Program Directors. Université de Montréal medical school Faculty is made up of approximately 3000 instructors (roughly 20% tenured, 80% sessional or non-tenured), working in over 100 different teaching sites (hospitals, family medicine and community clinics, etc. serving a population pool of about 2 million people in the Province of Quebec, Canada) and offering 73 programs, from the undergraduate MD program to 72 graduate (residency and specialty) programs. The Faculty comprises 16 academic departments as well as two health sciences schools. Educators who volunteered to be Academic Leaders were trained in CBME by education experts and offered remuneration for their work.
A case study approach with embedded levels (Gerring, 2007; Yin, 2014) was selected. The unit of enquiry is thus the academic department to which leaders belong. This allowed observation of the interactions between the leaders and the social structures, which is a crucial dimension in OKCM.
Selection of Departments was based on a purposive sampling method (Gerring, 2007: p. 88) using the number of Faculty Development activities related to CBME in 2012 to establish differences among them. Faculty Development is seen as a core element to enact curricular change (Steinert, 2012; Steinert, Naismith, & Mann, 2012) and so the number of activities devoted to CBME reflected varying levels of progress towards its implementation. The sample comprised the department with the highest number of activities (MED), one that held little activity, (SURG) and one that had no activities at all (PSY) in 2012.
3.1. Study Participants
A total of 26 medical educators participated in the study, in individual interviews or in focus groups. The three Department Chairs participated in the study and were interviewed twice, two years apart to collect data on what had changed in the intervening years. The nine educators interviewed individually were mid-career medical educators who demonstrated an interest for pedagogy. Two of them had a graduate degree in medical education, the rest had gone through all the relevant CBME Faculty Development sessions offered by our institution. One of them had been program director for many years and regularly facilitated training sessions on education for first year residents. The final list of educators interviewed either individually and in groups was determined by practical availability reasons (Table 2 & Table 3).
Table 2. Individual interview participants.
Table 3. Surgery department focus group participants.
3.2. Data Collection and Analysis
To observe change over time, two waves of data collection were carried-out in 2012 and 2014. All three Department Chairs were interviewed in both waves. For practical availability reasons, the three educators of the internal medicine department were interviewed individually in 2012 and in a group interview in 2014. Two educators in the surgery department were interviewed individually in 2014 and four focus groups were held in the same year. Two educators of the psychiatry department were interviewed individually in 2014. In total, 13 semi-structured individual and 5 group interviews yielded the qualitative data that was recorded and transcribed. Interviews were conducted in French. Only the quotes included in the manuscript were translated. Questions and prompts focused on their experience and the facilitators and barriers they faced as they worked to implement CBME. All interviews where held on the main campus of the university (Table 4).
A linear analytic approach was used with a descriptive purpose. Through thematic analysis common themes embedded in the transcripts were coded and identified (Boyatzis, 1998; Cresswell, 2003). Coding was carried out using the OKCM framework and themes described by participants were associated with the relevant phases of the process. For example, when participants described their work in program committees, this was clearly within the combination phase where concepts are shared to a broader group of individuals and the theme was dissemination to a wider audience. NVivo 12 software package was used to code the data according to OKCM stages by two team members (NF and DN). Concordance of codes was verified using the software (89% concordance) and differences were discussed and resolved. Ethical approval was sought and granted from the University Research Ethics Board (12-079-CERES-D).
In the following section we summarize the results of our analysis of the interview data of 2012 and 2014 structured according to themes found in the data. Each theme was ascribed by the research team to one the four OKCM stages.
Table 4. Data collection strategy.
4.1. CBME Implementation in 2012
A grass-roots movement—the socialization stage
In 2012, Department Chairs were not quite certain whether CBME implementation was necessary at all. They repeatedly shared the opinion that CBME is here today and gone tomorrow:
“It’s not worth the investment to improve teaching. When we finish introducing the changes, we’ll have to start all over again because something new will come out. For those who are passionate about it, teaching is always evolving, constantly challenging assumptions. This was so even before we heard about Competency-based medical education.” (DCW/2012).
This rather pragmatic outlook contrasted sharply with the dynamic view held by educators. Internal medicine educators were active and convinced of the relevance of CBME thus their engagement reflected a desire to improve teaching practice.
Exhorting teachers to change
In 2012 those responsible for teaching in hospitals asked educators to give lectures about CBME. This, educators admitted, meant that they had to “bluntly exhort teachers to change their teaching habits” (IM-03/2012). At first they simply showed up “with [their] powerpoint slides, gave their talk, and left.” (IM-01/2012). At this point, CBME was “interesting, but hardly concrete for what Faculty was supposed to be doing.” (PS-DC/2012). This generated much resistance, especially amongst physicians from other specialties: “Doctors don’t like to be told what to do, especially by someone who’s not from their specialty.” (IM-02/2012). In the terms of the OKCM, this is the mark of the socialization stage were contradictions between current practice and what was understood to be CBME were being voiced and discussed among peers.
4.2. CBME Implementation Two Years Later—2014
In 2014, Academic Leaders were active in all three departments. In the following section, we present the results of the second set of interviews, conducted between March and April of 2014.
Unlikely experts talking about pedagogy
In 2012, three educators of the psychiatry department got involved because, as one of them stated, pedagogy is “a stimulating and interesting field that touches everybody.” (PS-01/2014). As another put it, CBME implementation triggered discussions about pedagogy among colleagues in her teaching hospital, “it was like venting the ground that hadn’t been turned in 20 years” (PS02/2014).
One of the Leaders brought up issues of legitimacy associated with the sudden and unforeseen role of “CBME expert”:
“we have to find a way to define this institutionally […] we are quickly placed in the position of the one who knows. This is the way things happen at the university, you go and do a presentation on anxiety troubles, even if your purpose is to do prevention and …well, suddenly you’re the expert!” (PS-02/2014).
Reaching out and building partnerships
By 2014, educators realized they needed to work together in association with the Department Chair who offered some protected time for this work. Educators reached out for assistance and were provided tools that had a considerable impact on their work.
The group acted as a catalyst. It provided the structure and legitimacy required to push for changes and members became recognizable to colleagues as people that could help with CBME. It was clear to educators that coordinating team activities was crucial “to figure out what we had to do”. (PS-02/2014). Educators clearly felt that their decision to work together was an important facilitating factor.
“what helps, is not to be alone. The group’s impulse and the fact that we are a few of us to carry this mandate…because with such a big department, so widespread, at one point we don’t know where to start.” (PS-01/2014).
By 2014, Educators in the Psychiatry Department had set up working committees in teaching sites. But because of the number of teaching sites, it was impossible to think of embedding an Educator in each site. Educators understood the necessity to partner with teachers on the ground.
“So, this is when we finally, [realized] it’s just us three. We sometimes got time off and worked some files a little more, then we brought it back to a wider group because we thought that it was important that the work be done at each teaching site. […] We worked more in detail […] and we submitted our work to the committee and the committee made comments, and we came back. […] and so it was that last autumn we organized a whole day for the implementation of EPAs […] we managed to get about 50 psychiatrists who answered the call to whom we presented the EPAs and started working on the generic EPAs, chose the ones that were most relevant for [psychiatry] and got all our groups working. We have working groups in psychiatry who are responsible for the mandatory rotations […] and all rotations were represented by the sub-committee and in those sub-committees, they appropriated themselves the EPAs.” (PS-01/2014, pp. 183-194).
As CBME implementation progressed, Educators realized that what works best is to bring concrete worked-out solutions and make colleagues feel that what you are proposing is going to formalize what they are already doing. The above themes were associated with the externalization phase where Educators were revisiting their roles and activities—as experts in CBME, new language was being introduced associated with pedagogy and an overall consciousness was arising that CBME implementation must be a group effort.
EPAs as opportunity
The introduction of Entrustable Professional Activities (EPAs) (Peters, Holzhausen, Boscardin, ten Cate, & Chen, 2017; ten Cate & Scheele, 2007) in medical education provided an opportunity for Educators to push CBME implementation further. Educators adapted their approach to introduce and explain EPAs as a core component of CBME. Consequently, in the internal medicine specialty programs Educators became known as “people willing to help” (IM-FG/2014). This led them to change the way to reach out to colleagues. They learned to articulate the contradictions in teaching practice and show how EPAs provided the practical tools to resolve them. Educators had by then crossed the threshold from the externalization stage to the combination stage, where concepts are disseminated to other members of the organization who gain a practical understanding of them.
In 2014, Educators spoke of their keen interest for pedagogy and their willingness to improve teaching in the surgery department. The transition from socialization to externalization stage was signalled by the change of tack in bringing change to the teaching sites. Educators realized that they needn’t bother explaining the CBME concepts but make them workable.
“Well, people … sincerely … when we talk to people about EPAs, competencies, milestones, they have no idea what we’re talking about. And even if they attended presentations on the subject … I mean even myself who attended many of these presentations, it wasn’t at all clear at the beginning. So, I think people were wary at the beginning and I think they are still. And I understand, but I think that by doing it in practice it should be easier.” (SU-FG/2014, pp. 356-361).
The externalization stage in surgery was marked by the resolution of issues of legitimacy. The need to introduce EPAs for the assessment of competencies into surgery programs became clear to all instructors in the department and the Department Chair clearly expressed the need for it. This narrowed Educators’ work to working with key people, such as Program Directors, who wielded some authority to implement CBME.
By 2014, the Surgery Department was transitioning from the externalisation to the combination stage. A mark of the combination stage, where change starts to reach a wider audience and becomes systematized, was the introduction of a calendar of Surgery Department meetings devoted to CBME topics:
“you know a sort of structure in which, we say there is such a number of meetings per year in such and such a context. You know, that’s it, there needed to be a sort of skeleton of a structure in which you can easily find yourself. This instead of constantly chasing each other to meet-up and work.” (SU-FG/2014).
The formalization of CBME in this way helped to consolidate the gains and further stimulate Educators and colleagues to persevere in its implementation. It must be said, that it was known by 2014 that the ENT speciality would be the first implement EPA’s (e.g. CBME) the following year, as required by the National Accreditation agency, providing further incentive for change.
By 2014, Educators and Department Chairs in all three departments recognized that few people questioned the need to implement CBME anymore. Indeed, discourse on CBME was less often cast in a negative light. Pedagogy became a recurring topic in most Department Continuing Medical Education meetings and yearly Pedagogy Days are offered by the departments explicitly focused on writing competency assessment tools. However, EPA implementation was just beginning and was far from becoming a widespread tool for teaching and assessment at the medical school.
In summary, educators enacted change by collaborating with each other to strengthen their mastery of abstract concepts, securing recognition as experts from peers, transitioning from a role of “teacher” to one of “advisor”, and actively participating in the writing of tools and teaching materials within departmental committees.
During the second wave of data collection in 2014, as we listened to Department chairs and educators relate their experiences in CBME implementation, we were able to share some of our insights based on the Organizational Knowledge Creation Model that we had been using to reflect on what we were witnessing. The model, with its four stages, uses a simple language—or discourse—to describe a very complex process. For example, when educators described how hard it was at the beginning, to convince colleagues to come to their training sessions, we provided possible explanations using the socialization phase. Namely, they had noticed the “contradictions” in the way they were teaching and they were aware, thanks to the CBME training they had received, that implementation was coming soon. When they realized that they were not reaching those educators who most needed to know about this, that’s when they started seeking different ways to move forward. This kind of discussion, between educators and the research team, led them to become aware of the necessity and the means to reach a wider audience.
What also became very clear, at the second data collection wave, is that new roles and new approaches to working together were being developed. The fact that educators became surprised that they were the “experts” and they were invited to take part in pedagogical committees forced them to assume new roles and value their pedagogical knowledge. They further realized that they needed to work with Program Directors to get colleagues to collaborate in the writing of EPAs. These examples illustrate how new organizational knowledge was being created and this meant that the medical school was progressing towards its goal: change was happening! There are two valuable lessons here: 1) there was a shift in roles and responsibilities that educators recognized and undertook, and; 2) that new ways of working together to achieve the common goal of implementing CBME were being developed. Our selection of the Organizational Knowledge Creation Model, and the descriptions of the simple stages, provided the language and ideas that enabled educators to raise their awareness of the progress they had achieved in two years’ time.
Hence the Organizational Knowledge Creation Model afforded a unique perspective on CBME implementation which was useful to the educators carrying it out. On one level, by making them aware of their achievement, the model showed them how valuable and appreciated their work was. On another level, the simple and logical descriptions of the stages provided the words to recognize change and communicate about it to Faculty in the multiple teaching sites as well as to University authorities and social stakeholders. When the time comes to enact a change of similar amplitude, the model will be able to provide useful guidance through the process and ensure that progress is duly acknowledged.
In order to observe change in our medical school, we selected the case study approach with embedded levels. This meant that the individual differences amongst individual educators themselves were not taken into account. The cost is that factors such as personality, medical specialties, hospital settings, etc. that could play a role were overlooked. However, the focusing on the relationships between educators and the social structures, in coherence with the model, is of sufficient value to offset the cost. The impact of these relationships on organizational change cannot be underestimated and their examination can provide valuable insight into progress achieved.
Finally, since the data collection was completed in 2014, the medical school continued to implement change, in conjunction with Canada’s medical school accreditation agencies, the Royal College of Physicians and Surgeons of Canada and the College of Family Physicians of Canada. The process is largely complete, more than half the specialty programs have implemented the change. The writing of Entrustable Professional Activities (EPA) is now done by the accrediting agencies, so the focus within our medical school has been on Faculty development to prepare educators to guide and assess medical students according to the new standard.
Implementing CBME is considered an important organizational change for medical schools. Getting Faculty to envision such change is a challenge. The Organizational Knowledge Creation Model provided useful indications of progress over two years that allowed educators, university authorities and stakeholders to have a common appraisal of the progress that was being attained. This provided well-deserved justification to celebrate successes—a powerful motivating factor. The model will also prove to be useful for planning and conducting future changes in medical schools.
1In the internal medicine department, the same three persons were interviewed individually in 2012 and as a focus group in 2014.
 Albanese, M. A., Mejicano, G., Anderson, W. M., & Gruppen, L. (2010). Building a Competency-Based Curriculum: The Agony and the Ecstasy. Advances in Health Sciences Education, 15, 439-454.
 Ashforth, B. E., & Mael, F. (1989). Social Identity and the Organization. The Academy of Management Review, 14, 20-39.
 Carraccio, C., Englander, R., Van Melle, E., Ten Cate, O., Lockyer, J., Chan, M. K., & International Competency-Based Medical Education (2016). Advancing Competency-Based Medical Education: A Charter for Clinician-Educators. Academic Medicine, 91, 645-649.
 Carraccio, C., Wolfsthal, S. D., Englander, R., Ferentz, K., & Martin, C. (2002). Shifting Paradigms: From Flexner to Competencies. Academic Medicine, 77, 361-367.
 Englander, R., Frank, J. R., Carraccio, C., Sherbino, J., Ross, S., & Snell, L. (2017). Toward a Shared Language for Competency-Based Medical Education. Medical Teacher, 39, 582-587. https://doi.org/10.1080/0142159X.2017.1315066
 Frank, J. R., Mungroo, R., Ahmad, Y., Wang, M., De Rossi, S., & Horsley, T. (2010). Toward a Definition of Competency-Based Education in Medicine: A Systematic Review of Published Definitions. Medical Teacher, 32, 631-637.
 Hall, A., Rich, J., Dagnone, D. J., Weersink, K., J, Caudle, J., Sherbino, J., & Van Melle, E. (2020). It’s a Marathon, Not a Sprint: Rapid Evaluation of CBME Program Implementation. Academic Medicine, 95, 786-793.
 Klamen, D. L., Williams, R. G., Roberts, N., & Ciancolo, A. T. (2016). Competencies, Milestones, and EPAs: Are These Who Ignore the Past Condemned to Repeat It? Medical Teacher, 38, 1-7. https://doi.org/10.3109/0142159X.2015.1132831
 Kogan, J. R., Conforti, L. N., Yamazaki, K., Iobst, W., & Holmboe, E. S. (2017). Commitment to Change and Challenges to Implementing Changes after Workplace Based Assessment Rater Training. Academic Medicine, 92, 394-402.
 Krupat, E., Pololi, L., Schnell, E. R., & Kern, D. E. (2013). Changing the Culture of Academic Medicine: The C-Change Learning Action Network and Its Impact at Participating Medical Schools. Academic Medicine, 88, 1252-1258.
 Nonaka, I., & Toyama, R. (2003). The Knowledge-Creating Theory Revisited: Knowledge Creation as a Synthesizing Process. Knowledge Management Research & Practice, 1, 2-10. https://doi.org/10.1057/palgrave.kmrp.8500001
 Peters, H., Holzhausen, Y., Boscardin, C., ten Cate, O., & Chen, H. C. (2017). Twelve Tips for the Implementation of EPAs for Assessment and Entrustment Decisions. Medical Teacher, 39, 802-807. https://doi.org/10.1080/0142159X.2017.1331031
 Steinert, Y., Naismith, L., & Mann, K. (2012). Faculty Development Initiatives Designed to Promote Leadership in Medical Education. A BEME Systematic Review: BEME Guide No. 19. Medical Teacher, 34, 483-503.
 ten Cate, O., & Scheele, F. (2007). Competency-Based Postgraduate Training: Can We Bridge the GAP between Theory and Clinical Practice? Academic Medicine, 82, 542-547.
 Tong, A., Sainsbury, P., & Jonathan, C. (2007). Consolidated Criteria for Reporting Qualitative Research (COREQ): A 32-Item Checklist for Interviews and Focus Groups. International Journal for Quality in Health Care, 19, 349-357.
 Van Melle, E., Frank, J. R., Holmboe, E. S., Dagnone, D., Stockley, D., & Sherbino, J. (2019). A Core Components Framework for Evaluating Implementation of Competency-Based Medical Education Programs. Academic Medicine, 94, 1002-1009.