Back
 JSS  Vol.8 No.11 , November 2020
The Influence of Instructional Design and Instructional Material on Learners’ Motivation and Completion Rates of a MOOC Course
Abstract: Despite the formal, non-formal and informal training opportunities offered by MOOCs, they show low completion rates due to both personal factors of the participants and the instructional design of the courses. Empirical research has shown that good instructional design can help increase learners’ motivation, helping them achieve high performance by completing the program. To determine to which extent the quality instructional material, which is the result of a good instructional design, contributes to their motivation and high rates of completion of the courses, we created an eight (8) week MOOC program on “Violence and bullying in schools”, following the instructional design model “Systems Approach Model”. The motivation of the learners from the instructional material was measured using the Instructional Material Motivation Survey (IMMS) tool and showed that all the learners were highly motivated both as a whole and in every factor of the ARCS model (Attention, Relevance, Confidence, Satisfaction) contributing to very high completion rates, as from the 1309 learners who started the program, 1050 completed it (80.2%). Considering the results of our research, we, finally, suggest some points that can be taken into account when designing similar programs.

1. Introduction

Massive Open Online Courses (MOOCs) first appeared in 2008 and their forerunner can be considered the Open Course Ware program that was started in 2002 by MIT and sparked the Open Educational Resources (OER) movement ( Liyanagunawardena, Adams, & Williams, 2013). Today they are a tool for access to higher education by millions of people who want to improve their lives ( UNESCO, 2016).

Participants in MOOCs do not pay tuition fees nor do they have to meet certain criteria to enroll in them, even if their creator suggests having specific knowledge and skills to be able to understand their content. Their learning material is offered through short videos, slides or other digital files ( Hoy, 2014) and is hosted on online platforms such as Coursera and Edx. For the evaluation of the learners, assignments are assigned that are graded by graduates, teachers or other learners. Small, closed-ended quizzes that are automatically graded by computers are also used. Upon successful completion of the program, a free charge non-formal electronic certificate of completion or a formal certificate of payment and participation in formal examinations is provided ( Karnouskos & Holmlund, 2014).

Despite the ease of access and training opportunities they offer, many participants appear disappointed with their participation due to their form, their instructional design, the lack of frequent contact with the instructors and the vague instructions given to them ( Yuan & Powell, 2013; Hew & Cheung, 2014). Eventually, a very small percentage manages to complete them. In general, their completion rates range between 5% - 15% ( Jordan, 2013) or less than 10% as concluded by Alraimi, Zo, & Ciganek (2015) considering the completion rates of some other surveys.

The motivation of the learners plays an important role in the participation and the completion of the online programs. According to the research, the motivation of the learners can be achieved through a good instructional material, which is the result of an equally good instructional design, contributing significantly to the increase of the completion rates of the online programs. In the present study, we examine the degree of motivation of learners from the instructional design and instructional material of the first MOOC of the University of the Aegean (Greece), lasting eight (8) weeks, on “Violence and bullying in schools” and whether the degree of motivation contributed to increased program completion rates.

1.1. Instructional Design

In the modern concept of teaching, all its parts (instructor, students, learning material, learning environment) have a critical role and any change in one of them can affect the rest, but also the final learning outcome. That is, they function as a system and a way to improve the learning outcome is through instructional design ( Dick, Carey, & Carey, 2015).

In online learning environments, where lessons are conducted via the Internet, instructional design is considered necessary, as it systematizes the development process of these programs and contributes to achieving the learning goals that have been set ( Sofos, Kostas, & Paraschou, 2015) ensuring that the educational material created is effective and suitable for the educational needs of the learners.

One of the instructional design models we relied on to develop our program is Dick, Carey, & Carey’s “Systems Approach Model”. This model is one of the best known ( Gagne, Briggs, & Wager, 1992; Sofos et al., 2015), most popular and most influential, and it is comparison model for all other instructional design models ( Gustafson & Branch, 2002).

The model is completed in ten different steps that can be followed linearly, cyclically, or in parallel ( Dick et al., 2015). These steps are:

1) Identification of instructional goals. Instructional goals are more generally articulated to performance goals. Therefore, an educational goal may equate to a set of performance goals ( Oosterhof, 2010) achieved through the achievement of the performance goals associated with ( Sofos et al., 2015).

2) Conducting an instructional analysis, during which the educational goals of the previous step are analyzed and the steps for their achievement are determined, as well as the skills, knowledge and attitudes that the learners must possess to achieve them to the maximum extent.

3) Analysis of the learners and the context, during which the learning characteristics of the learners and the educational context in which they will learn and apply their new knowledge/skills are clarified.

4) Setting performance objectives, that is, what learners will be able to do, as well as how it will be demonstrated that they can do it.

5) Development of assessment instruments, which will examine the degree of achievement of the performance objectives of the previous stage.

6) Development of the instructional strategy that will lead to the achievement of the performance objectives. The instructional strategy may include pre-learning activities to motivate learners and increase their interest, activities of presenting new learning material, activities of active participation in the learning process, practice and reflection and activities of evaluating new knowledge and applying it in real conditions.

7) Development and/or selection of the instructional material based on which the instructional strategy of the previous stage will be implemented.

8) Development and construction of formative evaluation that will identify potential problems in instructional planning and possibilities for further improvement.

9) Review of the instructional intervention, based on the results of the formative evaluation, which will allow its improvement.

10) Developing and conducting a summative evaluation, which as a step does not belong to the design process, however, it is necessary conclude the success or not of the teaching.

1.2. Literature Review

Motivation is the driving force for participating in a training program. They are the reasons why one will decide and act with certain behavior and the reasons that determine the intensity of the effort that one will make ( Keller, 2010).

Motivations can be categorized as intrinsic and extrinsic. Intrinsic motivation stems from the learner himself and is related, for example, to his need for learning, interest, curiosity, and inner satisfaction. On the other hand, extrinsic motivations stem from learner’s external environment and are related to rewards and applause ( Davidson, Sternberg, & Sternberg, 2003; Dembo & Seli, 2020). Higher motivated learners are more actively involved in their learning and are more likely to complete the program ( Zimmerman, 1990; Sungur, 2007). Indeed, findings from various surveys highlight the positive role of (intrinsic) motivation in participation and program completion ( Littlejohn, Hood, Milligan, & Mustain, 2016; Khalil & Ebner, 2017; Watted & Barak, 2018; Shukor & Sulaiman, 2019). In contrast, those motivated by extrinsic motivations face more difficulties ( Rabin, Henderikx, Yoram, & Kalz, 2020). Nevertheless, extrinsic motivations can also motivate learners, especially obtaining a certificate of successful completion ( Kizilcec & Schneider, 2015; Semenova, 2020).

Motivation becomes even more important in autonomous learning environments, such as MOOCs than in traditional learning environments, due to the instructor’s lack of control over learners’ activity and the lack of communication between learners ( Semenova, 2020).

One way to increase the motivation of learners to continue and complete the program in which they participate, is its instructional design, as various studies have reported that poor instructional design is an obstacle that leads to dropping out of courses ( Gütl, Rizzardini, Chang, & Morales, 2014; Nawrot & Doucet, 2014; Loizzo, Ertmer, Watson, & Watson, 2017), while on the contrary, a good instructional design can promote learning ( Yousef, Chatti, Schroeder, & Wosnitza, 2014; Jung, Kim, Yoon, Park, & Oakley, 2019). Similarly, the instructional material of a MOOC program that is created or selected during the instructional design, is an important factor that influences the participation of the learners. In the research of Wang & Baker (2015) the participants who completed the program “Big Data in Education” considered the interest of the learners in the instructional material more important for the completion of the program than their interest in the MOOCs, while in the research of Hone & El Said (2016) found that learning material significantly affects participation in the program. More recently, in the Hew (2018) study, in which ten highly-rated MOOCs were examined taking into account the comments on the CourseTalk website of the learners who attended them, one of the factors that contributed to the active participation of the learners was the instructional material that meets needs and preferences of the learners.

Despite the importance of instructional design for the quality of the programs provided and its contribution to increasing the motivation of learners, very little research examines it. Margaryan, Bianco, & Littlejohn (2015) examining the instructional design of 76 randomly selected MOOCs programs, concluded that there is great room for improvement in most instructional design principles, except the organization and presentation of instructional material where the application of instructional design principles was high. In contrast to previous research, Watson, Watson, & Janakiraman (2017) found the greater application of instructional design principles in the nine (9) MOOCs they examined, although their research presented several limitations. Similar to the research of Margaryan et al. (2015), Oh, Chang, & Park (2019), examining 40 MOOCs from the subject of computer science, concluded that the application of the principles of instructional design is very small.

The examination of the contribution of instructional material in increasing the motivation of learners is equally small ( Huang & Hew, 2017). Examining the motivation of learners from the instructional material utilizing the Instructional Material Motivation Survey (IMMS) tool, Huang & Hew (2016) examined the responses of 27 learners who had participated in MOOCs programs hosted on various platforms and concluded that their motivation was, above all, average to high. A similar conclusion was reached by the same researchers in a later study examining the responses of 47 people, in which their motivation from the instructional material contributed to higher rates of completion of programs ( Huang & Hew, 2017).

1.3. Current Study

The purpose of this study is to evaluate the instructional design and instructional material of the MOOC program that we created, in terms of the degree of motivation that caused to those who attended it and to investigate the degree that their motivation from the instructional material contributed to increased completion rates.

Therefore, the following research questions were posed:

· Did the instructional material influence the motivations of the learners, both overall and in terms of the four factors of the ARCS (Attention, Relevance, Confidence, Satisfaction) model?

· Did the instructional design influence the course completion rates?

2. Method

2.1. Research Model and Procedure

For the present study, which is part of the first researcher’s doctoral research, we adopted the quantitative research method. The IMMS instrument was integrated into the program hosting platform and was answered once at the end of the program by the learners who completed it.

2.2. Research Context

The program, which was the first attempt at the University of the Aegean in the field of MOOCs, was conducted from 3/2 to 29/3/2020 and was hosted on an OpenEdx platform that we installed on a server of the University.

It consisted of eight weekly modules that were activated every Monday, and each included:

1) Instructional goals for what the learners were expected to achieve by attending each module.

2) Short introductory video (up to 2 minutes) that summarized the highlights of the previous week and informed about the topic and goals of the week that was starting.

3) Motivational activities that motivated the learners to submit their previous views, knowledge, attitudes, experiences and to develop a dialogue between them.

4) The main instructional material with short videos of up to 6 minutes with built-in slides that highlighted the main points that were heard or presented other explanatory elements (graphs, sketches, etc.). Videos with facts, testimonies, simulations, and analogies were also used as examples to explain the concepts presented in the main instructional material.

5) A multiple-choice quiz of 5 - 10 questions of knowledge, understanding, application, evaluation, analysis, and composition of data, after each video. Each response provided feedback justifying the correctness or error of each response.

6) One or more voluntary activities that led to the recall of the knowledge presented and their application to address incidents of violence and bullying in schools (case studies).

7) A final work of 300 - 500 words at the end of each weekly unit that included open-ended questions aimed at analyzing, synthesizing, and applying knowledge to resolve incidents of violence and bullying in schools. The assignments were evaluated by the other learners (peer review).

8) Additional educational material to deepen the knowledge presented.

2.3. Sample

Initial interest in attending the program expressed 1952 active teachers, pedagogical students, and individuals, but the majority were active teachers. Some participants did not activate their account or never showed up when the program started. In total, 1309 people participated in at least one of the activities of the program. Of these, another 259 dropped out of the program at some point, most in the first week of the course. Finally, 1050 people completed the program and were asked to answer the IMMS instrument stating the degree to which they agree with each of its statements. Two people did not answer, so the final sample is the 1048 learners whose answers were recorded.

2.4. Instrument

The IMMS instrument was used to measure the motivation of the learners from the educational material.

The IMMS instrument is designed based on the principles of the ARCS model, which provides strategies for developing instructional material that will create and maintain motivation for learning. It can be used either with printed educational material or in electronic environments that use educational material that supports self-directed/self-regulated learning. The tool aims to measure the degree of didactic reinforcement that the instructional material provides to the student through the quantified measurement of the four variables of the ARCS model ( Keller, 2010).

The questionnaire contains 36 statements (some of which are inverse), each of which corresponds to one of the four factors of the ARCS model ( Keller, 2010):

· the Attention that refers to the curiosity, activation, and interest caused to the learners by the course material.

· the Relevance that learners recognize in the course material to their needs, motivations, experiences, and interests.

· the Confidence inspired by the course material, i.e. the expectation for positive learning outcomes.

· the Satisfaction provided to students during or after the course.

The instrument was translated from English to Greek following the forward-backward translation methodology which is completed in four different stages ( Van de Vijver & Leung, 1997; Lee, Chinna, Lim Abdullah, & Zainal Abidin, 2018). The internal consistency index (Cronbach’s alpha) of the scale, found to be above the limit of 0.7 in all factors (Attention: 0.796, Relevance: 0.839, Confidence: 0.705, Satisfaction: 0.849), but also in total (0.935).

2.5. Data Analysis

For the analysis of the answers of the IMMS questionnaire the parametric One sample t-test was used, as the distribution of the sample was close to normal. This test checks whether the average of the population from which the sample was taken differs from a control value. The control value was set to 3, which on the Likert scale of the IMMS instrument corresponds to the average of its scale, as proposed by its manufacturer ( Keller, 2010).

No inferential statistics were used to answer the second research question, only descriptive ones.

3. Results

3.1. Research Question 1

In all factors but also in general, the learners express that they were greatly motivated by the instructional material, as the averages are above the average of the scale and move towards its upper limit, except for the factor of Confidence which is marginally below 4 (M = 3.98, SD = 0.505). The averages of the other factors are M = 4.41 (SD = 0.57809) for the Satisfaction factor, M = 4.35 (SD = 0.55028) for the Attention factor and M = 4.30 (SD = 0.4515) for the Relevance factor. Also, the total average of the answers is much above the average of the scale (M = 4.26, SD = 0.4494).

More specifically, starting from the factor with the highest average (Satisfaction), the learners express a high degree of motivation in each statement. The highest average has the statement “It felt good to successfully complete this lesson” (M = 4.72, SD = 0.562) and the lowest (M = 3.98, SD = 1.049) the statement “The wording of feedback after the exercises, or of other comments in this lesson, helped me feel rewarded for my effort” which had the largest standard deviation from all the questions, which indicates that the views of the learners differ. In the other statements, the second-highest average has the statement “It was a pleasure to work on such a well-designed lesson” (M = 4.56, SD = 0.691), followed by the statement “Completing the exercises in this lesson gave me a satisfying feeling of accomplishment” (M = 4.48, SD = 0.698), the statement “I really enjoyed studying this lesson” (M = 4.38, SD = 0.778) and finally, the statement “I enjoyed this lesson so much that I would like to know more about this topic”(M = 4.31, SD = 0.804).

In the Attention factor, the smaller averages are summed up by the inverse statements “The pages of this lesson look dry and unappealing” (M = 1.36, SD = 0.864), “This lesson is so abstract that it was hard to keep my attention on it” (M = 1.47, SD = 0.941), “The style of writing is boring” (M = 1.54, SD = 0.910), “There are so many words on each page that it is irritating” (M = 1.70, SD = 1.01) and “The amount of repetition in this lesson caused me to get bored sometimes” (M = 2.03, SD = 1.110), which indicates that the learners do not agree with these statements. Instead, the highest averages appear in the statements “There was something interesting at the beginning of this lesson that got my attention” (M = 4.45, SD = 0.663), “This lesson has things that stimulated my curiosity” (M = 4.42, SD = 0.683), “These materials are eye-catching” (M = 4.40, SD = 0.745), “The variety of reading passages, exercises, illustrations, etc., helped keep my attention on the lesson” (M = 4.35, SD = 0.782), “The way the information is arranged on the pages helped keep my attention” (M = 4.34, SD = 0.761), “The quality of the writing helped to keep my attention” (M = 4.30, SD = 0.826) and the statement, “I learned some things that were surprising or unexpected” (M = 4.05, SD = 0.912).

In the Relevance factor, the highest average has the statement “The content of this lesson will be useful to me” (M = 4.71, SD = 0.559), while the lowest has the inverse statement “This lesson was not relevant to my needs because I already knew most of it” (M = 4.28, SD = 0.972). In the rest of the statements, the highest averages in a row have the statements “Completing this lesson successfully was important to me” (M = 4.64, SD = 0.604), “There were stories, pictures, or examples that showed me how this material could be important to some people” (M = 4.48, SD = 0.710), “The content of this material is relevant to my interests” (M = 4.44, SD = 0.725), “The content and style of writing in this lesson convey the impression that its content is worth knowing” (M = 4.39, SD = 0.733), “I could relate the content of this lesson to things I have seen, done, or thought about in my own life” (M = 4.24, SD = 0.795), “There are explanations or examples of how people use the knowledge in this lesson” (M = 4.16, SD = 0.834) and finally, the statement “It is clear to me how the content of this material is related to things I already know” (M = 3.39, SD = 1.020), which has the largest standard deviation among all the questions of the factor.

In the factor with the lowest overall average (Confidence), the lowest averages are collected by all inverse statements with averages well below the scale average (3). Specifically, the lower averages are collected by the statements “I could not really understand quite a bit of the material in this lesson” (M = 1.45, SD = 0.897), “This material was more difficult to understand than I would like for it to be” (M = 2.10, SD = 1.105), “The exercises in this lesson were too difficult” (M = 2.13, SD = 1.030) and the statement “Many of the pages had so much information that it was hard to pick out and remember the important points” (M = 2.71, SD = 1.186). On the contrary, the highest averages are gathered by the statements with which they largely agree, with averages well above the average of the scale. Specifically, they agree with the statements, “The good organization of the content helped me be confident that I would learn this material” (M = 4.43, SD = 0.725), “As I worked on this lesson, I was confident that I could learn the content” (M = 4.24, SD = 0.731), “After working on this lesson for a while, I was confident that I would be able to pass a test on it” (M = 4.03, SD = 0.908), “After reading the introductory information, I felt confident that I knew what I was supposed to learn from this lesson” (M = 3, 97, SD = 0.884) and the statement “When I first looked at this lesson, I had the impression that it would be easy for me” (M = 3.58, SD = 0.957).

Subsequently, the One-Sample t-test was performed on all the factors of the ARCS model, but also as a whole, as the distribution of the sample was close to normal. This test checks whether the average of the population from which the sample was taken differs from a control value. In the test, the control value was set to 3, which on the Likert scale of the IMMS instrument corresponds to the average of its scale, as suggested by its manufacturer ( Keller, 2010).

The test showed that in all factors, but also in general there were statistically significant differences (p = 0.00 < 0.05). Also, the value of it in all variables was found outside the acceptance range, which sets the 95% Confidence Interval of the Difference. Therefore, we conclude that the training material had a statistically significant effect on the motivations of the learners, both as a whole and in all the factors of the ARCS model (Table 1).

3.2. Research Question 2

To calculate the completion rates, a different way is followed by each researcher ( Grainger, 2013). One of them is to consider the initial number of people enrolled in the program, resulting in small completion rates. Another one, which we also adopted, is to consider the number of people who completed the program in relation to those who participated in, at least, one activity of it. Based on this method of calculation, 1050 people completed the program from the initial 1309 who participated in at least one activity of the program (f = 80.2%).

The dropout rate is very low, compared to the literature, only 19.8%. Most

Table 1. One-sample t-test of IMMS instrument

learners (18.1%) left the program by the 4th week of the course (middle of the program), while a very small percentage (1.7%) left by the end of the program (8th week). Overall, 81.5% of those who left the program, did it during the 1st (59.5%) and during the 2nd week (22.0%) of the course.

4. Discussion

4.1. Research Question 1

Overall, the learners disagree, almost completely, that the program was boring and unattractive, difficult to learn and abstract to the extent that it did not arouse their interest and attention. On the contrary, they almost completely agree that they felt good about completing the program because they learned something useful and successfully completed something important to them. The positive emotions of those who successfully complete a program were also highlighted in the research of Milligan & Littlejohn (2014) and Kleiman, Wolf, & Frye (2015), due to the acquisition of knowledge, experience and professional development opportunities ( Zutshi, O’Hare, & Rodafinos, 2013; Milligan & Littlejohn, 2014; Van Hentenryck & Coffrin, 2014; Kizilcec & Halawa, 2015; Park, Jung, & Reeves, 2015; Huang & Hew, 2016; Koutsodimou & Tzimogiannis, 2016) or simply because they achieved their goals ( Beaven, Hauck, Comas-Quinn, Lewis, & de los Arcos, 2014; Wilkowski, Deutsch, & Russell, 2014; Kleiman et al., 2015; Li, 2015). In fact, the better the course design and the quality of the instructors and learning material ( Oakley, Poole, & Nestor, 2016), the more satisfaction they feel and declare they are willing to attend other courses in the future ( Belanger & Thornton, 2013; Tomkin & Charlevoix, 2014; Koutsodimou & Tzimogiannis, 2016), something that seems to happen in our research, too.

The instructional design of the program helped the learners to increase their Satisfaction with the program, Attention, Relevance and Confidence, with averages well above the average of the scale (3) (Min = 3.98, Max = 4.40, M = 4.25), and higher than the average of the research conducted to check to which extent the instructional material motivated the learners who participated in courses hosted on three different platforms, Coursera, Open2study and Khan Academy ( Huang & Hew, 2016) (Min = 3.58, Max = 3.77, M = 3.69), but also the research of Li (2015) and Gnostopoulou (2018). In the first, Li (2015), as part of his doctoral research, designed two Chemistry courses following the process of instructional design of Keller (2010), achieving lower averages than our research (1st course: Min = 3.99, Max = 4.29, M = 4.15; 2nd lesson: Min = 3.95, Max = 4.17, M = 4.08). In the second research, the researcher designed a MOOC course, in the context of her master research entitled “Introduction to Virtual Reality” following the strategies and techniques of the ARCS model achieving averages Min = 4.04, Max = 4.35, M = 4.18.

Keller (2010: p. 46) states that if there is an improvement in the factors Attention, Relevance and Confidence, then learners will be motivated to learn, and then they should be satisfied to continue wanting to learn. We believe, therefore, that all the learners were motivated by the instructional material and the wider instructional design that was followed and were satisfied until the end of the program.

Among the four factors, the Satisfaction factor improved the most. This finding confirms Gagné and Driscoll (1988) who characterize Satisfaction as the easiest factor to achieve and is usually achieved with constructive and timely feedback, as was the case in our program. On the other hand, while it is easy to provoke attention, it is difficult to maintain it ( Keller, 2010). From the second-best average that the Attention factor had, it seems that both the challenge and its maintenance in the program we implemented were achieved. The Relevance factor refers to how relevant the material is to the goals, needs, experiences and interests of the learners. This factor had the third-best average because of the statement “It is clear to me how the content of this material is related to things I already know” which received the lowest average among the other statements in both groups. Given that the content of the program was entirely related to the issues of violence and bullying in schools that most teachers face daily, there may have been a misunderstanding in the wording of the statement and specifically in the section “related to things I already know” which they perceived to refer to the knowledge they already possessed and not to issues related to their daily lives. The same question, with the same wording, received the lowest average among the other questions of the factor and in Gnostopoulou’s research (2018).

Τhe Confidence factor shows the lowest average, although it is much higher than the average of the scale (M > 3), because, at the beginning, the program seemed demanding and difficult, reducing their beliefs that they could obtain the knowledge they would like and complete it successfully, while then the tight schedule and the several evaluations (quizzes, final assignments) that existed played a role.

4.2. Research Question 2

The program was completed by 80.2% of the learners who started it. This percentage is very high in relation to the percentage of people who complete MOOCs according to the literature, which ranges from 5% - 15% ( Jordan, 2013).

The instructional material of the program and its general instructional design contributed to the very small dropout rate (19.8%), as the findings of various researches have shown that it decreases when learners are satisfied with the program and the instructional material ( Khalil & Ebner, 2013; Gütl, Rizzardini, Chang, & Morales, 2014; Nawrot & Doucet, 2014; Whitmer, Schiorring, & James, 2014; Yousef et al., 2014; Alraimi, Zo, & Ciganek, 2015; Castaño, Maiz, & Garay, 2015; Hew, 2016; Hone & El Said, 2016; Loizzo, Ertmer, Watson, & Watson, 2017; Jung et al., 2019) or when they are motivated by it ( Littlejohn, Hood, Milligan, & Mustain, 2016; Khalil & Ebner, 2017; Hew, 2018; Watted & Barak, 2018; Shukor & Sulaiman, 2019), something that happened in our program as well.

5. Conclusion, Limitations and Future Research

The present study investigated to which extent the instructional material, which is the result of a good instructional design, of the first MOOC program that we created helped to increase the motivation of those who attended it to higher rates of completion of the program. The results showed that the learners who participated in the program were largely motivated by the instructional material, both as a whole and in each factor of the ARCS model. Their degree of motivation is much higher than the average of the scale, consistent with the findings of Huang & Hew’s research (2016, 2017), but surpassing the results of other research designed following Keller’s instructional design model ( Li, 2015; Huang & Hew, 2016; Gnostopoulou, 2018). This fact shows that it does not matter which model of instructional design will be used, if it is used correctly, considering the requirements and needs of the learners. Of course, this is difficult to do in MOOCs involving a heterogeneous set of learners. However, the requirements of participating in the program out of personal or professional interest may be considered, as this is the most common motivation for enrolling in MOOCs.

The high degree of motivation of the learners contributed to very small dropout rates, only 19.8%, confirming empirical research that the instructional material, which is the result of an equally good educational planning, affects the participation and completion of the programs.

In order to design future successful programs in terms of student performance and low dropout rates, we suggest:

· following a careful instructional design according to a specific model, such as that of Keller (2010) or Dick et al. (2015).

· use of illustrative examples, case studies, facts or events that contradict students’ perceptions and knowledge, use of analogies and allegories, alternation of methods and means of instructional material and frequent change of interaction control between the teacher and the learner.

· clear schedule and goal setting in each weekly module.

· introductory videos that will guide, inform and arouse the interest of the learners to watch the section.

· short videos of up to 6 minutes with integrated slides or other multimedia material that will highlight to students the most important points from what they hear and see.

· closed-ended quizzes of various types of questions and not only cognitive ones in which explanatory comments will be provided in each answer.

· interesting start-up activities and optional tasks that will encourage dialogue.

· small final weekly projects to apply the new knowledge and the provision of exemplar answers either in the forum or automatically after grading a task.

· equal distribution of material in weekly units.

· provision of additional instructional material and bibliography.

Despite the large sample of our research, some limitations do not allow the results to be generalized. In particular, our research examined only one course in which a relatively homogeneous sample participated, as mainly teachers with at least a higher education degree, as well as knowledge and experience in the subject of the program participated. Nevertheless, the findings show that through the general instructional design of the program and the quality instructional material that meets the needs and requirements of the learners, enabling them to apply their knowledge to real problems through various activities, the motivation and active participation of the learners in the program can be achieved, accomplishing, in the end, high completion rates.

Our research extends the existing literature examining the pedagogical quality of MOOCs. These findings, combined with other researchers’ suggestions that MOOCs have room for improvement in their instructional design, can be considered by future program designers to design quality programs that will help learners meet the goals for which they attend them.

Cite this paper: Giasiranis, S. and Sofos, L. (2020) The Influence of Instructional Design and Instructional Material on Learners’ Motivation and Completion Rates of a MOOC Course. Open Journal of Social Sciences, 8, 190-206. doi: 10.4236/jss.2020.811018.
References

[1]   Alraimi, K. M., Zo, H., & Ciganek, A. P. (2015). Understanding the MOOCs Continuance: The Role of Openness and Reputation. Computers & Education, 80, 28-38.
https://doi.org/10.1016/j.compedu.2014.08.006

[2]   Beaven, T., Hauck, M., Comas-Quinn, A., Lewis, T., & de los Arcos, B. (2014). MOOCs: Striking the Right Balance between Facilitation and Self-Determination. MERLOT Journal of Online Learning and Teaching, 10, 31-43.

[3]   Belanger, Y., & Thornton, J. (2013). Bioelectricity: A Quantitative Approach Duke University’s First MOOC.

[4]   Castaño, C., Maiz, I., & Garay, U. (2015). Design, Motivation and Performance in a Cooperative MOOC Course/Diseño, motivación y rendimiento en un curso MOOC cooperative. Comunicar, 22, 19.
https://doi.org/10.3916/C44-2015-02

[5]   Davidson, J. E., Sternberg, R. J., & Sternberg, R. J. (2003). The Psychology of Problem Solving. Cambridge: Cambridge University Press.
https://doi.org/10.1017/CBO9780511615771

[6]   Dembo, M. H., & Seli, H. (2020). Motivation and Learning Strategies for College Success: A Focus on Self-Regulated Learning (6th ed.). Abingdon-on-Thames: Routledge.

[7]   Dick, W., Carey, L., & Carey, J. O. (2015). The Systematic Design of Instruction (8th ed.). Upper Saddle River, NJ: Pearson.

[8]   Gagné, R. M., & Driscoll, M. P. (1988). Essentials of Learning for Instruction (2nd ed.). Upper Saddle River, NJ: Prentice Hall.

[9]   Gagne, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of Instructional Design (4th ed.). Orlando, FL: Harcourt Brace Jovanovich.

[10]   Gnostopoulou, M. (2018). MOOC Design and Motivation: The Course “Introduction to Virtual Reality”. Master Dissertation, Thessaloniki: Aristotle University of Thessaloniki, School of Early Childhood Education, School of Electrical and Computer Engineering, University of Ioannina, Department of Primary Education.

[11]   Grainger, B. (2013). Massive Open Online Course (MOOC) Report. London: University of London International Programmes.
http://www.londoninternational.ac.uk

[12]   Gustafson, K., & Branch, R. (2002). Survey of Instructional Development Models (4th ed.). Syracuse, NY: ERIC Clearinghouse on Information & Technology.

[13]   Gütl, C., Rizzardini, R. H., Chang, V., & Morales, M. (2014). Attrition in MOOC: Lessons Learned from Drop-Out Students. In International Workshop on Learning Technology for Education in Cloud (pp. 37-48). Cham: Springer.
https://doi.org/10.1007/978-3-319-10671-7_4

[14]   Hew, K. F. (2016). Promoting Engagement in Online Courses: What Strategies Can We Learn from Three Highly Rated MOOCS. British Journal of Educational Technology, 47, 320-341.
https://doi.org/10.1111/bjet.12235

[15]   Hew, K. F. (2018). Unpacking the Strategies of Ten Highly Rated MOOCs: Implications for Engaging Students in Large Online Courses. Teachers College Record, 120, Article ID: 010308.

[16]   Hew, K. F., & Cheung, W. S. (2014). Students’ and Instructors’ Use of Massive Open Online Courses (MOOCs): Motivations and Challenges. Educational Research Review, 12, 45-58.
https://doi.org/10.1016/j.edurev.2014.05.001

[17]   Hone, K. S., & El Said, G. R. (2016). Exploring the Factors Affecting MOOC Retention: A Survey Study. Computers & Education, 98, 157-168.
https://doi.org/10.1016/j.compedu.2016.03.016

[18]   Hoy, M. B. (2014). MOOCs 101: An Introduction to Massive Open Online Courses. Medical Reference Services Quarterly, 33, 85-91.
https://doi.org/10.1080/02763869.2014.866490

[19]   Huang, B., & Hew, K. F. (2017). Factors Influencing Learning and Factors Influencing Persistence: A Mixed-Method Study of MOOC Learners’ Motivation. Proceedings of the 2017 International Conference on Information System and Data Mining, Charleston SC USA, April 2017, 103-110.
https://doi.org/10.1145/3077584.3077610

[20]   Huang, B., & Hew, K. F. T. (2016). Measuring Learners’ Motivation Level in Massive Open Online Courses. International Journal of Information and Education Technology, 6, 759-764.
https://doi.org/10.7763/IJIET.2016.V6.788

[21]   Jordan, K. (2013). MOOC Completion Rates: The Data.
http://www.katyjordan.com/MOOCproject.html

[22]   Jung, E., Kim, D., Yoon, M., Park, S., & Oakley, B. (2019). The Influence of Instructional Design on Learner Control, Sense of Achievement, and Perceived Effectiveness in a Supersize MOOC Course. Computers & Education, 128, 377-388.
https://doi.org/10.1016/j.compedu.2018.10.001

[23]   Karnouskos, S., & Holmlund, M. (2014). Impact of Massive Open Online Courses (MOOCs) on Employee Competencies and Innovation. Blekinge: Institute of Technology.

[24]   Keller, J. M. (2010). Motivational Design for Learning and Performance: The ARCS Model Approach. New York: Springer.
https://doi.org/10.1007/978-1-4419-1250-3

[25]   Khalil, H., & Ebner, M. (2013). “How Satisfied Are You with Your MOOC?” A Research Study on Interaction in Huge Online Courses. In EdMedia+ Innovate Learning (pp. 830-839). Victoria: Association for the Advancement of Computing in Education (AACE).

[26]   Khalil, M., & Ebner, M. (2017). Driving Student Motivation in MOOCs through a Conceptual Activity-Motivation Framework. Zeitschrift für Hochschulentwicklung, 12, 101-122.
https://doi.org/10.3217/zfhe-12-01/06

[27]   Kizilcec, R. F., & Halawa, S. (2015). Attrition and Achievement Gaps in Online Learning. Proceedings of the Second ACM Conference on Learning@ Scale, Vancouver, 14-18 March 2015, 57-66.
https://doi.org/10.1145/2724660.2724680

[28]   Kizilcec, R. F., & Schneider, E. (2015). Motivation as a Lens to Understand Online Learners: Toward Data-Driven Design with the OLEI Scale. ACM Transactions on Computer-Human Interaction (TOCHI), 22, 1-24.
https://doi.org/10.1145/2699735

[29]   Kleiman, G., Wolf, M. A., & Frye, D. (2015). Educating Educators: Designing MOOCs for Professional Learning. In The MOOC Revolution: Massive Open Online Courses and the Future of Education (pp. 117-146). New York: Routledge.

[30]   Koutsodimou, K., & Tzimogiannis, A. (2016). Mass Open Courses and Teacher Professional Development: Design Issues and Study of Participants’ Views. Proceedings of the 10th Pan-Hellenic and International Conference “ICT in Education”, Ioannina, 23-25 September 2016, 53-61.

[31]   Lee, W. L., Chinna, K., Lim Abdullah, K., & Zainal Abidin, I. (2018). The Forward-Backward and Dual-Panel Translation Methods Are Comparable in Producing Semantic Equivalent Versions of a Heart Quality of Life Questionnaire. International Journal of Nursing Practice, 25, e12715.
https://doi.org/10.1111/ijn.12715

[32]   Li, K. (2015). Motivating Learners in Massive Open Online Courses: A Design-Based Research Approach. Doctoral Dissertation, Athens, OH: Ohio University.

[33]   Littlejohn, A., Hood, N., Milligan, C., & Mustain, P. (2016). Learning in MOOCs: Motivations and Self-Regulated Learning in MOOCs. The Internet and Higher Education, 29, 40-48.
https://doi.org/10.1016/j.iheduc.2015.12.003

[34]   Liyanagunawardena, T. R., Adams, A. A., & Williams, S. A. (2013). MOOCs: A Systematic Study of the Published Literature 2008-2012. International Review of Research in Open and Distributed Learning, 14, 202-227.
https://doi.org/10.19173/irrodl.v14i3.1455

[35]   Loizzo, J., Ertmer, P. A., Watson, W. R., & Watson, S. L. (2017). Adult MOOC Learners as Self-Directed: Perceptions of Motivation, Success, and Completion. Online Learning, 21.
https://doi.org/10.24059/olj.v21i2.889

[36]   Margaryan, A., Bianco, M., & Littlejohn, A. (2015). Instructional Quality of Massive Open Online Courses (MOOCs). Computers & Education, 80, 77-83.
https://doi.org/10.1016/j.compedu.2014.08.005

[37]   Milligan, C., & Littlejohn, A. (2014). Supporting Professional Learning in a Massive Open Online Course. International Review of Research in Open and Distributed Learning, 15, 197-213.
https://doi.org/10.19173/irrodl.v15i5.1855

[38]   Nawrot, I., & Doucet, A. (2014). Building Engagement for MOOC Students: Introducing Support for Time Management on Online Learning Platforms. Proceedings of the 23rd International Conference on World Wide Web, Seoul Korea, April 2014, 1077-1082.
https://doi.org/10.1145/2567948.2580054

[39]   Oakley, B., Poole, D., & Nestor, M. (2016). Creating a Sticky MOOC. Online Learning, 20, 13-24.
https://doi.org/10.24059/olj.v20i1.731

[40]   Oh, E. G., Chang, Y., & Park, S. W. (2019). Design Review of MOOCs: Application of E-Learning Design Principles. Journal of Computing in Higher Education, 29, 28-46.
https://doi.org/10.1007/s12528-019-09243-w

[41]   Oosterhof, A. (2010). Educational Assessment: From Theory to Practice. Athens: Ellin.

[42]   Park, Y., Jung, I., & Reeves, T. C. (2015). Learning from MOOCs: A Qualitative Case Study from the Learners’ Perspectives. Educational Media International, 52, 72-87.
https://doi.org/10.1080/09523987.2015.1053286

[43]   Rabin, E., Henderikx, M., Yoram, M. K., & Kalz, M. (2020). What Are the Barriers to Learners’ Satisfaction in MOOCs and What Predicts Them? The Role of Age, Intention, Self-Regulation, Self-Efficacy and Motivation. Australasian Journal of Educational Technology, 36, 119-131.
https://doi.org/10.14742/ajet.5919

[44]   Semenova, T. (2020). The Role of Learners’ Motivation in MOOC Completion. Open Learning: The Journal of Open, Distance and e-Learning, 1-15.
https://doi.org/10.1080/02680513.2020.1766434

[45]   Shukor, N. A., & Sulaiman, S. (2019). Self-Regulated Learning Strategies and Learning Retention in MOOC.

[46]   Sofos, A., Kostas, A., & Paraschou, B. (2015). Online Distance Learning from Theory to Practice. Greek Academic Electronic Textbooks and Aids.
http://hdl.handle.net/11419/182

[47]   Sungur, S. (2007). Modeling the Relationships among Students’ Motivational Beliefs, Metacognitive Strategy Use, and Effort Regulation. Scandinavian Journal of Educational Research, 51, 315-326.
https://doi.org/10.1080/00313830701356166

[48]   Tomkin, J. H., & Charlevoix, D. (2014). Do Professors Matter? Using an a/b Test to Evaluate the Impact of Instructor Involvement on MOOC Student Outcomes. In Proceedings of the First ACM Conference on Learning@ Scale Conference (pp. 71-78). New York: ACM.
https://doi.org/10.1145/2556325.2566245

[49]   UNESCO (2016). Making Sense of MOOCs: A Guide for Policy-Makers in Developing Countries. United Nations Educational, Scientific and Cultural Organization (UNESCO), France, and Commonwealth of Learning (COL), Canada, June 2016.

[50]   Van de Vijver, F. J. R., & Leung, K. (1997). Methods and Data Analysis for Cross-Cultural Research (Vol. 1). London: Sage.

[51]   Van Hentenryck, P., & Coffrin, C. (2014). Teaching Creative Problem Solving in a MOOC. In Proceedings of the 45th ACM Technical Symposium on Computer Science Education (pp. 677-682). New York: ACM.
https://doi.org/10.1145/2538862.2538913

[52]   Wang, Y., & Baker, R. (2015). Content or Platform: Why Do Students Complete MOOCs. MERLOT Journal of Online Learning and Teaching, 11, 17-30.

[53]   Watson, W. R., Watson, S. L., & Janakiraman, S. (2017). Instructional Quality of Massive Open Online Courses: A Review of Attitudinal Change MOOCs. International Journal of Learning Technology, 12, 219-240.
https://doi.org/10.1504/IJLT.2017.088406

[54]   Watted, A., & Barak, M. (2018). Motivating Factors of MOOC Completers: Comparing between University-Affiliated Students and General Participants. The Internet and Higher Education, 37, 11-20.
https://doi.org/10.1016/j.iheduc.2017.12.001

[55]   Whitmer, J., Schiorring, E., & James, P. (2014). Patterns of Persistence: What Engages Students in a Remedial English Writing MOOC? In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge (pp. 279-280). New York: ACM.
https://doi.org/10.1145/2567574.2567601

[56]   Wilkowski, J., Deutsch, A., & Russell, D. M. (2014). Student Skill and Goal Achievement in the Mapping with Google MOOC. In Proceedings of the First ACM Conference on Learning@ Scale Conference (pp. 3-10). New York: ACM.
https://doi.org/10.1145/2556325.2566240

[57]   Yousef, A. M. F., Chatti, M. A., Schroeder, U., & Wosnitza, M. (2014). What Drives a Successful MOOC? An Empirical Examination of Criteria to Assure Design Quality of MOOCs. 2014 IEEE 14th International Conference on Advanced Learning Technologies, Athens, 7-10 July 2014, 44-48.
https://doi.org/10.1109/ICALT.2014.23

[58]   Yuan, L., & Powell, S. (2013). MOOCs and Open Education: Implications for Higher Education. Glasgow: JISC CETIS.

[59]   Zimmerman, B. (1990). Self-Regulated Learning and Academic Achievement: An Overview. Educational Psychologist, 25, 3-17.
https://doi.org/10.1207/s15326985ep2501_2

[60]   Zutshi, S., O’Hare, S., & Rodafinos, A. (2013). Experiences in MOOCs: The Perspective of Students. American Journal of Distance Education, 27, 218-227.
https://doi.org/10.1080/08923647.2013.838067

 
 
Top