Received 27 April 2016; accepted 13 June 2016; published 16 June 2016
As a native speaker of English, I often think about my own writing process, and how I learned to express myself with words to the level of proficiency required of a graduate student in Canada. I think back to my early days as a university student in my late teens, and how I loathed the “red pen” and feelings of rejection it represented. An English Literature major then, I would become emotionally attached to my pieces of writing, and when they were returned as if vandalized by the instructor, I was so distraught and afraid that I couldn’t even look at, let alone read the comments. When I read my essay before it had been scrawled upon in red, I could only see my own hard work and long hours. After focusing on the same piece of writing for several weeks or months, I had formed an indescribable attachment to it. The essay had become my friend: a work of art, and when a teacher defaced it, I became defensive, agitated, and embarrassed.
It was only after a certain amount of personal and emotional maturity that I was able to read the feedback and suddenly realize that it had significance. I finally understood that my professors knew what they were talking about, and if I read the suggestions and incorporated the changes into my writing, I might actually improve. I cannot pinpoint the exact moment that I was able to do this, but something inside my brain just clicked.
The professors who made the time to explain their comments, really helped me to understand where I needed improvement, and since most of this feedback was content based, it is important to note that my experience involved writing in my own native language. Because I was a native speaker who was not taught grammar directly in school, I really didn’t fully understand grammar corrections until I enrolled in linguistics courses later on. Also, when teachers explained their comments to me, it helped me understand points that I could not de-code, thanks to messy handwriting.
2. A Review of the Related Literature
In the Journal of Computer Assisted Learning in 2012, A.F. AbuSeileek  from Al al-Bayt University, Mafraq, Jordan, wrote an article entitled “Using track changes and word processor to provide corrective feedback to learners in writing.” His research involved 3 control groups with 64 EFL learners who were randomly assigned groups. Two of the groups received writing feedback with MS Track Changes, and one of them also used word processor. The other group received no feedback. Results showed a decrease in 11 major error types in immediate and delayed post types for the groups that received feedback, and the results showed that the most improvement was found in the group receiving both MS Track Changes and word processed feedback, while the control group saw the least amount of improvement.
Shiou-Wen Yeha, Jia-Jiunn Lob (2009)  examine how errors can be tackled using a computerized online error correction and corrective feedback system called Online Annotator for EFL Writing System that they developed themselves. The system consisted of five modules: Document Maker, Annotation Editor, Composer, Error Analyzer, and Viewer. They found that the students who used this system compared to students who did not, were able to find and identify more of their errors. This helps them to learn from, and correct common grammatical errors of EFL writers. This study does not show student or teacher attitudes toward computer mediated error correction. Similarly, it fails to examine long term effects of the program on the overall improvement of writing.
Ellis (2009)  provides instructors with valuable descriptions of all of the six types of linguistic error feedback types for writing. He identifies them as: Direct CF (where the teacher provides the student with the correct form), Indirect CF (where the teacher indicates that there is an error without telling the student what it is), Metalinguistic CF (where the instructor provides a clue), Unfocused and Focused CF (where the instructor corrects all of the errors or just specific ones), Electronic Feedback (where he only examines one error correction program called “Mark My Words”), and Reformulation (where a native speaker re-writes the entire text using the students ideas). This article is useful in that it identifies the key types of feedback, and the major researchers associated with those feedback types. It is a good way to be introduced to the topic of linguistic error correction in writing.
In their article that was published in the International Journal of English Language Studies (2010) Evans, N. W., Hartshorn K. James, & Tuioti, E. A.  examine the attitudes of over 1000 teachers in 69 different countries toward written corrective feedback. The results were conducted using surveys. Evans asks two central questions: “a) To what extent do current L2 writing teachers provide WCF? And b) What determines whether or not practitioners choose to provide WCF?” (p. 53).
The researchers concluded that the majority of teachers seem to genuinely want their students’ writing to improve, and that they use “principled pragmatism” (p. 65) when providing WCF.
In 1996, John Truscott  made an extraordinary claim that shook up the world of L2 instruction. His assertions that grammar error correction of writing was not only ineffective in helping ESL students learn English, but doesn’t help them and may in fact be harmful, were ground breaking in the field. However, it is important to keep in mind that this time frame included an era of anti-grammar instruction in schools, which can be exemplified by the Plowden committee’s work on creating more “progressive”, inquiry based learning in Britain in the late 1960’s, which influenced Canadian education.
The Economist published an article in June of 1998  regarding the Plowden Committee’s rationalization of a loosening of the British school system’s emphasis on rote learning in place of inquiry based learning. In the article, the developments that took place during this educational reformation are analyzed, and disputed for the eventual “lowering of standards” that took place, evidenced by a series of standardized tests that were given to students post-reformation showing how the students’ academic performance notably dwindled.
In this article, Wilson, Olinghouse, and Andrada  examine the effects of an AES (automated essay scoring system) on the performance of students in grades 4 - 8 who participated in a “statewide classroom benchmark writing assessment” in 2012. Their main questions were: 1) Would they observe improvement in writing over successive prompts due to the automated feedback? 2) How might grade level, gender, socio-economic status, and prior writing achievement affect their scores? 3) Would the automated feedback result in improved first draft results, or improved post-draft results?
This article was also useful because it outlines the different kinds of writing feedback that are given my both humans and AES systems: task-focused, procedural, strategic, and metacognitive.
Rami Mustafa  notes that facets of socio-cultural theory need to be applied when it comes to success with Saudi learners, who prefer social, and aural modes of feedback from their instructors due to cultural reasons. This helps to broaden pre-conceptions that we might have about writing feedback being mostly print based, and opens further possibilities for ways of providing feedback both in person, and through oral voice recordings using a technological device.
In her research, Bitchener  studied 144 low-level intermediate students over a 2 month period. She was looking at the efficiency of WCF (written corrective feedback) over time, whether certain methods used by L2 teachers were effective at tackling specific grammatical errors, and the overall effectiveness of WCF for international and migrant students specifically. Her research questions were focused on the usage of the two functions of the English article system (a, an, the), and she had four groups (3 treatment, and 1 control). Her findings indicate that WCF had a profound impact on the students’ comprehension and correct usage of articles, and the three groups who received the different kinds of WCF did better than the control group that received none.
This research is interesting because it is focused on one specific error; however, this research fails to obtain information regarding student and teacher attitudes toward written corrective feedback.
In her research, completed at the University of Texas San Antonio, Shannon Sauro  looks at computer mediated feedback on L2 learner development that reformulates error as recasts, and feedback that provides the learner with metalinguistic information about the error. The students were high-intermediate and advanced adults from a Swedish university. There were three groups, two with feedback options and one control group. They were randomly assigned native speakers who worked with them in an online chat format using text chatting. Her results showed an improvement in the target article error with both groups of feedback, and neither seemed to benefit more in the long term, however, the metalinguistic group fared better than the control group in terms of short term gains, which she feels shows that a grammatical explanation helps students in terms of their short term memory and subsequent understanding of grammatical L2 errors. She recommends that further research needs to ascertain which would be better, oral, written, face-to-face, or computer mediated feedback?
In her research, Ferris  began her study based on two leading questions:
1) What characteristics of teacher commentary appear to influence revision?
2) Do revisions influenced by teacher feedback lead to substantive and effective changes in students’ papers?
She examined 220 papers and the revisions made in a 15 week advanced university ESL course. There were 47 students in 3 sections, and she studied 1600 marginal and end comments that were handwritten by the teachers of each section in the course. She used the constant comparative method of analysis (Glaser & Strauss, 1967) to analyze the comments. Her findings were that the requests for information, general requests and comments on form were the most successful in terms of subsequent revisions. She also found that comments on grammar were very effective. On the other hand, questions or statements that provided information to students, or positive comments did not seem to have any effect on improvements in writing. In summary, longer, text specific comments that focused on both content and form were the most successful in improving student writing. This study helps me in my research in that it focuses on writing feedback in an upper level university ESL class, which is the same level that I am studying here in the ELC at VIU. It is also interesting because it focuses on handwritten feedback, which is the type of feedback that most instructors of AP5 employ. This study does not, however, address student or teacher attitudes toward feedback styles, which is the focus of my research.
In their study, Seker and Dincer (2014)  analyze student attitudes toward writing feedback in second language classes at a university in Turkey.
The study shows that students prefer feedback focused on content, form and organizational aspects of their writing. It was also shown that timely feedback is important. When the students felt positive about the feedback, they were more likely to respond immediately to making the necessary changes, whereas when they felt negative, they were less likely to respond to the feedback. The researchers strongly suggest that teachers follow the opinions and feelings of the students in order to individually tailor their assessment and feedback styles to their students’ individual needs and likes.
In his article, Ron Martinez  , a PhD, explains his strategy of screen casting his writing feedback to students. His article is not research, but rather a strategy that he found to be successful with his ESL writing students and which he felt compelled to share with his colleagues. Using Screencast-o-matic software, he had his students electronically send him their typed essays, and then Martinez video recorded himself talking whilst marking up their typed essays. It is unclear if he used MS track changes, or if he was just highlighting using MS Word highlighter tool, but he goes through the entire essay with them in a YouTube downloadable video format. Although not less time consuming, he did find that his ESL students found this process advantageous because they were able to pause, or play back the video as many times as they needed in case they could not understand the English (since English is not their first language). He also found it comparable to personal interviews, but less time consuming in that he did not need to be physically present during office hours (even though he was spending as much time at it, but from the comfort of his own home. Students gave him very positive feedback about the process. This aligns with my research interests because it deals with writing feedback, and promotes alternate modalities. It is also in line with Seker and Dincer’s  assertion that writing feedback needs to be tailored toward individual student’s needs.
This small study took place in the English Learning Centre (ELC) at Vancouver Island University (VIU). VIU is a degree-granting university located in Nanaimo, on Vancouver Island. The English Learning Centre is part of the Faculty of International Education, and ESL students come here to study pre-academic English language skills in order to become ready for their university courses. Sixty-one AP5 students are currently enrolled in our ELC. I also surveyed the seven AP5 instructors.
My research focuses on the attitudes of the instructors and students of the course. Once students complete and pass this 13 week course, they are granted access into academic studies. The course is also worth 3 credits towards their university degrees. The reason I chose to study this particular level, is that the students’ English language skills should be advanced enough to understand and complete the survey questions. Also, the students complete a major research paper in this course, and therefore receive more feedback on their writing.
The Associate Dean, Norma MacSween, gave her permission for the research to take place here, and the research ethics committee (REB) granted their ethics approval for this project. I am an ESL instructor in this program, and have been a regular faculty member since 2009. A link to FluidSurvey survey questions was given to instructors and students in the program.
4. Statement of the Problem
5. Statement of Hypothesis
It is my purpose to find out what student and teacher attitudes are toward written corrective feedback compared to typed corrective feedback using MS Track Changes. I also survey whether oral or recorded oral feedback is given, and whether students prefer handwritten, typed, or spoken feedback.
I would like to determine whether age is a correlative factor in whether instructors prefer typed or written feedback. Similarly, I wonder if age affects a students’ ability to appreciate feedback, and their ability to ascertain its effectiveness. Are there any correlations between gender and preferred feedback styles? I also hope to find out some of the shortcomings and benefits of handwritten compared to typed corrective feedback using MS Track Changes (or other computer programs), as well as oral feedback, so that as a faculty we can better serve our students.
61 students from 4 sections of AP5 classes in the ELC at Vancouver Island University, in the highest level of the program which is for university credit and required entrance into academic studies at VIU were surveyed. The students are from all over the world. 7 instructors were surveyed as well. The instructors are mostly over 40 or 50, with one instructor below 30. The requirement to work in the ELC is a Bachelor’s Degree, plus a TESL diploma, but a Master’s is preferred, and most instructors have Master’s degrees.
Fluid Survey Software making tools, with anonymity of student and teacher information.
Survey questions (Appendix A) using qualitative and quantitative analysis. Once ethics approval was received, the teachers and students were sent the survey link to respond, and were urged not to provide personal information on the survey. Then a qualitative and quantitative analysis of the data was performed.
I surveyed sixty-one AP5 students, and seven AP5 instructors. The numbers constitute a small study, which can be considered a preliminary research project that will hopefully lead to further, larger studies.
Out of the 61 students, there were 38 students aged 18 - 22, 17 students aged 23 - 26, 3 students aged 27 - 30, and 3 students who were over age 30 (Figure 1). There were 30 female, and 31 male students. Because the majority of the students were in the same age category, there did not appear to be a correlation between age and length of comments provided by the students in response to the survey questions. Some students took more time, and were more thoughtful toward their comments, so it seems to be an individual and personal factor.
Of the students who gave thoughtful and lengthy commentary on their survey questions, some of the interesting points that came up were that MS Word and Track Changes made the students lazy, and prevented them from learning how to spell. Similarly, MS Track Changes took time to learn how to use as a tool, and they did not have time to learn it. One student found “Google Drive” very useful for sending papers to group members for group editing. The students who preferred typed feedback, and who took the time to supply a reason why, explained that they could not understand their teacher’s handwriting. The students who liked handwritten feedback remarked that they could see the mistakes clearly because they were underlined, or crossed out. One student wrote: “I prefer
Figure 1. The average ages of students in AP5.
handwritten feedback because typed may sometimes [sic] seem unclear and just like any normal printout in black and white, it doesn’t show you exactly where you made your mistakes.”
In total, 3 female instructors preferred to give handwritten feedback, 3 female instructors, and 1 male instructor preferred to give both handwritten and typed feedback, and 4 female instructors and 1 male instructor liked to include oral feedback in the form of one on one, face to face interviews with their other preferred mode(s) of feedback (Figure 4). Three instructors explained that the one on one interviews enabled them to explain things in more depth to the students, such as grammar points, and content/organizational factors that helped clarify things for the students, while enabling the students to ask important questions about their writing. Time and
Figure 3. Ages of instructors. Caption 1: Seven instructors were surveyed, and all of them were female, and only 1 was male, so gender is not a consequential factor here unless there were an equal number of males and females.
workload were the major constraints that instructors felt impeded their ability to provide more feedback, and additional feedback modes.
Studies focusing on the acquisition in articles in writing (Sauro, 2009)  have shown that focused corrective feedback is much more useful than unfocused, which counter Troscott’s (1996)  claim that any feedback is useless.
Corrective feedback (CF) entails crossing out the error and placing the correct structure over it. Then, using metalinguistic feedback, the teacher explains the grammar rule associated with the error explicitly to the student which can be done orally, or in writing. Indirect feedback would point to the error either by underlining or providing a symbol and then the student has to figure it out, which is better than no feedback at all, yet is not as effective as CF, according to Bitchener, Knoch (2008),  Ellis, Sheen, Murakami, and Takashima (2008)  .
MS track changes allows the instructor to apply typed corrective feedback to a piece of writing. It does not, however, have error correction symbols that teachers can readily insert over a mistake. Does the lack of this feature on MS Track Changes detract from its efficacy? Although editing programs, such asOnline Annotator for EFL Writing system are available, as developed by Yeh, S., & Lo, J. (2009),  these programs require intensive training for the instructors and the students in order to be able to use them effectively, and this is problematic when instructors and students have accelerated schedules and are time constrained, as well as economically constrained.
Although it is interesting to discover which systems help students tackle certain grammatical errors the best, few studies actually examine the attitudes of instructors or students to these different modes of feedback systems. For example, few studies have examined student and teacher attitudes toward handwritten feedback compared to typed, or oral feedback.
Abuseileek (2012)  found that MS Track Changes helped students correct eleven major errors through WCF by their instructors; however, he did not examine instructor and teacher attitudes toward it. Similarly, Sauro (2009)  found that computer mediated feedback through texting helped adult Swedish students tackle article usage through corrective feedback in the form of recasts and metalinguistic typed explanation, however, she did not survey the students or instructors to find out their attitudes toward the different types of feedback. Sauro’s aim was to discover whether typed, oral, or face-to-face feedback was better.
Similarly, Ferris (1997)  examined whether teacher commentary improved writing of ESL students, but did not analyze attitudes, and although Evans, N. W., Hartshorn K. James, & Tuioti, E. A. (2010)  examined practitioners’ perspectives, they did not analyze students’ perspectives.
Furthermore, Mustafa (2011)  , who examined Saudi student’s attitudes toward feedback, pointed out that systems of sociocultural theory need to be applied when giving writing feedback, but he did not analyze teacher’s perspectives.
Therefore, it is integral that practitioners look at student and teacher attitudes toward these different feedback areas in order to ascertain their efficacy, if not only from the perspective of whether or not they improve grammatical accuracy, but also whether these forms of feedback are more liked by the teachers and ESL students themselves.
9. Limitations of Study
The study is limited in size. I would like to have been able to survey more students in general, as well as more male instructors, but due to economic factors and enrollment rates, I could not. Similarly, it would be interesting to subdivide the students into sociocultural groupings to see if there was a correlation between nationalities, and preferred feedback styles. It would be useful to conduct interviews so that some of the questions could be clarified for the students who did not fully understand what the survey questions were eliciting.
Questionnaire Regarding Writing Feedback
1) Which level are you a student of in the ELC?
2) Which age range do you fall within?
18-22 22-26 26-30 Over 30
3) Please indicate your gender:
4) Do you receive writing feedback from your teacher?
5) Do you use MS Track Changes, or another editing program for your writing? If, so how is it used in your class?
6) In your personal opinion, what are some of the advantages and disadvantages of MS track changes/or the computer program used?
7) Do you usually receive handwritten or typed feedback about your writing from your instructors?
8) Do you prefer handwritten or typed feedback about your writing? Why?
9) Do you think your teacher’s feedback helps you with your writing?
10) Do you receive oral, or spoken feedback from your teacher about your writing?
11) Which type of feedback about your writing do you prefer? (please circle)
12) In your personal opinion, what is the best kind of writing feedback that you would like to receive from your teacher to help improve your writing?
Questionnaire Regarding Writing Feedback
This questionnaire was designed to elicit information regarding writing feedback, in order to ascertain whether teachers prefer providing feedback by hand, or through MS Track Changes/another computer program. It was also designed to gather information about the types of feedback that writing teachers provide the most, and their opinions about it. Students will be given a similar survey.
No names will be permitted on the survey, and they will be typed so that students and teachers cannot be identified by writing.
1) Which level do you teach in the ELC? (this survey is only for AP5 Teachers)
2) Which age range do you fall within?
20 - 30 30 - 40 40 - 50 50 - 60
3) Please indicate your gender:
4) Do you provide writing feedback to your students?
5) Do you use MS track changes/or another computer program to provide feedback?
6) If so, what do you use MS track changes/another computer program for in your writing section?
7) In your professional opinion, what are some of the advantages and disadvantages of MS track changes/ computer program?
8) Do you provide handwritten/hand printed feedback to your students?
9) What kind of handwritten/hand printed feedback do you provide to your students? (For example: written corrective, error correction symbols, etc.)
10) In your professional opinion, what are some of the advantages and disadvantages of handwritten feedback?
11) Do you feel that your feedback benefits your students? Please explain.
12) Do you provide oral feedback to your students regarding their writing?
13) What kind of oral feedback to you provide for your students regarding their writing? (For example: face to face, voice recorded, etc.)
14) In your professional opinion, which kinds of feedback do you think help your students with their writing the most?
15) Can you identify anything specifically that you find impedes your ability to provide writing feedback?