research and teaching
Supporting Improved Writing Skills in Environmental Chemistry
By Dulani Samarasekara, Todd Mlsna, and Deb Mlsna
Problem solving or critical thinking, data interpretation, and oral and written communication are some of the most essential skills that undergraduates need to practice. Among these skills, writing is often neglected in a typical STEM curriculum. If a student continues on to graduate school or begins a career as a scientist, they will be expected to write scientific reports (Gragson & Hagen, 2009). STEM undergraduates often have relatively few opportunities to write scientific reports; consequentially, the writing of recently graduated students is often poor, with a general unawareness of the requirements for clear scientific text (Guilford, 2001; Walker & Sampson, 2013). Moreover, undergraduates might have limited opportunities to review and critique scientific papers, which leads to lower confidence in their writing abilities (Walker & Sampson, 2013). Here we introduce a modified peer assessment approach (Glaser, 2014; Guilford, 2001; Ricker & Whelan, 2016) designed to encourage scientific writing and critical thinking of undergraduate students’ writing. Our study shows that the modified peer assessment process provides a valid mechanism for students to improve their writing and practice critical analysis of their work.
The utilization of peer review, both online and on-campus, has been proven effective for supporting enhanced writing skills for undergraduate students. Many of the peer review formats use online essays to train students to edit critically and understand assignment goals (Boase-Jelinek et al., 2013; Dominguez et al., 2015; Gunersel et al., 2012; Miyazoe & Anderson, 2010; Novakovich, 2016; Zwicky & Hands, 2015). When used well, peer review of essays serves several layers of purpose as it allows student reviewers to gain experience editing and providing constructive feedback on a piece of writing, student authors receive comments from diverse perspectives as multiple editors give feedback, and instructors can reduce their grading burden related to editing when working with large enrollment classes (Boase-Jelinek et al., 2013; Guilford, 2001; Huisman et al., 2018).
A challenge with the peer review process, however, is supporting students to give quality feedback when providing comments (Kulkarni et al., 2015a; Kulkarni et al., 2015b; Yuan et al., 2016). Poor student edits short-change the process, where authors are less likely to improve their work, and may develop a false sense of confidence related to ineffective feedback (Russell, 2004). A number of attempts in the literature have been made to improve the quality of student feedback, which include providing common feedback phrases for quick use by the editing student (Kulkarni et al., 2015b), including interactive hints to help students stay on track (Krause et al., 2017), and designing grading rubrics with care (Hicks et al., 2016).
This research study focused on the incorporation of a Response to Reviewer Comments document, which allowed students to critically review student edits received on their writing and determine if change was warranted to improve their reports. Students have demonstrated that the perceived competence of peer feedback impacts their editing decisions (Berndt et al., 2018; Strijbos et al., 2010). The process of critical evaluation of suggested edits can potentially improve student ownership of their learning and help them develop critical-reflection skills in the process (Thomas et al., 2011). We sought to answer the following research questions with this study: 1.Are peer edits and feedback sufficient to improve student writing? 2.Did editing peer reports support students to improve their own writing? 3.Does the Response to Reviewers Comments document encourage students to critically evaluate their own writing?
Peer review writing assignments were incorporated into three sections of Environmental Chemistry at our institution in the years Spring 2015, 2017, and 2018. Most of the students enrolled in the class were Chemistry or Chemical Engineering majors and were typically junior- or senior-level students (see Table 1). The model used was a modification of the published Calibrated Peer Review protocol with the edits and reviews occurring in class instead of in an online format (Chapman, 1999). In addition, the Response to Reviewer Comments document was added to encourage critical thinking of each student toward their writing and suggested peer edits.
|Student demographics including sample size, gender, major, and academic year by class.|
This peer review assignment was implemented in conjunction with a university-wide initiative known as the Maroon and Write Quality Enhancement Plan. Maroon and Write is a comprehensive university model instituted in 2014 designed to improve undergraduate student writing through an implementation of writing across the curriculum, the use of write-to-learn strategies, and formal writing instruction. This peer review writing assignment supported upper-division writing needs in the chemical sciences.
The peer review assignment was designed as follows: 1.Students had the opportunity to “train” on essays before writing their own literature review, with examples provided of high-quality and low-quality essays. Essays were provided on the class website for review. In addition, students were given detailed instructions and examples of peer-editing style feedback with discussion on the types of edits and approaches that could be taken. 2.Each student wrote an original literature review of a current topic in environmental science. Sample essays that showed literature review structure and content were available. The literature review was approximately 1,500 words and cited at least four original scientific papers. This original assignment was turned in to the instructor for review and grading. This initial grade was not seen by the student and was used only for this research study. 3.Each student reviewed and edited three papers in class for approximately 20 minutes per peer essay. Papers had names and identifying information removed. Edits and grading rubrics were handwritten on papers and returned to the instructor for redistribution back to the original author. Students were tasked to have the average of their peer reviews not exceed a grade of 85. This was designed to stop students from just giving everyone high grades for their feedback and eliminating critical review.
Anonymous peer edits were returned to the original author. After student edits were addressed and deficiencies improved, a final literature review was turned in to the instructor for grading.
Each student also turned in a Response to Reviewer Comments document, which detailed the important student edits and explained how suggestions were addressed. This Review document was intended to have each author critically think about their peer edits and determine if the suggestions were warranted or to provide an explanation if they were not. It was included to improve the critical thinking of each author toward their own writing and to enhance ownership of their writing decisions.
Grades for each student were awarded as 65% from the final literature review turned in after peer edits; 15% from the quality of the student edits they made on other student papers; and 20% from their Response to Reviewer Comments document. The Response to Reviewer Comments document was graded with an assessment of how thoroughly students addressed reviewer suggestions and determined edit suitability. Essays and peer editors were assigned anonymously through a numeric system. All identifying information was removed from essays before papers were given to student editors to account for anonymity of review. Student editors were also kept anonymous from each author. All student papers and edit comments were scanned and kept for instructor assessment of the process.
Each paper was graded using a rubric developed by the American Chemical Society (ACS) Style Guide (Coghill & Garson, 2006). Students were tasked to review papers in four categories: appropriate citation of references; correct use of citations in the essay; grammar, spelling and neatness of work; and the overall content of the essay topic. Grades were based on a 100-point total and student marks were recorded for each section. An example Peer Review Grading Rubric is included in the appendix.
Our initial research question was to determine if peer edits and feedback are sufficient to improve student writing. In this study, report grades were assigned based on a grading rubric with four evaluation criteria: work cited; using cited works; grammar, spelling, and neatness; and content. Students’ pre- and posttotal essay grades and the four category rubric grades from the instructor were analyzed using paired sample t-test at the 95% confidence interval. Results showed that students’ report grades significantly improved after the peer-editing process, with total essay grade: t(91) = -16.3, p < .001, d = 1.6; works cited: t(91) = -11.9, p < .001, d = 1.2; using cited works: t(91) = -9.2, p < .001, d = 1.0; grammar, spelling, and neatness: t(91) = -9.0, p < .001, d = 0.9; content: t(91) = -12.3, p < .001, d = 1.3. Students’ pre- and postreport grade percentages are displayed in Figure 1. Students were able to improve their report quality after incorporating peer edits with essay average grades improving from 58% to 70% after student revisions.
To further study how successfully students edit or grade their assigned essays, a paired sample t-test was conducted among the instructor prereport grades and the averaged peer report grades. Results showed significant differences between the groups in all of the areas, with total essay grade: t(91) = 13.1, p < .001, d = 1.4; works cited: t(91) = 7.0, p < .001, d = 0.7; using cited works: t(91) = 8.1, p < .001, d = 0.8; grammar, spelling, and neatness: t(91) = 4.5, p < .001, d = 0.5; content: t(91) = 13.0, p < .001, d = 1.4. The significance in these areas indicate that the peer edits did not correlate well with the instructor pregrade, and we observed that the peer edits consistently scored higher than the instructor. However, student edits were still sufficient to improve the overall quality of reports. Paired sample correlations are given in Table 2. Significant positive correlation of the variables “work cited” and “grammar, spelling, and neatness” indicate that students who got higher points from their peers also received a higher grade from the instructor. The scatter plot for the initial total report grades from the peers and the instructor is given in Figure 2.
|Results of the paired sample correlation of the peer average and instructor initial report grade.|
In addition, to understand the student perception about the essay improvement due to the peer edits, two survey questions, “I found the reviewer comments I received helpful” and “I felt my paper improved as a result of the feedback I received” were analyzed. Survey response percentages are given in Figure 3. Likert responses “agree” and “strongly agree” were combined, as well as “disagree” and “strongly disagree” for this analysis.
Among all students, 55% said that the comments they received from their peers were helpful; only 16% said they were not. Most importantly, 72% of the students thought their papers were improved as a result of the student feedback. To determine if report grades were enhanced for students who thought reviewer comments were helpful, average grade differences (final instructor report grade— initial instructor report grade) were plotted against the three survey response categories of agree, neutral, and disagree. Results are given in Figure 4. Grade improvements do not show a significant difference among the groups, as all groups improved.
Our second research question was to determine if the editing of peer reports helped students to improve their own writing. This was addressed by the evaluation of student perception on the two survey questions, “Reading other papers helped me understand what the assignment should look like” and “Reading other papers gave me ideas for things I could change in my own paper.” In the analysis, Likert-scale items of strongly disagree and disagree and strongly agree and agree were merged and considered as disagree and agree, respectively. Survey response percentages are shown in Figure 5. Approximately 60% of students thought that reading other student papers helped them understand the assignment and supplied ideas to improve their own report.
Average values of grade differences (final instructor report grade— initial instructor report grade) were plotted against the three survey response categories of agree, neutral, and disagree to evaluate if student writing improved as a result of reading other papers (Figure 6). The average grade improvements among the three categories were not significantly different. Students strongly responded to the survey question however, and thought the process helped their writing.
Our third research question was to determine whether the Response to Reviewer Comment document helped students critically evaluate their own work. Overall, the quality of Response to Reviewer Comments documents was poor, as students took all peer edit suggestions as changes to be made. Therefore, in our opinion, many students did not really use the Response to Reviewer Comments document’s comments to critically evaluate their own work. Instructor grades for the quality of the Response to Reviewer Comments document correlated with students’ final report grades (Pearson correlation .467 with p < .001). Students that critically considered the peer edits and addressed comments in detail further improved their essay grade. Results are shown in Figure 7. Improved training with the Response to Reviewer Comments document is needed to support students to critically evaluate peer edits and appropriately defend their writing choices. The level of student writing confidence may currently be impacting these choices.
|Student feedback on survey questions and additional comments.|
There are some limitations to this study. We focused on one essay written by students within the course, which limits our conclusions concerning writing improvement. Multiple assignments would allow us to determine if student writing skills improved over the course of the semester with feedback. In addition, we did not directly compare the student edits made for each assignment with instructor edits. This could lead to a further exploration of the impact and efficacy of student edits. However, other researchers have found that student edits from multiple peers carried more weight than one expert opinion in contributing to essay improvement (Cho & MacArthur, 2010).
This peer editing exercise did improve student writing for the technical writing assignment and encouraged students to evaluate their own writing through peer edit feedback. Analysis of student edits showed that peer editing was sufficient to improve essay quality and the feedback was appreciated by students undergoing review. Student edits correlated with instructor grades most closely on work cited and grammar edits, with student edits focusing primarily on small, discrete suggestions instead of large, conceptual improvements. Students also valued the ability to read peer essays and maintained that reading other essays improved their own work. The majority of students appreciated the editing exercise and concluded that they would get peer edits on their own before their next writing assignment. The Response to Reviewer Comments document was included with this editing exercise to encourage students to think critically about their own work and analyze if editing suggestions were worthwhile. Overall, most students did not critically defend their writing, but instead incorporated all peer edits into their work. The ability to critically think and defend their own writing needs more support for students in this upper-division science course, as students were not confident enough in their own essay to defend their writing choices.
Overall, our results support the inclusion of peer edits as part of a writing assignment for students learning technical writing. Instructors can implement peer editing with assignments as a review cycle to improve student performance. The peer-edit process reduces instructor grading load, and results indicate that students gained as much benefit from reading peer essays as they did in receiving student edits. Further work on this approach includes incorporation of several peer-editing cycles to allow students the opportunity to improve overall writing skills. Further focus on critical evaluation of their own work is needed for students to fully use the Response to Reviewer Comments document.
Berndt M., Strijbos J.-W., & Fischer F. (2018). Effects of written peer-feedback content and sender’s competence on perceptions, performance, and mindful cognitive processing. European Journal of Psychology of Education, 33(1), 31–49.
Boase-Jelinek D., Parker J., & Herrington J. (2013). Student reflection and learning through peer reviews. Issues in Educational Research, 23(2), 119–131.
Chapman O. (1999). Calibrated peer reviewTM, an overview. .
Cho K., & MacArthur C. (2010). Student revision with peer and expert reviewing. Learning and Instruction, 20(4), 328–338.
Coghill A. M., & Garson L. (2006). The ACS style guide (Vol. 3). Oxford University Press and the American Chemical Society.
Dominguez C., Nascimento M. M., Payan-Carreira R., Cruz G., Silva H., Lopes J., Morais M., & Morais E. (2015). Adding value to the learning process by online peer review activities: Towards the elaboration of a methodology to promote critical thinking in future engineers. European Journal of Engineering Education, 40(5), 573–591.
Glaser R. E. (2014). Design and assessment of an assignment-based curriculum to teach scientific writing and scientific peer review. Journal of Learning Design, 7(2), 85–104.
Gragson D. E., & Hagen J. P. (2009). Developing technical writing skills in the physical chemistry laboratory: A progressive approach employing peer review. Journal of Chemical Education, 87(1), 62–65.
Guilford W. H. (2001). Teaching peer review and the process of scientific writing. Advances in Physiology Education, 25(3), 167–175.
Gunersel A. B., Simpson N. J., Aufderheide K. J., & Wang L. (2012). Effectiveness of Calibrated Peer ReviewTM for improving writing and critical thinking skills in biology undergraduate students. Journal of the Scholarship of Teaching and Learning, 8(2), 25–37.
Hicks C. M., Pandey V., Fraser C. A., & Klemmer S. (2016). Framing feedback: Choosing review environment features that support high quality peer assessment. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 458–469.
Huisman B., Saab N., van Driel J., & van den Broek P. (2018). Peer feedback on academic writing: Undergraduate students’ peer feedback role, peer feedback perceptions and essay performance. Assessment & Evaluation in Higher Education, 1–14.
Krause M., Garncarz T., Song J., Gerber E. M., Bailey B. P., & Dow S. P. (2017). Critique style guide: Improving crowdsourced design feedback with a natural language model. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 4627–4639). Association for Computing Machinery.
Kulkarni C. E., Bernstein M. S., & Klemmer S. R. (2015a). PeerStudio: Rapid peer feedback emphasizes revision and improves performance. Proceedings of the Second (2015) ACM Conference on Learning@Scale (pp. 75–84). Association for Computing Machinery.
Kulkarni C., Wei K. P., Le H., Chia D., Papadopoulos K., Cheng J., Koller D., & Klemmer S. R. (2015b). Peer and self assessment in massive online classes. ACM Transactions on Computer-Human Interaction (TOCHI), 20(6), 1–33.
Miyazoe T., & Anderson T. (2010). Learning outcomes and students’ perceptions of online writing: Simultaneous implementation of a forum, blog, and wiki in an EFL blended learning setting. System, 38(2), 185–199.
Novakovich J. (2016). Fostering critical thinking and reflection through blog-mediated peer feedback. Journal of Computer Assisted Learning, 32(1), 16–30.
Ricker A. S., & Whelan R. J. (2016). Reading, writing, and peer review: Engaging with chemical literature in a 200-level analytical chemistry course. In Integrating Information Literacy into the Chemistry Curriculum (pp. 157–168). ACS Publications.
Russell A. A. (2004). Calibrated peer review—a writing and critical-thinking instructional tool. Teaching Tips: Innovations in Undergraduate Science Instruction, 54.
Strijbos J.-W., Narciss S., & Dünnebier K. (2010). Peer feedback content and sender’s competence level in academic writing revision tasks: Are they critical for feedback perceptions and efficiency? Learning and Instruction, 20(4), 291–303.
Thomas G., Martin D., & Pleasants K. (2011). Using self-and peer-assessment to enhance students’ future-learning in higher education. Journal of University Teaching & Learning Practice, 8(1), 5.
Walker J. P., & Sampson V. (2013). Argument-driven inquiry: Using the laboratory to improve undergraduates’ science writing skills through meaningful science writing, peer-review, and revision. Journal of Chemical Education, 90(10), 1269–1274.
Yuan A., Luther K., Krause M., Vennix S. I., Dow S. P., & Hartmann B. (2016). Almost an expert: The effects of rubrics and expertise on perceived value of crowdsourced design critiques. In Proceedings of the 19th ACM Conference on Computer-Supported Work and Social Computing (pp. 1005–1017). Association for Computing Machinery.
Zwicky D. A., & Hands M. D. (2015). The effect of peer review on information literacy outcomes in a chemical literature course. Journal of Chemical Education, 93(3), 477–481.