research and teaching
Supporting Improved Writing Skills in Environmental Chemistry
Journal of College Science Teaching—November/December 2020 (Volume 50, Issue 2)
By Dulani Samarasekara, Todd Mlsna, and Deb Mlsna
Problem solving or critical thinking, data interpretation, and oral and written communication are some of the most essential skills that undergraduates need to practice. Among these skills, writing is often neglected in a typical STEM curriculum. If a student continues on to graduate school or begins a career as a scientist, they will be expected to write scientific reports (Gragson & Hagen, 2009). STEM undergraduates often have relatively few opportunities to write scientific reports; consequentially, the writing of recently graduated students is often poor, with a general unawareness of the requirements for clear scientific text (Guilford, 2001; Walker & Sampson, 2013). Moreover, undergraduates might have limited opportunities to review and critique scientific papers, which leads to lower confidence in their writing abilities (Walker & Sampson, 2013). Here we introduce a modified peer assessment approach (Glaser, 2014; Guilford, 2001; Ricker & Whelan, 2016) designed to encourage scientific writing and critical thinking of undergraduate students’ writing. Our study shows that the modified peer assessment process provides a valid mechanism for students to improve their writing and practice critical analysis of their work.
The utilization of peer review, both online and on-campus, has been proven effective for supporting enhanced writing skills for undergraduate students. Many of the peer review formats use online essays to train students to edit critically and understand assignment goals (Boase-Jelinek et al., 2013; Dominguez et al., 2015; Gunersel et al., 2012; Miyazoe & Anderson, 2010; Novakovich, 2016; Zwicky & Hands, 2015). When used well, peer review of essays serves several layers of purpose as it allows student reviewers to gain experience editing and providing constructive feedback on a piece of writing, student authors receive comments from diverse perspectives as multiple editors give feedback, and instructors can reduce their grading burden related to editing when working with large enrollment classes (Boase-Jelinek et al., 2013; Guilford, 2001; Huisman et al., 2018).
A challenge with the peer review process, however, is supporting students to give quality feedback when providing comments (Kulkarni et al., 2015a; Kulkarni et al., 2015b; Yuan et al., 2016). Poor student edits short-change the process, where authors are less likely to improve their work, and may develop a false sense of confidence related to ineffective feedback (Russell, 2004). A number of attempts in the literature have been made to improve the quality of student feedback, which include providing common feedback phrases for quick use by the editing student (Kulkarni et al., 2015b), including interactive hints to help students stay on track (Krause et al., 2017), and designing grading rubrics with care (Hicks et al., 2016).
This research study focused on the incorporation of a Response to Reviewer Comments document, which allowed students to critically review student edits received on their writing and determine if change was warranted to improve their reports. Students have demonstrated that the perceived competence of peer feedback impacts their editing decisions (Berndt et al., 2018; Strijbos et al., 2010). The process of critical evaluation of suggested edits can potentially improve student ownership of their learning and help them develop critical-reflection skills in the process (Thomas et al., 2011). We sought to answer the following research questions with this study: 1.Are peer edits and feedback sufficient to improve student writing? 2.Did editing peer reports support students to improve their own writing? 3.Does the Response to Reviewers Comments document encourage students to critically evaluate their own writing?
Peer review writing assignments were incorporated into three sections of Environmental Chemistry at our institution in the years Spring 2015, 2017, and 2018. Most of the students enrolled in the class were Chemistry or Chemical Engineering majors and were typically junior- or senior-level students (see Table 1). The model used was a modification of the published Calibrated Peer Review protocol with the edits and reviews occurring in class instead of in an online format (Chapman, 1999). In addition, the Response to Reviewer Comments document was added to encourage critical thinking of each student toward their writing and suggested peer edits.
Student demographics including sample size, gender, major, and academic year by class. | ||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
This peer review assignment was implemented in conjunction with a university-wide initiative known as the Maroon and Write Quality Enhancement Plan. Maroon and Write is a comprehensive university model instituted in 2014 designed to improve undergraduate student writing through an implementation of writing across the curriculum, the use of write-to-learn strategies, and formal writing instruction. This peer review writing assignment supported upper-division writing needs in the chemical sciences.
The peer review assignment was designed as follows: 1.Students had the opportunity to “train” on essays before writing their own literature review, with examples provided of high-quality and low-quality essays. Essays were provided on the class website for review. In addition, students were given detailed instructions and examples of peer-editing style feedback with discussion on the types of edits and approaches that could be taken. 2.Each student wrote an original literature review of a current topic in environmental science. Sample essays that showed literature review structure and content were available. The literature review was approximately 1,500 words and cited at least four original scientific papers. This original assignment was turned in to the instructor for review and grading. This initial grade was not seen by the student and was used only for this research study. 3.Each student reviewed and edited three papers in class for approximately 20 minutes per peer essay. Papers had names and identifying information removed. Edits and grading rubrics were handwritten on papers and returned to the instructor for redistribution back to the original author. Students were tasked to have the average of their peer reviews not exceed a grade of 85. This was designed to stop students from just giving everyone high grades for their feedback and eliminating critical review.
Anonymous peer edits were returned to the original author. After student edits were addressed and deficiencies improved, a final literature review was turned in to the instructor for grading.
Each student also turned in a Response to Reviewer Comments document, which detailed the important student edits and explained how suggestions were addressed. This Review document was intended to have each author critically think about their peer edits and determine if the suggestions were warranted or to provide an explanation if they were not. It was included to improve the critical thinking of each author toward their own writing and to enhance ownership of their writing decisions.
Grades for each student were awarded as 65% from the final literature review turned in after peer edits; 15% from the quality of the student edits they made on other student papers; and 20% from their Response to Reviewer Comments document. The Response to Reviewer Comments document was graded with an assessment of how thoroughly students addressed reviewer suggestions and determined edit suitability. Essays and peer editors were assigned anonymously through a numeric system. All identifying information was removed from essays before papers were given to student editors to account for anonymity of review. Student editors were also kept anonymous from each author. All student papers and edit comments were scanned and kept for instructor assessment of the process.
Each paper was graded using a rubric developed by the American Chemical Society (ACS) Style Guide (Coghill & Garson, 2006). Students were tasked to review papers in four categories: appropriate citation of references; correct use of citations in the essay; grammar, spelling and neatness of work; and the overall content of the essay topic. Grades were based on a 100-point total and student marks were recorded for each section. An example Peer Review Grading Rubric is included in the appendix.
Our initial research question was to determine if peer edits and feedback are sufficient to improve student writing. In this study, report grades were assigned based on a grading rubric with four evaluation criteria: work cited; using cited works; grammar, spelling, and neatness; and content. Students’ pre- and posttotal essay grades and the four category rubric grades from the instructor were analyzed using paired sample t-test at the 95% confidence interval. Results showed that students’ report grades significantly improved after the peer-editing process, with total essay grade: t(91) = -16.3, p < .001, d = 1.6; works cited: t(91) = -11.9, p < .001, d = 1.2; using cited works: t(91) = -9.2, p < .001, d = 1.0; grammar, spelling, and neatness: t(91) = -9.0, p < .001, d = 0.9; content: t(91) = -12.3, p < .001, d = 1.3. Students’ pre- and postreport grade percentages are displayed in Figure 1. Students were able to improve their report quality after incorporating peer edits with essay average grades improving from 58% to 70% after student revisions.
Students’ pre- and posttotal essay grades and category rubric grades in the different evaluation areas: works cited (10 points); using cited works (15 points); grammar, spelling, and neatness (15 points); and content (60 points). The graph is displayed as percentages. The results of paired sample t -test show significant improvement in all these categories. Error bars represent the standard errors.
To further study how successfully students edit or grade their assigned essays, a paired sample t-test was conducted among the instructor prereport grades and the averaged peer report grades. Results showed significant differences between the groups in all of the areas, with total essay grade: t(91) = 13.1, p < .001, d = 1.4; works cited: t(91) = 7.0, p < .001, d = 0.7; using cited works: t(91) = 8.1, p < .001, d = 0.8; grammar, spelling, and neatness: t(91) = 4.5, p < .001, d = 0.5; content: t(91) = 13.0, p < .001, d = 1.4. The significance in these areas indicate that the peer edits did not correlate well with the instructor pregrade, and we observed that the peer edits consistently scored higher than the instructor. However, student edits were still sufficient to improve the overall quality of reports. Paired sample correlations are given in Table 2. Significant positive correlation of the variables “work cited” and “grammar, spelling, and neatness” indicate that students who got higher points from their peers also received a higher grade from the instructor. The scatter plot for the initial total report grades from the peers and the instructor is given in Figure 2.
Scatter plot for the initial total report grades from student reviewers and the instructor.
Results of the paired sample correlation of the peer average and instructor initial report grade. | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
|
In addition, to understand the student perception about the essay improvement due to the peer edits, two survey questions, “I found the reviewer comments I received helpful” and “I felt my paper improved as a result of the feedback I received” were analyzed. Survey response percentages are given in Figure 3. Likert responses “agree” and “strongly agree” were combined, as well as “disagree” and “strongly disagree” for this analysis.
Percentage survey responses in three Likert-scale categories for the survey questions. Error bars represent the standard errors. In general, students thought the review process improved their papers.
Among all students, 55% said that the comments they received from their peers were helpful; only 16% said they were not. Most importantly, 72% of the students thought their papers were improved as a result of the student feedback. To determine if report grades were enhanced for students who thought reviewer comments were helpful, average grade differences (final instructor report grade— initial instructor report grade) were plotted against the three survey response categories of agree, neutral, and disagree. Results are given in Figure 4. Grade improvements do not show a significant difference among the groups, as all groups improved.
Averaged instructor grade improvements for students who rated their response as agree, neutral, and disagree in the survey questions, “I found the reviewer comments I received helpful” and “I felt my paper improved as a result of the feedback I received.” Error bars represent the standard errors.
Our second research question was to determine if the editing of peer reports helped students to improve their own writing. This was addressed by the evaluation of student perception on the two survey questions, “Reading other papers helped me understand what the assignment should look like” and “Reading other papers gave me ideas for things I could change in my own paper.” In the analysis, Likert-scale items of strongly disagree and disagree and strongly agree and agree were merged and considered as disagree and agree, respectively. Survey response percentages are shown in Figure 5. Approximately 60% of students thought that reading other student papers helped them understand the assignment and supplied ideas to improve their own report.
Percentage survey responses in three Likert-scale categories for the survey questions. In general, students thought the reviewing process improved their understanding about the writing assignment. Error bars represent the standard errors.
Average values of grade differences (final instructor report grade— initial instructor report grade) were plotted against the three survey response categories of agree, neutral, and disagree to evaluate if student writing improved as a result of reading other papers (Figure 6). The average grade improvements among the three categories were not significantly different. Students strongly responded to the survey question however, and thought the process helped their writing.
Averaged instructor grade improvements for students who rated their response as agree, neutral, and disagree in the survey questions. Error bars represent the standard errors.
Our third research question was to determine whether the Response to Reviewer Comment document helped students critically evaluate their own work. Overall, the quality of Response to Reviewer Comments documents was poor, as students took all peer edit suggestions as changes to be made. Therefore, in our opinion, many students did not really use the Response to Reviewer Comments document’s comments to critically evaluate their own work. Instructor grades for the quality of the Response to Reviewer Comments document correlated with students’ final report grades (Pearson correlation .467 with p < .001). Students that critically considered the peer edits and addressed comments in detail further improved their essay grade. Results are shown in Figure 7. Improved training with the Response to Reviewer Comments document is needed to support students to critically evaluate peer edits and appropriately defend their writing choices. The level of student writing confidence may currently be impacting these choices.
Representation of students’ report grade improvement related to the quality of their Response to Reviewer Comments document. Student quality was characterized as “high” if students showed strong engagement with the comments and defended their litera
Student responses to additional survey questions. A Likert scale was used ranging from 1 = Strongly Disagree to 5 = Strong Agree. Error bars represent standard errors.
Student feedback on survey questions and additional comments. | ||||||
---|---|---|---|---|---|---|
|
There are some limitations to this study. We focused on one essay written by students within the course, which limits our conclusions concerning writing improvement. Multiple assignments would allow us to determine if student writing skills improved over the course of the semester with feedback. In addition, we did not directly compare the student edits made for each assignment with instructor edits. This could lead to a further exploration of the impact and efficacy of student edits. However, other researchers have found that student edits from multiple peers carried more weight than one expert opinion in contributing to essay improvement (Cho & MacArthur, 2010).
This peer editing exercise did improve student writing for the technical writing assignment and encouraged students to evaluate their own writing through peer edit feedback. Analysis of student edits showed that peer editing was sufficient to improve essay quality and the feedback was appreciated by students undergoing review. Student edits correlated with instructor grades most closely on work cited and grammar edits, with student edits focusing primarily on small, discrete suggestions instead of large, conceptual improvements. Students also valued the ability to read peer essays and maintained that reading other essays improved their own work. The majority of students appreciated the editing exercise and concluded that they would get peer edits on their own before their next writing assignment. The Response to Reviewer Comments document was included with this editing exercise to encourage students to think critically about their own work and analyze if editing suggestions were worthwhile. Overall, most students did not critically defend their writing, but instead incorporated all peer edits into their work. The ability to critically think and defend their own writing needs more support for students in this upper-division science course, as students were not confident enough in their own essay to defend their writing choices.
Overall, our results support the inclusion of peer edits as part of a writing assignment for students learning technical writing. Instructors can implement peer editing with assignments as a review cycle to improve student performance. The peer-edit process reduces instructor grading load, and results indicate that students gained as much benefit from reading peer essays as they did in receiving student edits. Further work on this approach includes incorporation of several peer-editing cycles to allow students the opportunity to improve overall writing skills. Further focus on critical evaluation of their own work is needed for students to fully use the Response to Reviewer Comments document.
Chemistry Literacy Teaching Strategies Postsecondary