Skip to main content
 

research and teaching

Peer Review and Response

Supporting Improved Writing Skills in Environmental Chemistry

Journal of College Science Teaching—November/December 2020 (Volume 50, Issue 2)

By Dulani Samarasekara, Todd Mlsna, and Deb Mlsna

Students in an upper-division Environmental Chemistry course used peer review and response to reviewer comments to improve their writing skills. The process employed an anonymous and timed in-class Peer Review Format. In addition to editing peer papers, students were tasked to create a Response to Reviewer Comments document, which the authors used to mimic the peer-review process required for scientific publication. The Response to Reviewer Comments document was designed to have students think critically about their writing and defend their choices with respect to peer edits. Results of essay quality and student surveys are presented here. Student writing assignments improved with this process; however, more support is needed to encourage students to critically think about their own writing. 

 

Problem solving or critical thinking, data interpretation, and oral and written communication are some of the most essential skills that undergraduates need to practice. Among these skills, writing is often neglected in a typical STEM curriculum. If a student continues on to graduate school or begins a career as a scientist, they will be expected to write scientific reports (Gragson & Hagen, 2009). STEM undergraduates often have relatively few opportunities to write scientific reports; consequentially, the writing of recently graduated students is often poor, with a general unawareness of the requirements for clear scientific text (Guilford, 2001; Walker & Sampson, 2013). Moreover, undergraduates might have limited opportunities to review and critique scientific papers, which leads to lower confidence in their writing abilities (Walker & Sampson, 2013). Here we introduce a modified peer assessment approach (Glaser, 2014; Guilford, 2001; Ricker & Whelan, 2016) designed to encourage scientific writing and critical thinking of undergraduate students’ writing. Our study shows that the modified peer assessment process provides a valid mechanism for students to improve their writing and practice critical analysis of their work.

The utilization of peer review, both online and on-campus, has been proven effective for supporting enhanced writing skills for undergraduate students. Many of the peer review formats use online essays to train students to edit critically and understand assignment goals (Boase-Jelinek et al., 2013; Dominguez et al., 2015; Gunersel et al., 2012; Miyazoe & Anderson, 2010; Novakovich, 2016; Zwicky & Hands, 2015). When used well, peer review of essays serves several layers of purpose as it allows student reviewers to gain experience editing and providing constructive feedback on a piece of writing, student authors receive comments from diverse perspectives as multiple editors give feedback, and instructors can reduce their grading burden related to editing when working with large enrollment classes (Boase-Jelinek et al., 2013; Guilford, 2001; Huisman et al., 2018).

A challenge with the peer review process, however, is supporting students to give quality feedback when providing comments (Kulkarni et al., 2015a; Kulkarni et al., 2015b; Yuan et al., 2016). Poor student edits short-change the process, where authors are less likely to improve their work, and may develop a false sense of confidence related to ineffective feedback (Russell, 2004). A number of attempts in the literature have been made to improve the quality of student feedback, which include providing common feedback phrases for quick use by the editing student (Kulkarni et al., 2015b), including interactive hints to help students stay on track (Krause et al., 2017), and designing grading rubrics with care (Hicks et al., 2016).

This research study focused on the incorporation of a Response to Reviewer Comments document, which allowed students to critically review student edits received on their writing and determine if change was warranted to improve their reports. Students have demonstrated that the perceived competence of peer feedback impacts their editing decisions (Berndt et al., 2018; Strijbos et al., 2010). The process of critical evaluation of suggested edits can potentially improve student ownership of their learning and help them develop critical-reflection skills in the process (Thomas et al., 2011). We sought to answer the following research questions with this study: 1.Are peer edits and feedback sufficient to improve student writing? 2.Did editing peer reports support students to improve their own writing? 3.Does the Response to Reviewers Comments document encourage students to critically evaluate their own writing?

Methods

Students and demographics

Peer review writing assignments were incorporated into three sections of Environmental Chemistry at our institution in the years Spring 2015, 2017, and 2018. Most of the students enrolled in the class were Chemistry or Chemical Engineering majors and were typically junior- or senior-level students (see Table 1). The model used was a modification of the published Calibrated Peer Review protocol with the edits and reviews occurring in class instead of in an online format (Chapman, 1999). In addition, the Response to Reviewer Comments document was added to encourage critical thinking of each student toward their writing and suggested peer edits.

Student demographics including sample size, gender, major, and academic year by class.

Criteria

Semester

Semester

Spring 2015

Spring 2017

Spring 2018

Sample size, N

31

28

67

Gender

Male

15

11

48

Female

16

17

19

Major

Chemistry

14

22

9

Chemical engineering

17

1

54

Other

-

5

4

Academic year

Junior

-

7

3

Senior

31

21

64

Data collection

This peer review assignment was implemented in conjunction with a university-wide initiative known as the Maroon and Write Quality Enhancement Plan. Maroon and Write is a comprehensive university model instituted in 2014 designed to improve undergraduate student writing through an implementation of writing across the curriculum, the use of write-to-learn strategies, and formal writing instruction. This peer review writing assignment supported upper-division writing needs in the chemical sciences.

The peer review assignment was designed as follows: 1.Students had the opportunity to “train” on essays before writing their own literature review, with examples provided of high-quality and low-quality essays. Essays were provided on the class website for review. In addition, students were given detailed instructions and examples of peer-editing style feedback with discussion on the types of edits and approaches that could be taken. 2.Each student wrote an original literature review of a current topic in environmental science. Sample essays that showed literature review structure and content were available. The literature review was approximately 1,500 words and cited at least four original scientific papers. This original assignment was turned in to the instructor for review and grading. This initial grade was not seen by the student and was used only for this research study. 3.Each student reviewed and edited three papers in class for approximately 20 minutes per peer essay. Papers had names and identifying information removed. Edits and grading rubrics were handwritten on papers and returned to the instructor for redistribution back to the original author. Students were tasked to have the average of their peer reviews not exceed a grade of 85. This was designed to stop students from just giving everyone high grades for their feedback and eliminating critical review.

Anonymous peer edits were returned to the original author. After student edits were addressed and deficiencies improved, a final literature review was turned in to the instructor for grading.

Each student also turned in a Response to Reviewer Comments document, which detailed the important student edits and explained how suggestions were addressed. This Review document was intended to have each author critically think about their peer edits and determine if the suggestions were warranted or to provide an explanation if they were not. It was included to improve the critical thinking of each author toward their own writing and to enhance ownership of their writing decisions.

Grades for each student were awarded as 65% from the final literature review turned in after peer edits; 15% from the quality of the student edits they made on other student papers; and 20% from their Response to Reviewer Comments document. The Response to Reviewer Comments document was graded with an assessment of how thoroughly students addressed reviewer suggestions and determined edit suitability. Essays and peer editors were assigned anonymously through a numeric system. All identifying information was removed from essays before papers were given to student editors to account for anonymity of review. Student editors were also kept anonymous from each author. All student papers and edit comments were scanned and kept for instructor assessment of the process.

Each paper was graded using a rubric developed by the American Chemical Society (ACS) Style Guide (Coghill & Garson, 2006). Students were tasked to review papers in four categories: appropriate citation of references; correct use of citations in the essay; grammar, spelling and neatness of work; and the overall content of the essay topic. Grades were based on a 100-point total and student marks were recorded for each section. An example Peer Review Grading Rubric is included in the appendix.

Results and discussion

Our initial research question was to determine if peer edits and feedback are sufficient to improve student writing. In this study, report grades were assigned based on a grading rubric with four evaluation criteria: work cited; using cited works; grammar, spelling, and neatness; and content. Students’ pre- and posttotal essay grades and the four category rubric grades from the instructor were analyzed using paired sample t-test at the 95% confidence interval. Results showed that students’ report grades significantly improved after the peer-editing process, with total essay grade: t(91) = -16.3, p < .001, d = 1.6; works cited: t(91) = -11.9, p < .001, d = 1.2; using cited works: t(91) = -9.2, p < .001, d = 1.0; grammar, spelling, and neatness: t(91) = -9.0, p < .001, d = 0.9; content: t(91) = -12.3, p < .001, d = 1.3. Students’ pre- and postreport grade percentages are displayed in Figure 1. Students were able to improve their report quality after incorporating peer edits with essay average grades improving from 58% to 70% after student revisions.

FIGURE 1
Students’ pre- and posttotal essay grades and category rubric grades in the different evaluation areas

Students’ pre- and posttotal essay grades and category rubric grades in the different evaluation areas: works cited (10 points); using cited works (15 points); grammar, spelling, and neatness (15 points); and content (60 points). The graph is displayed as percentages. The results of paired sample t -test show significant improvement in all these categories. Error bars represent the standard errors.

To further study how successfully students edit or grade their assigned essays, a paired sample t-test was conducted among the instructor prereport grades and the averaged peer report grades. Results showed significant differences between the groups in all of the areas, with total essay grade: t(91) = 13.1, p < .001, d = 1.4; works cited: t(91) = 7.0, p < .001, d = 0.7; using cited works: t(91) = 8.1, p < .001, d = 0.8; grammar, spelling, and neatness: t(91) = 4.5, p < .001, d = 0.5; content: t(91) = 13.0, p < .001, d = 1.4. The significance in these areas indicate that the peer edits did not correlate well with the instructor pregrade, and we observed that the peer edits consistently scored higher than the instructor. However, student edits were still sufficient to improve the overall quality of reports. Paired sample correlations are given in Table 2. Significant positive correlation of the variables “work cited” and “grammar, spelling, and neatness” indicate that students who got higher points from their peers also received a higher grade from the instructor. The scatter plot for the initial total report grades from the peers and the instructor is given in Figure 2.

FIGURE 2
Scatter plot for the initial total report grades from student reviewers and the instructor.

Scatter plot for the initial total report grades from student reviewers and the instructor.

Results of the paired sample correlation of the peer average and instructor initial report grade.

Evaluation area

Paired sample correlation

Works cited

.234**

Using cited works

.094

Grammar, spelling, and neatness

.272*

Content

.188

Total

.196

In addition, to understand the student perception about the essay improvement due to the peer edits, two survey questions, “I found the reviewer comments I received helpful” and “I felt my paper improved as a result of the feedback I received” were analyzed. Survey response percentages are given in Figure 3. Likert responses “agree” and “strongly agree” were combined, as well as “disagree” and “strongly disagree” for this analysis.

FIGURE 3
Percentage survey responses in three Likert-scale categories for the survey questions. Error bars represent the standard errors. In general, students thought the review process improved their papers.

Percentage survey responses in three Likert-scale categories for the survey questions. Error bars represent the standard errors. In general, students thought the review process improved their papers.

Among all students, 55% said that the comments they received from their peers were helpful; only 16% said they were not. Most importantly, 72% of the students thought their papers were improved as a result of the student feedback. To determine if report grades were enhanced for students who thought reviewer comments were helpful, average grade differences (final instructor report grade— initial instructor report grade) were plotted against the three survey response categories of agree, neutral, and disagree. Results are given in Figure 4. Grade improvements do not show a significant difference among the groups, as all groups improved.

FIGURE 4
Averaged instructor grade improvements for students who rated their response as agree, neutral, and disagree in the survey questions, “I found the reviewer comments I received helpful” and “I felt my paper improved as a result of the feedback I rece

Averaged instructor grade improvements for students who rated their response as agree, neutral, and disagree in the survey questions, “I found the reviewer comments I received helpful” and “I felt my paper improved as a result of the feedback I  received.” Error bars represent the standard errors.

Our second research question was to determine if the editing of peer reports helped students to improve their own writing. This was addressed by the evaluation of student perception on the two survey questions, “Reading other papers helped me understand what the assignment should look like” and “Reading other papers gave me ideas for things I could change in my own paper.” In the analysis, Likert-scale items of strongly disagree and disagree and strongly agree and agree were merged and considered as disagree and agree, respectively. Survey response percentages are shown in Figure 5. Approximately 60% of students thought that reading other student papers helped them understand the assignment and supplied ideas to improve their own report.

FIGURE 5
Percentage survey responses in three Likert-scale categories for the survey questions. In general, students thought the reviewing process improved their understanding about the writing assignment. Error bars represent the standard errors.

Percentage survey responses in three Likert-scale categories for the survey questions. In general, students thought the reviewing process improved their understanding about the writing assignment. Error bars represent the standard errors.

Average values of grade differences (final instructor report grade— initial instructor report grade) were plotted against the three survey response categories of agree, neutral, and disagree to evaluate if student writing improved as a result of reading other papers (Figure 6). The average grade improvements among the three categories were not significantly different. Students strongly responded to the survey question however, and thought the process helped their writing.

FIGURE 6
Averaged instructor grade improvements for students who rated their response as agree, neutral, and disagree in the survey questions. Error bars represent the standard errors.

Averaged instructor grade improvements for students who rated their response as agree, neutral, and disagree in the survey questions. Error bars represent the standard errors.

Our third research question was to determine whether the Response to Reviewer Comment document helped students critically evaluate their own work. Overall, the quality of Response to Reviewer Comments documents was poor, as students took all peer edit suggestions as changes to be made. Therefore, in our opinion, many students did not really use the Response to Reviewer Comments document’s comments to critically evaluate their own work. Instructor grades for the quality of the Response to Reviewer Comments document correlated with students’ final report grades (Pearson correlation .467 with p < .001). Students that critically considered the peer edits and addressed comments in detail further improved their essay grade. Results are shown in Figure 7. Improved training with the Response to Reviewer Comments document is needed to support students to critically evaluate peer edits and appropriately defend their writing choices. The level of student writing confidence may currently be impacting these choices.

FIGURE 7
Representation of students’ report grade improvement related to the quality of their Response to Reviewer Comments document. Student quality was characterized as “high” if students showed strong engagement with the comments and defended their litera

Representation of students’ report grade improvement related to the quality of their Response to Reviewer Comments document. Student quality was characterized as “high” if students showed strong engagement with the comments and defended their litera

Peer feedback and response to peer review assignment perceptions

Student survey responses pertaining to the peer editing process are shown in Figure 8. Average student responses were found to be supportive of the peer editing approach. In addition, student comments on the peer editing approach are shown in Table 3.

FIGURE 8
Student responses to additional survey questions. A Likert scale was used ranging from 1 = Strongly Disagree to 5 = Strong Agree. Error bars represent standard errors.

Student responses to additional survey questions. A Likert scale was used ranging from 1 = Strongly Disagree to 5 = Strong Agree. Error bars represent standard errors.

Student feedback on survey questions and additional comments.

Survey question

Student comments

Please give us additional comments on the in-class peer review process. Was there enough time? Enough instruction on what to do? Enough work to accomplish? What would you do differently?

The review paper is a good idea and should continue.Instructions were clear and enough work to accomplish.I thought there was plenty of time—20 min./paper was sufficient. Overall, I thought it was a good assignment.I liked and appreciated the peer review process.I think we were given the right amount of time. I liked going through someone else’s paper and helping them grow.There was plenty of instruction on what to do, especially with the rubric we were given. I wouldn’t do anything differently. I think it worked very well.

Please give us additional feedback on the revision process for your own paper. Were comments helpful? What was most helpful? Was there enough time to revise? What would you differently?

Time was enough and peer comments were helpful.Great idea and interesting to see other new topics.Comments were very helpful and helped to improve the paper.I thought the revision process was very fair. I liked it!For the most part, comments were helpful. They were a little contradicting at times, but mostly they helped me catch my mistakes.

Limitations

There are some limitations to this study. We focused on one essay written by students within the course, which limits our conclusions concerning writing improvement. Multiple assignments would allow us to determine if student writing skills improved over the course of the semester with feedback. In addition, we did not directly compare the student edits made for each assignment with instructor edits. This could lead to a further exploration of the impact and efficacy of student edits. However, other researchers have found that student edits from multiple peers carried more weight than one expert opinion in contributing to essay improvement (Cho & MacArthur, 2010).

Conclusions

This peer editing exercise did improve student writing for the technical writing assignment and encouraged students to evaluate their own writing through peer edit feedback. Analysis of student edits showed that peer editing was sufficient to improve essay quality and the feedback was appreciated by students undergoing review. Student edits correlated with instructor grades most closely on work cited and grammar edits, with student edits focusing primarily on small, discrete suggestions instead of large, conceptual improvements. Students also valued the ability to read peer essays and maintained that reading other essays improved their own work. The majority of students appreciated the editing exercise and concluded that they would get peer edits on their own before their next writing assignment. The Response to Reviewer Comments document was included with this editing exercise to encourage students to think critically about their own work and analyze if editing suggestions were worthwhile. Overall, most students did not critically defend their writing, but instead incorporated all peer edits into their work. The ability to critically think and defend their own writing needs more support for students in this upper-division science course, as students were not confident enough in their own essay to defend their writing choices.

Overall, our results support the inclusion of peer edits as part of a writing assignment for students learning technical writing. Instructors can implement peer editing with assignments as a review cycle to improve student performance. The peer-edit process reduces instructor grading load, and results indicate that students gained as much benefit from reading peer essays as they did in receiving student edits. Further work on this approach includes incorporation of several peer-editing cycles to allow students the opportunity to improve overall writing skills. Further focus on critical evaluation of their own work is needed for students to fully use the Response to Reviewer Comments document.

References

Berndt M., Strijbos J.-W., & Fischer F. (2018). Effects of written peer-feedback content and sender’s competence on perceptions, performance, and mindful cognitive processing. European Journal of Psychology of Education, 33(1), 31–49.

Boase-Jelinek D., Parker J., & Herrington J. (2013). Student reflection and learning through peer reviews. Issues in Educational Research, 23(2), 119–131.

Chapman O. (1999). Calibrated peer reviewTM, an overview. .

Cho K., & MacArthur C. (2010). Student revision with peer and expert reviewing. Learning and Instruction, 20(4), 328–338.

Coghill A. M., & Garson L. (2006). The ACS style guide (Vol. 3). Oxford University Press and the American Chemical Society.

Dominguez C., Nascimento M. M., Payan-Carreira R., Cruz G., Silva H., Lopes J., Morais M., & Morais E. (2015). Adding value to the learning process by online peer review activities: Towards the elaboration of a methodology to promote critical thinking in future engineers. European Journal of Engineering Education, 40(5), 573–591.

Glaser R. E. (2014). Design and assessment of an assignment-based curriculum to teach scientific writing and scientific peer review. Journal of Learning Design, 7(2), 85–104.

Gragson D. E., & Hagen J. P. (2009). Developing technical writing skills in the physical chemistry laboratory: A progressive approach employing peer review. Journal of Chemical Education, 87(1), 62–65.

Guilford W. H. (2001). Teaching peer review and the process of scientific writing. Advances in Physiology Education, 25(3), 167–175.

Gunersel A. B., Simpson N. J., Aufderheide K. J., & Wang L. (2012). Effectiveness of Calibrated Peer ReviewTM for improving writing and critical thinking skills in biology undergraduate students. Journal of the Scholarship of Teaching and Learning, 8(2), 25–37.

Hicks C. M., Pandey V., Fraser C. A., & Klemmer S. (2016). Framing feedback: Choosing review environment features that support high quality peer assessment. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 458–469.

Huisman B., Saab N., van Driel J., & van den Broek P. (2018). Peer feedback on academic writing: Undergraduate students’ peer feedback role, peer feedback perceptions and essay performance. Assessment & Evaluation in Higher Education, 1–14.

Krause M., Garncarz T., Song J., Gerber E. M., Bailey B. P., & Dow S. P. (2017). Critique style guide: Improving crowdsourced design feedback with a natural language model. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 4627–4639). Association for Computing Machinery.

Kulkarni C. E., Bernstein M. S., & Klemmer S. R. (2015a). PeerStudio: Rapid peer feedback emphasizes revision and improves performance. Proceedings of the Second (2015) ACM Conference on Learning@Scale (pp. 75–84). Association for Computing Machinery.

Kulkarni C., Wei K. P., Le H., Chia D., Papadopoulos K., Cheng J., Koller D., & Klemmer S. R. (2015b). Peer and self assessment in massive online classes. ACM Transactions on Computer-Human Interaction (TOCHI), 20(6), 1–33.

Miyazoe T., & Anderson T. (2010). Learning outcomes and students’ perceptions of online writing: Simultaneous implementation of a forum, blog, and wiki in an EFL blended learning setting. System, 38(2), 185–199.

Novakovich J. (2016). Fostering critical thinking and reflection through blog-mediated peer feedback. Journal of Computer Assisted Learning, 32(1), 16–30.

Ricker A. S., & Whelan R. J. (2016). Reading, writing, and peer review: Engaging with chemical literature in a 200-level analytical chemistry course. In Integrating Information Literacy into the Chemistry Curriculum (pp. 157–168). ACS Publications.

Russell A. A. (2004). Calibrated peer review—a writing and critical-thinking instructional tool. Teaching Tips: Innovations in Undergraduate Science Instruction, 54.

Strijbos J.-W., Narciss S., & Dünnebier K. (2010). Peer feedback content and sender’s competence level in academic writing revision tasks: Are they critical for feedback perceptions and efficiency? Learning and Instruction, 20(4), 291–303.

Thomas G., Martin D., & Pleasants K. (2011). Using self-and peer-assessment to enhance students’ future-learning in higher education. Journal of University Teaching & Learning Practice, 8(1), 5.

Walker J. P., & Sampson V. (2013). Argument-driven inquiry: Using the laboratory to improve undergraduates’ science writing skills through meaningful science writing, peer-review, and revision. Journal of Chemical Education, 90(10), 1269–1274.

Yuan A., Luther K., Krause M., Vennix S. I., Dow S. P., & Hartmann B. (2016). Almost an expert: The effects of rubrics and expertise on perceived value of crowdsourced design critiques. In Proceedings of the 19th ACM Conference on Computer-Supported Work and Social Computing (pp. 1005–1017). Association for Computing Machinery.

Zwicky D. A., & Hands M. D. (2015). The effect of peer review on information literacy outcomes in a chemical literature course. Journal of Chemical Education, 93(3), 477–481.

Chemistry Literacy Teaching Strategies Postsecondary

Asset 2