This article describes a series of assignments that models the process of writing a manuscript for publication. While completing the assignments, students worked to improve their writing as they graphed, interpreted, and explained patterns in data from a local river. They reviewed published articles and each other’s papers to become more critical readers.
The nature of the scientific process is one of active engagement in gathering and analyzing data, making interpretations, and communicating results. Although science students at our small college complete a year-long senior research project, many have difficulty interpreting and presenting their research results, despite completing lab experiments with written reports in many previous courses.
To help students overcome this difficulty, I developed a project that models the complete process of constructing a scientific paper: Students construct graphs, interpret and explain data, and prepare a written presentation of the results after examining an actual set of water quality data. Students simultaneously review each others’ work and critique published journal articles. Previous works describe individual facets of the writing process (Janick-Buckner 1997; Liu, Pysarchik, and Taylor 2002); this project is an attempt to put all these aspects together in one course that students take before or concurrent with their senior research project.
The project was completed by seven environmental science majors enrolled in a 300-level limnology course during the spring semester in 2003. When possible, I emphasized topics relevant to the local Mid-Atlantic region in lecture and applied them throughout this project and field observations. All aspects of the project accounted for more than 40% (250 of 600 points) of students’ final course grades.
One drawback of this project, from the biological point of view, is that it neither proposes nor tests hypotheses. Rather, it is a monitoring project in which students look at data on local water chemistry and attempt to understand the functioning of a natural system. Although it lacks a design component, this analysis mimics the type of environmental monitoring that is commonly practiced in the field, and it provides valuable insight into the mechanics of data analysis and interpretation common to all scientific research.
Similar projects could be assigned in any field of study in which data are readily available. Potential applications include analyzing data on climate or soils from local research stations or the GLOBE Program data set (online at viz.globe.gov/viz-bin/access.cgi?l=en&b=g&rg=n) in soil science, meteorology, and Earth science courses; kinetics of reactions or synthetic pathways in chemistry or biochemistry lectures; or any aspect of bioinformatics in biology courses. One could add a section on experimental design and data collection to generate data before using the lessons synthesized here to analyze and present the material.
The Data Set
Large sets of water quality data are available through the National Estuarine Research Reserve program. I used data from the St. Jones River, a tidal tributary to the Delaware Bay that runs through Dover, Delaware. These data are compiled and posted on the Internet at cdmo.baruch.sc.edu/del.html. I used the data set from calendar year 2000 because it was the most complete recent set available when I was planning the project. I extracted small portions of the data set into Excel files to provide a different exercise for each student, and I included normal conditions throughout the year as well as several meteorology events (a “noreaster” winter storm and a summer thunderstorm).
I discussed the project on the first day of class when each student selected a computer diskette containing his or her data. Each disk also contained local daily weather data for the year 2000, several files containing guidelines and checklists for all assignments, and helpful hints on interpreting each individual data set. I walked students through all aspects of the entire writing project, timelines, and expectations. After discussion, the class agreed to follow the format of the journal Limnology and Oceanography for all writing. Students were instructed to use the online instructions for authors and instructions for reviewers as a style guide throughout the project.
Students completed individual sections sequentially throughout the semester. Students first graphed their data in a suitable presentation format. The rationale was to promote graph construction skills within the context and “social practice” of the scientific domain (Bowen, Roth, and McGinn 1999), as well as to present data in a visual format that made data interpretation more straightforward (Friel, Curcio, and Bright 2001). I instructed students on the use of two computer-graphing packages available in our computer lab, and I held a “hands-on” session in which students constructed graphs from practice data. Students were given two weeks to construct a graph set and an accompanying page of figure legends.
Each student gave his or her graph packet to another class member who served as an anonymous reviewer. I included peer review because it is an integral part of the scientific process and because I hoped that it would be a valuable learning tool for the reviewer and the original author. The reviewer was forced to critically evaluate a peer’s work, and this feedback could then be used by the writer to improve his or her product (Koprowski 1997). Reviewers also had the chance to improve their own work after seeing a peer’s paper.
Students were given a set of reviewer guidelines (modified from Liu, Pysarchik, and Taylor 2002) covering review criteria and document handling, and we discussed the goals and process on the first day of class. The process was reviewed each time the packets were exchanged, and I served as “editor” to assign reviewers and facilitate the transfer of papers. Single reviewers were used because of the small number of students. Author’s names were included on the original manuscripts. Students asked if they should include their name as reviewer, and this was left as an option.
Reviewers had one week to write comments before returning the graphs to their original authors. Authors then had an additional week to make revisions before turning in the original graphs, reviewer comments, and corrected final versions for grading. I graded the packets on accuracy and visual quality of graphs and legends before returning them to the original authors. Fifteen percent of students’ grades for this section were based on the strength or weakness of comments given as reviewers.
The second part of the project required students to describe their data as if they were writing the results section for the manuscript. For this assignment, I consciously separated the analytical functions (graphing data and determining quantitative relationships) from the interpretative functions (drawing conclusions and providing supporting evidence), as do Basey, Mendelow, and Ramos (2000). In this analytical section, I expected students to identify key features of their data and begin identifying patterns and correlations that suggest outside factors relevant to aquatic systems and water chemistry. Comprehending these patterns involves identifying trends from the lines of the graphs and relating these to some known factual content (Shah and Hoeffner 2002).
Students who could not describe patterns within their data had a difficult time relating those patterns to underlying forcing factors. At this time, I emphasized graphical data description and interpretation as I covered background content on water chemistry and physics in lecture.
Students were given two weeks to write the results section. The review (one week), revision, and grading process were as described previously, except that a different student served as reviewer. I graded this section for clear and accurate data descriptions.
Students wrote the discussion section of their manuscript for the third portion of the project, paying particular attention to interpreting data and explaining observations. Students were instructed to draw conclusions from their data and to provide evidence for their explanation (Basey, Mendelow, and Ramos 2000). I told students to make connections among the various data measures and the chemical, physical, and biological factors that influence these values. I continually reminded students to make these connections, including those among the various parts of this project, because many of my students compartmentalize well. Because there was no hypothesis testing possible, students needed to correlate their observations with known phenomena. Students had to include articles from the primary literature as backup for their explanations, as is typical for research and term paper projects.
Students had nearly three weeks for writing, one week for completing their peer review, and another to make corrections before turning in the entire project for final grading near the end of the semester. I graded on thoughtful and logical explanations of the data as well as continuity of the entire package.
As a companion to the St. Jones River paper, students read and critically evaluated three articles from the primary literature. The goal was for students to find relevant references from the literature while seeing what a “finished product” really looks like. I also hoped that these papers would serve as models for the students’ writing. I assigned the first paper as a common reading for the whole class, and students individually chose the two subsequent articles. I gave a detailed set of instructions and questions (as in Janick-Buckner 1997) to students at the beginning of the semester and reinforced these with each paper; I also covered types of areas that needed improvement when handing back each set of graded journal articles.
Students were required to briefly explain the hypothesis and/or rationale for the study, summarize important findings of the paper, and evaluate and critique the conclusions and evidence presented by the authors. I asked students to pay particular attention to identifying key findings and determining whether the explanations of the data were supported by the data and the accompanying text.
The original data set contained eight separate water quality measurements that appeared to be independent, so students constructed eight separate graphs in the order the data appeared in the spreadsheet. Students failed to comprehend that several measurements (such as salinity and conductivity) are derived duplicates of each other. Only after an in-class discussion did students understand that each of these pairs must either be treated as a two-panel figure or that one of the parameters should be dropped.
Although the majority of the final graphs were professional looking, students had difficulty deciding how to best express the x axis values for date and time. The majority eventually settled on a calendar-date format with tick marks to designate time of day. Some student reviewers were helpful in resolving this point, but most provided little critical feedback at this stage and were overly generous (Robinson 2002), possibly because they were not entirely clear about a correct format.
Figure legends seemed to present the most difficulty for students, probably because this was a new concept and because students failed to use the published papers they were reading as models. All students wrote legends that were incomplete, repetitive, or not very informative. Proper writing of figure legends is a concept that must be given more emphasis in my instructions to students and in all our other courses.
The writing style for the results section was generally straightforward, with few pertinent specifics. Most students described generalized patterns within their data and began to realize that several of the parameters exhibited 12-hour periodicities. This new information, particularly time-course analysis and tidally influenced features, proved difficult for many students to evaluate because they had limited set of experiences upon which to draw (Bowen, Roth, and McGinn 1999) and because these new features did not match any known pattern (Shah and Hoeffner 2002).
There was a lively discussion in class and on reviews whether to state in the results section that certain parameters were influenced by tide. Some students reasoned that the tide was an explanation and not a description, although I thought it appropriate to include that comment at this point in the writing. As occasionally occurs during the scientific review process, some authors charged that the reviewer was wrong and “didn’t get it”; as editor, I tactfully mediated these disputes.
Several students wrote their discussion as a stand-alone section that did not follow the results section that immediately preceded it in the final paper, probably because the assignments were completed in pieces. As expected, some students had a better interpretation of their data than did others, but all tried to offer some explanation. Students with data related to weather events had a particularly difficult time making sense of their results. In the future, I will include more instruction on these situations in the lecture portion of the course and will more closely monitor individual students regarding their interpretations.
Although several students provided plausible interpretations, several had explanations (such as oxygen uptake due to daytime respiration) that made no sense from the data they had or the underlying biological and chemical concepts covered in class. These incorrect explanations were all inserted after reviewers requested additional information and explanation, indicating that the reviewers were doing a fairly good job in requesting information.
Other students had interesting and potentially plausible explanations, but did not have data from this system or references from the literature to support their ideas. The lack of articles from the primary literature detracted from the evidence required to explain phenomena and provided no help in guiding students to greater understanding. Not coincidently, graduating seniors also use surprisingly few references and have trouble writing explanations on their senior projects.
The majority of students did a credible job summarizing the results of published research articles but did not critically evaluate the conclusions drawn from the evidence. I often got the impression that students simply copied concepts from the paper without really understanding the relevance or implications. Despite being told to ask and answer questions about research design, hypotheses, logic of conclusions, and strength of evidence presented, students tended to take everything at face value (Switzer and Shriner 2000) instead of analyzing whether the authors were being accurate when presenting and interpreting experimental results. It was evident from their writing that many students had become more critical evaluators by the third critique.
Student and Faculty Reactions
As part of the course final exam, I asked students to evaluate their learning experiences from this project. Although probably tempered by not being anonymous, the comments were revealing. All students felt that breaking up the project into three sections helped them get through, because they had deadlines to meet. Several said it would have been a horrible experience to do the entire project as one big assignment with no intermediate feedback. Nearly all said that the peer reviews helped them make their own paper better by seeing the quality of other students’ writing as well as having a model with which to work. Interestingly, the student who did the least work on this project noted that reading his peers’ work showed him how far he had to progress to get to their level. Nearly all the students claimed that the journal article critiques helped with the writing project, but few included specific examples of improvements made.
A major improvement for the next time will be to allow more class time for discussing sections of the project as it progresses. I will also give more details of what it means to do a critical evaluation, how to be a critical reviewer, and so forth by providing trial models of actual papers and reviews. We have expanded this aspect of our research methods/senior project course and during the journal article evaluations in other upper-level courses.
Overall, students had a good learning experience that improved their understanding of the process of writing a scientific paper. My biggest disappointment was that many of these upper-level students made only limited connections between the reading and review of research articles and the process of writing one. They viewed each assignment as an individual entity, without applying it to other aspects of their education. Perhaps this is what we as college faculty members really need to concentrate on in the future.
William Kroen is a professor of biology at Wesley College, 120 N. State Street, Dover, DE 19901; e-mail: email@example.com.
I thank co-instructor K. Curran and the seven students who made this class and project an enjoyable experience. Curran and two anonymous reviewers also made improvements to this manuscript. I thank D. Carter (PI), J. Hewes, and L. Dye (data editing and archiving) at the Delaware National Estuarine Research Reserve, St. Jones River Component, for making these data accessible.
Basey, J.M., T.N. Mendelow, and C.N. Ramos. 2000. Current trends of community college lab curricula in biology: an analysis of inquiry, technology, and content. Journal of Biological Education 34(2):80–86.
Bowen, G.M., W.-M. Roth, and M.K. McGinn. 1999. Interpretations of graphs by university biology students and practicing scientists: Toward a social practice view of scientific representation practices. Journal of Research in Science Teaching 36(9):1020–1043.
Friel, S.N., F.R. Curcio, and G.W. Bright. 2001. Making sense of graphs: Critical factors influencing comprehension and instructional implications. Journal of Research in Mathematics Education 32(2):124–158.
Janick-Buckner, D. 1997. Getting undergraduates to critically read and discuss primary literature. Journal of College Science Teaching 27(1):29–32.
Koprowski, J.L. 1997. Sharpening the craft of scientific writing. Journal of College Science Teaching 27(2):133–135.
Liu, J., D.T. Pysarchik, and W.W. Taylor. 2002. Peer review in the classroom. BioScience 52(9):824–829.
Robinson, J.M. 2002. In search of fairness: An application of multi-reviewer anonymous peer review in a large class. Journal of Further and Higher Education 26(2):183–192.
Shah, P., and J. Hoeffner. 2002. Review of graph comprehension research: Implications for instruction. Educational Psychology Review 14(1):47–69.
Switzer, P.V., and W.M. Shriner. 2000. Mimicking the scientific process in the upper-division laboratory. BioScience 50(2):157–162.