Jennifer+Blake

toc

About Me
Jennifer Blake teaches biology, genetics, and AP Biology at the Lovett School. She has been teaching at Lovett since 2005. She has a B.S. in Applied Biology from Georgia Tech and an M.Ed. in Science Education from Georgia State. Through her teaching, she strives to inspire curiosity in her students about the living world around them. She also hopes to improve students' scientific reasoning and analytical skills. She can be reached at jblake@lovett.org.

Takeaways / Big Ideas
As a result of my work this year in the CFT EE Ford Assessment Cohort, I have implemented a number of changes in my classroom, many of which have played a role in my action research. For example, I have used formative assessment more frequently and deliberately in my classes. I have a much greater understanding of what formative assessment is--assessment for the purpose of making instructional decisions. I wanted to incorporate formative assessment into the process of writing formal lab reports, and this is one reason I decided to do my action research on formative self- and peer assessments while writing lab reports.

I have also internalized the importance of clarifying your learning goals--for yourself, as well as your students. I have begun using rubrics more effectively to communicate my expectations to my students, and I incorporated this strategy into my action research on formal lab reports as well. I have learned the importance of providing rubrics to students at the beginning of a project or lesson, and the components of a high-quality rubric. In addition to clearly spelling out the expectations at the beginning of an assignment, I have grown in my efforts to provide quality feedback to students on assignments. In addition to suggestions for improvement, I also try to be sure to point out the positive aspects of student work. I try to make my feedback as specific and actionable as possible. Our study of the importance of feedback in the cohort is another reason why I decided to study self- and peer assessment on lab reports. Indeed, one benefit of peer assessment cited in educational literature is its facilitation of student receipt of large quantities of timely feedback (Topping, 1998).

In preparation for my action research on self- and peer assessment, I consulted a number of resources (see below for a list). Studies have shown rubric-referenced self-assessment to be an effective method of improving student achievement, more effective than use of rubrics alone (Andrade, 2009). Andrade spells out three steps to conducting an effective rubric-referenced self-assessment. First, the teacher must clearly communicate expectations to students, preferably walking students through the process of using the rubric to critique samples of student work. Next, students create a first draft of their assignment, and compare this draft to the rubric using the modeled method. Finally, students must be given the opportunity to revise their work in order for them to gain the largest benefit from the self-assessment process. This method has been found to be most effective for use for formative, as opposed to summative, purposes (Andrade, 2008 and 2009). In addition to increasing student performance, students have reported that rubric-referenced self-assessment has helped them feel more confident, less anxious, and more motivated. It has also helped them learn how to identify strengths and weaknesses in their own work (Andrade 2009).

While rubric-guided self-assessment has been shown to positively impact student performance, studies show that peer assessment may be a more reliable assessment strategy (Topping 1998). Additionally, a 2009 publication of the results of a peer review program for science lab reports in undergraduate students indicates peer assessment is significantly more effective at promoting revisions than self-assessment alone, as indicated by the number of revisions students made to their reports following review (Trautmann, 2009). In addition to gains in performance, peer assessment has been shown to improve student confidence, motivation, and critical thinking (Trautmann, 2009). Engagement in the formulation and response to intelligent questions is believed to lead to deeper student understanding. A number of actions by the teacher can increase the effectiveness of peer assessment. The criteria must be clearly communicated to students, and training must be provided on the procedure students will use for peer assessment, and on the interactions that are expected to take place between the assessor and the assessed. The teacher must monitor the peer assessment process closely, correcting any problems that arise, especially among students inexperienced with the procedure (Topping, 1998).

**My Action Research**

AR Overview
I have found over the years that students frequently struggle with scientific communication in formal lab reports. In the past, students have struggled with their ability to convey all pertinent details in a clear and concise manner, and to clearly display their data. They also frequently have difficulty supporting their conclusions with direct evidence gathered in the experiment, as well as from research conducted with the use of outside sources. I would like to see improvement in these areas, since students will be asked to use these skills throughout their science studies.

An additional challenge I have noted is that, while I write feedback on the students’ reports and return them, not all students incorporate this feedback consistently into their future reports. I would like to identify strategies for helping students internalize scientific communication skills.

As a result of my work this year through the cohort, I decided to implement a two-part action research plan to address these concerns. Students completed two formal lab reports in groups of two or three. Following the first report, student groups were given the opportunity to engage in rubric-guided self-assessment and turn in a revised version of the report. Following the second report, student groups were given the opportunity to engage in rubric-guided peer assessment and turn in a revised version of the report. As the rubric-guided assessments were performed during class, each student group was required to engage in assessment. However, the revision of the report following the assessment was voluntary.

**AR Question** How do rubric-guided self-assessment and peer-assessment, followed by rewrites, impact students' performance on formal lab reports?

AR Process
My action research was performed in my three sections of ninth grade biology during spring semester 2013. Students completed two formal lab reports in groups of two or three during the semester. The first lab was completed in January and the second in April. These two formal lab reports served as the foundation for my action research.

Prior to assigning the first report, I improved my rubric and guidelines sheet by modifying previous versions to be as clear and specific as possible. When assigning the first report, I gave each student a copy of these documents, and discussed my expectations with them.





Upon completion of the first lab report, students were given the unannounced opportunity to participate in a group self-assessment of their report. Student groups were to self-assess by completing the document below as a group. Students were instructed to list at least one strength and one weakness for each section. The students were provided a sample lab report and we discussed and assessed this report together as a class, to model effective use of the assessment document. Time was given for groups to discuss and assess their work during class. Students were then given several days to make any revisions and turn in a second, final draft. Students were to staple their completed assessment document to the back of their revised report. I maintained a copy of the original report, as well as the revised report, and compared the class averages on the original report to those on the revised report.



When the second report was assigned, students were reminded of the criteria and emailed the rubric and guidelines again. They were encouraged to review feedback their first lab group had received. They completed the second lab and lab report in different student groups, though the size of each student group (2 or 3) remained unchanged. When student groups turned in their second report, they were given the unannounced assignment of reviewing another group's report. All group assignments and group pairings were determined by the teacher in advance. Students used the same assessment document as before, and were coached in appropriate use of this document when reviewing peers' work. Student groups were then given several days to make voluntary changes based on the peer feedback they received. Again, I maintained a copy of the original, as well as revised, reports, and compared the class averages on the two.

AR Data Samples
Below is a graph of the averages of the first lab report of the year for several years. Please note that the data from the 2010-2011 and 2011-2012 school years are averages for honors biology, which I am not currently teaching. I have not taught regular level biology in a number of years, so this data was my best point for comparison. Please note that the average of the unrevised lab for this year's class is on par with that of the honors classes from previous years.

After self-assessing and turning this report back in, the overall average increased 2 points.

The average on the first draft of the second report was within one tenth of a point of the average on the first draft of the first report. However, the average increase after peer assessment was 3.8 points.

AR Data Analysis
Observing student self- and peer assessment forms, I noticed the self-assessments tended to have somewhat vague suggestions for improvement, and the suggestions were oftentimes grammatical in nature, or otherwise of fairly minor significance. Here are some typical quotes from the self-assessment forms.

“Hypothesis needs work.” “Work on verb tense.” “Check spelling.” “Work on formatting.” “More about measurements.”

On the other hand, I noticed the peer assessment forms included more specific suggestions, as well as suggestions that would lead to larger-scale improvements in communication. A number of grammatical errors were pointed out on these forms as well. While some were general, many were very specific. Here are some typical quotes from the peer assessment forms.

"Be more specific on how and when measurements were taken.” "Include a results paragraph and bibliography section.” “Be more specific with the directions.” “The graphs are confusing. Try combining them into one graph.”

As you can see below, more suggestions were made, on average, on the peer assessments than on the self-assessments.



Below are results from an anonymous feedback survey I administered at the end of my research. The first question asks students which of the listed factors was most helpful while they were completing and editing their reports. Students were allowed to pick more than one answer. The second asks students to compare the helpfulness of self- and peer assessment directly.



As you can see, students found the guidelines sheet I handed out when I gave them the assignment to be most helpful, followed by the rubric. They found peer assessment to be more helpful than the feedback I provided, and the self-assessment to be the least helpful. When asked directly about the peer assessment and self-assessment, students again picked the peer assessment to be most helpful, though many thought both were about equally helpful.

Written student feedback was the most helpful to me. Students had very positive things to say about having access to the rubric prior to the due date. They generally had very positive things to say about both self- and peer assessments, although many compared the two in their comments and found the peer assessment to be more helpful. Here are some examples of what students said in the survey.

“I believe that having the rubric to work off of made the chances of receiving a good score for the lab report more obtainable.”

“The rubric gave me specific guidelines in what I was doing, and this helped by allowing me to put in what was required and then even add additional information.”

“It was very helpful to self-assess our lab report because we found many errors that we did not find before.”

“Our group just went back and looked over our lab and noticed little things…. The self-assessing helped, but not as much as having feedback from other people.”

“I loved having a peer assessment. When giving feedback to another group, I saw simple but very important stuff that my group had left out of our lab report.”

“The peer assessment was extremely helpful in that it gave a different perspective on how others understand and interpret the conclusions that were made in the lab.”

“We had lazy peer reviewers so they did not help as much as I would have liked. “

AR Conclusions
I concluded that, in this study, peer assessment was more effective than self-assessment at improving student performance on lab reports. Student scores improved by two points, on average, following self-assessments. This improvement nearly doubled following peer assessment, to a 3.8-point improvement. I attribute this improvement to the increased quality and quantity of feedback provided through peer assessment. Not only were there more than double the number of suggestions, on average, following peer review, but the suggestions tended to be more specific and of larger significance.

As a result of this study, I plan to continue to hand out a rubric in advance of the due date. I also plan to use peer assessment on lab reports in the future, as well as find other opportunities to utilize this tool. In response to a few student concerns about the effort their peer assessors put into the process, I plan to consider ways to motivate all students to provide the best quality of feedback possible. I plan to keep the rewrite optional, and to keep the peer assessments formative.

Lit Review & Resources
Andrade, Heidi. "Self-Assessment Through Rubrics." //Educational Leadership// 65.4 (2007/2008): 60-63.

Andrade, Heidi, and Anna Valtcheva. "Promoting Learning and Achievement Through Self-Assessment." //Theory into Practice// 48.1 (2009): 12-19.

Hansen, Jette G., and Jun Liu. "Guiding Principles for Effective Peer Response." //ELT Journal// 59.1 (2005): 31-38.

Ross, John A. "The Reliability, Validity, and Utility of Self-Assessment." //Practical Assessment, Research, and Evaluation// 11.10 (2006).

Topping, Keith. "Peer Assessment Between Students in Colleges and Universities." //Review of Educational Research// 68.3 (1998): 249-76.

Trautmann, Nancy M. "Interactive Learning Through Web-Mediated Peer Review of Student Science Reports." //Educational Technology Research and Development// 57 (2009): 685-704.

Reflections
I have found my participation in the EE Ford Assessment Cohort to be a very valuable part of my professional growth. The deliberateness with which I think about assessments in my classroom has carried over into all aspects of my teaching, and my students are reaping the benefits. In addition to learning a number of useful assessment strategies, I also feel empowered to do actual research with my classes through the action research process. I would like to thank all who made this growth possible, especially our facilitators who worked tirelessly to support us.