Education SupportVariousShowcases of Learning ActivitiesShowcasesPeer feedback for active student involvement

Peer feedback for active student involvement

This article will show an example of peer feedback for active student involvement.

Paired Groups divide work for assignments

Course: YRM20306 Research Methodology in Environmental Science

Period: Period 1 and 2

Short Introduction and background

Introduction and story behind the emergence of this learning activity. What was the need, what issue/problem was the teacher facing and wanted to resolve? 

One of the main learning activities in this course is a group assignment where students have to come up with a conceptual framework for a research project. Although each group works on a case of their own choice, all assignments are structurally identical and build up through multiple stages.  

Figure 1: Peer feedback structure scheme

Figure 1: Peer feedback scheme for active student involvement.

Read more

Unfortunately, in the past students often tried to lower their individual workload by simply dividing the work instead of seeking meaningful collaboration. Students would fail to understand the relation between different aspects of research design because each student would just focus on one or two of those aspects. When we asked groups to provide feedback, the task was given to one person or similarly segmented. This made us decide to implement a peer feedback system in which every individual student assesses the entire assignment of another group at three stages in its development. Each group also has to synthesize the individual feedback it receives to support its revisions. 

It turns out that using peer feedback as the motor for the group assignment has an additional benefit. Students no longer look at their teachers to ‘tell them what to do’ before having critically reflected on it themselves. Now, when they look for our help, it leads to more interesting conversations on the trade-offs between various methodological choices. They show way more initiative and take greater responsibility for both the group process and group product. Despite the extra workload, the individual peer feedback procedure was evaluated positively by students. 

Relevant tools / apps (software) or hardware used

Learning outcome(s)

What has been learned after this lesson/activity has been executed ?

The group assignment links to several learning outcomes of the course, but most importantly to: 

After successful completion of this course, students are expected to be able to contribute to interdisciplinary research designs for the environmental sciences 

A more implicit objective was to make students value each other as sources of knowledge and not just defer to authority. 

Read more

Having students grade peers’ work using guiding questions provided by the teacher and then debate the merits of comments made allowed us to call attention to important learning objectives. For example, the fit between different proposal components was often overlooked before since it is very difficult to teach in abstract or to test by conventional means. 

The peer feedback system can be applied to different kind of learning outcomes but is especially suitable for higher cognitive outcomes. 

Lesson idea / Learning activity

Specific description and demonstration of the lesson idea/learning activity.

Read more

The peer feedback workflow

Our students work in groups of four and two groups are paired together, let’s call them group A and B. Every Wednesday group A and B hand in a part of their group assignment and the feedback process is as follows: 

  1. Each group member of group A anonymously reviews the assignment of group B and the other way around. This is done in a separate document by each student.  
  2. On Thursday each student has to read 2 out of the 4 reviews they have received as a group. In order to streamline this process and make sure everyone's reviews were read at least once while maintaining anonymity, we assign member numbers to individuals within groups and indicate whose work to review for each round. 
  3. On Friday both groups work in the same room/online space and first read and integrate the feedback within their group. After an hour, the groups exchange one or two members to answer questions as each group continues to interpret and synthesize feedback received. 
  4. Afterwards, each student rates each of the reviews they read according to how useful, clear and substantiated it was. Each feedback would, thus, be assessed by at least two members of the partner group. 

This process is repeated 4 times during the 6 weeks course.  

The teacher’s role

To facilitate the peer feedback process we provide students with a list of guiding questions. We explain and exemplify these questions in one of the first sessions. These are exactly the same questions we as teachers use for the summative assessment of the final version of the group assignment, which makes the assessment transparent. 

 

We are present during the on-campus (or online) collaborative integration sessions on Friday. It is astonishing how good the discussions in these sessions are. Because students give feedback and read feedback before class they are really well prepared. They only come to us if they cannot solve an issue or reach an agreement within the group, which has greatly reduced our workload and has made the conversations we have far more interesting.   

 

Last but not least, we do validate the feedback evaluation that students give each other (step 4 in the workflow). We randomly look at 1 out of the 4 given feedback documents of each student. The peer evaluations and teacher evaluation together are used as a modifier for the final grade of the group project. This double-check seems necessary because students’ assessment of their peers tend to be very generous.   

Creating extra learning opportunities

Our peer feedback workflow creates at least three extra learning opportunities compared to the old situation. 

  1. Students give feedback based on guiding questions and are encouraged to link their feedback to the teaching material. They thus actively apply what they learned. 
  2. In the collaborative integration session, students really have to assess and debate the received feedback. They have to actively link what they read to what others read and come to a conclusion. 
  3. By evaluating the quality of other students’ feedback they again have to think about what they learned from each other. 

 

These extra learning opportunities lead to a significantly higher quality of the final reports. Even though we intensified the course, fewer groups failed the assignment and the number of middling and confusing assignments that take a lot of time to grade has been sharply reduced.  

Lessons learned / Tips

Mentions tips lecturer has for colleagues based on their experience.

What we will improve

  • We learned through trial and error that the evaluation of received feedback should be kept rather simple. Because students are still in the learning process, they don’t always see whether the feedback they received was complete, etc. At the meta-level a teachers’ perspective is still valuable. For this reason, we will also maintain the instructor feedback moment on the first part of the group assignment. 
  • The closely structured individual peer feedback cycles eliminated opportunities for students to avoid work. Now we worry that closing these loopholes has increased the workload too much. . Now that we understand better what it takes to do what we expect, we are thinking about eliminating the feedback round, letting go of the textbook and/or pre-selecting the cases for the group assignment. Students currently spend a lot of time finding data, etc. for their specific case, which is not a learning objective of this course.  

What we will keep

  • We are definitely planning to keep the flipped approach after Covid-19. Having students study and prepare online and using the in-class time for high-quality discussions turns out to be a substantial improvement. High expectations, combined with a highly structured learning environment, high instructor involvement, and smart use of incentives, really seem to work. 
  • We hired a Teaching Assistant to keep an eye on the group process and identify group dysfunction at an early stage. She was present at the Friday meetings and looked at Progress in Brightspace and the channel activities in MS teams to monitor participation. In addition to being very useful in more typical TA functions, their presence saved at least three groups from serious problems. 
  • Stable pairings of groups whose members got to know each other encouraged individuals to do the work, take it seriously and submit on time. 
  • Randomly grading one of the four student reviews kept the workload manageable for the instructors and provided an effective incentive for students. 

What to watch out for

  • The grades students gave for the reviews did not correlate that well with the grades we gave those same reviews. This did not trouble us as the act of grading the review had other purposes: assessment (from Bloom’s taxonomy) and salience (if it is graded, Pavlovian conditioning means students pay attention) 
  • It was very difficult to maintain anonymity since, for example, Teams by default makes visible who submits a file (so remember to turn that off!). 
  • No system we could find supports individual review and assessment of the work done by a paired group. Managing this was labour intensive and prone to errors. 
  • Unless compelled to do otherwise, students will use the list of guiding questions as a questionnaire and simply state ‘yes’ and ‘no’ where the items are supposed to prompt deliberation informed by course material. 

Contacts

Teacher(s): Peter Tamas , Viktor Emonds
TLC contact (on MS Teams):
Jolanda Soeting
Author (interviewer): Sanne Mirck 

Attachments

  • Thought sharing with Peter and Viktor about peer feedback for active student involvement : (video below)

.

Interested in learning more about Showcases of learning activities?

Please visit: