Peer Review of Teaching Activities

Cutting Edge began conducting a comprehensive review of all the teaching collections in 2011 with the goal of developing a comprehensive and coherent set of teaching activities that will serve the geoscience education community for years to come. The initial implementation of the review process was in conjunction with the Teaching Mineralogy, Petrology, and Geochemistry in the 21st Century workshop and has grown to cover both the original On the Cutting Edge collection as well as other Earth education materials hosted by SERC and found in the Teach the Earth portal. The process involves having workshop participants review a set of teaching activities, using a rubric to help calibrate their scoring. There are five elements of each activity that get reviewed during the process:

  • scientific veracity
  • alignment of goals, activity, and assessment
  • pedagogical effectiveness
  • robustness (usability and dependability of all lesson components)
  • completeness of the ActivitySheet web page for the activity

All of the review is conducted online via a web-based management tool, with each participant being assigned 5-10 activities to review. As the participants do their reviews, they enter scores for each element and provide constructive comments to help the author address any noted deficiencies. Each activity receives two independent reviews. The review process is managed by the Cutting Edge PI Team and a number of associate editors selected from the workshop community. The associate editors maintain communications with the reviewers to ensure that the process moves ahead smoothly. The PI Team has the ultimate responsibility for assigning the final ratings and communicating the activity authors. Based on the results of the reviews, each activity receives one of four possible ratings:

  • Exemplary: Activities in the Exemplary Collection have received Exemplary or Very Good scores in all five categories and must have been rated Exemplary in at least 3 of the 5. It is expected that no more than 10-20% of the activities in the collection would be awarded an Exemplary rating in this process. Aggregate scores of 18-20 are required.
  • Reviewed: Activities in the Reviewed Collections have received positive reviews in all five categories, consisting of mostly of Very Good scores with possible Exemplary or Adequate scores in one or more areas. Authors with activities in this collection will receive recommendations from the reviewers and associate editors for ways of improving their activity such that it can be brought into the Exemplary Collection. Scores of 12-17 are required.
  • Activity Idea: An activity with this rating contains the nucleus of a good teaching activity in the materials that were submitted, but in its current form does not contain sufficient information to be able to be widely used in geoscience classes. Authors will be encouraged to invest energy in further developing the activity so that it become part of the Reviewed Collection. These are nucleus of good ideas that could be further developed.
  • Deaccession: These activities contain serious deficiencies which would be difficult or impossible to remedy. They will be removed from the teaching activities collection entirely.
View Exemplary Activity Collection »

Activity Review During Workshops

The process by which activities are reviewed at On the Cutting Edge activity development workshops is similar regardless of whether it occurs at face-to-face or virtual workshops.
  • Either before or during the workshop, participants submit an activity they have developed or adapted for use in their classroom.
  • Participants review each others' activities and comment on them using a rubric. Information on the review criteria and rubrics used is available for participants on pages such as this one from the 2010 Teaching Services Learning in the Geosciences workshop.
  • The reviewer and author discuss the reviewer's comments on the activity.
  • Authors are encouraged to work on revisions to their activity based on the feedback they received both at and after the workshop.

Beyond necessary technical differences in the process, there are also functional differences between review at face-to-face and virtual workshops. Virtual workshops are typically made up of multiple synchronous sessions spread over the course of some extended period of time. The first several sessions are usually devoted to exploring the workshop topic in depth to give participants a grounding in the state of knowledge in the field. In between these synchronous sessions, participants are usually tasked with developing, reviewing, and/or revising teaching activities and other workshop products for the website. The last synchronous session(s) allow participants to showcase their work to the rest of the group and explore next steps.