|
Explore Teaching Examples | Provide Feedback

How to Use ConcepTests

The effective use of ConcepTests follows a simple protocol:

  1. The instructor presents a short lecture, typically lasting from 5-15 minutes.
  2. A ConcepTest is then posted on the chalkboard or screen. (See example ConcepTests.) Students consider the question individually for a short time (30 seconds to 1 minute) and choose an answer.
  3. Students may indicate their answers using a variety of methods. They may
    • raise their hands as choices are presented by the instructor (Kovac, 1999; Landis et al., 2001).
    • be given large colored or lettered answer cards to display (see at right; Uno, 1996; Mazur, 1997; Jones et al., 2001; Meltzer and Manivannan, 2002).
    • write answers on answer sheets (Piepmeier,1998; Rao and DiCarlo, 2000).
    • use an electronic classroom response system ("clickers"; Wenk et al., 1997; Crouch and Mazur, 2001; Cox and Junkin, 2002; McConnell et al., 2003; Greer and Heaney, 2004).
  4. The instructor then evaluates the student responses. The optimal range of correct student responses is 35-70% (Crouch and Mazur, 2001). When correct responses are between 35-70%, students are instructed to discuss the reasons for their choices with their neighbors (peer instruction) in pairs or small groups for 1-2 minutes (Mazur, 1997).
    Dr. McConnell offers some advice for educators who want to use this technique. (2:40 min) Part of the MERLOT Elixr Case Story on ConcepTests.
    • If fewer than 35% of the responses are correct, students typically do not understand the topic well enough to discuss the subject or the question is unclear or too difficult. In such cases, instructors may have students offer interpretations of what the question is asking and what information they would need to provide an answer. Such discussions may reveal inconsistencies with the question or gaps in student understanding.
    • If more than 70% of the class answered correctly in their first response, the question may have been too easy. Additional discussion will yield little improvement in student answers or will result in most of those who did not initially choose the correct answer picking the choice that is clearly most popular without understanding why.
  5. Following peer instruction, the class is polled again. At this point, the instructor may either select a spokesperson to provide a brief explanation of the correct answer or the instructor may summarize the response for the class.

Instructors may adapt this protocol to their specific needs. If class time is at a premium, it may be appropriate to ask students to explain answers after step 4 if a majority of students have chosen the correct answer and none of the other responses were selected by more than 10-15% of the students. Alternatively, the instructor may chose to skip from step 2 to step 5 and have students discuss answers without time for individual reflection (Mestre et al., 1997).

Each class is different. A question that may elicit just 40% correct responses in a non-majors Earth Science class may be answered correctly by 75% of students in a Physical Geology course for majors. A class with an emphasis on reading assignments or other daily home work may better prepare students to do well on questions than a class where students are less likely to come to class having reflected on the day's lecture materials. Instructors are advised to avoid asking questions that consistently elicit 90% or higher correct response rates. These questions are a waste of class time and clearly don't help students learn.

Dr. McConnell explores why he doesn't grade student responses to ConcepTests. (2:05 min) Part of the MERLOT Elixr Case Story on ConcepTests.

ConcepTests and Grading

The percentages stated here should be considered guidelines, not requirements. The important point is to challenge students to confront the critical concepts that they will need to understand to be successful learners. This assessment process works best in a non-threatening, low stakes learning environment where the students are not penalized for getting the questions wrong. Consequently, it is recommended that ConcepTests are not used as the equivalent of quizzes or tests. Here are three ways that you might consider using ConcepTests.

  • ConcepTests are not graded and are not counted for participation. Students are aware that similar questions, sometimes the same questions, could be present on the exams. Students are given access to all ConcepTests used in class prior to exams.
  • ConcepTests are used to assign a participation grade. Students can earn up to 10% of the class grade on the basis of the number of days they answered ConcepTests (maximum points for 90+% participation), regardless of the accuracy of their answers. Multiple ConcepTests are presented to the students each day. An electronic response system can be used to track student participation. Students are given access to all ConcepTests used in class prior to exams.
  • ConcepTests are graded each day and account for a small proportion of the class grade (5-10%). Students score 1 point for each correct response and can earn up to 50 points for the semester. Multiple ConcepTests are presented to the students each day. More than 100 ConcepTests are used during the class. Students are given access to all ConcepTests used in class prior to exams.

Graph of data from four Earth Science classes taught by the module author in Fall 2007 and Spring 2008 with similar general eduation student populations, in similar classrooms, using the same teaching strategies.

The module author has used ConcepTests in his courses over a period of years. Anonymous student comments have been almost uniformly positive about the experience with this teaching strategy in Earth Science and Physical Geology classes. The average student success rate (the daily proportion of student correct answers averaged over the semester, see graph) on ConcepTests in the Earth Science class was between 50-70%. In other words, students correctly answered from half to two-thirds of the ConcepTests they attempted. Out of a population of more than 400 Earth Science students, no student averaged above 90% for the semester. The highest recorded student average score was 86% and the lowest score was 28%.