Reviewing and Assessing Student Responses

Initial Publication Date: January 7, 2011

You will need time to read and process student responses to Just-in-Time Teaching questions before the upcoming class period. How much time that takes will depend on the complexity of the questions, how many students are in the class, and how quickly you can process the information. Most instructors require students to submit responses a few hours prior to class, or perhaps the evening before, especially for an early morning class.

Looking for Patterns to Inform In-Class Exercise Development

After students submit their JiTT responses you will need to review them soon thereafter, looking for misconceptions, incorrect reasoning processes, or other learning gaps that can inform the activities you will include in your next class. As you go through the responses you will most likely find clusters of responses that highlight similar learning challenges - some you were not even aware of! Select representative responses from each of these clusters to show (anonymously) at the start of the next class and to use in developing related hands-on in-class learning activities. It is useful to start the class period with a discussion of the representative responses as a warm-up for these in-class activities aimed at improving students' understanding of the concepts included in the JiTT exercise.

The short time span between student-response submission and in-class activities developed from those responses is what gives Just-in-Time Teaching its sense of immediacy.

In general, by the time you begin reviewing students' JiTT responses you will already have done most of the planning for your upcoming class period. Students' JiTT responses will inform your decisions about what activities to include, how much time to spend on them, and how to involve all students in the learning process. For example, if you see a pattern of misconceptions or a lack of understanding of a key concept as you read their answers, you should plan active-learning activities that directly address those topics. This can be done in a variety of ways - showing sample student responses representing conflicting ideas at the start of class and using them to lead whole-class or small-group discussions, asking student groups to expand on incomplete answers, or designing activities that directly confront students with the learning gaps highlighted in their responses.

Evaluating Students' Responses - Grading for Effort

JiTT works best when the exercises are explicitly included as a percentage of students' course grades, typically at least 10%. Instructors use a variety of methods to assign credit for JiTT responses. However, given that JiTT exercises refer to material not yet covered in class, responses to JiTT questions are generally graded more on the level of effort than on the accuracy of the answer.

A JiTT Rubric

Kathy Marrs (IUPUI) has developed a 4-point rubric for scoring JiTT responses adapted from de Caprariis et al., 2001 - and illustrated below - that blends both correctness and effort in its criteria. Using a rubric (that is shared with students ahead of time) greatly speeds the grading process and makes JiTT assessment clear and transparent to students. You can grade the responses while reviewing them or wait until after the upcoming class - but the grades should be posted as soon as possible to provide prompt feedback to students on their performance.

JiTT Scoring Rubric

Ideas for Managing JiTT Grading in Large Classes

If you have very large classes, it may be difficult for you to grade all of your students' responses for every JiTT exercise. There are several ways to decrease the amount of grading you do, while maintaining the number of JiTT exercises you use:

  • You can grade only a random selection of the JiTT exercises you assign during the course.
  • You can grade responses from a random selection of students for each exercise.
  • You can award grades for completion, rather than accuracy of answers.