SAGE Musings: Using Exams to Promote Deep Learning

Christopher DiLeonardo, De Anza College
Author Profile
published Sep 27, 2018 10:26am

Over my years of teaching, I have learned two lessons:

  1. We don't teach; students learn. All real learning is active and a biologic process that occurs in the brain. What we do is create educational environments and experiences that foster learning.
  2. Though we should never teach to the test, (most) students learn to the test.

In my Introductory Geology class, I leverage these lessons to engage my students in deep learning via case-based, partially collaborative, multiple-choice exams. I know that using multiple-choice exams to promote deep learning might sound impossible. Bear with me; let me describe how this model works for me and show you some examples, and you can judge for yourself.

The course: Introductory Geology

My Introductory Geology course has an integrated lecture and laboratory. It meets 4 hours a week in lecture and 3 hours a week in laboratory. Collaboration in lab assignments and other work is the norm. The curriculum mirrors modern physical geology offered at most colleges. I emphasize Earth processes that act to change the crust and surface of our planet. The course is divided into two parts. The first covers active processes with the second part of the course looking at how those processes are reflected in the rock record. A midterm exam emphasizes the first part of the course and the final exam focuses on the second. These exams share a common approach with emphasis on conceptual understanding, application, and an ability to build on what students have learned.

Case-based Exams and the Game of Stones

I've always facilitated higher-level thinking on exams by using a case-based approach. This past year I have used the geology of the world of the popular HBO series Game of Thrones as the physical setting. It was a bit of a daunting task to create the geology of a world that works in such a way that the tectonic framework, seismic activity, volcanic distribution, topography, climate, structural geology, rock distribution, etc. all work. But there were base maps available and one doctoral student at Stanford who had mused about the geology online. I exchanged emails with that doctoral candidate and then ran with it.

One week prior to each of the exams, my students download seven pages of high-resolution, full-color, publication quality .pdf files. I tell them to download the files, print them out, and build a packet that they are free to write on and use during the exam. For the midterm, this packet includes seismograms, seismicity maps and the location of tectonic plates, topographic maps, and a base map of the world. For the final exam, the packet includes base maps, outcrop photos with close-up insert pictures of rock samples, cross-sections, columnar sections and geologic maps. The high resolution .pdf files are online so that students can zoom in and pick out details on the computer and make notes on their packets directly. I tell the students that the questions on the exams will be directly related to or ancillary to the geology and data presented in these reference materials. I encourage them to collaborate in their preparation.

Partially Collaborative

Numerous studies in undergraduate math and science education have shown the extreme value in collaborative learning (e.g., Treisman, 1992, and many more). I've embraced that in these exams. At first, I encouraged collaboration only in preparing for the exams. What I found was that overall student success increased, but my students from "targeted populations" -- demographic populations identified by the college as having lower overall success rates -- were not benefiting as much. They seemed more marginalized in the class and appeared to be less engaged in collaboration. Now I include a collaborative portion to the exam, "Part A," that I distribute at the same time as the exam packet.

What I noticed, anecdotally, is that ALL of my students became more engaged. I really have never seen anyone work as hard as these students who felt they were "getting something for nothing." Over a three-year period I saw the success rates climb in my targeted populations by 3 to 4% each year.

Part A of the exam is worth half of the points, and the students complete it collaboratively, working in small groups. Part B of the exam is administered in class and taken independently, like a traditional exam. Students use their own packets, annotated with their interpretations, for Part B.

My colleagues sometimes suggest to me that the collaborative portion of the exam should be worth fewer points than the in-class portion. The truth is that we give points for homework, labs, and in-class activities and discussions that add up to the points given on an exam. Get rid of the points on "homework" and replace them with something you call a "collaborative exam." The investment I see in collaboration with other students has been nothing short of amazing. And because many students learn to the test, you are focusing their learning where you want it.

When a student simply copies answers from other students for Part A of the exam, it is easy to recognize and is self-correcting. Like students who copy other students' homework, they fail the second half of the exam (Part B). Additionally, because of the similarity in style and approach for the two sections, you can rule out the idea that the student simply "doesn't test well." A plot of Part A vs. Part B exam scores allows you to see if each student's overall grades improve, are consistent, or involve cheating between the collaborative and non-collaborative parts of the exam.

Multiple-Choice

Both parts of the exam emphasize higher-level learning focusing on conceptual understanding, application, and critical thinking. The exam questions are in a multiple-choice format. This format has a number of advantages over open-ended questions:

  1. It allows me to do a comprehensive objective statistical analysis of student understanding across a range of topics in the class;
  2. It focuses student response to clearly identify weakness in understanding of concepts; and
  3. It levels the playing field between native and non-native English speakers by eliminating the inherent writing bias of essays.

I find the ability to do rigorous statistical analysis linking questions with similar characteristics or covering similar concepts a powerful assessment tool for my evaluation of student learning in the class as a whole.

Examples of Exam Questions

By giving students a packet of reference materials to study, discuss, and annotate well in advance of the exam, I make it possible to write multiple-choice questions that probe their conceptual understanding and their ability to apply geological concepts to new territory. The Midterm Exam Packet for Game of Stones includes:

  • Figure 1: a map of plates and seismic distributions for the planet
  • Figure 2: a map of the seismic recording stations maintained by the Maesters of the Seven Kingdoms and seismograms from three of the stations, recording ground motion from a single event (see the image at right)
  • Figure 3: travel time curves for the crust in this region

Referring to these figures -- and remember, the students have high-resolution versions of every figure -- I can ask multiple-choice questions such as

  • What is the S-P interval for the seismogram recorded at the Maesters' station at Myr (MYR)? (Answers range from 22 to 58 seconds)
  • What is the Focal Distance (DF) for the seismogram recorded at the Maesters' station at Myr (MYR)? (Answers range from 270 to 570 km)
  • In which of these cities would you have felt the P-wave of this earthquake first? (Several cities listed)
  • Could this be the biggest possible earthquake in the Seven Kingdoms? (Answers include yes, no, and "only if ....")

What you might note in these questions is a mix of mostly higher level learning with a few knowledge-based questions reflecting an understanding of their importance relative to the overall set of questions.

Parting Thoughts

Using this approach over the years, I have seen higher-level learning, greater student success, and overall greater student engagement. It uses the exam process over a week as a culminating learning experience rather than simply an assessment tool. In general, my approach to exams is based on those two important lessons discussed at the beginning of this post: we don't teach, students learn, and students learn to the test. I've leveraged these principles to motivate my students to learn and perform at a higher level. In order to adopt this model, I had to break through my own bias on what exams are "supposed to be about" and remember that in all that we do, in the end it's about student learning.



Comment? Start the discussion about SAGE Musings: Using Exams to Promote Deep Learning