Using Classroom Response Systems for Research on Learning

Nicole LaDue, Northern Illinois University, and Alix Davatzes, Temple University
published Mar 28, 2017

Do you use clickers in your classroom? If so, there are simple ways you can use the technology to pre-assess students' conceptual knowledge, ask students to make predictions, and provide timely feedback to support their learning throughout the course rather than waiting for the dreaded exam day.

The emergence of a new technology in the early 2000's allowed instructors to instantaneously collect student responses to multiple-choice questions and analyze the results on the spot. Clickers were first implemented using remote-control devices sold in kits. Today, most companies have exploited the prevalence of Wi-Fi-enabled classrooms and smart device technology to register student responses via web-based applications. Smart Student Response Systems (SRS) enable companies to provide a broader range of response options, including open-response and diagram questions (for a comparison of product capabilities, see http://socialcompare.com/en/comparison/student-response-systems). The technological advancements in SRS have also improved the analytics available to instructors. Feedback from students in real time presented through tallies of multiple choice responses, rankings of topics, and heat maps from student clicks on a diagram are powerful tools to guide instruction.

When implemented properly, SRS is an active learning pedagogy that can boost students' content learning and reduce failure rates (Freeman et al., 2014). There are many approaches to implementing SRS in the classroom that range from quizzing that counts for a grade to implementing conceptual questions that can facilitate peer-learning (Caldwell, 2007). At the most basic level, when instructors highlight the learning objectives of their lecture by asking basic content questions throughout or at the end of a lecture, the most critical highlights are made more salient for students. If questions are implemented without counting the correctness of their responses for a grade, this is a method of formative assessment, in which students can gauge their own learning. Likewise, instructors gain insight about which concepts are not clear. Research on using SRS in chemistry courses indicates that students make significant gains when questions are low-stakes (MacArthur and Jones, 2008). Another approach engages students in specific troublesome conceptions underlying science phenomena. ConcepTests were adapted to use in large lecture settings in geoscience classes (McConnell et al., 2006) to facilitate peer-learning. In this case, instructors ask students to respond individually to a challenging conceptual problem that is not easily solved. Subsequently, students work with their peers to discuss their answers and re-answer the question using the SRS. This technique focuses students on known conceptual hurdles and facilitates gradual feedback to facilitate refinement of students' mental models of science phenomena.

SRS also presents an opportunity for learning scientists to research students' conceptions. Pre-assessing students' conceptual understanding can confirm common non-science conceptions from the literature (Cheek, 2010) or reveal new systematic errors students make for previously unexplored topics. Using an SRS, the conceptions can be probed at multiple times across the course. Students can be probed pre-instruction, directly post-instruction, and at the end of the course to see if they revert to their previous conceptual framework or if they have changed their framework to accommodate the new learning (Duit and Treagust, 2003). These sticky concepts are fodder for additional studies to probe students understanding more deeply (i.e. open-ended interviews or design-based research).

SRS provides a mechanism through which faculty can quickly help students learn through prediction and immediate feedback. When students learn new information either through lecture, reading, or via experience, students will generate an internal representation of that information (Shipley & Tikoff, 2016). SRS technology can then give students to opportunity to make predictions in the classroom quickly, and be given immediate information on the range of responses provided by their classmates. If the students are then presented with the correct answer, they can engage in an analogical learning process (Gentner, 1983) in which they compare their prediction to the correct answer, allowing them to accept or refine their internal representation.

An example of this is a study by Resnick et al. (2017) in which students are provided a slide of the geologic time scale, and then asked to scale Eons, Eras, or Periods onto a linear timeline. After making a prediction using clickers, they are provided with the correct scaling of time. This was repeated when each new Eon or Era was introduced over the course of several weeks, each time taking less than 3 minutes of class time. This iterative process facilitated students making adjustments to their mental representation, accommodating the new information about geologic time. At the end of the semester, students were asked to make estimations on a linear time scale of millions and billions of years. Students in the intervention class section showed significant improvement compared to the control classes and the classes where students were asked to make predictions without committing to it using the SRS technology. In addition, overall exam scores in the intervention class improved compared to both the control class and the classes that did not use SRS technology.

Substantial evidence supports the use of SRS technology to facilitate students' learning in undergraduate science. How could we extend the usefulness of clickers to understand what students know and how they learn? Have you tried something in your classroom?

References:

Caldwell, J. E. (2007). Clickers in the large classroom: Current research and best-practice tips. CBE-Life sciences education, 6(1), 9-20.

Cheek, K. A. (2010). Commentary: A summary and analysis of twenty-seven years of geoscience conceptions research. Journal of Geoscience Education, 58(3), 122-134.

Duit, R., & Treagust, D. F. (2003). Conceptual change: A powerful framework for improving science teaching and learning. International journal of science education, 25(6), 671-688.

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410-8415

Gentner, D. (1983). Structure-mapping: A theoretical framework for analogy. Cognitive Science, 7, 155-170

MacArthur, J. R., & Jones, L. L. (2008). A review of literature reports of clickers applicable to college chemistry classrooms. Chemistry Education Research and Practice, 9(3), 187-195.

McConnell, D. A., Steer, D. N., Owens, K. D., Knott, J. R., Van Horn, S., Borowski, W., Dick, J., Foos, A., Malone, M., McGrew, H., & Greer, L. (2006). Using conceptests to assess and improve student conceptual understanding in introductory geoscience courses. Journal of Geoscience Education, 54(1), 61-68.

Resnick, I., Davatzes, A., Newcombe, N.S. and Shipley, T.F. (2017) Using analogy to learn about phenomena at scales outside human perception. Cognitive Research: Principles and Implications, 2(1), 21.

Shipley, T. F., & Tikoff, B. (2016). Linking cognitive science and disciplinary geoscience practice: The importance of the conceptual model.


Comment? Start the discussion about Using Classroom Response Systems for Research on Learning