ACM Pedagogic Resources > ACM Teagle Collegium > Project Summaries > Metacognition in Psychology
Author Profile

Metacognition in Psychology

Beloit College, Kristin Bonnie

Summary

My project involved investigating two principle questions:

  1. Can metacognitive tools and measures aid introductory psychology students (first year students in particular) in the navigation, organization and mastery of course materials?
  2. Can responses to/choice of exam questions be used as a measure of metacognition? How does this measure compare to other standard measures, including knowledge surveys and exam wrappers?

Context

I carried out my work in my Psychology 100 – Introduction to Psychology course at Beloit College. Intro Psych is a survey course covering many psychological topics. The course is designed to give students a broad exposure to many topics, and as a result the pace of the course is fairly quick and few topics are covered in depth. As a result, students often struggle with navigating the material and determining what they need to know. I don't believe in providing study guides or running review sessions prior to an exam, despite the fact that students constantly ask me to. As I began this project I realized that some established metacognitive tools (mostly knowledge surveys and exam wrappers) could help to guide students through the material in a manner that would at the same time inform my teaching.

As it turns out, the design of my course already included practices that could be fairly easily tweaked to carry out my project, and to make the course more "metacognitive". Specifically, I previously required students to complete "weekend worksheets" – 3-4 questions on the previous week's material that they completed online and submitted prior to the start of class the following Monday. I also have always given students choice on exams – for example, my intro exams have always included instructions like, "Please answer 20 of the following 22 multiple choice questions". For my project, I tweaked these practices in the manner described in the next section.

I carried out my project in the Fall of 2009. There were 24 students in the course (12 first year, 9 Sophomore, 1 Junior, 1 Senior, and 1 exchange). My project was carried out throughout the semester - focusing mainly on the 4 exams given (all non-cumulative, weighted equally, all on roughly equal amount of content). A number of students enroll in the course in order to fulfill a general education requirement (intro psych counts towards the requirement of 2 courses in the social sciences).

Teaching Practice

  1. Weekend worksheets became weekly knowledge surveys – Each week, students completed a short knowledge survey (6-12 questions) online (via Google forms) on the material we had covered in class that week. Rather than providing the answer to each question (less grading for me!) students indicated their confidence in their ability to answer the question correctly (I know; I'm think I know; I don't know). Students received a small amount of credit for completing the knowledge survey. I used the information to review material that students seemed to struggle with. In addition, a subset of these questions appeared on their exam – the knowledge survey therefore served as a review sheet.
  2. Exams still involved choice, but I required students to answer every multiple choice question, before choosing which questions they did not want me to grade. In addition, for all questions they omitted, they were asked to give a reason for why they omitted that question – choosing among provided answers including, "I don't remember learning about this", and "I narrowed it to two, but couldn't decide".
  3. I also gave students exam wrappers – worksheets that simply asked students to reflect on their exam performance and study habits.

All of these provided me with A LOT of data! In addition, at the conclusion of the semester I asked them about the purpose and usefulness of the knowledge surveys and exam wrappers. So, Ihave that information too.

Conclusions and Evidence

Overall, I think that weekly knowledge surveys and exam wrappers helped intro psych students manage the material throughout the semester. Their responses to evaluation questions at the end of the semester suggest that the degree of benefit varied – some students found these activities very useful, others less so. Unfortunately, I'm not able to track the responses to individual students.

I think exam questions do give insight into metacognition, but I think I need to continue looking into the data (and to gather more data) to really have a concrete conclusion about this.

Implications

I found knowledge surveys in general to be a really useful tool. Although I used them weekly in this class, I've used them as pre-tests in other courses and in all cases they help me and appear also to help the students. One thing I have learned is to deal with the data as it comes in, and to share it with students. They are generally very willing participants in the process, and they want to know about the research they are participating in.

I maintain that question choice is probably a measure of metacognition, although I'm not sure I fully captured it here. Compared to another exam format I compared my practices too, I'm confident that I'm using a sound process. Figuring out how to best use it for research purposes is a separate issue.

This project has a whole has been really interesting and inspiring. I've really valued the collaborations with other faculty, and the insights that other collegium participants have had throughout the process.

Looking Ahead

I'm teaching Intro again in the spring, and plan to continue using weekly knowledge surveys, question choice on exams, and exam wrappers. One principle change I plan to make is to cover, more explicitly, metacognition and to share the data I collect from the students with them. Given that this is an Intro Psychology course, this should be fairly easy to do. I plan, also, to continue implementing metacognitive practices into all my courses. I also hope to do more with the data I have - there seems to be a number of questions to ask and answer with the data.


See more Project Summaries »