Building Strong Geoscience Departments > Workshops > Assessing Geoscience Programs > Participants and their Contributions > John Bartley

Assessment of Student Learning in the Geology Program at Muskegon Community College

John Bartley, Muskegon Community College, Department of Mathematics and Physical Sciences

Before I began writing this essay, I thought it might be advisable to take a look at the participants list to see who might be in the audience. As I suspected, most of the participants are from 4-year colleges and universities, with programs ranging from exclusively undergraduate to those with doctoral degree programs. So my perspective on assessment in the geosciences comes from a very different point of view than most of you.

The geology "department" at MCC consists of one faculty member (me), in a department that includes the disciplines of astronomy, chemistry, mathematics, and physics. We do not offer a major in geology; in fact, the entire geology curriculum consists of two courses: introductory physical geology and historical geology, with labs in each. Most of the students who take these classes are not intending to become geoscientists, but instead enroll to meet a general education requirement for the associates degree, or a requirement at their intended transfer institution. The only prerequisite for enrolling in either of these courses is a 10th-grade reading comprehension level, determined by placement testing.

As a result, the assessment measures that I employ are somewhat limited in scope, being addressed exclusively at the course level. I use a variety of methods to assess student learning, from traditional exams and quizzes, to individual written reports, to collaborative exercises involving small groups of students. Because my classes are small (24 students in a full section), it is fairly easy to get to know each student over the course of a semester, and to develop some idea as to what they have learned by the conclusion of the course. Quantifying these impressions, however, is problematic. Each class is also assessed through the use of student opinion surveys that are administered at the conclusion of each semester. These have recently been changed from the traditional format of "what did you like/dislike about this class/instructor" to surveys with questions that address specific course learning objectives. These have only been in use for one year, however, and have produced limited data.

I confess that I have always been somewhat skeptical about assessment measures and their supposed influence on improving student learning. My impression is, and has been for a number of years, that we do these things primarily as a means of justifying our own existence, whether as individual faculty, as departments within our home institutions, or at the institutional level itself. While that may be an important outcome for our profession, I'm not sure if it leads to improved student learning. Instead, we seem to have verified the obvious: complex, difficult concepts are hard to learn, while simple concepts are easy.

See more Participants and their Contributions »