Assessment in the UVM Geology Department
Charlotte Mehrtens, Geology Department, University of Vermont
We assess at a variety of frequencies and scales. What follows will be a review of some our approaches, from the almost daily in-class assessment of student learning to the infrequent longitudinal surveys of alums.
With only one or two exceptions (those of us who have participated in Cutting Edge workshops) our faculty are very traditional in their assessment of students. Most give standard exams. Fortunately, with the exception of one introductory lecture class, these exams are NOT multiple choice or T/F but are problem-based. Several junior and senior level classes involve term papers. One of our more pedagogically innovative classes is a large introductory class in earth system science. Although the lecture is fairly traditional (predominantly lecture with some demonstrations, think-pair-share questions, etc) the labs involve student working groups on field trips, construction of concept maps, and a lab portfolio. The portfolio asks each student to identify common themes to the semester's labs as well as address a question that asks the student to contextualize the material. An annoying challenge is developing new innovative portfolio questions every year so that the portfolio essays can't be retrieved from "frat files." I strive in this class to align the assessment vehicle with the student learning goals and also to insure that the students have practice with the type of assessment vehicle before it "counts."
The student learning objectives for both the B.A. and B.S. degrees include the development of written communication skills. The "writing across the curriculum" (WAC) has been active for many years in the College of Arts and Sciences. Most of the older geology faculty (practically all of whom are now retired) participated in this program and incorporated a variety of strategies in their courses (multi-draft, peer review, portfolios) to improve student writing. In the past five years I've found less interest on the part of my colleagues in doing this because they feel it takes time away from content and there is still a perception that writing should be taught by the English department. Attendance at WAC workshops has evaporated, replaced by an assumption that more writing equals better writing. I'm hopeful that the capstone experience we've created will provide us with meaningful feedback about student writing, and we can have a meaningful discussion on how to improve writing with our assignments.
Course evaluations are required (by the Department) for every class we teach. The course evaluation is constructed around the student learning goals that are articulated in each course syllabus. These course evaluations are the focal point of discussions between the Chair and each faculty member in the spring. Department average values are calculated and these averages are used for the Chair in preparation of faculty portfolios for reappointment and tenure. Consensus among faculty came fairly easily during the development of this common course evaluation form and I think it works pretty well.
More recently we have tried to develop assessment metrics for graduating seniors. Because we offer three degrees (B.A. Geology, B.S. Geology and B.S. Environmental Science, Geology concentration), and only two of these require at least 3 credit hours of undergraduate research, evaluating what our graduating seniors know varies. For the students who have done research (about half the population) the quality of the written document and public presentation are the vehicles for assessment. The B.A. students, for whom research is only recommended, not required, we've had until very recently no assessment mechanism. Only last year did we develop a senior seminar capstone experience which this year has only 2 students enrolled. We've been really constrained on what type of capstone experience to develop because of resource constraints (faculty time) so we created a seminar based on something we were already doing, a visiting speaker series. B.A. Geology majors are required to attend the visiting speaker series (about 5 or 6 a semester) and at the end of the year, write a paper that synthesizes what topics were discussed at these seminars with what they learned in the geology classes they took over their four years. I have no idea if this will be effective in getting students to synthesize and reflect, but at least we will be able to evaluate if they can write. I should say that I had to drag my department to and through this process. Only one or two other colleagues were actually interested in assessing what our graduates could do. Because many colleagues are focused on their research programs they felt comfortable with what their undergrad research advisees could do and wanted to leave it at that.
Assessments are meaningless if you actually don't examine the results and modify what you are doing. That is the problem we will face as the capstone experience we've created for the B.A. non-research students runs for a few years. At this point in time the faculty do not seem interested in examining the results, but since this lies in the future, we'll see what happens down the road. As part of the UVM re-accreditation process we have committed to having an annual faculty meeting devoted to review of assessment data, but I'm not sure how we are going to do this because overall interest in the process is low.
On a very infrequent basis we solicit feedback from alumni. This longitudinal survey is really only done when we have to have data for program review. Since we went through such a review relatively recently, we conducted this survey. In constructing questions for this review I was interested in two questions. First, we were trying to find out if our alumni were engaged in careers related to the earth and environmental sciences. Second, we were interested in whether student evaluation of their experience in the geology department differed significantly from their perception of their overall UVM experience (anecdotally I had heard that UVM alums were highly loyal to individual faculty and departments but much less so to the institution). The results confirmed this to be the case.
In sum, most faculty assess student learning in the classroom in very traditional ways. Faculty have "bought into" our revised course evaluations, in part because they really do care about delivery of their material and these evaluations are part of the annual review process. As you get further away from what is of immediate interest to a faculty member their interest in assessment wanes. This is my department's big challenge.