Outcomes Assessment in the Department of Earth and Planetary Sciences, UNM: The Past and Evolving Present

John Geissman, Department of Earth and Planetary Sciences, University of New Mexico

The University of New Mexico attempted to initiate a rigorous, meaningful Outcome Assessment policy for all of its undergraduate academic programs in the early 1990's. The faculty of the Department of Earth and Planetary Sciences agreed that the most logical and effective means for the assessment of our undergraduate BS program in EPS was for it to take place in our Introductory Field Geology course (EPS319L), and for the principal instructor (i.e. me) to provide written summaries of each of the UNM BS majors in the course. My assessment of each of our EPS BS majors involved several rubrics, the most important of which focused on the student's abilities to (1) identify, describe, and recognize the importance of specific geologic materials in the field; (2) recognize, describe, and accurately record field relations at hand sample, outcrop, and larger scales; (3) prepare, as accurately as possible, geologic maps and cross sections for each of the mapping projects; and (4) summarize the geology and geologic history of each of the mapping areas as factually as absolutely possible in a four to six page (double-spaced) write up of each of the projects. My assessment typically was one to two pages in length, single spaced, for each student, and they were provided to the Department Chair, who submitted them to the extant "assessment office" for the institution.

The program of undergraduate outcomes assessment at the University of New Mexico ceased to function in the late 1990's, because support for the program was terminated.

Now to the next chapter in the history of Outcomes Assessment at the University of New Mexico. In anticipation of an upcoming re-accreditation of the institution, efforts were made to reconstitute an Outcomes Assessment program at the institution in, to the best of my knowledge, Fall, 2006, during the last year of Professor Les McFadden's term as Chair of the Department. The first phase of the rejuvenated effort consisted of developing OA plans for all of the lower division courses that fulfilled core requirements at the institution. For our Department, the appropriate courses included EPS 101 (Physical Geology, or the Way the Earth Works) and EnvSci101 (Blue Planet). The several faculty instructors for both of these course took their responsibility to develop objectives/goal and then assessable outcomes very seriously, resulting in two very comprehensive pilot OA plans. The next step began shortly before I became Chair of the Department (July, 2007), with the inception of developing OA plans for both our undergraduate and graduate programs. Everyone likes unfunded mandates, in particular when it was at least superficially apparent that the Institution, or at least part of it, really was only going through the motions in the context of the upcoming accreditation (scheduled for April, 2009). That said, the faculty of Earth and Planetary Sciences took the effort as seriously as possible, as we recognized that, no matter what, this effort would be of great benefit to our undergraduates and to at least most of our faculty. Our Undergraduate Committee, chaired by Professor David Gutzler, compiled a draft list of objectives (goals) for our EPS BS, EPS BA, and our EnvSci BS programs. Our Graduate Committee, chaired by Jane Selverstone, compiled draft lists of objectives (goals) for our MS and PhD programs. To my pleasant surprise, especially considering the disparate opinions often voiced by my colleagues at faculty meetings, we rather quickly came to a consensus on objectives for each program. The next step, outcomes, was a bit more complicated, in that several of my colleagues had a very difficult time recognizing the difference between objective and outcomes; this difficulty is of course exacerbated by the fact that much of the assessment literature appears to confuse the terms objectives, which are goals, and outcomes, which are measurable, at least as our Institution Outcomes Assessment Guru (IOAG) Professor Chuck Paine (English Department) states. So we forged ahead with outcomes, and eventually came to an agreement on the outcomes for all of our programs. I was tasked with preparing pilot assessment plans for all of our programs after the Spring, 2008, semester, with a deadline of 1 June, 2008, right in the middle of my Introductory Field Geology Course. So, the deadline was not met till mid-June. I was told to choose but a few outcomes to concentrate examples of pilot assessment (rubrics!) on, and did so for all of our programs. Actually, and this might be the most important part of my "essay", it was a lot of fun! Ultimately the IOAG and the CARC (College Assessment Review Committee) met and reviewed my pilot examples and provided feedback. This did not take place till late September/early October, 2008, when our attention was turned to the survival of the United States of America. The pilot OA plans were distributed to the entire faculty in mid-October, 2008, which of course was a clear and obvious mistake because individuals who had made no effort to play a role in the process up until this time now decided that they were OA experts and comments, many quite inflammatory, were sent to the entire department. One individual sent out a message to all the faculty late one night that prompted me, as Chair of the Department to print the message and attach a cover memo to the Dean and the College of Arts and Sciences, stating that, unless authorized otherwise, I would NEVER respond to an electronic mail message from this individual. Oh, the joys of OA! I invited the IOAG to our full faculty meeting on 5 November, 2008, and he very carefully explained what the College/institution wanted to see in our revised OA pilot assessment plans. He very carefully emphasized how a few, select outcomes should be targeted for assessment. Fascinatingly, the individuals who were so very willing to speak out via email never said a thing at this meeting. I closed the meeting by saying, "OK, our Undergraduate and Graduate Committees will work hard to take the best interests of the Department at heart in preparing revised OA plans. We can do this, YES, WE CAN!"

On a very positive note, almost all of our revised OA pilot assessment plans, which have addressed all of the concerns of the IOAG and the CARC, have been submitted to the College. OA of all of our programs will begin during the 2009/10 academic year, assuming that the institution, in its near present form, is still in place. The author of this report thanks Professors Chuck Paine, Dave Gutzler, and Jane Selverstone for their tremendous patience, hard work, and collegiality in producing our final OA plans for implementation in the 2009/10 academic year.