An Attempt at Assessment and Evaluation Using Portfolios

Dave Dempsey, Department of Geosciences, San Francisco State University


Our First Attempt

In 2002-03, the faculty of San Francisco State University's Department of Geosciences, led by Karen Grove (former Department Chair), collaborated to develop overall goals; a set of six learning objectives; and for each learning objective, several performance outcomes, for its B.S. program in Geology. (We were proud of these at the time. We still like them, though they need some revision.) To assess the B.S. program, the faculty chose to assemble portfolios of student work. We thought that portfolios would showcase what students had learned and allow us to judge both how well they'd learned and how their learning had evolved as they progressed through our program. (We still very much like the potential of portfolios for these purposes.)

To determine which assignments to include in the portfolios, we first considered each of our six learning objectives individually. For each one, the instructor of each of our (then) ten required core courses in geology (but not math, physics, or chemistry) tried to identify one assignment that arguably addressed the learning objective well. Not every core course necessarily had such an assignment, but the instructors for anywhere from one to six courses thought they did. A total of nine assignments were identified for inclusion in the portfolio. Some assignments addressed more than one learning objective; some courses contributed more than one assignment (to address different learning objectives); and nine of the ten required core courses contributed assignments. Most assignments were reports of some type; one consisted of the B.S. thesis and oral defense. Each semester during the next two academic years (2003-05), students were asked to bring any of the designated assignments that they'd completed to their faculty academic advisor as part of the Department's mandatory advising. The faculty advisors were expected to maintain the portfolios. (This didn't work as well as we'd envisioned.) To evaluate the portfolios, in the summer of 2005 a committee of three Geology program faculty members tried to score the available portfolios, using a rubric based on the performance outcomes for each learning objective. (This turned out to be hard to do.) After that first attempt, our assessment and evaluation efforts lapsed. We are now reviving them, spurred by an impending deadline issued by our administration.

What Did We Learn?

  • We liked the collaborative process that produced the goals, learning objectives, and performance criteria for the B.S. in Geology program, and we liked the result, at least in principle. (They do need to be revised, for example to include quantitative skills, and fewer performance outcomes might make evaluation easier.)
  • The portfolios were incomplete, partly because only nine of the (then) ten required core courses and none of the elective courses were represented, and partly because portfolios were assembled over just two years of a 3-4 year program, but partly because of breakdowns in our assignment collection mechanism. Instructors and advisors didn't always tell students to save certain assignments for their portfolios, or if they did, students did not always save the assignments, or if they did, they didn't always give them to their faculty advisor, or if they did, the faculty advisor didn't always maintain portfolios consistently. There were no consequences for students who neglected to contribute to their own portfolio or for faculty advisors who neglected to maintain them. The death knell came when the Department lost focus and stopped enforcing mandatory advising, on which the assignment-gathering strategy was founded.
  • Our three portfolio evaluators had trouble evaluating the portfolios, because:
    1. The portfolios tended to be incomplete.
    2. Most instructors did not write their assignments (or syllabi) explicitly to tell students (and evaluators) which learning objectives and performance outcomes each assignment addressed, and how. Nor were instructors either strongly encouraged or held accountable to do so. The absence of explicit tie-ins made using the scoring rubric hard.
    3. The evaluators discovered that they didn't really know what to evaluate. Was it: (a) did the assignment address a learning objective (in which case, just one portfolio would suffice); (b) did the students' work meet the instructor's expectations for the assignment and thus achieve the learning objectives implicit or explicit in the assignment (in which case, what could the evaluators add to the grade already assigned); or (c) did the grading fairly reflect the quality of student work (in which case the evaluators might be infringe on instructors' prerogative to adopt their own grading standards). That is, were the evaluators evaluating (a) course and curriculum design, (b) student performance, or (c) instructor evaluations of student performance?
    4. Evaluating the six learning objectives, with several performance criteria for each, was a significant amount of work for which volunteers received no compensation. There was little incentive or real obligation to serve as an evaluator.
    5. The assessment strategy lacked "indirect" assessment data (for example, student or alumni surveys, reflections, or interviews) to measure aspects of our program other than content knowledge and skills.
    6. The evaluation lacked the perspectives of prospective employers, faculty members from other academic programs in geology, and other outside experts.

Ideas for a Revised Strategy

A revised and improved assessment and evaluation strategy will need to (1) get all faculty members to buy in; (2) assign a meaningful role for students and make them responsible for it; (3) assign faculty responsibility for implementing the strategy; and (4) more broadly, incorporate the strategy into Departmental culture. We also believe that we need a wider range of assessment data, particularly indirect data, to complement the direct assessment data (student assignments) in the portfolios. Possible steps toward a revised and improved strategy include:

  • Assign responsibility for maintaining portfolios to the students, and hold students accountable for maintaining them. Students would own their portfolios, an employment marketing tool.
  • Reintroduce a mechanism to enforce mandatory advising each semester.
  • Assign responsibility for academic advising to one faculty member per semester, and give that person release time to do the work. Among other things, the advisor would monitor student portfolios and make sure they are maintained.
  • Create a 1-unit, Cr/NC, required course for seniors in their last semester. The course would have two requirements for students: (1) attend Departmental seminars; and (2) meet with a committee of at least two faculty members in a semi-formal context to present their portfolio orally. Committee members would rate the student's presentation, including use and contents of the portfolio, on a Likert scale, using the program performance outcomes as criteria. The course instructor would schedule the evaluation meetings; review results with committee members; distribute, collect, and summarize results of exit surveys, student reflections, and other indirect assessment data; examine assignments in the portfolios and syllabi (from Department files) for evidence that instructors are incorporating program learning objectives into them explicitly; work with instructors who are lagging in this respect; and write a brief summary of the evaluation results for all graduating students, identifying perceived strengths and weaknesses of our program based on the results. The instructor and student-evaluation committee members would rotate among multiple faculty members.