Building Strong Departments > Workshops > Assessing Geoscience Programs > Participants and their Contributions > William Hart

Assessing a Mid-Sized Baccalaureate through Doctoral Department

William Hart, Geology Department, Miami University

Miami University requires a full assessment of all academic programs on an approximate six-year cycle. Each cycle seems to hold a slightly different theme depending on the administration in place at that time, but the process begins with a "Self-Study" (see Miami University Department of Geology Self-Study Report, 2003 (Acrobat (PDF) 1.3MB Feb9 09)), which for all departments now includes an assessment of their Miami Plan (see Global Miami Plan Core Curriculum) contributions and for Ph.D. granting departments includes a section mandated by State guidelines. This is followed by an in-depth, on-site evaluation by a team comprised of two sets of reviewers; one internal and one external. Ultimately reports are written, comments are exchanged between the department and the review committee, and a final summary report is issued by the Provost's Office. Aside from the excellent opportunity that this process affords for internal reflection and assessment of key programmatic features, it also can provide important ammunition for future resource allocation requests.

Obviously the six-year process highlighted above must touch on all aspects of program assessment including, but not limited to undergraduate student learning outcomes from foundation through capstone experiences, undergraduate and graduate student success before and after graduation, program contributions to the university and discipline, program quality and visibility at the national level, and program financial viability. This array of assessment requirements necessitates an array of assessment approaches and tools, for example: individual faculty efforts in specific courses; collective faculty efforts addressing key learning outcomes across the curriculum (e.g. Geoscience Concept Inventory); continual acquisition of student feedback (e.g., course evaluations, exit interviews); continual tabulation and evaluation of quality, productivity, and financial measures (e.g. annual faculty/student activities reports, annual endowment/giving records, comparisons with peer programs); and periodic evaluation of alumni feedback (e.g. alumni surveys). Simply put, how well is the program doing in addressing its stated mission(s)?

While there is nothing overly novel about the above approaches, the challenge comes in fostering faculty and student cooperation and establishing a culture of continual self-evaluation that allows accomplishment of the desired assessment without noticeably impacting faculty time and effort; in other words, department buy-in. This is particularly important in a mid-sized Ph.D. granting department at an institution whose primary commitment is to liberal arts undergraduate education (e.g., see Miami University's mission statement). Fortunately the university backs up this commitment with resources in support of efforts such as those required for various forms of assessment (Miami University Assessment website) and for creativity and advancement in teaching and learning (Center for the Enhancement of Teaching and Learning, Miami University).

As our department approaches the next round of Academic Program Review we do so in a better position than ever before. Why? Some key reasons are highlighted below.

  • Faculty, staff, and graduate student annual reports of professional activities and accomplishments are a routine endeavor.
  • Quantifiable faculty/staff/student productivity and quality measures have been annually updated and evaluated to facilitate internal and external comparisons and trends.
  • Programmatic contributions to the university, community, and discipline have been annually updated and evaluated.
  • Graduating students have been surveyed annually questionnaires and exit interviews.
  • Faculty groups and the faculty as a whole routinely engage in discussions of curricular and programmatic improvements and continually seek new ways to involve students in learning outside the traditional classroom setting.
  • Assessment instruments to evaluate student learning outcomes are becoming more widely employed across the department's curriculum and programs.
  • All faculty have been involved with one or more of the following activities sponsored by the university; faculty learning communities, assessment workshops, assessment working groups, and focused assessment programs.
  • All faculty are supporting the department's involvement in a university-wide project Top 25 Project: Engaging Students in Their Learning to redesign the most popular foundation-level courses in ways that encourage greater student engagement in inquiry-driven learning. New teaching and learning strategies and assessment tools developed as part of the redesign of GLG 111 (The Dynamic Earth) will be extended first to other introductory (foundation) geology courses and ultimately throughout the curriculum.