Program Assessment

Tuesday 3:00pm-4:15pm Student Union: Fiesta A and B
Round Table Discussion


Karen Viskupic, Boise State University
Assessing program-level learning outcomes can drive curricular and pedagogical changes, and is likely a requirement of your institution's administration and accrediting body. We'll discuss assessment strategies, mechanisms, and instruments as well as how to use results to make program improvements. Please join the conversation to share your successes and/or your struggles, and to learn from your peers.

Questions to consider:

  1. What are the internal and external motivations for program assessment?
  2. What is your department/program doing to assess student learning at the program level?
    • What metrics do you use? Who develops them? Who evaluates the results?
    • How do you look at the attainment of outcomes at different levels across your program?
    • Over what time frame do you assess your program? Yearly, different outcomes in different years, etc.?
    • Are you satisfied with your plan? What is working and what isn't?
  3. How do you share the results of program assessment with your department? With your university?
  4. What are some examples of how program assessment data have been used to make program improvements?
  5. What could the geoscience community do to help make program assessment easier or more useful?
Report Out: Highlights from Discussion:

  • Motivation from institutional pressure, but there is value
  • Assessment is about improvement
  • Example strategies:
    • Evaluate 1-3 outcomes per year rather than evaluating all of them at once
    • Embed assessment questions within coursework
    • Alumni survey or alumni feedback
  • Several groups do pre/post of students, but all are not impressed with results-- both student performance and if it tells us anything
  • Some institutions might have support from an office of Institutional Research
  • Many geoscience programs are small, so looking at results from year to year with small number of students is difficult
  • At one institution, every program is required to provide an annual evaluation report
    • example of valid and reliable measures used by this program: Show an example field photograph, students provided example samples and respond to question about the photograph. Expectation for student performance depends on where they are in their coursework. There is a rubric to evaluate student performance.
    • University sets aside one day each year that is "assessment day"
    • Student participation in "assessment day" is voluntary, department provides pizza, student participation is about 50%
    • Students have opportunity to participate in assessment day several times over course of program
    • One exercise that has about 7 prompts
  • Student self reflection-- students fill out a curriculum matrix to see if students think they are meeting program outcomes where faculty think they are meeting outcomes. Also have a set of questions on end-of-semester evaluations that map onto the program outcomes (e.g., To what degree did this course help you....)
  • Some departments have a faculty meeting dedicated to assessment results on an annual basis.
  • Having a respected faculty member who cares is important-- someone to lead the effort, but need support of chair
  • Sometimes informal discussion with faculty is just as helpful as a dedicated metric
  • Helpful strategy might be to use assessment results that are honest to recognize gaps and needs and to leverage that to get resources from university
  • Giving newer faculty a voice in curriculum development is important-- new ideas and agency to lead
  • What can the geoscience community do to make assessment easier for all of us?
    • list of skills/competencies that are needed at the bachelors level-- this is already developed by the Summit on the Future of Undergraduate Geoscience Education
      • defined competencies is great, but then we need a question bank that aligned with those competencies
      • different programs will have different degrees of competency that are acceptable
    • Geoscience Literacy Exam from the InTeGrate project-- intended to assess key literacies-- developed from the literacy documents for Earth Science, etc.
      • but GLE is at a literacy level not a geoscience expertise level
  • At next year's Rendezvous we should have a question writing workshop to develop program-level assessments. We need help here.
  • Develop a group of geoscience assessment consultants

Program Assessment -- Discussion  

What are we doing to assess student learning at the program level?
-- Trinity - survey alums that are 2, 4, and 6 years out; exit exam for seniors (has not seemed to work - students are nervous, but the exam is low-stakes, but they need it to graduate)
-- CWU - pre- and post- using a subset of the GLE questions, but they do worse (or at least, not better) on the post-, maybe because the
-- George Mason - pre- and post-, also not a good result, sometimes students that do really well coming in do worse going out
-- Chemistry at UW is looking back at 15 years of data, including persistence, DFW rates, etc.
-- Assessing the program is not just assessing the majors - includes the Gen Ed courses. How are we doing in those?
-- In some places, Gen Ed is assessed as part of the department, some are assessed as part of the Gen Ed program
-- CWU - quantitative skills and field notebook skills are assessed on a more regular basis, and driven by faculty interests in building those skills. Requires faculty talking to each other about threading the skills through the curriculum.

Have we looked at confounding variables? Would be a good way to see if it's not just students doing worse (could be SES, etc.)
-- At Carleton, use TRiO status
-- Institutional Research helps at institutions like Carleton and UW
-- Need to have support outside of the department

-- statistics of small numbers - most geo programs are pretty small, so it's hard to use data to make decisions about the program
-- we seem to do a poor job of moving fundamental concepts from intro courses throughout the curriculum


Share edittextuser=888 post_id=32580 initial_post_id=0 thread_id=11930

Assessment methods:
Matrix of classes and outcomes (after Mogk)
Still struggling with what questions to ask and how to ask them.
Assessment of student products (papers, maps, cross sections) in three courses across the 4-year curriculum
SALG (Student Assessment of Learning Gains) survey
e-portfolio can be another way for students to self-assess
Use the same exam questions on a sophomore and senior-level exams (developed to be answered in multiple levels of complexity)
Exit exam in a capstone class- low stakes.
Embedded Assessment cutting edge workshop- Karl Wirth

Oceanography Concept Inventory - Leilani Arthurs

Many administrators require only final assessments (capstone, graduates) and it's difficult to motivate faculty to build in earlier assessments.
Many disciplines fall under the umbrella of geoscience. Standardizing assessment could be challenging.


Share edittextuser=39036 post_id=32592 initial_post_id=0 thread_id=11930

Motivations for Assessment: Almost universally from administrative pressure, but recognize its value.

What are we doing currently:
Ranges from very little to enough to stay off the radar to very effective (sometimes this range spans within one school)

Examples of Assessment Strategy:
• Qualitative assessment before/after capstone based on learning objectives.
• Every student evaluated on rubric from requirement above. Thesis class at end of program. All students give poster. Rubric developed and applied for past 5 years. Posters evaluated by three persons. Survey of alumni with 34-item list of things they should be able to know and do. Did you need it in your job and did you get it from your education. Use professional assessments from TTI success insights to look at 'soft skills' at beginning of program (100 level) and at end of capstone (disposition and motivation, teamwork).
• Campus assessment coordinator. Six objectives. One evaluated per year. Direct measures such as embedded exam questions, writing evaluation, student ranking in external field camp, alumni surveys 5 and 10 years after graduation (what done well, what should we include). Exit survey for seniors.
• Program and course-level objectives. Embedded questions, projects, student reflection, comprising artifacts within an e-portfolio. Assessment team (faculty subset) will come in and extract information. 11 objectives, assess 2 or 3 each year. Still working to develop functionality of the system.
• Learning outcomes, ID where these are met in curriculum. Faculty member decides what is assessed, and makes the assessments. Information is archived and report compiled. Anecdotes and conversations more valuable than the assessment, but the assessment process does inspire improvement.
• Alumni surveys - what skills provided and what skills should have been provided. Assessment of success of different student groups.

Modifications prompted by assessment:
• Alumni feedback for increase technology and field skills led to new geophysics field course development.


Share edittextuser=19237 post_id=32601 initial_post_id=0 thread_id=11930

Fundamental questions:

Are students required to take the assessments? How and when?
Are the assessments reliable and valid? How are these developed?
How is faculty buy-in generated, to the extent that the results are fed back into course and program development?


Share edittextuser=1128 post_id=32649 initial_post_id=0 thread_id=11930

Aloha, we discussed posting our names so we can potentially share assessment ideas. I will start us off.
Here is my email address: (Sarah Bean Sherman, University of British Columbia)


Share edittextuser=50306 post_id=32702 initial_post_id=0 thread_id=11930

This post was edited by LeeAnna Chapman on Jul, 2017
Including my info here for anyone willing to share program assessment resources! (LeeAnna Chapman, NC State / University of San Diego)


Share edittextuser=11108 post_id=32705 initial_post_id=0 thread_id=11930

Join the Discussion

Log in to reply