REU Program Assessment
Compiled by: David Mogk, Dept. Earth Sciences, Montana State University and Val Sloan, SOARS Center for Undergraduate Research, UCAR, Boulder, CO. Based on input from the GEO REU email list
What is assessment?
Data collection with the purpose of answering questions about: students' understanding, students' attitudes, students' skills, instructional design and implementation, and curricular reform (at multiple grain sizes)–from Diane Ebert-May. Assessment is "collecting data with a purpose" (NRC Integrating Research and Education: Biocomplexity Investigators Explore the Possibilities(NRC, 2003). What evidence is acceptable to demonstrate and measure progress towards goals, objectives and outcomes?
Assessment often addresses the "what questions about teaching and learning—what do students know and what can they do?" Assessment provides evidence that things are working or not (e.g. Do students actually learn better?). Evaluation may be defined as "the systematic investigation of the merit or worth of an object" (NSF 93-152) and is often used in the context of "what value has been added through this project, and how do you know?" Project evaluation is often used to demonstrate accountability (e.g. have the project goals been met?).
What's the difference between formative and summative assessment?
The goal of formative assessment is to gather feedback that can be used by the instructor and the students to guide improvements in the ongoing teaching and learning context. These are low stakes assessments for students and instructors.
The goal of summative assessment is to measure the level of success of proficiency that has been obtained at the end of an instructional unit (or REU program), by comparing it against some standard or benchmark. From Carnegie Mellon "Enhancing Teaching".
Why do assessment?
- Improve student learning and development through improving the program.
- Provides students and faculty substantive feedback about student understanding, and program strengths, weaknesses and impact.
- Accountability: find out whether you have achieved defined goals- for student learning and overall programmatic effectiveness and include the results in annual reports to sponsors and grant proposals.
- Gather data on your program with an eye to doing education research on REU programs.
- Effective assessments, embedded into program design, can result in students who are happier and who perform better, and can help guide faculty to create more effective, efficient and gratifying programs that have greater impact.
Assessment of education and outreach programs may be done for many reasons, and on many scales. The results may be utilized by different interested groups-including top-to-bottom project reviews, evaluation of the effectiveness of specific materials or methods, indicators of student learning, long-term impacts of a project, or as confirmation that the goals of a project have been met. Assessment activities may also lead to more expansive research on learning projects. The scholarship of teaching and learning provides many exciting (and much needed) opportunities to form partnerships with the cognitive and social sciences.
Some of the high level goals that have been defined by NSF can be found in these documents: GeoVision Report (Acrobat (PDF) 3.3MB Sep20 13), AC-GEO (2009), and Empowering the Nation Through Innovation and Discovery (Acrobat (PDF) 2.7MB Sep20 13) the NSF Strategic Plan, and Strategic Framework for Education and Diversity, Facilities, International Activities, and Data and Informatics in the Geosciences (Acrobat (PDF) 11.2MB Sep20 13) .
The Right Tool for the Right Job
- Clearly define project goals and expected outcomes at the start.
- A list of IRIS REU Outcomes (Microsoft Word 2007 (.docx) 71kB Sep18 13) and the IRIS Logic Model (Acrobat (PDF) 120kB Sep19 13) have been provided by Michael Hubenthal.
- What is the purpose of the assessment? Who will use the results and in what way?
- Identify the baseline data you will need to document change.
- There is an arsenal of assessment techniques that are available; pick the right tools and metrics that will provide the information required to meet your needs.
- Assessment is done throughout the course of a project for varying reasons: formative assessment is done to provide feedback for ongoing activities, and to inform any needed mid-course corrections; summative assessment is done to measure a project's overall success; longitudinal assessment tracks impacts beyond the duration or initial scope of the project.
- The assessment plans should be integral to the development and management of the project, not just added on as an after-thought.
- Develop partnerships with colleagues who have knowledge and expertise in assessment.
Assessment of student learning outcomes require an entirely different set of instruments and metrics than overall program assessment. The following section provides some assessments developed by GEO-REU leaders to address both aspects of assessment.
Assessment Instruments Developed by REU PIs
The following is a compilation of assessment instruments submitted by REU PIs that have been used to document student progress and/or program progress submitted by PIs on the GEO-REU listserv:
- Start of Internship questionnaire (Microsoft Word 2007 (.docx) 118kB Sep17 13)- submitted by Harmony Colella, Univ. of Miami (Ohio) for the IRIS RESESS internship program.
- Research-Focusing Questions (Microsoft Word 35kB Sep17 13)- Dallas Abbott, LDEO Columbia University.
The process of mentoring students in research can be done many ways. We use something we call research focusing. The research focusing methodology was developed by Kim Kastens for students in a masters degree program in Earth & Environmental Sciences, and adapted to the undergraduate intern program by author Dallas Abbott. Three research focusing meetings are scheduled throughout the summer. The first meeting is in the third week of the program, after students have had time for orientation and in some cases, a week-long field trip. The meeting includes a PhD level facilitator and 2 intern peers. Each student has 20 minutes of time and stays for an hour. The peers are changed at each meeting so the students have the opportunity to learn about other research projects. Peers are also encouraged to ask questions. We also emphasize that we expect that the students will not know the answer to all the questions posed and that this is OK. Because undergraduates are used to tests where all the answers are supposed to be known, this is an essential feature of undergraduate research focusing. We added this feature when it became clear that many undergraduates were intimidated or discouraged when they did not know the answer to every question
Because the peer group changes at each meeting, we ask each student for a 5 minute summary of their project and to write their question on the board at the start of both the second and third research focusing meeting. This requirement increases student facility with a simple explanation of their research that can be understood by anyone with a scientific background. It also requires students to think about the big picture of their research at each meeting. This is also important because students will be asked to give a 1 minute "commercial" about their research just before the start of the end-of-summer poster session.
The second research-focusing meeting addresses the methods that each student is planning to use. This meeting is held in the fifth week of the intern program. Typically, they view any type of instrument as a black box, and the facilitator presses them to explain the underlying physics or chemistry by which they will be, for example, turning ground up rocks into numbers, and numbers into insight about the Earth. If sampling is involved, we require the student to articulate and defend the sampling scheme: why not more samples? Why not more closely-spaced? Why not cover a larger area? After a few students have been through this, they begin to see that there is an inherent trade-off between breadth and depth in almost any research project. Students articulate their assumptions and the limitations of their methods, and the facilitator tries to help them see that every method has limitations beyond which it is inappropriate to apply.
The third research-focusing meeting is held at the beginning of the 8th week of the program, just before the students start to make their posters. At this time, most of the students already have some data. We found that it is too difficult for undergraduate students to address a theoretical interpretation of data without some preliminary data collection. Even then, we find that this is the most difficult research-focusing meeting. But with the guiding questions we provide, each student does, eventually, come up with something interesting and relevant to say for this meeting. This exercise gives them practice at thinking hypothetically. It makes it easier to start thinking about the meaning of their data while they are still in data acquisition mode, rather than waiting to begin interpretation after it's too late to get more or different data. It prepares them to recognize the unexpected while it is happening, because they have already thought through the expected outcomes. In addition, we also talk about displaying their data in poster format. This helps them to see where the data is going and to ask further questions that might be addressed with further data collection or in a senior thesis.
The questions for each meeting are attached. I hope you find this useful.
- Formative assessments using focus groups from Northern Ecosystem Research for Undergraduates (NERU)–from Erik Froburg, University of New Hampshire; Focus Group 1 Questionnaire (Acrobat (PDF) 413kB Sep17 13) and Focus Group 2 Questionnaire (Acrobat (PDF) 386kB Sep17 13)
- NSF/BIO REU Assessment and Evaluation Instruments – They have developed a lengthy, standardized set of questions and a centralized system to collect responses from across the REU network.
- Participant Feedback Form (Microsoft Word 36kB Sep17 13) from David Fields, Research Experience for Undergraduate, Bigelow Laboratory for Ocean Sciences
Mentor Feedback Survey (Microsoft Word 2007 (.docx) 18kB Sep17 13) , Workshop Feedback Survey (Acrobat (PDF) 71kB Sep18 13), Pre-Interview Outline (Acrobat (PDF) 40kB Sep18 13), Post-Interview Outline (Acrobat (PDF) 26kB Sep18 13), and REU Exit Interview (Acrobat (PDF) 79kB Sep18 13)- Shelley Pressley, Washington State University
Evolution of the Precambrian Rocks of Yellowstone National Park and Surrounding Areas- an NSF/REU Project- a case study that illustrates the philosophy, goals, development, and outcomes of this REU project. End of project Assessment (Microsoft Word 28kB Sep18 13) includes a skills confidence log, feedback on the REU experience and personal reflections.
- End of REU Questionnaire (Microsoft Word 32kB Sep19 13) – submitted by Daphne LaDue, Real-World Research Experiences for Undergraduates at the National Weather Center, Center for Analysis and Prediction of Storms, University of Oklahoma.
- REU Experience Evaluation (Microsoft Word 29kB Sep19 13) –submitted by Russell Cuhel, Univ. Wisconsin-Milwaukee
Related Resources on Mentoring
"Sometimes the most valuable contribution a mentor can make is just time and attention. It is always surprising to talk to former mentees about their experiences and what they found valuable. Often, their comments focus on a few themes: (1) it helped to have someone believe in my potential, (2) it helped my confidence to know that I could talk or write to someone of your stature, (3) it helped to have you listen to some of my professional development plans and then hear your suggestions.
"When mentoring, don't forget that just your time and attention can have a very significant impact. The combination of the mentor's accessibility and approachability is critical and even small actions can be impactful. Examples may include having lunch with a student and establishing an open-door policy, or in a class setting learning students' names and making a point of requesting student feedback on course material during class time (Gall et al. 2003)."
- Mentoring Agreement (Acrobat (PDF) 64kB Sep17 13) and Student Reflection Document (Acrobat (PDF) 73kB Sep17 13)- submitted by Michael Hubenthal, IRIS
- Mentoring Manual- from Pathways to Science, Institute for Broadening Participation
- Mentoring Underrepresented Students- from COSEE Ocean Sciences
- Why do we Mentor?, Brian Bingham, Western Washington University
- Building Diversity in STEM through Mentoring and Outreach, Ashanti Johnson, University of South Florida
- Strategies for Engaging Scientist Mentors in a Sustained Mentoring Community, Sharon Ziegler-Chong and Noelani Puniwai (University of Hawaii at Hilo)
- Preparation for Your Internship (Microsoft Word 2007 (.docx) 175kB Sep17 13)- document from IRIS to help students prepare for a successful REU experience.
- What Makes a Good STEM Mentor?- by John Platt, IEEE-USA Today's Engineer, May 2013.
- Five Effective Strategies for Mentoring Undergraduates: Students Perspectives- Mario Pita, Christopher Ramirez, Nathanaelle Joacin, Sarah Prentice, and Christy Clarke, University of Central Florida; CUR Quarterly, Spring 2013 v33 #3.
Some Suggested Resources on Assessment
- User-Friendly Handbook for Project Evaluation - Science, Mathematics, Engineering and Technology Education, 1993, Floraline Stevens, Frances Lawrenz and Laure Sharp, edited by Joy Frechtling. NSF 93-152
- User-Friendly Handbook for Mixed Method Evaluations, 1997, Edited by Joy Frechtling and Laure Sharp. NSF 97-153
- The Role of Formative Evaluation in the Development of an Interdisciplinary Academic Center, Susan B. Millar, NISE Occasional Paper 8, 2000 (PDF File)
- [link http://web.archive.org/web/20160313233723/http://www.flaguide.org/ 'Field Tested Learning Assessment Guide (FLAG)'] for science, mathematics, engineering and technology instructors The FLAG offers broadly applicable, self-contained modular classroom assessment techniques (CATs) and discipline-specific tools for SMET instructors interested in new approaches to evaluating student learning, attitudes and performance. Each has been developed, tested and refined in real colleges and universities classrooms. The FLAG also contains an assessment primer, a section to help you select the most appropriate assessment technique(s) for your course goals, and other resources.
- On-Line Education Resource Library (OERL) (more info) This library was developed by SRI for professionals seeking to design, conduct, document, or review Project evaluations. OERL is funded by the National Science Foundation (NSF).
- Understanding What Our Students are Learning: Observing and Assessing- a comprehensive web-based module on Assessment from the On the Cutting Edge program.
- The Whys and Hows of Assessment- Carnegie Mellon Eberly Center, Teaching Excellence and Educational Innovation
- Scientific Teaching by Jo Handelsman, Sarah Miller, Christine Pfund, the Wisconsin Program for Scientific Teaching; has a good introduction to assessment