Evaluation and Education Research
We partner with projects to create human-centered evaluations built on trust, curiosity, and shared learning.
Our Team Our Approach Our Services Types of Evaluation and Research Example Cases
Custom Evaluation for Actionable Insights
The SERC Evaluation Team works with each partner to provide situational evaluation services that accelerate change. We strategically design evaluation plans that, when possible, incorporate multiple methods and data sources to reduce bias and strengthen projects. We utilize comprehensive tools to ensure optimal organization, analysis, and communication. Our work is grounded in deep experience within higher education and a commitment to helping partners create meaningful, sustainable change.
What makes us distinct is how we work. As a small, collaborative team, the evaluator(s) who design your evaluation plan remain connected throughout the lifetime of your project, while maintaining the flexibility to introduce additional evaluators when specialized expertise or added capacity is needed. We offer continuity and contextual awareness, ensuring the project's goals, history, and evolving context are always understood and reflected in our work.
Embedded within SERC, our team bridges the worlds of higher education, educational research, and technology. We draw on the expertise of SERC's broader professional staff in faculty professional development, online resource design, and collaborative infrastructure. This allows us to design evaluations that integrate the human, technical, and pedagogical dimensions of change. SERC also has particular expertise in professional development initiatives that help faculty adopt effective and active teaching practices. Our evaluations often encompass online learning environments and open educational resources (OER), documenting their role, accessibility, and effectiveness in supporting broad participation and instructional change.
Across all projects, we emphasize clear, effective communication of findings. Our reports and visualizations are crafted to resonate with a variety of audiences, from project teams to funders and policymakers, translating data into insight through creative visual storytelling.
Our Team
Our team brings decades of experience in STEM education research, program evaluation, and teaching assessment, offering a distinctive perspective on evaluation in higher education and beyond. Grounded in both evaluation theory and education research, we can help you articulate a clear theory of change, identify the most appropriate analytical methods, and design assessment tools that meet rigorous standards of validity and reliability. Drawing on our experience as both internal and external evaluators, we take a pragmatic, utilization-focused approach to ensure that our instruments, analyses, and communications generate actionable insight to support data-informed decisions that strengthen strategy, enhance implementation, and demonstrate impact.
Our Approach
Collaborative and practical: we partner with you to clarify your goals, identify useful evaluation or research questions, and select robust evaluation methods, while honoring the voices and experiences of participants.
- Co-created plans that reflect your program's theory of change and context
- Methods aligned with project goals and decision points for maximum usefulness
- Rapid feedback cycles so you gain insights when you need them
- Agility to meet the needs of small pilot projects or large-scale multi-insitutional initiatives
- Clarity and transparency in evaluation designs that are rigorous, inclusive, and useful to decision makers
- Trust, care, and curiosity through building relationships that support honest reflection
Our Services
SERC's evaluation services span the full cycle, from design through communication of results.
- Evaluation Design & Planning: Collaborate with teams to develop logic models, theory of change, indicators, and tailored mixed-methods designs
- Implementation & Data Collection: Coordinate interviews, surveys, classroom observations, focus groups, website analytics, and usability studies
- Analysis & Synthesis: Integrate qualitative and quantitative evidence to uncover patterns, assess progress, and surface actionable insights.
- Capacity Building: Facilitate interactive workshops and mentoring to strengthen teams' ability to plan, collect, and interpret evidence effectively.
- Data Visualization & Communication: Create accessible, compelling reports and visualizations tailored to different audiences
- Web Design & Usability Consultation: Assess usability, accessibility, and effectiveness of user-centered educational websites
Types of Evaluation and Assessment
We bring expertise in a wide range of evaluation designs, to help programs understand the "What?" "So what?" and "Now what?" including:
- Formative and developmental evaluation: Support learning and improvement as programs innovate.
- Implementation and process evaluation: Examine how programs operate in real-world settings, identifying what supports success and where barriers arise to inform improvements.
- Outcomes and impact evaluation: Assess what changes occur and why, using mixed methods to connect activities with measurable impacts on people, practices, and systems.
- Rapid-cycle and real-time feedback: Deliver timely insights to guide decisions through streamlined data collection and visualization.
- Multi-site and systems-level evaluation: Identify patterns across institutions, communities, and networks with coordinated frameworks that honor local contexts while showing system-wide impact.
- Student Assessment: Measure student learning and development through shared instruments and rubrics, producing reliable, actionable results across diverse courses and institutions.
Special Focus: Evaluating and Researching Teaching and Learning
We bring deep experience helping faculty design, implement, and assess evidence-based teaching practices and flexible instructional materials. Whether you're piloting active learning strategies or scaling curriculum innovations, we can help you:
- Design rubrics and instruments to measure student learning, engagement, or fidelity of implementation
- Assess how teaching innovations work across diverse institutional contexts
- Analyze short- and long-term impacts on student outcomes
- Develop actionable feedback for instructors and program leaders
Our team members are also active researchers in discipline-based education research (DBER) and STEM teaching and learning, giving us insight into both practical constraints and research-grade rigor.
Evidence in action
Faculty Change Agents & Instructional Innovation
Challenge: A national STEM initiative sought to understand the ways in which faculty could act as catalysts for improving teaching and advancing departmental change across diverse institutions.
Approach: SERC conducted a longitudinal mixed-methods internal evaluation (alongside external evaluation and education research) using surveys, classroom observations, and interviews to track shifts in teaching practices, leadership roles, and institutional engagement.
Outcome: The evaluation documented increases in evidence-based instruction, faculty leadership, and reduced student equity gaps—insights that informed program improvements and strengthened institutional support for sustained teaching transformation.
Publications:
- Bragg, Debra D., Eddy, P., Iverson, E.R., Hao, Y., O'Connell., K (2022). Lessons from research and evaluation on faculty as change agents of teaching and campus reform. New Directions for Community Colleges, v. 2022(199), 215-228.
- O'Connell, K., Eddy, P.L., Iverson, E.R., & Macdonald, R.H. (2022) Faculty Change Agent Model: Cultivating Faculty to Catalyze Change, Change: The Magazine of Higher Learning, 54:3, 11-18.
- Iverson, E., Bragg, D. D., & Eddy, P.L. (2020). How faculty change agents enact mid-level leadership in STEM. New Directions for Community Colleges, Issue 191, pp. 67-79.
- Macdonald, R. Heather, Rachel J. Beane, Eric M. D. Baer, Pamela Eddy, Norlene R. Emerson, Jan Hodder, Ellen R. Iverson, John R. McDaris, Kristin O'Connell, and Carol J. Ormand (2019). Accelerating Change: The Power of Faculty Change Agents to Promote Diversity and Inclusive Teaching Practices. Journal of Geoscience Education, v. 67, p. 330-339.
- Eddy, P.L., Hao, Y., Markiewicz, C., & Iverson, E., (2018). Faculty change agents as adult learners: The power of situated learning, Community College Journal of Research and Practice.
Multi-Site Student Assessment in Multi-Institution Collaboration
Challenge: A network of universities piloting an innovative STEM curriculum needed rigorous evidence of its effectiveness across varied courses and institutional contexts.
Approach: SERC co-led the assessment team, partnering with hundreds of faculty across institutions to co-develop shared instruments and robust rubrics that held meaning across settings. The team built trust, fostered faculty ownership of the process, and managed the logistics of large-scale data collection and scoring thousands of student assessments, both multiple-choice and essay, while maintaining IRB compliance, data security, and efficiency.
Outcome: The collaborative effort produced reliable, comparable evidence of student learning gains and generated actionable insights that informed curriculum improvement, faculty development, and broader dissemination.
Publications:
- Iverson, E.R., Steer, D., Gilbert, L.A., Kastens, K.A., O'Connell, K., Manduca, C.A. (2019). Measuring Literacy, Attitudes, and Capacities to Solve Societal Problems. In: Gosselin, D., Egger, A., Taber, J. (eds) Interdisciplinary Teaching About Earth and the Environment for a Sustainable Future. AESS Interdisciplinary Environmental Studies and Sciences Series. Springer, Cham.
- Iverson, E. R., & Wetzstein, L. (2020). Connecting learning about the earth to societal issues: Downstream effects on faculty teaching. In J. Ostrow (Ed.), Teaching about Sustainability across Higher Education Coursework, New Directions for Teaching and Learning. San Francisco: Jossey-Bass.
- O'Connell, K., Bruckner, M.Z., Manduca, C.A., and Gosselin, D.C., (2015). Supporting Interdisciplinary Teaching about the Earth with the InTeGrate Website, Journal of Environmental Studies and Sciences, Vol. 6, pp. 354.
Impact of Website Design on Teaching Practice
Challenge: Determine whether a long-standing, faculty-authored collection of Open Education Resources (OER) contributes to improving undergraduate geoscience teaching and supports sustained use of active-learning approaches.
Approach: Using longitudinal national survey data, web analytics, and instructor interviews, we examined how faculty engagement with the OER related to changes in classroom practice, and interactions with colleagues.
Outcome: Findings show that well-designed OER is associated directly with increased active-learning adoption and indirectly by stimulating conversations with colleagues about content and teaching strategies. This demonstrate how trusted resources (authored by the community) that are well-designed can be a proxy for PD, spark ideas and conversation, and contribute to instructional change at scale.
Publications:
- Manduca, C.A., O'Connell, K., Fox, S., Iverson, E.R., Altermatt, E., Huyck Orr, C. (In preparation) Open Education Resources that Improve Faculty Teaching Practices.
- Manduca, C.A., Iverson, E., Luxenberg, M., Macdonald, R.H., McConnell, D., Mogk, D., and Tewksbury, B., (2017). Improving undergraduate STEM education: The efficacy of discipline-based professional development, Science Advances, Vol. 3, no. 2.
- Manduca, C.A., Fox, S., and Iverson, E.R., (2006). Digital Library as Network and Community Center, D-Lib, Vol. 12, no.12.
- Fox, S., Manduca, C.A., and Iverson, E., (2005). Building Educational Portals atop Digital Libraries, D-Lib, Vol. 11, no. 1.
Assessing Broader/Research Impacts through a National Toolkit
Challenge: A national initiative sought to improve how STEM researchers plan and implement Broader Impacts (BI) to increase the impact of research in society. The team needed evidence on whether BI planning tools were meaningful, usable across disciplines and institutions, and scalable.
Approach: SERC's evaluation team worked with the ARIS center to design and implement a multi-phase study including development of the ARIS BI Toolkit and Rubric, website usability testing, institutional pilots, and testing the validity of the rubric with a researcher audience.
Outcome: The toolkit and rubric demonstrated reliability and validity and broadened access to capacity-building resources for BI professionals and researchers. Users reported higher confidence and improved BI plans; institutions adopted the tools to support research-education engagement.
Publications:
- Iverson, E.R., O'Connell, K., McDonnell, J., Renoe S., Hotaling, L. (2024) The Reliability and Validity of the ARIS Broader Impacts Rubric, Journal of Community Engagement and Scholarship, Vol. 17, No. 2.
- Hotaling, L., Lichtenwalner, C.S., McDonnell, J., O'Connell, K., Ferraro, C. (2024), The ARIS Broader Impacts Toolkit: An Online Guide to Support Broader Impact Project Development and Evaluation for Researchers and BI Professionals, Journal of Community Engagement and Scholarship, Vol. 17, No. 2.
- Bosley, J., Dwyer, M., Mullen, T.G., Bohlin, A., Tan, W.A., Pelland, C.M., Iverson, E.R., Meier, N. Leveraging the ARIS BI Toolkit to Equip Faculty for Career - and CAREER - Success. Journal of Community Engagement and Scholarship, Vol. 17, No. 2.





