Evaluation and Education Research

We partner with projects to create human-centered evaluations built on trust, curiosity, and shared learning.

Our Team Our Approach Our Services Types of Evaluation and Research Example Cases

Custom Evaluation for Actionable Insights

The SERC Evaluation Team works with each partner to provide situational evaluation services that accelerate change. We strategically design evaluation plans that, when possible, incorporate multiple methods and data sources to reduce bias and strengthen projects. We utilize comprehensive tools to ensure optimal organization, analysis, and communication. Our work is grounded in deep experience within higher education and a commitment to helping partners create meaningful, sustainable change.

What makes us distinct is how we work. As a small, collaborative team, the evaluator(s) who design your evaluation plan remain connected throughout the lifetime of your project, while maintaining the flexibility to introduce additional evaluators when specialized expertise or added capacity is needed. We offer continuity and contextual awareness, ensuring the project's goals, history, and evolving context are always understood and reflected in our work.

Embedded within SERC, our team bridges the worlds of higher education, educational research, and technology. We draw on the expertise of SERC's broader professional staff in faculty professional development, online resource design, and collaborative infrastructure. This allows us to design evaluations that integrate the human, technical, and pedagogical dimensions of change. SERC also has particular expertise in professional development initiatives that help faculty adopt effective and active teaching practices. Our evaluations often encompass online learning environments and open educational resources (OER), documenting their role, accessibility, and effectiveness in supporting broad participation and instructional change.

Across all projects, we emphasize clear, effective communication of findings. Our reports and visualizations are crafted to resonate with a variety of audiences, from project teams to funders and policymakers, translating data into insight through creative visual storytelling.

Our Team

Our team brings decades of experience in STEM education research, program evaluation, and teaching assessment, offering a distinctive perspective on evaluation in higher education and beyond. Grounded in both evaluation theory and education research, we can help you articulate a clear theory of change, identify the most appropriate analytical methods, and design assessment tools that meet rigorous standards of validity and reliability. Drawing on our experience as both internal and external evaluators, we take a pragmatic, utilization-focused approach to ensure that our instruments, analyses, and communications generate actionable insight to support data-informed decisions that strengthen strategy, enhance implementation, and demonstrate impact.




Monica Bruckner
she/her
Project Specialist & Evaluation Associate


Ashley Carlson
they/them
Program Analyst


Ellen Iverson
she/her
Director


Kristin O'Connell
she/her
Evaluation & Education Associate


Kerry Vachta
she/they
Evaluation & Education Associate

Our Approach

Collaborative and practical: we partner with you to clarify your goals, identify useful evaluation or research questions, and select robust evaluation methods, while honoring the voices and experiences of participants.

  • Co-created plans that reflect your program's theory of change and context
  • Methods aligned with project goals and decision points for maximum usefulness
  • Rapid feedback cycles so you gain insights when you need them
  • Agility to meet the needs of small pilot projects or large-scale multi-insitutional initiatives
  • Clarity and transparency in evaluation designs that are rigorous, inclusive, and useful to decision makers
  • Trust, care, and curiosity through building relationships that support honest reflection

Our Services

SERC's evaluation services span the full cycle, from design through communication of results.

  • Evaluation Design & Planning: Collaborate with teams to develop logic models, theory of change, indicators, and tailored mixed-methods designs
  • Implementation & Data Collection: Coordinate interviews, surveys, classroom observations, focus groups, website analytics, and usability studies
  • Analysis & Synthesis: Integrate qualitative and quantitative evidence to uncover patterns, assess progress, and surface actionable insights.
  • Capacity Building: Facilitate interactive workshops and mentoring to strengthen teams' ability to plan, collect, and interpret evidence effectively.
  • Data Visualization & Communication: Create accessible, compelling reports and visualizations tailored to different audiences
  • Web Design & Usability Consultation: Assess usability, accessibility, and effectiveness of user-centered educational websites 

Types of Evaluation and Assessment

We bring expertise in a wide range of evaluation designs, to help programs understand the "What?" "So what?" and "Now what?" including:

  • Formative and developmental evaluation: Support learning and improvement as programs innovate.
  • Implementation and process evaluation: Examine how programs operate in real-world settings, identifying what supports success and where barriers arise to inform improvements.
  • Outcomes and impact evaluation: Assess what changes occur and why, using mixed methods to connect activities with measurable impacts on people, practices, and systems.
  • Rapid-cycle and real-time feedback: Deliver timely insights to guide decisions through streamlined data collection and visualization.
  • Multi-site and systems-level evaluation: Identify patterns across institutions, communities, and networks with coordinated frameworks that honor local contexts while showing system-wide impact.
  • Student Assessment: Measure student learning and development through shared instruments and rubrics, producing reliable, actionable results across diverse courses and institutions.

Special Focus: Evaluating and Researching Teaching and Learning

We bring deep experience helping faculty design, implement, and assess evidence-based teaching practices and flexible instructional materials. Whether you're piloting active learning strategies or scaling curriculum innovations, we can help you:

  • Design rubrics and instruments to measure student learning, engagement, or fidelity of implementation
  • Assess how teaching innovations work across diverse institutional contexts
  • Analyze short- and long-term impacts on student outcomes
  • Develop actionable feedback for instructors and program leaders

Our team members are also active researchers in discipline-based education research (DBER) and STEM teaching and learning, giving us insight into both practical constraints and research-grade rigor.

Evidence in action

Faculty Change Agents & Instructional Innovation

Challenge: A national STEM initiative sought to understand the ways in which faculty could act as catalysts for improving teaching and advancing departmental change across diverse institutions.

Approach: SERC conducted a longitudinal mixed-methods internal evaluation (alongside external evaluation and education research) using surveys, classroom observations, and interviews to track shifts in teaching practices, leadership roles, and institutional engagement.

Outcome: The evaluation documented increases in evidence-based instruction, faculty leadership, and reduced student equity gaps—insights that informed program improvements and strengthened institutional support for sustained teaching transformation.

Publications:

Multi-Site Student Assessment in Multi-Institution Collaboration

Challenge: A network of universities piloting an innovative STEM curriculum needed rigorous evidence of its effectiveness across varied courses and institutional contexts.

Approach: SERC co-led the assessment team, partnering with hundreds of faculty across institutions to co-develop shared instruments and robust rubrics that held meaning across settings. The team built trust, fostered faculty ownership of the process, and managed the logistics of large-scale data collection and scoring thousands of student assessments, both multiple-choice and essay, while maintaining IRB compliance, data security, and efficiency.

Outcome: The collaborative effort produced reliable, comparable evidence of student learning gains and generated actionable insights that informed curriculum improvement, faculty development, and broader dissemination.

Publications:

Impact of Website Design on Teaching Practice

Challenge: Determine whether a long-standing, faculty-authored collection of Open Education Resources (OER) contributes to improving undergraduate geoscience teaching and supports sustained use of active-learning approaches.

Approach: Using longitudinal national survey data, web analytics, and instructor interviews, we examined how faculty engagement with the OER related to changes in classroom practice, and interactions with colleagues.

Outcome: Findings show that well-designed OER is associated directly with increased active-learning adoption and indirectly by stimulating conversations with colleagues about content and teaching strategies. This demonstrate how trusted resources (authored by the community) that are well-designed can be a proxy for PD, spark ideas and conversation, and contribute to instructional change at scale.

Publications:

Assessing Broader/Research Impacts through a National Toolkit

Challenge: A national initiative sought to improve how STEM researchers plan and implement Broader Impacts (BI) to increase the impact of research in society. The team needed evidence on whether BI planning tools were meaningful, usable across disciplines and institutions, and scalable.

Approach: SERC's evaluation team worked with the ARIS center to design and implement a multi-phase study including development of the ARIS BI Toolkit and Rubric, website usability testing, institutional pilots, and testing the validity of the rubric with a researcher audience.

Outcome: The toolkit and rubric demonstrated reliability and validity and broadened access to capacity-building resources for BI professionals and researchers. Users reported higher confidence and improved BI plans; institutions adopted the tools to support research-education engagement.

Publications:

Evaluation Project Examples

See how our evaluations help programs learn, adapt, and demonstrate impact.