Evaluation, Assessment, & Research
Comprehensive Project Evaluation
Designing From Experience
Designing strong programs that meet measurable objectives and show evidence of reaching goals for impact can be challenging. We bring more than a decade of experience in guiding the design of projects, websites, curricula, and programs. With a strong grounding in the literature of program evaluation, we can help you to articulate a theory of change that underpins your program objectives, determine what analytical methods are best suited to your needs, and develop assessment tools and instruments.
Sophisticated Data Collection and Analysis
Because the research questions we address are varied and cover a wide range of teaching and learning topics, our approaches must be equally varied. We are highly skilled at designing and executing projects that include both qualitative and quantitative methods. We have extensive survey experience from designing national surveys aimed at understanding large-scale community change to smaller-scale participant or web pop-up surveys. From descriptive statistics to statistical hypothesis testing, we provide full analysis of quantitative data. When the project calls for a qualitative approach we have over 20 years of experience in conducting and using grounded theory analysis for data from focus groups, interview studies, and cognitive walkthroughs. In addition to these traditional methods we have specialized knowledge in web analytics, observation protocols, and artifact analysis.
SERC has conducted evaluation and assessment studies on projects from small scale or single aspects of a program to large scale, complex programs. Whether the project takes place at a single site or across multiple sites, our approaches are scalable and tailored to the needs of the program. We take a situational analysis approach and fit the evaluation plan to the context, interests, and needs of intended users. Capitalizing on our experience of working in different contexts, with different disciplines, and adapting to a range of scales helps us design an evaluation plan to document the maximum impact for your program.
Effective programs use sustained approaches that iterate over time. When working with us, projects benefit from our strategies that take the long view at identifying when and how to gather data to inform improvements to programs and highlight early indicators of success. Everything we learn from our evaluation and assessment efforts informs our work on similar projects. In this way, collaborators gain from our cumulative understanding of how a program evolves and how to best develop evaluation strategies that build upon previous work and inform future work throughout the life of the program.
Game-Changing Research on Education
Our experience encompasses a range of educational research topics including the impact of faculty development on teaching, the nature of teaching practice and teacher beliefs, and the relationship between specific pedagogies, curriculum, or textbooks and student learning. Examples of our research include ethnographic studies on the culture of teaching and learning at a specific campus, paired comparison artifact studies to measure the relationship between faculty development, assignment design, and student writing, and a matched cohort study to characterize the influence of whole student strategies and its relation to persistence in STEM. In addition, we have particular knowledge of the research on thinking and learning in the geosciences.
Research on Spatial Cognition
We are recognized leaders and sought-after collaborators in research relating to spatial reasoning in the STEM disciplines. We have strong ties to the geoscience education community, the cognitive science community, and partners in the petroleum industry. With these collaborators, we are able to explore the processes of spatial cognition, tools and strategies for supporting spatial thinking, and the expert-novice continuum within the context of spatial thinking in STEM.
Communication of Results
We synthesize evidence and place it within a situational context. When collaborating with us, you have a partner in disseminating your findings to the right audience with outstanding analysis, writing, and presentation skills. We present findings via professional meeting presentations, reports, blogs, web pages, and other means. Our published work can be found in journals such as Research in Science Education, the Journal of College Science Teaching, the Journal of Geoscience Education, Assessing Writing, Change: The magazine of Higher Learning, the Journal of Structural Geology, the International Journal on Digital Libraries, D-LIB, and the AAPG Explorer. We regularly present our findings at professional society meetings, which have included the Geological Society of America annual meeting, the American Geophysical Union Fall meeting, meetings of the Association of American Colleges and Universities, American Evaluation Association, The Professional and Organizational Development Network in Higher Education, and many more.
Get Started Working with SERC
We conduct evaluation for a wide range of programs, including formative and summative assessments of program activities, website impact studies, and team evaluations.
Formative assessments are conducted at key points during the program timeline, allowing for adjustments and informing decisions as the program evolves; summative assessments take place at the completion of key activities and help measure outcomes and determine the value of the program, providing data for the project such as website or report text.
For website impact, we work to characterize website use, compare traffic and page views, and generally provide information about the success and efficacy of a project website. Website usability cognitive walkthroughs and analytics data can be used to characterize how a website is being used, where strengths and potential cross-linkages exist, and to describe how users spend their time on the site and if they are able to achieve their goals.
We conduct team evaluations to determine how members of a team are working toward project goals and provide accurate information about bottlenecks and organizational problems. Evaluation of effective practices can greatly improve a team's performance by understanding issues such as whether all team leaders have a shared vision for achieving the goals of the project and if team members perceive ways in which communication strategies can be improved.
The base cost for most evaluation efforts typically start at around ten percent of the total budget for a given project. If projects require more time-intensive methods or where evaluation is central to the work of the project, then a more substantive part of the project budget may be required to support the work.
We can help you design and use instruments and measurements of student learning to support the development of curriculum, assess new pedagogical approaches, support educational research methods, and accurately measure the intended goals of a particular curriculum. We do this by consulting with faculty in designing measurable goals, developing high-quality curriculum that addresses the stated goals, and identifying existing assessments or designing new ones that align to the goals and material taught. Additionally, we can provide guidance in developing assessment instruments for your particular research study and validate and test the reliability of these instruments.
The Serckit content management system allows us a safe and efficient means to help researchers and faculty members manage and analyze student assessment data. We understand that it is essential to preserve students' anonymity in many data collection situations. We have systems in place to ensure that student identification is protected at every stage of data collection and analysis. The SERC assessment team can help write applications for IRB and discuss appropriate options for your specific situation.
Projects which require our Serckit data collection system, which includes preserving students' anonymity, provides tools for independent grading, and includes technical consulting and web support, typically start at $20,000 per year, and increases depending on the nature and size of the collection activities.Contact us to discuss assessing student learning »
The SERC assessment team serves both as consultants on projects lead by our collaborators, and also as primary investigators in education research. Areas where we are actively conducting research include:
- Teaching Practice & Philosophy Research
We work to describe, contextualize, and understand the way that faculty teach undergraduate classes. We seek to understand what faculty view as assumptions, barriers, and motivators for using active teaching approaches in their classes. We also are trained in the Reformed Teaching and Observation Protocol (RTOP) and have a cadre of other educational researchers trained in this protocol. Using RTOP as a method allows us to gather a direct measure of faculty teaching practice, which can be used with self-report data methods (surveys and interviews) to provide a more comprehensive picture of practice.
- Research on Spatial Thinking and Learning
We seek to use information about how the brain works to develop methods to improve teaching and learning. Specifically, we look to find ways to improve the teaching of spatial skills in geoscience courses. We also work to understand what the common challenges are to students' ability to think spatially in order to determine best practices that faculty can use to help students build spatial skills. A major question in this work is "what does cognitive science research tell us about spatial cognition, and how can STEM faculty use that to inform their teaching of spatial skills and concepts?"
- Website Impact Research
For projects that intend the website to be a critical component of the theory of change, we develop research designs that investigate the role of the website in supporting change. We use standard web analytic measures and grounded theory analyses applied to weblogs combined with more traditional methods of surveys and interviews as research approaches for measuring use and characterizing impact. In addition, our team has extensive experience with cognitive walk-throughs using a scenario-based approach as a method for exploratory research designs that aim to identify and test assumptions about sample populations.
- First-Phase Research and Needs Assessments
We can help design and conduct studies in the exploratory phases of a project or to assist in building a research base for larger proposals. For example, one goal of this type of work may be to determine how developing, administering, and scoring embedded assessments during a pilot study lead to "lessons learned" to guide the research design of the proposed project. It can also be used to determine what unmet needs exist in a program or project that can help focus the efficient use of resources to alleviate the most pressing needs.