Develop an Evaluation Plan
How will the evaluation meet the goals of your center? The evaluation plan should provide the necessary feedback to ensure that the center's programming is delivering on targeted outcomes and as well as to demonstrate where improvements can be made.
Evaluation plans need to include both practices and tools. One resource to help you with this is the Better Evaluation website and The Rainbow Framework.Stages of Evaluation
All centers evolve over time and will find themselves in different stages of development. Assessment indicators should be developmentally appropriate for where a center currently finds itself. Centers should be aware of measurable impacts at each stage and plan for these stages. However, flexibility is also essential as situations will continually change. Updating an existing plan is always more effective than trying to retroactively develop assessment. This will allow the center to strategically and coherently collect and use evaluation data (from initial baseline through long-term impacts) throughout it's life cycle.
Missioning and visioning phase - Has the center captured a clear consensus from stakeholders? How has this been confirmed? Possible ways to assess this might include feedback forms throughout visioning process or focus group with outside groups for field testing your mission and vision.
Initial implementation phase - Collect information about individuals who are applying for and participating in center programs. It is important to build a database (demographics, contact information, levels of participation) early and revisit the structure of the database regularly as the center grows. Think about how to systematize data collection across center activities. A common trap involves neglecting to view assessment systemically across center activities and creating orphaned datasets that limit the ability to talk about the larger work of the center. Use of relational database tools (Access, Oracle, Filemaker Pro) will ultimately allow for more powerful analyses of long term programmatic impact. Locate and initiate partnerships with professional networks (NSEC, POD) to leverage resources.
Intermediate phases - Develop and validate assessment tools. It is preferable to identify pre-existing validated tools where applicable and available during this development. It may be necessary to validate a tool for a particular population if it has not already been. This is an opportunity to test connections across multiple data sources and validate the evaluation process. Other sources of data for triangulating evaluation can include surveys, interviews, focus groups, observations to monitor and refine programming and to begin to create a descriptive account of programmatic experiences and center impacts. Thinking forward, it is important to ensure that contact information remains current and durable.
Advanced stages - Determine what longitudinal descriptive indicators are saying about the impacts of center programming. If results are promising, begin thinking about planning experimental studies to test hypotheses about these programs. This is a place where partnerships to access long term indicators (e.g., offices of institutional research, partner school districts, service providers, see partnerships page). Publish the results, but also think more broadly about dissemination beyond scholarly publications. Don't forget to be reciprocal to stakeholders regarding the center's work.
The ability to actually accomplish the identified assessment goals does depend upon resources and constraints, however considering how the center's assessment needs will grow allows for longer-term planning.
Evolution of the Evaluation Plan
How do you plan for the evolution of an assessment and evaluation plan? Creating a robust assessment and evaluation plan can be advanced quickly by identifying a model to guide the development of your assessment capacity. One way to conceptualize this evolutionary process is to think about the growth of your needs as emerging over time. While initially challenging, spending the time and effort to construct an assessment and evaluation plan up front will prove highly beneficial in the long run.
At the early stages in a center's life cycle, consider outsourcing assessment and evaluation projects to an External Evaluator when a project contains the resources. This allows center personnel to learn by watching and develop familiarity with assessment and evaluation in a low-stakes situation.
As resources grow, consider a center position dedicated to assessment and evaluation (e.g., an Evaluation Coordinator).
With sufficient experience and success, a center can serve as an external evaluator for other programs. This can be a powerful way of supporting a collaborative vision and mission.
The Discovery Learning Research Center has developed significant expertise in evaluation and offers that experience to partners on- and off-campus.