Not everything that can be counted counts: or how I learned to quit worrying and love the evaluation.
"If you don't know where you are going, you'll end up someplace else." – Yogi Berra
Volumes could be (and have been) written about assessment in higher education. As much as anything the questions around assessment and evaluation are political. I don't suggest that they are not essential in the development of effective programming, operation, and outcomes from our centers. However what counts depends upon who is asking.
By-and-large, we do not act as an evaluation resource for other programs on campus. However there is need (and there is a new Center for Assessment, Design, Research and Evaluation (CADRE), and the longstanding efforts of Ethnography and Evaluation Research (E&ER) at University of Colorado.) We also do consulting and development for individual faculty (especially through the new TRESTLE effort, that is housed within our Center).
We also require that any CSL sponsored effort engages in some form of evaluation – At minimum summative, so there is documented evidence of outcomes, impacts, and accomplishments of programs. This is true of our sponsored seed-grants (Chancellor Awards for Excellence in STEM Education), and initiatives that we run (annual symposia). Often these sorts of evaluations are akin to the NSF-style of demonstrating outcomes and impacts.
We have engaged in some significant efforts in formative evaluation, both of individual efforts and for our center. Much of our informal and formal faculty development efforts are around promoting capacity for formative development And, through annual reviews of activities we do assess whether or not our CSL programming is achieving the goals we establish. As a result of such evaluations we have curtailed and modified programmatic offerings.
Of course, a grand challenge is in the evaluation of impact of the Center itself. While we have undergone a number of informal and formal (paid consultant) efforts to engage in identifying strategic roles, mission , outcomes of the center, lately we have been focusing our attention not on our own mission but rather how our efforts (and mission) align with the strategic priorities of the Chancellor and Provost (noting there are priorities that are stated and those enacted).
Where there is some variation, our campus strategic priorities are: reputation (of the institution), retention (of students), and revenue (new revenue streams).
Reputation: As national attention continues to focus on STEM education, CU-Boulder is seen as a national resource and innovator in this space.
Student Success / Retention / Investing in the Student Experience: The Center incubates, hosts, and advances new models of educational change and effective practices.
Models of Revenue: The Center seeds new funding streams, supports extramurally funded work from foundations and federal sources, and allows for agile and innovative approaches to revenue development.
Within each of these buckets we have provided examples, such as:
- The Center provides the collective home for many of the most-cited DBER scholars in the NRC 2015 Reaching Students report, as well as for our weekly DBER seminar series. [Reputation]
- The Center serves as resource, connector, and advocate for the nearly 100 programs in STEM education on the CU Boulder campus¬ advancing our collective mission for excellence and inclusion in STEM education and success for students across initiatives. [Retention]
- Chancellor's Awards to 35 faculty have resulted in 11 NSF grants totaling roughly $5M, and more than $1.5M in F&A (indirect) to this institution. [Revenue]
But of course, not everything that counts can be counted... I firmly believe that the capacity building and cultural development that gets supported by the Center are among its most impactful and long-term outcomes. IF we are too reductionist in our accounting, we will miss out on the promise of these centers and the foundational purposes of higher education.
Center Profile: Center for STEM Learning - University of Colorado Boulder