Assessment and Project Evaluation
Jump to: Evaluation
An external evaluation team focuses on measuring the project's impact on programming, the associated impact on student learning, and the ultimate impact on students' ability and willingness to engage in societal roles addressing the sustainability of our civilization and our environment. The external evaluation team is comprised of a geoscientist, who is independent of the development teams, and a professional evaluation group. This combination provides an understanding of the nuances of the program's strategies and goals, and of the community that is striving to attain those goals.
The InTeGrate materials development process is a unique model to create pedogogically strong, adoptable and adaptable teaching materials. The assessment team assists throughout the materials development process utilizing the following tools and processes:
- A review and revision process to ensure that the materials are of highest quality, align with project-level goals, and have measurable outcomes, and
- Field tests in three different classrooms settings where formative and summative assessments inform the instructors of needed revisions, while project-level assessment instruments measure the impact on student learning related to project-level themes.
Materials Review and RevisionA materials rubric that uses a constructive alignment conceptual framework guides developers at inception, during testing, and through final field analysis of curricular materials to align the materials with project goals.
Before field-testing the materials in the classroom, all curricula are reviewed by three members of the assessment team using the InTeGrate Materials Development Rubric v6 (Microsoft Word 2007 (.docx) 113kB Jan18 14) which incorporates the broad goals of the InTeGrate project and researched guidelines for best practices in curriculum development. All curricula must meet the full score for the overarching goals of the project to ensure that they:
- address one or more grand challenges involving Earth and society,
- develop student ability to address interdisciplinary problems,
- improve student understanding of the nature and methods of geoscience and developing geoscientific habits of mind,
- make use of authentic and credible data to learn central concepts in the context of scientific methods of inquiry,
- incorporate systems thinking.
Measuring Impact of Curriculum on Student Learning
All materials are field-tested by a minimum of three faculty in different classroom settings. All field-testing classrooms collect measures of student learning that include:
Geoscience Literacy Exam (GLE)
As part of the InTeGrate goal of improving geoscience literacy, the assessment team developed the Geoscience Literacy Exam (GLE) as one of the tools to quantify the effectiveness of these materials on students' geoscience literacy. The GLE instrument aligns with the geoscience literacy documents addressing content and concepts in the Earth, Climate, Atmosphere, and Ocean Science literacy documents. The full instrument (60 multiple choice and 30 essays) is used to measure geoscience literacy from introductory, non-science students to upper-level geoscience majors. A subset of 8 GLE questions were administered in all field tests of InTeGrate materials. The GLE-8 include 2 multiple choice questions related to each of the four literacy documents.
Interdisciplinary and Systems Thinking Essay Questions
Two essay questions were designed to measure two of the overarching project themes of 1) developing students' ability to address interdisciplinary problems, and 2) fostering systems thinking. Administered at the the end of the term as a post-content measure, the essays examine the impact of the InTeGrate materials on students' understanding of interdisciplinary societal issues and thinking about systems.
Student Attitudinal Survey
In addition to covering geoscience content knowledge and understanding, the attitudinal survey is also intended to probe InTeGrate students' ability and motivation to use their geoscience expertise to address problems of environmental sustainability. The InTeGrate online survey of student attitudes was refined and administered in courses taught by 11 instructors across 4 modules.
The survey collects demographic data and probes (a) students' interest in careers related to the Earth and environment and (b) their motivations toward taking actions in their personal and professional lives that will contribute to solving grand challenges of natural resources and environmental sustainability. Surveys are administered pre- and post-instruction, using an anonymizing procedure to maintain student confidentiality while allowing matching of pre- and post-instruction responses. Demographic items ask about gender, race/ethnicity, age, year in school, and college major. Career questions ask students to rate their interest in specific careers and to indicate how important it is to them to have a career in which they use their knowledge of the Earth and environment. Motivation towards environmental sustainability is probed directly through questions that ask about concern or attitude and indirectly through questions about behaviors.
Analysis of responses from the 258 students who provided both pre- and post- responses shows small but statistically significant improvements across instruction on some measures of career interest and motivation towards environmental sustainability, with considerable variability between instructors and modules.
Summative and Formative Assessments
Summative assessments help evaluate if instructional materials and strategies are meeting the stated learning objectives, including the overall project goals. These assessments are part of the flow of the class, are graded by faculty based on a rubric, and demonstrate summative level of students' achievement of the learning outcomes. The use of a summative assessment replaced the use of three embedded assessments following field testing of the first round of modules, as embedded assessment did not give sufficient evidence of the ability of materials to meet learn goals.
Formative assessments are provided throughout the InTeGrate teaching materials to provide instructors with feedback about student learning while materials are being taught. The assessments have benefited from assessment team input ensuring alignment with module or course goals and have been tested in multiple settings, as per the InTeGrate materials development process.
Surveys were administered to faculty who participated in InTeGrate following their use of InTeGrate materials or participation in an event. Surveys aim to help the project better understand the context in which adaption and adoption is taking place. These context-rich reflections are included as "instructor stories" with each module or course to facilitate the use of InTeGrate materials in many different contexts.
Additionally, followup "reach" faculty surveys have been distributed to internal and external faculty who have participated in InTeGrate, either by using materials in their classroom or attending a InTegrate related workshop or webinar. These surveys allow the InTeGrate projecct to gauge how many faculty and students are using InTeGrate materials and provides a mechanism for feedback to the project.
The project evaluation is being undertaken by Dr. Kim Kastens (Educational Development Center) and Carol Baldassari (Senior Research Associate, Program Evaluation and Research Group, Lesley University). Dr. Frances Lawrenz (University of Minnesota) serves as a consultant to the evaluation team.
Project evaluation focuses on the measurable impact on programming, the associated impact on student learning, and the ultimate impact on students' ability and willingness to engage in societal roles addressing the sustainability of our civilization and our environment. Because the community-based project design depends critically on successful collaboration among partners who are dispersed by geography, discipline, and institution type, the external evaluation team provides formative feedback on the evolution of the partnership and sub-partnerships, their roles and responsibilities, and their ability to work together effectively. Key areas of interest are the alignment of members' work with the project design and intended goals, potential benefits and costs of their participation in the project, their commitment to project goals and activities over time, and the effectiveness of the structures and processes the Center creates and maintains to foster communication and collaboration. Data collection efforts include in-depth interviews, surveys, participation in on-line meetings and phone conferences, and reviews of program artifacts.
To evaluate the project's effectiveness at "...expanding the number of students who enroll..." (NSF, 2010, p. 5), patterns and trends in numbers and attributes of enrolled students are analyzed throughout the grant period, building on data from AGI, and using data collected from participating faculty and students in a database maintained by the project management team. Enrollment impact of the project over time will by tracked by demography (gender, race/ethnicity), by type of InTeGrate interaction (original development teams, implementation grant, or professional development workshop), by institution type (e.g. R1, 2YC, MSI), and by student specialization (general education, geoscience major, or geoscience-related field). To probe the reasons behind students' enrollment decisions, pre- and post-instruction assessments of selected classes will include survey questions about students' reasons for taking the class, what alternatives they considered, and their interest in future geoscience classes. The Center seeks to impact 400,000 students during its lifetime and to collect enrollment and assessment data from courses enrolling 75,000 students. To date over 200 educators have been involved in the project.
To evaluate the project's effectiveness at achieving "...enhanced learning..." (NSF, 2010, p. 5), the evaluation team will analyze assessments of student learning, the same pre- and post-instruction assessments developed and deployed by the assessment team during their review of the effectiveness of modules. However, the evaluation team will take a cross-institutional, project-wide view of these data. The evaluation team will look for patterns and trends in student learning gains, asking whether learning gains are evenly distributed across demographic groups, across institution types, and across modes of InTeGrate involvement (original development sites, implementation grant recipients, professional development workshop attendees). Based on the initial data collection and analysis, in Year Two, the assessment team set learning benchmarks for students enrolled in single courses and those enrolled in programs in the areas of geoscience literacy, understanding the process of science, and interdisciplinary problem solving. The Center seeks to have all students enrolled in courses supported by the Center make progress toward these benchmarks, and for 75% of students to meet the benchmark.
To evaluate the project's effectiveness at achieving "....significant progress towards addressing the national challenge" of environmental sustainability (NSF, 2010, p. 6), the evaluation team will consider students' ability and motivation to use insights from geosciences to address grand challenges of sustainability. Motivation will be assessed by including a career interest component on pre- and post-instruction surveys (student attitudinal survey). At present, many science educators are ambivalent about inclusion of human/environment interactions in science courses (Kastens and Turrin, 2006) or about devoting instructional time to "soft skills" such as interdisciplinary collaboration. Questions have been included on the faculty survey administered by the On the Cutting Edge project to gauge the degree of support among the geoscience professorate for teaching towards each of the three InTeGrate learning goals. This survey was administered in Fall, 2012. The Center seeks to have 30% of geoscience faculty supportive of this goal.
To evaluate the program model, we will conduct a series of site studies probing the relationships among InTeGrate-supplied materials and activities, contextual and other factors influencing the faculty and student at the adoption site, and the consequent changes in programming, faculty practice, enrollment, and student learning. Integrating across all of the intensive study sites, we will identify circumstances or actions that either favor or undercut likelihood for successful adoption or adaptation of InTeGrate's materials and methods.
The evaluation is being guided by the following logic model, developed by the evaluation team and project leaders: InTeGrate logic model (Acrobat (PDF) 80kB Sep26 12)
- Iverson E.R., Steer D., Gilbert L.A., Kastens K.A., O'Connell K., Manduca C.A. (2019) Measuring Literacy, Attitudes, and Capacities to Solve Societal Problems. In: Gosselin D., Egger A., Taber J. (eds) Interdisciplinary Teaching About Earth and the Environment for a Sustainable Future. AESS Interdisciplinary Environmental Studies and Sciences Series. Springer, Cham