Integrate > About this Project > Assessment and Evaluation

Assessment and Project Evaluation

Jump to: Evaluation

Two major efforts are aimed at understanding the impact of the InTeGrate program: internal assessment of project materials and external evaluation of measurable impact of the project. An internal assessment team focuses on understanding the impact of new materials and courses on student learning. The most important responsibility of the internal assessment team is to ensure that the materials InTeGrate disseminates effectively promote the goals of increasing students' geoscience literacy, understanding of the process of science, and improving their ability to solve interdisciplinary environmental and resource problems.

An external evaluation team will focus on measuring the project's impact on programming, the associated impact on student learning, and the ultimate impact on students' ability and willingness to engage in societal roles addressing the sustainability of our civilization and our environment. The external evaluation team is comprised of a geoscientist, who is independent of the development teams, and a professional evaluation group. This combination will provide an understanding of the nuances of the program's strategies and goals, and of the community that is striving to attain those goals.

Assessment

The project takes a two-fold approach to assessing the quality of the materials and courses. First, all curricula are independently reviewed prior to field-testing in the classroom. Second, all curricula are to be field-tested in three different classroom settings using a range of assessment measures to gauge student learning gains, student attitude and aspiration changes, and the role of the teaching circumstances in the success of the curriculum.

Materials Review

A materials rubric that uses a constructive alignment conceptual framework guides developers at inception, during testing, and through final field analysis of curricular materials to align the materials with project goals.

Before field-testing the materials in the classroom, all curricula are reviewed by three members of the assessment team using the InTeGrate Materials Development Rubric v6 (Microsoft Word 2007 (.docx) 113kB Jan18 14) which incorporates the broad goals of the InTeGrate project and researched guidelines for best practices in curriculum development. All curricula must meet the full score for the overarching goals of the project to ensure that they:

  1. address one or more grand challenges involving Earth and society,
  2. develop student ability to address interdisciplinary problems,
  3. improve student understanding of the nature and methods of geoscience and developing geoscientific habits of mind,
  4. make use of authentic and credible data to learn central concepts in the context of scientific methods of inquiry,
  5. incorporate systems thinking.

In addition, to successfully pass the rubric, materials must show strong alignment of instruction with learning goals, include assessments that will allow measurement of progress towards these goals, and incorporate research-based teaching methods.

Measuring Impact of Curriculum on Student Learning

All materials are field-tested by a minimum of three faculty in different classroom settings. All field-testing classrooms collect measures of student learning that include:

Geoscience Literacy Exam (GLE)

As part of the InTeGrate goal of improving geoscience literacy, the assessment team developed the Geoscience Literacy Exam (GLE) as one of the tools to quantify the effectiveness of these materials on students' geoscience literacy. The GLE instrument addresses content and concepts in the Earth, Climate, and Ocean Science literacy documents. The instrument will be used to measure geoscience literacy from introductory, non-science students to upper-level geoscience majors.


The GLE testing schema is organized into three levels of increasing complexity.

  • Level 1 questions are single answer, understanding- or application-level multiple choice questions. For example, selecting which type of energy transfer is most responsible for the movement of tectonic plates. They are designed such that most introductory level students should be able to correctly answer them after taking an introductory geoscience course.
  • Level 2 questions are more advanced multiple answer/matching questions at the understanding- through analysis-level. Students might be asked to determine the types of earth-atmosphere interactions that could result in changes to global temperatures in the event of a major volcanic eruption. Because the answers are more complicated, some introductory students and most advanced students should be able to respond correctly.
  • Level 3 questions are analyzing- to evaluating-level short essays, such as 'describe the ways in which the atmosphere sustains life on Earth.' These questions are designed such that introductory students could probably formulate a rudimentary response. We anticipate the detail and sophistication of the response will increase as students progress through the InTeGrate curriculum.

In year one testing of curriculum, eight level one questions were used to understand students prior understanding of geoscience content. Two level three questions are being used to assess the development of geoscience literacy: one question addressing the specific literacy area (Earth, Ocean, Atmosphere, Climate) addressed by the module and one addressing systems thinking. The systems thinking question is currently under development.

In year two, InTeGrate participants collected baseline Geoscience Literacy Exam data for 8 multiple choice questions collected from ~250 students in pre/post course mode in a variety of settings. The questions probed content and concepts from various aspects of the climate, earth science, atmosphere and ocean literacy documents. Discrimination indices calculated from the data suggest these 8 questions provide a valid measure of geosciences literacy within the scope of the concepts covered. Student normalized gains across a semester with limited InTeGrate exposure (typically 2 or less weeks of InTeGrate curriculum out of 14) were found to average 16%. Additional studies are underway to confirm reliability.

Student Attitudinal Survey

In addition to covering geoscience content knowledge and understanding, the attitudinal survey is also intended to probe InTeGrate students' ability and motivation to use their geoscience expertise to address problems of environmental sustainability. The InTeGrate online survey of student attitudes was refined and administered in courses taught by 11 instructors across 4 modules.

The survey collects demographic data and probes (a) students' interest in careers related to the Earth and environment and (b) their motivations toward taking actions in their personal and professional lives that will contribute to solving grand challenges of natural resources and environmental sustainability. Surveys are administered pre- and post-instruction, using an anonymizing procedure to maintain student confidentiality while allowing matching of pre- and post-instruction responses. Demographic items ask about gender, race/ethnicity, age, year in school, and college major. Career questions ask students to rate their interest in specific careers and to indicate how important it is to them to have a career in which they use their knowledge of the Earth and environment. Motivation towards environmental sustainability is probed directly through questions that ask about concern or attitude and indirectly through questions about behaviors.

Analysis of responses from the 258 students who provided both pre- and post- responses shows small but statistically significant improvements across instruction on some measures of career interest and motivation towards environmental sustainability, with considerable variability between instructors and modules.

Summative Assessment

Summative assessments play a critical role for the InTeGrate project in evaluating if instructional materials and strategies are meeting the stated learning objectives. These assessments are part of the flow of the class, are graded by faculty based on a rubric, and demonstrate summative level of students' achievement of the learning outcomes. The use of a summative assessment replaced the use of three embedded assessments following field testing of the first round of modules, as embedded assessment did not give sufficient evidence of the ability of materials to meet learn goals.

Additional faculty surveys that provide structured reflection and student engagement measures will be collected to build case studies that describe the context for how faculty used the materials in their course.

The GLE and attitudinal assessments will be made available for use throughout the geoscience community in exchange for access to the resulting data. This will allow a better understanding of the current state of geoscience literacy across the community and provide data that can be used to situate the project results in a larger sample. If you are interested in using these instruments, please contact Professor David Steer.

Evaluation

The project evaluation is being undertaken by Dr. Kim Kastens (Educational Development Center) and Carol Baldassari (Senior Research Associate, Program Evaluation and Research Group, Lesley University). Dr. Frances Lawrenz (University of Minnesota) serves as a consultant to the evaluation team.

Project evaluation will focus on the measurable impact on programming, the associated impact on student learning, and the ultimate impact on students' ability and willingness to engage in societal roles addressing the sustainability of our civilization and our environment. Because the community-based project design depends critically on successful collaboration among partners who are dispersed by geography, discipline, and institution type, the external evaluation team will provide formative feedback on the evolution of the partnership and sub-partnerships, their roles and responsibilities, and their ability to work together effectively. Key areas of interest will be the alignment of members' work with the project design and intended goals, potential benefits and costs of their participation in the project, their commitment to project goals and activities over time, and the effectiveness of the structures and processes the Center creates and maintains to foster communication and collaboration. Data collection efforts will include in-depth interviews, surveys, participation in on-line meetings and phone conferences, and reviews of program artifacts.

To evaluate the project's effectiveness at "...expanding the number of students who enroll..." (NSF, 2010, p. 5), patterns and trends in numbers and attributes of enrolled students will be analyzed throughout the grant period, building on data from AGI, and using data collected from participating faculty and students in a database maintained by the project management team. Enrollment impact of the project over time will by tracked by demography (gender, race/ethnicity), by type of InTeGrate interaction (original development teams, implementation grant, or professional development workshop), by institution type (e.g. R1, 2YC, MSI), and by student specialization (general education, geoscience major, or geoscience-related field). To probe the reasons behind students' enrollment decisions, pre- and post-instruction assessments of selected classes will include survey questions about students' reasons for taking the class, what alternatives they considered, and their interest in future geoscience classes. The Center seeks to impact 400,000 students during its lifetime and to collect enrollment and assessment data from courses enrolling 75,000 students. To date over 200 educators have been involved in the project.

To evaluate the project's effectiveness at achieving "...enhanced learning..." (NSF, 2010, p. 5), the evaluation team will analyze assessments of student learning, the same pre- and post-instruction assessments developed and deployed by the assessment team during their review of the effectiveness of modules. However, the evaluation team will take a cross-institutional, project-wide view of these data. The evaluation team will look for patterns and trends in student learning gains, asking whether learning gains are evenly distributed across demographic groups, across institution types, and across modes of InTeGrate involvement (original development sites, implementation grant recipients, professional development workshop attendees). Based on the initial data collection and analysis, in Year Two, the assessment team set learning benchmarks for students enrolled in single courses and those enrolled in programs in the areas of geoscience literacy, understanding the process of science, and interdisciplinary problem solving. The Center seeks to have all students enrolled in courses supported by the Center make progress toward these benchmarks, and for 75% of students to meet the benchmark.

To evaluate the project's effectiveness at achieving "....significant progress towards addressing the national challenge" of environmental sustainability (NSF, 2010, p. 6), the evaluation team will consider students' ability and motivation to use insights from geosciences to address grand challenges of sustainability. Motivation will be assessed by including a career interest component on pre- and post-instruction surveys (student attitudinal survey). At present, many science educators are ambivalent about inclusion of human/environment interactions in science courses (Kastens and Turrin, 2006) or about devoting instructional time to "soft skills" such as interdisciplinary collaboration. Questions have been included on the faculty survey administered by the On the Cutting Edge project to gauge the degree of support among the geoscience professorate for teaching towards each of the three InTeGrate learning goals. This survey was administered in Fall, 2012. The Center seeks to have 30% of geoscience faculty supportive of this goal.

To evaluate the program model, we will conduct a series of site studies probing the relationships among InTeGrate-supplied materials and activities, contextual and other factors influencing the faculty and student at the adoption site, and the consequent changes in programming, faculty practice, enrollment, and student learning. Integrating across all of the intensive study sites, we will identify circumstances or actions that either favor or undercut likelihood for successful adoption or adaptation of InTeGrate's materials and methods.

The evaluation is being guided by the following logic model, developed by the evaluation team and project leaders: InTeGrate logic model (Acrobat (PDF) 80kB Sep26 12)



« Previous Page      Next Page »