Essay Assessments: Interdisciplinary and Systems Thinking

Purpose of the Essay Questions:

The essay questions are a project-wide post-course measure aimed at understanding students' interdisciplinary and systems thinking in courses that use InTeGrate materials. The essays are aimed at addressing higher Bloom's levels, consistent with GLE level 3 questions. Since the courses testing InTeGrate materials include both geoscience and interdisciplinary courses in a range of disciplines (engineering, humanities, social science), the questions were written to a broad audience. The responses were analyzed to understand alignment with module/course learning goals and with InTeGrate guiding principles.

Current Version of the Instrument

Both the interdisiplinary (question 1) and systems thinking (2) questions were revised based on student responses and analysis by the assessment team. The current versions of the questions were established by 2014 and have shown promising results (see analysis below).

InTeGrate Essay Questions published in: Iverson E.R., Steer D., Gilbert L.A., Kastens K.A., O'Connell K., Manduca C.A. (2019) Measuring Literacy, Attitudes, and Capacities to Solve Societal Problems.

How the Essay Questions are used:

The InTeGrate assessment team collaborated to write the interdisciplinary and systems thinking essay questions. These essays were then tested in InTeGrate classrooms as a post-course measure (see below). In all enactments, the essays were given in a high-stakes exam so that the students would put effort into their responses. If a student had opted out of the InTeGrate study, they were still required to take the test and would receive course credit, but their responses were not included in the research study.

Usage to date:

  • Pilots: Courses that are piloting InTeGrate curriculum materials, which is a required step in the InTeGrate materials development process (100+ enactments at 100+ institutions)
  • IP's: Four of InTeGrate's Implementation Programs taught with InTeGrate materials and collected and submitted essay data. The materials had already been piloted in at least three courses, data analyzed, and revision plans in place.
  • Post-publication enactments: After InTeGrate materials were piloted and revised, essay data has been collected on the published modules from the InTeGrate Research team. 7 instructors taught without InTeGrate materials in Fall of 2015, and then taught the same courses incorporating InTeGrate materials in Spring and Fall 2016.

Development of the Essay Questions:

The InTeGrate materials are designed to reach students within and beyond the geosciences at all types of institutions, to reach the goal of increasing Earth literacy of all undergraduate students (about InTeGrate). To increase the likelihood that the essays would be usable in these various environments, the essay questions were designed through a community approach that included the assessment team, PI's, and year 1 module/course authors. Pilot data was collected and analyzed, leading to revisions to both questions.


A sample of essay question responses were drawn randomly and analyzed by the assessment team to examine student response alignment with the interdisciplinary and systems thinking project principles.

Systems Thinking

Initially, the responses from the systems thinking question, in particular, showed very poor responses, resulting in both strengthening the question and also working with materials authors to incorporate systems thinking in a more explicit manner in the InTeGrate materials (Iverson et al., manuscript). Following these instrument and materials revisions and re-focusing, student data was collected and analyzed, showing a much stronger result (Gilbert et al., 2017)

Process of revising and testing systems thinking essay prompt

In response to weak responses to the systems thinking prompt from the 2013 to 2014 course enactments, the assessment team revised the question during the face-to-face meeting in the summer of 2014. The assessment team designed two different systems thinking questions to pilot. One prompt intentionally remained broadly defined but was revised it to ensure more interdisciplinary language. A second systems thinking prompt was developed to test language that was more specific to the Earth system and included a series of sub-prompts. Faculty were recruited to test both of these prompts from materials developers who taught different types of courses (teacher in-service, introductory geoscience, and sociology) during the summer term. In addition, the same prompts were piloted at Barnard College and in multiple geoscience courses at University of Nebraska-Lincoln. Expert responses to both prompts were also solicited.

In November of 2014, the assessment team scored and evaluated all the summer and early fall student and expert responses. Based upon the findings of the scoring and analysis, a series of three sub-prompts for the systems thinking essay was developed that was broadly defined but also scaffolded the response through sub-prompts. The assessment team collected some just-in-time piloting of the revisions with students enrolled in team member courses. These final pilot tests confirmed that the revised prompt could elicit student responses where full marks were possible and learning about systems thinking could be discriminated. The finalized the systems thinking essay prompt was pilot for post-course assessments starting in the Fall of 2014.


An analysis of the interdisciplinary essay results compared students responses from those taught using InTeGrate modules versus a control group of students taught without using those materials. Interestingly, control group responses showed a focus on global climate change and fossil fuel issues, while many students in the InTeGrate groups focused instead on the topic of their two-week module. This suggests that InTeGrate materials helped to broaden the scope of student understanding of global grand challenges (Caulkins et al., 2014).