The InTeGrate Attitudinal Instrument (IAI)
by Kim Kastens, InTeGrate external evaluator.
Purpose of the IAI:
The IAI was developed by the InTeGrate external evaluation team to elicit three different kinds of information from students who were participating in InTeGrate-impacted instruction:
- Students' interest in careers and college majors related to the Earth and environment
- Students' motivation to contribute to solving grand challenges of environmental sustainability, depletion of natural resources, and natural hazards
- Demographic information
Audience for this document:
- People who are analyzing or interpreting data from InTeGrate's deployment of the IAI
- People who are thinking about adopting or adapting the IAI for their own projects.
Current version of the instrument:
The links below access screen shots of the version of the IAI that has been in use since January 2014. Students see an interactive version of these same views. (Warning, these files can be slow to load.) Enactments between Fall 2012 and Fall 2013 used a slightly different version; see Revision History below.
- Pre-instruction survey
- Post-instruction survey
- Convenient overview of pre- and post-instruction IAI forms (Acrobat (PDF) 83kB Oct27 16)
How the IAI is used:
- Online: InTeGrate administers the IAI online. Some professors have the students complete the survey in class, but most have them do it out of class. Some give extra credit for completing the survey. In the early days of the project, a few instructors preferred to give the surveys on paper, but that option has been discontinued.
- Data acquisition and storage: When the student fills out the survey, the data are automatically captured by the Science Education Resource Center (SERC) at Carleton College.
- Anonymization: The IAI online form has a place for students to enter an identifying code, but no place for them to enter their name. The professor tells the students what that code should be, typically the student's university ID number. At SERC, the code entered by the student is passed through an algorithm that generates an InTeGrate student ID number. People working on data analysis cannot determine who the student was. However if other data is later collected from the same student (e.g. post-instruction IAI, essays questions, Geoscience Literacy Exam), the algorithm will generate the same InTeGrate student ID number, and all of the products from that student can be associated and compared.
- Pre-post: The IAI is intended to be administered pre-instruction and then again post-instruction. The pre- and post- versions of the instrument differ. Demographic questions and a question about motivation for taking the course are presented only in the pre-instruction version. Several questions asking for reflection across the course are offered only post-instruction. In InTeGrate administration, the surveys have been given at the beginning and end of the course (not at the beginning and end of the module).
- Undergraduates: The IAI was designed for and tested with undergraduates. We have advised against using it with middle school students because of the reading level. High school would be a possibility.
- IRB: Development and project-level data analysis for the IAI have been under the jurisdiction of the Institutional Review Board on Human Subjects Research of Columbia University, under the direction InTeGrate external evaluator Kim Kastens. On individual campuses, instructors have obtained either implied consent or written consent (as required by their institution's IRB) from their students for collection of the IAI and other InTeGrate data.
Usage to date:
The IAI has been used in the following contexts:
- Pilots: Courses that are piloting InTeGrate curriculum materials, which is a required step in the InTeGrate materials development process (100+ enactments at 100+ institutions)
- IP's: InTeGrate Implementation Programs
- InTeGrate Research Team: 7 instructors taught without InTeGrate materials in Fall of 2015, and then taught the same courses incorporating InTeGrate materials in Spring and Fall 2016.
- non-InTeGrate courses: enlisted to test InTeGrate's Geoscience Literacy Exam and the IAI
Development of the IAI:
Sources for college major and career items:
- IAI's pre- and post-instruction item about undergraduate majors (pre- item 2; post- item 1 on 2014 version) was modified from Fuhrman (n.d) question 5. The choice of majors was influenced by Houlton (2010) Table 4.4, previous majors studied by students before geosciences.
- IAI's pre and post-instruction item about interest in specific professions (pre- item 3, post- item 2 on 2014 version) was based on Houlton (2010), Table 5.5, career ambitions of Houlton's full population set, plus additional professions informed by AGI (2009).
- IAI's pre- and post-instruction item about importance of generic "work in which you use your knowledge of earth and environment" (pre- item 4a, post- item 3a, 2014 version) was developed by the InTeGrate team, faced with the impossibility of listing every possible relevant career in the previous question.
- IAI's post-instruction graph question about your level of interest in a career in Earth or Environmental Sciences before and after taking this class (post- item 4 on 2014 version) was modified from Fuhrman (n.d) post-test question 11. The question had been verbal in Fuhrman's survey and was changed to a graphic version for InTeGrate. See cautionary note below about using this item with a less graph-literate population.
Sources & rationale for environmental motivation items:
- The following sources were mined in creating the IAI's environmental concerns items:
- Clarkson University Energy Literacy Survey: This instrument was developed by the Energy Literacy Assessment Project within the Clarkson Center for Sustainable Energy Systems, at Clarkson University, and used with 3708 middle and high school students in NY state. Instrument has cognitive, affective, and behavioral sub scales.
- The Cloud Institute for Sustainability Education: The Cloud Institute has a wide reaching document of 'standards and performance indicator' for their 9 core focus areas of Education for Sustainability. The areas include: cultural preservation and transformation; responsible local and global citizens; the dynamics of systems & change; sustainable economics; healthy commons; natural laws and ecological principles; inventing and affecting the future; multiple perspectives; and strong sense of place.
- Meredith College: 2011 Sustainability Literacy Assessment, developed for both students and faculty, includes cognitive, affective and behavioral questions.
- Middlebury College Environmental Studies Student Survey: This is an instrument developed at Middlebury College to assess various aspects of their Environmental Studies major such as learning goals, learning opportunities and their capstone senior seminar. This is a self-assessment that for each question area asks assumed importance, confidence the students feel in their abilities, and if they have improved as a result of their major.
- North Carolina State University: 2010 survey on Attitudes toward Sustainability Issues, over 300 students responded to the survey
- University of Colorado, Boulder, Climate Assessments: Two connected assessments were reviewed - one is focused on student knowledge of climate content and an associated part that focused on student attitude toward the topic of climate change. Questions included whether the respondent would feel that changes in lifestyle to reduce impacts are warranted and if so should changes be left to the individual, or should government, business, or international efforts be considered.
- University of Michigan Sustainability Cultural Indicators Project (SCIP): Michigan has a longitudinal study focusing on Sustainable Cultural Indicators which include: community awareness, climate action, waste prevention and healthy environments.
- Washington Center for Improving the Quality of Undergraduate Education: Pre-survey to developing curriculum for the 'Bioregion Initiative' The Center is an initiative of Evergreen State College in Washington state. They administered a brief online survey to a mix of 490 undergraduate students at 18 institutions that participated in the "inquiry and planning phase" of our Curriculum for the Bioregion initiative (January, 2005-June, 2006), and met with ~ 500 students on campuses (leadership groups and classes).
- Westchester University Values Framework: A values of nature framework based on Stephen Kellert's (1996) values of life to provide a set of concepts and a terminology that engages students to recognize the values they bring to environmental issues. [Kellert, S.R., 1996, The value of life: Biological diversity and human society: Washington, DC: Island Press, 263 p.]
- Informed by the sources above, and discussions with InTeGrate's leadership, the IAI comes at the construct of "motivation to contribute to solving grand challenges of environmental sustainability, depletion of natural resources, and natural hazards" from several directions:
- One question (pre- item 4b, post- item 3b on 2014 version) asks about the importance students place on working "in an organization committed to environmentally sustainable practices (independent of the field)." This question is formatted the same as, and placed adjacent to, the question on importance of "work in which you use your knowledge of earth and environment" to allow us to tease out motion towards earth-related careers from motion towards earth-sustaining values.
- One group of questions (pre- item 5, post- item 5 on the 2014 version) asks about "level of concern about ... potential developments in the Earth." All but one of the items offered are developments that geoscientists would consider to be worthy of concern, such as global climate change and energy resource limitations. "Meteor impact," which most geoscientists would consider to be a very low risk on human timescales, was included as an intentional distractor to test for the possibility of students thoughtlessly clicking "major problem" for everything.
- Because saying you are concerned requires almost no cost or effort, we also ask about an array of environmentally sustainable behaviors (pre- item 6, post- item 6a on 2014 version). We consider that it is easier to deceive oneself about how concerned or motivated one is than about what actions one did or did not take in the last week; "actions speak louder than words." We aren't considering sustainability behaviors as a separate construct from motivation; rather we are taking behavior as an indicator of motivation. Another instrument, with room for more items, might seek to separate out behaviors from motivation.
- A graph question (post- item 7 on 2014 version) asks respondents about their degree of motivation before and after the course to take action in their personal and professional lives to create a more environmentally sustainable society. See cautionary note below about using this item with a less graph-literate population.
- A final free-response question (post- item 8 on 2014 version) asks "As you think about your future, can you envision using what you have learned in this course to help society overcome problems of environmental degradation, natural resources limitations, or other environmental issues?' If yes, how? If no, why not? This question was designed to capture insights that hadn't emerged in the earlier items, and to hear students' own voices in the context of the time-frame when they will become society's leaders and decision-makers.
- One question is somewhat orthogonal to the basic "motivation" construct. Rather than probing how motivated respondents are towards environmental sustainability, it probes what factors or sources of information influence respondents' decision to engage in sustainability behaviors (post- item 6b on 2014 version). This item was designed as an evaluation question, to compare how much "this course" was influencing student behavior relative to "other college courses" and non-academic factors such as "friends and family" and "media, including the internet." However, it has emerged as an interesting item in its own right, as we see how different factors are differently weighted by various sub-populations and by the national sample.
Sources & rationale for demographic items:
- The demographic items (at the end of the pre-instruction survey) asking about gender, race, ethnicity, age and year in school, used wording adopted from earlier instruments used by the Science Education Resource Center. In particular, we wanted the race/ethnicity data collected from students to divide into the same categories as the demographic data collected from instructors and other members of the InTeGrate professional community.
- Demographic items were placed at the end of the survey so as to avoid triggering stereotype threat. Demographic items were asked on the pre-survey, so that we could test whether students who dropped the course or otherwise failed to complete the post-survey differed demographically from those who completed.
Process of selecting, vetting, and testing items:
The items that became the IAI were developed over the Spring and Summer of 2012, through the following processes:
- Gathering prior work: Through a literature review and inquiries among InTeGrate's Leadership Team, we obtained access to a substantial body of prior work, including nine earlier surveys and explications of sustainability values and learning goals. Much of this work was unpublished, and we thank the colleagues who generously shared their unpublished surveys and findings with InTeGrate.
- Data-base of potential items: We identified items from the prior surveys that were potentially suitable for InTeGrate use. For the motivation towards environmental problem solving construct, we created an Excel database (Excel 2007 (.xlsx) 36kB Oct12 16) of candidate items. Each item was coded according to whether it probes:
- Respondent's attitude, which we defined as "evaluative judgements about an object, phenomenon, event, or assertion that pertains to environmental sustainability." (Sinatra, G. M., Kardash, C., Taasoobshirazi, G., & Lombardi, D. (2011). Promoting attitude change and expressed willingness to take action toward climate change in college students. Instructional Science, 40(1), 1-17).
- Respondent's motivation, which we operationalized as queries about whether students plan to or are willing to take specific actions to contribute to creation of a more environmentally sustainable society.
- Respondent's actions, which we operationalized as queries about whether respondents have taken or are taking specific actions in their own lives to contribute to creation of a more environmentally sustainable society.
- In all, we coded 80 items as purely about attitude, 69 as purely about motivation, 62 as purely about action, 3 about action + attitude, 43 about attitude + motivation, and 17 about motivation + action.
- Winnowing and prioritizing items: During the June 2012 face-to-face meeting of the InTeGrate Assessment Team, a 3-person subcommittee of the Assessment Team worked intensively with the external evaluator to consider the database of candidate items. Assessment Team members have dual expertise in both geoscience and student assessment. Each item was reviewed for its clarity and relevance to InTeGrate. Similar items were clustered, and in some cases combined into multi-part items. We crafted a draft instrument, and tested it for length and face validity with other members of the Assessment Team. With this feedback, we eliminated, shortened, and clarified items, and emerged with a complete working draft.
- Think-alouds: In July 2012, we did think-alouds with four undergraduate summer interns, each interviewed individually. We asked them to recall a specific Earth or Environmental Science course that they remembered clearly, and answer the questions with respect to that course. We gave them 14 questions, which were close to the wording that ultimately became the 2012 version of the IAI. After each question or group of questions, we asked them to "explain what your were thinking about as you answered each question." We were listening for lack of understanding, interpretations other than what we intended, discomfort with wording, and other symptoms that might require a tweak of the wording. (Think aloud protocol (Excel 2007 (.xlsx) PRIVATE FILE 13kB Jan18 13))
- Division into pre- and post-instruction forms: To further shorten the response time, we segregated some items to appear only pre-instruction (reason for taking course, demographics) and others to appear only post-instruction (items that required reflection about the course or about the future). At this time we finalized the wording.
- Preparation of online version: The final questions were delivered to the Science Education Resource Center, where they were converted into an online form, which was tested for functionality, and deployed in time for the Fall 2012 semester.
Coding of the Open Response Question:
The IAI includes one open response question, administered post-instruction: "As you think about your future, can you envision using what you have learned in this course to help society overcome problems of environmental degradation, natural resources limitations, or other environmental issues? If no, why not? If yes, how?" A coding scheme for student responses was developed through a grounded theory inductive approach, in which the researcher examines the responses for patterns and trends that emerge from the data and then categorizes these according to codes or concept indicators (Chi, 1997; Feig, 2011). Answers from those who had responded "yes" and "no" to the initial question were coded separately. As is typical in such work, the coding scheme passed through multiple iterations involving two researchers comparing, reconciling, and revising.
- Coding Schema Version 1: Based on the first 2 semesters of pilot data, a first coding schema for the "No" responses was developed by Jackie DeLisi and Kim Kastens. Four broad themes were identified among the 47 no responses coded: issues with the course, respondent did not feel empowered to affect change, personal beliefs, and respondent would not be in a career where they where they could impact the environment. These findings were used to provide formative feedback to the Materials Development teams and InTeGrate leadership.
- Coding Schema Version 2: By Spring of 2016 we had accumulated slightly over 1000 student responses to the this question. A coding scheme was developed for "yes" answers, and the "no" coding scheme was modified to accommodate the wider range of responses in the larger data set. Coders/schema developers were Margie Turrin and Kim Kastens. When this coding scheme had stabilized, a third researcher re-coded 22% of the responses to test for inter-rater consistency. Although the consistency was adequate, the scheme was found to be somewhat difficult to apply and yielded too many lightly populated categories.
- Coding Schema Version 3 (final): We therefore developed a simplified scheme (version 3) by merging several categories and clarifying the descriptions of other categories. This version is intended to be useable by Materials Development teams or Implementation Program leadership to analyze data from their own students. Below is an abbreviated version of the final coding scheme, followed by links to the full instructions for coders with examples. (13 Feb 2017: a few clarifications and additional examples were added to the Yes coding scheme.)
- Final coding scheme for "No" responses (Acrobat (PDF) 90kB Oct12 16)
- Final coding scheme for "Yes" responses (Acrobat (PDF) 115kB Mar1 17)
Revision History of the IAI:
There have been two operational versions of the IAI. The old version (referred to as the "2012" or "2012-2013" version) was used in InTeGrate enactments between fall 2012 and fall 2013. The new version was introduced for InTeGrate enactments that began with the spring semester of 2014. The motivations for the changes were to simplify some unnecessarily complex wording, to better accommodate the lack of skip-coding in the online version, to allow more nuanced answers to the sustainable behaviors question, and to incorporate two stand-alone items into a cluster of similar items.
Here are the items as asked in the old version, as well as a summary of what was changed between the old and new versions:
- 2012-2013 version of pre-instruction IAI survey (Acrobat (PDF) 237kB Oct27 16)
- 2012-2013 version of post-instruction IAI survey (Acrobat (PDF) 295kB Oct27 16)
- Comparison of old versus new versions of IAI (Microsoft Word 2007 (.docx) 102kB Oct27 16)
For the project-wide evaluation of the entirety of InTeGrate, we have combined data from the old and new versions of the survey for some findings, as well as reporting other findings from the new version alone. The link below explains how the combining was done. The combined data should not be used to make longitudinal comparisons between the earlier and later years of the project, because of the change in instrument wording.
- Combining of data from old and new versions of the IAI (Microsoft Word 2007 (.docx) 129kB Oct27 16)
The IAI Analysis Toolkit:
The IAI was developed as a project-wide evaluation tool, and thus the evaluation team looks only at project-wide data and subsamples of project-level interest (such as under-represented minorities or pre-service teachers). However, some teams within the project are concerned with smaller specific subpopulations, such as all students in the pilot testing of a particular instructional module. To enable such teams to easily analyze their own IAI data, the IAI Analysis Toolkit has been developed. The Toolkit is an Excel spreadsheet pre-populated with data from a national sample of InTeGrate-using students. Teams can cut and paste their own IAI data into the Toolkit, and it automatically generates a suite of graphs and tables comparing the teams' sample with the national sample.
- Slidedeck describing the IAI Analysis Toolkit (PowerPoint 2007 (.pptx) 1.9MB Jun20 16) for Materials Developers' meeting June 2016
- Download the toolkit (Excel 2007 (.xlsx) 872kB Jun30 16)
Criticisms, cautions, and limitations:
- Need for graph literacy: The two post-instruction graph questions (post-instruction questions 4 and 7 on the 2014 version) have been criticized for pre-supposing that students have strong graph-reading ability. We think that this OK in the InTeGrate context, where students have completed an instructional module where they were required to make extensive use of "authentic, credible geoscience data." All InTeGrate modules include graph-reading activities that are much more challenging than the graphs on the IAI survey. But if the survey is adopted for other populations, the graph-literacy level of your audience would be something you should consider, just as you would consider whether the vocabulary of a verbal question is too challenging.
- Range of majors. When the IAI has been given at 2-year colleges or other institutions with a strong career and technical education mission, many respondents end up selecting "other" for the item about college major (pre- question 2; post- question 1 on 2014 version). For such institutions, the existing list of majors is too oriented towards traditional liberal arts and sciences. If you are planning to adopt the IAI for such a context, consider modifying the list of majors to better match your target population. A useful resource for doing so may the NSF STEM Classification of Instructional Programs Crosswalk from the Louis Stokes Alliances for Minority Participation. Their documentation explicitly notes that the codes have been expanded to include majors/disciplines at community colleges.
- Self-selection of instructors in the national sample: Although InTeGrate's sample of IAI respondents is nation-wide, including students from more than 100 institutions, it is not random. The instructors who chose to affiliate with the InTeGrate project by developing or testing innovative curriculum materials are a self-selected group, who may be expected on average to have a stronger interest in student-centered pedagogy and/or teaching in the context of societal issues.than the professoriate as a whole.
- Uneven access to sustainability behaviors: There have been two criticisms of the suite of sustainability behaviors that students are asked about (pre- item 6, post- item 6a on 2014 version). The first is that some students do not have access to all of the behaviors, for example "use public transportation instead of a car." We think that this is OK when the IAI is used pre-/post- at the aggregate level, as has been done in InTeGrate, since the same campuses and regions that were poorly served by transit at the beginning of the semester would still be poorly served by transit at the end of the semester. However, this limitation means that the instrument should not be used to compare campus-to-campus (and of course it should never be used to compare student-vs-student). The other criticism is that weather patterns may make outdoor behaviors, notably "walk or ride a bike instead of using a car," more pleasant at the beginning of the semester than at the end, or vice versa. If you like the format of this question but would prefer to substitute different sustainability behaviors more suitable for your campus situation, you may find ideas in Cooler Smarter: Practical Steps for Low-Carbon Living from the Union of Concerned Scientists or in the EcoChallenge from Northwest Earth Institute.
References cited:
- American Geosciences Institute (2009). Status of the Geoscience Workforce.
- Chi, M. T. H., 1997, Quantifying Qualitative Analyses of Verbal Data: A Practical Guide: Journal of the Learning Sciences, v. 6, no. 3, p. 271-315. doi: 10.1207/s15327809jls0603_1
- Feig, A. D., 2011, Methodology and location in the context of qualitative data and theoretical frameworks in geoscience education research, in Feig, A. D., and Stokes, A., eds., Qualitative Inquiry in Geoscience Education Research, Volume 474: Boulder, CO, Geological Society of America, p. 1-10.
- Fuhrman, M., American Institutes for Research, Questionnaire to measure indicators for recruitment/retention in geoscience careers, developed to evaluate projects in NSF's Opportunities for Enhancing Diversity in the Geosciences (OEDG) program.
- Houlton, H. (2010). Academic provenance: Investigation of pathways that lead student into the Geosciences. Unpublished Master of Sciences, Purdue University, West Lafayette, Indiana.