Beliefs about Investigation and Design Survey

Introduction

Students come in to introductory college-level science courses with beliefs about science and about their own abilities; they take science courses for different reasons and have different plans for what they want to do. We hope that they learn things during the course, and that perhaps we might see some changes in their beliefs, attitudes, and future plans that we can attribute to their experiences in the course. The Beliefs about Investigation and Design Survey (BIDS) was designed to assess changes in students' beliefs, particularly about their own abilities to participate in and do science, and their attitudes towards science and science teaching. Using comparative analysis, we can examine collected pre-and post-surveys for changes in beliefs, attitudes towards teaching and learning science—if any—among undergraduate students before and after curriculum development and/or implementation of investigation and design curriculum.

Survey overview

BIDS is an online, Qualtrics-based pre-/post-survey instrument. Both surveys begin by asking for a student ID (for purposes of anonymous matching by the research team) and the institution where the student is taking the course (for reporting to instructors the students that have completed the surveys). The pre- survey then asks the reason why the student is taking the course and students can select from several options. Both surveys ask about the majors that students have or intend to declare.

The next section is identical between the two surveys and provides the core for assessing the impact of instruction:

  • A set of 13 statements that reflect different attitudes towards science and engineering and towards science learning; students are asked to reflect upon their previous science courses and indicate the extent to which they agree with each statement.
  • A set of 20 statements that describe science and engineering-related tasks phrased as "I can...". Students are asked to rate their confidence in being able to address the task, or if they do not understand it. These statements were selected from the Next Generation Science Standards (NGSS) performance expectations (NRC, 2013).

The pre-survey ends with a demographic section asking students about their race, gender, and whether or not they are considered the first generation in their family to attend college. The post-survey ends with two questions about how student interests in a career in science or engineering and their interests in teaching science and engineering have changed, and a final open-ended question about how, if at all, they can envision ways in which they will use the skills learned or practiced in the course to address questions and solve problems that are relevant to you, your family, or your community.

Developing BIDS

We used a pre-/post- instrument developed for a similar purpose, the InTeGrate Attitudinal Instrument (IAI), as a starting point (Kastens, 2016). The IAI provided a framework for our surveys that was similarly seeking to assess the impact of instruction, including the strategy of using student IDs to match pre- and post- responses. The framework consists of a set of core questions asked in both the pre- and post- surveys, a set of course context and demographic questions asked only in the pre-instruction survey, and a set of reflective questions asked only in the post-instruction survey.

Core questions

Beliefs

We gathered and explored questions from several instruments that assess attitudes towards science, including:

We also considered questions that address not only attitudes towards science, but attitudes towards participation in the science classroom, including comfort voicing ideas and the value of diverse voices.

We avoided using a neutral response and instead chose a 4-point scale of agreement: strongly disagree, somewhat disagree, somewhat agree, strongly agree.

Confidence

We made use of the concept of the knowledge survey (e.g. Bell & Volckmann, 2011; Nuhfer and Knipp, 2003) in developing the confidence questions. , in which students are asked to reflect on their confidence level in completing specific tasks. For the tasks, we selected 20 performance expectations (PEs) from the Next Generation Science Standards (NGSS Lead States, 2013). The PEs were approximately equally distributed across the three disciplinary domains of Earth and space science (6 PEs), life science (8), and physical science (6), and included 12 high-school and 8 middle-school level PEs. The selected PEs represent seven of the eight science and engineering practices (National Research Council, 2012). We selected practices with redundancy both for internal validity and to assess students' beliefs about their abilities to transfer skills from one science discipline to another. Each confidence statement was modified to begin with an "I can..." phrase congruent with self-efficacy beliefs (Bandura et al., 1977), and students could select a response to each statement from three levels of confidence: "highly confident," "somewhat confident," and "not at all confident," or a fourth option, "I do not understand the statement."

Choice of major

In both surveys, we also asked students if they had declared a major and asked them to select their chosen or anticipated major(s). We used the list of 10 fields defined by the National Survey of Student Engagement (NSSE) (Ewell & McCormick, 2020) as potential majors, and students could select three options: "will not choose," "might choose," "will definitely or have chosen."

Pre-instruction survey only: Course context and demographic questions

In the pre-instruction survey only, we asked a set of questions about the context of the course within the student's college pathway. We asked their reason(s) for taking the course, how many previous college-level science courses they had taken, and the number of years they had been in college.  At the end of the pre-instruction survey, to avoid stereotype threat (Kaplowitz & Laroche, 2021; Steele et al., 2002), we asked students to select their gender, race/ethnicity, and parents' educational attainment.

Post-instruction survey only: Reflection and open-ended questions 

In the post-instruction survey only, we included reflection and open-ended questions. Students were asked to reflect on their interest in pursuing a career in science and engineering and teaching science and engineering at the beginning of the course and how it had changed. In addition, students were asked, "As you think about your future, can you envision ways in which you use the skills you learned or practiced in this course to address questions and solve problems that are relevant to you, your family, or your community?" Response options were "yes" or "no" and the following question asked them to explain their response.

Testing BIDS

To establish face validity of the complete survey, we implemented a think-aloud protocol (Willis, 2005; Willis, 2015) with students in a similar demographic group to our expected participants (college students taking introductory-level science courses). We  conducted nine think-alouds via Zoom (five on the pre- survey and four on the post- survey) with students in the STEM Teaching Program at Central Washington University. The nine students were asked to read survey questions aloud and discuss their responses and thoughts about the survey as they completed it. We looked especially for issues of readability, ease of understanding, and survey fatigue. We also pilot-tested the instrument in four introductory science courses at CWU; 121 students completed the pre- survey and 50 completed the post- survey.

From the think-alouds and pilot test, we made the following changes:

  • Added to pre- survey question about attitudes, "reflecting on your previous college classes". 
  • Added two questions to the pre- survey: "What year in college are you in?" AND "Before this course, how many courses have you taken before at the college level?"
  • Changed the language in the first-generation college student pre-survey question, "Did one or both of your parents complete a four-year degree," to "Have any of your parents completed a four-year degree?" 
  • Added to the open-ended question in the post-survey, "please name the skills in your response." 

For content validity, we collected written reflections from instructors in which they compared their course content to the 20 statements and described which ones were explicitly covered in their courses.

Current Versions of BIDS

TIDeS Pre-Survey_20211021 .pdf (Acrobat (PDF) 298kB Oct26 21) Pre-instruction survey given at the beginning of the course.
TIDeS Post-Survey_20211021 .pdf (Acrobat (PDF) 286kB Oct26 21) Post-instruction survey given at the end of the course.

References Cited

Adams, W. K., Perkins, K. K., Podolefsky, N. S., Dubson, M., Finkelstein, N. D., and Wieman, C. E. (2005). New instrument for measuring student beliefs about physics and learning physics: The Colorado Learning Attitudes about Science Survey. Phys. Rev. ST Phys. Educ. Res. 2, https://doi.org/10.1103/PhysRevSTPER.2.010101

Bandura, A., Adams, N. E., & Beyer, J. (1977). Cognitive processes mediating behavioral change. Journal of Personality and Social Psychology, 35(3), 125-139. https://doi.org/10.1037/0022-3514.35.3.125

Bell, P., & Volckmann, D. (2011). Knowledge Surveys in General Chemistry: Confidence, Overconfidence, and Performance. Journal of Chemical Education, 88(11), 1469-1476. https://doi.org/10.1021/ed100328c

Ewell, P. T., & McCormick, A. C. (2020). The National Survey of Student Engagement (NSSE) at Twenty. Assessment Update, 32(2), 1-16. https://doi.org/https://doi.org/10.1002/au.30204

Kaplowitz, R., & Laroche, J. (2021). More than Numbers: A Guide toward Diversity, Equity, and Inclusion (DEI) in Demographic Data Collection. Charles and Lynn Schusterman Family Philanthropies.

Kastens, K. (2016). The InTeGrate attitudinal instrument (IAI). Northfield, MN: SERC, Carlton, College. https://serc.carleton.edu/integrate/about/iai.html 

National Research Council. (2012). A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. The National Academies Press. http://www.nap.edu/catalog/13165/a-framework-for-k-12-science-education-practices-crosscutting-concepts

NGSS Lead States. 2013. Next Generation Science Standards: For States, By States. Washington, DC: The National Academies Press. https://doi.org/10.17226/18290

Nuhfer, E., & Knipp, D. (2003). 4: The knowledge survey: A tool for all reasons. To improve the academy, 21(1), 59-78.

Semsar, K., Knight, J. K., Birol, G., and Smith, M. K. (2011). The Colorado Learning Attitudes about Science Survey (CLASS) for Use in Biology. CBE—Life Sciences Education 10:3, 268-278 https://doi.org/10.1187/cbe.10-10-0133

Siegel, M.A. and Ranney, M.A. (2003). Developing the changes in attitude about the relevance of science (CARS) questionnaire and assessing two high school science classes. J. Res. Sci. Teach., 40: 757-775. https://doi.org/10.1002/tea.10110

Steele, C. M., Spencer, S. J., & Aronson, J. (2002). Contending with group image: The psychology of stereotype and social identity threat. In Advances in experimental social psychology, Vol. 34. (pp. 379-440). Academic Press. https://doi.org/10.1016/S0065-2601(02)80009-0

Willis, G. B. (2005). Cognitive Interviewing: A tool for improving questionnaire design. SAGE Publications, Inc. https://doi.org/https://www.doi.org/10.4135/9781412983655

Willis, G. B. (2015). Analysis of the Cognitive Interview in Questionnaire Design. Oxford University Press.