Beliefs about Investigation and Design Survey
Introduction
Students come in to introductory college-level science courses with beliefs about science and about their own abilities; they take science courses for different reasons and have different plans for what they want to do. We hope that they learn things during the course, and that perhaps we might see some changes in their beliefs, attitudes, and future plans that we can attribute to their experiences in the course. The Beliefs about Investigation and Design Survey (BIDS) was designed to assess changes in students' beliefs, particularly about their own abilities to participate in and do science, and their attitudes towards science and science teaching. Using comparative analysis, we can examine collected pre-and post-surveys for changes in beliefs, attitudes towards teaching and learning science—if any—among undergraduate students before and after curriculum development and/or implementation of investigation and design curriculum.
Survey overview
BIDS is an online, Qualtrics-based pre-/post-survey instrument. Both surveys begin by asking for a student ID (for purposes of anonymous matching by the research team) and the institution where the student is taking the course (for reporting to instructors the students that have completed the surveys). The pre- survey then asks the reason why the student is taking the course and students can select from several options. Both surveys ask about the majors that students have or intend to declare.
The next section is identical between the two surveys and provides the core for assessing the impact of instruction:
- A set of 13 statements that reflect different attitudes towards science and engineering and towards science learning; students are asked to reflect upon their previous science courses and indicate the extent to which they agree with each statement.
- A set of 20 statements that describe science and engineering-related tasks phrased as "I can...". Students are asked to rate their confidence in being able to address the task, or if they do not understand it. These statements were selected from the Next Generation Science Standards (NGSS) performance expectations (NRC, 2013).
The pre-survey ends with a demographic section asking students about their race, gender, and whether or not they are considered the first generation in their family to attend college. The post-survey ends with two questions about how student interests in a career in science or engineering and their interests in teaching science and engineering have changed, and a final open-ended question about how, if at all, they can envision ways in which they will use the skills learned or practiced in the course to address questions and solve problems that are relevant to you, your family, or your community.
Developing BIDS
We used a pre-/post- instrument developed for a similar purpose, the InTeGrate Attitudinal Instrument (IAI), as a starting point (Kastens, 2016). The IAI provided a framework for our surveys that was similarly seeking to assess the impact of instruction, including the strategy of using student IDs to match pre- and post- responses. In addition, we utilized the reason for taking the course and major questions from the IAI. However, we modified the list of potential majors, using the ten major fields defined by the National Survey of Student Engagement (NSSE).
Attitudes statements
We gathered and explored questions from several instruments that assess attitudes towards science, including:
- Changes in Attitudes towards Relevance of Science (CARS) questionnaire (Siegel and Ranney, 2003)
- Colorado Learning Attitudes about Science Survey (CLASS) (Adams et al., 2006), particularly the modification for biology (Semsar et al., 2017)
We also considered questions that address not only attitudes towards science, but attitudes towards participation in the science classroom, including comfort voicing ideas and the value of diverse voices.
We avoided using a neutral response and instead chose a 4-point scale of agreement: strongly disagree, somewhat disagree, somewhat agree, strongly agree.
Confidence rating
We made use of the concept of the knowledge survey (e.g. Nuhfer and Knipp, 2003) in developing the confidence questions. We selected 20 high-school level performance expectations from the NGSS in order to include PEs from all three disciplines and that made use of the range of science and engineering practices. We purposefully selected some to have redundancy in the skill used across disciplines to assess students' perceived ability to transfer skills from one course to another. We modified the PEs to begin with the phrase, "I can..." as suggested by ...
In addition to three levels of confidence (highly confident, somewhat confident, not at all confident), students could choose that they did not understand the statement.
Demographic questions (pre- only)
We implemented the demographic questions at the end of the survey in order to avoid stereotype threat and we followed best practices for defining gender identity and race/ethnicity (ref from Ellen). We added a question to asses first-generation status as well. Because we match pre- and post-instruction surveys, there is no need to ask demographic questions on both surveys.
Reflection and open-ended questions (post- only)
We utilized a style of question from the IAI that used a set of graphs to prompt students to reflect on their interest in a topic at the beginning and end of the course, and how it had changed. The open-ended question in the post- survey is also modeled after the IAI.
Testing BIDS
We implemented a think-aloud protocol to validate the survey questions and revise them accordingly. We conducted nine think-alouds via Zoom (five on the pre- survey and four on the post- survey) with students in the STEM Teaching Program at Central Washington University. The nine students were asked to read survey questions aloud and discuss their responses and thoughts about the survey as they completed it. We looked especially for issues of readability, ease of understanding, and survey fatigue. We also pilot-tested the instrument in four introductory science courses at CWU; 121 students completed the pre- survey and 50 completed the post- survey.
From the think-alouds and pilot test, we made the following changes:
- Added to pre- survey question about attitudes, "reflecting on your previous college classes".
- Added two questions to the pre- survey: "What year in college are you in?" AND "Before this course, how many courses have you taken before at the college level?"
- Changed the language in the first-generation college student pre-survey question, "Did one or both of your parents complete a four-year degree," to "Have any of your parents completed a four-year degree?"
- Added to the open-ended question in the post-survey, "please name the skills in your response."
Current Versions of BIDS
TIDeS Pre-Survey_20211021 .pdf (Acrobat (PDF) 298kB Oct26 21) Pre-instruction survey given at the beginning of the course.
TIDeS Post-Survey_20211021 .pdf (Acrobat (PDF) 286kB Oct26 21) Post-instruction survey given at the end of the course.
References Cited
Adams, W. K., Perkins, K. K., Podolefsky, N. S., Dubson, M., Finkelstein, N. D., and Wieman, C. E. (2005). New instrument for measuring student beliefs about physics and learning physics: The Colorado Learning Attitudes about Science Survey. Phys. Rev. ST Phys. Educ. Res. 2, https://doi.org/10.1103/PhysRevSTPER.2.010101
Kastens, K. (2016). The InTeGrate attitudinal instrument (IAI). Northfield, MN: SERC, Carlton, College. https://serc.carleton.edu/integrate/about/iai.html
National Research Council. 2013. Next Generation Science Standards: For States, By States. Washington, DC: The National Academies Press. https://doi.org/10.17226/18290.
Nuhfer, E., & Knipp, D. (2003). 4: The knowledge survey: A tool for all reasons. To improve the academy, 21(1), 59-78.
Semsar, K., Knight, J. K., Birol, G., and Smith, M. K. (2011). The Colorado Learning Attitudes about Science Survey (CLASS) for Use in Biology. CBE—Life Sciences Education 10:3, 268-278 https://doi.org/10.1187/cbe.10-10-0133
Siegel, M.A. and Ranney, M.A. (2003). Developing the changes in attitude about the relevance of science (CARS) questionnaire and assessing two high school science classes. J. Res. Sci. Teach., 40: 757-775. https://doi.org/10.1002/tea.10110