Questionnaire to measure indicators for recruitment/retention in geoscience careers
Miriam Fuhrman
,
American Institutes for Research
This partial survey contains generic items that can be used to evaluate indicators associated with enhanced likelihood of students remaining in the geosciences. This is a DRAFT document and has not yet been piloted or tested.
What learning is this evaluation activity designed to assess?
The instrument is designed to measure changes in attitudes and behaviors related to indicators associated with enhanced likelihood of students remaining in the geosciences. These indicators were developed by AIR staff as part of a conceptual framework to enable us to assess the effectiveness of NSF Opportunities for Enhancing Diversity in the Geosciences (OEDG) projects in the short term. The framework is based on a general literature review of STEM college major/career choice by underrepresented minorities in addition to a critical incident study focusing on specific indicators for geoscience career choice (Fuhrman et al; 2004). These geoscience indicators include, for example: attitude toward outdoors activities, ability to work in groups, and geoscience faculty accessibility. The specific indicators assessed in the draft survey are listed in the separate indicator/survey linking document.
What is the nature of the teaching/learning situation for which your evaluation has been designed?
This instrument is designed for use as a pre- /post-evaluation of an individual workshop or course. It is based on similar surveys used by OEDG grantees as one way to evaluate the effectiveness of OEDG programs/workshops/courses in encouraging members of under-represented minorities to enter and be retained along a pathway toward a geoscience career.
What advice would you give others using this evaluation?
Customized items can be added to address specific goals of an individual workshop or course—see the optional items at the end of the instrument, for some examples. These items have been designed to measure changes in attitudes and behaviors; so it is important that users administer the exact same items in the pre- and post- versions of the survey. It makes data interpretation difficult when different versions of items are used in pre- and post- versions.
Note that these items are all closed-ended. It may be tempting to add open-ended items, but it streamlines the analysis procedure greatly if as many items as possible are closed-ended. Closed-ended questions can always have a "catch-all" option of : "Other (explain) .....; common "other" answers can then become formal options in subsequent versions of the instrument.
Note that these items are all closed-ended. It may be tempting to add open-ended items, but it streamlines the analysis procedure greatly if as many items as possible are closed-ended. Closed-ended questions can always have a "catch-all" option of : "Other (explain) .....; common "other" answers can then become formal options in subsequent versions of the instrument.
Are there particular things about this evaluation that you would like to discuss with the workshop participants? Particular aspects on which you would like feedback?
As presented, this is a DRAFT document and has not yet been piloted or tested. Although some items have been used in evaluation instruments for OEDG projects, others have not. We are currently using cognitive laboratory testing to determine if the wording is clear and elicits appropriate responses from college students. We welcome feedback from workshop participants on specific items in terms of how useful they might be in evaluating the effectiveness of workshops and course in which participants are involved.
Evaluation Materials
- Survey Instrument (Microsoft Word 216kB May2 05)
- Linking document: indicators and items (Microsoft Word 44kB May2 05)
- Conceptual Model (with indicators) -- draft document as this model is still being refined (Microsoft Word 128kB May2 05)