Program
Thursday
8:30 Welcome, Introduction, and Opening Activity
8:50 Overview of Quantitative Research Design and Reorganize Groups
9:15 Face Validity/ Content Validity
9:45 Criterion Validity
10:15 Break
10:30 Construct Validity
11:00 Wrap up discussion
11:20 Road check (closed)
11:30 Adjourn for the day
Friday
8:30 Roadcheck Response, Overview, and Validity Review
9:10 Threats to validity, considering demographics, and resources available
9:40 Reliability and choose your next critique
- Internal Reliability of Scales of Items OR
- Inter-Rater Reliability
10:15 Break
10:30 Choose your own adventure:
- Workshop your research study (e.g., question, sampling, instrument) with a group
- Work in a group on a checklist for reviewing for validity and reliability in manuscripts
11:00 Report out and Wrap up
11:20 End of Workshop Survey
11:30 Adjourn
Resource list
Slides and Excerpt Articles
- Thursday Slides for EER Validity and Reliability Workshop (Acrobat (PDF) 473kB Jul14 22)
- Friday Slides for EER Validity and Reliability Workshop (Acrobat (PDF) 798kB Jul14 22)
- Hanauer, D. I., & Dolan, E. L. (2017). The Project Ownership Survey: Measuring Differences in Scientific Inquiry Experiences. CBE-Life Science Education, 13(1).
- Kahn, S., Vertesi, J., Adriaenssens, S., Byeon, J., Fixdal, M., Godfrey, K., . . . Wagoner, K. (2022). The Impact of Online STEM Teaching and Learning During COVID-19 on Underrepresented Students' Self-Efficacy and Motivation. Journal of College Science Teaching, 51(6).
- Clements, T. P., Friedman, K. L., Johnson, H. J., Meier, C. J., Watkins, J., Brockman, A. J., & Brame, C. J. (2022). "It made me feel like a bigger part of the STEM Community": Incorporation of Learning Assistants Enhances Students' Sense of Belonging in a Large Introductory Biology Course. CBE-Life Science Education, 21(2).
- Zumbrunn, S., McKim, C., Buhs, E. S., & Hawley, L. R. (2014). Support, belonging, motivation, and engagement in the college classroom: A mixed method study. Instructional Science. doi:10.1007/s11251-014-9310-0
- Forrester, C., Schwikert, S., Foster, J., & Corwin, L. (2022). Undergraduate R Programming Anxiety in Ecology: Persistent Gender Gaps and Coping Strategies. CBE-Life Science Education, 21(2).
- Ankrum, J. W., Morewood, A. L., Parsons, S. A., Vaughn, M., Parsons, A. W., & Hawkins, P. M. (2020). Documenting Adaptive Literacy Instruction: The Adaptive Teaching Observation Protocol (ATOP). Reading Psychology, 41(2), 71-86.
Websites
- NAGT GER Toolbox - Includes foundational information for geoscience education researchers from beginners to veterans
- Rossman/Chance Applet Collection - collection of applets for helping to visualize statistical methods; particularly the randomizing subjects applet
- Course Resources from Principles of Educational & Psychological Measurement - Dr. Michael Rodriguez's relevant course resources. Dr. Rodriguez is a national expert on psychometrics and accessibility of assessments. He is now Dean of the College of Education and Human Development at the University of Minnesota.
- Example of why "at face value" may be very low but instrument has high other types of validity
- StatisticsHowTo.com Commercial website (i.e., has ads) produced by instructor at Jacksonville University includes validity explanations, sample calculators, and more statistical resources
Articles or Reports
- Anatomy of an Education Study - Articles published in the CBE-Life Science Education journal that have been annotated to represent various aspects of designing, conduction, interpreting, and presenting education research studies.
- More than Numbers: A Guide Toward Diversity, Equity, and Inclusion (DEI) in Data Collection
- Taber, K.S. (2018). The Use of Cronbach's Alpha When Developing and Reporting Research Instruments in Science Education. Research in Science Education (48). 1273-1296.
- Yusoff, M. S. B. (2019). ABC of Content Validation and Content Validity Index Calculation. Education in Medicine Journal, 11(2), 49-54.
Books
- The Standards for Educational and Psychological Testing
- Creswell & Cresswell's Research Design
- The SAGE Handbook of Applied Social Research Methods
- Scale Development: Theory and Applications
Accessibility Resources
Crawford et al (2021) "Equity & Inclusion in Accessible Survey Design"
Slide presentation that provides both the rationale and evidence for accessibility design – including changing accessibility law, impacts of failing to integrate accessibility design on validity – and improvements when survey design does reflect good practice (the 'why') and instructions on integrating accessibility design (the 'how') using a survey designed by the authors as an example.
Tools
WAVE: Web Accessibility Evaluation Tool
a tool that can review and provide feedback on the accessibility of an existing web page (for example, your completed survey prior to launch!). Page includes a video summarizing the tool's functioning.
Webaim
A tool used to check color contrast (important for screen readers and for those with visual impairment)
Additional sources – guidelines, checklists
University of California, Office of the President
Good check list of things to and not to do – but little explanation. Has links to tools that can be used to improve accessibility and to test instruments for accessibility.
California State University Northridge: Universal design center
Great, exhaustive list of accessibility improvement strategies but does not always include explanations (of the why or the how). Included here because it does include quite a few guidelines that are not listed in other resources but some may require additional searches for instructions.
Checklist for Reviewing Manuscript for Validity and Reliability (also what you might consider for your own study)
This list was developed by group as part of the workshop experience. Note that this workshop focused on quantitative methods. We acknowledge that qualitative methods are also strong approaches but were not covered in this workshop.
- What are you trying to study? Who are you trying to study? What is your context? How can you effectively measure what you are trying to study?
- What are the theoretical underpinnings and how does it relate to your population of interest and context?
- Theoretical section already scaffolds the argument and starts to justify the methods.
- If using a validated instrument, are the researchers considering a DEI lens in reviewing their methods, including the instrument?
- Are the questions asked in a way for what you want? (e.g., Critically think whether this question rely on the student living in an affluent suburb?)
- Did the researchers show evidence for face validity?
- Instrument itself should be included in supplemental materials or available to other researchers in some way (contact first author).
- Content validity -This instrument was reviewed by X experts and their expertise/qualifications.
- What lines of evidence for validity are most necessary to make the argument and that those are presented and are aligned to the methods.
- Is there sufficient data/evidence to support the argument?
- Findings should clearly state the significance and magnitude of the effect.
- Does the presentation of the findings and evidence for validity state enough about the context (setting and population of interest) and whether interpretations could be generalized and if so, in what ways?
- As far as reliability, if some aspects of the data collections requires coding are there other coders (inter-rater reliability)? If curricular treatment is tested in multiple courses, is the instrument demonstrating consistency (via reliability measures)?
- Are confounding elements identified in the limitations?
- There is never going to be a perfect study (there is always measurement error). How is what's presented sufficient evidence to support the argument of interpretation.
- Identifying where the study is not perfect is just as important as the evidence of support. Are you acknowledging your limitations? Identify why you made one choice over another (it might have been better to conduct in-person but due to the pandemic. . .).