Advancing Assessment of Scientific and Quantitative Reasoning
Donna L. Sundre, James Madison University
Christopher Murphy, James Madison University
Mary Handley, James Madison University
For more Information
email to: firstname.lastname@example.org
This NSF funded project (DUE 0618599) furthers the development of collegiate scientific and quantitative reasoning assessment tools and procedures. It would seem logical to expect that institutions would use direct measures of student learning to assess important collegiate outcomes; however, this is not the case. Chun (2001) listed four general assessment methods used in higher education: actuarial, ratings of institutional quality, surveys, and direct measures of learning. It is disappointing to find that direct measures are the least systematically used of the four approaches. These are the methods that can best inform us as to student growth and development and help us to modify instructional delivery to foster and maintain learning. Chun concluded that, "The key is to focus on developing better methods to directly assess student learning"(p.25). Klein (2001) concurred by insisting that "development of appropriate measures of the quality of undergraduate education is the missing but essential ingredient needed to improve our understanding of the tradeoffs between access, productivity, and quality in American higher education" (p. 26). Without appropriate assessment methods, the nation will continue to rely upon less desirable indicators such as: student self-reports, actuarial reports, and external ratings of institutional quality. These methods are not suitable for informing us about actual student learning or improving STEM teaching and learning. Through exploration of the generalizability of our instruments to other diverse institutions, this project has: contributed to the knowledge of undergraduate STEM education; developed faculty expertise in assessment practice; and helped to build an interdisciplinary community of scholars from five diverse institutions. The four partner institutions that joined James Madison University were carefully selected to provide institutions with distinct missions serving diverse student populations: Michigan State University, East Lansing, Michigan; St. Mary's University, San Antonio, Texas; Truman State University, Kirksville, Missouri; and Virginia State University, Fredericksbug, Virginia.
James Madison University (JMU) is uniquely qualified to contribute to the development and dissemination of psychometrically sound instruments and assessment practice due to its long-term commitment to this work via the Center for Assessment and Research Studies (CARS). The proposed project builds upon highly successful work conducted over several years by CARS faculty with significant collaboration by JMU STEM faculty members through which objectives for scientific and quantitative reasoning have been carefully crafted, innovative items have been created and mapped to these objectives. The institution is currently using the ninth version of instruments designed to measure collegiate scientific (SR) and quantitative reasoning skills and knowledge. The proposed project has built upon our existing research base that has demonstrated the reliability of scores and the validity of the inferences we wish to make. Our work on this project further supports the hypotheses that the scientific and quantitative reasoning goals and objectives crafted, and the instruments developed successfully generalize to other institutions in need of sound assessment methods and practices. We have generated considerable supporting evidence and have started the dissemination of our findings to the scholarly community.
b) Building improved and scientifically based assessment plans for adoption at home institutions through consultation and participation in Faculty Institutes.
c) Building assessment capacity at participating institutions through professional development in assessment practice, analytic methods, and data presentation to enhance curricular reflection and improvement.
d) Developing new assessment models and designs for adoption or adaptation by other institutions.
e) Documenting potential barriers to effective assessment practice and exploring solutions.
f) Creating scholarly communities of assessment practitioners to sustain work at participating institutions and beyond.
The Quantitative and Scientific Reasoning assessment instruments were designed by faculty and assessment specialists at JMU to measure the eight general education QR and SR objectives. To date, there have been nine versions of these tests. The Scientific Reasoning (SR) test assesses six learning objectives and has 49 items. The Quantitative Reasoning (QR) test assesses two learning objectives and has 26 items. The tests are intended to assess quantitative and scientific reasoning skills of college level students.
Partner institution teams gathered at James Madison University to conduct two Faculty Institutes: one in July 2007 and the second in July of 2008. These institutes provided the opportunity for all teams to map items to their home student learning objectives, develop sound data collection plans, and craft meaningful research questions to guide data analyses. We also reported our findings at the 2008 Faculty Institute and made plans for subsequent presentation and writing projects.
Through participation in this project, the partner institutions have improved the direct assessment of student scientific and quantitative reasoning. Further, each institution's capacity for and sustainability of student assessment has been strengthened.
Evaluation and Assessment Strategies
Products, Key Findings, Publications
Sundre, D. L., Smith, K., Winston, M. & Faison, M. (2009, April). Advancing assessment of quantitative and scientific reasoning: A progress report on an NSF Project. Presentation at the NC State Assessment Symposium. Cary, NC.
Sundre, D. L. (2008, November). Making assessment a catalyst for higher achievement. An invited session presented to the Common Wealth College Learning and Virginia's Shared Future Summit. Co-hosted by the AAC&U and the University of Richmond. Richmond, VA.
Sundre, D. L. (2008, October). Challenges of Quantitative Reasoning Assessment. An invited plenary session presented to the NSF QuiRK/PKAL Quantitative Reasoning Workshop. Carleton College, MN.
Sundre, D. L. (2008, October). The James Madison University Story. An invited session presented to the NSF QuiRK/PKAL Quantitative Reasoning Workshop. Carleton College, MN
Sundre, D. L., Thelk, A. D., Murphy, C. G., Handley, M. K., & Hall, M. D. (2008, October). Advancing Assessment of Quantitative and Scientific Reasoning: a Progress Report. Poster session presented at JMU CISAT Faculty Research Day. Harrisonburg, VA.
Sundre, D. L. (2008, August). Institution-wide Assessment of Scientific and Quantitative Reasoning Learning Gains. An invited workshop for the Invited NSF Principal Investigators Conference. Washington DC.
Sundre, D. L. (2008, August). Advancing Assessment of Scientific and Quantitative Reasoning: Year 2. A poster session presented at the Invited NSF Principal Investigators Conference. Washington DC.
Sundre, D. L. (2008, July). Using the results of assessment to improve student learning. An invited workshop for the Southern Association of Colleges and Schools Commission on Colleges Institute on Quality Enhancement and Accreditation. Orlando, FL.
Sundre, D. L. (2008, July). Assessment: An invitation to engage. Plenary welcome address for the 2008 Assessment Institute hosted by the Center for Assessment and Research Studies: Harrisonburg, VA
Sundre, D. L., Thelk, A. , Kaplan, J., Ridley, E. & Lindevald, I. (2008, June). Advancing the assessment of Quantitative and Scientific Reasoning: First year results. Presented at the International Assessment and Retention Conference. Scottsdale, AZ.
Sundre, D. L. & Thelk, A. D. (2008, March). Advancing assessment of quantitative and scientific reasoning. Paper presented to the American Educational Research Association. New York, NY.
Sundre, D. L. & Thelk, A. D. (2008, February). Content alignment and standard setting techniques for general education assessment. An invited workshop for the General Education Requirements Committee at the University of Miami. Coral Gables, FL.
Miller, B. J., Setzer, C., Sundre, D. L., & Zeng, X. (2007, April). Content validity: A comparison of two methods. Paper presentation to the National Council on Measurement in Education. Chicago, IL.