QuIRK's Example Responses to Application Prompts

1. What is the status of Quantitative Reasoning programming on your campus?
In 2003, Carleton faculty created a working group to discuss quantitative reasoning in student work. After sharing anecdotes that suggested reason for concern, the group discussed where we might look for evidence to either confirm or allay our concerns. A member of the working group suggested we examine student writing portfolios. Completed at the end of the sophomore year as a graduation requirement, the portfolios contain authentic student work submitted to courses across the curriculum. We reasoned that if quantitative reasoning (QR) is the habit of mind to consider the power and limitations of numerical evidence to inform problems in a wide variety of contexts, the it should be evident (to the extent it is happening) in the "natural" environment of student work throughout the curriculum.

The initial portfolio reading identified four student learning goals with nine associated outcomes we would hope to see in student work. (See below.) With support from the US Department of Education's Fund for the Improvement of Post-Secondary Education, we developed a scoring rubric to evaluate the extent and quality of QR in the student writing portfolios. (See http://serc.carleton.edu/quirk/Assessment.html.)

The results of early assessment were both encouraging and sobering. First, we found that existing assignments provided ample opportunity for students to work on QR: about 1/3 of the papers in the sophomore portfolio were centrally related to a quantitative issue. Another 1/3 were "peripherally related" to QR in the sense that judicious use of numerical evidence could significantly strengthen the argument by providing context to the argument. In other words, we did not need to fundamentally alter the curriculum in order to enhance QR instruction.

While the potential for QR use was prevalent, actual student use of QR was much less frequent. In papers for which QR was centrally relevant to the topic, about 2/3 of students actually used numeric evidence. Perhaps more importantly, we found that in papers for which QR was peripherally relevant, only 1/8 students employed QR. In place of precise information, students too often fell back on ambiguous terms such as "many", "few", "frequently", and the like.

We interpret these findings as evidence that students do not fully appreciate the power of QR to strengthen written argument. In response, we have crafted a series of professional development workshops to train professors to use new and existing assignments to teach QR. In addition, we have invited a series of campus speakers whose work exemplifies effective use of QR in topics ranging from public health to voting systems to evaluating the consequences of the Iraq War.

We continue to consult data from an annual assessment of student work to guide our programming. (We also find that involving faculty in assessment is an effective way to motivate faculty to change teaching practice.) With support from the National Science Foundation and the W. M. Keck Foundation, we continue to fund curricular reforms with a particular emphasis on the Arts, Literature and Humanities.

2. What are the key learning goals that shape your current programming or that you hope to achieve?
We have four learning goals with nine associated outcomes:

  1. Thinks quantitatively
    • States questions and issues under consideration in numerical terms
    • Identifies appropriate quantitative or numerical evidence to address questions and issues
    • Investigates questions by selecting appropriate quantitative or numerical methods
  2. Implements competently
    • Generates, collects, or accesses appropriate data
    • Uses quantitative methods correctly
    • Focuses analysis appropriately on relevant data
  3. Interprets and evaluates thoughtfully
    • Interprets results to address questions and issues under consideration
    • Assesses the limitations of the methods employed, if appropriate to the task or assignment
  4. Communicates effectively
    • Presents and/or reports quantitative data appropriately

3. Do you have QR assessment instruments in place? If so, please describe:
At Carleton we assess QR in student writing using a rubric that sequentially considers whether QR is demanded by the assignment, whether the student takes the assignment in a direction that involves QR, whether the student actually uses QR, and how well the student employs QR if used. Because we've seen several errors show up repeatedly in past assessment, we also code for the presence or absence of several particular issues:

  • Uses ambiguous words rather than numbers.
  • Fails to describe own or others' data collection methods.
  • Doesn't evaluate source or methods credibility and limitations.
  • Inadequate scholarship on the origins of quantitative information cited.
  • Makes an unwarranted claim about the causal meaning of findings.
  • Presents numbers without comparisons that might give them meaning.
  • Presents numbers but doesn't weave them into a coherent argument.
With support from the National Science Foundation we are adapting this rubric for use at a wide variety of institutions.



We have also been involved in a consortium longitudinally testing the Collegiate Learning Assessment(CLA). While the CLA attempts to measure critical thinking more broadly, many of the "performance tasks" are intensively quantitative.

4. Considering your campus culture, what challenges or barriers do you anticipate in implementing or extending practices to develop and assess QR programming on your campus?
Faculty time is always scarce. This can limit faculty involvement in assessment, professional development workshops, and curricular revision.

A small group of about 40 faculty are deeply involved with assessment. Within this group there is acceptance of assessment if they believe the benefits (learning) out weight the costs (time involved). The rest of the faculty is not hostile toward assessment, but are somewhat skeptical about whether the institution is committing too many resources in this direction.

While the evaluation of younger faculty is clearly tied to teaching practice, it is less clear how the college rewards senior faculty for involvement in our work.

5. Considering your campus culture, what opportunities or assets will be available to support your QR initiatives?
We are in the middle of reaccreditation. This process has underscored the need for regular assessment of student learning to guide future innovation. Because many faculty are involved in this reaccreditation process, campus awareness of the use and value of regular assessment has been piqued.

We are also in the middle of a curriculum review. While it is too soon to know exactly what will come out of that process, all three curriculum design teams proposed a new QR graduation requirement. As any new requirement moves forward, we expect faculty will be increasingly interested in learning how well we are meeting learning goals with the resulting policy change.

Our Learning and Teaching Center hosts weekly brown-bag seminars for faculty to learn about and discuss innovations in teaching. We have found these to be effective venues for disseminating assessment findings and galvanizing faculty to change.

The college places strong emphasis on teaching in formal evaluations. Younger faculty can be encouraged to engage with our program as a way to demonstrate their active engagement in professional development.