Evaluating Learning

Evaluation and Assessment

We have undertaken various evaluations to measure the success of the SENCER QL course (1) against our original course goals, and (2) in comparison to the standard QL course on our campus. These include the SALG, Focus Groups, Survey Comparisons, Pre/Post tests results, and Pre/Post knowledge survey results. A summary of the results of each follows.


As a means of obtaining student evaluation on various aspects of the course in general, we made use of the on-line Student Assessment of Learning Gains (SALG) tool (http://www.wcer.wisc.edu/salgains/instructor/) that was developed by the SENCER project grant (http://www.sencer.net/). This free site is offered as a service to the college-level teaching community. This instrument is a powerful tool, can be easily individualized, provides instant statistical analysis of the results, and facilitates formative evaluation. At the site, any instructor in any discipline who would like feedback from their students about how various course elements are helping their students to learn can begin with a general template survey form and modify it to meet their own needs. However, all projects supported by SENCER were directed to use a special SENCER SALG survey template. At the time we began work on our project, there was a single version of the SENCER SALG template. We found a number of the questions to be too oriented toward science courses to be useful with a math class. We created more math-oriented versions of these questions and have used the math-oriented version of the SENCER SALG.

Students were given, at the beginning of the semester, an assignment to complete the pre-course SENCER MATH SALG survey and, at the end of the semester, an assignment to complete the post-course survey. While the SALG was easy to administer, and most students completed the assignments, we encountered some problems with the instrument. When students log on to complete the survey, the default setting in the system is the "pre-test." We discovered that a number of students accidentally completed the pre-course survey when they should have been completing the post-course survey. This resulted in the pre-course survey results being somewhat biased and the post-course survey results being incomplete.

Both students in the standard and the SENCER QL course completed the MATH SALG survey. In comparing results between the two groups of students, we see only small differences. Perhaps this is largely due to the similarity in course content and the fact that the standard course was already fairly successful in meeting its basic goal of developing quantitative literacy. In addition, it was not possible to include a specific question about how "the projects" influenced student learning as the same MATH SALG instrument was used with students in the standard course where projects were not assigned. Therefore we used other forms of assessment in addition to the MATH SALG, and we describe below where the other assessment tools did show greater differences between the two groups of students.

Three Focus Groups

Prior to developing the SENCER QL course we held a small focus group in Fall 2004 with students in the standard QL course to find out what they thought of the concept. From this we learned that they were most concerned about collaboration issues (would they end up doing all the work?) and the additional time and workload demands of adding a project to the existing coursework.

In Fall 2005 we conducted a focus group with 19 volunteers (out of a total enrollment of 99) from the 4 sections of the SENCER QL literacy course. The students generally agreed that mathematics was NOT a favorite subject. They observed that the projects approach afforded them an opportunity to demonstrate their skills and abilities beyond what "cramming for a final exam would." This group of students replied quite positively when asked if the projects approach should be retained (15 yes, 3 no, 1 not sure).

In Spring 2006 we conducted a focus group during class-time that was attended by 10 (out of an total enrollment of 11) students. This group of students enlightened us about their discomfort with the open-ended nature of the projects and the lack of direction about exactly which mathematical methods to apply at each stage.

Course Survey Comparisons

During 2005-6 we surveyed 62 students in several sections of the standard QL course and 29 of the students in the SENCER QL course. They gave strikingly similar responses about the usefulness of mathematics, which demonstrates that the standard version of course was already doing a good job of reaching one of its goals. However, students in the SENCER QL course had much greater awareness of community issues after the course as shown in the following chart.

Course Survey Comparisons

For students in the SENCER QL class we also asked some questions that related directly to the projects. We found that 79% agreed that their projects enabled them to connect their classroom learning (elementary statistics, mathematics of finance, computer spreadsheets) to campus or community issues and 79% also said it helped them practice and learn mathematical or analytical skills. In addition, 59% reported that the project experience taught them non-mathematical skills (e.g., working with people, time management, writing/distributing a survey).

Pre and Post Test Results

All students in the SENCER QL course took a 9 problem multiple-choice pre and post-test. The problem topics were: percentages, putting numbers in perspective, savings plans, credit cards, loans, statistical studies, the normal distribution, margin of error, and spreadsheets. The pre and post-test did not have identical problems, but problem types were matched, so that pre and post-test comparisons could be made for each problem type. A sample pre-test can be found in Appendix A. The following bar chart summarizes the results by question type.

Pre & Post Test

Every question type shows a gain in the percentage of students who were able to answer correctly, although the areas of credit cards, spreadsheets, and representative samples did not show as large a gain as in the other types of questions. Note, however, for the questions on spreadsheets and representative samples, students did fairly well on the pre-test and so we would not expect to see a very large gain. The disappointing post-test performance on the credit card question may be due to the fact that the topic is covered by a handout that supplements the text and therefore the students may not consider it important.

Students in the standard course also took pre and post-tests. For these students, the pre and post-test was a 12 problem multiple-choice test. Nine of the twelve problem types correspond to problem types that appeared on the SENCER QL preand post-test. (In fact, the problems came from the same problem bank.) The mean percent correct on both the pre and post-test are very similar for the standard vs. SENCER QL courses, as the chart below indicates.

Pre & Post Test 0506

The following chart compares the standard and the SENCER QL courses in these nine problem areas. The greatest difference between the post-test results is that the percentage of students from the SENCER QL course who were able to answer correctly a question relating to interpreting the margin of error exceeded the percentage of students from the standard course by approximately 20 percentage points. We speculate that the survey-based projects encouraged students to think more carefully about interpreting a margin of error.

Pre & Post with Lots of Bars

Pre and Post Knowledge Survey Results

A knowledge survey contains content questions that test mastery of particular course learning objectives. The questions are presented in the same order as the material is presented during the semester. Students take knowledge surveys at the beginning and the end of the semester. They respond to the questions, not by providing answers, but by indicating on a three-point scale how confident they are that they could correctly answer the question. Knowledge surveys can address both basic skills and complex open-ended questions. Time limitations would prevent students from directly answering all the questions on a thorough knowledge survey, but they can rate their confidence to provide answers in a fairly short time period. Tests can only address a limited portion of a course, while knowledge surveys can cover one in depth. Class averages on knowledge surveys have been shown to be good representations of the class's knowledge and abilities. The knowledge survey we constructed for our SENCER QL course in Spring 2006 can be found in Appendix B. The bar chart below summarizes the results.

Pre & Post Test, Lots of Blue Bars

Questions 1-9 on the knowledge survey are questions relating to number sense (percents, significant digits, putting numbers in perspective), and many of these topics are topics students have seen in the past, so it is not surprising that the preknowledge survey ratings are fairly high. Question 10 related to Simpson's paradox, a topic that was not actually covered in the course, so we should not expect the post-knowledge survey to show improvement on that item. Questions 12-18 relate to financial matters (savings plans, loans, taxes, investments) and the results show good improvement in the students' confidence in their ability to solve these problems. Questions 19-26 relate to statistics, and again show good improvement in the students' confidence in their ability to solve these problems.

Conclusions About Student Learning

The Pre- and Post-tests and knowledge surveys give evidence of students' improved quantitative skills. On the course surveys nearly 80% of students in the SENCER QL sections agree that the projects approach helped them to connect their classroom learning to the real world and gave them an opportunity to practice and learn mathematical or analytical skills. But the "best college teachers" hold, and we agree, that meaningful learning involves not only content mastery but also changes in attitude and beliefs. 5 A typical student taking the quantitative literacy course to meet her general education requirement in mathematics often does not like mathematics very much. This underlying dislike of mathematics can still come through at the end of the course; for example, a student agreed with the statement The project helped me connect my classroom learning to the real world and then went on to write: More so than any other boring math class. However, the course surveys also indicate that after the course students are more aware of the usefulness of mathematics and of local community issues. What may be the most significant impact is the hardest to assess - namely, the potential benefits gained from the project experience of tackling an open-ended question as a member of a team. We do know that nearly 60% of those surveyed reported that the project experience taught them non-mathematical skills.

Conclusions and Reflections

In this section we first address the question of whether we met our project goals, and then we reflect back on the value of incorporating group projects into the course.

Did we meet our project goals?

The specific goals stated in our original SENCER proposal were to:

  1. develop an alternate version of our current math core class (MATH 102 Quantitative Skills for the Modern World) that would be accessible to students with only a high school math background and in which students learn and apply mathematics to address problems in the greater Los Angeles area (and hence became 'civically engaged');
  2. have each of the three mathematics faculty on the team teach at least one section of the course during academic year 2005-6;
  3. recruit other faculty to teach this SENCERIZED version of the course.

Goal (1) has been met subject to the very local campus and community nature of our projects rather than drawing on the greater Los Angeles area for all of the project topics. We exceeded Goal (2) because the three mathematics faculty on the team taught a total of 5 sections of the revised course during 2005-6. Our success at Goal (3) remains to be seen. It is the case that 15 faculty attended the mathematics department seminar talk we gave on April 10, 2006 and that 9 faculty (6 full-time tenure track, 2 part-time and 1 full-time non-tenure track) attended the two-hour dissemination workshop we presented on May 10, 2006. The latter was intended to familiarize potential instructors of the course with the projects-based approach and materials we have developed. Of the 9 faculty present, 100% "Agreed" or "Strongly Agreed" with all three of the following statements:

As a result of attending this workshop I have a better idea about:

The rationale for teaching MATH 102 with projects

The materials that are available to teach MATH 102 with projects

How to teach MATH 102 with projects.

When asked if they would willingly volunteer to teach the SENCER version of the quantitative literacy course 56% said "yes" and 44% said they were "not sure." Every person who responded "not sure" went on to comment that it was the extra time commitment that made him or her hesitate. Those who responded positively commented; "It would be fun." "It sounds cool." "Great way to engage students with material - gives student valuable experience." "Seems rewarding, but wouldn't do it with other new preparations."

One of the original developers will be teaching the SENCER QL course in Fall 2006. The final version of the materials we have developed will go to faculty teaching this course along with an invitation to adopt the SENCER projects approach.

Reflection: Making a case for group projects

To make a case for group projects we turn to a well-known and provocative article Seven principles for good practice in undergraduate education written by Chickering and Gamson for the entire spectrum of higher education - faculty members, campus administrators, state agencies and government policy makers, and later amplified into a book6. After reflecting back on our year's experience using group projects in a quantitative literacy course, we find that this approach measures up rather well to the seven principles. We give the details in Appendix C.

Appendix A: Sample Pre/Post Test (Acrobat (PDF) 194kB Jul16 08)

Appendix B: Knowledge Survey (Acrobat (PDF) 131kB Jul16 08)