published December 31, 1969

SENCER E-Newsletter, November 2003, Volume 2, Issue 5

A Brief History of the Development of the SALG and the SENCER-SALG

The Student Assessment of their Learning Gains (SALG) instrument was developed in response to the expressed need of classroom innovators for a valid student evaluation instrument. Faculty sought an instrument that allowed them to make adjustments and improvements to their teaching methods. They also needed an instrument that allowed them to assess learning objectives and related class activities. They wanted students to focus on how well specific class features and activities enabled their learning. This was especially critical for innovative classrooms where faculty were concerned about classroom work being judged by inappropriate criteria.

The SALG instrument was originally developed to meet the needs articulated by participants in the ChemLinks Coalition and the Modular Chemistry Consortium (now allied as "ChemConnections") that formed part of the National Science Foundation's Systemic Change Initiative in Chemistry. Data collected as part of the formative evaluation of ChemConnections' work also informed the development of the SALG. This included interviews with faculty developing and testing the chemistry modules, and with 345 students interviewed in a matched sample of modular and more traditionally-taught introductory chemistry classes at eleven participating institutions.

The interviews examined two broad categories of assessment: assessments of course pedagogy and of teacher performance (expressed in terms of what students "liked"), and assessment of their own learning gainsfrom aspects of the class. Examples of student likes or dislikes were expressed in phrases like those given here:
"I thought the teacher was very organized and presented the material well,"
"The tests were fair,"
"The teacher was very approachable," and
"Some of the demonstrations didn't work well."

Although statements like these may imply that teacher characteristics such as organization, fairness, approachability, and technical competence have some level of impact on student learning, the connection is unstated and offers limited feedback about what really enables learning.

Students' observations on "what they liked" were also less useful than their estimates of "what they gained." Analysis of student judgments of faculty performance revealed that when all students' observations were compiled (for both the modular and the comparative classes) positive and negative comments about what students liked were almost evenly split. Neither group of faculty got a clear picture of the overall utility of their teaching when they were judged on the perceived quality of faculty's professional performance.

This finding reflects a common faculty experience; asking students what they "liked" about their classes (especially where no criteria for these judgments is offered) provides little useful information to faculty (or their departments) about the impact of their classroom work on students.

By contrast, when all observations about learning gains were totaled and divided into positive (things gained), negative (things not gained), and mixed reviews (qualified assessments of gain), the teachers of both the modular and comparative classes got clear overall evaluations from their students. Approximately 57 percent of the observations for both types of class were positive, 31 percent were negative, and 12 percent were "mixed." What the innovative and more traditional teachers gained in feedback on their work differed by the nature of the course pedagogy. However, both positive and negative learning gains statements proved useful to the module developers and testers and resulted in improvements to their work.

The first version of the current SALG instrument was developed in 1997 and was successfully piloted in classes in three participating chemistry departments. With a grant from the Exxon-Mobil Educational Foundation, the SALG was tested with a sample of 28 modular chemistry classes and then by wider groups of ChemConnections faculty and others. Questions were initially derived from modular learning objectives. However, as faculty in other disciplines began to use the instrument it became clear that many of these learning objectives were widely shared across the sciences and other disciplines. Question groupings explored concepts and skills, application of knowledge, making connections with other bodies of knowledge, use of resources, appreciation of the subject, and estimates of what students will carry away from the class.

The instrument's structure focuses on questions about how much particular aspects of a course enable student learning. From the outset, students were asked to respond to the instrument questions on a five-point Likert-style scale. Likert scales are commonly used, both for institutional classroom evaluation instruments, and in questionnaires exploring degrees of agreement or disagreement with position statements. A "not applicable" option, and questions inviting students' write-in comments were subsequently added.

In 1998, with funding from the Exxon-Mobil Educational Foundation, Sue Lottridge of the LEAD Center at the University of Wisconsin-Madison began to develop a website whose purpose was to streamline the use of the SALG paper instrument. The website offers faculty accessibility and easy modification allowing students to complete the instrument on-line. The website automatically provides the teacher with both raw data and a set of standard survey results. This resolves implementation problems related to compiling results, and allows easy dissemination of the SALG instrument to many users. Used as a formative tool (to revise courses), the online SALG instrument can easily be used mid-semester as well as at the end of a course.

By December, 2002, the site had 2043 registered users and 2019 registered courses. Currently, 649 registered courses are linked with 293 SALG users. The majority of these users registered only one course (167), and the vast majority (245) registered three or fewer courses. A total of 18,084 students completed surveys in these 649 courses.

In 2001, the SENCER organizing group, led by David Burns and Karen Oates, asked Elaine Seymour and Sue Daffinrud to help them to develop core versions of the online instrument that addressed learning objectives, class Activities, and items about science skills and interests that are shared by SENCER courses and their users. A significant innovation sponsored by the SENCER project in addition to the customization of the SALG for SENCER courses is the development of "pre" and "post" versions of the instrument. This innovation will allow instructors to measure changes on certain critical indicators over the course of each semester. During the last year, pilot versions of these instruments have been tested with limited samples of SENCER course teachers. This fall and in the coming spring, the resulting beta versions of the instrument will be used with a larger sample.

-Based on a report prepared by Dr. Elaine Seymour, University of Colorado at Boulder