Literacy Assessment

James D. Myers
,
University of Wyoming
These activities are pre- and post-course surveys to assess improvement in the fundamental, technical and citizenship literacies necessary to turn scientific content into scientific knowledge and understanding. They collect background information, test quantitatively literacy proficiency and assess the student's confidence level.

What learning is this evaluation activity designed to assess?

Our literacy surveys are designed to determine if students' fundamental, technical and citizenship literacy proficiencies improved during a course. The fundamental and technical literacies survey assesses the skills, e.g. quantitative calculations, spatial visualization, map reading, etc., necessary to turn scientific content into scientific understanding. The citizenship literacy survey evaluates those tools (recognition of hidden costs, predicting consequences, recognizing impacts) needed to use Earth science to address societal issues with geoscience components.

What is the nature of the teaching/learning situation for which your evaluation has been designed?

Our courses are designed to produce informed citizens who can use sound geoscience principles to address societal problems with a geologic component, e.g. building in a floodplain, siting a dam or opening a new mine. To accomplish this goal, they explicitly emphasize a set of supporting literacies as well as the traditional geoscience content. Mastery of the supporting literacies allows students to use their scientific knowledge to address the many societal issues they will face later in life as citizens of a democracy. To determine if we are successful, we need to assess not only our students' geoscience knowledge but their literacy proficiencies as well. These surveys are designed to assess literacy proficiency independent of scientific content. They provide us with information about improvement in skill mastery as well as changes in attitude and confidence.

What advice would you give others using this evaluation?

To ensure student participation and good-faith effort, we award points for completing the surveys. If a student turns in both pre and post-surveys, they receive 10 points (5—initial; 5—final) counted toward the required total for the class grade. This encourages students to, at the least, turn in the survey. Student participation is also encouraged by doing the surveys in the first and last lab sessions of the course. To encourage students to do their best on the surveys, we give them extra credit points based on how well they do on the survey individually. The maximum number of extra credit points is 5 for the initial survey and 5 for the final survey. So as not to penalize students who have not worked with the literacies recently, we calculate an average score for the entire class on the initial survey and assign everyone the same amount of extra credit points. On the final survey, the student receives the grade they earned but normalized to five points.

Although these surveys can be used in conjunction with any course, we feel they are best suited for those classes that actively emphasize the importance of the supporting literacies. Only if a course explicitly and repeatedly addresses these literacies will students gain an appreciation for these skills and the practice necessary to acquire proficiency in them. Given the consistently poor performance of students on the pre-course survey, it is clear students do not simply pick up these skills by causal exposure to them during their academic career.

Are there particular things about this evaluation that you would like to discuss with the workshop participants? Particular aspects on which you would like feedback?

We are particularly interested in comments on the citizenship survey. We have not yet administered this survey and are having trouble determining how to use it without overwhelming students with both surveys at the beginning of the course. We would like suggestions on how to turn the questions related to the press release into multiple choice questions. In early versions of the fundamental and literacy survey, we used short answer questions. However, this proved to be a major problem and we converted them to multiple choice to make analysis quicker and more manageable.

Comments on the structure and content of the fundamental and technical literacy survey would also be valuable.

Evaluation Materials

Additional Instructions and Rubrics