Building Strong Geoscience Departments > Workshops > Assessing Geoscience Programs > Participants and their Contributions > Eric J. Pyle
Author Profile

Assessment in the Context of Evaluation at James Madison University

Eric J. Pyle, Department Geology & Environmental Science, James Madison University


The Department of Geology & Environmental Science at James Madison University is a large one among undergraduate departments, with 14 tenured or tenure-track faculty and approximately 75-80 majors in two degree programs. Within the College of Science & Mathematics, however, it is the smallest in terms of faculty but equal to the Physics & Astronomy Department in the number of majors. Geology faculty, however, provide at least 50% of the student credit hours within the collage generated for science coursework in the general education program. As a result, the assessment of department activity relative to teaching and learning is an important factor in program evaluation.

In the most recent past, assessment within the Department of Geology & Environmental Science at James Madison University has been defined in the broadest terms. In the absence of a set of standards provided by a professional society, this may not be uncommon among geoscience departments. Assessment has been confined to measurements relative to either (a) specific course goals, and student mastery of these goals, or (b) confined to general statements about the needs of the programs and department vision statements. These data have been more retrospective and provide less information on future directions than one might expect.

Goals & Objectives

As a part of the development of our Academic Progress Report during the 2007-2008 year, program objectives were carefully examined and vetted amongst the faculty. General program goals and objectives for JMU Geology & Environmental Science degree programs were developed. The sub-committee charged with defining the goals and objectives also defined specific goals and objectives for degree candidates, such that a general set of expectations that reflect the cognitive, affective, and psychomotor (skills) domains could be applied in a measurable manner. As applied to degree candidates, the attainment of these goals would represent the satisfactory completion of a program, and a primary avenue by which the success of each program might be evaluated.

There is a general recognition among faculty in the department of the importance of Cognitive, Affective, and Psychomotor (skills) objectives in the degree programs. These objectives have been mapped through a set of matrices, stipulating in what courses the knowledge, skills, and dispositions are introduced, reinforced, and subsequently built upon. Faculty members were asked to define the core courses that they are responsible for in terms of the subset of the knowledge/skills/dispositions most appropriate to the course. These data are being compiled currently, with a new assessment committee charged with this task.

Assessment of Students in the Department

The actual number of geosciences degree candidates in any given year has been relatively small compared with other majors. There is relatively little literature on program assessment practices in the geosciences, and the professional societies have not developed a standardized set of instruments to provide such information. Given the complex and interdisciplinary nature of the geosciences, especially as applied to our programs, it became evident that a performance assessment protocol was desirable. Therefore a performance assessment for all majors was developed, to represent components of the cognitive, affective, and psychomotor domains. In developing a rubric in conjunction with this protocol, it was anticipated that levels of performance would reflect introductory, transitional, and expert level among participating students. The assessment protocol was piloted JMU Assessment Day 2008, with all BS and BA candidates strongly encouraged to participate in a performance assessment task.

The objective mapping exercise contributed substantially to the development of this performance assessment protocol, whereby students at different stages of degree progress are expected to perform at differing levels of expertise. This protocol provides a visual representation of a geologic situation, a verbal description, and allows access to samples of materials for the site. Students are to provide written responses to a set of prompts, asking for information expressing declarative and procedural knowledge, habits of mind, the application of skills, and the clear communication of each aspect. This protocol was initially piloted at the JMU Assessment Day in 2008, and participation will be mandatory for students in 2009. Preliminary analysis indicates that the general premise of documenting student development through in this manner is valid, although reliability is receiving a close examination between reviewers.

Other Sources of Data

The General Education testing program at JMU has been able to provide little data to the department, as often too few department majors are selected to take the Cluster 3 – Scientific Perspectives examination on Assessment Day on a given year to have statistical meaning. These assessments are administered by the JMU Center for Assessment and Research Studies (CARS) on the second Tuesday each February. Students that have not yet completed 70 hours are required to participate in Assessment Day, randomly selected to take an examination in 1-2 of the General Education program clusters. As an ethical issue, however, it is the policy of CARS to not release data related to the instruction provided any individual faculty member or department. Thus, data that is tied to a large portion of the instructional activity of the department is not available for program assessment or evaluation, for either department majors or for general education students.


As a primary data source, student evaluations of instruction (SEIs) have traditionally been used to gauge the value of instruction in a particular course, and implicitly the efficacy of faculty members in delivering that instruction. That said, the direct utility of SEIs is limited by the reflective nature of the questions, the range of student performance in a class, and the immediacy of student experience in an individual class relative to an overall curriculum, such that students only see the "trees" and not the "forest." The development of this instrument continues, with an analysis of individual questions, as well as a means by which the somewhat complex nature of some of the item responses may be best presented.

One area of attention can be specifically linked to science teacher education. JMU has a history of teacher education as the former public normal school for Virginia. As a result, teacher education captures a large share of majors at JMU to this day. To maintain accreditation, the College of Education at JMU undergoes periodic review by NCATE, which assigns the review of science teacher education programs to NSTA. The standards established by NSTA specify a clear and explicit role by content departments be documented, in terms of establishing the competence of teacher education candidates in content knowledge as well as the skills and dispositions that are valued within the discipline. This is a current focus of department assessment activities, one that will remain a challenge in the near future, but one in which the department has taken a clear lead within the College.

See more Participants and their Contributions »