In-Depth Assessment of a Large-Enrollment Scientific Computing Workshop to Evaluate Instructional Design and Foster an Online Experiential Learning Cycle

Monday 4:30pm-6:00pm Quad
Poster Session Part of Monday Poster Session

Authors

Gillian B Haberli, EarthScope Consortium
Mike Brudzinski, Miami University-Oxford
Michael Hubenthal, EarthScope
The increasingly digital nature of course assignments has created an opportunity for more in-depth assessment of student learning and evaluation of course design with respect to learning outcomes. The online Seismology Skill Building Workshop was implemented in Summer 2020, with minor changes each summer since. The workshop's goal is to help advanced undergraduates and recent graduates build scientific computing skills through seismology-specific programming in a Massive Open Online Course (MOOC) format. The online nature and extensive set of assignments enables unique analysis of student performance on a variety of tasks. For each assignment question (~1000), we developed a coding system for categorizing the required learning skills and revised Bloom's taxonomy. These were combined with the facility and discrimination index to assess students' hands-on learning of seismology and scientific computing.
This study identified the degree of higher-order thinking necessary was lower than previously assumed and likely impeded the goal of hands-on learning. By utilizing Bloom's taxonomy and Experiential Learning Cycle, we posit that online hands-on learning should involve a cycle of observing, remembering, applying, and analyzing. Changes were enacted to implement more analysis-focused questions and ensure the utilization of specific skills to better accomplish the learning objectives. Question categorization was also used to study the effect of fading supportive pedagogy elements on students' subject mastery to improve participants' independence. We accomplished this by progressively reducing prompting and recall of prior information, scaling up to higher-order Bloom's Taxonomy, and increasing skill requirements. Evaluation of these improvements will be conducted in Summer 2023.
The framework we developed for evaluating assignments and student performance at a granular level has successfully identified mismatches between course design and learning outcomes. Our work indicates an online experiential learning cycle is an important concept for instructional design as online formats are utilized to broaden access to scientific training.