Designing Shared Cloud-Based Geophysical Courses: Early Evaluation of Infrastructure, Pedagogy, and Learner Confidence
Authors
Tammy Bravo, EarthScope
Gillian B Haberli, EarthScope
Melissa Weber, EarthScope
Michael Hubenthal, EarthScope
As geoscience research becomes increasingly computational, many students encounter barriers related to software installation, hardware access, and limited prior coding experience. These barriers can restrict access to authentic data workflows and limit participation in computationally intensive subfields. Building on research from the Seismology Skill Building Workshop (SSBW), which has demonstrated gains in learner skills, confidence, and longer-term impacts for broadening participation, the NSF National Geophysical Facility operated by EarthScope is developing a Computing and Data Science Academy that integrates shared cloud infrastructure with research-informed course design.
Within this Academy, a growing portfolio of technical courses uses a JupyterHub-based cloud environment (GeoLab) paired with scaffolded learning activities. By eliminating local installation requirements and standardizing computing environments across institutions, the Academy aims to reduce technical friction while expanding access to authentic seismological and geodetic datasets held by the Facility. This effort also addresses a broader ecosystem gap: computational geophysical workflows are often not systematically taught within traditional curricula.
Our current work focuses on documenting instructional design principles, course structures, and early evaluation strategies across an expanding catalog of both facility-developed and community-led courses. These courses serve a broad audience of undergraduate students, graduate students, and early-career researchers developing computational skills in geophysics. Because the Academy supports both centrally developed offerings and courses led by community instructors, evaluation strategies must balance shared instructional goals with the flexibility needed for independently designed courses. We are developing instruments and collecting baseline data to examine learner confidence, perceived competence, and persistence in computational geoscience contexts. This contribution shares our emerging evaluation framework, early patterns observed across courses, and open questions about how shared infrastructure and intentional pedagogy may support learner confidence and continued engagement.
We invite feedback on research design, evaluation strategies, and collaborative opportunities as this work scales.


