Scientists learn from data. Learning to learn from data is obviously an essential aspect of the education of a future scientist.
These days, however, many other kinds of people also learn from data--including business people, investors, education leaders, and people who care about pollution, disease, or the quality of their local schools. My daily newspaper is rich in data-based graphs and maps--and so is the newsletter from my local library. These days, learning to learn from data is a necessary part of everyone's education.
However, learning to learn from data is not a typical part of everyone's education. This post explores what might be required to construct a thorough learning progression for learning from Earth Science data, beginning where a good elementary school leaves off and carrying on through to what an upper level college course or adult job might demand.
A learning progression is a "hypothesized description of the successively more sophisticated ways [that] student thinking about an important domain of knowledge or practice develops as children learn about and investigate that domain over an appropriate span of time" (Cocoran, et al, 2009). Neither I, nor anyone else, is ready to flesh out a full-fledged, research-supported learning progression about learning from data, but these thoughts from my December 2011 AGU talk pin down aspects of what are sometimes called the "lower anchor" and "upper anchor" of such a learning progression: the learning performances that exemplify the beginning and end of the progression.
In a good elementary or middle school science program, students have opportunities to get out in nature and collect data themselves about their local environment. Classic data collection opportunities for kids include making weather observations with hand-held instruments, or as shown below (left), making measurements in their local stream or estuary. By the time they get to college, however, students are expected to interpret professionally collected data, from the treasure trove of data freely provided via the Internet by data-rich agencies and universities.
This distinction is important because when people collect data themselves, they have a chance to pick up a deeper understanding of the process by which this particular aspect of Nature was turned into numbers, and in particular what some limits might be on the validity of the data. In the process of making "first inscriptions," they can develop an embodied, holistic sense of the setting or environment from which the data were extracted, and then draw on this understanding when it comes time to interpret what earth processes caused their data set to be the way it is. When working from data collected by others, a sense of the data-acquistion process has to come indirectly from bloodless metadata and a theoretical understanding of the instrumentation used.
When kids collect data themselves, they collect a few dozen or maybe a few hundred data points. They can create an appropriate data display by hand, with pencil and paper. College-level manipulation of professionally-collected data can involve millions of data points, impossible to contemplate one by one. Data visualization software of one sort or another comes into play, and each data visualization tool comes with a learning curve. In my opinion, the hardest part is not learning to manipulate that software to make the appropriate display, but rather learning to see the display in terms of trends and processes rather than as dots, wiggles or blotches of color.
Kid's data interpretation activities tend to focus on one data set and one data type at a time. Data interpretation tasks faced by college students and adults frequently involve two or more data sets, which may be of varied types. Questions about the interactions between the aspects of reality represented by the two or more data sets call for a new set of skills, both logical and statistical, to sort through the range of possibilities: are A and B related? might A be causing or influencing B? might B be causing or influencing A? might some other C be causing or influencing both?
When kids are first taught about graphs, they are typically taught that graphs are useful for looking up stuff. This skill can be taught and learned in a cookbook fashion (below, left). Q: "What was the salinity of the Hudson River at the Beczak Station at noon on April 16?" A: Go across the horizontal axis until you find noon on April 16. Go up until you hit the data line. Go across until you hit the vertical axis. Read off the value: 7000ppm. A harder-to-teach skill is to interpret the pattern or trend of a graph taken in its entirety (below, right).
In the kinds of data interpretation activities set up for kids, typically they can use the kinds of common sense lines of reasoning that work for them in every day life. In college or adulthood, the requisite lines of reasoning become more complicated, involving multi-step chains of reasoning, and drawing on temporal, spatial, quantitative, and systems thinking.
In summary then, we find that learning to learn from data at a sophisticated level is a complicated cognitive challenge, encompassing many sub-challenges.
In reflecting back on my own education, I don't know how this transformation was accomplished. I don't remember being explicitly taught any of this in an earth science course. It seems like I picked it up by osmosis, or maybe by trial and error--but that seems unlikely. How did you learn to learn from data?
See also: Kastens, K. A., and Turrin, M., 2010, Earth Science Puzzles: Making Meaning from Data: Washington, D.C., National Science Teachers Association, 186 p. Available from
NSTA Press bookstore.
References:
Clement, J., 2002, Graphing, in Lehrer, R., and Schauble, L., eds., Investigating Real Data in the Classroom: New York, Teachers College Press, p. 63-73.
Cocoran, T., Mosher, F. A., and Rogat, A., 2009,Learning Progressions in Science: An Evidence-based Approach to Reform: Center on Continuous Improvement of Instruction, Teachers College.
Kastens, K., and Turrin, M., 2011, Geoscience data puzzles: Developing students' ability to make meaning from data, Abstract ED11C-04, in 2011 Fall Meeting, AGU San Francisco, Calif., 13-17 December.
Wainwright, S., 2002, Shadows, in Lehrer, R., and Schauble, L., eds., Investigating Real Data in the Classroom: New York, Teachers College Press.
This post was edited by Caroline Kralovec-Kirchherr on May, 2018
I learned how to reason from data by going to college, majoring in Physics, then graduate school in physics. The best lab class I had was an electricity and magnetism lab where I had to be especially careful to determine measurement errors and understand how my measuring equipment worked.
As I taught Earth Science at UCSB, I always wished I could get students involved with real Earth data and, finally, computer technology allowed me to do this. I also observe that for many kinds of data investigations, the collection of data seems to dominate, and the analysis and effort of relating it to a larger science picture is not given the attention it deserves.
For my introductory oceanography class at UCSB, I selected plate tectonics to develop my data exploration activities, and later expanded to the monsoon, climate, and global fisheries. I developed a collaboration with Greg Kelly, a science education prof in UCSB's graduate school of education. This collaboration was very useful in identifying problems students had in creating a science argument.
If you want to know more about the materials and ideas I developed for my oceanography class at UCSB, go to:
es.earthednet.org/
and
es.earthednet.org/node/32
4049:13855
Share
edittextuser=169 post_id=13855 initial_post_id=0 thread_id=4049