published June 26, 2013

SSI 2013 Opening Plenary Speaker to Address Complexities of Scientific Communication and Cultural Cognition

Dan Kahan, the Elizabeth K. Dollard Professor of Law and Professor of Psychology at Yale Law School, will join us at Santa Clara University to deliver the SSI 2013 opening plenary address, exploring research in cultural cognition and scientific communication. In addition to the opening plenary, Professor Kahan will also address participants in the pre-Institute Symposium on Science and Public Policy.

David Burns, NCSCE executive director, notes, "We in the SENCER community and at the National Center for Science and Civic Engagement have a stake in understanding Dan's research. It is also easy to imagine our interest in a corollary challenge, the question of what it is that determines or influences a person's "processing" of academic and scientific claims for his/her own use as a citizen in a democratic society. Thus, we are truly honored and excited that Dan has agreed to help us think about these things as a participant in our science and public policy symposium and as our opening keynote speaker at the SENCER Summer Institute 2013."

Professor Kahan's areas of research include risk perception, criminal law and evidence. Prior to coming to Yale in 1999, he was on the faculty of the University of Chicago Law School. He also served as a law clerk to Justice Thurgood Marshall of the U.S. Supreme Court (1990-91) and to Judge Harry Edwards of the United States Court of Appeals for the D.C. Circuit (1989-90). He received his B.A. from Middlebury College and his J.D. from Harvard University (from

For more of David's reflections on Professor Kahan's significant work and contributions to the NCSCE community, please continue reading.

The National Academy of Sciences has been justifiably concerned about what it has termed "scientific communication."

Last year's NAS's Arthur M. Sackler Colloquium entitled, The Science of Science Communication, provided a comprehensive overview of the challenges faced by the scientific community as it engages with the American people, be they students, opinion leaders, policy-makers, or members of the larger body politic. Videos of colloquium presentations by noted scholars, such as Nobel Laureate, Professor Daniel Kahneman of Princeton University, who delivered the Annual Sackler Lecture, "Thinking That We Know," can be found on the NAS website. I heartily recommend a visit by everyone in the SENCER community.

One featured speaker at last year's event, whose work has been especially interesting to me personally since I was introduced to it by David Targan at Brown and Matt Fisher at St. Vincent's, is the distinguished researcher and teacher, Daniel Kahan, of the Yale University Law School. As you can see from his bio, Dan is a professor of both psychology and law and has organized a center focused on what is called "cultural cognition." One question on the minds of the Center's research team has been: what accounts for the fact that higher degrees of scientific literacy do not predict greater likelihood of "believing" or "accepting" scientific conclusions? (I refer here to conclusions that are well established and essentially noncontroversial within the scientific community.)

It is easy to understand why the National Academy, which was chartered specifically to render scientific advice to the executive branch and the Congress, would be interested in Dan and his team's research. It is also fairly easy to imagine why we in the SENCER community and at the National Center for Science and Civic Engagement would have a stake in understanding Dan's research. It is also easy to imagine our interest in a corollary challenge, the question of what it is that determines or influences a person's "processing" of academic and scientific claims for his/her own use as a citizen in a democratic society.

Thus, we are truly honored and excited that Dan has agreed to help us think about these things as a participant in our science and public policy symposium and as our opening keynote speaker at the SENCER Summer Institute 2013.

It is tempting, I think, to believe that those with more advanced "literacy" in science and mathematics would be more inclined to "understand" and then to believe, trust, and even act upon or follow the advice of scientific experts when they make claims and recommendations about important matters, especially on issues of public controversy.

Who in higher education would like to think anything other than that more knowledge, more capacity for discernment, more ability at data analysis, and more facility at evaluating and weighing evidence, will lead to greater trust for and willingness to accept the vetted, peer-reviewed, tested and proven claims of a reputable scientific establishment (or, for that matter, high-quality scholarship from any field)? One could say that efforts designed to improve general education, particularly in the area of STEM fields, are justifiable precisely because they advance a specific goal to increase our population's "scientific literacy" and, as one would reason, thus increase our capacity to deal with and make decisions about important questions where scientific knowledge is relevant.

It turns out, however, that higher degrees of scientific literacy do not necessarily entail, nor do they necessarily predict, the acceptance of claims by the scientific establishment, particularly about those topics and issues for which there is a counter narrative. This raises a whole welter of questions:

- If this is true, then are the efforts on our part—to teach greater powers of analysis, critical thinking, and discernment—in vain? If not, how do we need to change what and how we teach?
- What's behind this phenomenon? Why doesn't higher "literacy" lead to increased credibility for claims that meet the highest "tests" of legitimacy?
- If literacy and scientific knowledge don't predict a person's views on controversial topics, then what does?
- How broad is this problem? Why are some scientific conclusions accepted by some people, even while they reject other scientific conclusions that "experts" regard as being just as legitimate as the ones they accept?
- What does knowing about "system one and system two thinking" and "cultural cognition" tell us about what we need to do to improve scientific communication?
- Finally, what does all this have to do with teaching and learning?

Teaching is certainly a form of communication. Though students may be able to demonstrate mastery of the material "taught," this demonstration may or may not signal a willingness or likelihood to trust what has been learned and to follow the implications of that learning. (Students may tell you what you want to hear—and give you the right answer on the test—but that answer may reflect something very different than what they actually believe and what they will act upon sometime in the future.) This problem is not one that simply calls for alternate pedagogical or assessment techniques.

Why one kind of knowledge can be taught, tested, and "proven" and yet be discarded when human decisions must be made is a very interesting question, but it isn't a new one. What may qualify as new is something else: the tendency right now to construct and offer a "counter science" to contend and contest with "establishment" science, in a deliberate effort to call into question (or introduce sufficient doubt about) scientific claims. The novelty is that there is the appearance of an argument within science when, in fact, there is none, and that debate is largely settled. There is a debate, to be sure, but to pretend that it is within science helps no one, except perhaps those who would prefer to have their interests remain opaque. This needn't be so.

What I like about Dan Kahan's work, just as what I like about the approach to learning and to the civic sphere that I think is embodied in SENCER, is that both of our approaches acknowledge (without necessarily agreeing with or admiring the content of) the plurality of beliefs, interests, motives, and needs that people have, as well as what we could call the contexts and communities in which people are situated. It follows, for me, at least, that these interests need to be understood, appreciated, and to an extent greater than is the case now, respected, in the most basic sense of that word.

Let me give you an example of what I mean: Living here in central New Jersey I am inclined to support a ban on hydrofracking in New York and Pennsylvania because that gas extraction process appears to pose a potential threat to rivers I like and the water table. Fracking might impair the quality of drinking water, from my own well, even. I may be allowing and accepting an exaggerated risk, because, in the most immediate sense, nothing but potential risk seems to inhere in the proposed drilling, at least for me (the price of natural gas is affordable to me right now). I don't need to really know the science, the geology, hydrology, whatever—a whiff of scientific legitimacy may be enough for me. With that whiff of science to help me, I can stay comfortable in that most basic of conservative inclinations: leave well enough alone.

On the other hand, if I were a struggling dairy farmer in Pennsylvania, I might prefer to discount the risks of hydrofracking (or subscribe to a "counter-science" that denies risk or asserts there is no risk altogether), if I that meant that I could, in good conscience, lease the gas and mineral rights to my land so that, with that supplemental income, my family and I could continue to earn a living doing what we have done for generations. Without the income from the leases, my dairy farm would be facing bankruptcy. (This latter example comes from a video advertisement that makes this precise claim: that the income from gas extraction has enabled the continuity of dairy farming, complete with arresting visuals of beautiful fields and the cleanest Holsteins you have ever seen! The alternative: an abandonment of a centuries old tradition, one central to an almost Jeffersonian agrarian narrative; a threat of the loss of a way of life.)

In How People Learn, John Bransford reminds us that children, students, and all of us indeed have a myriad of prior ideas and beliefs (knowledge) in our minds about matters that we are being taught. We are not empty vessels waiting to be filled with new information, but rather people who often have views, not to mention whole epistemological frameworks, that will impede the acceptance of new knowledge. Similarly, the work that Dan and his colleagues are doing acknowledges that there are certain human inclinations, orientations towards authority, degrees of identification with others, other dispositions-along with what I would call more traditional identification of interests, as in my hydrofracking example—that will trump, or at least override, other truth claims.

For me, at least, one reason for paying attention to Dan's research and thinking is that it offers an alternative to "counter science"—or what some have called pseudoscience—in its frank acknowledgment that factors that have legitimacy, or at least saliency, are in play when matters of controversy are presented. One role for us as educators might very well be to make space in our arguments for these interests to be present and acknowledged, while at the same time, we work to eliminate or at least reduce the "pollution," to use Dan's term, manufactured in the service of, or catering to, certain interests and inclinations, but sometimes masquerading as a critique of high-quality, established science.

In enlarging the context in which scientific arguments are proffered and made, and in including considerations that have little or nothing to do with science but a lot to do with other things that people think matter, whether in the classroom or a legislative body, we can find ways to protect a whole range of interests. We can come to understand a complex web of motives and needs, and to protect and preserve, if you will, the boundaries of different ways of knowing. In short, when we make this kind of space available, we will be enacting or realizing some of original SENCER ideals (2000), paraphrased here:
- reveal[ing] the limits of science by identifying the elements of public issues where science doesn't help us decide what to do,
- show[ing] the power of science by identifying the dimensions of a public issue that can be better understood with certain mathematical and scientific ways of knowing,
- locat[ing] the responsibilities (the burdens and the pleasures) of discovery as the work of the student, and
- encourag[ing]...engagement with the "multidisciplinary trouble" [of complex, capacious, and largely unsolved] to help students overcome both unfounded fears and unquestioning awe of science.

We welcome Dan Kahan's help, and the contributions of all who will be attending SSI 2013, as we think about how to improve STEM learning in the service of our democracy.

Wm. David Burns
June 2013

To access How People Learn, please click here