[I’m no scientist. I even stopped being any good at it in school when we hit physics in my junior year. But it’s always fascinated me—especially when it comes to things like new discoveries and using the techniques of the arts and humanities to teach or communicate science. People who ignore or even disparage science are another interest of mine—I don’t get how even moderately educated people can take that attitude. So the two recent articles below, one from the Washington Post and the other from the New York Times, caught my attention. So I downloaded them saved them for use on ROT, and now’s a good opportunity to share them with the blog’s readers.]
“WHY AMERICANS ARE SO DUBIOUS ABOUT SCIENCE”
by Joel Achenbach
[The article below first appeared in the “Outlook” section of the Washington Post of 15 February 2015.]
The Post’s Joel Achenbach says the evidence
often conflicts with our experience
There’s a scene in Stanley Kubrick’s comic masterpiece
“Dr. Strangelove” in which Jack D. Ripper, an American general who’s gone rogue
and ordered a nuclear attack on the Soviet Union, unspools his paranoid
worldview — and the explanation for why he drinks “only distilled water, or
rainwater, and only pure grain alcohol” — to Lionel Mandrake, a
dizzy-with-anxiety group captain in the Royal Air Force.
Ripper: “Have you ever heard of a thing called fluoridation?
Fluoridation of water?”
Mandrake: “Ah, yes, I have heard of that, Jack. Yes, yes.”
Ripper: “Well, do you know what it is?”
Mandrake: “No. No, I don’t know what it is, no.”
Ripper: “Do you realize that fluoridation is the most
monstrously conceived and dangerous communist plot we have ever had to face?”
The movie came out in 1964, by which time the health
benefits of fluoridation had been thoroughly established and anti-fluoridation
conspiracy theories could be the stuff of comedy. Yet half a century later,
fluoridation continues to incite fear and paranoia. In 2013, citizens in
Portland, Ore., one of only a few major American cities that don’t
fluoridate, blocked a plan by local officials to do so. Opponents
didn’t like the idea of the government adding “chemicals” to their water. They
claimed that fluoride could be harmful to human health.
Actually fluoride is a natural mineral that, in the weak
concentrations used in public drinking-water systems, hardens tooth enamel and
prevents tooth decay — a cheap and safe way to improve dental health for
everyone, rich or poor, conscientious brushers or not. That’s the scientific
and medical consensus.
To which some people in Portland, echoing anti-fluoridation
activists around the world, reply: We don’t believe you.
We live in an age when all manner of scientific knowledge —
from the safety of fluoride and vaccines to the reality of climate
change — faces organized and often furious opposition. Empowered by their own
sources of information and their own interpretations of research, doubters have
declared war on the consensus of experts. There are so many of these
controversies these days, you’d think a diabolical agency had put something in
the water to make people argumentative.
Science doubt has become a pop-culture meme. In the recent
movie “Interstellar,” set in a futuristic, downtrodden America where NASA
has been forced into hiding, school textbooks say the Apollo moon landings were
faked.
In a sense this is not surprising. Our lives are permeated
by science and technology as never before. For many of us this new world is
wondrous, comfortable and rich in rewards — but also more complicated and
sometimes unnerving. We now face risks we can’t easily analyze.
We’re asked to accept, for example, that it’s safe to eat
food containing genetically modified organisms (GMOs) because, the experts
point out, there’s no evidence that it isn’t and no reason to believe that
altering genes precisely in a lab is more dangerous than altering them
wholesale through traditional breeding. But to some people, the very idea of
transferring genes between species conjures up mad scientists running amok —
and so, two centuries after Mary Shelley wrote “Frankenstein,” they talk about
Frankenfood.
The world crackles with real and imaginary hazards, and
distinguishing the former from the latter isn’t easy. Should we be afraid that
the Ebola virus, which is spread only by direct contact with bodily fluids,
will mutate into an airborne super-plague? The scientific consensus says
that’s extremely unlikely: No virus has ever been observed to completely
change its mode of transmission in humans, and there’s zero evidence that the
latest strain of Ebola is any different. But Google “airborne Ebola” and you’ll
enter a dystopia where this virus has almost supernatural powers, including the
power to kill us all.
In this bewildering world we have to decide what to believe
and how to act on that. In principle, that’s what science is for. “Science is
not a body of facts,” says geophysicist Marcia McNutt, who once headed the U.S.
Geological Survey and is now editor of Science, the prestigious journal.
“Science is a method for deciding whether what we choose to believe has a basis
in the laws of nature or not.”
The scientific method leads us to truths that are less than
self-evident, often mind-blowing and sometimes hard to swallow. In the early
17th century, when Galileo claimed that the Earth spins on its axis and orbits
the sun, he wasn’t just rejecting church doctrine. He was asking people to
believe something that defied common sense — because it sure looks like the
sun’s going around the Earth, and you can’t feel the Earth spinning. Galileo
was put on trial and forced to recant. Two centuries later, Charles Darwin
escaped that fate. But his idea that all life on Earth evolved from a
primordial ancestor and that we humans are distant cousins of apes, whales and
even deep-sea mollusks is still a big ask for a lot of people.
Even when we intellectually accept these precepts of
science, we subconsciously cling to our intuitions — what researchers call our
naive beliefs. A study by Andrew Shtulman of Occidental College showed that even students with an
advanced science education had a hitch in their mental gait when asked to
affirm or deny that humans are descended from sea animals and that the Earth
goes around the sun. Both truths are counterintuitive. The students, even those
who correctly marked “true,” were slower to answer those questions than
questions about whether humans are descended from tree-dwelling creatures (also
true but easier to grasp) and whether the moon goes around the Earth (also true
but intuitive).
Shtulman’s research indicates that as we become
scientifically literate, we repress our naive beliefs but never eliminate them
entirely. They nest in our brains, chirping at us as we try to make sense of
the world.
Most of us do that by relying on personal experience
and anecdotes, on stories rather than statistics. We might get a
prostate-specific antigen test, even though it’s no longer generally
recommended, because it caught a close friend’s cancer — and we pay less
attention to statistical evidence, painstakingly compiled through multiple
studies, showing that the test rarely saves lives but triggers many unnecessary
surgeries. Or we hear about a cluster of cancer cases in a town with a
hazardous-waste dump, and we assume that pollution caused the cancers. Of
course, just because two things happened together doesn’t mean one caused the
other, and just because events are clustered doesn’t mean they’re not random.
Yet we have trouble digesting randomness; our brains crave pattern and meaning.
Even for scientists, the scientific method is a hard
discipline. They, too, are vulnerable to confirmation bias — the tendency to
look for and see only evidence that confirms what they already believe. But
unlike the rest of us, they submit their ideas to formal peer review before
publishing them. Once the results are published, if they’re important
enough, other scientists will try to reproduce them — and, being
congenitally skeptical and competitive, will be very happy to announce that
they don’t hold up. Scientific results are always provisional, susceptible to
being overturned by some future experiment or observation. Scientists rarely
proclaim an absolute truth or an absolute certainty. Uncertainty is inevitable
at the frontiers of knowledge.
That provisional quality of science is another thing a lot
of people have trouble with. To some climate-change skeptics, for example, the
fact that a few scientists in the 1970s were worried (quite reasonably, it
seemed at the time) about the possibility of a coming ice age is enough to
discredit what is now the consensus of the world’s scientists: The
planet’s surface temperature has risen by about 1.5 degrees Fahrenheit in the
past 130 years, and human actions, including the burning of fossil fuels, are
extremely likely to have been the dominant cause since the mid-20th century.
It’s clear that organizations funded in part by the
fossil-fuel industry have deliberately tried to undermine the public’s
understanding of the scientific consensus by promoting a few skeptics. The
news media gives abundant attention to such mavericks, naysayers, professional
controversialists and table thumpers. The media would also have you believe
that science is full of shocking discoveries made by lone geniuses. Not so. The
(boring) truth is that science usually advances incrementally, through the
steady accretion of data and insights gathered by many people over many years.
So it has with the consensus on climate change. That’s not about to go poof
with the next thermometer reading.
But industry PR, however misleading, isn’t enough to explain
why so many people reject the scientific consensus on global warming.
The “science communication problem,” as it’s blandly called
by the scientists who study it, has yielded abundant new research into how
people decide what to believe — and why they so often don’t accept the expert
consensus. It’s not that they can’t grasp it, according to Dan Kahan of Yale
University. In one study he asked 1,540 Americans, a representative sample, to
rate the threat of climate change on a scale of zero to 10. Then he correlated
that with the subjects’ science literacy. He found that higher literacy
was associated with stronger views — at both ends of the spectrum. Science
literacy promoted polarization on climate, not consensus. According to Kahan,
that’s because people tend to use scientific knowledge to reinforce their
worldviews.
Americans fall into two basic camps, Kahan says. Those with
a more “egalitarian” and “communitarian” mind-set are generally suspicious of
industry and apt to think it’s up to something dangerous that calls for
government regulation; they’re likely to see the risks of climate change. In
contrast, people with a “hierarchical” and “individualistic” mind-set respect
leaders of industry and don’t like government interfering in their affairs;
they’re apt to reject warnings about climate change, because they know what
accepting them could lead to — some kind of tax or regulation to limit
emissions.
In the United States, climate change has become a litmus
test that identifies you as belonging to one or the other of these two
antagonistic tribes. When we argue about it, Kahan says, we’re actually arguing
about who we are, what our crowd is. We’re thinking: People like us believe
this. People like that do not believe this.
Science appeals to our rational brain, but our beliefs are
motivated largely by emotion, and the biggest motivation is remaining tight
with our peers. “We’re all in high school. We’ve never left high school,” says
Marcia McNutt. “People still have a need to fit in, and that need to fit in is
so strong that local values and local opinions are always trumping science. And
they will continue to trump science, especially when there is no clear downside
to ignoring science.”
Meanwhile the Internet makes it easier than ever for science
doubters to find their own information and experts. Gone are the days when a
small number of powerful institutions — elite universities, encyclopedias and
major news organizations — served as gatekeepers of scientific information. The
Internet has democratized it, which is a good thing. But along with cable TV,
the Web has also made it possible to live in a “filter bubble” that lets in
only the information with which you already agree.
How to penetrate the bubble? How to convert science
skeptics? Throwing more facts at them doesn’t help. Liz Neeley, who helps train
scientists to be better communicators at an organization called Compass, says
people need to hear from believers they can trust, who share their fundamental
values. She has personal experience with this. Her father is a climate-change
skeptic and gets most of his information on the issue from conservative media.
In exasperation she finally confronted him: “Do you believe them or me?” She
told him she believes the scientists who research climate change and knows many
of them personally. “If you think I’m wrong,” she said, “then you’re telling me
that you don’t trust me.” Her father’s stance on the issue softened. But it
wasn’t the facts that did it.
If you’re a rationalist, there’s something a little
dispiriting about all this. In Kahan’s descriptions of how we decide what to
believe, what we decide sometimes sounds almost incidental. Those of us in the
science-communication business are as tribal as anyone else, he told me. We
believe in scientific ideas not because we have truly evaluated all the
evidence but because we feel an affinity for the scientific community. When I
mentioned to Kahan that I fully accept evolution, he said: “Believing in
evolution is just a description about you. It’s not an account of how you
reason.”
Maybe — except that evolution is real. Biology is
incomprehensible without it. There aren’t really two sides to all these issues.
Climate change is happening. Vaccines save lives. Being right does matter — and
the science tribe has a long track record of getting things right in the end.
Modern society is built on things it got right.
Doubting science also has consequences, as seen in recent
weeks with the measles outbreak that began in California. The people who
believe that vaccines cause autism — often well educated and affluent, by the
way — are undermining “herd immunity” to such diseases as whooping cough and
measles. The anti-vaccine movement has been going strong since a prestigious
British medical journal, the Lancet, published a study in 1998 linking a common
vaccine to autism. The journal later retracted the study, which was
thoroughly discredited. But the notion of a vaccine-autism connection has been
endorsed by celebrities and reinforced through the usual Internet filters.
(Anti-vaccine activist and actress Jenny McCarthy famously said on “The Oprah
Winfrey Show,” “The University of Google is where I got my degree from.”)
In the climate debate, the consequences of doubt are likely
to be global and enduring. Climate-change skeptics in the United States have
achieved their fundamental goal of halting legislative action to combat global
warming. They haven’t had to win the debate on the merits; they’ve merely had
to fog the room enough to keep laws governing greenhouse gas emissions from
being enacted.
Some environmental activists want scientists to emerge from
their ivory towers and get more involved in the policy battles. Any scientist
going that route needs to do so carefully, says Liz Neeley. “That line between
science communication and advocacy is very hard to step back from,” she says.
In the debate over climate change, the central allegation of the skeptics is
that the science saying it’s real and a serious threat is politically tinged,
driven by environmental activism and not hard data. That’s not true, and it
slanders honest scientists. But the claim becomes more likely to be seen as
plausible if scientists go beyond their professional expertise and begin
advocating specific policies.
It’s their very detachment, what you might call the
cold-bloodedness of science, that makes science the killer app. It’s the way
science tells us the truth rather than what we’d like the truth to be.
Scientists can be as dogmatic as anyone else — but their dogma is always
wilting in the hot glare of new research. In science it’s not a sin to change
your mind when the evidence demands it. For some people, the tribe is more
important than the truth; for the best scientists, the truth is more important
than the tribe.
[Joel
Achenbach is a science reporter at the Washington Post. A
version of this essay appears on the cover of National Geographic’s
March 2015 issue.]
* *
* *
“NEW STAGE OF PROGRESS IN SCIENCE”
by Kenneth Chang
[This
report was originally published in “Science Times” of the New York Times on 3 March 2015.]
STONY BROOK, N.Y. —
Martha Furie stormed into the room and huffily sat down in a chair.
“Well, you know,
I’ve been working really hard, studying Lyme disease,” she said, her voice
tinged with disdain, to the woman sitting in the next chair. “It’s been a long
process. It’s hard to talk about it.”
The other woman,
Bernadette Holdener, was somewhat befuddled. ”How does it make you feel?” she
asked.
“Lyme disease?” Dr.
Furie sneered. “It can have all sorts of bad things.”
The two were
participating in an improvisational acting exercise a couple of Fridays ago [20
February]. But they are not aspiring actresses or comedians. Dr. Furie is a
professor of pathology at Stony Brook University [State University of New
York at Stony Brook], Dr. Holdener a professor of biochemistry and cell
biology.
“Anyone have any
inkling what is going on?” asked one of the instructors for the session
— Alan Alda, the actor who played Hawkeye in the television series
“M*A*S*H” more than three decades ago.
The exercise, called
“Who am I?,” challenges one of the participants — Dr. Furie, in this case — to
convey an unstated relationship with another, and everyone else must try to
deduce the relationship. “She sounded very angry,” Dr. Holdener said.
People guessed
variously that Dr. Furie was a Lyme researcher who had contracted the disease,
that she just been denied tenure and was venting to the head of her department,
that she was expressing passive-aggressive anger toward her spouse.
“You’re so close,”
Mr. Alda said.
Dr. Furie explained
that Dr. Holdener “was my long-lost sister who stole my husband away.” The
other participants laughed at the convoluted, unlikely setup.
Mr. Alda said that
Dr. Furie, focusing on her role as a wronged sister, intently observed her
audience — Dr. Holdener — and the effect of her words. “What I find interesting
about this is you’re suddenly talking about your work in a way you’ve never
talked about it before,” Mr. Alda said.
The idea of teaching
improv to scientists came from Mr. Alda, now a visiting professor. The
objective is not to make them funny, but to help them talk about science to
people who are not scientists. The exercises encourage them to pay attention to
the audience’s reaction and adjust. “Not jokes, not cleverness,” Mr. Alda said.
“It’s the contact with the other person.”
Mr. Alda has long
held a deep interest in science. In the 1990s, he collaborated on “QED,” a play
about the physicist Richard Feynman, with Mr. Alda playing Dr. Feynman.
He also hosted 11
seasons of the PBS program “Scientific American Frontiers.” In
interviews with hundreds of scientists, he found that he could draw out
engaging explanations. ”I didn’t go in with a list of questions,” Mr. Alda said
during a public lecture at Stony Brook the night before the workshop. “I just
listened to what they had to say and asked them questions that would help me
understand what their work was.”
But he recalled one
scientist who would switch from conversing with Mr. Alda to lecturing to the
camera. “And immediately, the tone of her voice changed,” Mr. Alda said. “Her
vocabulary changed. I couldn’t understand what she was saying.”
Mr. Alda started
suggesting to university presidents that they teach scientists how to present
their research to the public.
No one expressed
interest until 2007, when Mr. Alda visited Stony Brook and met Shirley Strum
Kenny, then the university’s president. “I thought, here’s my chance, I’ll go into
my pitch,” Mr. Alda said. “I said, ‘What do you think? Do you think both could
be taught at the same time so you can graduate accomplished scientists who are
also accomplished communicators?’ And she was interested.”
The next year, he
tested his improv idea at the University of Southern California on 20 graduate
engineering students. The students first talked briefly about their work. “It
was O.K.,” Mr. Alda said.
Then came three
hours of improvisational acting exercises. At the end, the students talked
about their work again. “The difference was striking,” Mr. Alda said. “They
came to life, and I thought, ‘This is going to work.’ ”
Stony Brook
established the Center for Communicating Science in 2009 as part of
its journalism school. In addition to classes, the center started the Flame
Challenge, a contest seeking compelling explanations of seemingly simple
phenomena. The first year, the question was “What is a flame?” Mr. Alda asked
his teacher this when he was 11, and the answer — “oxidation” — was his first
experience with confusing scientific jargon. This year, the question is “What
is sleep?” The winners will be named at the World Science Festival in New York
in May.
In 2013, the Stony
Brook program was officially named the Alan Alda Center for Communicating
Science.
Howard Schneider,
the dean of the journalism school, said science departments were initially
skeptical, with many thinking improv would be a distraction.
That has changed.
Two graduate programs now require students to take the center’s classes. All
medical school students receive 10 hours of training.
“This is a big
cultural shift,” Mr. Schneider said. In addition, four organizations —
Dartmouth College, the University of Vermont, the Robert Wood Johnson Medical
School in New Jersey and the American Chemical Society — have become affiliates
of the center. Other universities, inspired by Stony Brook, are considering
setting up similar programs.
The ability to
describe science effectively could prove key to winning research financing in
the future. Last year, Stony Brook ran a competition among its younger
scientists for a $200,000 prize. The four finalists, who were coached at
the Alan Alda Center, pitched to a panel of distinguished scientists. The
winner was Laurie T. Krug, a professor of molecular genetics and microbiology,
who proposed studying herpes viruses associated with cancer and using
nanoparticles to deliver molecules that act as scissors to cut up viral DNA.
The recent workshop
was for about 40 members of the Stony Brook faculty. For the improv sessions,
the group with Mr. Alda threw around imaginary balls of varying weights,
mirrored one another’s movements, tried to explain a smartphone to a time
traveler from the past, and talked of cherished photographs while holding up a
blank white folder. In the afternoon, they broke into smaller groups to talk
about how to distill and describe their own research.
Dr. Furie, who
directs the graduate program in genetics, said she had started the day unsure
the center’s offerings were a good use of time for her graduate students.
“Now, I’m
convinced,” she said. And she got to play the role of the wronged sister.
“That was crazy,”
Dr. Furie said. “I’m actually not a person who puts myself out there. I can’t
believe I did that.”
[Kenneth Chang is a science
reporter for the New York Times, covering chemistry,
geology, solid state physics, nanotechnology, Pluto, plague and other
scientific miscellany. He attended the science writing program at University of
California at Santa Cruz. He worked at The Los Angeles Times, the Greenwich Time in Connecticut, The Newark Star-Ledger and ABCNEWS.com prior to joining the Times in 2000.]
No comments:
Post a Comment