Showing posts with label science. Show all posts
Showing posts with label science. Show all posts

02 July 2018

Science, Curiosity, and God


[I've watched the PBS NewsHour, as it’s now called, pretty much every night since it was the McNeil/Lehrer Report.  In the last several years, the program’s introduced two regular features that are essentially oral essays by guest contributors.  They’re generally opinion pieces on subjects of particular interest to the essayist, but they cover a wide range of topics and presentation styles.  Recently there have been several of these segments that especially caught my attention, and two of them, one from “Brief but Spectacular” and one from “In My Humble Opinion,” seemed somewhat related.  So I’ve decided to post them together and let them play off one another for the benefit of readers of Rick On Theater.  ~Rick]

“WHY NEIL DEGRASSE TYSON WANTS TO FIX THE ADULT CURIOSITY PROBLEM”
by Niel deGrasse Tyson

[This essay, part of NewsHour’s “Brief but Spectacular” feature, was originally aired on Thursday, 24 May 2018.]

Neil deGrasse Tyson says he is like a “smorgasbord of science food” – he’s recognized hundreds of times every day and people are always hungry for more knowledge. DeGrasse Tyson, who spends much of his professional life encouraging science literacy in adults, gives his Brief but Spectacular take on bringing the universe down to Earth.

Judy Woodruff:  Finally, we turn to another installment of our weekly “Brief but Spectacular” series. Tonight, author and astrophysicist, Neil DeGrasse Tyson.

For more than two decades, he has served as director of the Hayden Planetarium in his home town of New York City. Tyson’s latest book, “Astrophysics for People in a Hurry,” is available now.

Neil DeGrasse Tyson:  What I think actually happened, was that the universe chose me. I know that’s not a very scientific sentence, but that’s what it felt like. The universe said, come, Neil, join us. And yes, I never looked back, back at earth. I kept looking up.

I was star struck at age nine. A visit to my local planetarium. Having been born in the Bronx, I thought I knew how many stars there were in the night sky, about a dozen.

Then you go into the dome of the planetarium and then thousands of stars come out. I just thought it was a hoax.

By age 11, I had an answer to that annoying question adults always ask children, what do you want to be when you grow up? I said, astrophysicist.

That usually just shut them up right there. Nobody knew anybody who was an astrophysicist and then I’d get back to the telescope.

So, deniers are people who wish the world were a way that does not agree with the operations of nature.

Believe what you want. I’m not going to even stop you. I would just hope you don’t rise to power over legislation and laws that then affect other people who do understand how science works. That’s dangerous.

Skepticisms is I will only believe what you believe what you tell me in proportion to the weight of the evidence you present. If you start speaking in ways where no known law of physics supports it, then I’m going to be all over you with my skepticism.

I’m recognized basically several hundred times a day. I wish I could put on a mustache and not be noticed but, of course, I have a mustache. They don’t care about me, tell me about that black hole you mentioned a program I saw the other day. Or, will we ever travel through space?

It’s like, I’m just this, this smorgasbord of science food and I got them hungry from something I did before and they’re still hungry and they want more. Most of my professional effort is trying to get adults scientifically literate. I think kids are born curious and if you fix the adult problem, the kids problem gets fixed overnight.

Part of my confidence is I see this generation who’s been born since 1995, teens, low 20s. That generation has only ever known the Internet as a source of access to knowledge. I have very high hope and expectations for what world they will create when they actually assume the mantles of power.

It’s the gap between when they do and what’s going on now that concerns me. It’s the adults that may have once been curious and forgot or there’s a flame that has been tamped down and you want to fan that flame and reawaken a sense of wonder about this world that we so often take for granted.

When I see eyes light up because that moment was reached, I’m done.

I’m Neil DeGrasse Tyson, your personal astrophysicist, and this is my “Brief But Spectacular” take on bringing the universe down to earth.

*  *  *  *
“A SCIENTIST STARES INTO INFINITY AND FINDS SPACE FOR SPIRITUALITY”
by Alan Lightman

[This comment from “In My Humble Opinion,” essays by thinkers, writers and artists, was broadcast on PBS NewsHour on 4 June 2018.]

Amna Nawaz:  One conflict in the ongoing culture wars seems to suggest that science and religion cannot coexist peacefully.

Alan Lightman is a distinguished physicist and a novelist who teaches at MIT. Tonight, he shares his Humble Opinion on how to make space for both facts and spirituality.

Alan Lightman:  I have worked as a physicist for many years. And I have always held a purely scientific view of the world.

And by that, I mean that the universe is made of matter and nothing more, that the universe is governed by a small number of laws, and that everything in the world eventually disintegrates and passes away.

And then, one summer night, I was out in the ocean in a small boat. It was a dark, clear night, and the sky vibrated with stars. I laid down in the boat and looked up. After a few minutes, I found myself falling into infinity.

I lost all track of myself, and the vast expanse of time extending from the far distant past to the far distant future seemed compressed to a dot. I felt connected to something eternal and ethereal, something beyond the material world.

In recent years, some scientists have attempted to use scientific arguments to question the existence of God. I think these people are missing the point. God, as conceived by most religions, lies outside time and space. You can’t use scientific arguments to either disprove or prove God.

And for the same reason, you can’t use scientific arguments to analyze or understand the feeling I had that summer night when I lay down in the boat and looked up and felt part of something far larger than myself.

I’m still a scientist. I still believe that the world is made of atoms and molecules and nothing more. But I also believe in the power and validity of the spiritual experience.

Is it possible to be committed to both without feeling a contradiction? I think so. We understand that everything in the physical world is material, fated to pass away. Yet we also long for the permanent, some grand and eternal unity.

We’re idealists and we’re realists. We’re dreamers and we’re builders. We experience and we do experiments. We long for certainties, and yet we ourselves are full of the ambiguities of the Mona Lisa and the I Ching.

We ourselves are part of the yin-yang of the world.

04 February 2018

"Laser Scans Reveal Maya ‘Megalopolis’ Below Guatemalan Jungle"

by Tom Clynes

[On the NBC Nightly News on Saturday,  anchor Lester Holt presented a report on “Lost Treasures of the Maya Snake Kings,” a special airing on the National Geographic channel on Tuesday, 6 February.  The special details how lasers on planes were used to reveal a massive complex of Mayan ruins covered for centuries by jungle foliage.  With this high-tech help, scientists have found that Maya civilization was more advanced and populous than previously imagined. 

[From time to time on Rick On Theater, I run articles on subjects that simply interest me.  One of those  topics is scientific discoveries that reveal something new about our world despite decades and even centuries of exploration and examination.  It might be a previously unknown species of fish or a hitherto undiscovered fossil, but what intrigues me even more is when the discovery makes scientists change their previously-held understanding of their field, making them formulate new rules and laws or rewriting history.  That what this revelation about the Maya promises to do, even though the hidden cities have been lying unexamined for over a millennium.  The article below was originally posted on the website National Geographic on 1 February 2018 (https://news.nationalgeographic.com/2018/02/maya-laser-lidar-guatemala-pacunam/).]

A vast, interconnected network of ancient cities was home to millions more people than previously thought.

*  *  *  *
In what’s being hailed as a “major breakthrough” in Maya archaeology, researchers have identified the ruins of more than 60,000 houses, palaces, elevated highways, and other human-made features that have been hidden for centuries under the jungles of northern Guatemala.

Using a revolutionary technology known as LiDAR (short for “Light Detection And Ranging”), scholars digitally removed the tree canopy from aerial images of the now-unpopulated landscape, revealing the ruins of a sprawling pre-Columbian civilization that was far more complex and interconnected than most Maya specialists had supposed.

“The LiDAR images make it clear that this entire region was a settlement system whose scale and population density had been grossly underestimated,” said Thomas Garrison, an Ithaca College archaeologist and National Geographic Explorer who specializes in using digital technology for archaeological research.

Garrison is part of a consortium of researchers who are participating in the project, which was spearheaded by the PACUNAM Foundation, a Guatemalan nonprofit that fosters scientific research, sustainable development, and cultural heritage preservation.

The project mapped more than 800 square miles (2,100 square kilometers) of the Maya Biosphere Reserve in the Petén region of Guatemala, producing the largest LiDAR data set ever obtained for archaeological research.

The results suggest that Central America supported an advanced civilization that was, at its peak some 1,200 years ago, more comparable to sophisticated cultures such as ancient Greece or China than to the scattered and sparsely populated city states that ground-based research had long suggested.

In addition to hundreds of previously unknown structures, the LiDAR images show raised highways connecting urban centers and quarries. Complex irrigation and terracing systems supported intensive agriculture capable of feeding masses of workers who dramatically reshaped the landscape.

The ancient Maya never used the wheel or beasts of burden, yet “this was a civilization that was literally moving mountains,” said Marcello Canuto, a Tulane University archaeologist and National Geographic Explorer who participated in the project.

“We’ve had this western conceit that complex civilizations can’t flourish in the tropics, that the tropics are where civilizations go to die,” said Canuto, who conducts archaeological research at a Guatemalan site known as La Corona. “But with the new LiDAR-based evidence from Central America and [Cambodia’s] Angkor Wat, we now have to consider that complex societies may have formed in the tropics and made their way outward from there.”

SURPRISING INSIGHTS

“LiDAR is revolutionizing archaeology the way the Hubble Space Telescope revolutionized astronomy,” said Francisco Estrada-Belli, a Tulane University archaeologist and National Geographic Explorer. “We’ll need 100 years to go through all [the data] and really understand what we’re seeing.”

Already, though, the survey has yielded surprising insights into settlement patterns, inter-urban connectivity, and militarization in the Maya Lowlands. At its peak in the Maya classic period (approximately A.D. 250-900), the civilization covered an area about twice the size of medieval England, but it was far more densely populated.

“Most people had been comfortable with population estimates of around 5 million,” said Estrada-Belli, who directs a multi-disciplinary archaeological project at Holmul, Guatemala. “With this new data it’s no longer unreasonable to think that there were 10 to 15 million people there—including many living in low-lying, swampy areas that many of us had thought uninhabitable.”

Virtually all the Mayan cities were connected by causeways wide enough to suggest that they were heavily trafficked and used for trade and other forms of regional interaction. These highways were elevated to allow easy passage even during rainy seasons. In a part of the world where there is usually too much or too little precipitation, the flow of water was meticulously planned and controlled via canals, dikes, and reservoirs.

Among the most surprising findings was the ubiquity of defensive walls, ramparts, terraces, and fortresses. “Warfare wasn’t only happening toward the end of the civilization,” said Garrison. “It was large-scale and systematic, and it endured over many years.”

The survey also revealed thousands of pits dug by modern-day looters. “Many of these new sites are only new to us; they are not new to looters,” said Marianne Hernandez, president of the PACUNAM Foundation. (Read Losing Maya Heritage to Looters.)

Environmental degradation is another concern. Guatemala is losing more than 10 percent of its forests annually, and habitat loss has accelerated along its border with Mexico as trespassers burn and clear land for agriculture and human settlement.

“By identifying these sites and helping to understand who these ancient people were, we hope to raise awareness of the value of protecting these places,” Hernandez said.

The survey is the first phase of the PACUNAM LiDAR Initiative, a three-year project that will eventually map more than 5,000 square miles (14,000 square kilometers) of Guatemala’s lowlands, part of a pre-Columbian settlement system that extended north to the Gulf of Mexico.

“The ambition and the impact of this project is just incredible,” said Kathryn Reese-Taylor, a University of Calgary archaeologist and Maya specialist who was not associated with the PACUNAM survey. “After decades of combing through the forests, no archaeologists had stumbled across these sites. More importantly, we never had the big picture that this data set gives us. It really pulls back the veil and helps us see the civilization as the ancient Maya saw it.”

[A National Geographic Explorer is a scientist, conservationist, educator, or storyteller funded and supported by the National Geographic Society.  According to the organization’s own description, “Every one of them is infinitely curious about our planet, committed to understanding it, and passionate about helping make it better.”

[Tom Clynes is an author and photojournalist who travels the world covering the adventurous sides of science, the environment, and education for publications such as National Geographic, Nature, the New York Times, and Popular Science.  His work has also appeared in The Atlantic, NewsweekScientific American, the Sunday Times Magazine of London, and many other publications.]

28 March 2015

Perspectives on Science


[I’m no scientist.  I even stopped being any good at it in school when we hit physics in my junior year.  But it’s always fascinated me—especially when it comes to things like new discoveries and using the techniques of the arts and humanities to teach or communicate science.  People who ignore or even disparage science are another interest of mine—I don’t get how even moderately educated people can take that attitude.  So the two recent articles below, one from the Washington Post and the other from the New York Times, caught my attention.  So I downloaded them saved them for use on ROT, and now’s a good opportunity to share them with the blog’s readers.] 

“WHY AMERICANS ARE SO DUBIOUS ABOUT SCIENCE”
by Joel Achenbach

[The article below first appeared in the “Outlook” section of the  Washington Post of 15 February 2015.]

The Post’s Joel Achenbach says the evidence often conflicts with our experience

There’s a scene in Stanley Kubrick’s comic masterpiece “Dr. Strangelove” in which Jack D. Ripper, an American general who’s gone rogue and ordered a nuclear attack on the Soviet Union, unspools his paranoid worldview — and the explanation for why he drinks “only distilled water, or rainwater, and only pure grain alcohol” — to Lionel Mandrake, a dizzy-with-anxiety group captain in the Royal Air Force.

Ripper: “Have you ever heard of a thing called fluoridation? Fluoridation of water?”

Mandrake: “Ah, yes, I have heard of that, Jack. Yes, yes.”

Ripper: “Well, do you know what it is?”

Mandrake: “No. No, I don’t know what it is, no.”

Ripper: “Do you realize that fluoridation is the most monstrously conceived and dangerous communist plot we have ever had to face?”

The movie came out in 1964, by which time the health benefits of fluoridation had been thoroughly established and anti-fluoridation conspiracy theories could be the stuff of comedy. Yet half a century later, fluoridation continues to incite fear and paranoia. In 2013, citizens in Portland, Ore., one of only a few major American cities that don’t fluoridate, blocked a plan by local officials to do so. Opponents didn’t like the idea of the government adding “chemicals” to their water. They claimed that fluoride could be harmful to human health.

Actually fluoride is a natural mineral that, in the weak concentrations used in public drinking-water systems, hardens tooth enamel and prevents tooth decay — a cheap and safe way to improve dental health for everyone, rich or poor, conscientious brushers or not. That’s the scientific and medical consensus.

To which some people in Portland, echoing anti-fluoridation activists around the world, reply: We don’t believe you.

We live in an age when all manner of scientific knowledge — from the safety of fluoride and vaccines to the reality of climate change — faces organized and often furious opposition. Empowered by their own sources of information and their own interpretations of research, doubters have declared war on the consensus of experts. There are so many of these controversies these days, you’d think a diabolical agency had put something in the water to make people argumentative.

Science doubt has become a pop-culture meme. In the recent movie “Interstellar,” set in a futuristic, downtrodden America where NASA has been forced into hiding, school textbooks say the Apollo moon landings were faked.

In a sense this is not surprising. Our lives are permeated by science and technology as never before. For many of us this new world is wondrous, comfortable and rich in rewards — but also more complicated and sometimes unnerving. We now face risks we can’t easily analyze.

We’re asked to accept, for example, that it’s safe to eat food containing genetically modified organisms (GMOs) because, the experts point out, there’s no evidence that it isn’t and no reason to believe that altering genes precisely in a lab is more dangerous than altering them wholesale through traditional breeding. But to some people, the very idea of transferring genes between species conjures up mad scientists running amok — and so, two centuries after Mary Shelley wrote “Frankenstein,” they talk about Frankenfood.

The world crackles with real and imaginary hazards, and distinguishing the former from the latter isn’t easy. Should we be afraid that the Ebola virus, which is spread only by direct contact with bodily fluids, will mutate into an airborne super-plague? The scientific consensus says that’s extremely unlikely: No virus has ever been observed to completely change its mode of transmission in humans, and there’s zero evidence that the latest strain of Ebola is any different. But Google “airborne Ebola” and you’ll enter a dystopia where this virus has almost supernatural powers, including the power to kill us all.

In this bewildering world we have to decide what to believe and how to act on that. In principle, that’s what science is for. “Science is not a body of facts,” says geophysicist Marcia McNutt, who once headed the U.S. Geological Survey and is now editor of Science, the prestigious journal. “Science is a method for deciding whether what we choose to believe has a basis in the laws of nature or not.”

The scientific method leads us to truths that are less than self-evident, often mind-blowing and sometimes hard to swallow. In the early 17th century, when Galileo claimed that the Earth spins on its axis and orbits the sun, he wasn’t just rejecting church doctrine. He was asking people to believe something that defied common sense — because it sure looks like the sun’s going around the Earth, and you can’t feel the Earth spinning. Galileo was put on trial and forced to recant. Two centuries later, Charles Darwin escaped that fate. But his idea that all life on Earth evolved from a primordial ancestor and that we humans are distant cousins of apes, whales and even deep-sea mollusks is still a big ask for a lot of people.

Even when we intellectually accept these precepts of science, we subconsciously cling to our intuitions — what researchers call our naive beliefs. A study by Andrew Shtulman of Occidental College showed that even students with an advanced science education had a hitch in their mental gait when asked to affirm or deny that humans are descended from sea animals and that the Earth goes around the sun. Both truths are counterintuitive. The students, even those who correctly marked “true,” were slower to answer those questions than questions about whether humans are descended from tree-dwelling creatures (also true but easier to grasp) and whether the moon goes around the Earth (also true but intuitive).

Shtulman’s research indicates that as we become scientifically literate, we repress our naive beliefs but never eliminate them entirely. They nest in our brains, chirping at us as we try to make sense of the world.

Most of us do that by relying on personal experience and anecdotes, on stories rather than statistics. We might get a prostate-specific antigen test, even though it’s no longer generally recommended, because it caught a close friend’s cancer — and we pay less attention to statistical evidence, painstakingly compiled through multiple studies, showing that the test rarely saves lives but triggers many unnecessary surgeries. Or we hear about a cluster of cancer cases in a town with a hazardous-waste dump, and we assume that pollution caused the cancers. Of course, just because two things happened together doesn’t mean one caused the other, and just because events are clustered doesn’t mean they’re not random. Yet we have trouble digesting randomness; our brains crave pattern and meaning.

Even for scientists, the scientific method is a hard discipline. They, too, are vulnerable to confirmation bias — the tendency to look for and see only evidence that confirms what they already believe. But unlike the rest of us, they submit their ideas to formal peer review before publishing them. Once the results are published, if they’re important enough, other scientists will try to reproduce them — and, being congenitally skeptical and competitive, will be very happy to announce that they don’t hold up. Scientific results are always provisional, susceptible to being overturned by some future experiment or observation. Scientists rarely proclaim an absolute truth or an absolute certainty. Uncertainty is inevitable at the frontiers of knowledge.

That provisional quality of science is another thing a lot of people have trouble with. To some climate-change skeptics, for example, the fact that a few scientists in the 1970s were worried (quite reasonably, it seemed at the time) about the possibility of a coming ice age is enough to discredit what is now the consensus of the world’s scientists: The planet’s surface temperature has risen by about 1.5 degrees Fahrenheit in the past 130 years, and human actions, including the burning of fossil fuels, are extremely likely to have been the dominant cause since the mid-20th century.

It’s clear that organizations funded in part by the fossil-fuel industry have deliberately tried to undermine the public’s understanding of the scientific consensus by promoting a few skeptics. The news media gives abundant attention to such mavericks, naysayers, professional controversialists and table thumpers. The media would also have you believe that science is full of shocking discoveries made by lone geniuses. Not so. The (boring) truth is that science usually advances incrementally, through the steady accretion of data and insights gathered by many people over many years. So it has with the consensus on climate change. That’s not about to go poof with the next thermometer reading.

But industry PR, however misleading, isn’t enough to explain why so many people reject the scientific consensus on global warming.

The “science communication problem,” as it’s blandly called by the scientists who study it, has yielded abundant new research into how people decide what to believe — and why they so often don’t accept the expert consensus. It’s not that they can’t grasp it, according to Dan Kahan of Yale University. In one study he asked 1,540 Americans, a representative sample, to rate the threat of climate change on a scale of zero to 10. Then he correlated that with the subjects’ science literacy. He found that higher literacy was associated with stronger views — at both ends of the spectrum. Science literacy promoted polarization on climate, not consensus. According to Kahan, that’s because people tend to use scientific knowledge to reinforce their worldviews.

Americans fall into two basic camps, Kahan says. Those with a more “egalitarian” and “communitarian” mind-set are generally suspicious of industry and apt to think it’s up to something dangerous that calls for government regulation; they’re likely to see the risks of climate change. In contrast, people with a “hierarchical” and “individualistic” mind-set respect leaders of industry and don’t like government interfering in their affairs; they’re apt to reject warnings about climate change, because they know what accepting them could lead to — some kind of tax or regulation to limit emissions.

In the United States, climate change has become a litmus test that identifies you as belonging to one or the other of these two antagonistic tribes. When we argue about it, Kahan says, we’re actually arguing about who we are, what our crowd is. We’re thinking: People like us believe this. People like that do not believe this.

Science appeals to our rational brain, but our beliefs are motivated largely by emotion, and the biggest motivation is remaining tight with our peers. “We’re all in high school. We’ve never left high school,” says Marcia McNutt. “People still have a need to fit in, and that need to fit in is so strong that local values and local opinions are always trumping science. And they will continue to trump science, especially when there is no clear downside to ignoring science.”

Meanwhile the Internet makes it easier than ever for science doubters to find their own information and experts. Gone are the days when a small number of powerful institutions — elite universities, encyclopedias and major news organizations — served as gatekeepers of scientific information. The Internet has democratized it, which is a good thing. But along with cable TV, the Web has also made it possible to live in a “filter bubble” that lets in only the information with which you already agree.

How to penetrate the bubble? How to convert science skeptics? Throwing more facts at them doesn’t help. Liz Neeley, who helps train scientists to be better communicators at an organization called Compass, says people need to hear from believers they can trust, who share their fundamental values. She has personal experience with this. Her father is a climate-change skeptic and gets most of his information on the issue from conservative media. In exasperation she finally confronted him: “Do you believe them or me?” She told him she believes the scientists who research climate change and knows many of them personally. “If you think I’m wrong,” she said, “then you’re telling me that you don’t trust me.” Her father’s stance on the issue softened. But it wasn’t the facts that did it.

If you’re a rationalist, there’s something a little dispiriting about all this. In Kahan’s descriptions of how we decide what to believe, what we decide sometimes sounds almost incidental. Those of us in the science-communication business are as tribal as anyone else, he told me. We believe in scientific ideas not because we have truly evaluated all the evidence but because we feel an affinity for the scientific community. When I mentioned to Kahan that I fully accept evolution, he said: “Believing in evolution is just a description about you. It’s not an account of how you reason.”

Maybe — except that evolution is real. Biology is incomprehensible without it. There aren’t really two sides to all these issues. Climate change is happening. Vaccines save lives. Being right does matter — and the science tribe has a long track record of getting things right in the end. Modern society is built on things it got right.

Doubting science also has consequences, as seen in recent weeks with the measles outbreak that began in California. The people who believe that vaccines cause autism — often well educated and affluent, by the way — are undermining “herd immunity” to such diseases as whooping cough and measles. The anti-vaccine movement has been going strong since a prestigious British medical journal, the Lancet, published a study in 1998 linking a common vaccine to autism. The journal later retracted the study, which was thoroughly discredited. But the notion of a vaccine-autism connection has been endorsed by celebrities and reinforced through the usual Internet filters. (Anti-vaccine activist and actress Jenny McCarthy famously said on “The Oprah Winfrey Show,” “The University of Google is where I got my degree from.”)

In the climate debate, the consequences of doubt are likely to be global and enduring. Climate-change skeptics in the United States have achieved their fundamental goal of halting legislative action to combat global warming. They haven’t had to win the debate on the merits; they’ve merely had to fog the room enough to keep laws governing greenhouse gas emissions from being enacted.

Some environmental activists want scientists to emerge from their ivory towers and get more involved in the policy battles. Any scientist going that route needs to do so carefully, says Liz Neeley. “That line between science communication and advocacy is very hard to step back from,” she says. In the debate over climate change, the central allegation of the skeptics is that the science saying it’s real and a serious threat is politically tinged, driven by environmental activism and not hard data. That’s not true, and it slanders honest scientists. But the claim becomes more likely to be seen as plausible if scientists go beyond their professional expertise and begin advocating specific policies.

It’s their very detachment, what you might call the cold-bloodedness of science, that makes science the killer app. It’s the way science tells us the truth rather than what we’d like the truth to be. Scientists can be as dogmatic as anyone else — but their dogma is always wilting in the hot glare of new research. In science it’s not a sin to change your mind when the evidence demands it. For some people, the tribe is more important than the truth; for the best scientists, the truth is more important than the tribe.

[Joel Achenbach is a science reporter at the Washington Post. A version of this essay appears on the cover of National Geographic’s March 2015 issue.]

*  *  *  *
“NEW STAGE OF PROGRESS IN SCIENCE”
by Kenneth Chang

[This report was originally published in “Science Times” of the New York Times on 3 March 2015.]

STONY BROOK, N.Y. — Martha Furie stormed into the room and huffily sat down in a chair.

“Well, you know, I’ve been working really hard, studying Lyme disease,” she said, her voice tinged with disdain, to the woman sitting in the next chair. “It’s been a long process. It’s hard to talk about it.”

The other woman, Bernadette Holdener, was somewhat befuddled. ”How does it make you feel?” she asked.

“Lyme disease?” Dr. Furie sneered. “It can have all sorts of bad things.”

The two were participating in an improvisational acting exercise a couple of Fridays ago [20 February]. But they are not aspiring actresses or comedians. Dr. Furie is a professor of pathology at Stony Brook University [State University of New York at Stony Brook], Dr. Holdener a professor of biochemistry and cell biology.

“Anyone have any inkling what is going on?” asked one of the instructors for the session — Alan Alda, the actor who played Hawkeye in the television series “M*A*S*H” more than three decades ago.

The exercise, called “Who am I?,” challenges one of the participants — Dr. Furie, in this case — to convey an unstated relationship with another, and everyone else must try to deduce the relationship. “She sounded very angry,” Dr. Holdener said.

People guessed variously that Dr. Furie was a Lyme researcher who had contracted the disease, that she just been denied tenure and was venting to the head of her department, that she was expressing passive-aggressive anger toward her spouse.

“You’re so close,” Mr. Alda said.

Dr. Furie explained that Dr. Holdener “was my long-lost sister who stole my husband away.” The other participants laughed at the convoluted, unlikely setup.

Mr. Alda said that Dr. Furie, focusing on her role as a wronged sister, intently observed her audience — Dr. Holdener — and the effect of her words. “What I find interesting about this is you’re suddenly talking about your work in a way you’ve never talked about it before,” Mr. Alda said.

The idea of teaching improv to scientists came from Mr. Alda, now a visiting professor. The objective is not to make them funny, but to help them talk about science to people who are not scientists. The exercises encourage them to pay attention to the audience’s reaction and adjust. “Not jokes, not cleverness,” Mr. Alda said. “It’s the contact with the other person.”

Mr. Alda has long held a deep interest in science. In the 1990s, he collaborated on “QED,” a play about the physicist Richard Feynman, with Mr. Alda playing Dr. Feynman.

He also hosted 11 seasons of the PBS program “Scientific American Frontiers.” In interviews with hundreds of scientists, he found that he could draw out engaging explanations. ”I didn’t go in with a list of questions,” Mr. Alda said during a public lecture at Stony Brook the night before the workshop. “I just listened to what they had to say and asked them questions that would help me understand what their work was.”

But he recalled one scientist who would switch from conversing with Mr. Alda to lecturing to the camera. “And immediately, the tone of her voice changed,” Mr. Alda said. “Her vocabulary changed. I couldn’t understand what she was saying.”

Mr. Alda started suggesting to university presidents that they teach scientists how to present their research to the public.

No one expressed interest until 2007, when Mr. Alda visited Stony Brook and met Shirley Strum Kenny, then the university’s president. “I thought, here’s my chance, I’ll go into my pitch,” Mr. Alda said. “I said, ‘What do you think? Do you think both could be taught at the same time so you can graduate accomplished scientists who are also accomplished communicators?’ And she was interested.”

The next year, he tested his improv idea at the University of Southern California on 20 graduate engineering students. The students first talked briefly about their work. “It was O.K.,” Mr. Alda said.

Then came three hours of improvisational acting exercises. At the end, the students talked about their work again. “The difference was striking,” Mr. Alda said. “They came to life, and I thought, ‘This is going to work.’ ”

Stony Brook established the Center for Communicating Science in 2009 as part of its journalism school. In addition to classes, the center started the Flame Challenge, a contest seeking compelling explanations of seemingly simple phenomena. The first year, the question was “What is a flame?” Mr. Alda asked his teacher this when he was 11, and the answer — “oxidation” — was his first experience with confusing scientific jargon. This year, the question is “What is sleep?” The winners will be named at the World Science Festival in New York in May.

In 2013, the Stony Brook program was officially named the Alan Alda Center for Communicating Science.

Howard Schneider, the dean of the journalism school, said science departments were initially skeptical, with many thinking improv would be a distraction.

That has changed. Two graduate programs now require students to take the center’s classes. All medical school students receive 10 hours of training.

“This is a big cultural shift,” Mr. Schneider said. In addition, four organizations — Dartmouth College, the University of Vermont, the Robert Wood Johnson Medical School in New Jersey and the American Chemical Society — have become affiliates of the center. Other universities, inspired by Stony Brook, are considering setting up similar programs.

The ability to describe science effectively could prove key to winning research financing in the future. Last year, Stony Brook ran a competition among its younger scientists for a $200,000 prize. The four finalists, who were coached at the Alan Alda Center, pitched to a panel of distinguished scientists. The winner was Laurie T. Krug, a professor of molecular genetics and microbiology, who proposed studying herpes viruses associated with cancer and using nanoparticles to deliver molecules that act as scissors to cut up viral DNA.

The recent workshop was for about 40 members of the Stony Brook faculty. For the improv sessions, the group with Mr. Alda threw around imaginary balls of varying weights, mirrored one another’s movements, tried to explain a smartphone to a time traveler from the past, and talked of cherished photographs while holding up a blank white folder. In the afternoon, they broke into smaller groups to talk about how to distill and describe their own research.

Dr. Furie, who directs the graduate program in genetics, said she had started the day unsure the center’s offerings were a good use of time for her graduate students.

“Now, I’m convinced,” she said. And she got to play the role of the wronged sister.

“That was crazy,” Dr. Furie said. “I’m actually not a person who puts myself out there. I can’t believe I did that.”

[Kenneth Chang is a science reporter for the New York Times, covering chemistry, geology, solid state physics, nanotechnology, Pluto, plague and other scientific miscellany. He attended the science writing program at University of California at Santa Cruz. He worked at The Los Angeles Times, the Greenwich Time in Connecticut, The Newark Star-Ledger and ABCNEWS.com prior to joining the Times in 2000.]


16 July 2011

The Rap on Darwin

[I’m always interested in the uses of theater and theater techniques in other disciplines and circumstances where theater and “the real world” overlap. When I read an article in the New York Times’s science section last month about a hip-hop performance on evolution, I was fascinated by the confluence of science, theater, and teaching. This was especially true because the piece, The Rap Guide to Evolution by Baba Brinkman, is being presented not as a science project or a teaching tool, but as a theater performance. I decided it was interesting enough to republish the article, and the review of the show that appeared on the same date in the arts section of the Times, on ROT. Just to round off the posting, I dug up an earlier New York Times blog article for an earlier appearance of Brinkman in his Darwin rap. ~Rick]

“PAYING HOMAGE TO DARWIN IN AN UNCONVENTIONAL FORMAT: RAP”
By Dennis Overbye

Don’t sleep with mean people.

That’s a lesson some of us learn painfully, if at all, in regard to our personal happiness. That there could be a cosmic evolutionary angle to this thought had never occurred to me until I heard Baba Brinkman, a rap artist and Chaucer scholar, say it the other night. Think of it as the ultimate example of thinking globally and acting very, very locally. We are all in the process of recreating our species in our most intimate acts:

Don’t sleep with mean people, that’s the anthem
Please! Think about your granddaughters and grandsons
Don’t sleep with mean people, pretty or handsome
Mean people hold the gene pool for ransom.

Imagine this to a hip-hop beat accompanied with intermittent snarls and scowls, gangster slouches and crotch grabs and you have “The Rap Guide to Evolution,” written and performed by Mr. Brinkman. The show, which just opened for a summer-long run at the SoHo Playhouse in Manhattan, is an hour-and-a-half lecture on Darwin and natural selection disguised as a rant on the history of rap, gangs and murder in Chicago, relations between the sexes and his own stubborn creationist cousins.

Evolution has had many prominent defenders and proselytizers over the years, including Thomas Huxley when Darwin was alive, and Richard Dawkins now, but few as engaging and rhythmic as Mr. Brinkman, who has performed at the prestigious Edinburgh Fringe Festival six times, winning an award for the best new theater writing there in 2009.

Writing on NYTimes.com last year, Olivia Judson, the biologist and author, called the evolution rap show “one of the most astonishing, and brilliant, lectures on evolution I’ve ever seen.” On a humid night last week the crowd spilled out of the playhouse and down the streets of SoHo after the show, chatting about the technical and social aspects of natural selection.

The scene reinforced my sense that “geek rap,” as Mr. Brinkman calls it, is becoming one of the most popular and vital forms of science communication. Few exegeses of the Large Hadron Collider match Alpinekat’s “Large Hadron Rap” for punch and rhythm, and Stephen Hawking’s robot voice and puckish wit have spawned a host of imitators, like M C Hawking, rapping about black holes and entropy.

But when it comes to mixing the personal and the cosmic, it’s hard to beat the combination of evolution and hip-hop. As an illustration of the Darwinian principle of biomimicry, Mr. Brinkman compared the menacing persona of gangsta rappers to the bright colors adopted by a nonpoisonous snake to appear poisonous and thus scare off predators in a hostile environment.

Mr. Brinkman is no gangsta. By the usual cultural signifiers, Mr. Brinkman does not fit the rapper stereotype at all. A tall blond Canadian of Dutch ancestry, he was born in 1978 in a log cabin built by his hippie parents and their friends in the West Kootenays, a mountain range in British Columbia. His father runs a company replanting trees after logging operations—more than a billion replanted so far. His mother is a member of Canada’s Parliament.

Over a plate of oysters and tuna last week, Mr. Brinkman said that he had wanted to be a rapper ever since he was 10 and had performed on and off since he was 18, making up songs and singing to the rhythm of tree planting at his father’s business.

He was also a literature nerd as a child and wound up getting a master’s degree in medieval literature from the University of Victoria. Along the way he began writing a rap version of Geoffrey Chaucer’s Canterbury Tales. “Chaucer needed to be better presented,” he explained.

Mr. Brinkman took “The Rap Canterbury Tales” to the Edinburgh festival, where it sold out in 2004, which led to “a whole lot of gigs” and a book, he said.

Along the way his work came to the attention of Mark Pallen, a biologist at the University of Birmingham and author of “The Rough Guide to Evolution,” who had just done a reggae treatment of Darwin for a Jamaican colleague, and had also been using evolutionary methods to study Chaucer manuscripts. He invited Mr. Brinkman to Birmingham and, as he puts it, “we quickly slipped into an evolutionary groove.”

Dr. Pallen asked Mr. Brinkman if he could do for Darwin what he had done for Chaucer.

“Probably,” Mr. Brinkman answered. The only hitch was that it had to be done in five months, in time for the 200th anniversary of Darwin’s birth, on Feb. 12, 2009, which was the occasion for a worldwide celebration of Darwin and evolution science.

Mr. Brinkman bought an audio version of “On the Origin of Species” and listened to it. Then he transferred it to an iPod Shuffle and listened to it again, with the chapters played in random order. “New connections emerged,” he said.

The result was what Dr. Pallen called “the first peer-reviewed rap.”

Mr. Brinkman performed the show at various venues around Britain for Darwin’s February birthday bash and then later on at Edinburgh and one sold-out week in New York in 2009. At one point, he said, he did 53 shows in three and a half weeks.

Fittingly, the show itself evolves. What was once a line about not sleeping with mean people, for example, has been expanded to a whole section. But the road has not been without bumps. Mr. Brinkman said that in Texas people walked out on a section of the rap which features a call and response of “Creationism is”—“dead wrong!”

When he was 19, Mr. Brinkman said, he wanted to be Eminem, selling a million records a year, but now he thinks he can see a lot of opportunities in geek rap. He said he was thinking of doing his next rap about climate change.

He paused over his pepper-crusted tuna, and said, “I’m very keen to do it, actually.”

[This article appeared in the print edition of the New York Times on 27 June 2011 (Sec. D [“Science Times”]).
The Rap Guide to Evolution opened 26 June for an open-ended run at the SoHo Playhouse, 15 Van Dam Street, South Village. For tickets and performance information, contact (212) 352-3101 or rapguidetoevolution.com.]

* * * *

“A RAPPER WRAPS HIS MIND AROUND DARWIN”
By David Rooney

If Terrence Malick’s majestic depiction of Darwinian natural selection in “The Tree of Life” was a little too solemn and symphonic for your taste, you might consider the more loquacious hip-hop alternative of “The Rap Guide to Evolution,” at the SoHo Playhouse.

An award winner at the 2009 Edinburgh Festival Fringe, this ever-evolving show is written and performed by Baba Brinkman, an affable white rapper from Canada with a master’s in medieval and Renaissance English literature.

A 90-minute interactive musical lecture with amusing visual aids—courtesy of the projection designer Wendall K. Harrington—the show was developed at the invitation of Mark Pallen, a professor of microbial genomics at the University of Birmingham, England, after he saw Mr. Brinkman’s “Rap Canterbury Tales.”

Clearly Mr. Brinkman is not intimidated by challenging material. Nor is this simply a smarty-pants vehicle in which an erudite hipster flaunts his mad skills by molding his scholarly insights into “The Origin of the Species” to unorthodox beats (provided onstage by Jamie Simmonds, the DJ and music producer). Unlike more sophomoric hybridists of highbrow content and popular form, Mr. Brinkman brings genuine passion, curiosity and analytical skills to his subject.

Creationists may sneer, but Mr. Brinkman mounts an argument against intelligent design that is both brainy and entertaining. “It’s time to elevate your mind-state/And celebrate your kinship with the primates,” he raps.

Lest this sound purely science-geeky, the show also uses theories of natural selection and evolutionary psychology to chart developments in hip-hop: “You could thrive like Timberlake on a Timbaland beat/Or go extinct like Vanilla Ice and ’N Sync.” O.K., so the meters won’t give Stephen Sondheim sleepless nights (though pairing “huge manatee” with “humanity” has undeniable charm), but the rhythms are punchy.

Mr. Brinkman draws parallels between animal kingdom behavior and rap as a survivalist expression of power, pride, menace and sexual magnetism. And as he wryly points out, what is the ostentatious plumage of the male peacock but nature’s bling?

Tightly directed by Dodd Loomis, the production closes with a Q&A period in which audience input feeds some free-style addenda. While this stretches the performance somewhat, it also shows that Mr. Brinkman is more than an obsessively overstimulated Darwin fanboy with a talent for recitation.

His “them = us” thread about nurturing the group above the individual gives the show an overarching message. “All this hippy-dippy, love-thy-neighbor bio-socialism isn’t just me editorializing as a Canadian,” he says with disarming self-mockery, going on to explain how society might be reconfigured to eliminate hostility and fear.

Sure, it’s a rose-colored vision, but by the time Mr. Brinkman shares his “Lysistrata”-inspired anthem of sexual selection, “Don’t Sleep With Mean People,” you might start singing along.

[David Rooney’s review appeared in the New York Times on 27 June 2011 (Sec. C [“The Arts”]).]
* * * *

“DARWIN GOT IT GOING ON”
By Olivia Judson

The lights go down. The room fills with music—a pulsating hip-hop rhythm. And then, over the music, you hear the voice of Richard Dawkins reading a passage from “On the Origin of Species” by Charles Darwin: “Whoever is led to believe that species are mutable will do good service by conscientiously expressing his conviction. For only thus can the load of prejudice by which this subject is overwhelmed be removed.”

So begins one of the most astonishing, and brilliant, lectures on evolution I’ve ever seen: “The Rap Guide to Evolution,” by Baba Brinkman.

Brinkman, a burly Canadian from Vancouver, is a latter-day wandering minstrel, a self-styled “rap troubadour,” with a master’s degree in English and a history of tree-planting (according to his Web site, he has personally planted more than one million trees). His guide to evolution grew out of a correspondence with Mark Pallen, an evolutionary biologist and rap enthusiast at the University of Birmingham, in Britain; the result, as Brinkman tells us, is “the only hip-hop show to have been peer-reviewed.”

It is also, I suspect, the only hip-hop show to talk of mitochondria, genetic drift, sexual selection or memes. For Brinkman has taken Darwin’s exhortation seriously. He is a man on a mission to spread the word about evolution—how it works, what it means for our view of the world, and why it is something to be celebrated rather than feared.

To this end, he has concocted a set of mini-lectures disguised as rap songs. When he comes to human evolution, for example, he has the audience sing along in call-response fashion to “I’m a African”—a riff on an earlier song of that name by the radical, pan-Africanist hip-hop duo Dead Prez. The point of Brinkman’s version is that because humans evolved in Africa, we are all Africans: pan-Africanism meets population genetics. A few moments later, he’s showing a video of individuals of the social slime mold Dictyostelium discoidium streaming together while rapping about how cooperation evolves.

(Dictyostelium is notorious, in some circles, for its strange life-style. Usually, an individual Dictyostelium lives alone as a single cell. But when food is scarce, the single cells come together and form a being known as “the slug”; this crawls off in search of better conditions. When it finds them, the slug develops into a stalked fruiting body, and releases spores. But here’s the mystery: not all members of the slug get to make spores—and thereby contribute to the next generation—so why do they cooperate?)

It’s surreal stuff. But the clever part is that the show works at different levels. If you are up on evolution you will be amused by the in-jokes and amazed by the erudition. If you know nothing about evolution, you will certainly be entertained, and you may even learn something. (The delivery is so fast, and the material so broad, that it’s hard to tell how much will stick on one hearing; but for enthusiasts, there’s a CD. It’s good; I’ve been listening to it all afternoon.)

The lyrics are, for the most part, witty, sophisticated and scientifically accurate; and they lack the earnest defensiveness that sometimes haunts lectures on evolution. I spotted one or two small slips—a confusion of the praying mantis with the Australian redback spider (oh no!)—and there are a few moments of poetic license that a po-faced pedant might object to. Otherwise, it’s pretty rigorous.

Brinkman can’t resist taking a few pot-shots at creationists (“Darwin got it going on / Creationism is . . . dead wrong . . .”), and he devotes one rap to refutations of creationist arguments. But by and large, he proselytizes about evolution not by attacking its deniers, but by revealing the subject’s scope, from natural selection to the evolution of human culture and language. At the same time, he teases the audience, sends up post-modernism, mocks himself and satirizes the genre of hip-hop, all with fizzing energy and spell-binding charisma. Like I said, astonishing.

I saw “The Rap Guide to Evolution” last week in Barnstaple, a small town in the west of England. But this week, Darwin got it going on for a few days at the Bleecker Street Theatre, off Broadway. If you are in New York—go.

[Olivia Judson’s article appeared on her New York Times blog, “The Opinionator,” on 4 May 2010 (http://opinionator.blogs.nytimes.com/2010/05/04/darwin-got-it-going-on/).

[Dirk Murray Brinkman, Jr., was born on 22 October 1978 in Riondel, an isolated town in British Columbia. Known as a "Lit Hop" artist and poet, Brinkman seems to have used the nickname “Baba” at least as far back as college, perhaps because of the family lore that he was born with a “Buddha-like” countenance. Brinkman and his family moved to Vancouver in 1980, where he still lives, between gigs, with his brother. After his 1996 graduation from high school, where his poetry and rap interests were born, Brinkman received a BA in English from Simon Fraser University in Vancouver and then an MA in medieval lit from the University of Victoria in Greater Victoria, B.C. His studies included rhyming and he wrote an essay in 2002 called "The Beste Rym I Kan: The Emergence of Rhyme in English." While working in his family’s tree-planting business—he claims to have planted over a million trees himself over ten years—Brinkman began performing at music and fringe festivals in 2001 and earned recognition as a local talent when a video he produced was aired between programs on CBC Television. He began his rap career in 2003. During the winter, Brinkman appeared at local schools and in 2004, he produced
Rap Canterbury Tales, based on his undergrad thesis. Starting his own publishing outlet, Babasworld Prioductions, Brinkman published the rap and it gathered acclaim at home and in the U.K. after he presented it at the Edinburgh Fringe Festival (three times). With BBC Slam Champion MC Dizraeli, Brinkman formed the hip-hop group Mud Sun in 2008 and they released two albums. The Rap Guide to Evolution in 2009 spread Brinkman’s renown from Canada and the U.K. to the English-speaking world, including the U.S., though it has run into disapproval in the American South when he launches into a rejection of Creationism. Brinkman claims the script has been vetted for accuracy by both scientists and historians, however, and some of his raps have been incorporated in school curricula in the U.K., Canada, and Australia. In 2010, he released the Rap Guide to Human Nature, created for the 2010 Fringe Festival at which he also premièred his newest rap, Rapconteur, drawn from oral epics including Beowulf, Gilgamesh, and the Finnish Kalevala.]

27 September 2010

Building Better Lives Through Science


[After the explosion on the BP drilling platform in the Gulf of Mexico last April, reporters, commentators, analysts, and ordinary people demonstrated two conflicting—but often simultaneously held—opinions about science and technology. On the one hand, they assumed, even demanded, that science solve the problem and became impatient, even angry, when it didn’t do it immediately and completely. Science, in the minds of many Americans, is credited with the capacity to solve any problem no matter what. The second view is disillusionment with science and technology because it is seen as the cause of such catastrophes as the gulf oil spill, global warming, nuclear contamination such as that caused by the Chernobyl and Three Mile Island melt-downs, the Challenger and Columbia explosions. Except for the environment, science and technology gave us those devices, they failed, and science hasn’t been able to stop or fix them. So we expect an almost magical power from science, which is our servant, and at the same time we believe science is flawed and inadequate and is poised to become our master. Clearly, neither of those propositions is accurate—or not wholly accurate. I’m going to examine the issue from a very personal perspective and see what, if any, benefit science and technology has provided in my own life.

[In a 1992 article called “Science’s Big Shift” in Time magazine, Dick Thompson wrote: " The public hears that we're No. 1 in science, and they want to know why that fact isn't making our lives better. The one thing that works in this country doesn't seem to be paying off." I was teaching a writing course at the time, so I quoted this statement for an essay exam and asked my students to address the idea that we don’t see science making our lives better. I also decided to answer the question myself as a model for an essay-exam response. I’ve based the essay below on that response.]

Science is the most visible indicator of how far humans have come from stone tools and wooden wheels. There’s question, however, whether scientific developments are actually making our lives better. While some perceive no betterment, even evils, from modern science and technology, the indisputable benefits in such fields as medicine, computers and robotics, and energy demonstrate that science makes our lives easier, healthier, safer, and more productive.

Medicine and health are fields plainly affected by new science and technology. Advances in organ transplants, immune-suppressant drugs, diagnostic tools like MRI’s and CAT scans, and new medicines and therapies mean illnesses we might have inevitably died from, like heart disease and cancer, are now routinely curable and others we couldn’t even diagnose until an autopsy, like tumors or weak blood vessels deep in the brain, are treatable. If medical technology has outstripped our ethical and legal capacities, that’s hardly a reason to disparage the usefulness of the scientific advances. It’s a reason to spur society and the law to catch up.

I can personally attest to at least one benefit of medical science and technology in my own life. Like most men my age, I’ve suffered the indignity of prostate examinations and colonoscopies, procedures that have come along only in recent decades to detect cancers that kill millions of men—but killed many more in the past because they weren’t detectable until much later in their progress. Those are common experiences for men (and women, too, in the case of colon exams) of a certain age, as they used to say. But somewhat rarer is the operation I had some years ago now that relieved me of a encumbrance with which I’d been saddled since I was about 12. I’m very astigmatic and I’ve had to wear glasses for over half a century. (I wore contacts for several years in the middle of that period, but they’re still lenses, prostheses.) I hated wearing glasses—not out of vanity (I’m not so good-looking to begin with that lenses were an additional disadvantage), but because as my eyes got worse, the lenses got thicker and heavier, and I couldn’t go anywhere without them. I always had to go about with glasses, and usually a pair of prescription sunglasses, too. If I traveled, I had to take an extra pair. Eventually, I got bifocals, which drove me nuts anyway; I took to carrying reading glasses for work in a library or other environment (because doing a lot of reading with the bifocals gave me headaches) and the bifocals for street use, plus the sunglasses. Three different pairs of glasses. I’d been asking my eye doctor about Lasik surgery, but he kept telling me it hadn’t been perfected enough to correct my vision problems yet. (Ironically, the same situation had occurred when I wanted to replace my glasses with contacts decades before.) Finally the doctor agreed that I was a good candidate and I had the surgery in a procedure that lasted 15 minutes, caused no discomfort, had no side effects, and left me with vision so good that I didn’t need even the drugstore reading glasses the doctor told me I might. It’s been over five years now and I haven’t worn prescription glasses of any kind since the operation (though my eyes have deteriorated in the meantime—the operation can correct the current conditions but it can’t stop your eyes from continuing to age). The operation cost several thousand dollars, which wasn’t covered by any insurance, but it was the best money I ever spent. Just waking up in the morning and not needing glasses to read the clock justifies the decision. For this alone, I can affirm that technology (for the laser device that performed the actual cutting) and science (for developing the technique of applying the laser) has benefited my life unquestionably.

Technological benefits from computers are also indisputable. Factories and offices work more efficiently, products are cheaper, retooling is easier and faster, and dangerous tasks are performed safely by non-human workers. Computer-aided design, for instance, makes possible the development of products with fewer design flaws before a prototype is even manufactured. Computer-generated models are created and tested in computer-generated environments, less costly than test laboratories and tracks, and their potential malfunctions and weaknesses detected long before metal and plastic are molded. The final design is electronically sent to the robot machine that makes the product, saving hours of expensive labor. When the time comes to change the design, the new specifications are sent to the robot, avoiding costly retooling and retraining. Though outsourcing has become an economic drawback (but not a technological one), the lost manufacturing jobs are eventually replaced with high-tech work programming, maintaining, and running the computers, and the computer glitches that occasionally plague our daily lives are becoming less and less frequent as we all become computer-literate.

I’ve written about the application of computer technology to theater and live performance (“Theater and Computers,” 5 December 2009), and that progress continues apace. (Film, which is always in the vanguard of such applications, is beginning to use performance capture—a feature of James Cameron’s Avatar—and the appearance of that tech on the live stage is probably not far away.) But once again, I can testify to the benefit computers have provided my own life. We all use ATM’s for their convenience and many of us use PDA’s and smart phones to access e-mail and the ‘Net when we’re on the move. (On my recent trip to Istanbul, many of my traveling companions were busily texting and e-mailing their friends and families back in the States from their hand-helds.) But my personal boon from computers is probably simpler and more commonplace. I’ve also written about my writing process and how I developed it (“Writing,” 9 April 2010). When I first contemplated that, I’d just started using a wordprocessor and I was very conscious of how that tool was changing how I wrote. I began to write more avidly, and sometimes more words as well, because it was fun. I couldn’t count drafts anymore, because each piece of writing was all one flexible draft that just reformed itself every time I made a change—until I decided it was finished, and sent it to the printer. Since there was no retyping every time I wanted to edit a draft, I began to do far more editing and revising. Where I’d used to give up on a change because it was too tedious to make—say, changing a small word or reversing the order of two sentences—I could push a couple of keys and I’d have exactly what I wanted, instead of what I’d used to settle for. Then conveniences like dictionaries and encyclopedias on CD became available and I had a whole reference and research library at my fingertips, obviating the need for constant trips to a library for small tasks. (I still go when I have major research or have saved up several smaller tasks that can’t be accomplished at home.) This was followed by the Internet and the array of reference and research sites that technology brought right to my screen. (All this eventually led to the launching of ROT, a lagniappe from the computer world that isn’t a benefit to my writing, but has become an outlet for the results. Others will have to determine whether that’s been worthwhile.)

The wordprocessor’s also been a huge asset for my teaching, as has the ‘Net. E-mail’s made communicating with research clients, colleagues, students (I work by e-mail with a woman now whom I tutored for a while and now help as a writing coach and editor), and, of course, friends and family. Business correspondence by e-mail is faster than by letter and more efficient and unambiguous than by telephone. We all know about the inaccuracies and even falsehoods that can be found on the Internet, but judicious use can make it a terrific research resource that saves countless hours in a library or even in my home reference collection. Small details I might have just left out, I can now verify and include because I don’t have to run off and spend an hour or two getting to and from a library and looking for the right reference book. As a result, my writing is both more detailed and specific, and faster. No danger that the Internet poses would make me wish to go back to a typewriter and a legal pad. I may prefer a printed book to an electronic one for reading (“Books in Print,” 9 July 2010), but I see no drawback to writing on a computer; there’s no downside.

A sidelight to computer writing that comes not from my own experience but from a friend’s is on-line publishing. I don’t mean publishing such as a blog or submission to an on-line journal, but publishing a book through an on-line service. Kirk Woodward did that with The Art of Writing Reviews, on which I commented at length on ROT (4, 8, 11, and 14 November 2009). It’s the only way I know of for a little guy like Kirk or me to get a book into print. Furthermore, Kirk can reedit the text instantly to make corrections and additions and potential readers can buy Writing Reviews on line as either a printed book or as a download. I have no idea how well the books published that way do in sales, but it’s still an accomplishment, and computers made it possible.

Communications and media are another field in which scientific and, especially, technological progress have had prominent consequences. Television has improved at both the viewing and the transmission ends with the introduction of digital and high-definition broadcasting, flat screens, green-screen technology, and satellites. (The content, however, may not seem better than 50 years ago! There’s just more of it.) Even old reliable radio has advanced with the birth of satellite transmission. In world events, we’ve seen how the fax machine brought news of the 1989 Tiananmen protest in Beijing out and sent encouragement in and then, two decades later, when Iran erupted after a disputed election, it was digital cameras, cell phone images, and the Internet that kept the story before the world. Perhaps the most salient—and personal—technical change in communications, however, is the cell phone which has gone from a large, clumsy, expensive device that few but the wealthiest enthusiasts owned to the ubiquitous, nearly omnipotent, tiny object that resembles the Star Trek communicator. While we’ve all heard (and experienced) the annoyances that go along with the proliferation of cell phones on our streets (and in our restaurants, stores, theaters, and everywhere else), the convenience and, in an emergency, necessity of a mobile telephone can’t be denied. Readers of ROT may already know the recent urgent situation that prompted my mother and me finally to get our first cells; I described it in “Meeting Mom” (in “Short Takes I,” 30 March 2010). To recap briefly: I was waiting at Penn Station for my mother to arrive on a bus from Washington last Thanksgiving. She was scheduled to get in at 3 p.m. but the bus ended up being 3½ hours late and she couldn’t reach me to tell me what was happening and I couldn’t call her to ask where she was. Long story short: we now each have a cell. No emergency like that has occurred since (yet, halevai!), but the potential exists, hence the necessity of the cell phone in my life.

In the mid-1970’s, the oil-producing nations embargoed the source of most of the world’s energy, teaching us that relying on this imminently exhaustible product is dangerous. Now the spill in the Gulf of Mexico gives us another reason to reconsider fossil fuel, even domestically produced, as our sole energy source. For twenty years, scientists have been looking for new, less vulnerable energy sources. Though no replacement for hydrocarbons has yet emerged, many possibilities have. Nuclear energy, the most common next to oil and coal, has serious drawbacks, but it also has advantages. It’s efficient, cheap, and virtually inexhaustible. The radioactive waste and the danger of a leak or melt-down, as has already happened at Pennsylvania’s Three Mile Island and the former Soviet Union’s Chernobyl reactors, are very real problems. This doesn’t mean they’re insoluble, and the technology to make reactors and their waste safer is already under development. Meanwhile, scientists are working on producing power from nuclear fusion—much cleaner than the fission used in current reactors—geothermal energy, and electromagnetics. The latter is already used in Japan in trains that float on cushions of air instead of rolling on wheels and track. The reduced friction saves wear and energy, and the high speeds will greatly reduce overland travel time. Even the old stand-by, petroleum, is being cleaned up in experiments with so-called gasahol where alcohol is added to the gasoline to make it burn cleaner and pollute less. Ethanol, too, has been examined as an alternative and though it’s not perfected, the prospect is still encouraging. The resulting more efficient engines and cleaner air are doubtlessly beneficial to all of us, and electric and hybrid engines are gaining popularity as that technology is improved and made less expensive. Busses that run on natural gas are appearing in cities all over the country as well.

Scientific advances can be a double-edged sword, however. In the realm of forensics, we’ve seen how scientific and technological progress has made detection and prevention of crime easier and faster, and solving crime and assigning blame has become more certain because of advances in science. DNA alone must be responsible for the convictions of hundreds of criminals across the country—as well as the exoneration of wrongly convicted defendants from the past. (A friend who’s a criminal defense attorney is very active in the Innocence Project, which has effected the release of hundreds of innocent prisoners.) But the trade-off, as I experienced it in a court, is that we civilians have come to expect a level of certainty that probably can’t be met in real life. I call this the CSI Syndrome because that TV show and its like have popularized the notion that science and scientists can solve any crime. When I was on the jury of an assault trial, we deliberated for several hours and ultimately found the accused, a man charged with the attempted rape of a women who rented a room from him, not guilty. The problem, we determined, was that there wasn’t enough evidence to convince us beyond the mandated reasonable doubt that the man had actually tried to rape the woman. We knew something had happened and that the woman had been a victim, and we suspected that the charge was true, but we couldn’t convict the defendant of such a crime solely on the testimony of the woman (who had credibility problems herself). There wasn’t any forensic evidence of an attempted rape. There was no medical evidence (the woman hadn’t gone to the hospital until after she’d reported the attack, which was some time after it was supposed to have occurred) to confirm a failed sexual assault. (There was also no corroborating testimony from other witnesses.) The accused hadn’t been charged with simple assault, a charge on which we could have convicted without a problem (the prosecutor explained that the DA’s office had decided that an assault charge carried too light a penalty), so we had no alternative but to acquit the man. But, I wondered afterwards, Were we putting too much reliance on criminal science? Has the CSI Syndrome convinced us non-scientists, non-criminalists that there must always be forensic proof, that there always is forensic proof and if there’s not, then conviction is impossible? Have we been taught, apparently falsely, to expect, to trust in, something that’s not really possible?

One question must be asked now, of course: Was my one experience at all typical of other juries of lay people across the country? If it was, then court cases and forensic science may be a metaphor for society’s general attitude toward science and technology: we raise it up on a pedestal, attributing to it capabilities it doesn’t have and when science fails to meet the impossible standards we’ve set, we tear down the pedestal and cast the whole discipline into the trash bin of cultural authority. Meanwhile, of course, real forensic scientists and technicians continue to do their valuable work and to push the envelope of their field to new and useful accomplishments.

Yes, there are disadvantages to scientific developments, and some new technologies may seem frivolous now. The new computer-generated environment called virtual reality is little more than an expensive toy right now, but its potential for real benefits is unmistakable. Television was a rich man’s plaything in the 1940’s and many intellectuals dubbed it a “vast wasteland” poised to rot our brains. Its power as a means of global, even interplanetary communications, however, has been demonstrated time and time again. From the first TV war in Vietnam to the landing on the moon in June 1969, this electronic gadget has proved its value, however badly it’s sometimes used. The same is obviously true of inventions in fields like space (miniaturization), transportation (supersonic jets, bullet trains), the military (nightscopes), and communications and the media (cell and satellite phones). Clearly, science has made our lives better—as long as we know how to use what it gives us. And don’t expect too much from its gifts. Despite the horrors of apocalyptic sci-fi tales, the true dangers of technology and science are on the human side.