[After the explosion on the BP drilling platform in the Gulf of Mexico last April, reporters, commentators, analysts, and ordinary people demonstrated two conflicting—but often simultaneously held—opinions about science and technology. On the one hand, they assumed, even demanded, that science solve the problem and became impatient, even angry, when it didn’t do it immediately and completely. Science, in the minds of many Americans, is credited with the capacity to solve any problem no matter what. The second view is disillusionment with science and technology because it is seen as the cause of such catastrophes as the gulf oil spill, global warming, nuclear contamination such as that caused by the Chernobyl and Three Mile Island melt-downs, the Challenger and Columbia explosions. Except for the environment, science and technology gave us those devices, they failed, and science hasn’t been able to stop or fix them. So we expect an almost magical power from science, which is our servant, and at the same time we believe science is flawed and inadequate and is poised to become our master. Clearly, neither of those propositions is accurate—or not wholly accurate. I’m going to examine the issue from a very personal perspective and see what, if any, benefit science and technology has provided in my own life.
[In a 1992 article called “Science’s Big Shift” in Time magazine, Dick Thompson wrote: " The public hears that we're No. 1 in science, and they want to know why that fact isn't making our lives better. The one thing that works in this country doesn't seem to be paying off." I was teaching a writing course at the time, so I quoted this statement for an essay exam and asked my students to address the idea that we don’t see science making our lives better. I also decided to answer the question myself as a model for an essay-exam response. I’ve based the essay below on that response.]
Science is the most visible indicator of how far humans have come from stone tools and wooden wheels. There’s question, however, whether scientific developments are actually making our lives better. While some perceive no betterment, even evils, from modern science and technology, the indisputable benefits in such fields as medicine, computers and robotics, and energy demonstrate that science makes our lives easier, healthier, safer, and more productive.
Medicine and health are fields plainly affected by new science and technology. Advances in organ transplants, immune-suppressant drugs, diagnostic tools like MRI’s and CAT scans, and new medicines and therapies mean illnesses we might have inevitably died from, like heart disease and cancer, are now routinely curable and others we couldn’t even diagnose until an autopsy, like tumors or weak blood vessels deep in the brain, are treatable. If medical technology has outstripped our ethical and legal capacities, that’s hardly a reason to disparage the usefulness of the scientific advances. It’s a reason to spur society and the law to catch up.
I can personally attest to at least one benefit of medical science and technology in my own life. Like most men my age, I’ve suffered the indignity of prostate examinations and colonoscopies, procedures that have come along only in recent decades to detect cancers that kill millions of men—but killed many more in the past because they weren’t detectable until much later in their progress. Those are common experiences for men (and women, too, in the case of colon exams) of a certain age, as they used to say. But somewhat rarer is the operation I had some years ago now that relieved me of a encumbrance with which I’d been saddled since I was about 12. I’m very astigmatic and I’ve had to wear glasses for over half a century. (I wore contacts for several years in the middle of that period, but they’re still lenses, prostheses.) I hated wearing glasses—not out of vanity (I’m not so good-looking to begin with that lenses were an additional disadvantage), but because as my eyes got worse, the lenses got thicker and heavier, and I couldn’t go anywhere without them. I always had to go about with glasses, and usually a pair of prescription sunglasses, too. If I traveled, I had to take an extra pair. Eventually, I got bifocals, which drove me nuts anyway; I took to carrying reading glasses for work in a library or other environment (because doing a lot of reading with the bifocals gave me headaches) and the bifocals for street use, plus the sunglasses. Three different pairs of glasses. I’d been asking my eye doctor about Lasik surgery, but he kept telling me it hadn’t been perfected enough to correct my vision problems yet. (Ironically, the same situation had occurred when I wanted to replace my glasses with contacts decades before.) Finally the doctor agreed that I was a good candidate and I had the surgery in a procedure that lasted 15 minutes, caused no discomfort, had no side effects, and left me with vision so good that I didn’t need even the drugstore reading glasses the doctor told me I might. It’s been over five years now and I haven’t worn prescription glasses of any kind since the operation (though my eyes have deteriorated in the meantime—the operation can correct the current conditions but it can’t stop your eyes from continuing to age). The operation cost several thousand dollars, which wasn’t covered by any insurance, but it was the best money I ever spent. Just waking up in the morning and not needing glasses to read the clock justifies the decision. For this alone, I can affirm that technology (for the laser device that performed the actual cutting) and science (for developing the technique of applying the laser) has benefited my life unquestionably.
Technological benefits from computers are also indisputable. Factories and offices work more efficiently, products are cheaper, retooling is easier and faster, and dangerous tasks are performed safely by non-human workers. Computer-aided design, for instance, makes possible the development of products with fewer design flaws before a prototype is even manufactured. Computer-generated models are created and tested in computer-generated environments, less costly than test laboratories and tracks, and their potential malfunctions and weaknesses detected long before metal and plastic are molded. The final design is electronically sent to the robot machine that makes the product, saving hours of expensive labor. When the time comes to change the design, the new specifications are sent to the robot, avoiding costly retooling and retraining. Though outsourcing has become an economic drawback (but not a technological one), the lost manufacturing jobs are eventually replaced with high-tech work programming, maintaining, and running the computers, and the computer glitches that occasionally plague our daily lives are becoming less and less frequent as we all become computer-literate.
I’ve written about the application of computer technology to theater and live performance (“Theater and Computers,” 5 December 2009), and that progress continues apace. (Film, which is always in the vanguard of such applications, is beginning to use performance capture—a feature of James Cameron’s Avatar—and the appearance of that tech on the live stage is probably not far away.) But once again, I can testify to the benefit computers have provided my own life. We all use ATM’s for their convenience and many of us use PDA’s and smart phones to access e-mail and the ‘Net when we’re on the move. (On my recent trip to Istanbul, many of my traveling companions were busily texting and e-mailing their friends and families back in the States from their hand-helds.) But my personal boon from computers is probably simpler and more commonplace. I’ve also written about my writing process and how I developed it (“Writing,” 9 April 2010). When I first contemplated that, I’d just started using a wordprocessor and I was very conscious of how that tool was changing how I wrote. I began to write more avidly, and sometimes more words as well, because it was fun. I couldn’t count drafts anymore, because each piece of writing was all one flexible draft that just reformed itself every time I made a change—until I decided it was finished, and sent it to the printer. Since there was no retyping every time I wanted to edit a draft, I began to do far more editing and revising. Where I’d used to give up on a change because it was too tedious to make—say, changing a small word or reversing the order of two sentences—I could push a couple of keys and I’d have exactly what I wanted, instead of what I’d used to settle for. Then conveniences like dictionaries and encyclopedias on CD became available and I had a whole reference and research library at my fingertips, obviating the need for constant trips to a library for small tasks. (I still go when I have major research or have saved up several smaller tasks that can’t be accomplished at home.) This was followed by the Internet and the array of reference and research sites that technology brought right to my screen. (All this eventually led to the launching of ROT, a lagniappe from the computer world that isn’t a benefit to my writing, but has become an outlet for the results. Others will have to determine whether that’s been worthwhile.)
The wordprocessor’s also been a huge asset for my teaching, as has the ‘Net. E-mail’s made communicating with research clients, colleagues, students (I work by e-mail with a woman now whom I tutored for a while and now help as a writing coach and editor), and, of course, friends and family. Business correspondence by e-mail is faster than by letter and more efficient and unambiguous than by telephone. We all know about the inaccuracies and even falsehoods that can be found on the Internet, but judicious use can make it a terrific research resource that saves countless hours in a library or even in my home reference collection. Small details I might have just left out, I can now verify and include because I don’t have to run off and spend an hour or two getting to and from a library and looking for the right reference book. As a result, my writing is both more detailed and specific, and faster. No danger that the Internet poses would make me wish to go back to a typewriter and a legal pad. I may prefer a printed book to an electronic one for reading (“Books in Print,” 9 July 2010), but I see no drawback to writing on a computer; there’s no downside.
A sidelight to computer writing that comes not from my own experience but from a friend’s is on-line publishing. I don’t mean publishing such as a blog or submission to an on-line journal, but publishing a book through an on-line service. Kirk Woodward did that with The Art of Writing Reviews, on which I commented at length on ROT (4, 8, 11, and 14 November 2009). It’s the only way I know of for a little guy like Kirk or me to get a book into print. Furthermore, Kirk can reedit the text instantly to make corrections and additions and potential readers can buy Writing Reviews on line as either a printed book or as a download. I have no idea how well the books published that way do in sales, but it’s still an accomplishment, and computers made it possible.
Communications and media are another field in which scientific and, especially, technological progress have had prominent consequences. Television has improved at both the viewing and the transmission ends with the introduction of digital and high-definition broadcasting, flat screens, green-screen technology, and satellites. (The content, however, may not seem better than 50 years ago! There’s just more of it.) Even old reliable radio has advanced with the birth of satellite transmission. In world events, we’ve seen how the fax machine brought news of the 1989 Tiananmen protest in Beijing out and sent encouragement in and then, two decades later, when Iran erupted after a disputed election, it was digital cameras, cell phone images, and the Internet that kept the story before the world. Perhaps the most salient—and personal—technical change in communications, however, is the cell phone which has gone from a large, clumsy, expensive device that few but the wealthiest enthusiasts owned to the ubiquitous, nearly omnipotent, tiny object that resembles the Star Trek communicator. While we’ve all heard (and experienced) the annoyances that go along with the proliferation of cell phones on our streets (and in our restaurants, stores, theaters, and everywhere else), the convenience and, in an emergency, necessity of a mobile telephone can’t be denied. Readers of ROT may already know the recent urgent situation that prompted my mother and me finally to get our first cells; I described it in “Meeting Mom” (in “Short Takes I,” 30 March 2010). To recap briefly: I was waiting at Penn Station for my mother to arrive on a bus from Washington last Thanksgiving. She was scheduled to get in at 3 p.m. but the bus ended up being 3½ hours late and she couldn’t reach me to tell me what was happening and I couldn’t call her to ask where she was. Long story short: we now each have a cell. No emergency like that has occurred since (yet, halevai!), but the potential exists, hence the necessity of the cell phone in my life.
In the mid-1970’s, the oil-producing nations embargoed the source of most of the world’s energy, teaching us that relying on this imminently exhaustible product is dangerous. Now the spill in the Gulf of Mexico gives us another reason to reconsider fossil fuel, even domestically produced, as our sole energy source. For twenty years, scientists have been looking for new, less vulnerable energy sources. Though no replacement for hydrocarbons has yet emerged, many possibilities have. Nuclear energy, the most common next to oil and coal, has serious drawbacks, but it also has advantages. It’s efficient, cheap, and virtually inexhaustible. The radioactive waste and the danger of a leak or melt-down, as has already happened at Pennsylvania’s Three Mile Island and the former Soviet Union’s Chernobyl reactors, are very real problems. This doesn’t mean they’re insoluble, and the technology to make reactors and their waste safer is already under development. Meanwhile, scientists are working on producing power from nuclear fusion—much cleaner than the fission used in current reactors—geothermal energy, and electromagnetics. The latter is already used in Japan in trains that float on cushions of air instead of rolling on wheels and track. The reduced friction saves wear and energy, and the high speeds will greatly reduce overland travel time. Even the old stand-by, petroleum, is being cleaned up in experiments with so-called gasahol where alcohol is added to the gasoline to make it burn cleaner and pollute less. Ethanol, too, has been examined as an alternative and though it’s not perfected, the prospect is still encouraging. The resulting more efficient engines and cleaner air are doubtlessly beneficial to all of us, and electric and hybrid engines are gaining popularity as that technology is improved and made less expensive. Busses that run on natural gas are appearing in cities all over the country as well.
Scientific advances can be a double-edged sword, however. In the realm of forensics, we’ve seen how scientific and technological progress has made detection and prevention of crime easier and faster, and solving crime and assigning blame has become more certain because of advances in science. DNA alone must be responsible for the convictions of hundreds of criminals across the country—as well as the exoneration of wrongly convicted defendants from the past. (A friend who’s a criminal defense attorney is very active in the Innocence Project, which has effected the release of hundreds of innocent prisoners.) But the trade-off, as I experienced it in a court, is that we civilians have come to expect a level of certainty that probably can’t be met in real life. I call this the CSI Syndrome because that TV show and its like have popularized the notion that science and scientists can solve any crime. When I was on the jury of an assault trial, we deliberated for several hours and ultimately found the accused, a man charged with the attempted rape of a women who rented a room from him, not guilty. The problem, we determined, was that there wasn’t enough evidence to convince us beyond the mandated reasonable doubt that the man had actually tried to rape the woman. We knew something had happened and that the woman had been a victim, and we suspected that the charge was true, but we couldn’t convict the defendant of such a crime solely on the testimony of the woman (who had credibility problems herself). There wasn’t any forensic evidence of an attempted rape. There was no medical evidence (the woman hadn’t gone to the hospital until after she’d reported the attack, which was some time after it was supposed to have occurred) to confirm a failed sexual assault. (There was also no corroborating testimony from other witnesses.) The accused hadn’t been charged with simple assault, a charge on which we could have convicted without a problem (the prosecutor explained that the DA’s office had decided that an assault charge carried too light a penalty), so we had no alternative but to acquit the man. But, I wondered afterwards, Were we putting too much reliance on criminal science? Has the CSI Syndrome convinced us non-scientists, non-criminalists that there must always be forensic proof, that there always is forensic proof and if there’s not, then conviction is impossible? Have we been taught, apparently falsely, to expect, to trust in, something that’s not really possible?
One question must be asked now, of course: Was my one experience at all typical of other juries of lay people across the country? If it was, then court cases and forensic science may be a metaphor for society’s general attitude toward science and technology: we raise it up on a pedestal, attributing to it capabilities it doesn’t have and when science fails to meet the impossible standards we’ve set, we tear down the pedestal and cast the whole discipline into the trash bin of cultural authority. Meanwhile, of course, real forensic scientists and technicians continue to do their valuable work and to push the envelope of their field to new and useful accomplishments.
Yes, there are disadvantages to scientific developments, and some new technologies may seem frivolous now. The new computer-generated environment called virtual reality is little more than an expensive toy right now, but its potential for real benefits is unmistakable. Television was a rich man’s plaything in the 1940’s and many intellectuals dubbed it a “vast wasteland” poised to rot our brains. Its power as a means of global, even interplanetary communications, however, has been demonstrated time and time again. From the first TV war in Vietnam to the landing on the moon in June 1969, this electronic gadget has proved its value, however badly it’s sometimes used. The same is obviously true of inventions in fields like space (miniaturization), transportation (supersonic jets, bullet trains), the military (nightscopes), and communications and the media (cell and satellite phones). Clearly, science has made our lives better—as long as we know how to use what it gives us. And don’t expect too much from its gifts. Despite the horrors of apocalyptic sci-fi tales, the true dangers of technology and science are on the human side.