Who Should Read Future Hype?

  • Business technology adopters who need technology for competitive advantage but can't afford to back the wrong horse. (Did you know that businesses worldwide spend almost three trillion dollars every year on information technology alone?)
  • Consumers who want to better understand technology change to guide their purchases.
  • Teachers and school administrators who want to spend scarce resources wisely, knowing that money spent on computers is money not spent on other important needs.
  • Anyone else who wants to more clearly see how technology affects us now and understand how it will change in the future.

"A wise and clear-eyed book, Future Hype challenges the conventional wisdom about technological change and provides a fresh perspective on our so-called computer age."

-- Nicholas G. Carr , author of Does IT Matter?

[more reviews]

We're not the first generation to feel overwhelmed by technology, not the first to rearrange our lives to accommodate technology, and not the first to ask ourselves if technology's good outweighs its bad. Generations past have dealt with disruptions every bit as challenging and exciting as our own. Our awe of today's technology isn't unique-it isn't even particularly substantial.

Today, progress is quick in a few areas and slower in all the rest, as it has been for centuries. The PC and Internet are indeed unprecedented-just like every other major technology before them. The clumsy exponential-growth model must be replaced by a more accurate paradigm.

Three decades ago, Alvin Toffler's Future Shock created a sensation with its portrayal of technology spinning out of society's control. Future Hype approaches the same topic but arrives at a very different conclusion: that the popular view of technology change is wrong and that the future won't be so shocking.

Come see how technology change really works and how to better evaluate it, anticipate it, and control it.

"Future Hype takes us on a technological rollercoaster over a landscape of exaggerated promises and failed dreams. Required reading for journalists, teachers, business managers and, well, everybody else."

-- A. K. Dewdney , author of Beyond Reason and Yes, We Have No Neutrons

[more reviews]

Excerpts

A selection of excerpts from Future Hype are included below. Download these reports ( PDF format):

Introduction: Leveling the Exponential Curve

The further backward you look,
the further forward you can see.
-- Winston Churchill

The game of chess dates back to India 1400 years ago. Legend says that the local ruler was so delighted by the game that he offered its inventor the reward of his choice. The inventor's request was defined by the game board itself: a single grain of rice for the first chess square, two for the next, four for the next and so on, doubling with each square through all 64. Unaccustomed to this kind of sequence, the ruler granted this seemingly trivial request. But though the initial numbers are small, the amount builds quickly. The rice begins to be measured in cups by square 14, sacks by square 20, and tons by square 26. The total comes to about 300 billion tons--more rice than has been harvested in the history of humanity.

Like the king in the chess story, most of us are inexperienced with repeated doublings such as these. Let's look at a present day example. In 1971, Intel introduced the 4004, its first microprocessor, with a performance of 0.06 MIPS (million instructions per second). Intel's Pentium Pro was introduced in 1995 with 300 MIPS, a 5000-fold performance increase in 24 years--about one doubling every two years. A car making the same speed increase would now have a top speed of about Mach 700. Give it another 24 years at the same rate of increase, and its top speed would exceed the speed of light.

Moore's Law, named after Intel cofounder Gordon Moore, predicts this exponential rise in computer performance. Every two years, microprocessor speed doubles. Again. This law has been startlingly accurate for three decades, unlike many other extrapolations, and the progress it predicts is expected to continue, at least for the near future.

There is no precedent for this rapid performance improvement, which is why we view computers and their rapid change with wonder.

My own career of 25 years as an engineer has been tied to the effects of Moore's Law, both as a digital hardware designer and as a programmer and software architect. Ever since high school in the 1970s, I've been immersed in computer technology and have been an energetic cheerleader of technology in general. I was in awe of the change computers caused and was delighted to be a small part of that change. Change was exciting. And it was all around us--I grew up with the space program and jumbo jets, nuclear power and skyscrapers, Future Shock and Megatrends. Exponential change seemed to be everywhere we looked.

To make sure we're all clear what exponential change looks like, this chart contrasts no change and linear change with exponential change. The vertical axis is unlabeled--it could represent transistors if we're measuring microprocessors, dollars for compound interest, the number of bacteria for growth in a Petri dish, or the grains of rice in the chess story. While they may start out slowly, exponential curves eventually snowball.

As I gained experience, I came to realize that change for its own sake wasn't as fun or desirable for the software user as imagined by the software developer. Curiously, users wanted new software to answer to bottom-line needs. Who would have guessed? Coolness alone wasn't enough--users demanded that software pull its weight, as they would for any other purchase.

They were right, of course. New software must provide sufficient additional benefits to outweigh the cost and aggravation of adopting it. This is also true for other consumer products. The consumer might think: I like that digital camera, but it uses a new type of memory card. Will it become a standard or an unsupported dead end, like so many other products? Should I make MP3 copies of my favorite songs or keep them on CD? Is HDTV really here, or is the current hype another false alarm? In general, is the latest hot product something that will last, or is it just a fad? The early adopters are quick to make this leap, but the chasm must be narrowed considerably for the majority of us. Change for its own sake wasn't as delightful as I'd thought, and I came to see things more from the user's perspective.

The high failure rate of new products challenges the inevitability of exponential change. A bigger challenge came as I studied high tech products from the past, looking for precedents against which to compare my own projects. Why were these old products successful, and how could I adopt their lessons for my own work? But something unexpected happened. As I learned more about the history of technology, examples emerged that the exponential model could not explain. I gradually realized that there was a different way--a more accurate way--to look at technology change.

The exponential model as a universal explanation for and predictor of technology change is at best an approximation and at worst a delusion. We can support it only by selecting just the right examples and ignoring the rest. Technology does not always continuously improve. For example, commercial airplane speeds increased steadily but halted when airlines realized that expensive supersonic travel didn't make business sense. Highway speed limits increased steadily but also hit a ceiling. Record building heights increased rapidly during the first third of the twentieth century but have increased only moderately since then. Use of nuclear power has peaked, and manned space exploration halted after we reached the moon.

Different areas of technology advance at different rates and come to the fore at different times. Cathedral building emerged during the 1200s while other technologies languished. Printing created dramatic change in the late 1400s. It surged again in the early 1800s as mechanized presses provided cheap books and magazines. Steam power and mills had their heyday; later, it was electricity and electrical devices. There are dozens of examples of a technology surging forward and then maturing and fading back into the commonplace.

Perhaps the most venerable use of the exponential model has been to represent world population growth. But even here it's an imperfect metaphor. In the 1960s and '70s, experts warned that world population was growing exponentially and the crowding was quickly getting worse. Famine was just around the corner. The exponential model was a dramatic way to make a point but an inaccurate one. World population growth is slowing and is expected to peak mid-century, and dozens of countries are already falling in population (ignoring immigration).

Despite the common perception, the impact of technology on society today is comparatively gentle. To see a truly serious example of the collision of technology and society, look at Britain during the Industrial Revolution almost two centuries ago. In 1811, armed gangs of Luddites smashed the textile machines that displaced their handmade crafts. Several years and over 10,000 men were required to put down the rebellion. The unrest spread to the Continent, where the word "sabotage" was coined. It comes from the French word sabot, the name for the wooden shoes used by workers to smash or jam machines. In the space of a generation, independent work on farms had given way to long six-day weeks in noisy and dangerous factories. Our own technology growing pains seem minor by comparison.

It's easy to focus on the recent at the expense of the old. But doing so leads to a distorted view of modern technology. New products loom disproportionately large in our minds, simply because they're new. The image of Americans a few generations ago living quiet, static lives is fiction. Societies of the past dealt with disruptions from technology every bit as challenging and exciting as our own: the telegraph and electricity, the car and railroad, anesthesia and vaccines, concrete and steel, newspapers and mail. And then we have the fundamental developments of antiquity on which society is based: agriculture, metallurgy, the beginnings of engineering, writing, textiles, transportation, timekeeping, basic tools and weapons, and so on. Are today's products really so amazing compared to those on which they were built? Too often we mistake a new technology for an important one.

Part of the problem is a narrow definition of technology. Obviously, the Internet, computer, and cell phone fit into this category. These are in the news and in our awareness. But this book will use a very broad definition of technology, including these new technologies as well as older and less glamorous ones mentioned above. Metallurgy, textiles, and all the rest were high tech at one point, they are still important to society, and examples from these older technologies will be liberally used in this book to illustrate that today's issues with technology have been around for a long time.

The prevailing view of reality is often an oversimplification. For example, a simple rule often taught to small children is "All ocean creatures are fish." Though incomplete, it's a step in the right direction. When the children are a little older, we might teach them, "All ocean creatures are fish--except whales and dolphins." When they are older still, we teach them "All ocean creatures are fish except marine mammals (like whales and dolphins), crustaceans (like crabs and lobsters), bivalves (like oysters and scallops), cephalopods (like nautilus and squid), ..." and so on.

We frequently hear that today's technology change is unprecedented. But like the fish simplification for children, this tells far less than the whole story. The characterization that today's change is unprecedented is easy to understand and helps explain some of what we see, but it's inaccurate--and dangerously so. You have outgrown the children's version and are ready for a grown-up look at technology.

Leave behind the hype. Come and see how technology has really affected society and how it will impact us in the future.

We live in a technology-dense world.
...We are terrifyingly naked without knowing elementary things
about how [technologies] work.
-- John Lienhard, The Engines of Our Ingenuity (2000)


 

Technology Good and Bad

Humankind is either on its way to the stars
or hurtling out of a high-rise window to the street
and mumbling, "So far, so good."
-- Edward Tenner, Why Things Bite Back (1996)

An ancient Chinese story tells of a farmer who owns a famous racehorse. One day, the horse ran away. His friends commiserated with him, but the farmer replied, "This isn't necessarily a bad thing." Soon, his horse returned and brought another fine-looking horse. His friends congratulated him, but the farmer observed, "This isn't necessarily a good thing." Later, the farmer's son is thrown while trying to tame the new horse. He broke his leg, which left him lame. The farmer's friends offered condolences, but he responded, "This isn't necessarily a bad thing." Sure enough, war broke out and the son's lameness prevented him from being conscripted. Though many neighbors' sons were killed in the fighting, the farmer's son was spared. Sometimes it's hard to tell what's good and what's bad.

But perhaps we can be certain in some cases. For example, we can all agree that the insecticide DDT is bad. The landmark book Silent Spring made DDT's environmental crimes common knowledge in 1962. And yet DDT's discoverer won a Nobel Prize for his work in 1948, just six years after its properties were understood, and it was credited with saving five million lives by 1950. In the 1950s and '60s, DDT cut the malaria incidence in India to 15,000 cases per year, down from one hundred million. Given this remarkable progress, worldwide eradication of malaria seemed a strong possibility. Though the problems of resistance, environmental damage, and impact on human health were beginning to be understood, abandoning this insecticide was not the obvious course. Malaria kills millions of people per year even today, and DDT is still used in countries holding almost half of the world's population, including China, India, and Mexico. So, what's the moral? Is DDT a killer or a lifesaver? We could ask the same about antibiotics and vaccines--they mercifully saved lives and yet threatened widespread famine by encouraging dramatic overpopulation.

Kranzberg's First Law helps to clarify this situation: Technology is neither good nor bad--nor is it neutral. At the risk of spoiling its Zenlike nature, let me propose an interpretation: a technology isn't inherently good or bad, but it will have an impact, which is why it's not neutral. Almost every applied technology has a good side and a bad side. When you think of transportation technologies, do you think of how they enable a delightful vacation or get the family back together during the holidays--or do you think of traffic jams and pollution? Are books a source of wisdom and spirituality or a way to distribute pornography and hate? Do you applaud medical technology for curing plagues or deplore transportation technology for spreading them? Does encrypted e-mail keep honest people safe from criminals or criminals safe from the police? Are plastics durable conveniences or everlasting pollutants? Counterfeiting comes with money, obscene phone calls come with the telephone, spam comes with e-mail, and pornography comes with the Internet. Every law creates an outlaw.

Opposites create each other. You can't have an up without a down. You can't have a magnetic north pole without a south pole, or a yin without its opposite yang. And providing a technology good opens the door for the bad. Werner von Braun observed, "Science does not have a moral dimension. It is like a knife. If you give it to a surgeon or a murderer, each will use it differently." The same could be said for applications of technology.

The dilemma of finding and maximizing technology's gifts while minimizing its harm is an important one today. It has also plagued society for centuries. Today we worry about junk on the Internet; yesterday we worried about junk on TV (and before that: junk through radio and film and books and newspapers). Today we worry about terrorists using bioengineering techniques to make new diseases; yesterday we worried about the telegraph and railroad being used to conduct the Civil War. Today, computer pioneer Bill Joy has argued that, because of the downsides of an accident, we should deliberately avoid certain areas of research; yesterday, Leonardo da Vinci destroyed plans for devices like the submarine, anticipating their use as weapons.


 

Unexpected Consequences of the Internet

 

In adversity, everything that surrounds you is a kind of medicine
that helps you refine your conduct,
yet you are unaware of it.
In pleasant situations, you are faced with weapons that will tear you apart,
yet you do not realize it.
-- Huanchu Daoren, Taoist philosopher (c. 1600)

One of the greatest uses of the World Wide Web is as a universal medium of communication. Getting a message out through the Web is cheaper than through a TV station, can reach a wider audience than can a local newspaper, and is faster than printing a book. "The Cluetrain Manifesto" (1999) raved: "We embrace the Web not knowing what it is, but hoping that it will burn the org chart--if not the organization--down to the ground. Released from the gray-flannel handcuffs, we say anything, curse like sailors, rhyme like bad poets, flame against our own values, just for the pure delight of having a voice. And when the thrill of hearing ourselves speak again wears off, we will begin to build a new world. That is what the Web is for." A little overheated, perhaps, but this is a popular view of the invigorating freedom the Web provides.

While the lack of constraint may indeed thrill the author, it is a huge downside for the reader. The Web is perhaps the least reliable source of information available. With no editor or standard for most Web content, the reader is forced to burrow through this information landfill and separate the useful and accurate content from the misinformation, ads, porn, and irrelevant drivel. Information that can find no outlet other than the Web is third tier, and the writer's enthusiasm for seeing it in print is rarely shared by the reader. There are reliable sources here, of course, but their reputations were usually made in the print or broadcast worlds (consider the Encyclopedia Britannica, the New York Times, NBC News, and so on). The paradox is that the Web's publishing strength is also its weakness.

In e-mail we find a similar paradox. It's easy to pass along useful e-mail but just as easy to pass along junk. This has given us inboxes full of spam, not-very-relevant business correspondence sent FYI, and jokes and other nonsense. For example, e-mail chain letters are common. You've probably heard about the Neiman Marcus cookie recipe ("they overcharged me for this recipe so I'm taking revenge by making it public--pass it on"), an appeal with an impossible claim ("Bill Gates will send three cents to this charity every time you forward this e-mail--pass it on"), or a warning against some computer virus ("this is important news--pass it on"). Whether it's greed, an emotional appeal, or a chance to be a good citizen, these chain letters must infect you with a reason to send them along. There's probably no real computer virus behind that warning e-mail--the e-mail itself is the virus, and you are the intended host.

Even when the facts are accurate, an e-mail appeal can be impossible to rein in. In 1989 a boy with cancer had a dying wish to set the Guinness record for the most get-well cards. He broke the record, he's quite well now, and, after 15 years and 350 million cards, he'd really, really like people to stop sending them.

Through the Internet, we can read news only from targeted news sources, avoiding stories we don't care about. But do we lose something important? In the same way that avoiding exposure to germs can lead to a weaker immune system, avoiding exposure to a random array of news can leave us poorly informed. Only by scanning unfiltered headlines do we find the small fraction of serendipitous and useful stories we would have missed otherwise.

The Web was invented to allow easy international collaboration among scientists, and yet here we find a similar problem. The bandwagon effect is stronger when ideas can be debated and a consensus reached earlier in the scientific process. Fringe ideas are dropped quickly when they can be aired more easily. This may often be more efficient, but it can avoid surprising discoveries that the old-fashioned approach might have produced.


 

High Tech Myth #6: Products are Adopted Faster

 

Future Hype analyzes and debunks the High Tech Myths, nine fashionable but deceptive explanations for how technology works today. This is Myth #6.

----------------

Web time [is] seven times faster than normal time.
-- Cluetrain Manifesto (1999)

Another popular argument states that products are reaching us increasingly quickly. The US Department of Commerce outlined the argument this way: "Radio was in existence 38 years before 50 million people tuned in; TV took 13 years to reach that benchmark. Sixteen years after the first PC kit came out, 50 million people were using one. Once it was opened to the general public, the Internet crossed that line in four years."

This is hardly a fair comparison. Fifty million people were half the US population when radio was introduced, but only 20 percent when the Web started. The "once it was open to the general public" caveat for the Internet is also important. The Internet began in 1969. This means that 22 years of money and research from the government and universities nurtured it before it was opened to the public in 1991. And even at its starting point in 1969, the Internet wasn't built from scratch, like radio or the telegraph, but was built on the infrastructure and experience of the telephone industry.

This is rather like a bamboo plant that builds its root infrastructure for years and then bursts forth with a new shoot that grows a foot or more per day. It can be said that the bamboo grows to full height in a month, but that ignores the years of preparation that made it possible. Not only was the Internet nurtured for decades before the Web was introduced, but by the time it was opened to the public, the home PC industry was already well established. From the consumer standpoint, the Internet was born with the technological equivalent of a silver spoon in its mouth.

This quote could be more honestly written as follows.

 The first radio broadcast was in 1906. About 23 years later, radio was mature enough for consumer use and receivers were in 2% of American households. Radio was in 50% of households in seven more years. Television, invented two decades later, had a similar progression: 24 years to reach 2% penetration and six more years to reach 50%.
 If we take the year the first microprocessor was built (1971) as the start of the PC industry, it took little more than a decade to reach 2%. Its gestation was much faster than that of radio and TV because the PC did not need as much infrastructure. Nevertheless, it took almost two more decades for PCs to reach 50% penetration, three times longer than radio or TV.
 The Internet was begun in 1969 as a government-funded research project. It was opened to commercial use the same year the Web was launched, in 1991. To reach 2% household penetration took 24 years, and it hit 50% after an additional seven years.

What conclusions can we draw? The evidence for accelerating technology change has evaporated, and we can see that successful products over the past century have had similar gestation times and growth rates and that modern inventions have not reached the market unusually quickly. However, it's interesting that the PC, one of the poster children of the our-times-are-unprecedented mindset, grew so much more slowly than radio and TV. Don't think that the PC carried a heavier burden because it was expensive. A 1981 PC was half the relative cost of a 1939 television and one tenth that of a 1908 Model T (these three dates are the first time their respective products were made available to the general public).

We'll take a final look at this question of how fast technology moved by looking at a very old example, cathedral construction in the medieval period. The difficulty of these projects is staggering. Imagine a cavernous, handmade stone building over 400 feet long with 12 stories of open space inside. The primitive cement of the time does little more than fill the gaps between the stones and will break if tension (pulling force) develops. Builders could validate new techniques and designs only through experiment. Most are illiterate, and they learned their skills through apprenticeship rather than books or schools. Architecture is not yet a science, and failed experiments can cost lives and years of work. There are no cranes, trucks, or power tools--there are not even any blueprints. This was the challenge facing the town of Chartres, France in 1194.

Despite these difficulties, the stunning Chartres cathedral, which still stands today, was almost completely built within 30 years. Salisbury cathedral in England, of similar dimensions and begun a few decades later, took less than 40 years. It is humbling to note that Washington's National Cathedral took more than twice this long, and New York City's St. John cathedral is still unfinished after over a century of work. The common perception of medieval cathedrals requiring centuries of work from generation after generation of stonemasons is quaint but not always true. When funds were available, as they were for Chartres and Salisbury, work proceeded quickly. When they are not, as in these modern examples, work halts.  Only by picking and choosing examples can one argue for cathedrals--or technology in general--that technology changes ever faster.


 

Computers in Schools

 

Billions have been spent on adding computers to schools. This money goes for the computers themselves, plus networking, maintenance, training, software, and so on. And the money continues to flow. Are we certain that it is well spent?

----------------

What's wrong with education cannot be fixed with technology.
-- Steve Jobs, cofounder of Apple Computer

Seymour Papert illustrates how little technology has helped education with this example. Imagine that a doctor and a teacher were transported from a century ago to the present. Technology has so changed today's medical landscape, with new tests, drugs, knowledge, techniques, and equipment, that the doctor would be unable to practice medicine. Nevertheless, beyond a few small adjustments, a teacher from a century ago would fit well into today's classrooms. Technology has been a huge expense for schools as well as a big disappointment.

Schools have had a long-standing immunity against the introduction of new technologies. In 1922 Thomas Edison predicted that movies would replace textbooks. In 1945, one forecaster imagined radios as common as blackboards in classrooms. In the 1960s, B. F. Skinner predicted that teaching machines and programmed instruction would double the amount of information students could learn in a given time. Filmstrips and other audio-visual aids were fads 30 years ago, and the television, now seen as a supplier of brain candy, once had a sterling reputation as an education machine.

Public education has tried repeatedly to extract the potential of the PC. The Congressional Office of Technology Assessment analyzed the evolution of these frustrating attempts. Its report notes that in the early days of the IBM PC, teachers, parents, and school administrators were told that we needed to teach students to program in BASIC, since that tool came with PCs. Then, the focus moved to the computer language Logo: let's teach students to think, not just program. Oops--a few years later, we were told that computers were best used for drill and practice. Then another correction: since PCs are tools, students should be taught word processing. Later phases emphasized curriculum-specific tools (a history database, a science simulation, and so on), then Web page design, and then the Internet. The progression reads like an implausible story--how can people see the PC's role in education fail and get redefined over and over and over and still maintain the faith? Wouldn't the joke wear thin after a while? And yet the faith endures.

The fortunes of The Learning Company, one of the most successful education software companies, parallels that of education software in general. With popular titles like Reader Rabbit and Carmen Sandiego, TLC was bought by Mattel in 1998 for almost $4 billion. Three years later, it was resold for one percent of that price.

The generous organization that donates a million dollars of PCs to a school district may be killing with kindness. The total cost of ownership of a PC is much more than the cost of the PC itself. The million-dollar donation condemns the school to spend perhaps half that much each year, forever, to satisfy ongoing needs for software, training, support, and upgrades (or, if not spend, then scrounge an equivalent amount through donations and extra work).

Seeing PCs' educational promise as largely empty, the editor of Issues in Science and Technology had these comments on the overemphasis on the digital divide in 2000: "These students who have less access to computers and the Net also have less access to everything else. Why among all their deprivations should we focus on their lack of computers? Is this what separates the underclass from the upwardly mobile? Hardly.... At this stage in the development of educational technology, the computer and Net are a condiment or a dessert on the educational menu."

I'm optimistic about the long-term benefit that computers can give to education. However, we should expect more false starts, each with proponents convinced that (despite the failures in the past) they have finally discovered educational nirvana. Expect them to also shrilly proclaim that neglecting the latest approach will dramatically shortchange the future of our children.


 

Technology and Popular Culture

 

Much of Future Hype contrasts the change we see today with that from the past Is change today bigger, better, faster than it was in the past? We only know by making the comparison. Here's a look at how popular culture has seen technology.

--------

Whoever wishes to foresee the future must consult the past;
for human events ever resemble those of preceding times.
-- Niccolo Machiavelli, The Prince (1513)

The novel Frankenstein tells the story of a scientist who brings the dead back to life. The subtitle of Mary Shelley's 1816 novel is The Modern Prometheus, likening Victor Frankenstein to the god who dared to bring the gift of fire to humanity. But the scientist's hoped-for gift goes very wrong, and the novel explores the public concerns in Shelley's time over the impact of science and industrialization on society.

Literature has reflected and explored current concerns for centuries. Just a few years after the American Civil War, during which civilian technologies were drafted into war service, Jules Verne wrote 20,000 Leagues Under the Sea (1869). In it, Captain Nemo tries to recapture a few of the demons from Pandora's Box by building a submarine to destroy warships. The Time Machine (1895) by H. G. Wells considered the evolution of mankind as Darwin's ideas of evolution were taking hold. Sir Arthur Conan Doyle's The Lost World (1912) told of the discovery of a lost dinosaur habitat in an isolated part of South America, at a time when archeologists were uncovering the fossils of enormous animals and explorers were uncovering new parts of the Earth. Contrast this with the approach to dinosaurs in Jurassic Park (1990). At that time, new parts of the Earth were no longer being discovered, but genetic research was exploring new domains, and this played a central role in the book.

After earlier books of individual mishandling of technology such as Frankenstein and The Strange Case of Dr. Jekyll and Mr. Hyde (1886) or early science fiction such as The Time Machine and A Journey to the Center of the Earth (1871), literature and culture moved on to explore the larger issue of technology used to the detriment of society. Movies such as Metropolis (1927) and Modern Times (1936) showed the darker view of a society with technology out of control. Brave New World (1932), Nineteen Eighty-Four (1949), and Fahrenheit 451 (1953) illustrated other negative utopias with varying amounts of technology used to keep people in their places.

Modern literature such as Future Shock (1970), the many books predicting The Coming Crash or The Coming End of <fill in the blank>, and movies such as The Terminator (1984) or The Matrix (1999) are recent examples of technology anxiety. Literature changes to reflect what's on the public's mind and preserves a record--a long one--of how society frets about technology. Hype can convince us that, not only is technology moving so fast that it's leaving us all in the dust, but that it's the fast track to hell as well.

Popular culture is another area exploring what's high priority to us. Marvel Comics launched the Spider-man comic in 1962. There we learned that Peter Parker became Spider-man after being bitten by a radioactive spider. Forty years later, the movie version also showed Peter obtaining his super powers from a spider bite, but this time it was a genetically-modified spider. Hollywood tracks what's on the public mind, and that had changed: radioactivity wasn't interesting anymore, but the dangers of genetically-modified animals were. The Hulk comic (also 1962) shows Bruce Banner transformed into the Hulk by radiation from a nuclear explosion, but in the movie version (2003), radiation triggers nanobots that make the transformation. We see another cultural fossil of the public mindset when in the 1967 movie The Graduate the title character is given hushed, almost clandestine career advice: "Plastics."

Western culture can become more sinister when technology conveys it into another culture. Television came to the Basque region in the Pyrenees mountains in southwestern France in the late 1960s. Euskara, the ancient Basque language that survived invasion by the Romans and by the Celts before them, has had difficulty competing with modern French culture that effortlessly penetrates into the remote mountains on TV signals.

Bhutan, the tiny land hidden high in the Himalayas and ruled by its Dragon King, has had roads, electricity, and public schools for only a few decades. In 1999, it became the last country on earth to get television. This gentle country with Gross National Happiness as its guiding principle has had a difficult time coping with televised violence and the lure of western goods.

Popular culture has tried to predict the changes that we can expect in our own culture, though with limited success. The Jetsons TV cartoon, shown first in 1962 and set a century in the future, is a familiar bit of contemporary science fiction. George Jetson works three hours per day pushing buttons as a "digital index operator." Calisthenics for Jane Jetson are morning button-pushing exercises. These extrapolations of the 1960s' diminishing amount and intensity of labor may yet come true, but that seems unlikely anytime soon. The number one song for 1969, "In the Year 2525," predicts atrophy of the human body, actions controlled by pills, and robots replacing humans. ("In the year 5555 / Your arms are hanging limp at your sides / Your legs got nothing to do / Some machine's doing that for you.") The novel 2001 (1968) was also off the mark in predicting human-like intelligence for the computer HAL. Even its ominous (though perhaps accidental) warning in the name HAL--take the succeeding letters of the acronym HAL to get IBM--seems quaint now.

[Television] is an art which shines like a torch of hope in the troubled world.
It is a creative force which we must learn to utilize for the benefit of all mankind.
-- David Sarnoff, at the public introduction of RCA's television in 1939

I wish goddamned television had never been invented.
-- Edward R. Murrow, radio journalist

 

Table of Contents

The book is divided into two parts. Part I looks at how and why we see technology incorrectly. While exploring its downsides, how it bites back, its surprising fragility, and its unpredictability, we will review some tools and insights to make our sometimes tense relationship more pleasant. The nine High Tech Myths are analyzed and debunked. Once the errors are chiseled away, a new and more accurate way of seeing technology change can emerge from the debris.

Part II looks at the constancy of technology change in a broad range of areas. Popular culture, health and safety, fear and anxiety, personal technologies, business--in all of these, history gives us repeated examples that make our experiences today seem unexceptional. This survey, illustrated with stories from thousands of years of technology, should lay to rest the notion that technology change is unique today.

--------

 

Preface

Introduction: Leveling the Exponential Curve

Part I: The Ways we See Technology Incorrectly

1 The Birthday Present Syndrome

Technology Good and Bad

Man versus Machine Contests

Acting like a Human

The Ever-moving Goal

Technological Myopia: Revisiting the Birthday Present Syndrome

2 The Perils of Prediction

Poor Predictions

Don't Get Stuck in the Present

Avoid Technology Infatuation

New Products Don't Win on Every Point

Finding the Next Big Thing

3 The Unintended Wager

Unexpected Consequences in Health

Unexpected Consequences on the Internet

And Unexpected Consequences Everywhere Else

Dealing with Systems

4 If it Ain't Broke, Be Grateful

Fragile Digital Storage

Bugs

Risks of Monoculture

Technology Dependence

5 More Powerful than a Locomotive

Myth #1: Change is Exponential

Attack on the Exponential Model

Excusing Exponential Failures

Myth #2: Technology is Inevitable

The Greater Change was Often in the Past

6 Faster than a Speeding Bullet

Myth #3: Important New Products Arrive Ever Faster

Myth #4: The Rising Tide of Valuable Information

Myth #5: Today's High-Tech Price Reductions are Unprecedented

Myth #6: Products are Adopted Faster

Myth #7: Invention Gestation Time is Decreasing

Press Overhype

7 Leap Tall Buildings in a Single Bound

Myth #8: The Internet Changes Everything

Sense from Statistics

The Internet--the Plastic of the 21st Century?

Myth #9: Moore's Law is Really Important

Computers in Schools

Computers in the Home and Office

8 Corrective Lenses

Technology Hierarchy

Technology's Family Tree

Technology Spotlight

Part II: The More Things Change ...

9 For Better or For Worse

Ages Through the Ages

Technology Defines Society

Technology and Popular Culture

Technology Can be a Tight Fit

Social Stereotypes

Social Conventions

10 Playing with Matches

Environment

Health

Human Physiology

Safety

Risk

Medical Ethics

11 Fear and Anxiety

Incredulity and Naiveté

Fears and Weird Beliefs

Adult Learning Curve

Skill Loss and Encapsulation

12 Technologies That Touch Us

Home Innovations

Timekeeping

Calendars

Writing

Technology Words

Standardization of Language

Units of Measurement

13 Innovation Stimulation

Gold Rush

Prizes

Transportation and Exploration

Industrial Revolution

14 What's Mine is Mine

Trade Secrets and Industrial Espionage

Patent Battles

Piracy

Conclusion: Vaccinate Against the Hype

Logical Fallacies

Technology for the Rest of Us

Want to Learn More About Technology Change?

Buy the Book!

Order at these online stores:

Amazon

Barnes & Noble

Read Future Blog

Explore the intersection of technology and society. See the historic precedents for today's technology. Learn more about what makes technology both delight us and frustrate us.

read on ->

Speaking of the Future

Bob Seidensticker shows people how we've been deceived about technology change by giving presentations that show how technology change really works.

more ->