Back to National Journal
5 of 43 results     Previous Story | Next Story | Back to Results List

12-11-1999

GOVERNMENT: False Prophets

Televangelists, credit cards, shopping malls, just-in-time inventory,
occupational health and safety, greater personal liberty, financial
independence for women, prosperity for all--oh yes, the year 2000 promised
to be a paradisiacal scene. Or so it appeared to a prescient Edward
Bellamy in his best-selling 1888 novel, Looking Backward: 2000-1887.
Bellamy was a journalist and a utopian socialist from Massachusetts, who
saw the future as he wished it to be. He predicted that ordinary Americans
would

live to age 85 or 90 ("old age approaches many years later and has an aspect far more benign than in past times," he wrote), and that they could "look forward to an eventual unification of the world as one nation."

Well, not yet, but keep checking back.

Jules Verne, too, got a lot of things right. As early as the 1860s, the world's first modern science fiction author wrote about motorized carriages fueled by internal gas (that is, hydrogen) combustion, and even fax machines--"photographic telegraphy [that] permitted transmission of the facsimile of any form of writing or illustration." He laid out technical specifications for sending men to the moon that proved eerily similar--in the escape velocity, the travel time, the shape of the capsule and its component materials, even the splashdown site--to those of NASA's first manned lunar flight. The many parallels "will blow you away," says Graham T.T. Molitor, the vice president and secretary of the World Future Society.

Given the pace and unpredictability of change in this mad, mad world, in which even the weather--a closed physical system--is orders of magnitude too complex for the most powerful computer to comprehend, it is astonishing that prognosticators ever get anything right. Yet occasionally they do. Even as George F. Kennan, in 1947, offered his diplomatic prescription for containing the Soviet Union, he declared that the emerging superpower "bears within it the seeds of its own decay." In 1959, science fiction writer Robert A. Heinlein wrote about a high-tech suit of armor that would also add to a soldier's strength, an idea that U.S. military labs are now pondering. Herman Kahn, the futurist and nuclear strategist, predicted in 1967 that the year 2000 would see the widespread use of computers, lasers, personal pagers, VCRs, satellite television, and organ transplants.

The same year, an intellectually august Commission on the Year 2000 published several papers, among them the views of sociologist David Reisman on the prospect of an unrelentingly competitive society; anthropologist Margaret Mead on a narrowing of the gap in gender roles; then-professor Daniel Patrick Moynihan on a mechanism much like vouchers as a market-based way to help the poor; and law professor Harry Kalven Jr. on the threat that technology posed to individual privacy. Alvin Toffler, a best-selling futurist, accurately foresaw (in his 1980 book, The Third Wave) the shift from an Industrial to an Information Age that would favor customization over standardization, decentralized over centralized authority, and small-is-beautiful over bigger-is-better. "He just nailed it, every time," says Jeffrey A. Eisenach, the president of the Progress and Freedom Foundation, a libertarian think tank.

But Toffler also predicted the use of paper wedding dresses, and Kahn expected human hibernation, control of the weather, and possibly the elimination of arthritis by the century's end. Jules Verne proposed using a cannon to launch men toward the moon, then wrote an adventure taking place inside a hollow Earth. Edward Bellamy imagined a blissful socialism that knew no money, poverty, jails, or social distinctions--in which lawyers had vanished because, as a character in Looking Backward marvels, "the world has outgrown lying."

Well, no. Politicians, policy wonks, and prophets who have sought to foresee the world as the millennium turns have been wrong--often laughably wrong--much more than they've been right. Whatever happened to the robot maids, as then-Atomic Energy Commission Chairman Glenn T. Seaborg expected in the 1960s to see in 2000, or the chic woman's ornamental use of "live butterflies fluttering around her hairdo," as The New York Times foretold, or an auto-plane for every adult? Earthlings aren't exploring in person the limits of the solar system or living underground, as science fiction writer Isaac Asimov once predicted. Nor have hovercraft replaced automobiles, as Marshall McLuhan, the author of Understanding Media, divined. In 1983, Vincent DeVita Jr., the director of the National Cancer Institute, said that annual cancer deaths could be cut almost in half by 2000; they've fallen, but by less than 10 percent.

Prophecy is hard. It was believed in the '60s that a War on Poverty might soon succeed. The '70s saw a revival of Malthusian fears--predictions of famine and long-term oil shortages, as population outstripped resources. Wrong, wrong, wrong.

People who assumed stormy days ahead in geopolitics have found some sunshine instead. Anyone who counted on improving human nature in hopes of improving society has seen those hopes dashed. The successful prognosticators, the few of them, have relied on artful extrapolations of existing trends, or on deep insight into the dynamics of things--or on dumb luck.

But luck comes and goes, deep insight is rare, and extrapolations don't always work. Recall country-and-western singer Ray Stevens' 1992 ditty, "Working for the Japanese," which predicted that soon we'd "drink nothing but Kawasaki sake and Honda wine and Mitsubishi light beer."

Predictions, in other words, aren't really about the future at all, but about the present. Some soothsayers look at the time in which they write and figure the future has to be better. Bellamy's vision of a socially harmonious 2000 was a reaction against the rude industrialization and labor strife of his day. Other foretellers believe little will change but the faces. In 1968, the president of the Carnegie Corp. of New York, Alan Pifer, predicted that the year 2000 would see "campus demonstrations led by militant youngsters whose badge of defiance to the corrupt and conservative world of their elders (the present student generation) will be crew cuts." He was right about the crew cuts.

Truly foretelling the future, after all, requires something more than peering into the distance; it takes seeing around corners, too. For it's the changes that no one can foresee--in technology, in social habits, and from cataclysmic events--that tend to alter things beyond recognition.

Probably that was why, at a conference on the future of warfare hosted by the Air Force in 1985, the pilots and scientists in attendance found it "more productive" to hear what science fiction writers had to say than to rely on academics and hard-headed consultants who try predicting the future, recounts Joe Haldeman, the author of 15 science fiction novels. Military strategists, he says, figured to learn more from an hour of "off-the-wall, unpredictable, blue-sky" musings--touching on then-outlandish notions, such as computer bugs and weapons operated from afar--than from a conventional, fact-trapped view of what the future may hold.

Jeremiahs of the Oil Patch

What chance do we have of foretelling the future, when we don't even have a clue about what's happening now? Why is crime down? Why are teen births on the wane? Experts haven't the foggiest. So the politicians are free to say whatever they like (and they do).

Even the simplest of demographic extrapolations are tricky to get right. What could be easier than predicting the nation's population a couple of decades ahead, given that most of the people who'll be living then are already alive? But census projections are notoriously inaccurate. In 1964, Hans H. Landsberg of Resources for the Future predicted that the U.S. population would reach 331 million by 2000. In 1967, the Census Bureau projected a population of 282.6 million to 361.4 million, a range that rested on different assumptions about immigration restrictions, abortion laws, and such social customs as the spacing of children. Latest census estimates put the U.S. population at 273.6 million.

On demographics, as on most other things, "we get it wrong all the time," says Irwin M. Stelzer, the director of regulatory studies at the Hudson Institute. One reason, he says, is the likelihood of errors in economic projections; a modest misjudgment in the rate of growth in gross national product--which drives so many other projections--can yield huge errors in analytical conclusions. And something else is at work: When forecasts carry implications for policy, "everybody has a stake in a certain outcome," Stelzer points out. "They don't lie or torture the figures, but they bring to it a set of assumptions," which helps them to contemplate a set of competing, equally plausible options and to choose among them.

Consider, for instance, what happened to expectations about the future of oil during the turmoil of the 1970s. After the Arab oil embargo and the Iranian revolution touched off successive oil shocks, it became almost an article of faith--among oil companies, environmental activists, and federal energy analysts alike--that oil reserves were in dangerously short supply and that prices would jump to as high as $100 a barrel and never return to where they had been before. "The era of `cheap energy' has ended," Texaco's then-chairman, Maurice F. Granville, penned in 1977, in an Oil & Gas Journal supplement called "Petroleum/2000." Electric utilities, calculating (for state regulators' benefit) the value of a proposed nuclear power plant throughout its presumed life, assumed that oil would cost $100 a barrel. With a high-falutin' market price in mind, Exxon spent more than $1 billion on buying oil-shale lands.

James R. Schlesinger, who was Energy Secretary in the Carter Administration, went so far in 1978 as to present to a House subcommittee a graph (based on CIA findings) predicting that the free world's demand for oil would overtake its productive capacity in the early 1980s. "Unless our nation has planned wisely and well," Schlesinger warned, "it will face difficulties as severe as anything we have experienced since the 1930s."

All of this proved to be nonsense, of course. OPEC's power collapsed, the world is awash in oil, and prices are currently much lower in real terms--even with the recent run-up--than they were in the late '70s and early '80s. So why was everyone so wrong? For one thing, "governments need crises," Stelzer ventured. Oil companies and environmentalists, each for their own reasons, found solace in high prices.

Even a Republican such as Schlesinger had a hard time seeing a market at work. "OPEC ministers were known for being very, very arrogant at the time," Schlesinger recalls, "and we tended to believe them" when they vowed to keep oil production restrained and prices high. "This was a different kind of cartel," or so it seemed, in Schlesinger's account, and oil was considered a different sort of commodity--a driver of the world's economy, consumed rather than reprocessed, and fairly impervious to price changes.

Michael C. Lynch, a political scientist at the Massachusetts Institute of Technology, offers an explanation he says might be found "in the `Journal of Abnormal Psychology,' not the `Journal of Energy Economics.' " The anxiety over the OPEC-induced shortage, he suggests, somehow turned into a panic over a possible physical shortage.

Such a jump in logic, "in retrospect, [is] hard to explain," acknowledges Jay E. Hakes, the chief of the government's Energy Information Administration, but "when you first look at the data, [the conclusion] makes a lot of sense." After all, oil resources are finite, production in the continental United States had been sliding since 1970, no big U.S. fields remained to be found (even in Alaska), and domestic oil demand had more than doubled every decade since 1945. Worldwide, oil production capability looked strictly limited, as well as politically unreliable, as Mideast oil kingdoms bad-mouthed foreign influences, and bankers worried about recycling petro-dollars.

Besides, pessimism about oil's future was nothing new. "The sky is always falling," observes Edward D. Porter, the research manager at the American Petroleum Institute. First, it was whale oil that was being disastrously depleted because of overhunting. Then the oil seeps in Pennsylvania threatened to give out, before the first well was drilled in 1859. Late in the 19th century, Pennsylvania officials saw just four or five years of supply remaining; then came the Spindletop gusher in Beaumont, Texas, in 1901, and the roughnecks soon ruled the Lone Star state.

Each time, the prophets of doom failed to count on high prices to curb demand and to bring forth new supply. "There's nothing new about the process of innovation--it's been going on ever since the industry got started," says economist Morris A. Adelman, an MIT professor emeritus and one of the few oil analysts who declared in the 1970s--to hoots and hollers--that prices would settle back down and that the harder people looked for oil, the more they would find. ("Give him a couple of gold stars," Hakes says now.) "The big question," Adelman explains, "is who is winning the race between depletion, on one hand, and knowledge, on the other."

So far, knowledge is. Rising prices carried oil drilling to new and costlier places, such as the North Sea, the tar sands of Venezuela, and now the Caspian Sea. Even more crucial has been the role of technology--some of it serendipitous, much of it a result of oil's preciousness. "We're getting so much smarter in how to extract oil," says Paul R. Portney, the president of Resources for the Future, a nonpartisan Washington think tank on environmental and natural resource issues. He cites the computer-based 3-D seismic technology, which has vastly reduced the cost of locating oil; deep-water drilling, such as in the Gulf of Mexico; and "slant drilling" in several directions, to recover more oil from a given formation than through a single, vertical hole. Contrary to the experts' best guesses, says Hakes, "oil has sort of hung in there."

The Soviet Union Forever

It wasn't only about oil and food that midcentury predictions were almost universally gloomy--and wrong. When it came to geopolitics, where the facts are iffier and the ideological blinders are thicker, the predilection to expect more of the same proved stronger still.

The Cold War was at times alarmingly grim, and a hard realism set in. Democratic presidential candidate John F. Kennedy, during a debate with Richard M. Nixon in their 1960 campaign, suggested that "10, 15, or 20 nations will have a nuclear capacity" by 1964, up from four. Herman Kahn wrote in 1967 that as many as 50 nations (including Germany, Indonesia, Japan, Poland, Romania, and Yugoslavia) "might have access" to nuclear weapons by the 1990s. He was off by 42.

The proliferation of nuclear weapons wasn't the only geopolitical fear: The hostile superpower that already had the bomb looked scarier still. Even Westerners who might have scoffed at Soviet leader Nikita S. Khrushchev's 1956 prediction--"We will bury you!"--saw no reason to anticipate the reverse. The hard-nosed Kahn assumed the Soviet Union would survive at least until 2000. A fantasy history called The Third Millennium: A History of the World, AD 2000-3000, published in 1985, figured on a Soviet Union until 2800 or later. "Having a major world power vanish from one's life is not a thing you [expect to] experience," sighs Henry J. Aaron, a venerable economic analyst at the Brookings Institution.

Except that it was the fantasy that, in the end, proved realistic. Who knew? Well, Kennan did, for one. Writing as X in Foreign Affairs in 1947, he described the Soviet Union as essentially fragile, in that the Kremlin's ruthlessness in suppressing its citizenry was eventually bound to inflame a popular reaction. If anything disrupted the unity of the Communist Party, he wrote, "Soviet Russia might be changed overnight from one of the strongest to one of the weakest and most pitiable of national societies."

Even as a powerful Soviet Union became an undeniable, shoe-banging fact, the occasional analyst glimpsed something beyond. "Barring a war," Robert Conquest, a Soviet historian at the Hoover Institution on War, Revolution, and Peace at Stanford University, wrote in 1960, "another decade or two may see enormous changes. ...The regime must evolve or perish." Again and again over the years, he predicted that the Soviet Union wouldn't last, reasoning that a totalitarian government was sure to ossify and prevent the economy from changing as needed. "The situation resembles the Marxist one," he explained in a recent interview, "of a society with an economy and politics that don't fit. It was bound to blow up."

Edward N. Luttwak, a senior fellow at the Center for Strategic and International Studies and a self-described "dabbler" in Soviet matters, began during the 1980s to predict the USSR's demise, but for a different reason. Born in Transylvania, Luttwak said he became acutely aware at an early age of "nationality," which became "the chief engine" of the strung-together empire's disintegration.

The traditional foreign-policy thinkers--the Henry A. Kissingers and the like, who believe in balances of power and in conventional means of diplomacy--are by nature ill-equipped to color outside of the lines. Which raises the most curious case of all: Ronald Reagan. Not only did the 40th President, unschooled in diplomatic practicalities, call for the downfall of Communism--he predicted it. In 1982, while speaking before the British House of Commons, he borrowed Karl Marx's terminology to declare that democracy "will leave Marxism-Leninism on the ash heap of history." Was he prescient, or merely so naive as to believe that whatever he wished for would come to pass?

"Ronald Reagan never got caught up in all the complexities," says Donald Kagan, a historian at Yale University. "He thought simply. And when you do think simply about this question, things really are clear. ...He knew the [Soviet] system was wrong, and that right would be done. All it took was for us to do our job." Analysts generally credit Reagan for increasing the strain on the Kremlin by casting the Soviet Union as the Evil Empire and by upping defense spending by many billions of dollars that the Soviets felt obliged to match.

In so successfully imagining an unimaginable world, Reagan surely must have been helped by having been a denizen of Hollywood. Only someone so agile at creating his own reality would find himself discussing (with Soviet leader Mikhail S. Gorbachev, in Reykjavik, Iceland) a world without nuclear weapons. "Ronald Reagan was an idealist in the classic sense of the word," Luttwak ventures. "The movie will not end with the bad guys victorious."

Pollyannas About Poverty

If a prone-to-be-paranoid public is apt to expect the worst about the excesses of the marketplace or the dynamics of inscrutable superpowers, it can also show a certain lofty--if misplaced--faith in the possibilities of human nature.

Recall, for example, some of the expectations about education from the 1960s. Such best sellers as Summerhill School and Education and Ecstasy pointed to a free-style education that rested on an assumption that schoolchildren are naturally self-motivated to learn. Optimism about education wasn't new. Many have touted radio, then television, and most recently the Internet as instruments to transform the nation's classrooms, notes Jeffrey Mirel, the director of education studies at Emory University. "No technology is going to impact knowledge, let alone wisdom."

An even greater hubris inspired the War on Poverty. "In our lifetime," President Lyndon B. Johnson told Democratic contributors in 1964, a few weeks after he had proposed an anti-poverty program to Congress, "God willing and with your help, we are going to wipe out poverty in America." That year's annual report by the Council of Economic Advisers had described the "conquest of poverty" as a simple matter of arithmetic--an achievement "well within our power. About $11 billion a year would bring all poor families up to the $3,000 income level we have taken to be the minimum for a decent life." In 1966, anti-poverty czar Sargent Shriver wrote a memo to Charles L. Schultze, LBJ's budget director, describing "the realistic goal of ending poverty by 1976." In 1967, economist James Tobin wrote an article in The New Republic that was headlined: "It Can Be Done! Conquering Poverty in the U.S. by 1976."

Such optimism was perhaps understandable. Sheldon Danzinger, a professor at the University of Michigan and an expert on poverty, points out that World War II had been won, the economy was booming, and Americans put "great faith" in science and in social science. The War on Poverty, he recalls, "was really a noble calling." Why be cowed by the biblical prediction that the poor will always be with us?

Even at the time, though, people close to the problem had their doubts. "I was in the trenches, and it was discouraging," recalls H. Ralph Taylor, who had been involved in housing in Somerville, Mass., and New Haven, Conn., before running LBJ's Model Cities Program from 1966-68. Part of the problem was too few federal resources, he says, especially after the escalation in the Vietnam War.

Worse, though, was that urban problems proved more intractable than the do-gooders imagined, complicated by race and the dark mysteries of human motivation. The economists' straightforward arithmetic didn't take into account the disincentives and changes in habit that government programs may encourage. "In the era of the '60s, and even into the Nixon Administration, [people] were just generally much more optimistic about what could be done to deal with these problems by government programs than is now the case," says Schultze, currently a senior fellow emeritus at Brookings. The government is pretty good at building highways or at the bureaucratic task of running Medicare for the elderly, he explains. But, as has been learned during the past 30 to 40 years, "it's just a hell of a lot harder...to make major changes in human behavior and to inculcate knowledge and all of that." Indeed, he adds, "some things you do may make the problem worse."

This insight has left us sadder about the future, perhaps, but wiser as well.

Dull Dreams

Or so you'd think. Then why don't foiled forecasters learn from past mistakes? Colin J. Campbell, a Geneva-based oil consultant, got much notice as the co-author of a piece in Scientific American last year predicting that world oil production will peak before 2010. Did Paul Ehrlich, the gloomy ecologist, feel chastened after he had wagered $1,000--and lost--in a bet with Julian L. Simon, a University of Maryland economist, who believed that human ingenuity could make up for any strains on the Earth's resources? To measure shortage or plenty, they bet in 1980 on whether the prices of five commodities that Ehrlich picked (chrome, copper, nickel, tin, and tungsten) would rise or fall by 1990. Ten years later, after the prices of all five had fallen--by more than half overall--Ehrlich paid off. "The bet doesn't mean anything," he told a reporter for The New York Times at the time. "Julian Simon is like the guy who jumps off the Empire State Building and says how great things are going, so far, as he passes the 10th floor," he added, contending that the prices might yet rise.

Someday, of course, Ehrlich may yet be proved right, much like the wizened British officer of lore, perched on a bar stool somewhere, who had been predicting a major war every year for the past 50: Twice, he had been correct. Almost anything can happen, and people are wont to apply something other than their best rational judgment in looking ahead.

Which is why the mundaneness of our current imaginings seems a trifle depressing. Americans once dreamed about abolishing poverty and colonizing space, and preferred big fins on their cars as a reminder of their grand ambitions. But now, even the science fiction is prosaic. It's "similar to regular fiction," says author Haldeman, in its "concern about lives that are static and meaningless and concern about large institutions that are out of control. ...I see very little predictions of actual utopia."

Celebration over the impending turn of millennium, which has been pretty much left to advertisers, has shown precious few signs of becoming a meaningful event. Not that it should. Paul Boyer, a historian at the University of Wisconsin who has written a book on biblical prophecy, notes that the millennial themes of Christianity refer to the span of a paradisiacal reign, not to any particular starting date. But he also reports that Jonathan Edwards, the 18th-century Calvinist preacher, speculated in his personal journals that the year 2000 would touch off the world's seventh--and last--millennial epoch, a time of bliss before history ends and eternity begins.

Surely that's just another prediction bound to miss its mark.

Burt Solomon National Journal
5 of 43 results     Previous Story | Next Story | Back to Results List