Part 4. Confusion and Dismay 2000-2012: “Is Earth F**ked?”
In June of 1988 the director of NASA’s Goddard Institute for Space Studies, James Hansen, testified before a congressional hearing that the earth’s climate was threatened by a buildup of heat-trapping “greenhouse gases” in the atmosphere. The phenomenon was not newly-discovered; for several decades climatologists at Hawaii’s Moana Loa Observatory had tracked steadily rising concentrations of carbon dioxide in the atmosphere, and many scientists had warned of its effect on atmospheric temperatures. By 1988 sufficient data was available to make a credible case (to scientists, at least) that “greenhouse gases” were changing climate with serious, deleterious consequences to humanity.
Thus was engaged the most serious threat to the continuation of the human presence on earth that our species has encountered in its 200,000-year tenure on the planet. The response would not be encouraging.
Before the end of the year, the first World Conference on the Changing Atmosphere was held in Toronto, and the United Nations created the Intergovernmental Panel on Climate Change (IPCC) of scientists to advise the world’s governments on the looming threat. Early studies focused on the rise of sea levels, with consequent flooding of coastal cities and salinization of water supplies, and decline of crop yields due to heat stress of plants.
Serious environmental threats had been countered by prompt action before: depletion of the ozone layer by chlorofluorocarbons (CFCs), and a resultant increase of lethal ultraviolet radiation bombarding the planet’s surface, had led to the Montreal Accord in 1987, accomplishing the gradual replacement of the offending CFCs in refrigerants and other uses.
Climate change, though, proved to be far more difficult to address. Whereas the corporations producing the ozone-depleting CFCs were able to relatively easily switch to other compounds to accomplish the same industrial uses, earning them comparable profits, that was not the case now. Some of the largest, most powerful corporations in the world—Chevron, Exxon, Shell, Mobil, BP—had enormous infrastructure investments in fossil fuel extraction and processing. These investments could not be converted to sustainable, renewable sources of energy, such as solar, water, or wind. Responding to climate change thus threatened the valuation and profits of these corporate giants.
Another problem was the timing. During the Reagan presidency (1981-1989), business interests had seized the moment to entrench unregulated capitalism (and its attendant profits) as the unchallenged economic paradigm of the world. Indeed, in America free-market capitalism was presented in semi-religious terms as the essence (with democracy) of America—a notion that would have left the nation’s Founding Father’s scratching their heads.
Free trade negotiations between the United States and Canada had begun in the very year David Hansen raised the issue of climate change, and in the ensuing years the North America Free Trade Agreement (NAFTA) was forged and signed, and soon the World Trade Organization (WTO) signatories had locked in the globalization of unregulated capitalism. The political elite of the world was happy to agree with the economic elite that booming, export-driven economies were good not just for the transnational corporations, but for the people of the world.
Wealth would “trickle down,” President Reagan assured the nation, as a result of the globalization of free-market capitalism. In fact, what began was the long, steady impoverishment of the middle class in America and elsewhere, negating three decades of progress after the Second World War. Income inequality within two decades would become marked: the rich grew fabulously richer, and everyone else lagged.
From the beginning, funding from large corporations flowed to entities denying that the combustion of fossil fuels was primarily responsible for increasing atmospheric temperatures and the attendant changes in climate. Another tactic was denying the scientific consensus that climate was changing at all. Soon after the turn of the new millennium, a right-wing Chicago-based think-tank assumed the lead role: the Heartland Institute, founded in the midst of the Reagan glory-years. After devoting its first several decades to the noble cause of denying the link between cigarette smoking and lung cancer (with Phillip Morris a generous contributor), the institute wheeled smoothly to denying the link between fossil fuel combustion and the readily-documented climate change.
Heartland used the same tactics for climate change they had used in the pro-smoking tobacco campaign: search exhaustively to ferret out a few scientists willing to express doubts about the data, put them on the payroll of Heartland or one of its many corporate sponsors, then trumpet their questioning of the science. Its annual Climate Conferences from 2008 to the present spotlighted denial of fossil fuel’s connection to climate change from an impressive (and well-recompensed) range of policy officials, legislators, lawyers, and the occasional scientist—all of which confirmed Upton Sinclair’s observation that “It is difficult to get a man to understand something, when his salary depends upon his not understanding it!”
For example, an investigation by Greenpeace revealed that a prominent speaker in the 2011 Heartland Institute Conference on Climate Change, astrophysicist Willie Soon, had for the eight previous years (2002 to 2010) received 100 percent of his new “research” grants funded by fossil fuel interests. A 2012 leak of internal documents from Heartlands headquarters showed direct monthly payments from the institute to a range of climate skeptics: physicist Fred Singer ($5,000 plus expenses per month), geologist Robert M. Carter ($1,667 per month), founder of the climate-skeptic Center for the Study of Carbon Dioxide and Global Change Craig Idso ($11,600 per month), and a one-off $90,000 payment to blogger and one-time TV weatherman Anthony Watts.
As the scientific evidence for climate change caused primarily by fossil fuel combustion mounted, the climate deniers ramped up their well-oiled (and well-paid) campaign casting doubt on the science. The result in the first years of the 21st century was widespread public confusion and doubt regarding not just the issue of climate change, but environmental issues in general. This uncertainty had its effect upon the environmental community itself. The first signs of dramatic questioning of the entire environmental enterprise soon appeared.
In 2004, two persons long engaged in America’s mainstream environmental movement reluctantly came to a startling conclusion. In an article provocatively entitled The Death of Environmentalism, Michael Shellenberger and Ted Nordhaus argued that although the century-old tactics of America’s mainstream environmental organizations had achieved notable results, they were woefully inadequate to the great global challenges then facing the earth, such as climate change, biological diversity, and human overpopulation.
With climate change as one example, they claimed that “Not one of America’s environmental leaders is articulating a vision of the future commensurate with the magnitude of the crisis. Instead they are promoting technical policy fixes like pollution controls and higher vehicle mileage standards—proposals that provide neither the popular inspiration nor the political alliances the community needs to deal with the problem. By failing to question their most basic assumptions about the problem and the solution, environmental leaders are like generals fighting the last war—in particular the war they fought and won for basic environmental protections more than 30 years ago.”
“Environmentalism is today more about protecting a supposed ‘thing’—‘the environment’—than advancing the worldview articulated by Sierra Club founder John Muir, who nearly a century ago observed, ‘When we try to pick out anything by itself, we find it hitched to everything else in the Universe.’"
Shellenberger and Nordhaus did not fault today’s leaders of the environmental movement. “As individuals, environmental leaders are anything but stupid. Many hold multiple advanced degrees in science, engineering, and law from the best school in the country. But as a community, environmentalists suffer from a bad case of group think, starting with shared assumptions about what we mean by ‘the environment’—a category that reinforces the notions that a) the environment is a separate ‘thing’ and b) human beings are separate from and superior to the ‘natural world.’”
The challenges of overpopulation, climate change, and preserving biologic diversity are global, and much too vast to respond to technical fixes deriving from compromise and negotiation with commercial and political powers, they claimed. Something more—something more fundamental—is required. “Environmentalists are in a culture war whether we like it or not. It’s a war over our core values as Americans and over our vision for the future, and it won’t be won by appealing to the rational consideration of our collective self-interest.”
Though Shellenberger and Nordhaus did not in their article advance specific steps or frameworks regarding the new direction required by environmental groups, they did suggest that “Environmentalists need to tap into the creative worlds of myth-making, even religion, not to better sell narrow and technical policy proposals but rather to figure out who we are and who we need to be.” To my mind, it is telling that they quote John Muir, and specifically “call out” as problematic the anthropocentric view of humans as superior to the natural world, and the inadequacy of appeals to high, informed self-interest.
This radical critique of 20th century foundations for 21st century environmental problems was given an unexpected and resounding “second” later in 2004, when a former Director of the Sierra Club, Adam Werbach, announced in a talk before The Commonwealth Club of California in San Francisco (entitled “The Death of Environmentalism and the Birth of the Commons Movement) that “I am done calling myself an environmentalist,” due to outdated thinking and approaches. His about-face provoked a storm within the movement, and after receiving death threats Warbach stopped speaking in public without special security.
The concerns of Shellenberger, Nordhaus, and Werbach, while grabbing the attention (and anger, often) of mainline environmentalists at the time, generated no significant changes in the way the Big Green groups are constituted or operated. The strikingly meager progress in dealing with climate change and overpopulation early in the new century, however, seemed to confirm the concern of the critics. It was as if the sheer inertia of groups with memberships numbering in the hundreds of thousands precludes the sort of fundamental shifts proposed.
As for the critics themselves, Shellenberger and Nordhaus had formed The Breakthrough Institute shortly before the publication of their essay. In 2007 they coauthored Break Through: From the Death of Environmentalism to the Politics of Possibility. Those who had seen the 2004 essay as accurate and constructive eagerly greeted the book to discover what solution the critics would advance to further their earlier criticisms.
Surprisingly, none was proffered. Break Through is a curiously cerebral book, predominantly a long, discursive analysis of various writings of Rachel Carson, Jared Diamond, John Gray, and particularly Francis Fukuyama’s 1992 The End of History and the Last Man and his 2006 America at the Crossroads. These literary sections alternate with breathless opinions of contemporary political events in America, mainly the 2006 congressional elections and George Bush’s actions in Iraq.
Break Through has a bit of everything—except what the authors called for in The Death of Environmentalism: a new foundation for the environmental movement. Their only grand proposal, put forth in the last chapter of the book, modestly entitled “Greatness,” is their own Apollo project envisioning massive governmental subsidies and investment to advance renewable energy technologies, which they claim is positive and far preferable to punishing the fossil fuel economy.
While their Apollo project seems like a good idea (though it has largely languished in the halls of Congress and boardrooms of private foundations), it bears a certain resemblance to the very technological fixes and policy proposals that Shellenberger and Nordhaus decried in their 2004 essay. Certainly Break Through is conspicuously lacking in the sort of basic new foundation for protecting the planet (“myth-making, even religion”) they called for in 2004. When you look in the index of Break Through, you discover no entries at all for “John Muir” or “anthropocentric view” despite their being highlighted as so important in the 2004 essay.
The frustration building in the environmental movement in the first decade of the new century was heightened by a 2009 paper by Stanford’s Mark Jacobson and University of California, Davis’ Mark Delucchi, which detailed how “100 percent of the world’s energy, for all purposes, could be supplied by wind, water and solar resources, by as early as 2030.” Two years later the University of Melbourne’s Energy Institute published a blueprint for Australia achieving a 60-percent-solar and 40-percent-wind electricity system in only ten years. One year after that, the U.S. Department of Energy’s National Renewable Energy Laboratory outlined how wind, solar, and other currently available green technologies could provide 80 percent of U.S. electrical needs by 2050.
The technology is available to wean ourselves off fossil fuels, in other words. As Jacobson, coauthor of the 2009 study, put it, “It would require an effort comparable to the Apollo moon project or constructing the interstate highway system. But it is possible, without even having to go to new technologies. We really need to just decide collectively that this is the direction we want to head as a society.”
Unfortunately, that collective decision remained stubbornly elusive in the face of widespread opposition from fossil fuel and mining corporations, and the efforts of the Heartland Institute to demean science and sanctify free-market capitalism. No matter that 97% of the world’s climate scientists and hundreds of their peer-reviewed studies agreed that combustion of fossil fuels was causing climate change threatening human civilization. Even among economic geologists, scientists employed by extractive industries to locate commercially exploitable fossil fuel deposits, surveys established that 47 percent believe the evidence points to human-caused climate change—an astonishingly high number, given Upton Sinclair’s astute comments about the difficulty of assenting to something that threatens your income.
An impressive array of international and national scientific organization around the world agreed that fossil fuel combustion has the key role in climate change threatening human civilization: the U.N.’s Intergovernmental Panel on Climate Change (IPCC), the U.S. National Academy of Sciences (NAS), the American Association for the Advancement of Science (AAAS), the National Aeronautics and Space Agency (NASA), Britain’s Royal Society, the World Bank, the International Energy Agency.
No matter. The economic and political power of the fossil fuel and mining corporations kept the simple solution—transitioning from fossil to renewable fuels to generate power—off the table, by creating discord and confusion, even within the environmental movement. So successful was the Heartland Institute at discrediting the overwhelming scientific consensus that the United States in the first decade of the 21st century seemed to have reverted to a pre-scientific society, characterized by the ready belief in nonsensical assertions contrary to common sense and rational analysis. Not since the Salem Witch Trials had irrationality held such sway in America. The Heartlands Institute had dragged America four centuries into the past.
A low point came in 2009, when the United Nations climate summit in Copenhagen, billed as the last hope to avert calamity, failed miserably to agree to any binding emissions limits by any country. As before, in Rio de Janeiro in 1992 and Kyoto in 1997, nothing could be mustered beyond platitudes, and promises to try really, really hard to get close to unenforceable goals. Disappointed delegates to the conference wept. Big Oil and Big Coal had carried the day; one imagines champagne flutes clinking together at the Heartland Institute.
The only notable accomplishment at Copenhagen was the agreement to mark 2℃ as the maximum rise above pre-industrial average global temperature that could be tolerated by human civilization. Even though no mechanisms were stipulated to achieve this goal, it was felt that just naming a specific goal would be helpful. Even this, though, was fraught with controversy. Island nations, those with significant low coastal regions, and African countries already suffering from droughts protested vigorously, pushing instead for a goal of 1.5℃. They pointed out that even a 2℃ rise would be a severe challenge, triggering salinization of much of civilization’s fresh water supplies, frequent tidal inundation of great coastal cities ad important delta agricultural areas,, and decline in crop yields worldwide due to intensifying droughts. But in the end, 2℃ was adopted--for the good that setting a goal with no plans to reach it might do.
By 2010, climatologists and other scientists had become sufficiently concerned at the lack of progress to sound the alarm more forcefully about what their data indicated. “Why then are climatologists speaking out about the dangers of global warming?” asked Ohio State University’s Lonnie G. Thompson. “The answer is that virtually all of us are now convinced that global warming poses a clear and present danger to civilization.”
In 2011 the International Energy Agency indicated the planet was on track for 6℃ warming by 2100. “Everybody, even the school children, know that this will have catastrophic implications for all of us,” declared the report.
In 2012 the rush of data and predictions crested. A World Bank study indicated that a 4℃ temperature rise by 2100 would result from current trends, “marked by extreme heat waves, declining global food stocks, loss of ecosystems and biodiversity, and life-threatening sea level rise...There is also no certainty that adaptation (by human civilization) to a 4℃ world is possible.”
Kevin Anderson, deputy director of Britain’s Tyndall Centre for Climate Change Research, summarized that such a 4℃ increase “is incompatible with any reasonable characterization of an organized, equitable and civilized global community.”
A 2012 study by the international accounting and auditing firm PricewaterhouseCoopers concluded that the climate studies “mean, quite simply, that climate change has become an existential crisis for the human species.” A manifesto issued that year by 21 past winners of the Blue Planet Prize, including James Hansen (who had first raised the alarm in 1988) and former Norway prime minister Gro Harlem Brundtland, declared that “In the face of an absolutely unprecedented emergency, society has no choice but to take dramatic action to avert a collapse of civilization.”
Together, the coalescing tide of studies made it clear that unacceptable consequences to human civilization accompanied an atmospheric temperature increase beyond 2℃: sea rise by 2100 of between 1 and 2 meters, rendering coastal cities virtually uninhabitable (think New York City, Boston, Los Angles, San Francisco, London, Vancouver, Mumbmai, Hong Kong, Shanghai, among many others); vast inundation of coastal areas on many continents (Ecuador to Brazil; the Netherlands; much of Florida, California, and New England; Bangladesh, the Philippines, and Vietnam); drastic declines in world production of grains due to floods, inundation of delta regions, drought, and heat stress; consequent widespread starvation, particularly in tropical and subtropical regions of Africa and South Asia; wide-ranging, frequent droughts and raging wildfires; pest and disease outbreaks as new pathogens follow the changed climate regimes; fisheries collapses as the oceans become warmer and more acidic (already occurring earlier than anticipated); and dramatic water shortages worldwide as wells are invaded by salt water (on the coast) or water tables plummet (also already occurring).
Then there is increased frequencies and intensities of storms and hurricanes. Since every 0.6℃ increase expands the atmosphere’s capacity to hold water by 4 percent, a 3.5℃ increase would represent a 5.8 expansion factor, or 23 percent more water in the atmosphere, causing the predicted increase in frequency and intensity of storms worldwide (though not evenly spread; some areas will bear the brunt of the devastation, such as the Asian monsoons).
As if all this were not enough, scientists say that somewhere between a 2 and 4℃ rise in temperature, several catastrophic “tipping points” would likely occur. These would include the thawing of the permafrost in northern regions, releasing millions of tons of methane into the atmosphere. Methane is 25 times more powerful a greenhouse gas than carbon dioxide; thus global warming would suddenly spike significantly higher. Other tipping points predicted in the 2012 World Bank Study included “disintegration of the West Antarctic ice sheet, leading to more rapid sea-level rise; or large-scale Amazon dieback, drastically affecting ecosystems, rivers, agriculture, energy production, and livelihoods...and impact entire continents” worldwide.
Clearly the data indicated not just challenges to civilization, but its widespread collapse around the world as temperatures increased above 2℃, accompanied by horrific death-rates from famine and disease worldwide, but especially in the already-warm equatorial belt. How many will die? Studies of the impact of severe weather precipitated by past volcanic eruptions indicate the toll would not be in the thousands, nor hundreds of thousands, but in the millions. The 1783 eruption of Iceland’s Laki volcano, for example, produced death-tolls estimated between 1.5 million to 6 million worldwide, including drought and famine in Japan and India, and flooding with brutally cold winters in central and western Europe.
Had meaningful responses to climate change begun to be gradually phased in shortly after the initial 1988 announcement by Hansen, the threat could have been handled relatively easily and catastrophe averted. Three decades of refusal to seriously address the issue, however, due primarily to the very successful efforts of Big Oil, Big Coal, the Heartland Institute, and the pro-business Big Green groups, had presented humanity with a very different scenario by 2012. What was now required to avert the specter of civilization’s collapse on the planet was wrenching change focused into a short window of time. Serious global warming and its consequences, many scientists said, had already locked in as a result of the utter failure to reduce emissions since 1988. Whether the increase fell between 2 or 4℃, humanity’s tenure on the planet was in serious jeopardy.
At the American Geophysical Union meeting in San Francisco in the fall of 2012, all this was graphically illustrated in the session by a young, pink-haired geophysicist and computer modeler named Brad Werner. His presentation, entitled “Is Earth F**ked?”, delved into highly technical “complex systems theory,” charting perturbations, bifurcations, systems boundaries, inputs, dissipations, and attractors. At the end, when asked by a light-headed journalist whether all this meant the earth was, indeed, “f**ked,” Werner sighed, and nodded. “More or less,” he answered.
Indeed. And many in the environmental movement, as a result, were uneasy. Confused. Dismayed at the lack of meaningful response to the stark predictions of the scientists. Wondering whether they were doing the right things for the right reasons. Unsure whether, in spite of all their efforts, that planet wasn’t, after, f**ked.
Living and Writing in the Natural World
John Muir's Legacy: a history of the American environmental movement.
June 16, 2016
Be the first to comment