April 17, 2013

EU Climate Policy on Verge of Collapse

The rejection by the European Parliament of a proposal to salvage the EU’s carbon-emissions trading system has left the program in utter disarray. The 334-315 vote on April 16th, generally regarded as a surprise, occurred against an anemic economic climate and European fears of a loss of competitiveness. Prices, which were over €6 a metric ton at the beginning of the year and have been sliding, fell yet further to around €2.5 a ton. The survival of the emissions-trading-system is now very much in doubt.  This now-dated chart from Spiegel Online shows the general tendency:

Notes Schumpeter at the Economist: “The ETS has long been troubled. The scheme is the world’s biggest carbon market, trading allowances to produce carbon which cover about half the European Union’s total carbon emissions. Partly because of weak industrial demand and partly because the EU gave away too many allowances to pollute in the first place, there is massive oversupply in the carbon-emissions market. Prices fell from €20 a tonne in 2011 to just €5 a tonne in February 2013. The European Commission, the EU’s executive arm, therefore hatched a plan to take about 900m tonnes of carbon allowances off the market now and reintroduce them in about five years time when, it was hoped, demand would be stronger (“backloading” in the jargon). This was the proposal the European Parliament turned down.”
Dave Keating, at EuropeanVoice.com, gives some immediate reactions:
Stig Schjølset, head of EU carbon analysis for PointCarbon, said the proposal is now “effectively dead”. “This means there will be no changes to the current system until 2020,” he said. “Prices will stay really low up to then. The EU ETS will not bring about any additional greenhouse gas reductions, so it will be irrelevant in terms of reducing total emissions in Europe.” 
"The EU's carbon market is at crisis point,” said Green MEP Bas Eickhout after the vote. He called the combination of centre-right MEPs, German Liberals and some hard left MEPs from the GUE group [European United Left] who voted to reject the proposal an “irresponsible and unholy alliance of MEPs.” . . .

Hans ten Berge, secretary general of electricity industry association Eurelectric, said the vote “is a dangerous set-back for the internal energy market and for EU carbon goals.” “Immediate carbon market reactions to the vote show how low the credibility of the ETS has fallen," he added. "Only urgent action by the Commission to put forward structural proposals on ETS can now stop Member States from each legislating their own alternative policies: 27 different carbon floor prices, coal taxes, carbon taxes.”

Climate skeptics are gloating: “The EU has been the global laboratory testing the green agenda to see how it works,” writes Walter Russell Mead. The vote “means that the guinea pig died; the most important piece of green intervention in world history has become an expensive and embarrassing flop.  It’s hard to exaggerate the importance of this for environmentalists everywhere; if the EU can’t make the green agenda work, it’s unlikely that anybody else will give it a try.”

Climate hawks are depressed: in an interview with Spiegel Online, Felix Matthes says it will have grave consequences and predicts, with Hans ten Berge, a re-nationalization of climate policy: “The decision means the end of a European approach to climate policy. The paradox is that all the politicians who are constantly calling for more harmonization of climate policy in the EU and internationally are sending the policy back to the national level. That is an enormous step backwards -- also for global climate policy. Even China is now starting to pursue emissions trade. South Korea and Australia have already implemented it, and California has started a very ambitious system.”

* * *

This chart from The Economist gives a longer view of the price movement:

April 19, 2013

April 16, 2013

Whales and Dolphins At Risk from Seismic Surveys

From Yale Environment360, a summary of a new report from environmental group Oceana on the dangers of seismic testing for offshore energy:

The proposed use of seismic air guns in the search for offshore oil and gas reserves along the U.S. East Coast could injure or kill nearly 140,000 marine animals annually and disrupt the vital activities of other species, a new study says. The seismic testing, in which guns filled with compressed air are fired repeatedly over deep-sea target areas to provide energy companies an image of the deposits below, would threaten marine species of all sizes, from tiny fish eggs to large whales, according to an analysis by the conservation group Oceana. The group said that the powerful air gun blasts, which it describes as “100,000 times more intense than a jet engine,” could disturb the breathing, feeding, and mating habits for dolphins and whales and cause injury or death to endangered species such as the North Atlantic right whale. The analysis comes as the U.S. Interior Department’s Bureau of Ocean Energy Management completes an environmental study on the potential effects of seismic activities from Delaware to Florida. Oil industry officials point to other research that shows seismic testing is unlikely to threaten marine mammals.
The industry case, with some rebuttals by Oceana, is given in this longer report by Jennifer A. Diouhy at Fuelfix 

Industry representatives note that seismic technology has advanced dramatically in recent years — one reason that oil companies are eager for a look at data from the East Coast, where research is decades old. Geophysical survey companies also can tailor the timing of their studies to avoid animal migrations and minimize disruption. 

Industry officials also point to research that shows slim prospects of physical harm to marine life from seismic surveys. For example, during a 2012 study by scientists in San Diego that aimed studying the way marine mammals experience temporary losses in hearing sensitivity, the researchers could not induce the problem after exposing a dolphin to 10 impulses from an air gun. “None of the dolphins has exhibited significant behavioral reactions,” the scientists concluded. “These data suggest that the potential for seismic surveys using air guns to cause auditory effects on dolphins and similar delphiniums may be lower than previously predicted.” 

Chip Gill, the president of the International Association of Geophysical Contractors, stressed that seismic analysis helps boost the odds that oil and gas companies will drill promising wells — rather than dry holes — effectively limiting the industry’s potential footprint. “We used to explore with a drill bit,” Gill said. “There’s a strong argument that seismic surveys could be the preferred environmental tool.” 

Oceana recommends federal regulators require geophysical contractors adopt minimizing techniques, if they allow any seismic research along the East Coast. That could include use of less-disruptive seismic technology — not dependent on air guns — even though it may be a few years away. “If seismic testing is going to occur, (the Department of Interior) should require it be done using the least harmful technology available,” Oceana said in its report. Regulators also “should permanently close large areas to seismic surveying and drilling to protect vulnerable habitats and species.” 

Marine biologists say the government statistics don’t capture the potential damage, some of which manifests slowly over time. “For marine mammals that are more sensitive to sound and depend greatly on their hearing, such as whales and dolphins, the airgun noise can be a severe threat,” Oceana said. In the case of low-frequency noise, “the sound can travel thousands of miles away from the airgun source, interrupting whale calls and altering their behavior even at great distances. Fin and humpback whales in a 100,000 square mile area stopped singing in the North Atlantic because of such noise, and bowhead whales have abandoned their habitat because of it in Alaska.” 

Although the Obama administration’s five-year plan for selling offshore oil and gas leases through 2017 does not include any planned auctions of Atlantic waters, a new generation of seismic research could pave the way for future drilling in the region. Data indicating potential big untapped resources could add pressure for future administrations to lease Atlantic tracts and help plan any auctions in the area. The geological and geophysical surveys also would be used to dictate the siting of future renewable energy installations offshore and help pinpoint areas for sand and gravel mining.

* * *

Jennifer A. Diouhy, "Seismic research on East Coast could harm 140,000 whales & dolphins," Fuelfix, April 16, 2013.

April 15, 2013

Tragedy of the Commons In Reverse

The journalist Fred Pearce writes in The Land Grabbers that his chosen title may seem pejorative, but is descriptively accurate. He defines land grabbing as "any contentious acquisition of large-scale land rights by a foreigner or other 'outsider,' whatever the legal status of the transaction." It's not all bad, he acknowledges, "but it all merits attention." Estimates of the amount of land taken over from indigenous inhabitants varies: The World Bank says 120 million acres, but other groups say a lot more.

Pearce's argument raises vital questions. It also splits the environmental movement wide open:

It is important to know what agribusiness can and cannot deliver. But it is equally important to be angered by the appalling injustice of people having their ancestral land pulled from beneath their feet. And to question the arrogance and ignorance surrounding claims, by home governments and Western investors alike, that huge areas of Africa are "empty" lands only awaiting the magic of foreign hands and foreign capital. And to balk at the patina of virtue that often surrounds environmentalists eagerly taking other people's land in the interests of protecting wildlife. What right do "green grabbers" have to take peasant fields and pastures to grow biofuels, cordon off rich pastures for nature conservation, shut up forests as carbon stores, and fence in wilderness as playpens and hunting grounds for rich sponsors? They are cooking up a "tragedy of the commons" in reverse.
Over the next few decades I believe land grabbing will matter more, to more of the planet's people, even than climate change. The new land rush looks increasingly like a final enclosure of the planet's wild places, a last roundup on the global commons. Is this the inevitable cost of feeding the world and protecting its surviving wildlife? Must the world's billion or so peasants and pastoralists give up their hinterlands in order to nourish the rest of us? Or is this a new colonialism that should be confronted--the moment when localism and communalism fight back? (ix-x)

 Fred Pearce, The Land Grabbers: The New Fight over Who Owns Earth (Beacon Press, 2012)

April 13, 2013

Frowning Through the Aquacalypse

Herewith extracts from a few stellar essays on the crisis of global fisheries. Daniel Pauly is a professor at the Fisheries Centre of the University of British Columbia, which sponsors (with the Pew foundation) the Sea Around Us Project; his "Aquacalypse Now” appeared in The New Republic, September 28, 2009:  

Our oceans have been the victims of a giant Ponzi scheme, waged with Bernie Madoff–like callousness by the world’s fisheries. Beginning in the 1950s, as their operations became increasingly industrialized--with onboard refrigeration, acoustic fish-finders, and, later, GPS--they first depleted stocks of cod, hake, flounder, sole, and halibut in the Northern Hemisphere. As those stocks disappeared, the fleets moved southward, to the coasts of developing nations and, ultimately, all the way to the shores of Antarctica, searching for icefishes and rockcods, and, more recently, for small, shrimplike krill. As the bounty of coastal waters dropped, fisheries moved further offshore, to deeper waters. And, finally, as the larger fish began to disappear, boats began to catch fish that were smaller and uglier--fish never before considered fit for human consumption. Many were renamed so that they could be marketed: The suspicious slimehead became the delicious orange roughy, while the worrisome Patagonian toothfish became the wholesome Chilean seabass. Others, like the homely hoki, were cut up so they could be sold sight-unseen as fish sticks and filets in fast-food restaurants and the frozen-food aisle.

The scheme was carried out by nothing less than a fishing-industrial complex--an alliance of corporate fishing fleets, lobbyists, parliamentary representatives, and fisheries economists. By hiding behind the romantic image of the small-scale, independent fisherman, they secured political influence and government subsidies far in excess of what would be expected, given their minuscule contribution to the GDP of advanced economies--in the United States, even less than that of the hair salon industry. In Japan, for example, huge, vertically integrated conglomerates, such as Taiyo or the better-known Mitsubishi, lobby their friends in the Japanese Fisheries Agency and the Ministry of Foreign Affairs to help them gain access to the few remaining plentiful stocks of tuna, like those in the waters surrounding South Pacific countries. Beginning in the early 1980s, the United States, which had not traditionally been much of a fishing country, began heavily subsidizing U.S. fleets, producing its own fishing-industrial complex, dominated by large processors and retail chains. Today, governments provide nearly $30 billion in subsidies each year--about one-third of the value of the global catch--that keep fisheries going, even when they have overexploited their resource base. As a result, there are between two and four times as many boats as the annual catch requires, and yet, the funds to “build capacity” keep coming.

The jig, however, is nearly up. In 1950, the newly constituted Food and Agriculture Organization (FAO) of the United Nations estimated that, globally, we were catching about 20 million metric tons of fish (cod, mackerel, tuna, etc.) and invertebrates (lobster, squid, clams, etc.). That catch peaked at 90 million tons per year in the late 1980s, and it has been declining ever since. Much like Madoff’s infamous operation, which required a constant influx of new investments to generate “revenue” for past investors, the global fishing-industrial complex has required a constant influx of new stocks to continue operation. Instead of restricting its catches so that fish can reproduce and maintain their populations, the industry has simply fished until a stock is depleted and then moved on to new or deeper waters, and to smaller and stranger fish. And, just as a Ponzi scheme will collapse once the pool of potential investors has been drained, so too will the fishing industry collapse as the oceans are drained of life.

Unfortunately, it is not just the future of the fishing industry that is at stake, but also the continued health of the world’s largest ecosystem. While the climate crisis gathers front-page attention on a regular basis, people--even those who profess great environmental consciousness--continue to eat fish as if it were a sustainable practice. But eating a tuna roll at a sushi restaurant should be considered no more environmentally benign than driving a Hummer or harpooning a manatee. In the past 50 years, we have reduced the populations of large commercial fish, such as bluefin tuna, cod, and other favorites, by a staggering 90 percent. One study, published in the prestigious journal Science, forecast that, by 2048, all commercial fish stocks will have “collapsed,” meaning that they will be generating 10 percent or less of their peak catches. Whether or not that particular year, or even decade, is correct, one thing is clear: Fish are in dire peril, and, if they are, then so are we.

The extent of the fisheries’ Ponzi scheme eluded government scientists for many years. They had long studied the health of fish populations, of course, but typically, laboratories would focus only on the species in their nation’s waters. And those studying a particular species in one country would communicate only with those studying that same species in another. Thus, they failed to notice an important pattern: Popular species were sequentially replacing each other in the catches that fisheries were reporting, and, when a species faded, scientific attention shifted to the replacement species. At any given moment, scientists might acknowledge that one-half or two-thirds of fisheries were being overfished, but, when the stock of a particular fish was used up, it was simply removed from the denominator of the fraction. For example, the Hudson River sturgeon wasn’t counted as an overfished stock once it disappeared from New York waters; it simply became an anecdote in the historical record. The baselines just kept shifting, allowing us to continue blithely damaging marine ecosystems.

It was not until the 1990s that a series of high-profile scientific papers demonstrated that we needed to study, and mitigate, fish depletions at the global level. They showed that phenomena previously observed at local levels--for example, the disappearance of large species from fisheries’ catches and their replacement by smaller species--were also occurring globally. It was a realization akin to understanding that the financial meltdown was due not to the failure of a single bank, but, rather, to the failure of the entire banking system--and it drew a lot of controversy. . . .

To some Western nations, an end to fish might simply seem like a culinary catastrophe, but for 400 million people in developing nations, particularly in poor African and South Asian countries, fish are the main source of animal protein. What’s more, fisheries are a major source of livelihood for hundreds of million of people. A recent World Bank report found that the income of the world’s 30 million small-scale fisheries is shrinking. The decrease in catch has also dealt a blow to a prime source of foreign-exchange earnings, on which impoverished countries, ranging from Senegal in West Africa to the Solomon Islands in the South Pacific, rely to support their imports of staples such as rice.

And, of course, the end of fish would disrupt marine ecosystems to an extent that we are only now beginning to appreciate. Thus, the removal of small fish in the Mediterranean to fatten bluefin tuna in pens is causing the “common” dolphin to become exceedingly rare in some areas, with local extinction probable. Other marine mammals and seabirds are similarly affected in various parts of the world. Moreover, the removal of top predators from marine ecosystems has effects that cascade down, leading to the increase of jellyfish and other gelatinous zooplankton and to the gradual erosion of the food web within which fish populations are embedded. This is what happened off the coast of southwestern Africa, where an upwelling ecosystem similar to that off California, previously dominated by fish such as hake and sardines, has become overrun by millions of tons of jellyfish.

Jellyfish population outbursts are also becoming more frequent in the northern Gulf of Mexico, where the fertilizer-laden runoff from the Mississippi River fuels uncontrolled algae blooms. The dead algae then fall to a sea bottom from which shrimp trawling has raked all animals capable of feeding on them, and so they rot, causing Massachusetts-sized “dead zones.” Similar phenomena--which only jellyfish seem to enjoy--are occurring throughout the world, from the Baltic Sea to the Chesapeake Bay, and from the Black Sea in southeastern Europe to the Bohai Sea in northeastern China. Our oceans, having nourished us since the beginning of the human species some 150,000 years ago, are now turning against us, becoming angry opponents.

That dynamic will only grow more antagonistic as the oceans become warmer and more acidic because of climate change. Fish are expected to suffer mightily from global warming, making it essential that we preserve as great a number of fish and of fish species as possible, so that those which are able to adapt are around to evolve and propagate the next incarnations of marine life. In fact, new evidence tentatively suggests that large quantities of fish biomass could actually help attenuate ocean acidification. In other words, fish could help save us from the worst consequences of our own folly--yet we are killing them off. The jellyfish-ridden waters we’re seeing now may be only the first scene in a watery horror show. . . .

The truth is that governments are the only entities that can prevent the end of fish. For one thing, once freed from their allegiance to the fishing-industrial complex, they are the ones with the research infrastructure capable of prudently managing fisheries. For another, it is they who provide the billions of dollars in annual subsidies that allow the fisheries to persist despite the lousy economics of the industry. Reducing these subsidies would allow fish populations to rebuild, and nearly all fisheries scientists agree that the billions of dollars in harmful, capacity-enhancing subsidies must be phased out. Finally, only governments can zone the marine environment, identifying certain areas where fishing will be tolerated and others where it will not. In fact, all maritime countries will have to regulate their exclusive economic zones (the 200-mile boundary areas established by the U.N. Law of the Sea Treaty within which a nation has the sole right to fish). The United States has the largest exclusive economic zone in the world, and it has taken important first steps in protecting its resources, notably in the northwest Hawaiian islands. Creating, or re-creating, un-fished areas within which fish populations can regenerate is the only opportunity we have to repair the damage done to them. . . .

* * *

The state of the oceans is also explored in Elizabeth Kolbert, “The Scales Fall: Is There Any Hope for Our Overfished Oceans?The New Yorker, August 2, 2010:
The sorry state of ocean life has led to a new kind of fish story—a lament not for the one that got away but for the countless others that didn’t. In “Saved by the Sea: A Love Story with Fish” (St. Martin’s; $25.99), David Helvarg notes that each year sharks kill some five to eight humans worldwide; meanwhile we kill a hundred million of them. Dean Bavington, the author of “Managed Annihilation: An Unnatural History of the Newfoundland Cod Collapse” (University of British Columbia; $94), observes that two hundred billion pounds’ worth of cod were taken from Canada’s Grand Banks before 1992, when the cod simply ran out. In “Four Fish: The Future of the Last Wild Food” (Penguin Press; $25.95), Paul Greenberg estimates that somewhere in the range of a hundred million salmon larvae used to hatch in the Connecticut River each year. Now the number’s a lot easier to pin down: it’s zero. “The broad, complex genetic potential of the Connecticut River salmon,” Greenberg writes, has “vanished from the face of the earth.”

The assumption of the inexhaustibility of the sea’s resources is found in Hugo Grotius’s Mare Liberum (The Free Sea) of 1609; it received repeated affirmation thereafter. Kolbert draws attention to a lecture of Thomas Huxley at the opening of the 1883 Great International Fisheries Exhibition, affirming the belief that “probably all the great sea fisheries are inexhaustible; that is to say that nothing we do seriously affects the number of the fish.”  

Huxley’s views dominated thinking about fisheries for most of the next century. In 1955, Francis Minot, the director of the Marine and Fisheries Engineering Research Institute, in Woods Hole, Massachusetts, co-wrote a book titled “The Inexhaustible Sea.” As yet, he observed, “we do not know the ocean well enough. Much must still be learned. Nevertheless, we are already beginning to understand that what it has to offer extends beyond the limits of our imagination.” In 1964, the annual global catch totalled around fifty million tons; a U.S. Interior Department report from that year predicted that it could be “increased at least tenfold without endangering aquatic stocks.” Three years later, the department revised its estimate; the catch could be increased not by a factor of ten but by a factor of forty, to two billion tons a year. This, it noted, would be enough to feed the world’s population ten times over. Michael L. Weber observes, in “From Abundance to Scarcity” (2002), that as recently as the nineteen-nineties U.S. policy was predicated “on the belief that the ocean’s productivity was almost limitless.” 

In the meantime, “machinery” beyond Huxley’s wildest imagining was being developed. Purse seines were introduced in the nineteen-thirties. These giant nets can be played out around entire schools of fish, then gathered up with drawstrings, like huge laundry bags. Factory freezer trawlers, developed after the Second World War, grew to be so gargantuan that they amounted to, in effect, seafaring towns. In the nineteen-fifties, many fleets added echo-sounding sonar, which can detect fish schools long before they surface. Today, specially designed buoys known as “fish aggregating devices,” or FADs, are deployed to attract species like yellowfin tuna and blue marlin. So-called “smart” FADs come equipped with sonar and G.P.S., so operators can detect from afar whether they are, in fact, surrounded by fish. 

In the short term, the new technology worked, much as Huxley had predicted, to swell catches. But only in the short term. In the late nineteen-eighties, the total world catch topped out at around eighty-five million tons, which is to say, roughly 1.9 billion tons short of the Interior Department’s most lunatic estimate. This milestone—the point of what might be called “peak fish”—was passed without anyone’s quite realizing it, owing to inflated catch figures from the Chinese. (These fishy figures were later exposed as politically motivated fabrications.) For the past two decades, the global catch has been steadily declining. It is estimated that the total take is dropping by around five hundred thousand tons a year. 

Meanwhile, as the size of the catch has fallen, so, too, has the size of the creatures being caught. This phenomenon, which has become known as “fishing down the food web,” was first identified by Daniel Pauly, a fisheries biologist at the University of British Columbia. In “Five Easy Pieces: How Fishing Impacts Marine Ecosystems” (Island Press; $50), Pauly follows this trend to its logical—or, if you prefer, illogical—conclusion. Eventually, all that will be left in the oceans are organisms that people won’t, or can’t, consume, like sea slugs and toxic algae. It’s been argued that humans have become such a dominant force on the planet that we’ve ushered in a new geological epoch. Pauly proposes that this new epoch be called the Myxocene, from the Greek muxa, meaning “slime.” . . .

* * *

There is a useful archive of magazine essays on the fisheries crisis at the Sea Around Us Project.

April 12, 2013

Carbon Bubbles

A report last year from Carbontracker, in London, draws attention to the financial solvency of major fossil fuel producers. Its study, Unburnable Carbon, has had a big impact. Bill McKibben, of 350.org, has rallied students at the nation's leading colleges and universities, 250 at last count, to demand that their schools' endowments divest from the 200 companies listed on world stock exchanges that hold fossil fuel reserves. (The divestment would take place over a five year period.) The divestment movement raises a lot of questions, but here I want to focus on the Carbontracker report, both in the way of summarizing its conclusions and pointing to some weaknesses therein.   

According to Carbontracker, “The total carbon potential of the Earth’s known fossil fuel reserves comes to 2795 GtCO2 [gigatons of carbon dioxide]. 65% of this is from coal, with oil providing 22% and gas 13%. This means that governments and global markets are currently treating as assets, reserves equivalent to nearly 5 times the carbon budget for the next 40 years. . . .We estimate the fossil fuel reserves held by the top 100 listed coal companies and the top 100 listed oil and gas companies represent potential emissions of 745 GtCO2. This exceeds the remaining carbon budget of 565 GtCO2 by 180 GtCO2.”


Such reserves, the report notes, cannot be burned if the world is to stay within its carbon budget. "Only 20% of the total reserves can be burned unabated, leaving up to 80% of assets technically unburnable."
This has profound implications for the world’s energy finance structures and means that using just the reserves listed on the world’s stock markets in the next 40 years would be enough to take us beyond 2°C of global warming. This calculation also assumes that no new fossil fuel resources are added to reserves and burnt during this period – an assumption challenged by the harsh reality that fossil fuel companies are investing billions per annum to find and process new reserves. It is estimated that listed oil and gas companies had CAPEX budgets of $798 billion in 2010. In addition, over two-thirds of the world’s fossil fuels are held by privately or state owned oil, gas and coal corporations, which are also contributing even more carbon emissions.
Given that only one fifth of the total reserves can be used to stay below 2°C warming, if this is applied uniformly, then only 149 of the 745 GtCO2 listed can be used unmitigated. This is where the carbon asset bubble is located. If applied to the world’s stock markets, this could result in a repricing of assets on a scale that would dwarf past profit warnings and revaluation of reserves.
Let us extrapolate a bit from Carbontracker's figures, which are admittedly astounding in showing the scale of the challenge. If we are to stay below two degrees of warming, the world can burn only 886 GtCO2 from 2000 to 2050. That is “the global carbon budget.” But in the first decade of this century the world used up over a third of its budget, having burned, according to one calculation, some 282 GtCO2, with land use change adding another 39 GtCO2. So that reduces to 565 GtCO2 the carbon budget for the years 2011-2050. Ignoring the “land use change” factor for the moment, that basically indicates that 28.2 GtCO2 was the amount burned per year in the first decade. In the next four decades, the amount would have to fall by half (565/40), to average 14.125 GtCO2 per year. Barring catastrophe, of course, an immediate reduction of fossil fuel use by half is impossible, so one would have to imagine a sharply descending line like that projected by Greenpeace in a recent report, with emissions in 2050 far below the 14.125 figure. (See further here.)

Carbontracker then asks fossil fuel companies--together with regulators and pension funds--to consider the following questions in stating and understanding their financial accounts:  

• Which of the assets you have an interest in are amongst the 20% of fossil fuel reserves we can afford to burn in the next 40 years?

• If you sanction capital expenditure on finding and developing more reserves, just how likely is it that those new reserves can ever be burned?

• What discount rates would it be prudent for investors to use when valuing reserves? Are historical discount rates too optimistic given the likely haircut to reserves values that corporate owners of fossil fuels are likely to have to take?

Furthermore, as the regulators of the capital markets will need to look closely at disclosures and reporting requirements around how reserves are presented, accountants and auditors will need to revise guidelines on how value is recorded:

• If not all reserves that are exchange listed can be burnt, how should auditors account for these stranded assets?
• What assumptions need to be reviewed in order to create a reliable assessment of which assets are contingent or impaired?

* * *

Carbontracker undoubtedly raises a set of important questions, but its approach also has some serious limitations.

First, it mixes together the reserves situation for coal, oil, and gas in a way that is distorting. Coal reserves are enormous and constitute, as Carbontracker notes, some 65% of the carbon potential of proven fossil fuel reserves. But the "reserves to production" ratio for coal is far larger than for natural gas or for oil. BP pegs the reserves to production ratio for coal at 112 years, 63.6 for natural gas, and 54.2 for oil, and even those aggregate figures conceal important regional variations.

Second, the fossil fuel reserves of companies listed on stock exchanges account for only 26.6% of total fossil fuel reserves.

Both of those points are relevant to the movement to divest from fossil fuel companies. The former raises the question whether it makes sense to focus on all fossil fuels. There are trade-offs among all fuel sources in terms of environmental impact, but King Coal is undoubtedly the dirtiest and most productive of CO2 emissions. Surely some discrimination among fossil fuels is desirable: Is there really no room to replace coal with natural gas in electricity generation? Are we to build more dams, produce more biofuels, or build more nuclear reactors to make up for the lost production? I just don't see "no fossil fuels," never, ever, as a viable energy policy.

The second point reflects a sigificant real world conundrum: pressure on the western oil majors, if effective, could simply redound to the benefit of the National Oil Companies (or NOCs), which hold some seventy five percent of world oil reserves. (Some studies put the figure even larger, at 90 percent.) The majors are much diminished from their former days of glory in control of the world oil industry; the nationalizations of the 1970s saw to that. Aiming at them illuminates only a limited portion of the intended target.

But the biggest reservation about Carbontracker's approach--and the approach, as well, of those who have followed its methodology in the divestment movement--is its one-sided focus on the supply side.

Unless we make heroic assumptions about the complete transformation of energy infrastructure in a short period of time, the sectors of the economy that depend on energy consumption would clearly be greatly affected by a radical reduction in fossil fuel use, perhaps even more than the fossil fuel producers themselves. Leave aside that oil-service firms, like Halliburton, do not fall under the proscription of Carbontracker or the divestment movement; what about oil refineries, steel producers, automakers, airliners? What would be their enterprise value under the assumption (a radical reduction in fossil fuel use) that Carbontracker is applying to the 200 companies it singles out? Since the entire economy is fossil fuel dependent, it seems strange that Carbontracker, ostensibly concerned with such things as fiduciary responsibility and the accurate reporting of assets, should be so incurious about the implications of its projections for the larger financial structure.


The challenges posed by the energy transition are undoubtedly enormous, and Carbontracker's calculations have the merit of drawing attention to the contradiction between "business as usual" approaches and the forecasts of impending doom coming from climate scientists. If its financial analysis seems rather otherworldly (with no adequate discussion of alternative supply, likely demand, and expected price), it nevertheless raises vital questions.

Carbontracker does not call the fossil fuel companies evil, but rather actuarially unsound. However, the divestment movement, which has run with Carbontracker's report, has not hesitated to denominate them as such. By contrast, I object to any approach that draws a ring of fire around the fossil fuel companies and denominates them as wicked, while in effect leaving the consumers of energy off the hook. Such an approach seems to me unbalanced and myopic. It also ignores the fact that the provision of energy, in the modern industrial civilization we inhabit, provides indispensable contributions to human welfare. We have excellent grounds for believing that such consumption, on current trends, will likely produce grim environmental consequences in the future, but also ample reason to believe that going cold turkey from fossil fuels would produce delerium tremors in the patient, and probably kill him. A resolution to this profound dilemma, distant though such a resolution may now appear, is one of the great challenges of the coming century, but I resist the idea that absolutist strategies constitute progress in dealing with it.

April 13, 2013

April 11, 2013

The Next Big Pandemic

Florence Williams explains “How Animals May Cause the Next Big One” in the New York Review of Books, reviewing David Quammen’s Spillover: Animal Infections and the Next Human Pandemic. Here are some excerpts:

For better or worse, Homo sapiens has become the most abundant large mammal ever to roam the planet. We have spread into nearly every conceivable terrestrial habitat. We have increased our fertility and decreased our mortality. We have reengineered ecosystems and food webs and disinterred fossil stores to produce our calories and condition our dwellings. We are seven billion strong, growing at a rate of 70 million people a year.

As E.O. Wilson, both an entomologist and a conservationist, put it, “When Homo sapiens passed the six-billion mark we had already exceeded by perhaps as much as 100 times the biomass [i.e., the mass of the living organism] of any large animal species that ever existed on the land.” He was talking about wild animals. We are only about five times more numerous and probably a little less massive than our livestock—herded, fattened, and medically dosed just for us. Or, as David Quammen puts it in his masterful new book Spillover: Animal Infections and the Next Human Pandemic: we are an “outbreak,” a species that has undergone a “vast, sudden population increase.” “And here’s the thing about outbreaks,” warns Quammen: “They end…. In some cases they end gradually, in other cases they end with a crash.”

If this sounds alarming, it’s meant to. Undergirding his book’s structure, and right there in the subtitle, is the prospect of another major pandemic, what he and various epidemiologists he has consulted call the Next Big One. What will cause it? Most likely, a virus. What kind of virus? A brand new one, or new, at least, to humans. It will likely be a coronavirus. These, like HIV, have genes written in RNA, not DNA. This means it will be quickly mutating and elusive to treat. Where will it come from? Another animal. When a virus from an animal host “spills over” onto us, this is called zoonosis. Quammen estimates that roughly 60 percent of human infectious diseases have originated with animals, including Lyme disease, West Nile fever, the bubonic plague, and all influenzas. Zoonosis is “a word of the future,” he writes, “destined for heavy use in the twenty-first century.”

This insight, of course, is not really new for anyone who has read such books as Laurie Garrett’s The Coming Plague (1994) or Richard Preston’s The Hot Zone (1994) or seen Steven Soderbergh’s film Contagion (2011), in which a virus found in pigs and bats rapidly spreads around the world. Peter Heller’s 2012 novel The Dog Stars takes place after an influenza strain has killed 99.9 percent of humanity. In the films 28 Days Later (2002) and I Am Legend (2007), a rabies-like infection has turned the civilized world into the eaters and the eaten, plus a small band of survivors.

The problem with many bio- apocalyptic scenarios is that they’re ahistorical and unscientific. Viruses that are effective killers, like Ebola, tend to burn out quickly because they annihilate their hosts before germs can spread too far. Viruses that are highly transmissible, like the so-called Spanish flu of 1918, tend to kill only a small percentage of those infected. (The Spanish flu infected 30 percent of the world’s population. It killed about 2 percent.)

Quammen knows this, and is too rigorous a journalist to overdramatize dangers. His central idea is that the study of viruses must also be a study of ecology. By limiting his scope to diseases transmitted by animals as opposed to, say, polio and smallpox (which, though devastating, are diseases only found in humans), he can explore the complex and fascinating connections between us and the animals around us, both wild and domestic. As humans have swarmed the planet, we’ve altered habitats both large and microscopic. The natural world is rapidly disintegrating, or at least reorganizing in vastly unpredictable ways, and Quammen has for years been writing about the consequences. In The Song of the Dodo: Island Biogeography in an Age of Extinction, he identifies his subject as “the extinction of species in a world that has been hacked into pieces.”1

Spillover is a logical sequel. Although diseases can “reside undetected” within intact ecosystems, “ecological disturbance causes diseases to emerge.” We have not only disrupted, fragmented, and interrupted the web of relationships with animals in these places, but we’ve presented ourselves—our very own tissues and cells—as alternative targets for opportunistic microbes. . . .

As Quammen brilliantly portrays them, the lessons of the SARS virus are unnerving. We are a populous and hungry species eating our way across the taxonomic map. Our livestock are kept in close quarters with wild animals, and we travel over oceans in a day. The SARS outbreak could easily have been worse. A bigger disaster was averted because Chinese authorities were ultimately organized and ruthless about quarantining (even public spitters were fined $300). The hospitals in China and in Toronto were excellent. (What if the disease had broken out in New Delhi?) Moreover, as Quammen explains, SARS patients got very sick before the height of infection, helping them get off the streets and buses before they were too contagious. With many other viruses, the reverse is true. “When the Next Big One comes,” he writes, “we can guess, it will likely conform to the same perverse pattern, high infectivity preceding notable symptoms. That will help it to move through cities and airports like an angel of death.” . . .

Viruses mutate constantly. Many expire. But the strains we should fear are those that randomly manage to stay alive in a new host and keep replicating. Fifty years ago, Rachel Carson wrote, “If we are going to live so intimately with these chemicals—eating and drinking them, taking them into the very marrow of our bones—we had better know something about their nature and their power.” The same applies to emerging viruses, and it’s why people should read Quammen’s book. . . .

In a revealing passage, Quammen roundly criticizes the science writer Richard Preston for narrative exaggeration. Preston’s The Hot Zone was a gripping account of an Ebola-like outbreak among lab monkeys in a corporate research facility in Reston, Virginia, in 1989. The book was a best seller and the basis for a movie starring Dustin Hoffman. “There’s no question that it did more than any journal article or newspaper story to make ebolaviruses infamous and terrifying to the general public,” Quammen says. The problem, he explains, is that the catalog of horrors described by Preston—liquefying organs, people dissolving in their beds—wasn’t quite accurate. . . .

Spillover leaves the impression that viruses are terribly scary. Quammen makes the qualifying point that sometimes they are neutral or even salubrious, but then he drops it. This doesn’t seem quite right. In fact, humans are part virus. Our genome carries about 100,000 fragments of retrovirus DNA, making up 8 percent of our total genetic material. As Carl Zimmer explains in A Planet of Viruses:
Many scientists now argue that viruses contain a genetic archive that’s been circulating the planet for billions of years. When they try to trace the common ancestry of virus genes, they often work their way back to a time before the common ancestor of all cell-based life.

He notes that the French virologist Patrick Forterre has suggested that viruses may have “invented the double-stranded DNA molecule as a way to protect their genes from attack.” The mammalian placenta is made possible thanks to genes contributed by an ancient virus. Viral DNA is intertwined with ours and has been from our earliest beginnings. Viruses don’t just attack us; they are us. 

If this is the case, and if, as we know, humans have been invading habitats since we left the cradle of Africa tens of thousands of years ago, how new a threat is zoonosis, really? After all, far fewer humans are dying of infectious disease than ever before. Quammen anticipates this criticism and has a convincing answer: HIV. That family of viruses has killed 30 million people worldwide and another 34 million are currently infected. Decades after its discovery, we still can’t effectively treat or contain it in many parts of the world. Uniquely modern factors, including changing sexual and social patterns, poor public health, and easy global transmission, amplified the pandemic. Aside from HIV, we penetrate more deeply and more destructively into remote ecosystems through large-scale mining operations, deforestation, oil and gas exploration, modern agriculture, and, of course, human-caused climate change. 

World War I and globalization helped the Spanish flu become humankind’s worst viral outbreak. Despite its lethality rate of between one and two people per hundred infected, it ultimately killed around 50 million people. In the US, the virus was powerful enough to reduce the national average lifespan by ten years. More recently at home, West Nile virus infections were up 19 percent last summer. The Washington Post reported that the virus appears to be mutating to stronger strains capable of damaging the central nervous system. Mosquitoes, bearers of so many bad tidings for the human immune system, thrive in hotter, wetter places and longer warm seasons. . . .

* * *

New York Review of Books, April 25, 2003. Florence Williams is a Contributing Editor at Outside Magazine. Her book Breasts: A Natural and Unnatural History was published in 2012. (By the way, that's an excellent book on women's health issues, and an entertaining if depressing read.)

April 4, 2013

Reconsidering Hydro

A new study highlights the grim ecological consequences of hydroelectric power, noting the limited capacity of hydroelectric dams to offer an effective pathway for migratory fish runs. The results underline the risks of the global expansion of hydropower. Writes John Waldman 

Six colleagues and I undertook a study of the success — or, rather, failure — of Atlantic salmon, American shad, river herring, and other species in migrating from the sea to their spawning grounds past a gauntlet of dams on three rivers in the northeastern U.S. — the Susquehanna, Connecticut, and Merrimack. What we found was grimmer than we expected. For one species, American shad, less than 3 percent of the fish made it past all the dams in these rivers to their historical spawning reaches. 

Results for other anadromous species (those that spawn in fresh water and migrate to the ocean and back again) were nearly as bad. And the sobering aspect of these contemporary studies is that they are based on the insubstantial number of fish today as compared to earlier massive migrations of these species, which numbered in the many millions. While investigating fish passage on the Merrimack River in New Hampshire, our project’s lead researcher, Jed Brown of the U.S. Fish & Wildlife Service, was struck by the long-term lack of recovery of the targeted fish populations — at some fish restoration meetings there were more people in the room than salmon in the river. 

What has happened on the U.S. East Coast, as reported in our study published in the journal Conservation Letters in January, is of more than regional or national interest. There are important lessons, as well. Even as some large dams in the U.S. begin to be removed for environmental reasons, a hydropower boom is occuring worldwide. Thirty large dams have been announced for the Amazon River alone. Eleven major dams are planned for the lower Mekong River. The dam industry in Canada wants to dramatically expand its recent hydropower initiative. 

And dam projects are proposed, planned, or in the works for Africa’s upper Nile, the Patuca in Honduras, the Teesta in India, the upper Yangtze in China, the Tigris in Turkey, the Selenge in Mongolia, and many others. Though most of these rivers lack anadromous fishes, many are home to richly diverse freshwater fish communities that make important seasonal migrations within these river systems.

 For the international community, the record of fish passage on rivers in the northeastern U.S. is a cautionary tale. Hydropower has often been billed as a clean source of renewable energy, and generating electricity without polluting the air or producing greenhouse gases is commendable. But “clean” is in the eye of the beholder, and any claims to being sustainable ignore its multifarious aquatic effects, including blocking fish passage, fragmenting habitat, and undermining a river’s fundamental ecological services. . . .

 * * *

John Waldman, “Blocked Migration: Fish Ladders on U.S. Dams Are Not Effective,” Yale Environment 360, April 4, 2013.

April 3, 2013

Unsettled Science

These two charts from the March 30, 2013 Economist show that global mean temperatures have flattened over the past 10 to 15 years and are at the low end of projections. The Economist has hitherto been a stout proponent of mainstream climate science, and so its expression of doubts about the settled character of the science is quite notable. The major anomaly is that while temperatures have flattened, emissions have continued their meteoric rise. "The world added roughly 100 billion tonnes of carbon to the atmosphere between 2000 and 2010," about a quarter of all the carbon dioxide released by human activity since 1750. And yet, concedes climate hawk James Hansen, "the five-year mean temperature has been flat for a decade." Also stalling out is the long term rise in surface seawater temperatures, as another chart from the article illustrates:

How to explain these anomalies? What follows is the first half of the Economist's explanation:

The mismatch might mean that—for some unexplained reason—there has been a temporary lag between more carbon dioxide and higher temperatures in 2000-10. Or it might be that the 1990s, when temperatures were rising fast, was the anomalous period. Or, as an increasing body of research is suggesting, it may be that the climate is responding to higher concentrations of carbon dioxide in ways that had not been properly understood before. This possibility, if true, could have profound significance both for climate science and for environmental and social policy.

The term scientists use to describe the way the climate reacts to changes in carbon-dioxide levels is “climate sensitivity”. This is usually defined as how much hotter the Earth will get for each doubling of CO concentrations. So-called equilibrium sensitivity, the commonest measure, refers to the temperature rise after allowing all feedback mechanisms to work (but without accounting for changes in vegetation and ice sheets).

Carbon dioxide itself absorbs infra-red at a consistent rate. For each doubling of CO levels you get roughly 1°C of warming. A rise in concentrations from preindustrial levels of 280 parts per million (ppm) to 560ppm would thus warm the Earth by 1°C. If that were all there was to worry about, there would, as it were, be nothing to worry about. A 1°C rise could be shrugged off. But things are not that simple, for two reasons. One is that rising CO levels directly influence phenomena such as the amount of water vapour (also a greenhouse gas) and clouds that amplify or diminish the temperature rise. This affects equilibrium sensitivity directly, meaning doubling carbon concentrations would produce more than a 1°C rise in temperature. The second is that other things, such as adding soot and other aerosols to the atmosphere, add to or subtract from the effect of CO. All serious climate scientists agree on these two lines of reasoning. But they disagree on the size of the change that is predicted.

The Intergovernmental Panel on Climate Change (IPCC), which embodies the mainstream of climate science, reckons the answer is about 3°C, plus or minus a degree or so. In its most recent assessment (in 2007), it wrote that “the equilibrium climate sensitivity…is likely to be in the range 2°C to 4.5°C with a best estimate of about 3°C and is very unlikely to be less than 1.5°C. Values higher than 4.5°C cannot be excluded.” The IPCC’s next assessment is due in September. A draft version was recently leaked. It gave the same range of likely outcomes and added an upper limit of sensitivity of 6°C to 7°C.

A rise of around 3°C could be extremely damaging. The IPCC’s earlier assessment said such a rise could mean that more areas would be affected by drought; that up to 30% of species could be at greater risk of extinction; that most corals would face significant biodiversity losses; and that there would be likely increases of intense tropical cyclones and much higher sea levels.

Other recent studies, though, paint a different picture. An unpublished report by the Research Council of Norway, a government-funded body, which was compiled by a team led by Terje Berntsen of the University of Oslo, uses a different method from the IPCC’s. It concludes there is a 90% probability that doubling CO emissions will increase temperatures by only 1.2-2.9°C, with the most likely figure being 1.9°C. The top of the study’s range is well below the IPCC’s upper estimates of likely sensitivity.

This study has not been peer-reviewed; it may be unreliable. But its projections are not unique. Work by Julia Hargreaves of the Research Institute for Global Change in Yokohama, which was published in 2012, suggests a 90% chance of the actual change being in the range of 0.5-4.0°C, with a mean of 2.3°C. This is based on the way the climate behaved about 20,000 years ago, at the peak of the last ice age, a period when carbon-dioxide concentrations leapt. Nic Lewis, an independent climate scientist, got an even lower range in a study accepted for publication: 1.0-3.0°C, with a mean of 1.6°C. His calculations reanalysed work cited by the IPCC and took account of more recent temperature data. In all these calculations, the chances of climate sensitivity above 4.5°C become vanishingly small.

If such estimates were right, they would require revisions to the science of climate change and, possibly, to public policies. If, as conventional wisdom has it, global temperatures could rise by 3°C or more in response to a doubling of emissions, then the correct response would be the one to which most of the world pays lip service: rein in the warming and the greenhouse gases causing it. This is called “mitigation”, in the jargon. Moreover, if there were an outside possibility of something catastrophic, such as a 6°C rise, that could justify drastic interventions. This would be similar to taking out disaster insurance. It may seem an unnecessary expense when you are forking out for the premiums, but when you need it, you really need it. Many economists, including William Nordhaus of Yale University, have made this case.

If, however, temperatures are likely to rise by only 2°C in response to a doubling of carbon emissions (and if the likelihood of a 6°C increase is trivial), the calculation might change. Perhaps the world should seek to adjust to (rather than stop) the greenhouse-gas splurge. There is no point buying earthquake insurance if you do not live in an earthquake zone. In this case more adaptation rather than more mitigation might be the right policy at the margin. But that would be good advice only if these new estimates really were more reliable than the old ones. And different results come from different models.
The issue brief goes on to describe the differing climate models extant. Neither the "general circulation models" nor the "energy balance models" are without serious limitations. So far as the science is concerned, the general conclusion is that a "a small reduction in estimates of climate sensitivity would seem to be justified: a downwards nudge on various best estimates from 3°C to 2.5°C, perhaps; a lower ceiling (around 4.5°C), certainly. If climate scientists were credit-rating agencies, climate sensitivity would be on negative watch. But it would not yet be downgraded."

The accompanying leader from the Economist ("Apocalypse Perhaps a Little Later") notes that the recent flattening of temperatures is not a good reason to stop worrying. Though the risk of extreme warming is arguably diminished, not-so-extreme warming still represents a real risk. Other bits of evidence, like the death spiral of Arctic sea ice, continue to point in a more alarmist direction. Moreover, the world has not really begun to take measures to counteract the risk of warming: though "climate rhetoric has been based on fears of high sensitivity, climate policy has not been."
If climate policy continues to be this impotent, then carbon-dioxide levels could easily rise so far that even a low-sensitivity planet will risk seeing changes that people would sorely regret. There is no plausible scenario in which carbon emissions continue unchecked and the climate does not warm above today’s temperatures.

Bad climate policies, such as backing renewable energy with no thought for the cost, or insisting on biofuels despite the damage they do, are bad whatever the climate’s sensitivity to greenhouse gases. Good policies—strategies for adapting to higher sea levels and changing weather patterns, investment in agricultural resilience, research into fossil-fuel-free ways of generating and storing energy—are wise precautions even in a world where sensitivity is low. So is putting a price on carbon and ensuring that, slowly but surely, it gets ratcheted up for decades to come.
 * * *

"Climate Science: A Sensitive Matter,"  The Economist, March 30, 2013

Climate hawks dispute the significance of the flattening of the upper ocean heat content anomaly: about 90% of global warming goes into heating the oceans, they argue, and much of the "missing heat" has been found in the deep oceans below 700 meters.

 April 17, 2013