January 30, 2011

The Anthropocene and All That

From Yale Environment 360, "Living in the Anthropocene":
It’s a pity we’re still officially living in an age called the Holocene. The Anthropocene — human dominance of biological, chemical and geological processes on Earth — is already an undeniable reality. Evidence is mounting that the name change suggested by one of us more than ten years ago is overdue. It may still take some time for the scientific body in charge of naming big stretches of time in Earth’s history, the International Commission on Stratigraphy, to make up its mind about this name change. But that shouldn’t stop us from seeing and learning what it means to live in this new Anthropocene epoch, on a planet that is being anthroposized at high speed.
For millennia, humans have behaved as rebels against a superpower we call “Nature.” In the 20th century, however, new technologies, fossil fuels, and a fast-growing population resulted in a “Great Acceleration” of our own powers. Albeit clumsily, we are taking control of Nature’s realm, from climate to DNA. We humans are becoming the dominant force for change on Earth. A long-held religious and philosophical idea — humans as the masters of planet Earth — has turned into a stark reality. What we do now already affects the planet of the year 3000 or even 50,000.
Changing the climate for millennia to come is just one aspect. By cutting down rainforests, moving mountains to access coal deposits and acidifying coral reefs, we fundamentally change the biology and the geology of the planet. While driving uncountable numbers of species to extinction, we create new life forms through gene technology, and, soon, through synthetic biology.
This essay from the Economist (May 26, 2011) has more on the concept:

According to studies by Erle Ellis, an ecologist at the University of Maryland, Baltimore County, the vast majority of ecosystems on the planet now reflect the presence of people. There are, for instance, more trees on farms than in wild forests. And these anthropogenic biomes are spread about the planet in a way that the ecological arrangements of the prehuman world were not. The fossil record of the Anthropocene will thus show a planetary ecosystem homogenised through domestication. 
More sinisterly, there are the fossils that will not be found. Although it is not yet inevitable, scientists warn that if current trends of habitat loss continue, exacerbated by the effects of climate change, there could be an imminent and dramatic number of extinctions before long. 
All these things would show future geologists that humans had been present. But though they might be diagnostic of the time in which humans lived, they would not necessarily show that those humans shaped their time in the way that people pushing the idea of the Anthropocene want to argue. The strong claim of those announcing the recent dawning of the age of man is that humans are not just spreading over the planet, but are changing the way it works.  
Such workings are the province of Earth-system science, which sees the planet not just as a set of places, or as the subject of a history, but also as a system of forces, flows and feedbacks that act upon each other. This system can behave in distinctive and counterintuitive ways, including sometimes flipping suddenly from one state to another. To an Earth-system scientist the difference between the Quaternary period (which includes the Holocene) and the Neogene, which came before it, is not just what was living where, or what the sea level was; it is that in the Neogene the climate stayed stable whereas in the Quaternary it swung in and out of a series of ice ages. The Earth worked differently in the two periods. 
The clearest evidence for the system working differently in the Anthropocene comes from the recycling systems on which life depends for various crucial elements. In the past couple of centuries people have released quantities of fossil carbon that the planet took hundreds of millions of years to store away. This has given them a commanding role in the planet’s carbon cycle.  
Although the natural fluxes of carbon dioxide into and out of the atmosphere are still more than ten times larger than the amount that humans put in every year by burning fossil fuels, the human addition matters disproportionately because it unbalances those natural flows. . . . .The result of putting more carbon into the atmosphere than can be taken out of it is a warmer climate, a melting Arctic, higher sea levels, improvements in the photosynthetic efficiency of many plants, an intensification of the hydrologic cycle of evaporation and precipitation, and new ocean chemistry. 
All of these have knock-on effects both on people and on the processes of the planet. More rain means more weathering of mountains. More efficient photosynthesis means less evaporation from croplands. And the changes in ocean chemistry are the sort of thing that can be expected to have a direct effect on the geological record if carbon levels rise far enough.

The Food Revolution

From Ambrose Evans-Pritchard:
The immediate cause of this food spike was the worst drought in Russia and the Black Sea region for 130 years, lasting long enough to damage winter planting as well as the summer harvest. Russia imposed an export ban on grains. This was compounded by late rains in Canada, Nina disruptions in Argentina, and a series of acreage downgrades in the US. The world’s stocks-to-use ratio for corn is nearing a 30-year low of 12.8pc, according to Rabobank.
The deeper causes are well-known: an annual rise in global population by 73m; the “exhaustion” of the Green Revolution as the gains in crop yields fade, to cite the World Bank; diet shifts in Asia as the rising middle class switch to animal-protein diets, requiring 3-5 kilos of grain feed for every kilo of meat produced; the biofuel mandates that have diverted a third of the US corn crop into ethanol for cars.

Add the loss of farmland to Asia’s urban sprawl, and the depletion of the non-renewable acquivers for irrigation of North China’s plains, and the geopolitics of global food supply starts to look neuralgic.

Can the world head off mass famine? Yes, with leadership. The regions of the ex-Soviet Union farm 30m hectares less today than in the Khrushchev era, and yields are half western levels.

There are tapped hinterlands in Brazil, and in Africa where land titles and access to credit could unleash a great leap forward. The global reservoir of unforested cropland is 445m hectares, compared to 1.5 billion in production. But the low-lying fruit has already gone, and the vast investment needed will not come soon enough to avoid a menacing shift in the terms of trade between the land and the urban poor.

We are on a thinner margin of food security, as North Africa is discovering painfully, and China understands all too well. Perhaps it is a little too early to write off farm-rich Europe and America.

Arctic Warming and the Hockey Stick

From the New York Time's Green blog, a new study from the University of Colorado:
Water flowing from the North Atlantic into the Arctic Ocean is warmer today than at any time in the past 2,000 years, a new study shows.
The waters of the Fram Strait, which runs between Greenland and the Arctic archipelago of Svalbard, have warmed roughly 3.5 degrees Fahrenheit over the past 100 years, the study’s authors said. The water temperatures are about 2.5 degrees higher than during the Medieval Warm Period, a time of elevated warmth from A.D. 900 to 1300.

The findings are another indication that recent global warming is atypical in the context of historical climate fluctuations, said Thomas Marchitto, a paleoclimatologist at the University of Colorado, Boulder, and a co-author of the study.

“It doesn’t necessarily prove that the change that we see is man-made, but it does strongly point toward this being an unusual event,” Dr. Marchitto said. “On a scale of 2,000 years, it stands out dramatically as something that does not look natural.”

The scientists used cores of ocean sediment containing fossils of microscopic shelled organisms called foraminifera to reconstruct past water temperatures in the strait. They found that the abundance of a species of warmer-water foraminifera rose sharply in the last 100 years, becoming dominant over a cold-water variety for the first time in 2,000 years.

The scientists also tested the shells for levels of magnesium, which rise in tandem with water temperature.

“Both of those approaches gave us the same answers,” Dr. Marchitto said.

Scientists called the study yet another validation of the so-called “hockey stick” graph, a historical reconstruction of global temperatures first published in the late 1990s that showed a steep rise in temperatures in modern times.

“It’s one more piece of evidence that the past 100 years has been an anomaly,” Joshua Willis, an oceanographer at the Jet Propulsion Laboratory in Pasadena, Calif., told a science writer with the journal Nature.

The hockey stick graph has long been a target of climate skeptics, who assert that temperatures during the Medieval Warm Period, when vineyards were planted in England and southern Greenland was settled by Norse colonists, were probably higher than they are today. Were temperatures higher in the past, skeptics argue, then recent warming is more likely to be a result of natural variability than a consequence of human activity.

January 29, 2011

Egypt: Food, Population, and Crisis

Many poor countries are suffering from the rise in food prices, just as in 2008, when food riots shook many governments. The situation is particularly acute in Egypt, despite food subsidies provided by the Egyptian government.

From The Oil Drum:
As population grows, the amount of land needed for housing and businesses rises, and the amount of land for agriculture falls. So Egypt can produce less of its own food, as time goes on.
Egypt is reported to be the world’s largest importer of wheat. In 2010, the oil minister stated that Egypt imports 40% of its food, and 60% of its wheat. The problem this year is that world wheat production is down (at least in part due to weather problems in Russia) so world exports are down.



This graph from a Credit Suisse presentation, showing that 40% of monthly expenditures for Egyptians goes for food, is also quite revealing:


January 28, 2011

Oil and Food Prices: The Close Correlation




This graph from Paul Chefurka plots together the Food Price Index data from the UN Food and Agriculture Organization and the monthly average oil price from the U.S. Energy Information Administration. As Chefurka observes, there is an astonishingly close relationship: "the correlation coefficient of the two data sets is 0.93!" Chefurka also notes that food prices seem to lead oil prices by a few months, indicating that they are responding to the same underlying set of circumstances.

Hat tip: Energy Bulletin

January 24, 2011

China's Bid for World Resources

This neat map from Forbes (click on link for the interactive version) shows Chinese purchases of energy and mineral resources over the last several years.


Seen in isolation, such purchases seem enormous. However, as Charles Wolf points out in The Wall Street Journal  (Jan. 24, 2011), the scale, compared with the acquisitions of other countries, has been modest thus far but is set to grow rapidly.
From 2007 through the first half of this year, Chinese buyers—state, private and in between—acquired 400 companies located outside of the country. The acquisitions span a wide range: mining companies in Australia, Vietnam and South America; oil and gas companies in Africa and the Middle East; banking, financial services and insurance companies in Europe; and electronics, telecommunications and lab testing companies in the U.S. The total cost? $86 billion.

That may sound like a big number, but in fact it's relatively small. The number of companies at play in global cross-border M&A markets during this period exceeded 12,400, with acquisition costs of more than $1.3 trillion. China's share of the total number was 3.2%, and its share of total acquisition value was 6.6%. China ranks sixth in both number of deals and in acquisition value: behind the U.S., U.K., France, Germany and Japan, though just ahead of the United Arab Emirates. Cross-border acquisitions by U.S. investors numbered over 5,000 (42.1% of all transactions), and nearly $400 billion by value (30.1% of aggregate value).

China's acquisitions beyond its borders are also modest compared with foreign investors' acquisitions within China. Currently, China's annual cross-border acquisitions are about half of annual foreign direct investments in the country.

What is significant about China's acquisitions over the past few years is the change they represent from the negligible amounts in the past. Prior to 2007, nearly all China's foreign investments involved buying U.S. debt, along with lesser purchases of euro and yen debt. China currently holds more than $1.6 trillion of U.S. government debt and an additional $1 trillion of non-U.S. government debt and other assets.

It didn't used to be in the business of acquiring foreign companies. That's changed, and I expect that China's acquisitions will at least double in the next five years, and perhaps quadruple by 2020.

Weather-Related Natural Catastrophes on the Rise

Has global warming contributed to a greater number of weather related natural catastrophes? The argument that it has was made by Munich Re, the global reinsurer, in a recent report. The evidence is complicated because some damage, for example, is the result of greater population and built-up areas near coasts that have been subjected to hurricanes. On the other hand, there have been improvements in warning times and in building codes that should mitigate the above effect. One of the co-authors of the Munich Re study, Dr. Peter Höppe, offered this intriguing piece of evidence to Joe Romm of Climate Progress: 
For me the most convincing piece of evidence that global warming has been contributing already to more and more intense weather related natural catastrophes is the fact that while we find a steep increase in the number of loss relevant weather events (about tripling in the last 30 years) we only find a slight increase in geophysical (earthquake, volcano, tsunami) events, which should not be affected by global warming. If the whole trend we find in weather related disaster should be caused by reporting bias, or socio-demographic or economic developments we would expect to find it similarly for the geophysical events. 

January 18, 2011

Bombing Iran Less Likely

Over the past three years, yours truly has been greatly alarmed by the prospect that Israel or the United States would bomb Iran's nuclear facilities. I regarded that course of action as imprudent, immoral, and illegal. It seemed to me, and still seems, to be incredibly dangerous and wholly unnecessary. Happily, that prospect has greatly eased over the last several months, with the most important development being the apparent success of a U.S-backed Israeli effort, via a computer virus, to make Iran's centrifuges go haywire. How far that has set back the Iranian program is still disputed, but it seems likely to have set back by several years the prospect of an Iranian bomb.

That marks a striking change. As late as last fall, it was not unreasonable to think that such an attack might be forthcoming in late 2010 or 2011. Jeffrey Goldberg, in a piece in the Atlantic, warned that Israeli would likely launch a strike against Iran were sanctions and diplomatic pressure to fail. Even then, to be sure, there was contrary evidence: George Bush, it was widely known, had vetoed an Israeli attack in 2008, despite being importuned by Dick Cheney to do the deed. And though the columnist David Broder suggested in late 2010 that President Obama would rally the country and ensure re-election by approving an attack on Iran, that prospect seemed quite contrary to the president's innate caution and prudence. If Bush wouldn't approve it, could we really expect that Obama would? Still, the thing could definitely not be ruled out. A weak president might find Israel's importunings irresisble, or impossible to stop.

But the Stuxnet computer virus, if reports of its success are to be credited, constitutes a pretty dramatic game-changer.

Here are parts of the New York Times initial report on the story, "Israeli Test on Worm Called Crucial in Iran Nuclear Delay," January 15, 2011:

Though American and Israeli officials refuse to talk publicly about what goes on at Dimona, the operations there, as well as related efforts in the United States, are among the newest and strongest clues suggesting that the virus was designed as an American-Israeli project to sabotage the Iranian program.


In recent days, the retiring chief of Israel’s Mossad intelligence agency, Meir Dagan, and Secretary of State Hillary Rodham Clinton separately announced that they believed Iran’s efforts had been set back by several years. Mrs. Clinton cited American-led sanctions, which have hurt Iran’s ability to buy components and do business around the world.

The gruff Mr. Dagan, whose organization has been accused by Iran of being behind the deaths of several Iranian scientists, told the Israeli Knesset in recent days that Iran had run into technological difficulties that could delay a bomb until 2015. That represented a sharp reversal from Israel’s long-held argument that Iran was on the cusp of success.

The biggest single factor in putting time on the nuclear clock appears to be Stuxnet, the most sophisticated cyberweapon ever deployed.

In interviews over the past three months in the United States and Europe, experts who have picked apart the computer worm describe it as far more complex — and ingenious — than anything they had imagined when it began circulating around the world, unexplained, in mid-2009.

The worm itself now appears to have included two major components. One was designed to send Iran’s nuclear centrifuges spinning wildly out of control. Another seems right out of the movies: The computer program also secretly recorded what normal operations at the nuclear plant looked like, then played those readings back to plant operators, like a pre-recorded security tape in a bank heist, so that it would appear that everything was operating normally while the centrifuges were actually tearing themselves apart. . . .


The attacks were not fully successful: Some parts of Iran’s operations ground to a halt, while others survived, according to the reports of international nuclear inspectors. Nor is it clear the attacks are over: Some experts who have examined the code believe it contains the seeds for yet more versions and assaults.

“It’s like a playbook,” said Ralph Langner, an independent computer security expert in Hamburg, Germany, who was among the first to decode Stuxnet. “Anyone who looks at it carefully can build something like it.” Mr. Langner is among the experts who expressed fear that the attack had legitimized a new form of industrial warfare, one to which the United States is also highly vulnerable. . . .

The project’s political origins can be found in the last months of the Bush administration. In January 2009, The New York Times reported that Mr. Bush authorized a covert program to undermine the electrical and computer systems around Natanz, Iran’s major enrichment center. President Obama, first briefed on the program even before taking office, sped it up, according to officials familiar with the administration’s Iran strategy. So did the Israelis, other officials said. Israel has long been seeking a way to cripple Iran’s capability without triggering the opprobrium, or the war, that might follow an overt military strike of the kind they conducted against nuclear facilities in Iraq in 1981 and Syria in 2007.

Two years ago, when Israel still thought its only solution was a military one and approached Mr. Bush for the bunker-busting bombs and other equipment it believed it would need for an air attack, its officials told the White House that such a strike would set back Iran’s programs by roughly three years. Its request was turned down.

Now, Mr. Dagan’s statement suggests that Israel believes it has gained at least that much time, without mounting an attack. So does the Obama administration.
A few other developments of the last several months may also be noted here.

The Wikileaks disclosures showed that King Abdullah of Saudi Arabia had been hot for the United States to take action against Iran's nuclear program and "cut off the head of the snake" (though the lower levels of the Saudi foreign policy establishment, as elsewhere in the Arab world, were much more cautious). Such importunings from Saudi and other Arab leaders had been previously reported, but the Wikileaks disclosures put the official seal on the matter. (Too bad that members of the U.S. military and intelligence apparatus are forbidden from reading them, because they're classified!)

Two Iranian nuclear scientists have been assassinated--in all probability by Israeli agents.

Serious rifts have appeared in Iran itself. According to this piece in the Atlantic, it is Ahmadinejad, the reputed crazy man, who has favored making a deal with the West over the nuclear question, and who has emerged in Iran's internal battles as both favoring greater recognition of human rights and emphasizing Iran's Persian (rather than Islamic) heritage.
That is the surprising impression one gets reading the latest WikiLeaks revelations, which portray Ahmadinejad as open to making concessions on Iran's nuclear program and far more accommodating to Iranians' demands for greater freedoms than anyone would have thought. Two episodes in particular deserve special scrutiny not only for what they reveal about Ahmadinejad but for the light they shed on the question of who really calls the shots in Iran.

In October 2009, Ahamdinejad's chief nuclear negotiator, Saeed Jalili, worked out a compromise with world power representatives in Geneva on Iran's controversial nuclear program. But the deal, in which Iran agreed to ship nearly its entire stockpile of low enriched uranium to Russia and France for processing, collapsed when it failed to garner enough support in Iran's parliament, the Majles.

According to a U.S. diplomatic cable recently published by WikiLeaks, Ahmadinejad, despite all of his tough talk and heated speeches about Iran's right to a nuclear program, fervently supported the Geneva arrangement, which would have left Iran without enough enriched uranium to make a nuclear weapon. But, inside the often opaque Tehran government, he was thwarted from pursuing the deal by politicians on both the right and the left who saw the agreement as a "defeat" for the country and who viewed Ahmadinejad as, in the words of Ali Larijani, the conservative Speaker of the Majles, "fooled by the Westerners."

Despite the opposition from all sides, Ahmadinajed, we have learned, continued to tout the nuclear deal as a positive and necessary step for Iran. In February 2010, he reiterated his support for the Geneva agreement saying, "If we allow them to take [Iran's enriched uranium for processing], there is no problem." By June, long after all parties in the Geneva agreement had given up on the negotiations and the Iranian government had publicly taken a much firmer line on its nuclear program, Ahmadinejad was still trying to revive the deal. "The Tehran declaration is still alive and can play a role in international relations even if the arrogant (Western) powers are upset and angry," he declared. Even as late as September, Ahmadinejad was still promising that "there is a good chance that talks will resume in the near future," despite statements to the contrary from Iran's Supreme Leader, Ayatollah Khamenei. . . .
It might seem shocking to both casual and dedicated Iran-watcher that the bombastic Ahmadinejad could, behind Tehran's closed doors, be playing the reformer. After all, this was the man who, in 2005, generated wide outrage in the West for suggesting that Israel should be "wiped from the map." But even that case said as much about our limited understanding of him and his context as it did about Ahmadinejad himself. The expression "wipe from the map" means "destroy" in English but not in Farsi. In Farsi, it means not that Israel should be eliminated but that the existing political borders should literally be wiped from a literal map and replaced with those of historic Palestine. That's still not something likely to win him cheers in U.S. policy circles, but the distinction, which has been largely lost from the West's understanding of the Iranian president, is important.
Iran's position has been weakened considerably, as this report from Frontline makes clear:

Sanctions alone have often proved ineffective in toppling a government. After all, Fidel Castro has endured 50 years of unilateral U.S. sanctions. However, in concert with other efforts, and given the U.N. and E.U. endorsements of sanctions against Iran, the blockades are having a discernible impact. As can be expected, ordinary Iranians are the primary casualties. Denied food and merchandise imports, raw materials needed for production and manufacturing, they suffer skyrocketing inflation and unemployment that has left the economy in shambles. The infrastructure, too, is deteriorating. Resources are seriously depleted, and the price of gasoline has risen as countries obeying sanctions no longer export it to Iran. Even Iran-friendly Turkey has increased its price. Planes built nearly 40 years ago cannot be adequately serviced without access to replacement parts, leaving Iran with a reputation as one of the most dangerous places to fly. Just last week a plane crashed in northern Iran, killing at least 78 and spurring protests against the sanctions, but also underscoring the impact of the government's resistance to compromise. Hospitals and medical treatment suffer from lack of equipment and medicine, preventing Iranians from receiving proper everyday treatment and even life-saving medical care.

While a profound lack of humanity exists in imposing sanctions at the sacrifice of the lives of innocent people, they are nevertheless weakening the foundation of the current regime. The government has tried unsuccessfully to hamper sanctions by subsidizing; unfortunately, purchasing materials through secondary or tertiary sources has proved very expensive, as they are often paying three-fold for items once acquired from the West. Even Russia and China have begun to cut back on their oil purchases.

Iran is also losing its nuclear battle. Despite rhetoric from Ahmadinejad proclaiming the peaceful aim of the country's nuclear program, inherent risks come with enrichment of uranium, and there is no guarantee of Iran's intentions. And as reported by the New York Times, the program appears to have been undermined by a complex system virus -- Stuxnet, introduced by Israel in cooperation with the United States, has set Iran's nuclear ambitions back years, although it has failed to disable the program completely. And there may be more to come, as the virus may contain embedded codes set to strike at a later date. Iran has thus been outmaneuvered in its nuclear game. And the government may not be capable of fulfilling the pronouncement General Mohammad Ali Jafari, commander of the Islamic Revolutionary Guard Corps, made in a Frontline interview: "You will not find a single instance in which a country has inflicted harm on us and we have left it without a response. So if the United States makes such a mistake, they should know that we will definitely respond. And we don't make idle threats."

The arguably ingenious approach taken by Israel outsmarted those who predicted its strategy would come in the form of military strikes on Iran's nuclear facilities. Stuxnet had the same impact without dropping a single bomb. So how is Iran left to respond to a disabling virus? Then there are the assassination of a key nuclear scientist and an attempt on the life of another. Iran is charging Israel with responsibility for both. But aside from an inconsequential lawsuit, does Iran have the capacity to respond either in kind or otherwise?

Perhaps retaliation will come through other branches, such as Lebanon's Hezbollah, which recently withdrew from the country's cabinet, resulting in the collapse of Prime Minister Saad al-Hariri's coalition government. But can an assault by Israel's enemy based in southern Lebanon effectively challenge the force that, in 2006, decimated their efforts even as they lobbed missiles over the border? While Hassan Nasrallah may have visions of grandiosity, if he is Iran's arm against Israel, the response will be weak and no doubt short-lived. And it won't reverse the damage done to Iran's crippled nuclear program.

Internal threats further weaken Iran's position. While the strong-arm tactics of the government effectively stifled the Green Movement, there is no rescinding the revolt that took place following the 2009 presidential election. It's doubtful that minds have changed regarding the state of the government under the current Supreme Leader, and the potential for a resurgence of the insurrection always exists. Even with a rash of imprisonments and accompanying harsh sentences, it is likely impossible to squelch the reform movement entirely. And no doubt others will take the place of the exiled and executed. Demonstrations that proved the largest rebellion since the Revolution still reverberate not only within Iran, but in the shattered perception of Iran's government as monolithic, as the drama played out on the international stage.

Wind Power Fails When It's Most Needed

The Daily Mail reports on a weak wind blowing in these parts:
Britain’s wind farms almost ground to a halt during the coldest spells in December, it has emerged.
As temperatures plunged below zero and demand for electricity soared, figures reveal that most of the country’s 3,000 wind turbines were virtually still, energy experts say. During some of the chilliest weather, they were working at less than one-hundredth of capacity, producing electricity for fewer than 30,000 homes.

The National Grid was forced to compensate for the still, cold conditions by cranking up conventional coal and gas-fired power stations. December was the coldest month in more than a century – and yesterday, as some in northern England, the Midlands and Wales were hit with more snow, residents will have been switching on the heating again. But critics have warned that the UK is becoming too dependent on wind for power.

There are 3,153 working turbines in 283 wind farms across the UK, capable of generating more than 5.2 gigawatts of electricity – enough to power almost three million homes, the wind industry says. Over the next decade, another 10,000 turbines will go up to meet Europe’s climate change targets. By 2020, the Government says 30 per cent of all Britain’s electricity will be generated by wind.

But at best, turbines work at just 30 to 40 per cent of their capacity. And in cold winter snaps, often caused by vast, slow-moving high-pressure systems over Northern Europe, winds drop to almost nothing. Helen Chivers, of the Met Office, said cold spells were often accompanied by low winds. ‘It is fairly common in winter.'
During December’s cold snaps, the windfarms’ output repeatedly fell sharply, National Grid data shows. On the coldest day, December 20, the average temperature was minus 5.6C. But just as demand for electricity to heat homes was rising, the winds failed. That evening the recorded output from the UK’s wind farms dipped to 59 megawatts. . . .

John Constable, of the Renewable Energy Foundation, which argues against wind farm expansion, said: ‘When you get a high pressure system at this time of year it can cover most of the UK. The whole of the UK is becalmed just when it gets really cold and when demand for electricity goes up. Regardless of how much wind you have installed you need to have the same amount of conventional stations ready to switch on if the wind fails.’ The wind industry insisted wind was reliable – and that still spells are rare. Nick Medic, of Renewables UK, said if the wind does drop, we can import energy from overseas, or use energy stored in dams.

Captured Carbon "Fizzing like Soda Pop"

That's great. From the Winnipeg Free Press, "Land Fizzing like Soda Pop: Farmer Says CO2 Injected Underground is Leaking":
A Saskatchewan farm couple whose land lies over the world's largest carbon capture and storage project says greenhouse gases seeping from the soil are killing animals and sending groundwater foaming to the surface like shaken soda pop. 
The gases were supposed to have been injected permanently underground.

Cameron and Jane Kerr own nine quarter-sections of land above the Weyburn oilfield in eastern Saskatchewan. They released a consultant's report Tuesday that links high concentrations of carbon dioxide in their soil to 6,000 tonnes of the gas injected underground every day by energy giant Cenovus (TSX:CVE) in an attempt to enhance oil recovery and fight climate change.

"We knew, obviously, there was something wrong," said Jane Kerr.

A Cenovus spokeswoman said the company doubts those findings. She pointed out they contradict years of research from other scientists.

"It's not what we believe," said Rhona Delfrari.

Since 2000, Cenovus has injected about 16 million tonnes of carbon dioxide underground to force more oil from an aging field and safely store greenhouse gases that would otherwise contribute to climate change.

But in 2005, the Kerrs began noticing algae blooms, clots of foam and multicoloured scum in two ponds at the bottom of a gravel quarry on their land. Sometimes, the ponds bubbled. Small animals — cats, rabbits and goats — were regularly found dead a few metres away.

Then there were the explosions.

"At night we could hear this sort of bang like a cannon going off," said Jane Kerr, 58. "We'd go out and check the gravel pit and, in the walls, it (had) blown a hole in the side and there would be all this foaming coming out of this hole."

"Just like you shook up a bottle of Coke and had your finger over it and let it spray," added her husband.

The water, said Jane Kerr, came out of the ground carbonated.

"It would fizz and foam."

Alarmed, the couple left their farm and moved to Regina.

"It was getting too dangerous to live there," Cameron Kerr said.

He said provincial inspectors did a one-time check of air quality. Eventually, the Kerrs paid a consultant for a study.

Paul Lafleur of Petro-Find Geochem found carbon dioxide concentrations in the soil last summer that averaged about 23,000 parts per million — several times those typically found in field soils. Concentrations peaked at 110,607 parts per million.

Lafleur also used the mix of carbon isotopes he found in the gas to trace its source.

"The ... source of the high concentrations of CO2 in the soils of the Kerr property is clearly the anthropogenic CO2 injected into the Weyburn reservoir," he wrote.

"The survey also demonstrates that the overlying thick cap rock of anhydrite over the Weyburn reservoir is not an impermeable barrier to the upward movement of light hydrocarbons and CO2 as is generally thought."
The story has been disputed and will no doubt be subject to further controversy and tests. Here's an update from the Green blog of the New York Times. It notes the bleak conclusion drawn by The Globe and Mail newspaper--“What started as a series of worrisome problems on a rural Saskatchewan property has now raised serious questions about the safety of carbon sequestration and storage, a technology that has drawn billions in spending from governments and industry, which have promoted it as a salve to Canada’s growth in greenhouse-gas emissions”--but adds:
Not so fast, some top geologists say.

Sally Benson, a geologist at Stanford University, described the report, by Paul Lafleur, president of Petro-Find Geochem, a Saskatoon-based geological consulting firm, as far from comprehensive and said that other causes unconnected to the Weyburn project could be the source of the Kerrs’ problems.

“It’s a very short report and it’s a very complex issue,” Dr. Benson said.

Susan D. Hovorka, a geologist at the University of Texas at Austin, went further, saying that Mr. Fleur’s declarations that a firm link had been found between carbon dioxide on the Kerrs’ property and the storage project were “misleading.”

“He may be certain, but he’s wrong about his certainty,” Dr. Hovorka said. “His confidence is not justified by the data.”

The Petroleum Research Technology Center, a Saskatchewan-based research group overseeing the Weyburn project, has also issued a report strongly rebutting the consultant’s report.

Scientists also questioned the conclusion that the type of leakage allegedly occurring on the Kerrs’ property — even if it were confirmed — undermined the case for carbon capture and storage as a climate-change solution.

“That’s ridiculous — people have been saying from the beginning that there are things that could go wrong of this nature,” Dr. Benson said. “This is nothing radically out of the range of expectations.”

“Does this mean that carbon dioxide storage projects will leak back enough carbon into the atmosphere that they would cease to be effective? Absolutely not,” she added.

An independent investigation of the Kerr property has already been proposed by IPAC-CO2, a carbon-storage research institute, and both Cenovus and a lawyer for the Kerr family have signaled their approval.

“It certainly warrants looking into,” Dr. Benson said. “If there’s damage occurring, a remedy should be made.”
So: it's not happening, but if it's happening, it's no big deal; and if it is a big deal, there should be a remedy.

January 17, 2011

Solar: Green vs. Green

From Reuters, "With Solar Power, It's Green vs. Green," January 5, 2011
The push to create an alternative to carbon-based fuel has hit an unlikely snag: environmentalists. . . .To a growing number of environmental advocates, the dozens of large solar plants that are springing up in vast areas of the western wilderness are more scourge than savior.

The upshot is that those who on paper seem to be perfect allies for solar are turning into its biggest enemies.

That includes the Sierra Club, which last week filed what senior attorney Gloria Smith says is its first suit against a solar plant, a giant 664-megawatt project called Calico that is slated to go up in the desert near Barstow, California. It would lie smack in the middle of habitat for rare plants and animals, in an area Smith calls "a very unfortunate site." . . .

For the solar industry overall, the situation marks a fundamental shift in attitude. Where previously almost any bare patch of desert seemed like a prospective solar plant, now the reality is that much of the nation's most fertile ground for alternative power and energy independence may well remain undeveloped. 
And the backlash is likely to slow down the number of big plants developers will try to get through. Some 142 U.S. solar plants are under development, according to the Solar Energy Industry Association, up from just 28 two years ago. Many of these are well over 500 megawatts; a handful are over 1,000 megawatts, meaning they would cover hundreds of acres of land and power at least 300,000 homes each. . . .

California lies at the center of the U.S. solar industry, thanks to a confluence of sunlit land and a legal requirement for 33 percent of its electricity to come from renewable sources by 2020. More than 40 solar utility plants are in development, according to the state's public utilities commission. Almost all of them have or will run into problems with environmentalists or people who simply don't want the plants in their backyard . . .

Among the plants [Christine Hersey, a solar analyst at Wedbush Securities] considers at high risk is First Solar's 300-megawatt Stateline project, which has high numbers of threatened desert tortoises.
Several other projects are already mired in legislation or under threat of it.

The Quechan Tribe, a Native American group centered around the border between Arizona and California, has sued the Bureau of Land Management over a 709-megawatt plant planned for its ancestral land in the Imperial Valley, citing animals such as the flat-tailed horned lizard. The tribe charges the BLM approval of the project didn't follow appropriate procedures. Last month, it secured an injunction blocking the plant, under development by NTR plc's Tessera Solar.

Just last week, La Cuna de Aztlan, a Native American advocacy group, and its co-plaintiffs filed a lawsuit over federal approval of six solar plants, citing the cultural environment, among other issues.

Among the six is the 370-megawatt Ivanpah plant in the Mojave Desert, for which BrightSource Energy broke ground in October. BrightSource already made some concessions after the Center for Biological Diversity, known for litigation on development it believes threatens the environment, raised concerns. The Tucson, Arizona-based group is keeping a close eye on other proposed solar projects, according to biologist Ileene Anderson.

In its suit filed last week in the Supreme Court of California, the Sierra Club sued the California Energy Commission over its approval of the Calico Solar Project. Among the Sierra Club's worries: the plant is going in an area rich with desert tortoises, which are threatened under federal law and endangered under California law, and other species. CEC officials "look forward to defending our position in court," said spokeswoman Sandy Louey. The developer, Tessera Solar, sold the project to New York-based K Road Power late last month.

Groups ranging from the Audubon Society to the Defenders of Wildlife to the Natural Resources Defense Council are also lobbing out objections against other projects.

About half of all plants in development now are having issues concerning plant and animal habitat, culture sites, or water demand, Hersey estimates. Many of those could end up in court. And just the threat of litigation seems likely to affect the scale of solar, analysts say.

Aging Nuclear Plants



Source: Energy Information Administration

January 16, 2011

Ethanol Madness

Selections from Jim Quinn's diatribe against the US government's ethanol policies, at The Burning Platform:
Corn is the most widely produced feed grain in the United States, accounting for more than 90% of total U.S. feed grain production. 81.4 million acres of land are utilized to grow corn, with the majority of the crop grown in the Midwest.  Although most of the crop is used to feed livestock, corn is also processed into food and industrial products including starch, sweeteners, corn oil, beverage and industrial alcohol, yogurt, latex paint, cosmetics, and last but not least, fuel Ethanol. Of the 10,000 items in your average grocery store, at least 2,500 items use corn in some form during the production or processing. The United States is the major player in the world corn market providing more than 50% of the world’s corn supply. In excess of 20% of our corn crop had been exported to other countries, but the government ethanol mandates have reduced the amount that is available to export.

This year, the US will harvest approximately 12.5 billion bushels of corn. More than 42% will be used to feed livestock in the US, another 40% will be used to produce government mandated ethanol fuel, 2% will be used for food products, and 16% is exported to other countries. Ending stocks are down 963 million bushels from last year. The stocks-to-use ratio is projected at 5.5%, the lowest since 1995/96 when it dropped to 5.0%. . .

The United States is the big daddy of the world food economy. It is far and away the world’s leading grain exporter, exporting more than Argentina, Australia, Canada, and Russia combined. In a globalized food economy, increased demand for corn, to fuel American vehicles, puts tremendous pressure on world food supplies. Continuing to divert more food to fuel, as is now mandated by the U.S. federal government in its Renewable Fuel Standard, will lead to higher food prices, rising hunger among the world’s poor and to social chaos across the globe. By subsidizing the production of ethanol, now to the tune of $6 billion each year, U.S. taxpayers are subsidizing skyrocketing food bills at home and around the world.

The energy bill signed by that free market capitalist George Bush in 2008 mandates that increasing amounts of corn based ethanol must be used in gasoline sold in the U.S. This energy legislation requires a five-fold increase in ethanol use by 2022. Some 15 billion gallons must come from traditional corn-blended ethanol. . . .

The Sorry State of the Renewables

Back in February 2008, Eric Janszen wrote an article in Harper's on "The Next Bubble," in which he gamely predicted that alternative energy fit the bill better than any other sector of the economy. Janzsen's piece is an excellent dissection of the technology and real estate bubbles, focusing, in the case of the latter, on the growth of the FIRE (Financial, Insurance, Real Estate) sector of the economy. As of yet, however, his prediction of a bubble in alternative energy has been far of the mark.

Here is what he wrote back then:
There are a number of plausible candidates for the next bubble, but only a few meet all the criteria. Health care must expand to meet the needs of the aging baby boomers, but there is as yet no enabling government legislation to make way for a health-care bubble; the same holds true of the pharmaceutical industry, which could hyperinflate only if the Food and Drug Administration was gutted of its power. A second technology boom—under the rubric “Web 2.0”—is based on improvements to existing technology rather than any new discovery. The capital-intensive biotechnology industry will not inflate, as it requires too much specialized intelligence.

There is one industry that fits the bill: alternative energy, the development of more energy-efficient products, along with viable alternatives to oil, including wind, solar, and geothermal power, along with the use of nuclear energy to produce sustainable oil substitutes, such as liquefied hydrogen from water.
Alas, it has not worked out that way. On the contrary, alternative energy, especially the wind and solar sectors, has busted pretty badly.

Herewith a visual tour of the sad state of wind and solar.

This chart shows the price of TAN (the solar etf or exchange-traded fund) and FAN (the wind etf) over the last three years:


Also revealing are ratio charts of TAN and FAN in relation to other energy indexes. As these charts show, the capital markets have been punishing alternative energy and rewarding coal and oil. (A lot of natural gas companies have also been slaughtered in the last few years, but that's another story.)


Here's a longer term view showing the ratio chart of the Wilderhill Clean Energy Index with the Dow Jones Coal Index:


Just to add insult to injury, here's the Clean Energy Index in relation to British Petroleum:


The past is not necessarily prologue; perhaps this is a terrific buying opportunity for "clean energy," having been down so long. But in the last few years, investors in green technologies have pretty much gotten crushed.

Oil Prices and Inventories

With oil again over $90 a barrel, the question arises whether we are likely to see another spike in prices comparable to what occurred in the summer of 2008, when oil reached $145 a barrel.


The following three charts, from the Energy Information Administration's Short Term Energy Outlook in January 2011, suggest the answer is "no."

OECD commercial oil stocks are considerably higher than in the lean years preceding the 2008 spike in prices:



The same is true for US oil inventories


Since the marginal demand for oil is now coming from emerging markets like China and India, both of the above charts are not necessarily determinative. But also bearing on the question is the state of surplus capacity in OPEC producers. According to the EIA, it's much higher than in the 2004-2008 period.

Solar Going to China

From the New York Times, "Solar Panel Maker Moves Work to China"

Aided by at least $43 million in assistance from the government of Massachusetts and an innovative solar energy technology, Evergreen Solar emerged in the last three years as the third-largest maker of solar panels in the United States.

But now the company is closing its main American factory, laying off the 800 workers by the end of March and shifting production to a joint venture with a Chinese company in central China. Evergreen cited the much higher government support available in China. . . .

Although solar energy still accounts for only a tiny fraction of American power production, declining prices and concerns about global warming give solar power a prominent place in United States plans for a clean energy future — even if critics say the federal government is still not doing enough to foster its adoption.

Beyond the issues of trade and jobs, solar power experts see broader implications. They say that after many years of relying on unstable governments in the Middle East for oil, the United States now looks likely to rely on China to tap energy from the sun.

Evergreen, in announcing its move to China, was unusually candid about its motives. Michael El-Hillow, the chief executive, said in a statement that his company had decided to close the Massachusetts factory in response to plunging prices for solar panels. World prices have fallen as much as two-thirds in the last three years — including a drop of 10 percent during last year’s fourth quarter alone.

Chinese manufacturers, Mr. El-Hillow said in the statement, have been able to push prices down sharply because they receive considerable help from the Chinese government and state-owned banks, and because manufacturing costs are generally lower in China.

“While the United States and other Western industrial economies are beneficiaries of rapidly declining installation costs of solar energy, we expect the United States will continue to be at a disadvantage from a manufacturing standpoint,” he said. . . .

Other solar panel manufacturers are also struggling in the United States. Solyndra, a Silicon Valley business, received a visit from President Obama in May and a $535 million federal loan guarantee, only to say in November that it was shutting one of its two American plants and would delay expansion of the other.

First Solar, an American company, is one of the world’s largest solar power vendors. But most of its products are made overseas.

Chinese solar panel manufacturers accounted for slightly over half the world’s production last year. Their share of the American market has grown nearly sixfold in the last two years, to 23 percent in 2010 and is still rising fast, according to GTM Research, a renewable energy market analysis firm in Cambridge, Mass.

In addition to solar energy, China just passed the United States as the world’s largest builder and installer of wind turbines. . . .

Evergreen was selling solar panels made in Devens for $3.39 a watt at the end of 2008 and planned to cut its costs to $2 a watt by the end of last year — a target it met. But Evergreen found that by the end of the fourth quarter, it could fetch only $1.90 a watt for its Devens-made solar panels. Chinese manufacturers were selling them for as little as $1.60 a watt after reducing their costs to as little as $1.35 or less per watt. . . .

Factory labor is cheap in China, where monthly wages average less than $300. That compares to a statewide average of more than $5,400 a month for Massachusetts factory workers. But labor is a tiny share of the cost of running a high-tech solar panel factory, Mr. El-Hillow said. China’s real advantage lies in the ability of solar panel companies to form partnerships with local governments and then obtain loans at very low interest rates from state-owned banks.

Evergreen, with help from its partners — the Wuhan municipal government and the Hubei provincial government — borrowed two-thirds of the cost of its Wuhan factory from two Chinese banks, at an interest rate that under certain conditions could go as low as 4.8 percent, Mr. El-Hillow said in August. Best of all, no principal payments or interest payments will be due until the end of the loan in 2015.

By contrast, a $21 million grant from Massachusetts covered 5 percent of the cost of the Devens factory, and the company had to borrow the rest from banks, Mr. El-Hillow said.

Banks in the United States were reluctant to provide the rest of the money even at double-digit interest rates, partly because of the financial crisis. “Therein lies the hidden advantage of being in China,” Mr. El-Hillow said.

January 15, 2011

Oil Exploration: The Trend Toward Ever Deeper Waters

A fascinating piece in the New York Times magazine by Benjamin Wallace-Welles relates the innately hazardous, hugely expensive, and very risky game of oil exploration:
Oil reserves have been declining for a decade, and it is an article of faith among petroleum geologists that the easy oil — easier to find, less complicated to drill — has all been extracted and that the explorers are now into the hard oil. When the Deepwater Horizon rig, drilling an exploratory well deep into rock through a mile of water and three miles into the ocean floor off the Louisiana coast, struck a highly pressurized pocket of oil and gas, causing an explosion, it was in some ways a consequence of this iterative, competitive game, each generation of discoveries pushing further into the unknown.

A few years ago, the industry norm was to drill at depths of 15,000 or 20,000 feet. Now the frontier is 35,000 feet, where engineers find higher temperatures and pressures. “The scarcity of new reserves has been driving companies into plays that have previously been seen as extremely high risk and high cost,” said Brian Maxted, the chief executive officer of Kosmos Energy, a deepwater-exploration company in Dallas. “The trend recently has been in going toward ever-deeper waters and ever-more challenging environments.” . . .
The possibility of a boom commands particular attention now, because the industry’s faith in a limitless future has begun to diminish. The International Energy Agency — which had until recently been optimistic about oil — concluded last fall that the world has very likely already passed its peak oil production.

“The deepwater was one of the last big exploration plays on the planet,” says Gerald Kepes, a partner and head of upstream and gas at PFC Energy, a consulting firm. “We’re now looking at the second half of the global deepwater play. You can see the end of it, maybe 25 years from now.”

This is not the only way of looking at the data; other analysts, recalling the technological advances and the unforeseen finds that have marked exploration’s history, are more positive. But that optimism also depends, in some part, on whether the mass of subsalt off Angola’s coast approaches the size of that off Brazil’s. And the first clues to answer that question will be revealed, in part, by what Cobalt finds at Gold Dust. . . .

Last February, Richard Sears, a geophysicist who was vice president for exploration and deepwater technical evaluation at Shell and is now senior science adviser to the National Oil Spill Commission, appeared at the TED ideas conference in Long Beach, Calif., to give a talk about the future of energy. Sears says that there are between 30 and 50 years left before a broad gap opens between worldwide oil supply and demand. It is hard, he says, to describe a situation that is either a lot more optimistic or pessimistic than that. At TED, Sears held up a pincushion of the globe, with red thumbtacks stuck in. Each thumbtack represented an oil basin. “This is it,” he said. “This is the oil in the world. Geologists have a pretty good idea of where it is.”

These last 650 billion barrels are the hardest. “There are still some areas — Iran or Iraq or Russia — where you can literally fly over in a plane and see big structures lying right out there, and they are undrilled,” Farnsworth told me. But much of that territory has been reserved for national oil companies, and so in the last decade 43 percent of the industry’s new reserves have come from the deep water. “It’s gotten harder all along. And the structures, generally, have gotten smaller.”

But as the map has compressed, and the possibility of finding new basins has dwindled, explorers have returned their attention to the regions where vast deposits of oil have already been found, in the belief that new technologies might allow them to drill deeper. These aspirations drove new finds earlier this decade in offshore Brazil, and the continued work in the Gulf of Mexico. They also compelled the industry toward Angola, where Western oil companies have sustained production onshore and in shallow water for decades, even through a long civil war. . . .

One geologist told me he would show me the tools he used to find oil when all else failed; he then produced a divining rod and a crystal ball. It was a joke, but one made frequently enough that he had a divining rod and a crystal ball lying around. The essential mystery of exploration — that it is impossible to know precisely what exists below until you begin to drill — and the push into more complicated depths and environments mean that an engineer often has only a dim read on what he is drilling into: unexpected pressures, slopes and formations. The enduring oddity of the Deepwater Horizon spill was that it took place in a prospect where major problems seemed unlikely — comparatively shallow, very well mapped, mature.

"It’s frustrating to me,” Farnsworth told me. “It’s never going to change, but the general public always thinks, I should be able to get a gallon of gasoline, and it should be damn cheap, and whether I choose to drive a 10-mile-per-gallon car or a 40-mile-per-gallon car should have no impact on that price. We know how hard it is to explore for oil, and we know how hard it is to get it out of the deep water. And there’s been this incredible disconnect, which might have been lessened by the spill, between what people think it takes to get gasoline in their car and what we do." . . .

Drilling engineers suffer from the tyrannies of darkness and depth, and these conditions limit what they can do when these systems do fail. “The real problem with going so deep is not that the possibility of failure is greater,” Bea said. “But if something does fail, the consequences are so much greater. In shallow water, you can physically contain a spill. But at this kind of depth that becomes dramatically more difficult.”

Oil Price Shocks and Economic Recessions

Jim Hamilton of Econobrowser has a new paper on oil price shocks and post-World War II recessions, in which he argues that "Every recession (with one exception) was preceded by an increase in oil prices, and every oil market disruption (with one exception) was followed by an economic recession."
The correlation between oil shocks and economic recessions appears to be too strong to be just a coincidence  And although demand pressure associated with the later stages of a business cycle expansion seems to have been a contributing factor in a number of these episodes, statistically one cannot predict the oil price changes prior to 1973 on the basis of prior developments in the U.S. economy. Moreover, supply disruptions arising from dramatic geopolitical events are prominent causes of a number of the most important episodes. Insofar as events such as the Suez Crisis and first Persian Gulf War were not caused by U.S. business cycle dynamics, a correlation between these events and subsequent economic downturns should be viewed as causal. This is not to claim that the oil price increases themselves were the sole cause of most postwar recessions. Instead the indicated conclusion is that oil shocks were a contributing factor in at least some postwar recessions.
Hat tip: Calculated Risk

Ice Shelves Disappearing in Antarctica




From the U.S. Geological Survey, Department of the Interior, "Ice Shelves Are Disappearing in Arctic Peninsula," February 22, 2010:
Ice shelves are retreating in the southern section of the Antarctic Peninsula due to climate change. This could result in glacier retreat and sea-level rise if warming continues, threatening coastal communities and low-lying islands worldwide.


Research by the U.S. Geological Survey is the first to document that every ice front in the southern part of the Antarctic Peninsula has been retreating overall from 1947 to 2009, with the most dramatic changes occurring since 1990. The USGS previously documented that the majority of ice fronts on the entire Peninsula have also retreated during the late 20th century and into the early 21st century.

The ice shelves are attached to the continent and already floating, holding in place the Antarctic ice sheet that covers about 98 percent of the Antarctic continent. As the ice shelves break off, it is easier for outlet glaciers and ice streams from the ice sheet to flow into the sea. The transition of that ice from land to the ocean is what raises sea level.

“This research is part of a larger ongoing USGS project that is for the first time studying the entire Antarctic coastline in detail, and this is important because the Antarctic ice sheet contains 91 percent of Earth’s glacier ice,” said USGS scientist Jane Ferrigno. “The loss of ice shelves is evidence of the effects of global warming. We need to be alert and continually understand and observe how our climate system is changing.”

The Peninsula is one of Antarctica’s most rapidly changing areas because it is farthest away from the South Pole, and its ice shelf loss may be a forecast of changes in other parts of Antarctica and the world if warming continues.

Retreat along the southern part of the Peninsula is of particular interest because that area has the Peninsula’s coolest temperatures, demonstrating that global warming is affecting the entire length of the Peninsula.

The Antarctic Peninsula’s southern section as described in this study contains five major ice shelves: Wilkins, George VI, Bach, Stange and the southern portion of Larsen Ice Shelf. The ice lost since 1998 from the Wilkins Ice Shelf alone totals more than 4,000 square kilometers, an area larger than the state of Rhode Island.


See here for other reports and maps in the USGS's Coastal Change and Glaciological Maps of Antarctica series.

See also:

"Large Antarctic Glacier Thinning..." at Climate Progress

Earth Observatory: Pine Island Glacier

January 14, 2011

Obfuscatory Data from the IEA and EIA

Gregor McDonald, in a post at The Oil Drum, complains of "channel stuffing" by the two main public energy data sources--the Paris-based International Energy Agency and the Washington-based Energy Information Administration.
In the case of Canada, the high-cost tar sands production has now been aggregated into that country’s measures of “crude oil.” While not as egregious as including ethanol into publicly released data measures of oil, the alchemy and energy inputs required to turn oily dirt into usable petroleum can hardly be deemed as conventional crude oil production. To this point, one of the core methods EIA Washington and IEA Paris have increasingly relied on in recent years–to obscure the very serious and now very real problem of oil depletion–is to include biofuels and natural gas liquids in the accounting of global oil production. The technique that both agencies use to conduct this obfuscation is a familiar one, in which the key information is aggregated (buried) into a much larger barrage of data and presentations. For a scholarly look at the methods governments use to work around their obligations to inform the public, do watch the one hour lecture that Jay Rosen gave to the World Bank earlier this year. Rosen’s deconstructions of the media have been very helpful to me, over the past two years. See his blog here: PressThink.org. Rosen describes the use of opacity as a kind of hiding in plain sight, or secrecy by complexity.

In order to rebut this Secrecy by Complexity it’s the obligation of responsible energy analysts to explain the falsehood of adding biofuels and natural gas liquids (NGL’s) to measures of oil production. The reason is simple: natural gas liquids are not oil. They are not oil in any sense and most important of all NGL’s contain only 65% of the BTU of oil. Worse, biofuels are barely an energy source themselves and are the product of a conversion process of other energy inputs. Accordingly, the world is not producing 84, or 85, or 86 million barrels of oil per day. The world is instead producing 73.436 million barrels of crude oil per day. The depletion of oil will not be solved by either by the production of biofuels and NGLs, nor their inclusion into oil data, as the world economy moves into the future.

When the EIA in Washington falsely composes such forecasts, aggregating future natural gas liquids and ethanol into a supply picture for “oil” as they do each year in their various projections, this disables the public’s ability to accurately understand the true outlook for global oil supply. While it’s still the case that EIA Washington produces data each month for Crude Oil production only, the predominant reporting and forecasting is now weighted towards “liquids”, the unhelpful aggregation of oil with NGLs and biofuels. But most unhelpful of all is that, in forecasts, the EIA has essentially dropped projections for global crude oil supply. For example, in the International Energy Outlook (IEO) 2010 public press conference this May the EIA released to the press the following slide-deck: International Energy Outlook 2010–With Projections to 2035. No accounting or forecast of oil is contained in the document. see: Slide 6: OPEC producers maintain an approximate 40% share of total liquids production in the Reference case.


Little information of any use is actually provided by the projection shown above for one simple reason: the chart does not tell us about the actual energy that will available to society. More egregious is that even in the main body of the IEO 2010 report, more false aggregation occurs with a yet another term of complexity: Conventional Liquids. Indeed, it’s not surprising that OECD governments use opacity and secrecy by complexity to handle this extremely important issue. OECD economies are now structurally short energy supply, having lost access to the cheap BTU in oil that built out their societies over the past 100 years. The loss of cheap energy, the loss of the cheap BTU that oil has provided to OECD nations for the past century, is a crucial factor in the dilemma the West now faces: a newly chronic economic restraint that refuses to go away.
McDonald's post provoked some interesting comments. In response to the idea that if it can run motors and be made into plastics, it ought to be counted as as a "liquid" equivalent to oil, RockyMtgGuy noted the following:
There is an argument for including condensates from gas wells and pentanes plus (C5+) from gas plants in the "liquid fuels" category, since these are liquids under atmospheric conditions and can be blended into the crude oil stream going into the refinery. In fact, many refineries would want to do this because it improves their yield of gasoline, which will be low if they are processing a lot of heavy oil. This is particularly helpful in the US where the demand for gasoline and asphalt is high, but less so in Europe which wants a higher cut of diesel than a condensate+heavy oil mix would provide.


But the lighter NGLs - ethane, propane, and butane - are gases under normal atmospheric conditions. They have different markets, such as petrochemicals, or heating/cooking fuels where natural gas is not available. You can't just indiscriminately blend them into gasoline because they will cause fuel systems to vapor lock under hot conditions. A refinery would prefer to sell them to a petrochemical plant, which is why they are often built side-by-side.

And biofuels, which mostly means fuel ethanol, is problematic because it really involves some double-counting of "oil" production. Refineries are producing diesel fuel, which farmers then burn in their equipment to produce corn for ethanol which the refineries blend into their gasoline. This is just an indirect and expensive way of converting diesel fuel into gasoline. It would be more efficient for the refineries to enhance their refining processes to produce less diesel fuel and more gasoline, but then the total volume of "oil" produced would be lower.

So, the EIA in Washington and the IEA in Paris are producing numbers which are highly misleading. They purport to show that oil production is remaining stable, when in reality it is starting to decline.
One commentator objected that RMG had obscured the nature of ethanol: "Ethanol is a Gas-to-Liquids process, with a (small) net energy gain, a huge economic gain, and a lot of dry distillers grains feeding livestock. The bulk of the energy used is natgas to make ammonia fertilizer, and natgas to run the distillation. The diesel used to run farm equipment is damn near in the noise. Trucking feedstocks and products around takes way more diesel than minimal-tillage corn farming does."

RMG replied by acknowledging that the "use of natural gas to create nitrogen fertilizer is the biggest energy input into creating fuel ethanol. Coal to generate electricity is another big input. The use of diesel fuel for farm equipment is a smaller input, but the diesel fuel to transport crops and run irrigation pumps is also significant."

But it's not really a gas-to-liquids process, it is a food-to-fuel process and has the secondary effective of driving up world food prices. This is not a minor effect - much of the world has become dependent on imports of cheap American corn, which is becoming less cheap as it is turned into automobile fuel.

I don't see the economic gain, because agricultural production and irrigation are heavily subsidized in the US. Turning corn into fuel ethanol just adds even more subsidies to get rid of the corn surpluses caused by the agricultural and irrigation subsidies. It's piling subsidies on subsidies.
At the end of it all, the whole fuel ethanol system is just a huge mass of subsidies that could be eliminated by forcing car companies to build and people to buy more fuel-efficient vehicles. Instead of putting 10% ethanol into gasoline, Americans could just use 10% less gasoline by buying smaller vehicles - which of course people in almost all other countries already do. The main losers would be industrial-scale corn farmers, oil companies, and automobile companies. However the latter groups have much better lobbying organizations than taxpayers do.

January 13, 2011

Worse Than You Thought

From the National Center for Atmospheric Research in Boulder:
The magnitude of climate change during Earth’s deep past suggests that future temperatures may eventually rise far more than projected if society continues its pace of emitting greenhouse gases, a new analysis concludes. The study, by National Center for Atmospheric Research (NCAR) scientist Jeffrey Kiehl, will appear as a “Perspectives” piece in this week’s issue of the journal Science.
 Building on recent research, the study examines the relationship between global temperatures and high levels of carbon dioxide in the atmosphere tens of millions of years ago. It warns that, if carbon dioxide emissions continue at their current rate through the end of this century, atmospheric concentrations of the greenhouse gas will reach levels that last existed about 30 million to 100 million years ago, when global temperatures averaged about 29 degrees Fahrenheit (16 degrees Celsius) above pre-industrial levels.

Kiehl said that global temperatures may gradually rise over the next several centuries or millennia in response to the carbon dioxide. Elevated levels of the greenhouse gas may remain in the atmosphere for tens of thousands of years, according to recent computer model studies of geochemical processes that the study cites.

The study also indicates that the planet’s climate system, over long periods of times, may be at least twice as sensitive to carbon dioxide than currently projected by computer models, which have generally focused on shorter-term warming trends. This is largely because even sophisticated computer models have not yet been able to incorporate critical processes, such as the loss of ice sheets, that take place over centuries or millennia and amplify the initial warming effects of carbon dioxide. . . .

Kiehl focused on a fundamental question: when was the last time Earth’s atmosphere contained as much carbon dioxide as it may by the end of this century?
If society continues on its current pace of increasing the burning of fossil fuels, atmospheric levels of carbon dioxide are expected to reach about 900 to 1,000 parts per million by the end of this century. That compares with current levels of about 390 parts per million, and pre-industrial levels of about 280 parts per million.

Since carbon dioxide is a greenhouse gas that traps heat in Earth’s atmosphere, it is critical for regulating Earth’s climate. Without carbon dioxide, the planet would freeze over. But as atmospheric levels of the gas rise, which has happened at times in the geologic past, global temperatures increase dramatically and additional greenhouse gases, such as water vapor and methane, enter the atmosphere through processes related to evaporation and thawing. This leads to further heating.

Kiehl drew on recently published research that, by analyzing molecular structures in fossilized organic materials, showed that carbon dioxide levels likely reached 900 to 1,000 parts per million about 35 million years ago.

At that time, temperatures worldwide were substantially warmer than at present, especially in polar regions—even though the Sun’s energy output was slightly weaker. The high levels of carbon dioxide in the ancient atmosphere kept the tropics at about 9-18 degrees F (5-10 degrees C) above present-day temperatures. The polar regions were some 27-36 degrees F (15-20 degrees C) above present-day temperatures.

Kiehl applied mathematical formulas to calculate that Earth’s average annual temperature 30 to 40 million years ago was about 88 degrees F (31 degrees C)—substantially higher than the pre-industrial average temperature of about 59 degrees F (15 degrees C).

The study also found that carbon dioxide may have at least twice the effect on global temperatures than currently projected by computer models of global climate.

The world’s leading computer models generally project that a doubling of carbon dioxide in the atmosphere would have a heating impact in the range of 0.5 to 1.0 degree C watts per square meter. (The unit is a measure of the sensitivity of Earth’s climate to changes in greenhouse gases.) However, the published data show that the comparable impact of carbon dioxide 35 million years ago amounted to about 2 degrees C watts per square meter.

Computer models successfully capture the short-term effects of increasing carbon dioxide in the atmosphere. But the record from Earth's geologic past also encompasses longer-term effects, which accounts for the discrepancy in findings. The eventual melting of ice sheets, for example, leads to additional heating because exposed dark surfaces of land or water absorb more heat than ice sheets.

“This analysis shows that on longer time scales our planet may be much more sensitive to greenhouse gases than we thought,” Kiehl says.
See the further discussion in Climate Progress, which cites other alarming studies showing that "the paleoclimate data is considerably more worrisome than" the (much-criticzed) computer models.

Kiehl's conclusion in his Science article:
The above arguments weave together a number of threads in the discussion of climate that have appeared over the past few years. They rest on observations and geochemical modeling studies. Of course, uncertainties still exist in deduced CO2 and surface temperatures, but some basic conclusions can be drawn. Earth’s CO2 concentration is rapidly rising to a level not seen in ∼30 to 100 million years, and Earth’s climate was extremely warm at these levels of CO2. If the world reaches such concentrations of atmospheric CO2, positive feedback processes can amplify global warming beyond current modeling estimates. The human species and global ecosystems will be placed in a climate state never before experienced in their evolutionary history and at an unprecedented rate. Note that these conclusions arise from observations from Earth’s past and not specifically from climate models. Will we, as a species, listen to these messages from the past in order to avoid repeating history?

Methane Leakage from Arctic Shelf

From the National Science Foundation, March 4, 2010, "Methane Releases From Arctic Shelf May be Much Larger and Faster Than Anticipated"


 
A section of the Arctic Ocean seafloor that holds vast stores of frozen methane is showing signs of instability and widespread venting of the powerful greenhouse gas, according to the findings of an international research team led by University of Alaska Fairbanks scientists Natalia Shakhova and Igor Semiletov.

The research results, published in the March 5 edition of the journal Science, show that the permafrost under the East Siberian Arctic Shelf, long thought to be an impermeable barrier sealing in methane, is perforated and is starting to leak large amounts of methane into the atmosphere. Release of even a fraction of the methane stored in the shelf could trigger abrupt climate warming.

"The amount of methane currently coming out of the East Siberian Arctic Shelf is comparable to the amount coming out of the entire world's oceans," said Shakhova, a researcher at UAF's International Arctic Research Center. "Subsea permafrost is losing its ability to be an impermeable cap."

Methane is a greenhouse gas more than 30 times more potent than carbon dioxide. It is released from previously frozen soils in two ways. When the organic material (which contains carbon) stored in permafrost thaws, it begins to decompose and, under anaerobic conditions, gradually releases methane. Methane can also be stored in the seabed as methane gas or methane hydrates and then released as subsea permafrost thaws. These releases can be larger and more abrupt than those that result from decomposition.

The East Siberian Arctic Shelf is a methane-rich area that encompasses more than 2 million square kilometers of seafloor in the Arctic Ocean. It is more than three times as large as the nearby Siberian wetlands, which have been considered the primary Northern Hemisphere source of atmospheric methane. Shakhova's research results show that the East Siberian Arctic Shelf is already a significant methane source, releasing 7 teragrams of methane yearly, which is as much as is emitted from the rest of the ocean. A teragram is equal to about 1.1 million tons.

"Our concern is that the subsea permafrost has been showing signs of destabilization already," she said. "If it further destabilizes, the methane emissions may not be teragrams, it would be significantly larger."

Shakhova notes that the Earth's geological record indicates that atmospheric methane concentrations have varied between about .3 to .4 parts per million during cold periods to .6 to .7 parts per million during warm periods. Current average methane concentrations in the Arctic average about 1.85 parts per million, the highest in 400,000 years, she said. Concentrations above the East Siberian Arctic Shelf are even higher.