February 27, 2011

Fracking Hazards

The New York Times has a big piece on the environmental dangers of hydraulic fracking, the technique that has doubled U.S. natural gas reserves over the last few years:
While the existence of the toxic wastes has been reported, thousands of internal documents obtained by The New York Times from the Environmental Protection Agency, state regulators and drillers show that the dangers to the environment and health are greater than previously understood.
The documents reveal that the wastewater, which is sometimes hauled to sewage plants not designed to treat it and then discharged into rivers that supply drinking water, contains radioactivity at levels higher than previously known, and far higher than the level that federal regulators say is safe for these treatment plants to handle.

Other documents and interviews show that many E.P.A. scientists are alarmed, warning that the drilling waste is a threat to drinking water in Pennsylvania. Their concern is based partly on a 2009 study, never made public, written by an E.P.A. consultant who concluded that some sewage treatment plants were incapable of removing certain drilling waste contaminants and were probably violating the law.

The Times also found never-reported studies by the E.P.A. and a confidential study by the drilling industry that all concluded that radioactivity in drilling waste cannot be fully diluted in rivers and other waterways.

But the E.P.A. has not intervened. In fact, federal and state regulators are allowing most sewage treatment plants that accept drilling waste not to test for radioactivity. And most drinking-water intake plants downstream from those sewage treatment plants in Pennsylvania, with the blessing of regulators, have not tested for radioactivity since before 2006, even though the drilling boom began in 2008.

In other words, there is no way of guaranteeing that the drinking water taken in by all these plants is safe.

That has experts worried.

“We’re burning the furniture to heat the house,” said John H. Quigley, who left last month as secretary of Pennsylvania’s Department of Conservation and Natural Resources. “In shifting away from coal and toward natural gas, we’re trying for cleaner air, but we’re producing massive amounts of toxic wastewater with salts and naturally occurring radioactive materials, and it’s not clear we have a plan for properly handling this waste.” . . .
There were more than 493,000 active natural-gas wells in the United States in 2009, almost double the number in 1990. Around 90 percent have used hydrofracking to get more gas flowing, according to the drilling industry.
Gas has seeped into underground drinking-water supplies in at least five states, including Colorado, Ohio, Pennsylvania, Texas and West Virginia, and residents blamed natural-gas drilling.

Air pollution caused by natural-gas drilling is a growing threat, too. Wyoming, for example, failed in 2009 to meet federal standards for air quality for the first time in its history partly because of the fumes containing benzene and toluene from roughly 27,000 wells, the vast majority drilled in the past five years.
In late 2008, drilling and coal-mine waste released during a drought so overwhelmed the Monongahela that local officials advised people in the Pittsburgh area to drink bottled water. E.P.A. officials described the incident in an internal memorandum as “one of the largest failures in U.S. history to supply clean drinking water to the public.”
In Texas, which now has about 93,000 natural-gas wells, up from around 58,000 a dozen years ago, a hospital system in six counties with some of the heaviest drilling said in 2010 that it found a 25 percent asthma rate for young children, more than three times the state rate of about 7 percent.

“It’s ruining us,” said Kelly Gant, whose 14-year-old daughter and 11-year-old son have experienced severe asthma attacks, dizzy spells and headaches since a compressor station and a gas well were set up about two years ago near her house in Bartonville, Tex. The industry and state regulators have said it is not clear what role the gas industry has played in causing such problems, since the area has had high air pollution for a while.

“I’m not an activist, an alarmist, a Democrat, environmentalist or anything like that,” Ms. Gant said. “I’m just a person who isn’t able to manage the health of my family because of all this drilling.” . . .
[A] review by The Times of more than 30,000 pages of federal, state and company records relating to more than 200 gas wells in Pennsylvania, 40 in West Virginia and 20 public and private wastewater treatment plants offers a fuller picture of the wastewater such wells produce and the threat it poses.
Most of the information was drawn from drilling reports from the last three years, obtained by visiting regional offices throughout Pennsylvania, and from documents or databases provided by state and federal regulators in response to records requests.

Among The Times’s findings:

¶More than 1.3 billion gallons of wastewater was produced by Pennsylvania wells over the past three years, far more than has been previously disclosed. Most of this water — enough to cover Manhattan in three inches — was sent to treatment plants not equipped to remove many of the toxic materials in drilling waste.

¶At least 12 sewage treatment plants in three states accepted gas industry wastewater and discharged waste that was only partly treated into rivers, lakes and streams.

¶Of more than 179 wells producing wastewater with high levels of radiation, at least 116 reported levels of radium or other radioactive materials 100 times as high as the levels set by federal drinking-water standards. At least 15 wells produced wastewater carrying more than 1,000 times the amount of radioactive elements considered acceptable.

Results came from field surveys conducted by state and federal regulators, year-end reports filed by drilling companies and state-ordered tests of some public treatment plants. Most of the tests measured drilling wastewater for radium or for “gross alpha” radiation, which typically comes from radium, uranium and other elements.

Industry officials say they are not concerned.
"These low levels of radioactivity pose no threat to the public or worker safety and are more a public perception issue than a real health threat,” said James E. Grey, chief operating officer of Triana Energy.
In interviews, industry trade groups like the Marcellus Shale Coalition and Energy in Depth, as well as representatives from energy companies like Shell and Chesapeake Energy, said they were producing far less wastewater because they were recycling much of it rather than disposing of it after each job.

But even with recycling, the amount of wastewater produced in Pennsylvania is expected to increase because, according to industry projections, more than 50,000 new wells are likely to be drilled over the next two decades.
The Times piece is the first part of a longer series, "Drilling Down"; it also has a couple of terrific interactive charts attached. Below is a snapshot of an interactive feature showing the amount by which wells tested exceeded the federal limit for radium (clink on link for other toxic wastes). Another explains how fracking is done.

February 25, 2011

Biofuels and the Food Crisis

From Tim Searchinger in The Washington Post:
Each year, the world demands more grain, and this year the world's farms will not produce it. World food prices have surged above the food crisis levels of 2008. Millions more people will be malnourished, and hundreds of millions who are already hungry will eat less or give up other necessities. Food riots have started again.
Nearly all assessments of the 2008 food crisis assigned biofuels a meaningful role, but much of academia and the media ultimately agreed that the scale of the crisis resulted from a "perfect storm" of causes. Yet this "perfect storm" has re-formed not three years later. We should recognize the ways in which biofuels are driving it.

Demand for biofuels is almost doubling the challenge of producing more food. Since 2004, for every additional ton of grain needed to feed a growing world population, rising government requirements for ethanol from grain have demanded a matching ton. Brazil's reliance on sugar ethanol and Europe's on biodiesel have comparably increased growth rates in the demand for sugar and driven up demand for vegetable oil.

Agricultural production is keeping up in general with the growing demand for food - but it keeps up with the added demand for biofuels only if growing weather is good. A good growing year in 2008 helped end that year's crisis, but average-to-poor weather since then has stressed inventories and confidence. Higher fuel costs for farmers and a weaker dollar contribute to higher prices, but prices soar only when large consumers, fearing that production will continue to fall short, bid up prices to secure their supplies.

Much of today's discussion focuses only on the challenge of meeting rising food demand because of factors such as rising meat consumption in China and long-term underinvestment in agricultural research. Droughts in Russia and floods in Australia over the past year may be early harbingers of climate change. But if it is hard to meet rising food demands, it must be harder to meet demands for both food and biofuels.

So why has attention shifted away from biofuels? The answer probably lies in the confusing explanations of 2008, when the problem and its "cause" were defined in different ways.

For example, some studies evaluated the effect of biofuels on retail food prices in the United States rather than on wholesale crop prices worldwide. Not surprisingly, they found little impact. The price of corn in your corn flakes and other retail products is so small that even a tripling of crop prices has little effect at U.S. grocery stores. But the world's poor do not eat processed, packaged corn flakes; they spend more than half of their incomes on staples such as corn meal.

Several reports tried to segregate the precise role of biofuels from weather and other factors. That's not possible because the causes multiply each other. Just as a political tremor in the Middle East makes oil prices jump in tight markets, so drought in Russia sends wheat futures soaring once biofuels have stressed grain markets. In 2008 and again recently, some governments have responded by banning grain exports to keep domestic prices down. This has the effect of forcing prices higher for everyone else. You can blame national self-interest and the inevitable vagaries of weather, but the key is to avoid tight markets in the first place.

A broad misunderstanding has also arisen from economic models predicting price increases from biofuels that are still far lower than those of the past decade. In fact, these models do not estimate biofuel effects on prices today but those in a future market "equilibrium," which will exist only after farmers have ample time to increase production to match demand. Today, the market is out of equilibrium. Biofuels have grown rapidly, from consuming 2 percent of world grain and virtually no vegetable oil in 2004 to more than 6.5 percent of grain and 8 percent of vegetable oil last year. Governments worldwide seek to triple production of biofuels by 2020, and that implies more moderately high prices after good growing years and soaring prices after bad ones.

The good news is that relief is possible. The same economic studies imply that food prices should come down if we can just limit biofuel growth. Corn ethanol is nearing Congress's requirement for 15 billion gallons a year, and lawmakers need to hold it there. Similarly, Europe must rethink its mandates. For "advanced biofuels" required by Congress, the Obama administration needs to focus on fuel sources that do not compete with food, such as garbage and crop residues, and not grasses grown on good cropland. Otherwise, the sequel to the food crisis is likely to turn into a series.

The writer is a research scholar at Princeton University and a fellow of the German Marshall Fund of the United States.

February 20, 2011

Unprecedented Spread in Oil Prices

The spread between the price of West Texas Intermediate crude, which trades in New York, and Brent, which trades in London, reached nearly $19 last week (though it subsequently fell to around $13). A spread in the teens is unprecedented. Here's a chart from Jim Bianco showing what's up with prices:

Fortune explains the reasons for the difference:
Increased flows from Canadian tar sands and the Bakken shale fields in the northern Great Plains have sent oil flooding into Cushing, Okla., where the WTI crude contract is priced. But because pipelines are set to run into Cushing, not out, much of that oil is going into storage rather than into refineries. Oil stockpiles in Cushing hit their highest level in seven years last month. 
Gasoline prices have followed the Brent price upward. East Coast gas is priced off the Brent crude because its refineries import oil from European terminals.

The South is also feeling the squeeze, as Fortune explains:
Take Louisiana Light Sweet crude, a fuel blend favored by refiners in the south who are cut off from the WTI market. The spread between WTI and Louisiana Light recently hit $21 a barrel, another record. 
Running off the WTI glut could ease the pressure on Brent and Louisiana Light prices. But the glut isn't going anywhere any time soon.

Pipelines that take oil out of Cushing are at least two years away, and oil companies that stand to rake in fat refining profits aren't exactly looking to rush that timeline.
Here's another piece of good news:
A study released last month by IHS Global Economics says a 25-cent rise in the gasoline price, all else equal, will reduce employment by some 600,000 jobs over the following two years. And the steeper the rise, the more jobs that stand to be lost.
Early Warning has a chart of the Brent-WTI spread over the last twenty years:

It's Official: You Should Worry About Solar Flares

From the FT, reporting the official warning of an intense electromagnetic storm that would be a "global Katrina" and cost the world economy, "in the worst case," two thousand billion dollars. 
Senior officials responsible for policy on solar storms – also known as space weather – in the US, UK and Sweden urged more preparedness at the annual meeting of the American Association for the Advancement of Science in Washington.

“We have to take the issue of space weather seriously,” said Sir John Beddington, UK chief scientist. “The sun is coming out of a quiet period, and our vulnerability has increased since the last solar maximum [around 2000].”

“Predict and prepare should be the watchwords,” agreed Jane Lubchenco, head of the US National Oceanic and Atmospheric Administration. “So much more of our technology is vulnerable than it was 10 years ago.” . . .

[An] extreme storm can shut down communications satellites for many hours – or even cause permanent damage to their components. On the ground, the intense magnetic fluctuations can induce surges in power lines, leading to grid failures such as the one that blacked out the whole of Quebec in 1989.

The 11-year cycle of solar activity is quite variable and the present one is running late, with the next maximum expected in 2013.

The peak was not expected to be very strong but that should not cause complacency, said Tom Bogdan, director of the US Space Weather Prediction Center.

The most intense solar storm on record, which ruined much of the world’s newly installed telegraph network in 1859, took place during an otherwise weak cycle. An 1859-type storm today could knock out the world’s information, communications and electricity distribution systems, at a cost estimated by the US government at $2,000bn.

February 19, 2011

Cheer Up, The Weather's Fine

Futurist Ray Kurzweil predicts the world will get 100 percent of its energy from solar in 20 years. (Alas, he also thinks, according to an interviewer from Grist, that "eventually we'll be able to live forever, and maybe even bring people back from the dead.")
One of my primary theses is that information technologies grow exponentially in capability and power and bandwidth and so on. If you buy an iPhone today, it's twice as good as two years ago for half the cost. That is happening with solar energy -- it is doubling every two years. And it didn't start two years ago, it started 20 years ago. Every two years, we have twice as much solar energy in the world.
Today, solar is still more expensive than fossil fuels, and in most situations it still needs subsidies or special circumstances, but the costs are coming down rapidly -- we are only a few years away from parity. And then it's going to keep coming down, and people will be gravitating towards solar, even if they don't care at all about the environment, because of the economics.
So right now it's at half a percent of the world's energy. People tend to dismiss technologies when they are half a percent of the solution. But doubling every two years means it's only eight more doublings before it meets a 100 percent of the world's energy needs. So that's 16 years. We will increase our use of electricity during that period, so add another couple of doublings: In 20 years we'll be meeting all of our energy needs with solar, based on this trend which has already been underway for 20 years.

People say we're running out of energy. That's only true if we stick with these old 19th-century technologies. We are awash in energy from the sunlight.

February 18, 2011

The Risk to Saudi Arabia

From Ambrose Evans-Pritchard:
At least four protesters were killed in a bloody crack-down in Bahrain after tanks entered the capital and security forces smashed a tent city in the main square, opening fire with grapeshot. The situation is fraught with risk since a Sunni monarchy rules a Shia majority with mixed Iranian ancestry and sympathetic ties to Tehran.

"Bahrain is the main danger, not because it is intrinsically important, but because it could trigger intervention by Saudi Arabia," said Faysal Itani, a Mid-East expert at consultants Exclusive Analysis. "We have heard reports that the Saudis have already dispatched troops and equipment to put down the uprising". . . .

BNP Paribas said events in the Gulf were setting off a scramble for scarce energy supplies. "Rising oil prices are becoming an increasing threat to the global economy, hitting the net oil consumers of China, Europe, and Japan the most."

Credit default swaps (CDS) measuring bond risk in Bahrain have been rising all week, surging another 24 basis points yesterday to 285. The CDS on Saudi Arabia rose 3 to 127, and Egypt pushed back 9 to 360 as the political euphoria gives way to industrial strikes. The Mid-East holds 60pc of the world's proven oil reserves, and makes up 36pc of current supply.

"The real issue is whether this will spread to Saudi Arabia," said Mr Itani. "The Kingdom's Eastern Province is also mostly Shi'ite and contains the vast majority of its oil reserves. When markets start to make the connection between the region's political geography and the location of the oil, that is when they might react."

The province is the headquarters of the Saudi oil giant Aramco and holds the huge Safaniya, Shaybah and Ghawar oilfields. The area rose up in 1979 in sympathy with the Iranian revolution, leading to 21 deaths. The Saudis have since made efforts to address grievances.

Mr Itani said Western observers are complacent about Saudi Arabia. "We do not believe things are as docile as presented. There is very high unemployment, deeply-aggrieved minorities, and the army has been kept weak and divided for political reasons," he said.

Mr Itani said the Shia majority in Bahrain were kept out of sensitive jobs and denied a real voice in the island's parliament, where an appointed upper house calls the shots. "This uprising has been brewing for two years but the authorities were able to confine protests to the villages until now. This time it reached Manama," he said.

Oil analysts say crude prices would become a major drag on growth if they punch above $120 a barrel and stay there for long. It was this sort of level that sapped the foundations of the global economy in mid 2008, though prices did briefly spike to $147.

Rising oil costs are a major headache for China and other emerging powers in Asia that have a much higher dependence on energy for each unit of output than the the West. They rely on cheap shipping cost to transport exports with wafer-thin profit margins.

Lighter US crude has not risen as much, though it jumped $2 yesterday to $86.5. However, US crude and Brent prices rarely diverge for long. A major disruption in the Gulf would ultimately cause oil to rise acoss the global system.

Most analysts expect Brent to drop back to US crude levels once the Mid-East jitters subside. "Prices are expected to severely correct downward as current prices are too high for fundamentals," said Christophe Barret from Credit Agricole CIB

However, OPEC is already overshooting its output target by some 2m barrels-a-day (b/d). It is not clear whether the cartel has the spare capacity to meet fast-growing demand in Asia for much longer, leaving a thinner margin of security in the energy markets.

The International Energy Agency (IEA) expects consumption to jump by over 2m b/d to 89.3m this year, the fifth upgrade since August. The global "oil burden" will rise sharply to 4.7, "close to levels that have coincided in the past with a marked economic slowdown."

"The combination of higher prices, emerging inflationary pressures and instability in the Middle East is not a healthy one," said the IEA's monthly report.
Two additional data points underline the potential for disruption from the Persian Gulf:

Seventeen million barrels of oil passes through the Persian Gulf and the Strait of Hormuz every day, according to an IEA official. The IEA's chief economist, Fatih Birol, estimates that "90 percent of growth in oil production" will have to be met by Middle East and North African countries.

Record Melt in Greenland

From the NASA Earth Observatory site:  
2010 was an exceptional year for Greenland’s ice cap. Melting started early and stretched later in the year than usual. Little snow fell to replenish the losses. By the end of the season, much of southern Greenland had set a new record, with melting that lasted 50 days longer than average.
This image was assembled from microwave data from the Special Sensor Microwave/Imager (SSM/I) of the Defense Meteorological Satellites Program. Snow and ice emit microwaves, but the signal is different for wet, melting snow than for dry. Marco Tedesco, a professor at the City College of New York, uses this difference to chart the number of days that snow is melting every year. This image above shows 2010 compared to the average number of melt days per year between 1979 and 2009.

The long melt season primarily affected southern and western Greenland, where communities experienced their warmest year on record. After a warm, dry winter, temperatures were particularly high in the spring, getting the melt season off to a strong start. The early melting set the tone for the rest of the season, leading to more melting all the way into mid-September.

When snow melts, the fine, bright powder turns to larger-grained, gravely snow. These large grains reflect less light, which means that they can absorb more energy and melt even faster. When the annual snow is melted away, parts of the ice cap are exposed. The surface of the ice is also darker than snow. Since dark ice was exposed earlier and longer in 2010, it absorbed more energy, leading to a longer melt season. A fresh coat of summer snow would have protected the ice sheet, but little snow fell.

Melting ice in Greenland freshens the seas near the Arctic and contributes to rising sea levels around the world. It is unclear just how much melting ice from Greenland will push sea levels up, largely because the melting is occurring much more quickly than scientists predicted. Current estimates call for an increase of up to 0.6 meters by 2100.

February 17, 2011

New National Intelligence Estimate on Iran's Nuclear Program

From the Wall Street Journal:
A new classified U.S. intelligence assessment concludes that Iran's leaders are locked in an increasingly heated debate over whether to move further toward developing nuclear weapons, saying the bite of international sanctions may be sowing discord.
The new national intelligence estimate, or NIE, says Tehran likely has resumed work on nuclear-weapons research in addition to expanding its program to enrich uranium—updating a contested 2007 estimate that concluded the arms program had all but halted in 2003.

But it doesn't conclude that Iran has relaunched a full-blown program to try to build bombs. According to the assessment, Iran's debate over whether to do so suggests international sanctions may be causing divisions in Tehran, U.S. officials said. . . .

The NIE's findings suggest that, in the U.S. view, at least some Iranian leaders are worried that economic turmoil fueled in part by international sanctions could spur opposition to the regime—though officials acknowledge it is impossible for outsiders to determine the precise effect of sanctions on decision-making in Tehran.

Iran's government has also taken steps to stifle any possible unrest in response to its own economic measures, after Tehran significantly cut subsidies for fuel, electricity and basic food items in late December.

An NIE is considered the consensus view of all the various U.S. intelligence agencies, and, as a result, carries more weight than an analysis coming from any one part of the intelligence community.

The new assessment is the first full, new analysis by the intelligence community since the 2007 estimate, which concluded that Iran had halted its nuclear-weapon design and weaponization work, as well as its covert uranium enrichment-related activities. Those findings were disputed by some European spy agencies. Iran denies it is trying to develop nuclear weapons.

U.S. officials say at least some of those 2007 assertions have been revised in the new NIE. But the new assessment stops short of rejecting the earlier findings.

"The bottom line is that the intelligence community has concluded that there's an intense debate inside the Iranian regime on the question of whether or not to move toward a nuclear bomb," a U.S. official said. "There's a strong sense that a number of Iranian regime officials know that the sanctions are having a serious effect."

Such conclusions are likely to stiffen the resolve of Obama administration officials to tighten sanctions further. The White House declined to comment on the new intelligence assessment. . . .

The new intelligence findings also reflect a growing consensus among the U.S. and its allies that Tehran's suspected effort to obtain a warhead has been significantly slowed by the combination of sanctions and problems at its nuclear facilities.

Senior Israeli officials said last month that Tehran may be at least four years away from being able to produce a nuclear weapon because of technological difficulties, a notably longer timeline than Israelis had used previously. Soon after, U.S. Secretary of State Hillary Clinton said Washington believed Iran's nuclear program faced mounting "technical" problems.

In a separate threat assessment presented to the Senate Intelligence Committee Wednesday, the director of national intelligence, James Clapper, said the U.S. believes Iran should be capable of producing enough highly enriched uranium for a weapon "in the next few years," provided Tehran makes the decision to do so.

Officials in the U.S., Europe and Asia credit, in part, an international campaign that they say has restricted Iran's ability to procure the raw materials needed to build an atomic bomb.

Officials say Iran has had difficulty acquiring carbon fiber and a particular high-strength steel, two critical components for making machinery used in producing enriched uranium.

Iran's nuclear program also appears to have been slowed by problems in the computer system used to run its enrichment equipment, officials said.

Officials say Tehran is encountering problems deploying advanced centrifuge machines that could drastically accelerate the production of highly enriched uranium, which is needed for a nuclear bomb.

Experts attribute equipment failures among such machines at Iran's main uranium-enrichment plant to a computer worm known as Stuxnet. Iran has acknowledged the computer attacks. Experts speculate that Stuxnet was developed by Israel or the U.S., or both, though neither government has confirmed any role.

A report issued on Wednesday by David Albright, an expert on Iran's nuclear program who heads the Institute for Science and International Security, said it is increasingly accepted that a successful Stuxnet attack in late 2009 or early 2010 destroyed about 1,000 Iranian centrifuges out of about 9,000 at the site.

"The effect of this attack was significant," Mr. Albright said in the report. "It rattled the Iranians, who were unlikely to know what caused the breakage, delayed the expected expansion of the plant, and further consumed a limited supply of centrifuges to replace those destroyed."

Mr. Clapper, in testimony to Congress, said the intelligence community believes Iran is "keeping open the option to develop nuclear weapons in part by developing various nuclear capabilities that better position it to produce such weapons, should it choose to do so. We do not know, however, if Iran will eventually decide to build nuclear weapons."

High Costs of Coal

From Reuters, a report on a new study by an associate of Harvard's Center for Health and the Global Environment:
The United States' reliance on coal to generate almost half of its electricity, costs the economy about $345 billion a year in hidden expenses not borne by miners or utilities, including health problems in mining communities and pollution around power plants, a study found.
Those costs would effectively triple the price of electricity produced by coal-fired plants, which are prevalent in part due to the their low cost of operation, the study led by a Harvard University researcher found.

"This is not borne by the coal industry, this is borne by us, in our taxes," said Paul Epstein, a Harvard Medical School instructor and the associate director of its Center for Health and the Global Environment, the study's lead author.

"The public cost is far greater than the cost of the coal itself. The impacts of this industry go way beyond just lighting our lights."

Coal-fired plants currently supply about 45 percent of the nation's electricity, according to U.S. Energy Department data. Accounting for all the ancillary costs associated with burning coal would add about 18 cents per kilowatt hour to the cost of electricity from coal-fired plants, shifting it from one of the cheapest sources of electricity to one of the most expensive. 
In the year that ended in November, the average retail price of electricity in the United States was about 10 cents per kilowatt hour, according to the Energy Department. 
Advocates of coal power have argued that it is among the cheapest of fuel sources available in the United States, allowing for lower-cost power than that provided by the developing wind and solar industries.
"The Epstein article ignores the substantial benefits of coal in maintaining lower energy prices for American families and businesses," said Lisa Camooso Miller, a spokeswoman for the American Coalition for Clean Coal Electricity, an industry group. "Lower energy prices are linked to a higher standard of living and better health."

The estimate of hidden costs takes into account a variety of side-effects of coal production and use. Among them are the cost of treading elevated rates of cancer and other illnesses in coal-mining areas, environmental damage and lost tourism opportunities in coal regions where mountaintop removal is practiced and climate change resulting from elevated emissions of carbon dioxide from burning the coal. . . .

The $345 billion annual cost figure was the study's best estimate of the costs associated with burning coal. The study said the costs could be as low as $175 billion or as high as $523 billion.
There is more from the NYT Green blog:
“We really don’t appreciate the public health dimension of what this is costing us,” said Paul Epstein, lead author of the study and a public health expert at Harvard Medical School. “I think we’ve been sticking our heads into the sand.”
Even the study’s most conservative estimate of the uncounted cost of coal — $175 billion per year — would more than double the average cost of coal-fired electricity, its authors found. At this lower range, roughly 80 percent of the costs were from well-documented public health impacts like lung and heart disease, with the rest of the costs attributed to climate change and other environmental impacts as well as local economic effects like lost tourism in coal-mining areas. . . .
Dr. Epstein maintains that the true costs of coal are probably even higher than the study’s worst-case estimate of more than $500 billion per year. Much remains unknown about the public health dimensions of coal mining, processing and combustion, particularly the effect on groundwater, and with a lack of firm data, the study ignored a host of probable pollution-related health impacts.
“Part of the epidemic of cancer can be attributable to some of these carcinogens that we’re pouring into the groundwater from extracting fossil fuels,” he said.
The infiltration of carcinogens into the residential water supply of Appalachian communities may be particularly acute, he said, and public health studies are under way to determine the severity of the contamination.

“We see the accidents and the deaths of some of the miners. We see some of the impacts of mountaintop removal,” he said. “We don’t see the benzene and lead and mercury and arsenic — the whole slew of carcinogenic materials affecting household waters.”

February 16, 2011

Solar Flares Gonna Get Your Mama

In December I posted an item about the Carrington Event, the Big Solar Flare of 1859, that fried telegraph offices and that would, if repeated, do tremendous damage to our communications and energy infrastructure. Chet Nagle of the Daily Caller has more detail on recent predictions and what it would all mean.
On August 1st and 2nd of last year, an entire hemisphere of the solar disk erupted. The side of the sun facing the earth convulsed in flares, waves of plasma, radio storms and glowing magnetic arcs that blew billions of tons of the sun into space. Astronomers saw solar mayhem so unusual, so violent, that its significance to civilization and life on earth is now being debated by astrophysicists, chemists, engineers, and even clerics. Our alert media did not tell you about it? Then watch the recording of the event by NASA's Solar Dynamics Observatory here.
What you will see if you click on the link is a movie of the sun taken in the extreme ultraviolet part of the spectrum; a clock in the lower left of the screen shows the time and date.

“Wow,”’ you say. “What a light show!” Yes, indeed, and when the solar tsunami reached the earth on August 3rd, it created an aurora borealis display as far south as Iowa. Earth’s magnetic field reverberated from the impact like a big bell hit by a sledgehammer. “Okay,” you say. “I’m sure glad that’s over.” Unfortunately, it is not over. In fact, it is just beginning. Analysts believe a second, greater solar flare is coming — in 2012.

Solar flares appear in eleven-year cycles. The science of why they happen, and how they affect the earth, is complex and replete with words like: coronal mass ejections, auroral electrojet, geomagnetic storm, and solar maximum — or solar max. And scientists, like Mausumi Dikpati of the National Center for Atmospheric Research, believe the next solar max will likely be a “doozy.”

Solar max events are fairly common. One in 1958 blew Northern Lights all the way down to Mexico. A 1972 solar max knocked out telephone lines across Illinois. In 1989, six million people in Canada were blacked out when their power lines were overloaded and fried. That solar max melted huge transformers as far south as New Jersey. In 2005, an x-ray storm left the sun at the speed of light, reached us in eight minutes, and interrupted satellite communications and GPS signals.

The biggest recorded solar max occurred in 1859, the famous Carrington White Light Solar Flare. That “doozy” caused night skies to erupt in auroras so brilliant that newspapers could be read as easily as in daylight. Northern Lights were seen in Cuba, Jamaica, and the Bahamas. Telegraph systems went wild, and sparks shocked telegraph operators and set telegraph paper on fire. Even when telegraphers disconnected batteries, powerful solar max currents in the wires still allowed messages to be transmitted.

In 1859, there were no satellites, of course. There were no cell phones, radio, television, GPS, or Internet. The national power grid did not exist either, and we weren’t dependent on electricity and electronics. Now imagine what would happen if a geomagnetic storm like the 1859 event occurred today, a solar max scientists estimate was driven by the power of 100 billion atomic explosions the size of the Hiroshima bomb!

Somehow the question of whether this will help Obama get reelected, which Nagle subsequently gets into, seems secondary to me. Googling "Mausumi Dikpati of the National Center for Atmospheric Research" yielded the following story from NASA Science News, dated March 10, 2006:
It's official: Solar minimum has arrived. Sunspots have all but vanished. Solar flares are nonexistent. The sun is utterly quiet.

Like the quiet before a storm.

This week researchers announced that a storm is coming--the most intense solar maximum in fifty years. The prediction comes from a team led by Mausumi Dikpati of the National Center for Atmospheric Research (NCAR). "The next sunspot cycle will be 30% to 50% stronger than the previous one," she says. If correct, the years ahead could produce a burst of solar activity second only to the historic Solar Max of 1958.

That was a solar maximum. The Space Age was just beginning: Sputnik was launched in Oct. 1957 and Explorer 1 (the first US satellite) in Jan. 1958. In 1958 you couldn't tell that a solar storm was underway by looking at the bars on your cell phone; cell phones didn't exist. Even so, people knew something big was happening when Northern Lights were sighted three times in Mexico. A similar maximum now would be noticed by its effect on cell phones, GPS, weather satellites and many other modern technologies.

Dikpati's prediction is unprecedented. In nearly-two centuries since the 11-year sunspot cycle was discovered, scientists have struggled to predict the size of future maxima—and failed. Solar maxima can be intense, as in 1958, or barely detectable, as in 1805, obeying no obvious pattern.

The key to the mystery, Dikpati realized years ago, is a conveyor belt on the sun.

We have something similar here on Earth—the Great Ocean Conveyor Belt, popularized in the sci-fi movie The Day After Tomorrow. It is a network of currents that carry water and heat from ocean to ocean--see the diagram below. In the movie, the Conveyor Belt stopped and threw the world's weather into chaos.

The sun's conveyor belt is a current, not of water, but of electrically-conducting gas. It flows in a loop from the sun's equator to the poles and back again. Just as the Great Ocean Conveyor Belt controls weather on Earth, this solar conveyor belt controls weather on the sun. Specifically, it controls the sunspot cycle.
Solar physicist David Hathaway of the National Space Science & Technology Center (NSSTC) explains: "First, remember what sunspots are--tangled knots of magnetism generated by the sun's inner dynamo. A typical sunspot exists for just a few weeks. Then it decays, leaving behind a 'corpse' of weak magnetic fields."

Enter the conveyor belt.

"The top of the conveyor belt skims the surface of the sun, sweeping up the magnetic fields of old, dead sunspots. The 'corpses' are dragged down at the poles to a depth of 200,000 km where the sun's magnetic dynamo can amplify them. Once the corpses (magnetic knots) are reincarnated (amplified), they become buoyant and float back to the surface." Presto—new sunspots!

All this happens with massive slowness. "It takes about 40 years for the belt to complete one loop," says Hathaway. The speed varies "anywhere from a 50-year pace (slow) to a 30-year pace (fast)."

When the belt is turning "fast," it means that lots of magnetic fields are being swept up, and that a future sunspot cycle is going to be intense. This is a basis for forecasting: "The belt was turning fast in 1986-1996," says Hathaway. "Old magnetic fields swept up then should re-appear as big sunspots in 2010-2011."

Like most experts in the field, Hathaway has confidence in the conveyor belt model and agrees with Dikpati that the next solar maximum should be a doozy. But he disagrees with one point. Dikpati's forecast puts Solar Max at 2012. Hathaway believes it will arrive sooner, in 2010 or 2011.

"History shows that big sunspot cycles 'ramp up' faster than small ones," he says. "I expect to see the first sunspots of the next cycle appear in late 2006 or 2007—and Solar Max to be underway by 2010 or 2011."
From NASA's Earth Observatory site comes news that will no doubt improve scientific understanding of the sun's mysteries.
For the first time in history, the world has a full view of the far side of the Sun—and of the entire 360-degree sphere at once, for that matter—thanks to NASA’s Solar Terrestrial Relations Observatory (STEREO). On February 6, 2011, the twin satellites reached opposite sides of the Sun, allowing space weather watchers to detect activity at any point on the sphere and to image eruptions that might be headed toward Earth.
“For the first time ever, we can watch solar activity in its full 3-dimensional glory,” said Angelos Vourlidas, a member of the STEREO science team from the U.S. Naval Research Laboratory. . . (download high definition animation here)

It has been a long journey to a full 360 view of our nearest star. For hundreds of years, ground-based astronomers could only observe the Earth-facing side of the Sun. (Unlike the Moon, the Sun rotates, but we can only see half of the sphere at a time.) With the launch of the Solar and Heliospheric Observatory in 1995, researchers developed methods of helioseismology—studying wave propagation from inside the Sun—to model what was happening on the far side.
It has only been since October 2006, with the launch of STEREO, that scientists have been able to get a true “view” around the earth-visible limb of the Sun. For nearly five years, the two spacecraft slowly moved out in opposite arcs from a common point in the line from Sun to Earth. It took until this month to get to the full separation of 180 degrees.

Niger Delta Threatened By Libya Water Plan

From Yale Environment360
The floods in what geographers call the inner Niger delta nurture abundant fish for the Bozo people, who lay their nets in every waterway and across the lakes. . . . This inland delta is Africa’s second-largest floodplain and one of its most unique wetlands. Seen from space, it is an immense smudge of green and blue on the edge of the Sahara.

But this rare and magnificently productive ecosystem is now facing an unprecedented threat, as a Libyan-backed enterprise has begun construction of a project inside Mali that will divert large amounts of Niger River water for extensive irrigation upstream.

This is all part of a grand plan by Libyan leader Moammar Gaddafi to make his desert nation self-sufficient in food through long-term deals with nearby countries to grow food for Libya. Mali’s president has agreed to the scheme, which numerous experts say will enhance Libyan food security at the expense of Malian food security by sucking dry the river that feeds the inland delta, diminishing the seasonal floods that support rich biodiversity — and thriving agriculture and fisheries vital to a million of Mali’s poorest citizens — on the edge of the Sahara desert.

February 15, 2011

Confuting the Peak Oil Skeptics

Richard Heinberg writes with great clarity and conviction about, among other things, Peak Oil. This is from the latest installment of his new work on the limits to growth.
The trends in the oil industry are clear and undisputed: exploration and production are becoming more costly, and are entailing more environmental risks, while competition for access to new prospective regions is generating increasing geopolitical tension. The rate of oil discoveries on a worldwide basis has been declining since the early 1960s, and most exploration and discovery are now occurring in inhospitable regions such as in ultra-deepwater (at ocean depths of up to three miles) and the Arctic, where operating expenses and environmental risks are extremely high. This is precisely the situation we should expect to see as the low-hanging fruit disappear and global oil production nears its all-time peak in terms of flow rate. 
While the U.S. Department of Energy and the IEA continue to produce mildly optimistic forecasts suggesting that global liquid fuels production will continue to grow until at least 2030 or so, these forecasts now come with a semi-hidden caveat: as long as implausibly immense investments in exploration and production somehow materialize. This hedged sanguinity is echoed in statements from ExxonMobil and Cambridge Energy Research Associates, as well as a few energy economists. Nevertheless, it is fair to say that most serious analysts now expect a near-term (i.e., within the current decade) commencement of decline in global crude oil and liquid fuels production. A survey last year of about a hundred of the world’s most respected petroleum geologists by the Association for the Study of Peak Oil (ASPO) found that the vast majority expected world oil production to peak between 2010 and 2020. Prominent oil industry figures such as Charles Maxwell and Boone Pickens say the peak either already has happened or will do so soon. And recent detailed studies by governments and industry groups reached this same conclusion. Toyota, Virgin Airlines, and other major fuel price-sensitive corporations routinely include Peak Oil in their business forecasting models.

Examined closely, the arguments of the Peak Oil naysayers actually boil down to a tortuous effort to say essentially the same things as the Peaksters do, but in less dramatic (some would say less accurate and useful) ways: Cornucopian pundits like Daniel Yergin of Cambridge Energy Research Associates speak of a peak not in supply, but in demand for petroleum (but of course, this reduction in demand is being driven by rising oil prices—so what exactly is the difference?). Or they emphasize that the world is seeing the end of cheap oil, not of oil per se. They point to enormous and, in some cases, growing petroleum reserves worldwide—yet close examination of these alleged reserves reveals that most consist of “paper reserves” (claimed numbers based on no explicit evidence), or bitumen and other oil-related substances that require special extraction and processing methods that are slow, expensive, and energy-intensive. Read carefully, the statements of even the most ebullient oil boosters confirm that the world has entered a new era in which we should expect prices of liquid fuels to remain at several times the inflation-adjusted levels of only a few years ago.

Quibbling over the exact meaning of the word “peak” or the exact timing of the event, or what constitutes “oil” is fairly pointless. The oil world has changed. And this powerful shock to the global energy system has just happened to coincide with a seismic shift in the world’s economic and financial systems. 
Heinberg also explores how markets respond to resource scarcity by considering the boom and bust cycle of the petroleum industry. "The standard economic assumption is that, as a resource becomes scarce, prices will rise until some other resource that can fill the same need becomes cheaper by comparison." But oil is different:
Once upon a time (about a dozen years past), oil sold for $20 a barrel in inflation-adjusted figures, and The Economist magazine ran a cover story explaining why petroleum prices were set to go much lower. The U.S. Department of Energy and the International Energy Agency were forecasting that, by 2010, oil would probably still be selling for $20 a barrel, but they also considered highly pessimistic scenarios in which the price could rise as high as $30 (those forecasts are in 1996 dollars).

Instead, as the new decade wore on, the price of oil soared relentlessly, reaching levels far higher than the “pessimistic” $30 range. Demand for the resource was growing, especially in China and some oil exporting nations like Saudi Arabia; meanwhile, beginning in 2005, actual world oil production hit a plateau. Seeing a perfect opportunity (a necessary commodity with stagnating supply and growing demand), speculators drove the price up even further.

As prices lofted, oil companies and private investors started funding expensive projects to explore for oil in remote and barely accessible places, or to make synthetic liquid fuels out of lower-grade carbon materials like bitumen, coal, or kerogen.

But then in 2008, just as the price of a barrel of oil reached its all-time high of $147, the economies of the OECD countries crashed. Airlines and trucking companies downsized and motorists stayed home. Demand for oil plummeted. So did oil’s price, bottoming out at $32 at the end of 2008.

But with prices this low, investments in hard-to-find oil and hard-to-make substitutes began to look tenuous, so tens of billions of dollars’ worth of new energy projects were canceled or delayed. Yet the industry had been counting on those projects to maintain a steady stream of liquid fuels a few years out, so worries about a future supply crunch began to make headlines.

It is the financial returns on their activities that motivate oil companies to make the major investments necessary to find and produce oil. There is a long time lag between investment and return, and so price stability is a necessary condition for further investment.

Here was a conundrum: low prices killed future supply, while high prices killed immediate demand. Only if oil’s price stayed reliably within a narrow—and narrowing—“Goldilocks” band could serious problems be avoided. Prices had to stay not too high, not too low—just right—in order to avert economic mayhem. . . .

Not many resources, when they become scarce, have the capability of choking off economic activity as directly as oil shortages can. But as more and more resources acquire the Goldilocks syndrome, general commodity prices will likely spike and crash repeatedly, making a hash of efforts to stabilize the economy.

February 14, 2011

Obama Proposes Nuclear Loan Guarantees of $54.5 Billion

From Bloomberg:
President Barack Obama’s 2012 budget almost triples U.S. loan guarantees for nuclear power-plant construction, funds development of a new breed of smaller reactors and spends more on “breakthrough” energy research.

The Energy Department would get $29.5 billion, a 12 percent increase, to support Obama’s proposed “clean-energy standard,” which requires 80 percent of U.S. electricity to come from low- pollution sources by 2035, according to the fiscal 2012 budget released today. The proposed standard is part of Obama’s policy “to ensure strong American leadership in the clean-energy economy,” according to the budget plan.

A program that guarantees as much as $18.5 billion in loans for construction of nuclear reactors would expand by $36 billion in 2012, to backstop $54.5 billion in lending. An initiative to develop designs for “small modular reactors” would get $67 million. Such reactors would be about a third the size of those now used by U.S. power companies, according to the department.

“We don’t think nuclear loan guarantees are a very good bet,” given the industry’s history of cost overruns, said Robert Cowin, a Washington representative for the Union of Concerned Scientists based in Cambridge, Massachusetts. “The $36 billion the president has proposed is excessive.”

Obama’s budget, which requires approval by Congress, would also expand loan guarantees for energy-efficiency and renewable- energy programs by as much as $2 billion. . . .

Obama also proposed a $36 billion expansion in nuclear loan guarantees last year for fiscal 2011. Congress failed to pass a budget last year, and lawmakers have been funding the federal government with a series of stop-gap spending measures, or continuing resolutions, which haven’t expanded the nuclear loan- guarantee program.

February 13, 2011

The Institutional Framework for Water Policy

This table from a recent report by Peter Gleick suggests the sheer complexity of the institutional environment for water policy:

February 12, 2011

Israel's Natural Gas Find

From the Weekly Standard:
At the end of December, a huge natural gas discovery was confirmed in the Eastern Mediterranean inside Israel’s territorial waters. Once referred to as an “energy island” that not only lacked energy reserves itself but was also cut off from the huge energy resources of the nearby Arab nations, Israel may well become over the next decade an energy exporter. The discovery of the Leviathan gas deposit in the Levant Basin marks a major development for Israel, with the potential for significant economic and strategic advantages, as well as implications for Europe, Russia, and the natural gas market.

Natural gas was first discovered off Israel’s coast in 1999, but the quantity was so small that until recently Israel was still contemplating importing natural gas from Russia by pipeline and liquid natural gas by tanker. Now Jerusalem’s plans are beginning to change. The Leviathan field, discovered by a consortium led by Houston-based Noble Energy, is the world’s largest offshore gas find in the past decade and vaults Israel into the ranks of the largest gas reserve holders in the world. (There are some indications that Leviathan might contain a world-scale oil deposit as well.)

Analysts believe that Leviathan could provide Israel with anywhere from 50-200 years of gas, at current levels of consumption, and more than meet growing demand for decades. In a few years Israel will no longer need gas from Egypt, which since 2008 has fueled 16 percent of Israel’s electricity and provided 40 percent of its natural gas. Israel plans to continue to buy Egyptian gas for the purpose of diversification and political ties, but the recent cutoff following sabotage of the gas pipeline in the Sinai highlights the dangers of dependence on Cairo. . . .

Leviathan’s abundance means Israel could export natural gas later this decade, most likely to Europe, which will face a widening gap between supply and demand. The most economical way to export to Europe would be by converting the gas to liquified natural gas (LNG) and shipping it by tanker. An LNG terminal could be built on Israel’s Mediterranean coast, float at sea, or be built in Cyprus. . . .

Even so, it won’t be entirely smooth sailing for Israel. According to the U.S. Geological Survey, Israel has a portion of the Levant Basin, but it is shared by Gaza, Lebanon, Cyprus, and the Turkey-dominated Turkish Republic of Northern Cyprus. Lebanon and Israel have exchanged tough rhetoric over border demarcation, and Beirut has already taken its case to the U.N.—and that’s not the worst of it. If there is another round of hostilities between Israel and Lebanon, a Hezbollah armed with tens of thousands of rockets might well target Israeli gas facilities.

Leviathan will also influence international relations through its impact on the global natural gas market. Israeli gas exports to Europe would compete with, and lead to reduced demand for, Russian gas, and thereby reduce Russia’s political influence in European capitals. And since Israeli gas exports would be priced by the gas market, they would further erode Russia’s beneficial gas export pricing, which has been uniquely pegged to oil prices, which are higher than gas prices. Reflecting Moscow’s interest in protecting its pricing and markets, its gas giant, Gazprom, which once wanted to sell Israel gas through Turkey, now wants to buy part of Israel’s gas fields. Reduced Russian influence in Europe is good for Israel’s chief ally, the United States. Washington has sought to undercut Russia’s dominant supply of natural gas to Europe, which is why it has supported construction of pipelines from Central Asia and the Middle East, like the proposed Nabucco line, that skirt Russia and Iran.

"Without Mubarak There is No Israeli Attack on Iran"

Writing in Haaretz, Aluf Benn argues that Mubarak's departure rules out an Israeli strike on Iran:
Egypt played a key role in the Sunni, the "moderate," axis, which lined up alongside Israel and the United States against Mahmoud Ahmadinejad and his allies in Lebanon, Syria and the Gaza Strip. The toppling of the regime in Cairo does not alter this strategic logic. The revolutionaries at Tahrir Square were motivated by Egyptian national pride and not by their adoration of the Islamic Revolution in Iran. Whoever succeeds Mubarak will want to follow this line, even bolster Egyptian nationalism, and not transform Egypt into an Iranian satellite. This does not mean that Mubarak's successor will encourage Israel to strike the Iranian nuclear installations. On the contrary: they will listen to Arab public opinion, which opposes a preemptive war against Iran. Israel will find it difficult to take action far to the east when it can not rely on the tacit agreement to its actions on its western border. Without Mubarak there is no Israeli attack on Iran. His replacement will be concerned about the rage of the masses, if they see him as a collaborator in such operation. Whoever is opposed to a strike, or fear its consequences - even though they appear to be in favor, like Netanyahu and Defense Minister Ehud Barak - now have the ultimate excuse. We wanted to strike Iran, they will write in their memoirs but we could not because of the revolution in Egypt. Like Ehud Olmert says that he nearly made peace, they will say that they nearly made war. In his departure Mubarak prevented a preemptive Israeli war. This appears to have been his last contribution to regional stability.

February 11, 2011

Water Crisis in the Southwest

From the Stockholm Environmental Institute, a projection of the water shortfall in the Southwestern United States over the next 100 years:

The authors explain:
At today's rates of water use, the Southwest is projected to use 1,303 million acre feet of groundwater in the 100 years starting in 2010 (the blue plus green segments of the top bar in Figure 1); of this a conservatively estimated 260 million acre feet would be overdraft (shown in green). Taking into consideration only baseline growth of population and income, the Southwest's shortfall of water (today's overdraft plus additional water needed beyond today's annual rates, or green plus yellow in Figure 1) reaches 1,815 million acre feet over the 100-year period. Using the B1 climate assumptions – the least climate change that is still thought to be possible – the Southwest‟s shortfall grows to 2,096 million acre feet (green, yellow, and orange). Under the A2 climate assumptions – the temperature increase expected if the current trend in global greenhouse gas emissions continues – the shortfall reaches 2,253 million acre feet (adding the red segment). This shortfall must be met either from increases to supply (perhaps the most difficult and most expensive options as discussed below), additional groundwater withdrawals, or reductions to use – planned or unplanned. 
Update, May 13, 2011. From Miller-McCune, via Energy Bulletin:
Between 1920 and 2000, the seven states that share the Colorado River grew from 5.7 million to almost 50 million people. Peter Gleick, co-founder and president of the Pacific Institute for Studies in Development, Environment, and Security, says 23 million more people will be added by 2030 amid mounting evidence that our current practices in water use and management are unsustainable.

As Gleick points out in “Roadmap for Sustainable Water Resources in Southwestern North America,” it’s not land, energy, mining or climate, that is going to be most difficult issue to address in the Western United States — it’s water. 

So given the International Panel on Climate Change’s warnings about anthropogenic global warming and a well-documented expanding drought, residents of the West are not just living on borrowed water, they’re also living on borrowed time.

February 9, 2011

Wikileaks and Saudi Oil Capacity

The Peak Oilers are abuzz with the latest Wikileaks revelations reported in the UK Guardian, which consists of a series of cables from the U.S. embassy in Saudi Arabia reporting conversations with a Saudi observer knowledgeable about Saudi oil prospects. The confidant, one Sadad al-Husseini, is a geologist and former head of exploration at the Saudi oil giant Aramco. Though al-Husseini is careful to distance himself from the peak oilers and from the views of Matt Simmons regarding an imminent decline in Saudi production, he gives an estimate of Saudi reserves some 40 percent lower than the official figure. In a cable dated December 10, 2007, his views are summarized as follows:
It is al-Husseini's belief that while Aramco can reach 12 million b/d within the next 10 years, it will be unable to meet the goal of 12.5 million b/d by 2009. The former EVP added that sustaining 12 million b/d output will only be possible for a limited period of time, and even then, only with a massive investment program.
According to al-Husseini, the crux of the issue is twofold. First, it is possible that Saudi reserves are not as bountiful as sometimes described and the timeline for their production not as unrestrained as Aramco executives and energy optimists would like to portray. In a December 1 presentation at an Aramco Drilling Symposium, Abdallah al-Saif, current Aramco Senior Vice President for Exploration and Production, reported that Aramco has 716 billion barrels (bbls) of total reserves, of which 51 percent are recoverable. He then offered the promising forecast - based on historical trends - that in 20 years, Aramco will have over 900 billion barrels of total reserves, and future technology will allow for 70 percent recovery.
Al-Husseini disagrees with this analysis, as he believes that Aramco's reserves are overstated by as much as 300 billion bbls of "speculative resources." He instead focuses on original proven reserves, oil that has already been produced or which is available for exploitation based on current technology. All parties estimate this amount to be approximately 360 billion bbls. In al-Husseini's view, once 50 percent depletion of original proven reserves has been reached and the 180 billion bbls threshold crossed, a slow but steady output decline will ensue and no amount of effort will be able to stop it. By al-Husseini's calculations, approximately 116 billion barrels of oil have been produced by Saudi Arabia, meaning only 64 billion barrels remain before reaching this crucial point of inflection. At 12 million b/d production, this inflection point will arrive in 14 years. Thus, while Aramco will likely be able to surpass 12 million b/d in the next decade, soon after reaching that threshold the company will have to expend maximum effort to simply fend off impending output declines. Al-Husseini believes that what will result is a plateau in total output that will last approximately 15 years, followed by decreasing output.

Al-Husseini elaborated that oil field depletion rates also play a significant role in determining the Aramco - and global - production timeline. Increasing output is not simply a function of adding new capacity to already existing operations. Instead, due to depletion rates, new reserves must be brought online to both replace depleted production and satisfy growth in consumption. The International Energy Agency (IEA) has estimated global depletion rates at 4 percent, while a 2006 Aramco statement has estimated Saudi Arabia's overall depletion rate at 2 percent. Al-Husseini estimates that moving forward, satisfying increases in global demand will require bringing online annually at least 6 million b/d of worldwide output, 2 million to satisfy increased demand and 4 million to compensate for declining production in existing fields.
A couple of points may be inserted parenthetically here. When the cable was written, the price of oil had just recently reached $100 a barrel. The world was on the cusp of the extraordinarily volatile year of 2008, when the price reached $145 a barrel in the summer and then collapsed into the fall. We're now back to the levels reached in late 2007. In 2008 the International Energy Agency also released an estimate of global depletion rates far more alarming (at 9 percent) than the 4 percent that al-Husseini estimated, while also projecting that Saudi oil production would grow to 15.6 mbd by 2030.

The immediate import of the cable was that the increases in the price of oil reflected real constraints and not the influence of speculators.
Considering the rapidly growing global demand for energy - led by China, India and internal growth in oil-exporting countries - and in light of the above mentioned constraints on expanding current capacity, al-Husseini believes that the recent oil price increases are not market distortions but instead reflect the underlying reality that demand has met supply (global energy supply having remained relatively stagnant over the past years at approximately 85 million barrels/day). He estimates that the current floor price of oil, removing all geopolitical instability and financial speculation, is approximately 70 - 75 USD/barrel. Due to the longer-term constraints on expanding global output, al-Husseini judges that demand will continue to outpace supply and that for every million b/d shortfall that exists between demand and supply, the floor price of oil will increase 12 USD. Al-Husseini added that new oil discoveries are insufficient relative to the decline of the super-fields, such as Ghawar, that have long been the lynchpin of the global market. 
While al-Husseini believes that Saudi officials overstate capabilities in the interest of spurring foreign investment, he is also critical of international expectations. He stated that the IEA's expectation that Saudi Arabia and the Middle East will lead the market in reaching global output levels of over 100 million barrels/day is unrealistic, and it is incumbent upon political leaders to begin understanding and preparing for this "inconvenient truth." Al-Husseini was clear to add that he does not view himself as part of the "peak oil camp," and does not agree with analysts such as Matthew Simmons. He considers himself optimistic about the future of energy, but pragmatic with regards to what resources are available and what level of production is possible. While he fundamentally contradicts the Aramco company line, al-Husseini is no doomsday theorist. His pedigree, experience and outlook demand that his predictions be thoughtfully considered.

The second issue that will limit any proposed Aramco output expansion can be broadly defined as a lack of supporting resources. For example, in al-Husseini's estimation, it is not the amount of oil available that will prevent Aramco from reaching 12.5 million b/d by 2009, but rather issues such as a lack of available skilled engineers, a shortage of experienced construction companies, insufficient refining capacity, underdeveloped industrial infrastructure, and a need for production management (if too much oil is extracted from a well without proper planning and technique, a well's potential output will be significantly damaged). As previously reported by post (Reftel), the Eastern Province economy is facing severe industrial expansion limits, and despite Aramco's willingness to invest up to 50 billion USD to achieve the 2009 goal, availability of labor, materials and housing may end up as determinative factors.

All in all, al-Husseini's views seem to split the difference between the "peak oilers" and the CERA projections. He gives aid and comfort to the peak oil camp by sharply reducing the estimates of Saudi reserves, but on the other hand seems to accept the CERA view of an undulating plateau of oil production that will be reached in the next decade or so. His estimate that the Saudis will only be able to produce 12 million barrels per day is, on the one hand, well short of the estimates contained in the most recent IEA report (which projected the Saudis would be able to add 5 mbd to their existing capacity over the 2009-2035 period) but, on the other hand, far in excess of the immediate crisis projected by Simmons in his Twilight in the Desert. Saudi Arabia's current production is a bit short of 8.5 mbd; it is uncertain how much spare capacity they have now (though official estimates put it at 2-3 mbd). In its most recent forecast, CERA also makes the point, like al-Husseini, that constraints to increasing production are technological, not geological, at least in the near term (though CERA is referring to the global production of oil liquids more generally rather than, as with al-Husseini, the prospects for Saudi production).

The Early Warning blog (Stuart Staniford) regularly updates Saudi production. Here is EW's latest estimate. "The graph shows five different data sources, together with an average index (thick black line) which summarizes the various data sources into something that is hopefully within hailing distance of the truth."

Staniford also includes a graph of the rig count in Saudi Arabia. The decline since 2008 he interprets as indicating that the Saudis are comfortable with their current level of spare capacity (whatever that is), though it might also be related to the technical constraints that al-Husseini details.

The Guardian also reports the conclusions of some cables sent subsequently.
Seven months later, the US embassy in Riyadh went further in two more cables. "Our mission now questions how much the Saudis can now substantively influence the crude markets over the long term. Clearly they can drive prices up, but we question whether they any longer have the power to drive prices down for a prolonged period."
A fourth cable, in October 2009, claimed that escalating electricity demand by Saudi Arabia may further constrain Saudi oil exports. "Demand [for electricity] is expected to grow 10% a year over the next decade as a result of population and economic growth. As a result it will need to double its generation capacity to 68,000MW in 2018," it said.

It also reported major project delays and accidents as "evidence that the Saudi Aramco is having to run harder to stay in place – to replace the decline in existing production."
Al-Husseini's views had been previously reported, so to specialists, as The Oil Drum observed, it was nothing new that he repeated himself to the American Embassy. Perhaps the most interesting revelation is the strategic conclusion reached by the US Embassy that Saudi Arabia can drive prices up but cannot keep them down.

Al-Husseini himself subsequently issued a press release in which he claimed that US officials had misinterpreted what he was saying and that the summary included "many patently inaccurate statements."
I do not and did not question in any manner the reported reserves of Saudi Aramco which are in fact based on the highest levels of sound and well established engineering and economic principles and practices. Since Saudi Aramco’s proven oil reserves are 260 billion barrels, there is no way I could have said they are in error by 300 billion barrels, a number that exceeds the actual reserves estimate itself. . . . [Note: The US representative evidently mistook Saudi reserves for world reserves.]

In regards to Saudi Aramco’s oil production capacity, the giant multi-billion dollar expansion projects which were funded in recent years are now all a visible reality for the whole industry to see across the Saudi oil fields. . . .All these oil production projects were completed by the end of 2009 and the Kingdom’s total oil production capacity does in fact now stand firmly at 12.5 million barrels per day.
All clear now?

February 7, 2011

Doubts About the Shale Revolution

Doubts about the cost, efficiency and reserves of shale plays, writes Arthur Berman, "are shared among many petroleum industry scientists and financial analysts despite the enthusiasm for these plays by large public companies."
Critics of our position on shale gas plays have focused on methods of decline-curve analysis, and the projections of estimated ultimate recovery (EUR) that result. The problem with this debate from all sides is that we are uncertain about how to apply decline models to newer shale plays because there is insufficient production history to satisfy all of our questions. I will, therefore, focus on some stubborn facts about Barnett Shale cumulative production and approaches to play development.

Major operators claim that their average Barnett EUR will reach 2.2-3.3 Bcf/well. Figure 1 shows that those levels of EUR are unlikely to occur in an economically meaningful timeframe based on cumulative production to date. Figure 2 shows that well performance has been erratic since operators began drilling horizontal wells, though the trend has been improving slightly in recent years. This is probably due to drilling outside of what are now known to be the core areas. The "manufacturing" paradigm that is prevalent in shale plays has led many companies to assume that all areas in the Barnett Shale and other plays are uniformly attractive.

Shale plays typically begin with a leasing frenzy whereby major players accumulate hundreds of thousands of acres, often at astronomical bonus prices. Next, a drilling campaign ensues driven more by lease expiration schedules—typically in the 3- year range—than by science. Only after considerable capital has been destroyed in this manner are the core areas recognized. This "Braille method" is completely opposite to the customary approach to E&P projects, where a cautious approach based on science is used to high-grade focus areas.

The methods used to obtain decline rates and reserve estimates for shale plays presented in this column employ best practices in the petroleum industry. Yet a group of professionals believe that some shale plays are exceptions to the methods of decline-curve analysis established by peer-reviewed papers published by the Society of Petroleum Engineers (SPE). It does not seem logical that type-curve methods should be more reliable than individual well decline-curve analysis. If the pattern of well decline is empirically exponential, it makes no sense that it should be treated as hyperbolic for conceptual reasons or because of a preference based on production from higher permeability reservoirs that are not comparable to those in the Barnett or other recent shale gas plays.

We recognize that it may take many years before true pseudo-steady-state flow is reached. But in the Barnett, decline trends are well developed in thousands of wells, and we must forecast reserves based on those trends, and not on some future, model-driven expectation of flattening decline rates. Let me be clear. We do not dispute the volume of gas resources claimed by operators. We do question the reserves that, by definition, must be commercial on a full-cycle economic basis.

The time has come for the companies that operate in the shale plays to show the data that supports their optimistic forecasts for natural gas supply in the US. The economic viability of shale gas is a serious issue with profound implications for capital investment, alternate energy research funding and national policy. To simply say that those that have doubts about shale plays are wrong will no longer satisfy the many intelligent people who follow this debate.

Stuxnet and the Safety of Iran's Bushehr Reactor

From the Associated Press, January 31, 2011:
The control systems of Iran's Bushehr nuclear plant have been penetrated by a computer worm unleashed last year, according to a foreign intelligence report that warns of a possible Chernobyl-like disaster once the site becomes fully operational.
Russia's envoy to NATO, Dmitry Rogozin, also has raised the specter of the 1986 reactor explosion in Ukraine, but suggested last week that the danger had passed.

The report, drawn up by a nation closely monitoring Iran's nuclear program and obtained by The Associated Press, said such conclusions were premature and based on the "casual assessment" of Russian and Iranian scientists at Bushehr.

With control systems disabled by the virus, the reactor would have the force of a "small nuclear bomb," it said.

"The minimum possible damage would be a meltdown of the reactor," it says. "However, external damage and massive environmental destruction could also occur ... similar to the Chernobyl disaster."

The virus, known as Stuxnet, has the ability to send centrifuges spinning out of control and temporarily crippled Iran's uranium enrichment program. Some computer experts believe Stuxnet was work of Israel or the United States, two nations convinced that Iran wants to turn nuclear fuel into weapons-grade uranium.

Iran has acknowledged that the malware — malicious software designed to infiltrate computer systems — hit the laptops of technicians working at Bushehr, but has denied that the plant was affected or that Stuxnet was responsible for delays in the startup of the Russian-built reactor.

The Islamic Republic is reluctant to acknowledge setbacks to its nuclear activities, which it says are aimed at generating energy but are under U.N. sanctions because of concerns they could be channeled toward making weapons. Only after outside revelations that its enrichment program was temporarily disrupted late last year by the mysterious virus did Iranian officials acknowledge the incident.

Ali Asghar Soltanieh, Iran's chief representative to the IAEA, cut short attempts by AP to seek comment on possible damage by Stuxnet at Bushehr.

But Rogozin, the Russian envoy, described how engineers at Bushehr "saw on their screens that the systems were functioning normally, when in fact they were running out of control," conjuring up a frightening dimension to the potential fallout from the virus.

"The virus which is very toxic, very dangerous, could have had very serious implications," Rogozin told reporters, adding it "could have led to a new Chernobyl."

Experts are split on how powerful the Stuxnet virus might prove.

Olli Heinonen, who retired last year as head of investigations of Iran's nuclear programs at the International Atomic Energy Agency, believes it could have infected control systems at Bushehr, or elsewhere, causing "a lot of havoc."

Bur German cybersecurity researcher Ralph Langner says that, while the virus has infested the reactor's computers, "Stuxnet cannot technically mess with the systems in Bushehr.

"Bottom line: A thermonuclear explosion cannot be triggered by something like Stuxnet," said Langner, who has led research into Stuxnet's effects on the Siemens equipment running Iran's nuclear programs.

A spokeswoman for Atomstroyexport, the Russian company in charge of construction at Bushehr, also cast doubt on there being major damage at the plant, saying its control system is fully autonomous and virus-proof.

The IAEA — the U.N. monitor of Iran's nuclear activities — declined comment on damage at Bushehr. But officials, who asked for anonymity because they were not authorized to discuss the issue, have said the agency is unhappy with safety and operating standards at the reactor.
Update: On February 25, the New York Times reported further difficulties at Bushehr:
Iran told atomic inspectors this week that it had run into a serious problem at a newly completed nuclear reactor that was supposed to start feeding electricity into the national grid this month, raising questions about whether the trouble was sabotage, a startup problem, or possibly the beginning of the project’s end.
In a report on Friday, the International Atomic Energy Agency said Iran told inspectors on Wednesday that it was planning to unload nuclear fuel from its Bushehr reactor — the sign of a major upset. For years, Tehran has hailed the reactor as a showcase of its peaceful nuclear intentions and its imminent startup as a sign of quickening progress.

But nuclear experts said the giant reactor, Iran’s first nuclear power plant, now threatens to become a major embarrassment, as engineers remove 163 fuel rods from its core.

Iran gave no reason for the unexpected fuel unloading, but it has previously admitted that the Stuxnet computer worm infected the Bushehr reactor. On Friday, computer experts debated whether Stuxnet was responsible for the surprising development.

Russia, which provided the fuel to Iran, said earlier this month that the worm’s infection of the reactor should be investigated, arguing that it might trigger a nuclear disaster. Other experts said those fears were overblown, but noted that the full workings of the Stuxnet worm remained unclear.

In interviews Friday, nuclear experts said the trouble behind the fuel unloading could range from minor safety issues and operational ineptitude to serious problems that would bring the reactor’s brief operational life to a premature end.

“It could be simple and embarrassing all the way to ‘game over,’ ” said David A. Lochbaum, a nuclear engineer at the Union of Concerned Scientists and a former official at the Nuclear Regulatory Commission, which oversees nuclear reactors in the United States.

Mr. Lochbaum added that having to unload a newly fueled reactor was “not unprecedented, but not an everyday occurrence.” He said it happened perhaps once in every 25 or 30 fuelings. In Canada, he added, a reactor was recently fueled and scrapped after the belated discovery of serious technical problems.

“This could represent a substantial setback to their program,” David Albright, president of the Institute for Science and International Security, a private group in Washington that tracks nuclear proliferation, said of the problem behind the Bushehr upset.

“It raises questions of whether Iran can operate a modern nuclear reactor safely,” he added. “The stakes are very high. You can have a Chernobyl-style accident with this kind of reactor, and there’s lots of questions about that possibility in the region.”

The new report from the I.A.E.A. — a regular quarterly review of the Iran nuclear program to the agency’s board — gave the reactor unloading only brief mention and devoted its bulk to an unusually toughly worded indictment of Iranian refusals to answer questions about what the inspectors called “possible military dimensions” of its nuclear program.

The report alluded to “new information recently received,” suggesting continuing work toward a nuclear warhead.

But the inspectors provided no details about the new information or how it was received. The I.A.E.A. frequently gets its data from the intelligence agencies of member countries, including the United States, but it also tries to collect data from its own sources.

The report on Friday referred directly to concerns that Iran was working on “the development of a nuclear payload for a missile.” But it noted that all of its requests for information had been ignored for years, with Iranian officials arguing that whatever information the agency possessed, it was based on forgeries.

The White House said Friday that the report cast new light on what it called Iran’s covert movement toward nuclear arms.

“The I.A.E.A.’s reports of obstruction and Iran’s failure to cooperate are troubling,” said Tommy Vietor, spokesman for the National Security Council. “We will continue to hold Iran accountable to its international nuclear obligations, including by deepening the international pressure on Iran.”

The reactor is located outside the Iranian city of Bushehr on the nation’s Persian Gulf coast. Priced at more than a billion dollars, it is ringed by dozens of antiaircraft guns and large radar stations meant to track approaching jets.

Its tangled history began around 1975 with a West German contract. After the Islamic Revolution in 1979, the West Germans withdrew. Iraq repeatedly bombed the half-built reactor between 1984 and 1988.

Iran signed a rebuilding accord with Russia in 1995 that should have had the project completed in 1999. But the plan bogged down in long delays.

The United States once opposed the plant. But Washington dropped its objections after Russia agreed to take back the spent rods, removing the possibility that Iran could reprocess them for materials that could fuel nuclear arms.

The loading of uranium fuel into the reactor was initially planned to start soon after its shipment to Bushehr last August, but was delayed by what the Iranians said was a leak in a pool near the central reactor.

In October, Iranian officials said the Stuxnet worm had infected the reactor complex, but they played down the issue. Mohammad Ahmadian, an Iranian Atomic Energy Organization official, said the affected computers had been “inspected and cleaned up.”

Later in October, as the fueling at last got under way, after three decades of delay, the head of Iran’s Atomic Energy Organization, Ali Akbar Salehi, called the Bushehr reactor “the most exceptional power plant in the world.”

In December, he predicted that the plant would be connected to the national power grid by Feb. 19. “This phase,” he said, according to The Tehran Times, “is the most important operational work of the plant.”

In an interview on Friday, a European diplomat familiar with Iran’s nuclear program called the fueling problem a major setback, even if the technical cause proves to be less than monumental.

“It’s clearly a significant setback to the startup of the reactor,” said the diplomat, who spoke on the condition of anonymity because of the diplomatic delicacy of the matter.

He said that engineers at Bushehr had identified a technical failure, but were struggling to understand its cause.

“It’s too early to know,” the diplomat said. “I’m sure the Iranians are studying that question quite desperately.”
The whole subject remains shrouded in mystery, but if the Stuxnet virus is the cause of Iran's difficulties it does throw an interesting light on Albright's comment. Did 9/11 show that the United States had difficulty operating a modern skyscraper safely?