July 28, 2011

If At First You Don't Succeed, Give Up

Perhaps that is too harsh, but a new report from the Breakthrough Institute on the failure of climate mitigation strategies seems to suggest as much.

It is true that the global strategy to curb emissions through international agreements (Kyoto and Copenhagen) has been a bust, but it surely does not follow that an indirect approach, with climate mitigation as a secondary effect, will be successful.

Nor does the unlikelihood of international agreement diminish the case for carbon or other taxes on energy consumption--in fact, the United States is very much the outlier among the advanced democracies in its religious aversion to energy taxes. Breakthrough has very interesting things to say about the nature of technological innovation; however, it does not take a genius to see that a two-track approach, combining subsides and taxes, is better than one that relies on subsidies alone.

The problem with subsidies is that they are likely to be allocated via political weight rather than intrinsic merit; Breakthrough recognizes the problem but can only resolve it by recommending that the process for allocating subsidies be undertaken by disinterested experts (presumably, themselves) rather than the body (Congress) that is to vote the funds.

From the Breakthrough Institute report, Climate Pragmatism: Innovation, Resilience, and No Regrets:
The old climate framework failed because it would have imposed substantial costs associated with climate mitigation policies on developed nations today in exchange for climate benefits far off in the future -- benefits whose attributes, magnitude, timing, and distribution are not knowable with certainty. Since they risked slowing economic growth in many emerging economies, efforts to extend the Kyoto-style UNFCCC framework to developing nations predictably deadlocked as well.
The new framework now emerging will succeed to the degree to which it prioritizes agreements that promise near-term economic, geopolitical, and environmental benefits to political economies around the world, while simultaneously reducing climate forcings, developing clean and affordable energy technologies, and improving societal resilience to climate impacts. This new approach recognizes that continually deadlocked international negotiations and failed domestic policy proposals bring no climate benefit at all. It accepts that only sustained effort to build momentum through politically feasible forms of action will lead to accelerated decarbonization.
Here is a summation of an earlier Breakthrough Institute report on American innovation:

Driving directions from your iPhone. The cancer treatments that save countless lives. The seed hybrids that have slashed global hunger. A Skype conversation while flying on a Virgin Airlines jet across the continent in just five hours.

Where did these everyday miracles come from? As soon as the question is asked we know to suspect that the answer is not as simple as Apple, Amgen, or General Electric. We might recall something about microchips and the Space Race, or know that the National Institutes of Health funds research into new drugs and treatments.

But most of us remain unaware of the depth and breadth of American government support for technology and innovation. . . .

Where do good technologies come from?

One answer is visionary presidents. From George Washington to George W. Bush, under presidents both Republican and Democrat, the unbroken history of American innovation is one of active partnership between public and private sectors. Washington helped deliver interchangeable parts, which revolutionized manufacturing. Lincoln, the railroads and agricultural centers at land grant colleges. Eisenhower, interstate highways and nuclear power; Kennedy, microchips. But some of America's most important technologies came out of programs that spanned multiple presidents, as in the case of medical and biotechnology research; President Richard Nixon launched the quest to cure cancer in 1971, while funding for the National Institutes of Health tripled under Presidents Bill Clinton and George W. Bush.

Another answer is war. Interchangeable parts were developed at public armories, originally for rifles. One hundred and fifty years later, microchips, computing, and the Internet were created to guide rockets and communicate during nuclear war; today those technologies power our laptops and smartphones.

But outside of war, the United States has made decades-long investments in medicine, transportation, energy, and agriculture that resulted in blockbuster drugs, railroads and aviation, new energy technologies, and food surpluses.

America's brilliant inventors and firms played a critical role, but it is the partnerships between the state and private firms that delivered the world-changing technologies that we take for granted today.

There may be no better example of the invisible hand of government than the iPhone.

Launched in 2007, the iPhone brought many of the now familiar capabilities of the iPod together with other communications and information technologies made possible by federal funding:

    The microchips powering the iPhone owe their emergence to the U.S. military and space programs, which constituted almost the entire early market for the breakthrough technology in the 1960s, buying enough of the initially costly chips to drive down their price by a factor of 50 in a few short years.

    The early foundation of cellular communication lies in radiotelephony capabilities progressively advanced throughout the 20th century with support from the U.S. military.

    The technologies underpinning the Internet were developed and funded by the Defense Department's Advanced Research Projects Agency in the 1960s and 70s.

    GPS was originally created and deployed by the military's NAVSTAR satellite program in the 1980s and 90s.

    Even the revolutionary multitouch screen that provides the iPhone's intuitive interface was first developed by University of Delaware researchers supported by grants and fellowships
provided by the National Science Foundation and the CIA.

The iPhone is emblematic of the public-private partnerships that have driven America's technological leadership. Historically, this partnership has taken two general forms. First, the government has long acted as an early funder of key basic science and applied research and development. So it was in agriculture, when the government created new land-grant colleges and expanded funding for agricultural science, leading to the development of new and better crops. In medicine, many of today's blockbuster drugs can trace their existence to funding from the National Science Foundation (NSF) and the National Institutes of Health (NIH).

In addition to providing robust funding for new scientific discovery and technological advancement, the government has also routinely helped develop new industries by acting as an early and demanding customer for innovative, high-risk technologies that the private sector was unable or unwilling to fund. Military procurement during and after World War I helped America catch up to its European rivals in aerospace technology and was key to the emergence of the modern aviation industry. Decades later, the modern semiconductor and computer industries were created with the help of government procurement for military and space applications.

The case studies herein also demonstrate that when this vital partnership between the public and private sector is severed, so too is American economic leadership. Once a global leader in wind and solar energy technology, the United States faltered and never fully recovered as public support ceased and other governments - Denmark, Germany, and Japan - increased their investments and stepped in to assume the mantle of leadership in the emerging sectors. . . .

July 26, 2011

Third Iranian Scientist Assassinated

From Xinhua:
Iranian officials on Sunday accused the United States and Israel of being involved in the assassination of an Iranian academic who was shot dead on Saturday in front of his house.

Iran's Majlis (parliament) Speaker Ali Larijani called the assassination of an Iranian scholar as a "U.S.-Zionist terrorist act." "Yesterday's U.S.-Zionist terrorist act that targeted one of the elites of Iran is another instance demonstrating the U.S. hostility (toward Iran)," Larijani was quoted as saying by the satellite Press TV Sunday.
Larijani said that "the U.S. has resorted to terrorist acts because of failing in its adventures in the region." The Iranian speaker also called on Iran's security forces to respond to terrorists with "stronger will," said the report.
Also, the rapporteur of the National Security and Foreign Policy Commission of Iran's parliament, Kazzem Jalali, condemned the assassination of the Iranian scientist and took the U.S. and Israel responsible for the killing of the Iranian elites, local Fars news agency reported. "Assassination of the scientific elites means that enemies of the Iranian nation are desperate," Jalali told reporters on Sunday.
"32 years after the victory of the Islamic Revolution, the Iranian nation is now well informed of the identity of those who are behind such terrorist attacks," Jalali was quoted as saying. "They (Iranians) know that Americans and Zionists are behind the assassination of Iran's scientific elites," he added. . . .
On Saturday, local media reported that "Dariush Rezaei," PhD of nuclear Physics and a university professor, was killed by motorcyclists in front of his home in eastern Tehran.According to the reports, the assassinated scientist was an expert linked to Iran's Atomic Energy Organization and a nuclear physics professor at Mohaqeq Ardebili University in Iran's northwestern Ardebil city. Later reports, however, corrected their earlier news saying that "Dariush Rezaeinejad," 35, worked for the Iranian Defense Ministry. The victim majored in electro-techniques at Tehran's Khajeh Nasir University, Fars said on Sunday.
No group or individual has claimed responsibility for the attack and no reports have been issued about the arrests of the suspects so far.

In January 2010, the Iranian nuclear scientist and the Physics professor at the prestigious University of Tehran, Massoud Ali- Mohammadi, was killed by a remote-controlled bomb attached to a motorbike parked near his house.
In November 2010, another Iranian nuclear scientist Majid Shahriar was also killed by a bomb attached to his car on the way to his work.
In January, Iran said it dismantled an Israeli spying network and arrested a group of people who were linked to the assassination of its nuclear scientist Massoud Ali-Mohammadi.
Iran's Foreign Ministry Spokesman Ramin Mehmanparast said that Iran would legally sue Israel over assassinating the nuclear scientist.
Also, in the month, Iranian Vice President Nasrin Soltankhah announced that Iran formed security teams to protect its nuclear scientists. Protecting the Iranian scientists was seriously considered by the country's security systems, Soltankhah said, adding that the Iranian scientists were also trained to confront the possible dangers which may threat their lives. The Iranian authorities have been accusing the United States and Israeli intelligence services of being behind the assassinations. . . .
DCH:

I find these acts immoral, illegal, and reckless. They should be condemned as vehemently as any terrorist action. Plainly, they are acts of war. The sequence of assassinations is another instance of the defenders of civilization resorting to the most odious and barbaric of means. And it hardly arouses a flutter of comment in America.

July 25, 2011

Mid-Continental U.S. Oil Production to Soar, Says Barrons

In a bullish piece on the mid-continental refiners, Barron's relates some interesting predictions about expected increases in oil production in the mid-continent, including such areas as the Bakken region of North Dakota, the Permian basin in Texas, and the Alberta oil sands. One analyst estimates that mid-continental production will grow from 4.7 million barrels a day now to 7.6 mbd by 2020; another says that the spread between West Texas Intermediate and Brent could reach $50 in 2012.

Oil from these regions is flowing through pipelines into the middle of the country, but can't easily reach the Gulf, Pacific or Atlantic coasts, where higher world-market oil prices prevail. There is ample pipeline capacity to bring oil into the center of the country, but limited capacity to take it out. All the trapped oil is giving mid- continent refiners a cheap crude source.

There are plans for pipelines to bring crude out of the mid-continent to the Gulf Coast. The planned Keystone XL pipeline would run from Canada to the Houston area with capacity of more than 500,000 barrels per day, but environmental opposition could kill Keystone and potentially delay other pipelines. There isn't enough other transportation to move out a sizable amount of oil.

Paul Sankey, Deutsche Bank's energy analyst, wrote recently that the spread between WTI and Brent crude "can stay wider for much longer than the market currently anticipates." He argues that oil production in Canada and the mid-continental U.S. could rise by 300,000 barrels per day each year through 2020. Current daily production is 4.7 million barrels per day, potentially increasing to 7.6 million barrels per day by 2020. Morgan Stanley's Calio says the gap between WTI and a Gulf Coast crude similarly priced to Brent could get as wide as $50 per barrel in 2012, given all the new domestic and Canadian production.



Even as politicians debate U.S. energy policy, oil and natural-gas companies have been actively drilling in the continental U.S. As a result, gas production has surged in recent years, depressing prices, while oil output also has increased, lifting U.S. production to near a 10-year high despite declining output in decades-old Alaskan fields and drilling restrictions in the Gulf of Mexico. . . .

Credit the surge in crude supply to so-called unconventional oil sources in shale formations in the Bakken and Permian regions that are being accessed through horizontal drilling and controversial hydraulic fracturing, or hydrofracking techniques like those used in tapping large new domestic sources of natural gas. The Bakken region, for instance, could be producing one million barrels a day in five years, triple the current output. Oil production from the Alberta oil sands also is rising steadily and flowing south. . . .

* * *

Platts says that an even bigger play exists in the Monterey Shale in California, not mentioned in the Barron's piece:
In the Energy Information Administration's recently released "Review of Emerging Resources," the federal agency pegs the Monterey play at 15.42 billion barrels of technically recoverable oil, compared to the Bakken's 3.59 billion and the Eagle Ford's 3.35 billion.

The Monterey shale overlaps much of California's traditional crude producing regions. It generally runs in two swaths: a roughly 50-mile wide ribbon running length of the San Joaquin Valley and coastal hills, and a Pacific Coast strip of similiar width between Santa Barbara and Orange County.

Add to the mix an improved regulatory climate with faster permitting, an uptick in rig counts, and increased interest from independents, and California's oil fields have become "a great place for investors to look," McPherson said. "Their day in the sun is about to come."

Challenges persist, however, such as community resistance to new drilling. . . .

July 24, 2011

Two Wrongs Do Make a Right


From Yale Environment 360, "Stratospheric Pollution Is Slowing Global Warming, Study Says":

Sulfur dioxide being spewed into the stratosphere by coal-fired power plants and volcanic eruptions has blunted the impact of global warming over the past decade, offsetting roughly one third of the increased heat caused by greenhouse gas emissions, according to a new study. Researchers at the U.S. National Oceanic and Atmospheric Administration (NOAA) found that stratospheric aerosols formed by sulfur dioxide have nearly doubled in the past decade, and that those aerosols have been reflecting a significant amount of heat back into space. “Aerosols acted to keep warming from being as big as it would have been,” said NOAA atmospheric scientist John Daniel, a co-author of the study published online in Science. Much of the sulfur dioxide that rose into the stratosphere six miles above the Earth came from coal-fired power plants, the study said. The aerosol finding, along with a weaker sun due to the most recent solar cycle, may help explain why global warming has not accelerated as rapidly in the past decade as it did in the 1990s. But scientists say that if major polluters such as China add better scrubbing technology to their power plants, then aerosols in the stratosphere are likely to decrease, which could accelerate warming.

July 23, 2011

Classical Economics and the Limits to Growth

The following extract comes from an essay by Tony Wrigley, a Cambridge economic historian who has a new book on Energy and the English Industrial Revolution (2010)

The most fundamental defining feature of the industrial revolution was that it made possible exponential economic growth – growth at a speed that implied the doubling of output every half-century or less. This in turn radically transformed living standards. Each generation came to have a confident expectation that they would be substantially better off than their parents or grandparents. Yet, remarkably, the best informed and most perspicacious of contemporaries were not merely unconscious of the implications of the changes which were taking place about them but firmly dismissed the possibility of such a transformation. The classical economists Adam Smith, Thomas Malthus, and David Ricardo advanced an excellent reason for dismissing the possibility of prolonged growth.

They thought in terms of three basic factors of production, i.e. land, labour, and capital. The latter two were capable of indefinite expansion in principle but the first was not. The area of land which could be used for production was limited, yet its output was basic – not just to the supply of food but of almost all the raw materials which entered into material production. This was self-evidently true of animal and vegetable raw materials – wool, cotton, leather, timber, etc. But it was also true of all mineral production since the smelting of ores required much heat and this was obtained from wood and charcoal. Expanding material production meant obtaining a greater volume of produce from the land but that in turn meant either taking into cultivation land of inferior quality, or using existing land more intensively, or both. This necessarily meant at some point that returns both to capital and labour would fall. In short, the very process of growth ensured that it could not be continued indefinitely. This was a basic characteristic of all “organic” economies, those which were universal before the industrial revolution. Adam Smith summarised the problem as follows:

In a country which had acquired that full complement of riches which the nature of its soil and climate, and its situation with respect to other countries, allowed it to acquire; which could, therefore, advance no further, and which was not going backwards, both the wages of labour and the profit of stock would probably be very low. (Smith 1789)

He went on to spell out in greater detail what his statement implied for the living standards of the bulk of the population and for the return on capital. When Ricardo tackled the same issue he came to the same conclusion and was explicit in insisting that the resulting situation “will necessarily be rendered permanent by the laws of nature, which have limited the productive powers of the land” (Ricardo 1817). . . .

Access to energy that did not spring from the annual product of plant photosynthesis was a sine qua non for breaking free from the constraints afflicting all organic economies. By an intriguing paradox, this came about by gaining access to the products of photosynthesis stockpiled over a geological time span. It was the steadily increasing use of coal as an energy source which provided the escape route. . . .

In recent years, it has become possible to quantify the phasing and scale of the energy revolution since scholars in a number of European countries have agreed a common set of conventions for the description and measurement of energy consumption. They have produced illuminating data. . . . [A]s more information becomes available for other European countries the strong similarities between all countries whose economies remained organic is striking. None could break free from the constraint which Adam Smith described unless they turned to the accumulated product of photosynthesis in the past rather than depending on the annual cycle of current photosynthesis. . . . .

The implications of the new age, which were invisible to the classical economists, were only fully appreciated much later by the generation of Karl Marx. He and his contemporaries saw clearly that output was rising rapidly and that it was reasonable to expect it to continue to do so, though they differed widely about the implications of this new situation.

* * *

In Greek mythology, Pandora was created by Zeus to enable him to punish Prometheus for having stolen fire from the sun to animate his man of clay. Zeus intended that Pandora should marry Prometheus and had given her a jar with the instruction that she should present the jar to the man she married. She was ignorant of its contents. Prometheus was suspicious and repulsed her. She instead married his brother Epimetheus, who ignored a warning about acting imprudently and opened the jar. In so doing he released into the world a host of previously unknown and malign forces.

The story has parallels with the occurrence of the industrial revolution. Contemporaries were not aware of the radical and irreversible nature of the changes which were in train. The analogy is not, of course, exact. On balance, the forces released by the industrial revolution may be thought beneficial rather than malign but the balance is a fine one.

Every increase in the powers of production has been offset by a matching increase in the powers of destruction, exemplified perhaps most vividly by the atomic bomb. And the possible impact of the massive increase in the burning of fossil fuels on the environment may also call in question the future stability of the gains which have been made in productive power.

The great bulk of the literature about the industrial revolution has been devoted to explaining how it began. This has been to the neglect of the equally important question of why the growth did not grind to a halt as all previous experience suggested was inevitable. It is in this context that the history of energy usage is critical to the understanding of the changes which took place.

Societies whose productive capacities were limited by the annual product of photosynthesis operated within severe and seemingly immovable constraints. Societies which switched to depending on the stored products of photosynthesis in the form of fossil fuels were released from these constraints, though whether the immense benefits which were thus made possible will prove durable and stable remains an open question.

 h/t: Early Warning

July 22, 2011

More Reserve, Please, in Releasing Strategic Reserves

This piece from the Financial Times reports the end of the International Energy Agency’s release of strategic oil reserves, which amounted to 60 million barrels for 30 days after June 23, 2011.  It notes, interestingly, that the IEA may have felt compelled to take action when OPEC did not raise its own output, but then says that Saudi Arabia is now pumping an extra 20 million barrels per month (700,000 per day). 

Analysts said the IEA’s unexpected decision to employ strategic stockpiles had exerted only a temporary restraint on oil prices. Immediately before the announcement, Brent crude, the most important benchmark, was trading at around $112.50 per barrel. When the IEA board made its decision on June 23, the price fell as low as $103.62, only to rebound strongly in the weeks that followed.

Brent was trading at $118.25 on Thursday, significantly above the level seen before the IEA chose to use the stockpiles held in its member states. “If their objective was to lower oil prices, it clearly hasn’t succeeded,” said Caroline Bain, senior commodities editor at the Economist Intelligence Unit.

The original decision may have had as much to do with politics as market conditions, she added. On May 19, the IEA board publicly threatened to release strategic reserves if Opec failed to increase its own output. The oil cartel then declined to raise production quotas during an acrimonious meeting in Vienna on June 8. “They had threatened Opec that they would act, so they perhaps felt they had to go ahead with it,” added Ms Bain.

The IEA secretariat cited a more recent increase in production from individual Opec members as a further reason for not repeating the decision to draw on reserves. Last month, Saudi Arabia is believed to have pumped an extra 700,000 b/d, bringing the kingdom’s total output to 9.7m b/d.

This helped total Opec production to climb to just above 30m b/d – slightly higher than the level before Libya’s civil war deprived the market of most of that country’s production in February.

I thought the IEA’s original release was a dumb idea. If impressing OPEC was the purpose, a one month release surely would not do it. Did anyone at this august organization, at the time the original decision was made, ask the question: what about month two?  Better yet: have you ever heard about the Boneless Wonder?

The timing was also weird. Prices were well off the highs of the spring. Perhaps this was an instance of a multilateral body of 28 members acting with all deliberate speed, or such as it can muster. The IEA did nothing when panic gripped the markets from late February to the end of April, and acted bravely seven weeks later, when oil had already fallen 15 percent! (When the oil market did crack on April 30 or so, it coincided with a market call by Goldman Sachs saying it was time to sell oil and three other commodities, suggesting that the Vampire Squid has more market-moving moxie than the IEA.) 


The above (hope you can make it out on your browser) is a six-month chart in four-hour increments. You can note June 23 by the big volume surge about a sixth of the way in from the right. The etf, which has a different price than the futures contract for Brent, fell from 76 to 69 after the announcement.

An objection that goes unmentioned by the FT is the potential for insider trading that the IEA’s decision brought forth. Given that decision was the product of multilateral body, a lot of governments had to know the details of the forthcoming announcement, which caused (as FT noted) nearly a $10 short-term move in the Brent futures price, and some serious drift in the days preceding. I’m not alleging insider trading, but rather insisting that any decision with vast potential for insider trading must have some serious countervailing merits. The IEA’s original release had none.

July 21, 2011

Carbon Capture Fizzles Further

From Bloomberg, a report on diminished prospects for carbon capture:

New Haven, an Ohio River coal mining town of 1,550, is home to American Electric Power Co. Inc.’s 1,300-megawatt Mountaineer Plant. The plant has a 1,103-foot-tall chimney and burns 12,000 tons of coal a day to generate electricity for AEP’s 11-state grid, which supplies power to 5.3 million customers.

In the process, it annually belches out 8.5 million metric tons of greenhouse gas carbon dioxide. Abutting the plant is an ambitious $100 million experiment: a seven-story, steel-and- fiberglass rectangle, corralled by dull metal catwalks and rattled by motors and pumps. The apparatus traps a portion of the plant’s carbon-dioxide-rich exhaust using an ammonia-based catalyst. (Hence the acrid smell.) The reclaimed CO2 is pumped 8,000 feet underground, where, in theory, it will remain harmlessly out of the atmosphere. The goal of the experiment is to prove that carbon capture and storage technology, or CCS, works, and in so doing, provide one possible solution to global warming, Bloomberg Businessweek reports in its July 25 issue. . . .

[W]hen the New Haven project -- hitching one of America’s largest power companies to the U.S. Energy Department’s carbon capture bandwagon -- was launched in 2009, there was a sense of determination and common cause. “It’s time to advance this technology for commercial use,” declared AEP Chief Executive Officer Michael G. Morris. The company planned to replace its pilot with a larger $668 million CCS facility, which would bury more than 1 million metric tons of CO2 a year, splitting construction costs evenly with the Energy Department. Environmentalists heralded the project. . . .

Yet only two years in, the future of CCS is in jeopardy. On July 14, AEP pulled the plug on its CCS efforts, citing a weak economy and the “uncertain status of U.S. climate policy.” CEO Morris said AEP and its partners “have advanced CCS technology more than any other power generator with our successful two-year project to validate the technology. But at this time it doesn’t make economic sense to continue.”

The dimming of CCS’s promise reflects a broader national retreat from the goal of reversing climate change. In private and, to some degree, in public, the company and its executives express frustration that they tried to do the right thing -- only to end up burned. . . .

 “Two years ago was the height of optimism,” says Howard J. Herzog, senior research engineer for the Massachusetts Institute of Technology Energy Initiative who has tracked CCS technologies and research from the outset. Now, without a price on carbon, “the finances are tough. Every other week you hear of a project being canceled. It’s not a pretty picture.”

It’s easy to see why carbon capture once seemed so appealing. It could significantly reduce carbon emissions while keeping coal, still the nation’s chief source of electric power, central to the energy mix. In 2010, President Barack Obama unveiled a goal to bring five to 10 commercial-size CCS demonstration plants on line in the U.S. by 2016. Leaders of the Group of Eight, which includes the U.S., Russia, and Japan, embraced in 2008 a goal to launch 20 large-scale CCS demonstration projects by 2010 with “broad deployment” of the technology by 2020. All told, governments worldwide committed $22.5 billion to support CCS since the start of 2008, according to Bloomberg New Energy Finance. An MIT website that tracks worldwide CCS projects lists 68 scattered across 15 countries, 45 of them associated with coal-fired power plants. Yet since the beginning of the fourth quarter of 2010, at least five large-scale CCS projects have been canceled or postponed, while the fate of several others remains doubtful . . .

For all its hype and promise, the challenges of extracting carbon dioxide from smokestacks, compressing it, transporting it, and pumping it underground, where it is supposed to stay for eons, remain daunting.

Costs are a core obstacle, notably those related to what’s called the parasitic load, defined as the amount of energy consumed in the process of removing CO2 from power plant exhaust. Estimated to be $60 to $95 per metric ton of CO2 captured, these costs could add 81 percent or more to consumer power bills, according to a November 2010 Energy Department report. The DOE says its goal is to get those costs down to no more than 30 percent of the price of electricity generated by conventional coal plants and 10 percent more than the price of coal-gasification plants.

Why is CCS so expensive? Based on results so far, storage capacity isn’t the driving cost factor. A 2010 DOE report estimated that between underground saline formations, oil and gas fields, and unmineable coal areas, the U.S. and Canada alone have up to 5,700 years of carbon sequestration capacity.

But capturing carbon is another matter. The clattering, odiferous Mountaineer pilot required the company to add the equivalent of a small, energy-intensive refinery on to the side of the power plant. As AEP’s Spitznogle explains, power plant exhaust is sucked into the capture unit, cooled, and mixed with a chilled ammonia-based solvent, causing the CO2 to precipitate out as a slurry that gets reconverted to a gas. It is compressed into a liquid state, and then pumped into deep, porous underground formations by a series of injection wells. All of which increases the parasitic load.

When the AEP project went on line in 2009, the goal was modest: to capture and bury up to 1.5 percent of Mountaineer’s carbon dioxide. When the project shut down, the New Haven plant had captured about 37,000 metric tons of CO2, a fraction of its target. Spitznogle says that separating out the carbon using “complex chemistry” proved challenging. . . .

If CCS is to work out in the end, he and others agree, the number of pilot projects must increase. “The only way to bring down the cost is to start building a lot of these projects,” says David G. Victor, director of the Laboratory on International Law and Regulation at the University of California, San Diego. Again, “the absence of a price on carbon puts everything in limbo.”

The fate of CCS technology is inextricably tied to the controversy over clean coal, a debate which has helped drive a wedge in the green community since many environmentalists -- citing the ecological harm in coal’s extraction, including the process of mountaintop removal -- simply don’t believe coal can ever be made “clean.” But perhaps the biggest blow to CCS’s reputation as a clean technology originated at a farm in southeastern Saskatchewan, about 40 miles above the North Dakota border. . . . [See further here.]

At present, no other technology comes close to matching the potential of CCS in the fight against global warming. The International Energy Agency projects that CCS will have to account for 20 percent of all CO2 reductions if the 2050 goal to cut worldwide greenhouse gases by half of current levels is to be met. Without a successful CCS program, the costs will be 70 percent greater, the agency says. “If we don’t implement carbon capture and storage,” says John Thompson of the Clean Air Task Force, an environmental advocacy group, “it’s probably game over on climate change.”

* * *

Ken Wells and Benjamin Elgin, "Carbon Capture Hopes Dim as AEP Says It Got Burned at Coal Plant," Bloomberg, July 20, 2011.
 

July 15, 2011

Forests Remove One-Third of Anthropogenic Carbon Dioxide

A new study published in Science and reported by AFP provides "the first complete and global evidence of the overwhelming role of forests in removing anthropogenic carbon dioxide," notes Joseph Canadell, a co-author of the study and a scientist at Australia's national climate research center. Canadell described the findings as  both "incredible" and "unexpected".
Wooded areas across the planet soak up fully a third of the fossil fuels released into the atmosphere each year, some 2.4 billion tonnes of carbon, the study found.
At the same time, the ongoing and barely constrained destruction of forests -- mainly in the tropics -- for food, fuel and development was shown to emit 2.9 billion tonnes of carbon annually, more than a quarter of all emissions stemming from human activity.
Up to now, scientists have estimated that deforestation accounted for 12 to 20 percent of total greenhouse gas output.
The big surprise, said Canadell, was the huge capacity of tropical forests that have regenerated after logging or slash-and-burn land clearance to purge carbon dioxide from the atmosphere. "We estimate that tropical forest regrowth is removing an average of 1.6 billion tonnes of carbon each year," he said in an e-mail exchange.
Adding up the new figures reveals that all the world's forests combined are a net "sink", or sponge, for 1.1 billion tonnes of carbon, the equivalent of 13 percent of all the coal, oil land gas burned across the planet annually. "That's huge. These are 'savings' worth billions of euros a year if that quantity had to be paid out by current mitigation (CO2 reduction) strategies or the price of carbon in the European market," Canadell said.
The international team of climate scientists combined data -- covering the period 1990 through 2007 -- from forests inventories, climate models and satellites to construct a profile of the role global forests have played as regulators of the atmosphere.
In terms of climate change policy, the study has two critically important implications, said Canadell. The fact that previous science underestimated both the capacity of woodlands to remove CO2, and the emissions caused by deforestation, means that "forests are even more at the forefront as a strategy to protect our climate", he said. It also follows that forests should play a larger role in emerging carbon markets, he added. "The amount of saving which are up for grabs is very large, certainly larger than what we thought," Canadell said.
The UN-backed scheme known as REDD -- Reduced Emissions from Deforestation and Degradation -- allots credit to tropical countries in Latin America, Asia and Africa that slow rates of forest destruction. It also provides a mechanism for rich countries to offset their own carbon-reduction commitments by investing in that process. . . .
The breakdown over the last decade for CO2 removal was 1.8 billion tonnes each year for boreal forests at high latitudes, 2.9 billion for temperate forests, and 3.7 billion for tropical forests.
Once deforestation and regrowth are taken into account, however, tropical forests have been essentially carbon neutral. . . .

July 14, 2011

Increasing Wind's Output Tenfold

From Yale Environment 360:
A new analysis by the California Institute of Technology (Caltech) finds that the power output of wind farms can be increased tenfold — and with fewer environmental impacts — through better positioning of vertical-axis turbines. Because the large turbines used in most modern wind farms are placed far apart to prevent aerodynamic interference, much of the potential wind energy that enters the farms is wasted, according to the paper, published in the Journal of Renewable and Sustainable Energy. And the common steps taken to compensate for that problem, including construction of bigger blades or taller towers, generate higher costs and greater environmental impacts. Using a test array of vertical-axis wind turbines on a Southern California field, the Caltech researchers showed that more strategic placement of turbines closer to the ground maximizes energy production. The vertical-axis turbines can be placed closer together without causing aerodynamic interference, and researchers found that having each turbine spin in the opposite direction of its closest neighbor increased efficiency, perhaps because the opposing spins decreased the drag on each turbine, allowing them to spin faster.