July 28, 2011

If At First You Don't Succeed, Give Up

Perhaps that is too harsh, but a new report from the Breakthrough Institute on the failure of climate mitigation strategies seems to suggest as much.

It is true that the global strategy to curb emissions through international agreements (Kyoto and Copenhagen) has been a bust, but it surely does not follow that an indirect approach, with climate mitigation as a secondary effect, will be successful.

Nor does the unlikelihood of international agreement diminish the case for carbon or other taxes on energy consumption--in fact, the United States is very much the outlier among the advanced democracies in its religious aversion to energy taxes. Breakthrough has very interesting things to say about the nature of technological innovation; however, it does not take a genius to see that a two-track approach, combining subsides and taxes, is better than one that relies on subsidies alone.

The problem with subsidies is that they are likely to be allocated via political weight rather than intrinsic merit; Breakthrough recognizes the problem but can only resolve it by recommending that the process for allocating subsidies be undertaken by disinterested experts (presumably, themselves) rather than the body (Congress) that is to vote the funds.

From the Breakthrough Institute report, Climate Pragmatism: Innovation, Resilience, and No Regrets:
The old climate framework failed because it would have imposed substantial costs associated with climate mitigation policies on developed nations today in exchange for climate benefits far off in the future -- benefits whose attributes, magnitude, timing, and distribution are not knowable with certainty. Since they risked slowing economic growth in many emerging economies, efforts to extend the Kyoto-style UNFCCC framework to developing nations predictably deadlocked as well.
The new framework now emerging will succeed to the degree to which it prioritizes agreements that promise near-term economic, geopolitical, and environmental benefits to political economies around the world, while simultaneously reducing climate forcings, developing clean and affordable energy technologies, and improving societal resilience to climate impacts. This new approach recognizes that continually deadlocked international negotiations and failed domestic policy proposals bring no climate benefit at all. It accepts that only sustained effort to build momentum through politically feasible forms of action will lead to accelerated decarbonization.
Here is a summation of an earlier Breakthrough Institute report on American innovation:

Driving directions from your iPhone. The cancer treatments that save countless lives. The seed hybrids that have slashed global hunger. A Skype conversation while flying on a Virgin Airlines jet across the continent in just five hours.

Where did these everyday miracles come from? As soon as the question is asked we know to suspect that the answer is not as simple as Apple, Amgen, or General Electric. We might recall something about microchips and the Space Race, or know that the National Institutes of Health funds research into new drugs and treatments.

But most of us remain unaware of the depth and breadth of American government support for technology and innovation. . . .

Where do good technologies come from?

One answer is visionary presidents. From George Washington to George W. Bush, under presidents both Republican and Democrat, the unbroken history of American innovation is one of active partnership between public and private sectors. Washington helped deliver interchangeable parts, which revolutionized manufacturing. Lincoln, the railroads and agricultural centers at land grant colleges. Eisenhower, interstate highways and nuclear power; Kennedy, microchips. But some of America's most important technologies came out of programs that spanned multiple presidents, as in the case of medical and biotechnology research; President Richard Nixon launched the quest to cure cancer in 1971, while funding for the National Institutes of Health tripled under Presidents Bill Clinton and George W. Bush.

Another answer is war. Interchangeable parts were developed at public armories, originally for rifles. One hundred and fifty years later, microchips, computing, and the Internet were created to guide rockets and communicate during nuclear war; today those technologies power our laptops and smartphones.

But outside of war, the United States has made decades-long investments in medicine, transportation, energy, and agriculture that resulted in blockbuster drugs, railroads and aviation, new energy technologies, and food surpluses.

America's brilliant inventors and firms played a critical role, but it is the partnerships between the state and private firms that delivered the world-changing technologies that we take for granted today.

There may be no better example of the invisible hand of government than the iPhone.

Launched in 2007, the iPhone brought many of the now familiar capabilities of the iPod together with other communications and information technologies made possible by federal funding:

    The microchips powering the iPhone owe their emergence to the U.S. military and space programs, which constituted almost the entire early market for the breakthrough technology in the 1960s, buying enough of the initially costly chips to drive down their price by a factor of 50 in a few short years.

    The early foundation of cellular communication lies in radiotelephony capabilities progressively advanced throughout the 20th century with support from the U.S. military.

    The technologies underpinning the Internet were developed and funded by the Defense Department's Advanced Research Projects Agency in the 1960s and 70s.

    GPS was originally created and deployed by the military's NAVSTAR satellite program in the 1980s and 90s.

    Even the revolutionary multitouch screen that provides the iPhone's intuitive interface was first developed by University of Delaware researchers supported by grants and fellowships
provided by the National Science Foundation and the CIA.

The iPhone is emblematic of the public-private partnerships that have driven America's technological leadership. Historically, this partnership has taken two general forms. First, the government has long acted as an early funder of key basic science and applied research and development. So it was in agriculture, when the government created new land-grant colleges and expanded funding for agricultural science, leading to the development of new and better crops. In medicine, many of today's blockbuster drugs can trace their existence to funding from the National Science Foundation (NSF) and the National Institutes of Health (NIH).

In addition to providing robust funding for new scientific discovery and technological advancement, the government has also routinely helped develop new industries by acting as an early and demanding customer for innovative, high-risk technologies that the private sector was unable or unwilling to fund. Military procurement during and after World War I helped America catch up to its European rivals in aerospace technology and was key to the emergence of the modern aviation industry. Decades later, the modern semiconductor and computer industries were created with the help of government procurement for military and space applications.

The case studies herein also demonstrate that when this vital partnership between the public and private sector is severed, so too is American economic leadership. Once a global leader in wind and solar energy technology, the United States faltered and never fully recovered as public support ceased and other governments - Denmark, Germany, and Japan - increased their investments and stepped in to assume the mantle of leadership in the emerging sectors. . . .

No comments: