Another wealthy VC tries explaining what the current GOP can't wrap it's head around.
Raise Taxes on Rich to Reward True Job Creators: Nick Hanauer - Bloomberg
By Nick Hanauer Nov 30, 2011 4:01 PM PT 411
It is a tenet of American economic beliefs, and an article of faith for Republicans that is seldom contested by Democrats: If taxes are raised on the rich, job creation will stop.
Trouble is, sometimes the things that we know to be true are dead wrong. For the larger part of human history, for example, people were sure that the sun circles the Earth and that we are at the center of the universe. It doesn’t, and we aren’t. The conventional wisdom that the rich and businesses are our nation’s “job creators” is every bit as false.
I’m a very rich person. As an entrepreneur and venture capitalist, I’ve started or helped get off the ground dozens of companies in industries including manufacturing, retail, medical services, the Internet and software. I founded the Internet media company aQuantive Inc., which was acquired by Microsoft Corp. (MSFT) in 2007 for $6.4 billion. I was also the first non-family investor in Amazon.com Inc. (AMZN)
Even so, I’ve never been a “job creator.” I can start a business based on a great idea, and initially hire dozens or hundreds of people. But if no one can afford to buy what I have to sell, my business will soon fail and all those jobs will evaporate.
That’s why I can say with confidence that rich people don’t create jobs, nor do businesses, large or small. What does lead to more employment is the feedback loop between customers and businesses. And only consumers can set in motion a virtuous cycle that allows companies to survive and thrive and business owners to hire. An ordinary middle-class consumer is far more of a job creator than I ever have been or ever will be.
Theory of Evolution
When businesspeople take credit for creating jobs, it is like squirrels taking credit for creating evolution. In fact, it’s the other way around.
It is unquestionably true that without entrepreneurs and investors, you can’t have a dynamic and growing capitalist economy. But it’s equally true that without consumers, you can’t have entrepreneurs and investors. And the more we have happy customers with lots of disposable income, the better our businesses will do.
That’s why our current policies are so upside down. When the American middle class defends a tax system in which the lion’s share of benefits accrues to the richest, all in the name of job creation, all that happens is that the rich get richer.
And that’s what has been happening in the U.S. for the last 30 years.
Since 1980, the share of the nation’s income for fat cats like me in the top 0.1 percent has increased a shocking 400 percent, while the share for the bottom 50 percent of Americans has declined 33 percent. At the same time, effective tax rates on the superwealthy fell to 16.6 percent in 2007, from 42 percent at the peak of U.S. productivity in the early 1960s, and about 30 percent during the expansion of the 1990s. In my case, that means that this year, I paid an 11 percent rate on an eight-figure income.
One reason this policy is so wrong-headed is that there can never be enough superrich Americans to power a great economy. The annual earnings of people like me are hundreds, if not thousands, of times greater than those of the average American, but we don’t buy hundreds or thousands of times more stuff. My family owns three cars, not 3,000. I buy a few pairs of pants and a few shirts a year, just like most American men. Like everyone else, I go out to eat with friends and family only occasionally.
It’s true that we do spend a lot more than the average family. Yet the one truly expensive line item in our budget is our airplane (which, by the way, was manufactured in France by Dassault Aviation SA (AM) ), and those annual costs are mostly for fuel (from the Middle East ). It’s just crazy to believe that any of this is more beneficial to our economy than hiring more teachers or police officers or investing in our infrastructure.
More Shoppers Needed
I can’t buy enough of anything to make up for the fact that millions of unemployed and underemployed Americans can’t buy any new clothes or enjoy any meals out. Or to make up for the decreasing consumption of the tens of millions of middle-class families that are barely squeaking by, buried by spiraling costs and trapped by stagnant or declining wages.
If the average American family still got the same share of income they earned in 1980, they would have an astounding $13,000 more in their pockets a year. It’s worth pausing to consider what our economy would be like today if middle-class consumers had that additional income to spend.
It is mathematically impossible to invest enough in our economy and our country to sustain the middle class (our customers) without taxing the top 1 percent at reasonable levels again. Shifting the burden from the 99 percent to the 1 percent is the surest and best way to get our consumer-based economy rolling again.
Significant tax increases on the about $1.5 trillion in collective income of those of us in the top 1 percent could create hundreds of billions of dollars to invest in our economy, rather than letting it pile up in a few bank accounts like a huge clot in our nation’s economic circulatory system.
Consider, for example, that a puny 3 percent surtax on incomes above $1 million would be enough to maintain and expand the current payroll tax cut beyond December, preventing a $1,000 increase on the average worker’s taxes at the worst possible time for the economy. With a few more pennies on the dollar, we could invest in rebuilding schools and infrastructure. And even if we imposed a millionaires’ surtax and rolled back the Bush- era tax cuts for those at the top, the taxes on the richest Americans would still be historically low, and their incomes would still be astronomically high.
We’ve had it backward for the last 30 years. Rich businesspeople like me don’t create jobs. Middle-class consumers do, and when they thrive, U.S. businesses grow and profit. That’s why taxing the rich to pay for investments that benefit all is a great deal for both the middle class and the rich.
So let’s give a break to the true job creators. Let’s tax the rich like we once did and use that money to spur growth by putting purchasing power back in the hands of the middle class. And let’s remember that capitalists without customers are out of business.
(Nick Hanauer is a founder of Second Avenue Partners, a venture capital company in Seattle specializing in early state startups and emerging technology. He has helped launch more than 20 companies, including aQuantive Inc. and Amazon.com, and is the co-author of two books, “The True Patriot” and “The Gardens of Democracy.” The opinions expressed are his own.)
To contact the writer of this article: Nick Hanauer at Nick@secondave.com.
To contact the editor responsible for this article: Max Berley at mberley@bloomberg.net .
Sunday, December 04, 2011
Thursday, December 01, 2011
Boston Review — Kenneth Arrow: Economics and Inequality
It is pleasing to be hearing these thoughts and concerns growing out of the Occupy movement.
Boston Review — Kenneth Arrow: Economics and Inequality
Economics and Inequality
Kenneth Arrow
This article is part of Occupy the Future, a forum on lessons to be drawn from the Occupy movement.
The specific problems of the current U.S. economy—the drastic increase in unemployment and sluggish increase in output—overlay a tendency of much longer duration, a drastic and rapid increase in the inequality of income. Every economy of complexity produces an unequal distribution of the good things in life. But the period immediately following World War II showed a considerably increased equality of income compared with either the Great Depression or the previous period of relative prosperity.
Since the middle 1980s, this tendency has been reversed. In the United States, median family income (adjusted for size) has remained virtually constant since 1995, while per capita income has risen at about 2 percent per annum. The difference in income between college graduates and those with only high school degrees increased at a rapid rate, even during the period before 1990 when per capita income grew very slowly. Further, the proportion of the college-age population enrolled in college, which had been rising rapidly, stopped increasing and has remained the same for thirty years.
Clearly, the bulk of the gains from increased productivity went to a small group of upper-income recipients. Indeed, closer study has shown that the bulk of the increase went to the top 1 percent of income recipients and much of that to those in the top .1 percent.
The causes of this growing inequality are varied. There has been a steady attack on the use of the tax system as a means of equalizing income. Income and estate taxes were once the most directly effective factors in redistribution. The top rate in the federal income tax was over 90 percent in the 1950s and is about 35 percent today. The exemption level for estate taxes has risen steadily, ensuring that less and less can be taxed. On the other hand, the earned-income tax credit has actually permitted negative income taxes (payments by the government to the tax filer) at the lowest end.
Shifts in the composition of goods and services have reduced income opportunities for many. Skilled industrial jobs have disappeared, while growing information services require a different set of skills. This shift has undoubtedly been augmented by globalization, which has resulted in considerable imports of manufactured goods. The weakening of unions is in good measure attributable to the relative decline in manufacturing, where unionization is easier.
Contemporaneous with the decline of manufacturing has been the increase of two service industries, finance and health. Profits from the finance sector, which historically have been about 10 percent of all profits, have risen to an extraordinary 40 percent. The sector’s labor needs are, of course, directed in considerable measure to the best-educated.
A proper sense of responsibility has to be enforced by legislation.
The notion of a well-running market is applicable to manufactured goods; different items are produced to be alike and can be evaluated by consumers. But the products of the finance and health industries are individualized and complex. The consumer cannot seriously evaluate them—a situation that economists call “asymmetric information.”
This casts light on the claim that the problem is one of personal ethics, of greed. After all, the search for improvement in technology, and consequently in the general standards of living, is motivated by greed. When the market system works properly, greed is tempered by competition. Hence, most of the gains from innovation and good service cannot be retained by the providers.
But in situations of asymmetric information, the forces of competition are weakened. The individual patient or financial client does not have access to all the relevant information. Indeed, when the information is sufficiently complex, it may be impossible to provide adequate information.
In these circumstances, greed becomes more relevant. There arises an obligation to present the relevant information as fully as possible, an obligation that has been violated in the financial industry. In the medical field, this challenge has to a considerable extent been met historically by standards of proper practice. These may involve revelation of all information, or at least the requirement that differences in information not be exploited.
It is clear that the financial industry is well behind the medical in this respect. A proper sense of responsibility has to be enforced by legislation, as it was in the 1930s. There has been some erosion in the law, for example under the Clinton administration, and in enforcement. The Dodd-Frank law is a step in the right direction, but the influence of the financial industry watered it down and created unnecessary complications.
It is not superfluous to argue that steepening the income tax progression, removing a number of blatant loopholes, such as the special treatment of capital gains, and reducing the exemption level for estates would add considerably to post-tax equality.
Boston Review — Kenneth Arrow: Economics and Inequality
Economics and Inequality
Kenneth Arrow
This article is part of Occupy the Future, a forum on lessons to be drawn from the Occupy movement.
The specific problems of the current U.S. economy—the drastic increase in unemployment and sluggish increase in output—overlay a tendency of much longer duration, a drastic and rapid increase in the inequality of income. Every economy of complexity produces an unequal distribution of the good things in life. But the period immediately following World War II showed a considerably increased equality of income compared with either the Great Depression or the previous period of relative prosperity.
Since the middle 1980s, this tendency has been reversed. In the United States, median family income (adjusted for size) has remained virtually constant since 1995, while per capita income has risen at about 2 percent per annum. The difference in income between college graduates and those with only high school degrees increased at a rapid rate, even during the period before 1990 when per capita income grew very slowly. Further, the proportion of the college-age population enrolled in college, which had been rising rapidly, stopped increasing and has remained the same for thirty years.
Clearly, the bulk of the gains from increased productivity went to a small group of upper-income recipients. Indeed, closer study has shown that the bulk of the increase went to the top 1 percent of income recipients and much of that to those in the top .1 percent.
The causes of this growing inequality are varied. There has been a steady attack on the use of the tax system as a means of equalizing income. Income and estate taxes were once the most directly effective factors in redistribution. The top rate in the federal income tax was over 90 percent in the 1950s and is about 35 percent today. The exemption level for estate taxes has risen steadily, ensuring that less and less can be taxed. On the other hand, the earned-income tax credit has actually permitted negative income taxes (payments by the government to the tax filer) at the lowest end.
Shifts in the composition of goods and services have reduced income opportunities for many. Skilled industrial jobs have disappeared, while growing information services require a different set of skills. This shift has undoubtedly been augmented by globalization, which has resulted in considerable imports of manufactured goods. The weakening of unions is in good measure attributable to the relative decline in manufacturing, where unionization is easier.
Contemporaneous with the decline of manufacturing has been the increase of two service industries, finance and health. Profits from the finance sector, which historically have been about 10 percent of all profits, have risen to an extraordinary 40 percent. The sector’s labor needs are, of course, directed in considerable measure to the best-educated.
A proper sense of responsibility has to be enforced by legislation.
The notion of a well-running market is applicable to manufactured goods; different items are produced to be alike and can be evaluated by consumers. But the products of the finance and health industries are individualized and complex. The consumer cannot seriously evaluate them—a situation that economists call “asymmetric information.”
This casts light on the claim that the problem is one of personal ethics, of greed. After all, the search for improvement in technology, and consequently in the general standards of living, is motivated by greed. When the market system works properly, greed is tempered by competition. Hence, most of the gains from innovation and good service cannot be retained by the providers.
But in situations of asymmetric information, the forces of competition are weakened. The individual patient or financial client does not have access to all the relevant information. Indeed, when the information is sufficiently complex, it may be impossible to provide adequate information.
In these circumstances, greed becomes more relevant. There arises an obligation to present the relevant information as fully as possible, an obligation that has been violated in the financial industry. In the medical field, this challenge has to a considerable extent been met historically by standards of proper practice. These may involve revelation of all information, or at least the requirement that differences in information not be exploited.
It is clear that the financial industry is well behind the medical in this respect. A proper sense of responsibility has to be enforced by legislation, as it was in the 1930s. There has been some erosion in the law, for example under the Clinton administration, and in enforcement. The Dodd-Frank law is a step in the right direction, but the influence of the financial industry watered it down and created unnecessary complications.
It is not superfluous to argue that steepening the income tax progression, removing a number of blatant loopholes, such as the special treatment of capital gains, and reducing the exemption level for estates would add considerably to post-tax equality.
Serious times call for more bread and circuses - Part I
It's unlikely that the current Republican party even knows how to look up the word "shame", but it's insightful to see how our election circus plays in foreign countries, in this case, Germany.
A Club of Liars, Demagogues, and Fools
By Scott Horton
The German newsweekly Spiegel takes the latest disclosures concerning Herman Cain and the rise of Newt Gingrich as an opportunity to offer a foreign bird’s-eye view of the current Republican Party and the American media froth around it. My translation:
“Africa is a country. The Taliban rule in Libya. Muslims are terrorists. Immigrants are mostly criminals, Occupy Wall Street protesters are always dirty. And women who claim to have been sexually molested should kindly keep quiet.”
Welcome to the wonderful world of the Republican Party. Or rather: to the distorted world of its presidential campaign. For months it has coiled through the country like a traveling circus, from debate to debate, from scandal to scandal, contesting the mightiest office in the world — and nothing is ever too unfathomable for them… These eight presidential wannabes are happy enough not only to demolish their own reputations but also that of their party, the once worthy party of Abraham Lincoln. They are also ruining the reputation of the United States.
They lie, deceive, scuffle and speak every manner of idiocy. And they expose a political, economic, geographic and historical ignorance compared to which George W. Bush sounds like a scholar. Even the party’s boosters are horrified by the spectacle…
Platitudes in lieu of programs: in serious times that demand the smartest, these clowns offer blather that is an insult to the intelligence of all Americans. But as with all freak shows, it would be impossible without a stage, the U.S. media, which has been neutered by the demands of political correctness, and a welcoming audience, a party base that seems to have been lobotomized overnight. Notwithstanding the subterranean depths of the primary process, the press and broadcasters proclaim one clown after the next to be the new frontrunner, in predictable news cycles of forty-five days.
Spiegel ties the disintegration of the Republican Party to the Tea Party, “a ‘popular movement’ that was sponsored by Fox News and never showed any interest in the business of government — neither in information nor intellect, which are its requisites, but rather in a self-marketing exercise driven by commissions and millions.”
The most important observation Spiegel offers is this: At a time of mounting crisis, when much of the world is looking to the United States for leadership and initiative, the celebration of sleaze and ignorance that has marked the Republican primary is damaging the reputation of the nation as a whole. Even those who despise the G.O.P. should be concerned about the depths to which the party has sunk.
http://harpers.org/archive/2011/11/hbc-90008328 (paywall'd)
A Club of Liars, Demagogues, and Fools
By Scott Horton
The German newsweekly Spiegel takes the latest disclosures concerning Herman Cain and the rise of Newt Gingrich as an opportunity to offer a foreign bird’s-eye view of the current Republican Party and the American media froth around it. My translation:
“Africa is a country. The Taliban rule in Libya. Muslims are terrorists. Immigrants are mostly criminals, Occupy Wall Street protesters are always dirty. And women who claim to have been sexually molested should kindly keep quiet.”
Welcome to the wonderful world of the Republican Party. Or rather: to the distorted world of its presidential campaign. For months it has coiled through the country like a traveling circus, from debate to debate, from scandal to scandal, contesting the mightiest office in the world — and nothing is ever too unfathomable for them… These eight presidential wannabes are happy enough not only to demolish their own reputations but also that of their party, the once worthy party of Abraham Lincoln. They are also ruining the reputation of the United States.
They lie, deceive, scuffle and speak every manner of idiocy. And they expose a political, economic, geographic and historical ignorance compared to which George W. Bush sounds like a scholar. Even the party’s boosters are horrified by the spectacle…
Platitudes in lieu of programs: in serious times that demand the smartest, these clowns offer blather that is an insult to the intelligence of all Americans. But as with all freak shows, it would be impossible without a stage, the U.S. media, which has been neutered by the demands of political correctness, and a welcoming audience, a party base that seems to have been lobotomized overnight. Notwithstanding the subterranean depths of the primary process, the press and broadcasters proclaim one clown after the next to be the new frontrunner, in predictable news cycles of forty-five days.
Spiegel ties the disintegration of the Republican Party to the Tea Party, “a ‘popular movement’ that was sponsored by Fox News and never showed any interest in the business of government — neither in information nor intellect, which are its requisites, but rather in a self-marketing exercise driven by commissions and millions.”
The most important observation Spiegel offers is this: At a time of mounting crisis, when much of the world is looking to the United States for leadership and initiative, the celebration of sleaze and ignorance that has marked the Republican primary is damaging the reputation of the nation as a whole. Even those who despise the G.O.P. should be concerned about the depths to which the party has sunk.
http://harpers.org/archive/2011/11/hbc-90008328 (paywall'd)
Wednesday, November 23, 2011
The Wisdom of Retrenchment
Another good read from Foreign Affairs Journal
The Wisdom of Retrenchment: The United States can no longer afford a world-spanning foreign policy. Retrenchment -- cutting military spending, redefining foreign priorities, and shifting more of the defense burden to allies -- is the only sensible course. Luckily, that does not have to spell instability abroad. History shows that pausing to recharge national batteries can renew a dominant power’s international legitimacy.
In the wake of the Cold War, U.S. foreign policy underwent a profound transformation. Unrestrained by superpower competition, the United States' ambitions spilled over their former limits. Washington increased its military spending far faster than any of its rivals, expanded NATO, and started dispatching forces around the world on humanitarian missions while letting key allies drift away. These trends accelerated after 9/11, as the United States went to war in Afghanistan and Iraq, ramped up its counterterrorism operations around the world, sped up its missile defense program, and set up new bases in distant lands.
Today, however, U.S. power has begun to wane. As other states rise in prominence, the United States' undisciplined spending habits and open-ended foreign policy commitments are catching up with the country. Spurred on by skyrocketing government debt and the emergence of the Tea Party movement, budget hawks are circling Washington. Before leaving office earlier this year, Secretary of Defense Robert Gates announced cuts to the tune of $78 billion over the next five years, and the recent debt-ceiling deal could trigger another $350 billion in cuts from the defense budget over ten years. In addition to fiscal discipline, Washington appears to have rediscovered the virtues of multilateralism and a restrained foreign policy. It has narrowed its war aims in Afghanistan and Iraq, taken NATO expansion off its agenda, and let France and the United Kingdom lead the intervention in Libya.
But if U.S. policymakers have reduced the country's strategic commitments in response to a decline in its relative power, they have yet to fully embrace retrenchment as a policy and endorse deep spending cuts (especially to the military), redefine Washington's foreign policy priorities, and shift more of the United States' defense burdens onto its allies. Indeed, Secretary of Defense Leon Panetta has warned that a cut in defense spending beyond the one agreed to in the debt-ceiling deal would be devastating. "It will weaken our national defense," he said. "It will undermine our ability to maintain our alliances throughout the world." This view reflects the conventional wisdom of generations of U.S. decision-makers: when it comes to power, more is always better. Many officials fear that reducing the country's influence abroad would let tyranny advance and force trade to dwindle. And various interest groups oppose the idea, since they stand to lose from a sudden reduction in the United States' foreign engagements.
In fact, far from auguring chaos abroad and division at home, a policy of prudent retrenchment would not only reduce the costs of U.S. foreign policy but also result in a more coherent and sustainable strategy. In the past, great powers that scaled back their goals in the face of their diminishing means were able to navigate the shoals of power politics better than those that clung to expensive and overly ambitious commitments. Today, a reduction in U.S. forward deployments could mollify U.S. adversaries, eliminate potential flashpoints, and encourage U.S. allies to contribute more to collective defense -- all while easing the burden on the United States of maintaining geopolitical dominance. A policy of retrenchment need not invite international instability or fuel partisan rancor in Washington. If anything, it could help provide breathing room for reforms and recovery, increase strategic flexibility, and renew the legitimacy of U.S. leadership.
DECLINE: DELUSION OR DESTINY?
Power is multifaceted and difficult to measure, but the metrics that matter most over the long term are a country's military capability and economic strength relative to rivals. Using those benchmarks, there is a strong case to be made that although U.S. decline is real, its rate is modest.
The United States invests more in its military manpower and hardware than all other countries combined. As the political scientist Barry Posen argues, this has allowed it to exercise "command of the commons." With its vast fleet of attack submarines and aircraft carriers, the United States controls the seas -- even those that are not its territorial waters and those outside its exclusive economic zone. Its fighter aircraft and unmanned aerial vehicles give it unrivaled air superiority. And its dominance of outer space and cyberspace is almost as impressive.
But the United States' return on its military investment is falling. Manpower and technology costs are increasing rapidly. The Government Accountability Office reports that since the end of the Cold War, funding for weapons acquisition has increased by 57 percent while the average acquisition cost has increased by 120 percent. According to the Congressional Research Service, between 1999 and 2005, the real cost of supporting an active-duty service member grew by 33 percent. Meanwhile, the benefits of unrestricted defense spending have not kept up with the costs. As Gates put it, U.S. defense institutions have become "accustomed to the post-9/11 decade's worth of 'no questions asked' funding requests," encouraging a culture of waste and inefficiency he described as "a semi-feudal system -- an amalgam of fiefdoms without centralized mechanisms to allocate resources."
The trend of the last decade is disturbing: as military spending soared, U.S. success abroad sagged. To be clear, the United States continues to field the best-armed, most skilled military in the world. The wars in Afghanistan and Iraq have bent, but not broken, the all-volunteer force, and the burden of maintaining this formidable force is not unacceptably onerous. The proposed $553 billion base-line defense budget for 2012 represents just 15 percent of the federal budget and less than five percent of GDP. (To put that figure in perspective, consider that the proposed 2012 budget for Social Security spending tops $760 billion.) Yet current trends will make it harder for the United States to continue to purchase hegemony as easily as it has in the past. Changes in military tactics and technology are eroding the United States' advantages. The proliferation of antiship cruise missiles makes it harder for the U.S. Navy to operate near adversaries' shores. Advanced surface-to-air missiles likewise raise the cost of maintaining U.S. air superiority in hostile theaters. Nationalist and tribal insurgencies, fueled by a brisk small-arms trade, have proved difficult to combat with conventional ground forces. U.S. defense dominance is getting more expensive at a moment when it is becoming less expensive for other states and actors to challenge the sole superpower.
Beyond these challenges to the country's military dominance, a weakened economic condition is contributing to the decline of U.S. power. The U.S. economy remains the largest in the world, yet its position is in jeopardy. Between 1999 and 2009, the U.S. share of global GDP (measured in terms of purchasing power parity) fell from 23 percent to 20 percent, whereas China's share of global GDP jumped from seven percent to 13 percent. Should this trend continue, China's economic output will surpass the United States' by 2016. China already consumes more energy than the United States, and calls are growing louder to replace the dollar as the international reserve currency with a basket of currencies that would include the euro and the yuan.
The fiscal position of the United States is alarming, whether or not one believes that Standard & Poor's was justified in downgrading U.S. Treasury bonds. Between 2001 and 2009, U.S. federal debt as a percentage of GDP more than doubled, from 32 percent to 67 percent, and state and local governments have significant debts, too. The United States' reliance on imports, combined with high rates of borrowing, has led to a considerable current account deficit: more than six percent of GDP in 2006. Power follows money, and the United States is leaking cash.
The news is not all doom and gloom. Despite massive federal debt, the United States spent less than five percent of its 2010 budget on net interest payments, limiting the extent to which debt servicing costs have crowded out other spending. The United States still exports more goods and services than any other country and is close behind China as the world's largest manufacturer. In terms of market exchange rate, the U.S. economy is still more than double the size of the Chinese economy, and China faces a raft of obstacles that could slow its rise: domestic unrest, stock and housing bubbles, corruption, an aging population, high savings, and an unproven track record of innovation. Yet the overall picture is clear: the United States' economic supremacy is no longer assured, and this uncertainty will reduce its geopolitical dominance.
In essence, the United States has fallen into a familiar pattern for hegemonic powers: overconsumption, overextension, and overoptimism. But the country also has a resourceful economy and a resilient military; it is not in free fall. Now, it needs a foreign policy to match.
RESISTING THE MYTHS OF EMPIRE
Despite the erosion of U.S. military and economic dominance, many observers warn that a rapid departure from the current approach to foreign policy would be disastrous. The historian Robert Kagan cautions that "a reduction in defense spending . . . would unnerve American allies and undercut efforts to gain greater cooperation." The journalist Robert Kaplan even more apocalyptically warns that "lessening [the United States'] engagement with the world would have devastating consequences for humanity." But these defenders of the status quo confuse retrenchment with appeasement or isolationism. A prudent reduction of the United States' overseas commitments would not prevent the country from countering dangerous threats and engaging with friends and allies. Indeed, such reductions would grant the country greater strategic flexibility and free resources to promote long-term growth.
A somewhat more compelling concern raised by opponents of retrenchment is that the policy might undermine deterrence. Reducing the defense budget or repositioning forces would make the United States look weak and embolden upstarts, they argue. "The very signaling of such an aloof intention may encourage regional bullies," Kaplan worries. This anxiety is rooted in the assumption that the best barrier to adventurism by adversaries is forward defenses -- the deployment of military assets in large bases near enemy borders, which serve as tripwires or, to some eyes, a Great Wall of America.
There are many problems with this position. For starters, the policies that have gotten the United States in trouble in recent years have been activist, not passive or defensive. The U.S.-led invasion of Iraq alienated important U.S. allies, such as Germany and Turkey, and increased Iran's regional power. NATO's expansion eastward has strained the alliance and intensified Russia's ambitions in Georgia and Ukraine.
More generally, U.S. forward deployments are no longer the main barrier to great-power land grabs. Taking and holding territory is more expensive than it once was, and great powers have little incentive or interest in expanding further. The United States' chief allies have developed the wherewithal to defend their territorial boundaries and deter restive neighbors. Of course, retrenchment might tempt reckless rivals to pursue unexpected or incautious policies, as states sometimes do. Should that occur, however, U.S. superiority in conventional arms and its power-projection capabilities would assure the option of quick U.S. intervention. Outcomes of that sort would be costly, but the risks of retrenchment must be compared to the risks of the status quo. In difficult financial circumstances, the United States must prioritize. The biggest menace to a superpower is not the possibility of belated entry into a regional crisis; it is the temptation of imperial overstretch. That is exactly the trap into which opponents of the United States, such as al Qaeda, want it to fall.
Nor is there good evidence that reducing Washington's overseas commitments would lead friends and rivals to question its credibility. Despite some glum prophecies, the withdrawal of U.S. armed forces from western Europe after the Cold War neither doomed NATO nor discredited the United States. Similar reductions in U.S. military forces and the forces' repositioning in South Korea have improved the sometimes tense relationship between Washington and Seoul. Calls for Japan to assume a greater defense burden have likewise resulted in deeper integration of U.S. and Japanese forces. Faith in forward defenses is a holdover from the Cold War, rooted in visions of implacable adversaries and falling dominoes. It is ill suited to contemporary world politics, where balancing coalitions are notably absent and ideological disputes remarkably mild.
Others warn that the U.S. political system is too fragmented to implement a coordinated policy of retrenchment. In this view, even if the foreign policy community unanimously subscribed to this strategy, it would be unable to outmaneuver lobbying groups and bureaucracies that favor a more activist approach. Electoral pressures reward lucrative defense contracts and chest-thumping stump speeches rather than sober appraisals of declining fortunes. Whatever leaders' preferences are, bureaucratic pressures promote conservative decisions, policy inertia, and big budgets -- none of which is likely to usher in an era of self-restraint.
Despite deep partisan divides, however, Republicans and Democrats have often put aside their differences when it comes to foreign policy. After World War II, the United States did not revert to the isolationism of earlier periods: both parties backed massive programs to contain the Soviet Union. During the tempestuous 1960s, a consensus emerged in favor of détente with the Soviets. The 9/11 attacks generated bipartisan support for action against al Qaeda and its allies. Then, in the wake of the global financial crisis of 2008, politicians across the spectrum recognized the need to bring the wars in Afghanistan and Iraq to an end. When faced with pressing foreign policy challenges, U.S. politicians generally transcend ideological divides and forge common policies, sometimes expanding the United States' global commitments and sometimes contracting them.
Today, electoral pressures support a more modest approach to foreign affairs. According to a 2009 study by the Pew Research Center, 70 percent of Americans would rather the United States share global leadership than go it alone. And a 2010 study by the Chicago Council on Global Affairs found that 79 percent of them thought the United States played the role of world policeman more than it should. Even on sacrosanct issues such as the defense budget, the public has demonstrated a willingness to consider reductions. In a 2010 study conducted by the Program for Public Consultation at the University of Maryland, 64 percent of respondents endorsed reductions in defense spending, supporting an average cut of $109 billion to the base-line defense budget.
Institutional barriers to reform do remain. Yet when presidents have led, the bureaucrats have largely followed. Three successive administrations, beginning with that of Ronald Reagan, were able to tame congressional opposition and push through an ambitious realignment program that ultimately resulted in the closure of 100 military bases, saving $57 billion. In its 2010 defense budget, the Obama administration succeeded in canceling plans to acquire additional F-22 Raptors despite fierce resistance by lobbyists, members of Congress, and the air force brass. The 2010 budget also included cuts to the navy's fleet of stealth destroyers and various components of the army's next generation of manned ground vehicles.
Thus, claims that retrenchment is politically impractical or improbable are unfounded. Just as a more humble foreign policy will invite neither instability nor decline, domestic political factors will not inevitably prevent timely reform. To chart a new course, U.S. policymakers need only possess foresight and will.
THE VIRTUES OF RESTRAINT
Even if a policy of retrenchment were possible to implement, would it work? The historical record suggests it would. Since 1870, there have been 18 cases in which a great power slipped in the rankings, as measured by its GDP relative to those of other great powers. Fifteen of those declining powers implemented some form of retrenchment. Far from inviting aggression, this policy resulted in those states' being more likely to avoid militarized disputes and to recover their former rank than the three declining great powers that did not adopt retrenchment: France in the 1880s, Germany in the 1930s, and Japan in the 1990s. Those states never recovered their former positions, unlike almost half of the 15 states that did retrench, including, for example, Russia in the 1880s and the United Kingdom in the first decade of the twentieth century.
Retrenchment works in several ways. One is by shifting commitments and resources from peripheral to core interests and preserving investments in the most valuable geographic and functional areas. This can help pare back the number of potential flashpoints with emerging adversaries by decreasing the odds of accidental clashes, as well as reducing the incentives of regional powers to respond confrontationally. Whereas primacy forces a state to defend a vast and brittle perimeter, a policy of retrenchment allows it to respond to significant threats at the times and in the places of its choosing. Conflict does not become entirely elective, as threats to core interests still must be met. But for the United States, retrenchment would reduce the overall burden of defense, as well as the danger of becoming bogged down in a marginal morass.
It would also encourage U.S. allies to assume more responsibility for collective security. Such burden sharing would be more equitable for U.S. taxpayers, who today shoulder a disproportionate load in securing the world. Every year, according to Christopher Preble of the Cato Institute, they pay an average of $2,065 each in taxes to cover the cost of national defense, compared with $1,000 for Britons, $430 for Germans, and $340 for Japanese.
Despite spending far less on defense, the United States' traditional allies have little trouble protecting their vital interests. No state credibly threatens the territorial integrity of either western European countries or Japan, and U.S. allies do not need independent power- projection capabilities to protect their homelands. NATO's intervention in Libya has been flawed in many respects, but it has demonstrated that European member states are capable of conducting complex military operations with the United States playing a secondary role. Going forward, U.S. retrenchment would compel U.S. allies to improve their existing capabilities and bear the costs of their altruistic impulses.
The United States and its allies have basically the same goals: democracy, stability, and trade. But the United States is in the awkward position of both being spread too thin around the globe and irritating many states by its presence on, or near, their soil. Delegating some of its responsibilities to allies would permit the U.S. government to focus more on critical objectives, such as ensuring a stable and prosperous economy. Regional partners, who have a greater stake in and knowledge of local challenges, can take on more responsibility. With increased input from others and a less invasive presence, retrenchment would also allow the United States to restore some luster to its leadership.
A MORE FRUGAL FUTURE
To implement a retrenchment policy, the United States would have to take three main steps: reduce its global military footprint, change the size and composition of the U.S. military, and use the resulting "retrenchment dividend" to foster economic recovery at home.
First, the United States must reconsider its forward deployments. The top priority should be to deter aggression against its main economic partners in Europe and Asia. This task is not especially burdensome; there are few credible threats to U.S. allies in these regions, and these states need little help from the United States.
Although Russia continues to meddle in its near abroad and has employed oil and gas embargoes to coerce its immediate neighbors, western Europe's resources are more than sufficient to counter an assertive Russia. A more autonomous Europe would take some time to develop a coherent security and defense policy and would not always see events through the same lens as Washington. But reducing Europe's dependence on the United States would create a strong incentive for European states to spend more on defense, modernize their forces, and better integrate their policies and capabilities. U.S. forces in the European theater could safely be reduced by 40-50 percent without compromising European security.
Asia is also ready for a decreased U.S. military presence, and Washington should begin gradually withdrawing its troops. Although China has embarked on an ambitious policy of military modernization and engages in periodic saber rattling in the South China Sea, its ability to project power remains limited. Japan and South Korea are already shouldering greater defense burdens than they were during the Cold War. India, the Philippines, and Vietnam are eager to forge strategic partnerships with the United States. Given the shared interest in promoting regional security, these ties could be sustained through bilateral political and economic agreements, instead of the indefinite deployments and open-ended commitments of the Cold War.
In the event that China becomes domineering, U.S. allies on its borders will act as a natural early warning system and a first line of defense, as well as provide logistical hubs and financial support for any necessary U.S. responses. Yet such a state of affairs is hardly inevitable. For now, there are many less expensive alternatives that can strengthen the current line of defense, such as technology transfers, arms sales, and diplomatic mediation. Defending the territorial integrity of Japan and South Korea and preventing Chinese or North Korean adventurism demands rapid-response forces with strong reserves, not the 30,000 soldiers currently stationed in each country. Phasing out 20 percent of those forces while repositioning others to Guam or Hawaii would achieve the same results more efficiently.
Reducing these overseas commitments would produce significant savings. A bipartisan task force report published in 2010 by the Project on Defense Alternatives estimated that the demobilization of 50,000 active-duty soldiers in Europe and Asia alone could save as much as $12 billion a year. Shrinking the U.S. footprint would also generate indirect savings in the form of decreased personnel, maintenance, and equipment costs.
Retrenchment would also require the United States to minimize its presence in South Asia and the Middle East. The United States has an interest in ensuring the flow of cheap oil, yet armed interventions and forward deployments are hardly the best ways to achieve that goal. These actions have radicalized local populations, provided attractive targets for terrorists, destabilized oil markets, and inflamed the suspicions of regional rivals such as Iran. Similarly, the United States has a strong incentive to deny terrorist groups safe havens in ungoverned spaces. It is unclear, however, whether large troop deployments are the most cost-effective way to do so. The U.S.-led NATO mission in Afghanistan has established temporary pockets of stability, but it has enjoyed little success in promoting good governance, stamping out corruption, or eradicating the most dangerous militant networks. Nor have boots on the ground improved relations with or politics in Pakistan.
More broadly, the Pentagon should devote fewer resources to maintaining and developing its capabilities for engaging in peripheral conflicts, such as the war in Afghanistan. Nation building and counterinsurgency operations have a place in U.S. defense planning, but not a large one. The wars in Afghanistan and Iraq have raised the profile of counterinsurgency doctrine and brought prominence to its advocates and practitioners, such as David Petraeus, the retired general who is now director of the CIA. This is an understandable development, considering that the defense establishment was previously unprepared to wage a counterinsurgency war. But such conflicts require enormous commitments of blood and treasure over many years, rarely result in decisive victory, and seldom bring tangible rewards. A retrenching United States would sidestep such high-risk, low-return endeavors, especially when counterterrorism and domestic law enforcement and security measures have proved to be effective alternatives. Although they cannot solve every problem, relatively small forces that do not require massive bases can nevertheless carry out significant strikes -- as evidenced by the operation that killed Osama bin Laden.
Curbing the United States' commitments would reduce risks, but it cannot eliminate them. Adversaries may fill regional power vacuums, and allies will never behave exactly as Washington would prefer. Yet those costs would be outweighed by the concrete benefits of pulling back. A focus on the United States' core interests in western Europe would limit the risk of catastrophic clashes with Russia over ethnic enclaves in Georgia or Moldova by allowing the United States to avoid commitments it would be unwise to honor. By narrowing its commitments in Asia, the United States could lessen the likelihood of conflict over issues such as the status of Taiwan or competing maritime claims in the South China Sea. Just as the United Kingdom tempered its commitments and accommodated U.S. interests in the Western Hemisphere at the turn of the last century, the United States should now temper its commitments and cultivate a lasting compromise with China over Taiwan.
Disassociating itself from unsavory regimes in the Middle East would insulate the United States from the charges of hypocrisy that undermine public support for its foreign policy throughout the region. And an accelerated drawdown of the wars in Afghanistan and Iraq would save a considerable amount of money. The current request for $118 billion to support these operations represents a savings of $42 billion compared with last year. Moving even faster to end those conflicts would result in even larger savings. At a time when the U.S. government is under incredible pressure to justify big-ticket spending, what little return on investment these wars promise does not warrant any more patience -- or sacrifice.
FROM PROFLIGACY TO PRUDENCE
The second necessary step for retrenchment would be to change the size and composition of U.S. military forces. Despite Gates' best efforts, the 2012 defense budget remains stuffed with allocations for weapons systems of debatable strategic value. For instance, despite delays, cost overruns, failed or deferred tests, and opposition from U.S. allies, the Obama administration has pledged more than $10 billion for various ballistic missile defense systems and close to another $10 billion to fund the F-35 Joint Strike Fighter. And such programs are merely the low-hanging fruit. A nonpartisan task force of leading experts convened by the Institute for Policy Studies recently concluded that the U.S. government could slash more than $77 billion from the 2012 defense budget across eight different programs. New submarines and preliminary payments on what would be the 11th U.S. aircraft carrier -- no other country has more than one -- cannot be the best way to spend $5 billion. Likewise, spending $100 billion over the next ten years to upgrade U.S. nuclear weapons will not alter any adversary's calculations in a positive way.
Deeper defense cuts would force the Pentagon to do what the rest of the United States is already doing: rethink the country's role in a changing world. One problem with present procurement plans is that the strategic rationale underlying certain goals -- a 320-ship fleet for the navy, 2,200 fighter aircraft for the air force -- remains murky. Protecting international trade routes against Chinese aggression is often cited as the justification for such military programs. But precisely how the United States is going to protect its economy by clashing with its third-largest trading partner is rarely explained.
The lack of a clear assessment of the costs and benefits of new weapons systems may also lead to expensive errors. The United States already has an immense lead in aircraft carriers, fourth-generation jet fighters, and mechanized land forces. There are few reasons to squander resources replacing weapons systems that already surpass those of every single rival. Moreover, the fast pace of technological change, in particular when it comes to advanced antiship and air defense capabilities, casts doubt on the wisdom of pouring money into systems that might be obsolete the moment they roll off assembly lines.
In contrast, a modest investment in proven capabilities would bolster U.S. defenses in core regions and give the United States maximum flexibility to respond to future threats. To this end, investments should continue in theater- and naval-based ballistic missile defense systems, which remain the best ways to protect U.S. allies against missile threats. The Pentagon should acquire cheap alternatives to existing systems, such as unmanned aerial vehicles, in large numbers. Congress should continue to fund research and development, but only enough to ensure that new technologies could be produced promptly when clear and present needs arise. These changes in procurement, combined with a slightly swifter drawdown in Afghanistan and Iraq and a somewhat smaller U.S. Army and Marine Corps, would save the United States a minimum of $90 billion annually.
Savings of that kind would be part of a retrenchment dividend that could be spent on reinvigorating the U.S. economy. Retrenchment begins with the curtailing of foreign policy resources, but it ends only when the resources saved are spent domestically. Although military expenditures are a productive investment, they are not infinitely or incomparably so. And the United States is already past the point of diminishing returns when it comes to defense spending. Washington should prioritize measures to more directly stimulate the U.S. economy and make it more competitive. How exactly to achieve that outcome will surely continue to be the subject of fierce debate. But that debate will be much more meaningful if it is conducted with the aim of investing a retrenchment dividend.
The modest decline of U.S. power, combined with a relatively benign international environment, has provided the United States with a unique opportunity to reduce its foreign policy commitments in a measured manner. To make a virtue of this necessity, policymakers in Washington must resist calls to tighten the United States' tenuous grasp on global affairs, ignore the stale warnings about eroded credibility, and overcome the tired protests of bloated bureaucracies. By reducing its forward deployments, sharing burdens with its allies, limiting its fights in peripheral territories, and paring back wasteful spending on unnecessary weapons, the United States can not only slow its decline but also sow the seeds of its recovery.
The Wisdom of Retrenchment: The United States can no longer afford a world-spanning foreign policy. Retrenchment -- cutting military spending, redefining foreign priorities, and shifting more of the defense burden to allies -- is the only sensible course. Luckily, that does not have to spell instability abroad. History shows that pausing to recharge national batteries can renew a dominant power’s international legitimacy.
In the wake of the Cold War, U.S. foreign policy underwent a profound transformation. Unrestrained by superpower competition, the United States' ambitions spilled over their former limits. Washington increased its military spending far faster than any of its rivals, expanded NATO, and started dispatching forces around the world on humanitarian missions while letting key allies drift away. These trends accelerated after 9/11, as the United States went to war in Afghanistan and Iraq, ramped up its counterterrorism operations around the world, sped up its missile defense program, and set up new bases in distant lands.
Today, however, U.S. power has begun to wane. As other states rise in prominence, the United States' undisciplined spending habits and open-ended foreign policy commitments are catching up with the country. Spurred on by skyrocketing government debt and the emergence of the Tea Party movement, budget hawks are circling Washington. Before leaving office earlier this year, Secretary of Defense Robert Gates announced cuts to the tune of $78 billion over the next five years, and the recent debt-ceiling deal could trigger another $350 billion in cuts from the defense budget over ten years. In addition to fiscal discipline, Washington appears to have rediscovered the virtues of multilateralism and a restrained foreign policy. It has narrowed its war aims in Afghanistan and Iraq, taken NATO expansion off its agenda, and let France and the United Kingdom lead the intervention in Libya.
But if U.S. policymakers have reduced the country's strategic commitments in response to a decline in its relative power, they have yet to fully embrace retrenchment as a policy and endorse deep spending cuts (especially to the military), redefine Washington's foreign policy priorities, and shift more of the United States' defense burdens onto its allies. Indeed, Secretary of Defense Leon Panetta has warned that a cut in defense spending beyond the one agreed to in the debt-ceiling deal would be devastating. "It will weaken our national defense," he said. "It will undermine our ability to maintain our alliances throughout the world." This view reflects the conventional wisdom of generations of U.S. decision-makers: when it comes to power, more is always better. Many officials fear that reducing the country's influence abroad would let tyranny advance and force trade to dwindle. And various interest groups oppose the idea, since they stand to lose from a sudden reduction in the United States' foreign engagements.
In fact, far from auguring chaos abroad and division at home, a policy of prudent retrenchment would not only reduce the costs of U.S. foreign policy but also result in a more coherent and sustainable strategy. In the past, great powers that scaled back their goals in the face of their diminishing means were able to navigate the shoals of power politics better than those that clung to expensive and overly ambitious commitments. Today, a reduction in U.S. forward deployments could mollify U.S. adversaries, eliminate potential flashpoints, and encourage U.S. allies to contribute more to collective defense -- all while easing the burden on the United States of maintaining geopolitical dominance. A policy of retrenchment need not invite international instability or fuel partisan rancor in Washington. If anything, it could help provide breathing room for reforms and recovery, increase strategic flexibility, and renew the legitimacy of U.S. leadership.
DECLINE: DELUSION OR DESTINY?
Power is multifaceted and difficult to measure, but the metrics that matter most over the long term are a country's military capability and economic strength relative to rivals. Using those benchmarks, there is a strong case to be made that although U.S. decline is real, its rate is modest.
The United States invests more in its military manpower and hardware than all other countries combined. As the political scientist Barry Posen argues, this has allowed it to exercise "command of the commons." With its vast fleet of attack submarines and aircraft carriers, the United States controls the seas -- even those that are not its territorial waters and those outside its exclusive economic zone. Its fighter aircraft and unmanned aerial vehicles give it unrivaled air superiority. And its dominance of outer space and cyberspace is almost as impressive.
But the United States' return on its military investment is falling. Manpower and technology costs are increasing rapidly. The Government Accountability Office reports that since the end of the Cold War, funding for weapons acquisition has increased by 57 percent while the average acquisition cost has increased by 120 percent. According to the Congressional Research Service, between 1999 and 2005, the real cost of supporting an active-duty service member grew by 33 percent. Meanwhile, the benefits of unrestricted defense spending have not kept up with the costs. As Gates put it, U.S. defense institutions have become "accustomed to the post-9/11 decade's worth of 'no questions asked' funding requests," encouraging a culture of waste and inefficiency he described as "a semi-feudal system -- an amalgam of fiefdoms without centralized mechanisms to allocate resources."
The trend of the last decade is disturbing: as military spending soared, U.S. success abroad sagged. To be clear, the United States continues to field the best-armed, most skilled military in the world. The wars in Afghanistan and Iraq have bent, but not broken, the all-volunteer force, and the burden of maintaining this formidable force is not unacceptably onerous. The proposed $553 billion base-line defense budget for 2012 represents just 15 percent of the federal budget and less than five percent of GDP. (To put that figure in perspective, consider that the proposed 2012 budget for Social Security spending tops $760 billion.) Yet current trends will make it harder for the United States to continue to purchase hegemony as easily as it has in the past. Changes in military tactics and technology are eroding the United States' advantages. The proliferation of antiship cruise missiles makes it harder for the U.S. Navy to operate near adversaries' shores. Advanced surface-to-air missiles likewise raise the cost of maintaining U.S. air superiority in hostile theaters. Nationalist and tribal insurgencies, fueled by a brisk small-arms trade, have proved difficult to combat with conventional ground forces. U.S. defense dominance is getting more expensive at a moment when it is becoming less expensive for other states and actors to challenge the sole superpower.
Beyond these challenges to the country's military dominance, a weakened economic condition is contributing to the decline of U.S. power. The U.S. economy remains the largest in the world, yet its position is in jeopardy. Between 1999 and 2009, the U.S. share of global GDP (measured in terms of purchasing power parity) fell from 23 percent to 20 percent, whereas China's share of global GDP jumped from seven percent to 13 percent. Should this trend continue, China's economic output will surpass the United States' by 2016. China already consumes more energy than the United States, and calls are growing louder to replace the dollar as the international reserve currency with a basket of currencies that would include the euro and the yuan.
The fiscal position of the United States is alarming, whether or not one believes that Standard & Poor's was justified in downgrading U.S. Treasury bonds. Between 2001 and 2009, U.S. federal debt as a percentage of GDP more than doubled, from 32 percent to 67 percent, and state and local governments have significant debts, too. The United States' reliance on imports, combined with high rates of borrowing, has led to a considerable current account deficit: more than six percent of GDP in 2006. Power follows money, and the United States is leaking cash.
The news is not all doom and gloom. Despite massive federal debt, the United States spent less than five percent of its 2010 budget on net interest payments, limiting the extent to which debt servicing costs have crowded out other spending. The United States still exports more goods and services than any other country and is close behind China as the world's largest manufacturer. In terms of market exchange rate, the U.S. economy is still more than double the size of the Chinese economy, and China faces a raft of obstacles that could slow its rise: domestic unrest, stock and housing bubbles, corruption, an aging population, high savings, and an unproven track record of innovation. Yet the overall picture is clear: the United States' economic supremacy is no longer assured, and this uncertainty will reduce its geopolitical dominance.
In essence, the United States has fallen into a familiar pattern for hegemonic powers: overconsumption, overextension, and overoptimism. But the country also has a resourceful economy and a resilient military; it is not in free fall. Now, it needs a foreign policy to match.
RESISTING THE MYTHS OF EMPIRE
Despite the erosion of U.S. military and economic dominance, many observers warn that a rapid departure from the current approach to foreign policy would be disastrous. The historian Robert Kagan cautions that "a reduction in defense spending . . . would unnerve American allies and undercut efforts to gain greater cooperation." The journalist Robert Kaplan even more apocalyptically warns that "lessening [the United States'] engagement with the world would have devastating consequences for humanity." But these defenders of the status quo confuse retrenchment with appeasement or isolationism. A prudent reduction of the United States' overseas commitments would not prevent the country from countering dangerous threats and engaging with friends and allies. Indeed, such reductions would grant the country greater strategic flexibility and free resources to promote long-term growth.
A somewhat more compelling concern raised by opponents of retrenchment is that the policy might undermine deterrence. Reducing the defense budget or repositioning forces would make the United States look weak and embolden upstarts, they argue. "The very signaling of such an aloof intention may encourage regional bullies," Kaplan worries. This anxiety is rooted in the assumption that the best barrier to adventurism by adversaries is forward defenses -- the deployment of military assets in large bases near enemy borders, which serve as tripwires or, to some eyes, a Great Wall of America.
There are many problems with this position. For starters, the policies that have gotten the United States in trouble in recent years have been activist, not passive or defensive. The U.S.-led invasion of Iraq alienated important U.S. allies, such as Germany and Turkey, and increased Iran's regional power. NATO's expansion eastward has strained the alliance and intensified Russia's ambitions in Georgia and Ukraine.
More generally, U.S. forward deployments are no longer the main barrier to great-power land grabs. Taking and holding territory is more expensive than it once was, and great powers have little incentive or interest in expanding further. The United States' chief allies have developed the wherewithal to defend their territorial boundaries and deter restive neighbors. Of course, retrenchment might tempt reckless rivals to pursue unexpected or incautious policies, as states sometimes do. Should that occur, however, U.S. superiority in conventional arms and its power-projection capabilities would assure the option of quick U.S. intervention. Outcomes of that sort would be costly, but the risks of retrenchment must be compared to the risks of the status quo. In difficult financial circumstances, the United States must prioritize. The biggest menace to a superpower is not the possibility of belated entry into a regional crisis; it is the temptation of imperial overstretch. That is exactly the trap into which opponents of the United States, such as al Qaeda, want it to fall.
Nor is there good evidence that reducing Washington's overseas commitments would lead friends and rivals to question its credibility. Despite some glum prophecies, the withdrawal of U.S. armed forces from western Europe after the Cold War neither doomed NATO nor discredited the United States. Similar reductions in U.S. military forces and the forces' repositioning in South Korea have improved the sometimes tense relationship between Washington and Seoul. Calls for Japan to assume a greater defense burden have likewise resulted in deeper integration of U.S. and Japanese forces. Faith in forward defenses is a holdover from the Cold War, rooted in visions of implacable adversaries and falling dominoes. It is ill suited to contemporary world politics, where balancing coalitions are notably absent and ideological disputes remarkably mild.
Others warn that the U.S. political system is too fragmented to implement a coordinated policy of retrenchment. In this view, even if the foreign policy community unanimously subscribed to this strategy, it would be unable to outmaneuver lobbying groups and bureaucracies that favor a more activist approach. Electoral pressures reward lucrative defense contracts and chest-thumping stump speeches rather than sober appraisals of declining fortunes. Whatever leaders' preferences are, bureaucratic pressures promote conservative decisions, policy inertia, and big budgets -- none of which is likely to usher in an era of self-restraint.
Despite deep partisan divides, however, Republicans and Democrats have often put aside their differences when it comes to foreign policy. After World War II, the United States did not revert to the isolationism of earlier periods: both parties backed massive programs to contain the Soviet Union. During the tempestuous 1960s, a consensus emerged in favor of détente with the Soviets. The 9/11 attacks generated bipartisan support for action against al Qaeda and its allies. Then, in the wake of the global financial crisis of 2008, politicians across the spectrum recognized the need to bring the wars in Afghanistan and Iraq to an end. When faced with pressing foreign policy challenges, U.S. politicians generally transcend ideological divides and forge common policies, sometimes expanding the United States' global commitments and sometimes contracting them.
Today, electoral pressures support a more modest approach to foreign affairs. According to a 2009 study by the Pew Research Center, 70 percent of Americans would rather the United States share global leadership than go it alone. And a 2010 study by the Chicago Council on Global Affairs found that 79 percent of them thought the United States played the role of world policeman more than it should. Even on sacrosanct issues such as the defense budget, the public has demonstrated a willingness to consider reductions. In a 2010 study conducted by the Program for Public Consultation at the University of Maryland, 64 percent of respondents endorsed reductions in defense spending, supporting an average cut of $109 billion to the base-line defense budget.
Institutional barriers to reform do remain. Yet when presidents have led, the bureaucrats have largely followed. Three successive administrations, beginning with that of Ronald Reagan, were able to tame congressional opposition and push through an ambitious realignment program that ultimately resulted in the closure of 100 military bases, saving $57 billion. In its 2010 defense budget, the Obama administration succeeded in canceling plans to acquire additional F-22 Raptors despite fierce resistance by lobbyists, members of Congress, and the air force brass. The 2010 budget also included cuts to the navy's fleet of stealth destroyers and various components of the army's next generation of manned ground vehicles.
Thus, claims that retrenchment is politically impractical or improbable are unfounded. Just as a more humble foreign policy will invite neither instability nor decline, domestic political factors will not inevitably prevent timely reform. To chart a new course, U.S. policymakers need only possess foresight and will.
THE VIRTUES OF RESTRAINT
Even if a policy of retrenchment were possible to implement, would it work? The historical record suggests it would. Since 1870, there have been 18 cases in which a great power slipped in the rankings, as measured by its GDP relative to those of other great powers. Fifteen of those declining powers implemented some form of retrenchment. Far from inviting aggression, this policy resulted in those states' being more likely to avoid militarized disputes and to recover their former rank than the three declining great powers that did not adopt retrenchment: France in the 1880s, Germany in the 1930s, and Japan in the 1990s. Those states never recovered their former positions, unlike almost half of the 15 states that did retrench, including, for example, Russia in the 1880s and the United Kingdom in the first decade of the twentieth century.
Retrenchment works in several ways. One is by shifting commitments and resources from peripheral to core interests and preserving investments in the most valuable geographic and functional areas. This can help pare back the number of potential flashpoints with emerging adversaries by decreasing the odds of accidental clashes, as well as reducing the incentives of regional powers to respond confrontationally. Whereas primacy forces a state to defend a vast and brittle perimeter, a policy of retrenchment allows it to respond to significant threats at the times and in the places of its choosing. Conflict does not become entirely elective, as threats to core interests still must be met. But for the United States, retrenchment would reduce the overall burden of defense, as well as the danger of becoming bogged down in a marginal morass.
It would also encourage U.S. allies to assume more responsibility for collective security. Such burden sharing would be more equitable for U.S. taxpayers, who today shoulder a disproportionate load in securing the world. Every year, according to Christopher Preble of the Cato Institute, they pay an average of $2,065 each in taxes to cover the cost of national defense, compared with $1,000 for Britons, $430 for Germans, and $340 for Japanese.
Despite spending far less on defense, the United States' traditional allies have little trouble protecting their vital interests. No state credibly threatens the territorial integrity of either western European countries or Japan, and U.S. allies do not need independent power- projection capabilities to protect their homelands. NATO's intervention in Libya has been flawed in many respects, but it has demonstrated that European member states are capable of conducting complex military operations with the United States playing a secondary role. Going forward, U.S. retrenchment would compel U.S. allies to improve their existing capabilities and bear the costs of their altruistic impulses.
The United States and its allies have basically the same goals: democracy, stability, and trade. But the United States is in the awkward position of both being spread too thin around the globe and irritating many states by its presence on, or near, their soil. Delegating some of its responsibilities to allies would permit the U.S. government to focus more on critical objectives, such as ensuring a stable and prosperous economy. Regional partners, who have a greater stake in and knowledge of local challenges, can take on more responsibility. With increased input from others and a less invasive presence, retrenchment would also allow the United States to restore some luster to its leadership.
A MORE FRUGAL FUTURE
To implement a retrenchment policy, the United States would have to take three main steps: reduce its global military footprint, change the size and composition of the U.S. military, and use the resulting "retrenchment dividend" to foster economic recovery at home.
First, the United States must reconsider its forward deployments. The top priority should be to deter aggression against its main economic partners in Europe and Asia. This task is not especially burdensome; there are few credible threats to U.S. allies in these regions, and these states need little help from the United States.
Although Russia continues to meddle in its near abroad and has employed oil and gas embargoes to coerce its immediate neighbors, western Europe's resources are more than sufficient to counter an assertive Russia. A more autonomous Europe would take some time to develop a coherent security and defense policy and would not always see events through the same lens as Washington. But reducing Europe's dependence on the United States would create a strong incentive for European states to spend more on defense, modernize their forces, and better integrate their policies and capabilities. U.S. forces in the European theater could safely be reduced by 40-50 percent without compromising European security.
Asia is also ready for a decreased U.S. military presence, and Washington should begin gradually withdrawing its troops. Although China has embarked on an ambitious policy of military modernization and engages in periodic saber rattling in the South China Sea, its ability to project power remains limited. Japan and South Korea are already shouldering greater defense burdens than they were during the Cold War. India, the Philippines, and Vietnam are eager to forge strategic partnerships with the United States. Given the shared interest in promoting regional security, these ties could be sustained through bilateral political and economic agreements, instead of the indefinite deployments and open-ended commitments of the Cold War.
In the event that China becomes domineering, U.S. allies on its borders will act as a natural early warning system and a first line of defense, as well as provide logistical hubs and financial support for any necessary U.S. responses. Yet such a state of affairs is hardly inevitable. For now, there are many less expensive alternatives that can strengthen the current line of defense, such as technology transfers, arms sales, and diplomatic mediation. Defending the territorial integrity of Japan and South Korea and preventing Chinese or North Korean adventurism demands rapid-response forces with strong reserves, not the 30,000 soldiers currently stationed in each country. Phasing out 20 percent of those forces while repositioning others to Guam or Hawaii would achieve the same results more efficiently.
Reducing these overseas commitments would produce significant savings. A bipartisan task force report published in 2010 by the Project on Defense Alternatives estimated that the demobilization of 50,000 active-duty soldiers in Europe and Asia alone could save as much as $12 billion a year. Shrinking the U.S. footprint would also generate indirect savings in the form of decreased personnel, maintenance, and equipment costs.
Retrenchment would also require the United States to minimize its presence in South Asia and the Middle East. The United States has an interest in ensuring the flow of cheap oil, yet armed interventions and forward deployments are hardly the best ways to achieve that goal. These actions have radicalized local populations, provided attractive targets for terrorists, destabilized oil markets, and inflamed the suspicions of regional rivals such as Iran. Similarly, the United States has a strong incentive to deny terrorist groups safe havens in ungoverned spaces. It is unclear, however, whether large troop deployments are the most cost-effective way to do so. The U.S.-led NATO mission in Afghanistan has established temporary pockets of stability, but it has enjoyed little success in promoting good governance, stamping out corruption, or eradicating the most dangerous militant networks. Nor have boots on the ground improved relations with or politics in Pakistan.
More broadly, the Pentagon should devote fewer resources to maintaining and developing its capabilities for engaging in peripheral conflicts, such as the war in Afghanistan. Nation building and counterinsurgency operations have a place in U.S. defense planning, but not a large one. The wars in Afghanistan and Iraq have raised the profile of counterinsurgency doctrine and brought prominence to its advocates and practitioners, such as David Petraeus, the retired general who is now director of the CIA. This is an understandable development, considering that the defense establishment was previously unprepared to wage a counterinsurgency war. But such conflicts require enormous commitments of blood and treasure over many years, rarely result in decisive victory, and seldom bring tangible rewards. A retrenching United States would sidestep such high-risk, low-return endeavors, especially when counterterrorism and domestic law enforcement and security measures have proved to be effective alternatives. Although they cannot solve every problem, relatively small forces that do not require massive bases can nevertheless carry out significant strikes -- as evidenced by the operation that killed Osama bin Laden.
Curbing the United States' commitments would reduce risks, but it cannot eliminate them. Adversaries may fill regional power vacuums, and allies will never behave exactly as Washington would prefer. Yet those costs would be outweighed by the concrete benefits of pulling back. A focus on the United States' core interests in western Europe would limit the risk of catastrophic clashes with Russia over ethnic enclaves in Georgia or Moldova by allowing the United States to avoid commitments it would be unwise to honor. By narrowing its commitments in Asia, the United States could lessen the likelihood of conflict over issues such as the status of Taiwan or competing maritime claims in the South China Sea. Just as the United Kingdom tempered its commitments and accommodated U.S. interests in the Western Hemisphere at the turn of the last century, the United States should now temper its commitments and cultivate a lasting compromise with China over Taiwan.
Disassociating itself from unsavory regimes in the Middle East would insulate the United States from the charges of hypocrisy that undermine public support for its foreign policy throughout the region. And an accelerated drawdown of the wars in Afghanistan and Iraq would save a considerable amount of money. The current request for $118 billion to support these operations represents a savings of $42 billion compared with last year. Moving even faster to end those conflicts would result in even larger savings. At a time when the U.S. government is under incredible pressure to justify big-ticket spending, what little return on investment these wars promise does not warrant any more patience -- or sacrifice.
FROM PROFLIGACY TO PRUDENCE
The second necessary step for retrenchment would be to change the size and composition of U.S. military forces. Despite Gates' best efforts, the 2012 defense budget remains stuffed with allocations for weapons systems of debatable strategic value. For instance, despite delays, cost overruns, failed or deferred tests, and opposition from U.S. allies, the Obama administration has pledged more than $10 billion for various ballistic missile defense systems and close to another $10 billion to fund the F-35 Joint Strike Fighter. And such programs are merely the low-hanging fruit. A nonpartisan task force of leading experts convened by the Institute for Policy Studies recently concluded that the U.S. government could slash more than $77 billion from the 2012 defense budget across eight different programs. New submarines and preliminary payments on what would be the 11th U.S. aircraft carrier -- no other country has more than one -- cannot be the best way to spend $5 billion. Likewise, spending $100 billion over the next ten years to upgrade U.S. nuclear weapons will not alter any adversary's calculations in a positive way.
Deeper defense cuts would force the Pentagon to do what the rest of the United States is already doing: rethink the country's role in a changing world. One problem with present procurement plans is that the strategic rationale underlying certain goals -- a 320-ship fleet for the navy, 2,200 fighter aircraft for the air force -- remains murky. Protecting international trade routes against Chinese aggression is often cited as the justification for such military programs. But precisely how the United States is going to protect its economy by clashing with its third-largest trading partner is rarely explained.
The lack of a clear assessment of the costs and benefits of new weapons systems may also lead to expensive errors. The United States already has an immense lead in aircraft carriers, fourth-generation jet fighters, and mechanized land forces. There are few reasons to squander resources replacing weapons systems that already surpass those of every single rival. Moreover, the fast pace of technological change, in particular when it comes to advanced antiship and air defense capabilities, casts doubt on the wisdom of pouring money into systems that might be obsolete the moment they roll off assembly lines.
In contrast, a modest investment in proven capabilities would bolster U.S. defenses in core regions and give the United States maximum flexibility to respond to future threats. To this end, investments should continue in theater- and naval-based ballistic missile defense systems, which remain the best ways to protect U.S. allies against missile threats. The Pentagon should acquire cheap alternatives to existing systems, such as unmanned aerial vehicles, in large numbers. Congress should continue to fund research and development, but only enough to ensure that new technologies could be produced promptly when clear and present needs arise. These changes in procurement, combined with a slightly swifter drawdown in Afghanistan and Iraq and a somewhat smaller U.S. Army and Marine Corps, would save the United States a minimum of $90 billion annually.
Savings of that kind would be part of a retrenchment dividend that could be spent on reinvigorating the U.S. economy. Retrenchment begins with the curtailing of foreign policy resources, but it ends only when the resources saved are spent domestically. Although military expenditures are a productive investment, they are not infinitely or incomparably so. And the United States is already past the point of diminishing returns when it comes to defense spending. Washington should prioritize measures to more directly stimulate the U.S. economy and make it more competitive. How exactly to achieve that outcome will surely continue to be the subject of fierce debate. But that debate will be much more meaningful if it is conducted with the aim of investing a retrenchment dividend.
The modest decline of U.S. power, combined with a relatively benign international environment, has provided the United States with a unique opportunity to reduce its foreign policy commitments in a measured manner. To make a virtue of this necessity, policymakers in Washington must resist calls to tighten the United States' tenuous grasp on global affairs, ignore the stale warnings about eroded credibility, and overcome the tired protests of bloated bureaucracies. By reducing its forward deployments, sharing burdens with its allies, limiting its fights in peripheral territories, and paring back wasteful spending on unnecessary weapons, the United States can not only slow its decline but also sow the seeds of its recovery.
Monday, November 21, 2011
David Frum on The Troubled GOP
I've spent a lot of time looking for a sign of the honorable side of conservatism. I have always felt that the best political policy is born from the healthy tension between principled conservatism and principled liberalism. Over the last several years, and built to a fever pitch since the election of Barack Obama, the most insidious usurpation of reasoned conservatism has gripped the throat of the US and brought about a completely dysfunctional political stalemate devoid of any sign of the "art of the possible".
David Frum, formerly of the AIE and former advisor to George W. Bush and Rudi Giuliani, laments his parties break with reality-based politics. Good read.
"What if [Obama] is so outside our comprehension that only if you understand Kenyan, anti-colonial behavior can you begin to piece together [his actions]?" Newt Gingrich
It’s a very strange experience to have your friends think you’ve gone crazy. Some will tell you so. Others will indulgently humor you. Still others will avoid you. More than a few will demand that the authorities do something to get you off the streets. During one unpleasant moment after I was fired from the think tank where I’d worked for the previous seven years, I tried to reassure my wife with an old cliché: “The great thing about an experience like this is that you learn who your friends really are.” She answered, “I was happier when I didn’t know.”
It’s possible that my friends are right. I don’t think so—but then, crazy people never do. So let me put the case to you.
I’ve been a Republican all my adult life. I have worked on the editorial page of The Wall Street Journal, at Forbes magazine, at the Manhattan and American Enterprise Institutes, as a speechwriter in the George W. Bush administration. I believe in free markets, low taxes, reasonable regulation, and limited government. I voted for John McCain in 2008, and I have strongly criticized the major policy decisions of the Obama administration. But as I contemplate my party and my movement in 2011, I see things I simply cannot support.
America desperately needs a responsible and compassionate alternative to the Obama administration’s path of bigger government at higher cost. And yet: This past summer, the GOP nearly forced America to the verge of default just to score a point in a budget debate. In the throes of the worst economic crisis since the Depression, Republican politicians demand massive budget cuts and shrug off the concerns of the unemployed. In the face of evidence of dwindling upward mobility and long-stagnating middle-class wages, my party’s economic ideas sometimes seem to have shrunk to just one: more tax cuts for the very highest earners. When I entered Republican politics, during an earlier period of malaise, in the late seventies and early eighties, the movement got most of the big questions—crime, inflation, the Cold War—right. This time, the party is getting the big questions disastrously wrong.
It was not so long ago that Texas governor Bush denounced attempts to cut the earned-income tax credit as “balancing the budget on the backs of the poor.” By 2011, Republican commentators were noisily complaining that the poorer half of society are “lucky duckies” because the EITC offsets their federal tax obligations—or because the recession had left them with such meager incomes that they had no tax to pay in the first place. In 2000, candidate Bush routinely invoked “churches, synagogues, and mosques.” By 2010, prominent Republicans were denouncing the construction of a mosque in lower Manhattan as an outrageous insult. In 2003, President Bush and a Republican majority in Congress enacted a new prescription-drug program in Medicare. By 2011, all but four Republicans in the House and five in the Senate were voting to withdraw the Medicare guarantee from everybody under age 55. Today, the Fed’s pushing down interest rates in hopes of igniting economic growth is close to treason, according to Governor Rick Perry, coyly seconded by TheWall Street Journal. In 2000, the same policy qualified Alan Greenspan as the “greatest central banker in the history of the world,” according to Perry’s mentor, Senator Phil Gramm. Today, health reform that combines regulation of private insurance, individual mandates, and subsidies for those who need them is considered unconstitutional and an open invitation to “death panels.” A dozen years ago, a very similar reform was the Senate Republican alternative to Hillarycare. Today, stimulative fiscal policy that includes tax cuts for almost every American is “socialism.” In 2001, stimulative fiscal policy that included tax cuts for rather fewer Americans was an economic-recovery program.
I can’t shrug off this flight from reality and responsibility as somebody else’s problem. I belonged to this movement; I helped to make the mess. People may very well say: Hey, wait a minute, didn’t you work in the George W. Bush administration that disappointed so many people in so many ways? What qualifies you to dispense advice to anybody else?
Fair question. I am haunted by the Bush experience, although it seems almost presumptuous for someone who played such a minor role to feel so much unease. The people who made the big decisions certainly seem to sleep well enough. Yet there is also the chance for something positive to come out of it all. True, some of my colleagues emerged from those years eager to revenge themselves and escalate political conflict: “They send one of ours to the hospital, we send two of theirs to the morgue.” I came out thinking, I want no more part of this cycle of revenge. For the past half-dozen years, I have been arguing that we conservatives need to follow a different course. And it is this argument that has led so many of my friends to demand, sometimes bemusedly, sometimes angrily, “What the hell happened to you?” I could fire the same question back: “Never mind me—what happened to you?”
"If we took away the minimum wage—if conceivably it was gone—we could potentially virtually wipe out unemployment completely." Michelle Bachman
So what did happen? The first decade of the 21st century was a crazy bookend to the twentieth, opening with a second Pearl Harbor and ending with a second Great Crash, with a second Vietnam wedged in between. Now we seem caught in the coils of a second Great Depression. These shocks radicalized the political system, damaging hawkish Democrats like Hillary Clinton in the Bush years and then driving Republicans to dust off the economics of Ayn Rand.
Some liberals suspect that the conservative changes of mind since 2008 are opportunistic and cynical. It’s true that cynicism is never entirely absent from politics: I won’t soon forget the lupine smile that played about the lips of the leader of one prominent conservative institution as he told me, “Our donors truly think the apocalypse has arrived.” Yet conscious cynicism is much rarer than you might suppose. Few of us have the self-knowledge and emotional discipline to say one thing while meaning another. If we say something often enough, we come to believe it. We don’t usually delude others until after we have first deluded ourselves. Some of the smartest and most sophisticated people I know—canny investors, erudite authors—sincerely and passionately believe that President Barack Obama has gone far beyond conventional American liberalism and is willfully and relentlessly driving the United States down the road to socialism. No counterevidence will dissuade them from this belief: not record-high corporate profits, not almost 500,000 job losses in the public sector, not the lowest tax rates since the Truman administration. It is not easy to fit this belief alongside the equally strongly held belief that the president is a pitiful, bumbling amateur, dazed and overwhelmed by a job too big for him—and yet that is done too.
Conservatives have been driven to these fevered anxieties as much by their own trauma as by external events. In the aughts, Republicans held more power for longer than at any time since the twenties, yet the result was the weakest and least broadly shared economic expansion since World War II, followed by an economic crash and prolonged slump. Along the way, the GOP suffered two severe election defeats in 2006 and 2008. Imagine yourself a rank-and-file Republican in 2009: If you have not lost your job or your home, your savings have been sliced and your children cannot find work. Your retirement prospects have dimmed. Most of all, your neighbors blame you for all that has gone wrong in the country. There’s one thing you know for sure: None of this is your fault! And when the new president fails to deliver rapid recovery, he can be designated the target for everyone’s accumulated disappointment and rage. In the midst of economic wreckage, what relief to thrust all blame upon Barack Obama as the wrecker-in-chief.
The Bush years cannot be repudiated, but the memory of them can be discarded to make way for a new and more radical ideology, assembled from bits of the old GOP platform that were once sublimated by the party elites but now roam the land freely: ultralibertarianism, crank monetary theories, populist fury, and paranoid visions of a Democratic Party controlled by ACORN and the New Black Panthers. For the past three years, the media have praised the enthusiasm and energy the tea party has brought to the GOP. Yet it’s telling that that movement has failed time and again to produce even a remotely credible candidate for president. Sarah Palin, Donald Trump, Michele Bachmann, Rick Perry, Herman Cain, Newt Gingrich: The list of tea-party candidates reads like the early history of the U.S. space program, a series of humiliating fizzles and explosions that never achieved liftoff. A political movement that never took governing seriously was exploited by a succession of political entrepreneurs uninterested in governing—but all too interested in merchandising. Much as viewers tune in to American Idol to laugh at the inept, borderline dysfunctional early auditions, these tea-party champions provide a ghoulish type of news entertainment each time they reveal that they know nothing about public affairs and have never attempted to learn. But Cain’s gaffe on Libya or Perry’s brain freeze on the Department of Energy are not only indicators of bad leadership. They are indicators of a crisis of followership. The tea party never demanded knowledge or concern for governance, and so of course it never got them.
Many hope that the tea-party mood is just a passing mania, eventually to subside into something more like the businessperson’s Republicanism practiced in the nineties by governors and mayors like George Pataki and Rudy Giuliani, Christine Todd Whitman and Dick Riordan, Tommy Thompson and John Engler. This hope tends to coalesce around the candidacies of Mitt Romney and Jon Huntsman, two smart and well-informed former governors who eschew the strident rhetoric of the tea party and who have thereby earned its deep distrust. But there are good reasons to fear that the ebbing of Republican radicalism remains far off, even if Romney (or Huntsman) does capture the White House next year.
"[Obama] grew up in a privileged way. He never had to really work for anything; he never had to go through what Americans are going through." Rick Perry
1. Fiscal Austerity and Economic Stagnation
We have entered an era in which politics increasingly revolves around the ugly question of who will bear how much pain. Conservative constituencies already see themselves as aggrieved victims of American government: They are the people who pay the taxes even as their “earned” benefits are siphoned off to provide welfare for the undeserving. The reality is, however, that the big winners in the American fiscal system are the rich, the old, the rural, and veterans—typically conservative constituencies. Squeezing the programs conservatives most dislike—PBS, the National Endowment for the Humanities, tax credits for the poor, the Department of Education, etc.—yields relatively little money. Any serious move to balance the budget, or even just reduce the deficit a little, must inevitably cut programs conservative voters do like: Medicare for current beneficiaries, farm subsidies, veterans’ benefits, and big tax loopholes like the mortgage-interest deduction and employer-provided health benefits. The rank and file of the GOP are therefore caught between their interests and their ideology—intensifying their suspicion that shadowy Washington elites are playing dirty tricks upon them.
2. Ethnic Competition
White America has been plunged into a mood of pessimism and anger since 2008. Ron Brownstein reports in the National Journal: “63 percent of African-Americans and 54 percent of Hispanics said they expected their children to exceed their standard of living. Even college-educated whites are less optimistic (only about two-fifths agree). But the noncollege whites are the gloomiest: Just one-third of them think their kids will live better than they do; an equal number think their children won’t even match their living standard. No other group is nearly that negative.” Those fears are not irrational. In postrecession America, employers seem to show a distinct preference for foreign-born workers. Eighty percent of the net new jobs created in the state of Texas since 2009 went to the foreign-born. Nationwide, foreign-born workers have experienced a net 4 percent increase in employment since January 2009, while native-born workers have seen continuing employment declines. Which may explain why President Obama’s approval rating among whites slipped to 41 percent in January 2010 and is now testing a new low of 33 percent. The president’s name and skin color symbolize the emergence of a new America in which many older-stock Americans intuit they will be left behind.
It is precisely these disaffected whites—especially those who didn’t go to college—who form the Republican voting base. John McCain got 58 percent of noncollege-white votes in 2008. The GOP polls even higher among that group today, but the party can only sustain those numbers as long as it gives voice to alienation. Birtherism, the claim that President Obama was not born in the United States, expressed the feeling of many that power has shifted into alien hands. That feeling will not be easily quelled by Republican electoral success, because it is based on a deep sense of dispossession and disinheritance.
3. Fox News and Talk Radio
Extremism and conflict make for bad politics but great TV. Over the past two decades, conservatism has evolved from a political philosophy into a market segment. An industry has grown up to serve that segment—and its stars have become the true thought leaders of the conservative world. The business model of the conservative media is built on two elements: provoking the audience into a fever of indignation (to keep them watching) and fomenting mistrust of all other information sources (so that they never change the channel). As a commercial proposition, this model has worked brilliantly in the Obama era. As journalism, not so much. As a tool of political mobilization, it backfires, by inciting followers to the point at which they force leaders into confrontations where everybody loses, like the summertime showdown over the debt ceiling.
But the thought leaders on talk radio and Fox do more than shape opinion. Backed by their own wing of the book-publishing industry and supported by think tanks that increasingly function as public-relations agencies, conservatives have built a whole alternative knowledge system, with its own facts, its own history, its own laws of economics. Outside this alternative reality, the United States is a country dominated by a strong Christian religiosity. Within it, Christians are a persecuted minority. Outside the system, President Obama—whatever his policy errors—is a figure of imposing intellect and dignity. Within the system, he’s a pitiful nothing, unable to speak without a teleprompter, an affirmative-action phony doomed to inevitable defeat. Outside the system, social scientists worry that the U.S. is hardening into one of the most rigid class societies in the Western world, in which the children of the poor have less chance of escape than in France, Germany, or even England. Inside the system, the U.S. remains (to borrow the words of Senator Marco Rubio) “the only place in the world where it doesn’t matter who your parents were or where you came from.”
"I'm ready for the gotcha questions...and when they ask me who is the president of Uzbeki-beki-beki-beki-stan-stan I'm gonna say, you know, I don't know." Herman Cain
We used to say “You’re entitled to your own opinion, but not to your own facts.” Now we are all entitled to our own facts, and conservative media use this right to immerse their audience in a total environment of pseudo-facts and pretend information.
When contemplating the ruthless brilliance of this system, it’s tempting to fall back on the theory that the GOP is masterminded by a cadre of sinister billionaires, deftly manipulating the political process for their own benefit. The billionaires do exist, and some do indeed attempt to influence the political process. The bizarre fiasco of campaign-finance reform has perversely empowered them to give unlimited funds anonymously to special entities that can spend limitlessly. (Thanks, Senator McCain! Nice job, Senator Feingold!) Yet, for the most part, these Republican billionaires are not acting cynically. They watch Fox News too, and they’re gripped by the same apocalyptic fears as the Republican base. In funding the tea-party movement, they are actually acting against their own longer-term interests, for it is the richest who have the most interest in political stability, which depends upon broad societal agreement that the existing distribution of rewards is fair and reasonable. If the social order comes to seem unjust to large numbers of people, what happens next will make Occupy Wall Street look like a street fair.
Over the past few years, I have left this alternative knowledge system behind me. What is that experience like? A personal story may be relevant here.
Through the debate over health-care reform in 2009–10, I urged that Republicans try to reach some kind of deal. The Democrats had the votes to pass something. They could not afford to lose. Providing health coverage to all is a worthy goal, and the core mechanisms of what we called Obamacare should not have been obnoxious to Republicans. In fact, they were drawn from past Republican plans. Democrats were so eager for Republican votes to provide bipartisan cover that they might well have paid a substantial price to get them, including dropping the surtaxes on work and investment that supposedly financed the Affordable Care Act. My urgings went unheeded, obviously. Senator Jim DeMint predicted that health care would become Obama’s Waterloo, the decisive defeat that would destroy his presidency, and Republicans accepted DeMint’s counsel. So they bet everything—and lost everything. A major new entitlement has been written into law, financed by redistributive new taxes. Changes in the bill that could have been had for the asking will now require years of slow, painful legislative effort, if they ever come at all. Republicans hope that the Supreme Court will overturn the Affordable Care Act. Such a decision would be the most dramatic assertion of judicial power since the thirties, and for that reason alone seems improbable. Yet absent action by the Supreme Court, outright repeal of President Obama’s health-care law is a mirage, requiring not only 60 votes in the Senate but also the withdrawal of benefits that the American people will have gotten used to by 2013.
On the day of the House vote that ensured the enactment of health-care reform, I wrote a blog post saying all this—and calling for some accountability for those who had led the GOP to this disaster. For my trouble, I was denounced the next day by my former colleagues at The Wall Street Journal as a turncoat. Three days after that, I was dismissed from the American Enterprise Institute. I’m not a solitary case: In 2005, the economist Bruce Bartlett, a main legislative author of the Kemp-Roth tax cut, was fired from a think tank in Dallas for too loudly denouncing the George W. Bush administration’s record, and I could tell equivalent stories about other major conservative think tanks as well.
I don’t complain from a personal point of view. Happily, I had other economic resources to fall back upon. But the message sent to others with less security was clear: We don’t pay you to think, we pay you to repeat. For myself, the main consequences have been more comic than anything else. Back in 2009, I wrote a piece for Newsweek arguing that Republicans would regret conceding so much power to Rush Limbaugh. Until that point, I’d been a frequent guest on Fox News, but thenceforward some kind of fatwa was laid down upon me. Over the next few months, I’d occasionally receive morning calls from young TV bookers asking if I was available to appear that day. For sport, I’d always answer, “I’m available—but does your senior producer know you’ve called me?” An hour later, I’d receive an embarrassed second call: “We’ve decided to go in a different direction.” Earlier this year, I did some volunteer speechwriting for a Republican contemplating a presidential run. My involvement was treated as a dangerous secret, involving discreet visits to hotel suites at odd hours. Thus are political movements held together. But thus is not how movements grow and govern.
Some call this the closing of the conservative mind. Alas, the conservative mind has proved itself only too open, these past years, to all manner of intellectual pollen. Call it instead the drying up of conservative creativity. It’s clearly true that the country faces daunting economic troubles. It’s also true that the wrong answers to those problems will push the United States toward a future of too much government, too many taxes, and too much regulation. It’s the job of conservatives in this crisis to show a better way. But it’s one thing to point out (accurately) that President Obama’s stimulus plan was mostly a compilation of antique Democratic wish lists, and quite another to argue that the correct response to the worst collapse since the thirties is to wait for the economy to get better on its own. It’s one thing to worry (wisely) about the long-term trend in government spending, and another to demand big, immediate cuts when 25 million are out of full-time work and the government can borrow for ten years at 2 percent. It’s a duty to scrutinize the actions and decisions of the incumbent administration, but an abuse to use the filibuster as a routine tool of legislation or to prevent dozens of presidential appointments from even coming to a vote. It’s fine to be unconcerned that the rich are getting richer, but blind to deny that middle-class wages have stagnated or worse over the past dozen years. In the aftershock of 2008, large numbers of Americans feel exploited and abused. Rather than workable solutions, my party is offering low taxes for the currently rich and high spending for the currently old, to be followed by who-knows-what and who-the-hell-cares. This isn’t conservatism; it’s a going-out-of-business sale for the baby-boom generation.
I refuse to believe that I am the only Republican who feels this way. If CNN’s most recent polling is correct, only half of us sympathize with the tea party. However, moderate-minded people dislike conflict—and thus tend to lose to people who relish conflict. The most extreme voices in the GOP now denounce everybody else as Republicans in Name Only. But who elected them as the GOP’s membership committee? What have they done to deserve such an inheritance? In the mid-sixties, when the party split spectacularly between Ripon Republicans, who embraced the civil-rights movement, and Goldwater Republicans, who opposed it, civil-rights Republicans like Michigan governor George Romney spoke forcefully for their point of view. Today, Republicans discomfited by political and media extremism bite their tongues. But if they don’t speak up, they’ll be whipsawed into a choice between an Obama administration that wants to build a permanently bigger government and a conservative movement content with permanently outraged opposition.
This is, unfortunately, not merely a concern for Republican voters. The conservative shift to ever more extreme, ever more fantasy-based ideology has ominous real-world consequences for American society. The American system of government can’t work if the two sides wage all-out war upon each other: House, Senate, president, each has the power to thwart the others. In prior generations, the system evolved norms and habits to prevent this kind of stonewalling. For example: Theoretically, the party that holds the Senate could refuse to confirm any Cabinet nominees of a president of the other party. Yet until recently, this just “wasn’t done.” In fact, quite a lot of things that theoretically could be done just “weren’t done.” Now old inhibitions have given way. Things that weren’t done suddenly are done.
We can debate when the slide began. But what seems beyond argument is that the U.S. political system becomes more polarized and more dysfunctional every cycle, at greater and greater human cost. The next Republican president will surely find himself or herself at least as stymied by this dysfunction as President Obama, as will the people the political system supposedly serves, who must feel they have been subjected to a psychological experiment gone horribly wrong, pressing the red button in 2004 and getting a zap, pressing blue in 2008 for another zap, and now agonizing whether there is any choice that won’t zap them again in 2012. Yet in the interests of avoiding false evenhandedness, it must be admitted: The party with a stronger charge on its zapper right now, the party struggling with more self-imposed obstacles to responsible governance, the party most in need of a course correction, is the Republican Party. Changing that party will be the fight of a political lifetime. But a great political party is worth fighting for.
http://nymag.com/print/?/news/politics/conservatives-david-frum-2011-11/
David Frum, formerly of the AIE and former advisor to George W. Bush and Rudi Giuliani, laments his parties break with reality-based politics. Good read.
"What if [Obama] is so outside our comprehension that only if you understand Kenyan, anti-colonial behavior can you begin to piece together [his actions]?" Newt Gingrich
It’s a very strange experience to have your friends think you’ve gone crazy. Some will tell you so. Others will indulgently humor you. Still others will avoid you. More than a few will demand that the authorities do something to get you off the streets. During one unpleasant moment after I was fired from the think tank where I’d worked for the previous seven years, I tried to reassure my wife with an old cliché: “The great thing about an experience like this is that you learn who your friends really are.” She answered, “I was happier when I didn’t know.”
It’s possible that my friends are right. I don’t think so—but then, crazy people never do. So let me put the case to you.
I’ve been a Republican all my adult life. I have worked on the editorial page of The Wall Street Journal, at Forbes magazine, at the Manhattan and American Enterprise Institutes, as a speechwriter in the George W. Bush administration. I believe in free markets, low taxes, reasonable regulation, and limited government. I voted for John McCain in 2008, and I have strongly criticized the major policy decisions of the Obama administration. But as I contemplate my party and my movement in 2011, I see things I simply cannot support.
America desperately needs a responsible and compassionate alternative to the Obama administration’s path of bigger government at higher cost. And yet: This past summer, the GOP nearly forced America to the verge of default just to score a point in a budget debate. In the throes of the worst economic crisis since the Depression, Republican politicians demand massive budget cuts and shrug off the concerns of the unemployed. In the face of evidence of dwindling upward mobility and long-stagnating middle-class wages, my party’s economic ideas sometimes seem to have shrunk to just one: more tax cuts for the very highest earners. When I entered Republican politics, during an earlier period of malaise, in the late seventies and early eighties, the movement got most of the big questions—crime, inflation, the Cold War—right. This time, the party is getting the big questions disastrously wrong.
It was not so long ago that Texas governor Bush denounced attempts to cut the earned-income tax credit as “balancing the budget on the backs of the poor.” By 2011, Republican commentators were noisily complaining that the poorer half of society are “lucky duckies” because the EITC offsets their federal tax obligations—or because the recession had left them with such meager incomes that they had no tax to pay in the first place. In 2000, candidate Bush routinely invoked “churches, synagogues, and mosques.” By 2010, prominent Republicans were denouncing the construction of a mosque in lower Manhattan as an outrageous insult. In 2003, President Bush and a Republican majority in Congress enacted a new prescription-drug program in Medicare. By 2011, all but four Republicans in the House and five in the Senate were voting to withdraw the Medicare guarantee from everybody under age 55. Today, the Fed’s pushing down interest rates in hopes of igniting economic growth is close to treason, according to Governor Rick Perry, coyly seconded by TheWall Street Journal. In 2000, the same policy qualified Alan Greenspan as the “greatest central banker in the history of the world,” according to Perry’s mentor, Senator Phil Gramm. Today, health reform that combines regulation of private insurance, individual mandates, and subsidies for those who need them is considered unconstitutional and an open invitation to “death panels.” A dozen years ago, a very similar reform was the Senate Republican alternative to Hillarycare. Today, stimulative fiscal policy that includes tax cuts for almost every American is “socialism.” In 2001, stimulative fiscal policy that included tax cuts for rather fewer Americans was an economic-recovery program.
I can’t shrug off this flight from reality and responsibility as somebody else’s problem. I belonged to this movement; I helped to make the mess. People may very well say: Hey, wait a minute, didn’t you work in the George W. Bush administration that disappointed so many people in so many ways? What qualifies you to dispense advice to anybody else?
Fair question. I am haunted by the Bush experience, although it seems almost presumptuous for someone who played such a minor role to feel so much unease. The people who made the big decisions certainly seem to sleep well enough. Yet there is also the chance for something positive to come out of it all. True, some of my colleagues emerged from those years eager to revenge themselves and escalate political conflict: “They send one of ours to the hospital, we send two of theirs to the morgue.” I came out thinking, I want no more part of this cycle of revenge. For the past half-dozen years, I have been arguing that we conservatives need to follow a different course. And it is this argument that has led so many of my friends to demand, sometimes bemusedly, sometimes angrily, “What the hell happened to you?” I could fire the same question back: “Never mind me—what happened to you?”
"If we took away the minimum wage—if conceivably it was gone—we could potentially virtually wipe out unemployment completely." Michelle Bachman
So what did happen? The first decade of the 21st century was a crazy bookend to the twentieth, opening with a second Pearl Harbor and ending with a second Great Crash, with a second Vietnam wedged in between. Now we seem caught in the coils of a second Great Depression. These shocks radicalized the political system, damaging hawkish Democrats like Hillary Clinton in the Bush years and then driving Republicans to dust off the economics of Ayn Rand.
Some liberals suspect that the conservative changes of mind since 2008 are opportunistic and cynical. It’s true that cynicism is never entirely absent from politics: I won’t soon forget the lupine smile that played about the lips of the leader of one prominent conservative institution as he told me, “Our donors truly think the apocalypse has arrived.” Yet conscious cynicism is much rarer than you might suppose. Few of us have the self-knowledge and emotional discipline to say one thing while meaning another. If we say something often enough, we come to believe it. We don’t usually delude others until after we have first deluded ourselves. Some of the smartest and most sophisticated people I know—canny investors, erudite authors—sincerely and passionately believe that President Barack Obama has gone far beyond conventional American liberalism and is willfully and relentlessly driving the United States down the road to socialism. No counterevidence will dissuade them from this belief: not record-high corporate profits, not almost 500,000 job losses in the public sector, not the lowest tax rates since the Truman administration. It is not easy to fit this belief alongside the equally strongly held belief that the president is a pitiful, bumbling amateur, dazed and overwhelmed by a job too big for him—and yet that is done too.
Conservatives have been driven to these fevered anxieties as much by their own trauma as by external events. In the aughts, Republicans held more power for longer than at any time since the twenties, yet the result was the weakest and least broadly shared economic expansion since World War II, followed by an economic crash and prolonged slump. Along the way, the GOP suffered two severe election defeats in 2006 and 2008. Imagine yourself a rank-and-file Republican in 2009: If you have not lost your job or your home, your savings have been sliced and your children cannot find work. Your retirement prospects have dimmed. Most of all, your neighbors blame you for all that has gone wrong in the country. There’s one thing you know for sure: None of this is your fault! And when the new president fails to deliver rapid recovery, he can be designated the target for everyone’s accumulated disappointment and rage. In the midst of economic wreckage, what relief to thrust all blame upon Barack Obama as the wrecker-in-chief.
The Bush years cannot be repudiated, but the memory of them can be discarded to make way for a new and more radical ideology, assembled from bits of the old GOP platform that were once sublimated by the party elites but now roam the land freely: ultralibertarianism, crank monetary theories, populist fury, and paranoid visions of a Democratic Party controlled by ACORN and the New Black Panthers. For the past three years, the media have praised the enthusiasm and energy the tea party has brought to the GOP. Yet it’s telling that that movement has failed time and again to produce even a remotely credible candidate for president. Sarah Palin, Donald Trump, Michele Bachmann, Rick Perry, Herman Cain, Newt Gingrich: The list of tea-party candidates reads like the early history of the U.S. space program, a series of humiliating fizzles and explosions that never achieved liftoff. A political movement that never took governing seriously was exploited by a succession of political entrepreneurs uninterested in governing—but all too interested in merchandising. Much as viewers tune in to American Idol to laugh at the inept, borderline dysfunctional early auditions, these tea-party champions provide a ghoulish type of news entertainment each time they reveal that they know nothing about public affairs and have never attempted to learn. But Cain’s gaffe on Libya or Perry’s brain freeze on the Department of Energy are not only indicators of bad leadership. They are indicators of a crisis of followership. The tea party never demanded knowledge or concern for governance, and so of course it never got them.
Many hope that the tea-party mood is just a passing mania, eventually to subside into something more like the businessperson’s Republicanism practiced in the nineties by governors and mayors like George Pataki and Rudy Giuliani, Christine Todd Whitman and Dick Riordan, Tommy Thompson and John Engler. This hope tends to coalesce around the candidacies of Mitt Romney and Jon Huntsman, two smart and well-informed former governors who eschew the strident rhetoric of the tea party and who have thereby earned its deep distrust. But there are good reasons to fear that the ebbing of Republican radicalism remains far off, even if Romney (or Huntsman) does capture the White House next year.
"[Obama] grew up in a privileged way. He never had to really work for anything; he never had to go through what Americans are going through." Rick Perry
1. Fiscal Austerity and Economic Stagnation
We have entered an era in which politics increasingly revolves around the ugly question of who will bear how much pain. Conservative constituencies already see themselves as aggrieved victims of American government: They are the people who pay the taxes even as their “earned” benefits are siphoned off to provide welfare for the undeserving. The reality is, however, that the big winners in the American fiscal system are the rich, the old, the rural, and veterans—typically conservative constituencies. Squeezing the programs conservatives most dislike—PBS, the National Endowment for the Humanities, tax credits for the poor, the Department of Education, etc.—yields relatively little money. Any serious move to balance the budget, or even just reduce the deficit a little, must inevitably cut programs conservative voters do like: Medicare for current beneficiaries, farm subsidies, veterans’ benefits, and big tax loopholes like the mortgage-interest deduction and employer-provided health benefits. The rank and file of the GOP are therefore caught between their interests and their ideology—intensifying their suspicion that shadowy Washington elites are playing dirty tricks upon them.
2. Ethnic Competition
White America has been plunged into a mood of pessimism and anger since 2008. Ron Brownstein reports in the National Journal: “63 percent of African-Americans and 54 percent of Hispanics said they expected their children to exceed their standard of living. Even college-educated whites are less optimistic (only about two-fifths agree). But the noncollege whites are the gloomiest: Just one-third of them think their kids will live better than they do; an equal number think their children won’t even match their living standard. No other group is nearly that negative.” Those fears are not irrational. In postrecession America, employers seem to show a distinct preference for foreign-born workers. Eighty percent of the net new jobs created in the state of Texas since 2009 went to the foreign-born. Nationwide, foreign-born workers have experienced a net 4 percent increase in employment since January 2009, while native-born workers have seen continuing employment declines. Which may explain why President Obama’s approval rating among whites slipped to 41 percent in January 2010 and is now testing a new low of 33 percent. The president’s name and skin color symbolize the emergence of a new America in which many older-stock Americans intuit they will be left behind.
It is precisely these disaffected whites—especially those who didn’t go to college—who form the Republican voting base. John McCain got 58 percent of noncollege-white votes in 2008. The GOP polls even higher among that group today, but the party can only sustain those numbers as long as it gives voice to alienation. Birtherism, the claim that President Obama was not born in the United States, expressed the feeling of many that power has shifted into alien hands. That feeling will not be easily quelled by Republican electoral success, because it is based on a deep sense of dispossession and disinheritance.
3. Fox News and Talk Radio
Extremism and conflict make for bad politics but great TV. Over the past two decades, conservatism has evolved from a political philosophy into a market segment. An industry has grown up to serve that segment—and its stars have become the true thought leaders of the conservative world. The business model of the conservative media is built on two elements: provoking the audience into a fever of indignation (to keep them watching) and fomenting mistrust of all other information sources (so that they never change the channel). As a commercial proposition, this model has worked brilliantly in the Obama era. As journalism, not so much. As a tool of political mobilization, it backfires, by inciting followers to the point at which they force leaders into confrontations where everybody loses, like the summertime showdown over the debt ceiling.
But the thought leaders on talk radio and Fox do more than shape opinion. Backed by their own wing of the book-publishing industry and supported by think tanks that increasingly function as public-relations agencies, conservatives have built a whole alternative knowledge system, with its own facts, its own history, its own laws of economics. Outside this alternative reality, the United States is a country dominated by a strong Christian religiosity. Within it, Christians are a persecuted minority. Outside the system, President Obama—whatever his policy errors—is a figure of imposing intellect and dignity. Within the system, he’s a pitiful nothing, unable to speak without a teleprompter, an affirmative-action phony doomed to inevitable defeat. Outside the system, social scientists worry that the U.S. is hardening into one of the most rigid class societies in the Western world, in which the children of the poor have less chance of escape than in France, Germany, or even England. Inside the system, the U.S. remains (to borrow the words of Senator Marco Rubio) “the only place in the world where it doesn’t matter who your parents were or where you came from.”
"I'm ready for the gotcha questions...and when they ask me who is the president of Uzbeki-beki-beki-beki-stan-stan I'm gonna say, you know, I don't know." Herman Cain
We used to say “You’re entitled to your own opinion, but not to your own facts.” Now we are all entitled to our own facts, and conservative media use this right to immerse their audience in a total environment of pseudo-facts and pretend information.
When contemplating the ruthless brilliance of this system, it’s tempting to fall back on the theory that the GOP is masterminded by a cadre of sinister billionaires, deftly manipulating the political process for their own benefit. The billionaires do exist, and some do indeed attempt to influence the political process. The bizarre fiasco of campaign-finance reform has perversely empowered them to give unlimited funds anonymously to special entities that can spend limitlessly. (Thanks, Senator McCain! Nice job, Senator Feingold!) Yet, for the most part, these Republican billionaires are not acting cynically. They watch Fox News too, and they’re gripped by the same apocalyptic fears as the Republican base. In funding the tea-party movement, they are actually acting against their own longer-term interests, for it is the richest who have the most interest in political stability, which depends upon broad societal agreement that the existing distribution of rewards is fair and reasonable. If the social order comes to seem unjust to large numbers of people, what happens next will make Occupy Wall Street look like a street fair.
Over the past few years, I have left this alternative knowledge system behind me. What is that experience like? A personal story may be relevant here.
Through the debate over health-care reform in 2009–10, I urged that Republicans try to reach some kind of deal. The Democrats had the votes to pass something. They could not afford to lose. Providing health coverage to all is a worthy goal, and the core mechanisms of what we called Obamacare should not have been obnoxious to Republicans. In fact, they were drawn from past Republican plans. Democrats were so eager for Republican votes to provide bipartisan cover that they might well have paid a substantial price to get them, including dropping the surtaxes on work and investment that supposedly financed the Affordable Care Act. My urgings went unheeded, obviously. Senator Jim DeMint predicted that health care would become Obama’s Waterloo, the decisive defeat that would destroy his presidency, and Republicans accepted DeMint’s counsel. So they bet everything—and lost everything. A major new entitlement has been written into law, financed by redistributive new taxes. Changes in the bill that could have been had for the asking will now require years of slow, painful legislative effort, if they ever come at all. Republicans hope that the Supreme Court will overturn the Affordable Care Act. Such a decision would be the most dramatic assertion of judicial power since the thirties, and for that reason alone seems improbable. Yet absent action by the Supreme Court, outright repeal of President Obama’s health-care law is a mirage, requiring not only 60 votes in the Senate but also the withdrawal of benefits that the American people will have gotten used to by 2013.
On the day of the House vote that ensured the enactment of health-care reform, I wrote a blog post saying all this—and calling for some accountability for those who had led the GOP to this disaster. For my trouble, I was denounced the next day by my former colleagues at The Wall Street Journal as a turncoat. Three days after that, I was dismissed from the American Enterprise Institute. I’m not a solitary case: In 2005, the economist Bruce Bartlett, a main legislative author of the Kemp-Roth tax cut, was fired from a think tank in Dallas for too loudly denouncing the George W. Bush administration’s record, and I could tell equivalent stories about other major conservative think tanks as well.
I don’t complain from a personal point of view. Happily, I had other economic resources to fall back upon. But the message sent to others with less security was clear: We don’t pay you to think, we pay you to repeat. For myself, the main consequences have been more comic than anything else. Back in 2009, I wrote a piece for Newsweek arguing that Republicans would regret conceding so much power to Rush Limbaugh. Until that point, I’d been a frequent guest on Fox News, but thenceforward some kind of fatwa was laid down upon me. Over the next few months, I’d occasionally receive morning calls from young TV bookers asking if I was available to appear that day. For sport, I’d always answer, “I’m available—but does your senior producer know you’ve called me?” An hour later, I’d receive an embarrassed second call: “We’ve decided to go in a different direction.” Earlier this year, I did some volunteer speechwriting for a Republican contemplating a presidential run. My involvement was treated as a dangerous secret, involving discreet visits to hotel suites at odd hours. Thus are political movements held together. But thus is not how movements grow and govern.
Some call this the closing of the conservative mind. Alas, the conservative mind has proved itself only too open, these past years, to all manner of intellectual pollen. Call it instead the drying up of conservative creativity. It’s clearly true that the country faces daunting economic troubles. It’s also true that the wrong answers to those problems will push the United States toward a future of too much government, too many taxes, and too much regulation. It’s the job of conservatives in this crisis to show a better way. But it’s one thing to point out (accurately) that President Obama’s stimulus plan was mostly a compilation of antique Democratic wish lists, and quite another to argue that the correct response to the worst collapse since the thirties is to wait for the economy to get better on its own. It’s one thing to worry (wisely) about the long-term trend in government spending, and another to demand big, immediate cuts when 25 million are out of full-time work and the government can borrow for ten years at 2 percent. It’s a duty to scrutinize the actions and decisions of the incumbent administration, but an abuse to use the filibuster as a routine tool of legislation or to prevent dozens of presidential appointments from even coming to a vote. It’s fine to be unconcerned that the rich are getting richer, but blind to deny that middle-class wages have stagnated or worse over the past dozen years. In the aftershock of 2008, large numbers of Americans feel exploited and abused. Rather than workable solutions, my party is offering low taxes for the currently rich and high spending for the currently old, to be followed by who-knows-what and who-the-hell-cares. This isn’t conservatism; it’s a going-out-of-business sale for the baby-boom generation.
I refuse to believe that I am the only Republican who feels this way. If CNN’s most recent polling is correct, only half of us sympathize with the tea party. However, moderate-minded people dislike conflict—and thus tend to lose to people who relish conflict. The most extreme voices in the GOP now denounce everybody else as Republicans in Name Only. But who elected them as the GOP’s membership committee? What have they done to deserve such an inheritance? In the mid-sixties, when the party split spectacularly between Ripon Republicans, who embraced the civil-rights movement, and Goldwater Republicans, who opposed it, civil-rights Republicans like Michigan governor George Romney spoke forcefully for their point of view. Today, Republicans discomfited by political and media extremism bite their tongues. But if they don’t speak up, they’ll be whipsawed into a choice between an Obama administration that wants to build a permanently bigger government and a conservative movement content with permanently outraged opposition.
This is, unfortunately, not merely a concern for Republican voters. The conservative shift to ever more extreme, ever more fantasy-based ideology has ominous real-world consequences for American society. The American system of government can’t work if the two sides wage all-out war upon each other: House, Senate, president, each has the power to thwart the others. In prior generations, the system evolved norms and habits to prevent this kind of stonewalling. For example: Theoretically, the party that holds the Senate could refuse to confirm any Cabinet nominees of a president of the other party. Yet until recently, this just “wasn’t done.” In fact, quite a lot of things that theoretically could be done just “weren’t done.” Now old inhibitions have given way. Things that weren’t done suddenly are done.
We can debate when the slide began. But what seems beyond argument is that the U.S. political system becomes more polarized and more dysfunctional every cycle, at greater and greater human cost. The next Republican president will surely find himself or herself at least as stymied by this dysfunction as President Obama, as will the people the political system supposedly serves, who must feel they have been subjected to a psychological experiment gone horribly wrong, pressing the red button in 2004 and getting a zap, pressing blue in 2008 for another zap, and now agonizing whether there is any choice that won’t zap them again in 2012. Yet in the interests of avoiding false evenhandedness, it must be admitted: The party with a stronger charge on its zapper right now, the party struggling with more self-imposed obstacles to responsible governance, the party most in need of a course correction, is the Republican Party. Changing that party will be the fight of a political lifetime. But a great political party is worth fighting for.
http://nymag.com/print/?/news/politics/conservatives-david-frum-2011-11/
Wednesday, November 16, 2011
Automation and employment in the 21st Century
This one gave me pause... it doesn't feel that close, but then would I notice if it did? I think that it is really just a question of when.
From the Economist...
Artificial intelligence
Difference Engine: Luddite legacy
AN APOCRYPHAL tale is told about Henry Ford II showing Walter Reuther, the veteran leader of the United Automobile Workers, around a newly automated car plant. “Walter, how are you going to get those robots to pay your union dues,” gibed the boss of Ford Motor Company. Without skipping a beat, Reuther replied, “Henry, how are you going to get them to buy your cars?”
Whether the exchange was true or not is irrelevant. The point was that any increase in productivity required a corresponding increase in the number of consumers capable of buying the product. The original Henry Ford, committed to raising productivity and lowering prices remorselessly, appreciated this profoundly—and insisted on paying his workers twice the going rate, so they could afford to buy his cars.
For the company, there was an added bonus. By offering an unprecedented $5 a day in 1914, he caused the best tool-makers and machinists in America to flock to Ford. The know-how they brought boosted production efficiency still further and made Ford cars ever more affordable. With its ingenious Model T, Ford became the first car company in the world to bring motoring to the masses.
Economists see this as a classic example of how advancing technology, in the form of automation and innovation, increases productivity. This, in turn, causes prices to fall, demand to rise, more workers to be hired, and the economy to grow. Such thinking has been one of the tenets of economics since the early 1800s, when hosiery and lace-makers in Nottingham—inspired by Ned Ludd, a legendary hero of the English proletariat—smashed the mechanical knitting looms being introduced at the time for fear of losing their jobs.
Some did lose their jobs, of course. But if the Luddite Fallacy (as it has become known in development economics) were true, we would all be out of work by now—as a result of the compounding effects of productivity. While technological progress may cause workers with out-dated skills to become redundant, the past two centuries have shown that the idea that increasing productivity leads axiomatically to widespread unemployment is nonsense.
But here is the question: if the pace of technological progress is accelerating faster than ever, as all the evidence indicates it is, why has unemployment remained so stubbornly high—despite the rebound in business profits to record levels? Two-and-a-half years after the Great Recession officially ended, unemployment has remained above 9% in America. That is only one percentage point better than the country’s joblessness three years ago at the depths of the recession.
The modest 80,000 jobs added to the economy in October were not enough to keep up with population growth, let alone re-employ any of the 12.3m Americans made redundant between 2007 and 2009. Even if job creation were miraculously nearly to triple to the monthly average of 208,000 that is was in 2005, it would still take a dozen years to close the yawning employment gap caused by the recent recession, says Laura D’Andrea Tyson, an economist at University of California, Berkeley, who was chairman of the Council of Economic Advisers during the Clinton administration.
The conventional explanation for America's current plight is that, at an annualised 2.5% for the most recent quarter (compared with an historical average of 3.3%), the economy is simply not expanding fast enough to put all the people who lost their jobs back to work. Consumer demand, say economists like Dr Tyson, is evidently not there for companies to start hiring again. Clearly, too many chastened Americans are continuing to pay off their debts and save for rainy days, rather than splurging on things they may fancy but can easily manage without.
There is a good deal of truth in that. But it misses a crucial change that economists are loth to accept, though technologists have been concerned about it for several years. This is the disturbing thought that, sluggish business cycles aside, America's current employment woes stem from a precipitous and permanent change caused by not too little technological progress, but too much. The evidence is irrefutable that computerised automation, networks and artificial intelligence (AI)—including machine-learning, language-translation, and speech- and pattern-recognition software—are beginning to render many jobs simply obsolete.
This is unlike the job destruction and creation that has taken place continuously since the beginning of the Industrial Revolution, as machines gradually replaced the muscle-power of human labourers and horses. Today, automation is having an impact not just on routine work, but on cognitive and even creative tasks as well. A tipping point seems to have been reached, at which AI-based automation threatens to supplant the brain-power of large swathes of middle-income employees.
That makes a huge, disruptive difference. Not only is AI software much cheaper than mechanical automation to install and operate, there is a far greater incentive to adopt it—given the significantly higher cost of knowledge workers compared with their blue-collar brothers and sisters in the workshop, on the production line, at the check-out and in the field.
In many ways, the white-collar employees who man the cubicles of business today share the plight of agricultural workers a century ago. In 1900, nearly half of the adult population worked on the land. Thanks to tractors, combine harvesters, crop-picking machines and other forms of mechanisation, agriculture now accounts for little more than 2% of the working population.
Displaced agricultural workers then, though, could migrate from fields to factories and earn higher wages in the process. What is in store for the Dilberts of today? Media theorist Douglas Rushkoff (“Program or Be Programmed” and “Life Inc”) would argue "nothing in particular." Put bluntly, few new white-collar jobs, as people know them, are going to be created to replace those now being lost—despite the hopes many place in technology, innovation and better education.
The argument against the Luddite Fallacy rests on two assumptions: one is that machines are tools used by workers to increase their productivity; the other is that the majority of workers are capable of becoming machine operators. What happens when these assumptions cease to apply—when machines are smart enough to become workers? In other words, when capital becomes labour. At that point, the Luddite Fallacy looks rather less fallacious.
This is what Jeremy Rifkin, a social critic, was driving at in his book, “The End of Work”, published in 1995. Though not the first to do so, Mr Rifkin argued prophetically that society was entering a new phase—one in which fewer and fewer workers would be needed to produce all the goods and services consumed. “In the years ahead,” he wrote, “more sophisticated software technologies are going to bring civilisation ever closer to a near-workerless world.”
The process has clearly begun. And it is not just white-collar knowledge workers and middle managers who are being automated out of existence. As data-analytics, business-intelligence and decision-making software do a better and cheaper job, even professionals are not immune to the job-destruction trend now underway. Pattern-recognition technologies are making numerous highly paid skills redundant.
Radiologists, who can earn over $300,000 a year in America, after 13 years of college education and internship, are among the first to feel the heat. It is not just that the task of scanning tumour slides and X-ray pictures is being outsourced to Indian laboratories, where the job is done for a tenth of the cost. The real threat is that the latest automated pattern-recognition software can do much of the work for less than a hundredth of it.
Lawyers are in a similar boat now that smart algorithms can search case law, evaluate the issues at hand and summarise the results. Machines have already shown they can perform legal discovery for a fraction of the cost of human professionals—and do so with far greater thoroughness than lawyers and paralegals usually manage.
In 2009, Martin Ford, a software entrepreneur from Silicon Valley, noted in “The Lights in the Tunnel” that new occupations created by technology—web coders, mobile-phone salesmen, wind-turbine technicians and so on—represent a tiny fraction of employment. And while it is true that technology creates jobs, history shows that it can vaporise them pretty quickly, too. “The IT jobs that are now being off-shored and automated are brand new jobs that were largely created in the tech boom of the 1990s,” says Mr Ford.
In his analysis, Mr Ford noted how technology and innovation improve productivity exponentially, while human consumption increases in a more linear fashion. In his view, Luddism was, indeed, a fallacy when productivity improvements were still on the relatively flat, or slowly rising, part of the exponential curve. But after two centuries of technological improvements, productivity has "turned the corner" and is now moving rapidly up the more vertical part of the exponential curve. One implication is that productivity gains are now outstripping consumption by a large margin.
Another implication is that technology is no longer creating new jobs at a rate that replaces old ones made obsolete elsewhere in the economy. All told, Mr Ford has identified over 50m jobs in America—nearly 40% of all employment—which, to a greater or lesser extent, could be performed by a piece of software running on a computer. Within a decade, many of them are likely to vanish. “The bar which technology needs to hurdle in order to displace many of us in the workplace,” the author notes, “is much lower than we really imagine.”
In their recent book, “Race Against the Machine”, Erik Brynjolfsson and Andrew McAfee from the Massachusetts Institute of Technology agree with Mr Ford's analysis—namely, that the jobs lost since the Great Recession are unlikely to return. They agree, too, that the brunt of the shake-out will be borne by middle-income knowledge workers, including those in the retail, legal and information industries. But the authors' perspective is from an ivory tower rather than from the hands-on world of creating start-ups in Silicon Valley. Their proposals for reform, while spot on in principle, expect rather a lot from the political system and other vested interests.
Unlike Mr Ford, Dr Brynjolfsson and Dr McAfee are more sanguine about the impact smart technology is having on the job market. As they see it, those threatened the most by technology should learn to work with machines, rather than against them. Do that, they suggest, and the shake-out among knowledge workers becomes less of a threat and more of an opportunity.
As an example, they point to the way Amazon and eBay have spurred over 600,000 people to earn their livings by dreaming up products for a world-wide customer base. Likewise, Apple’s App Store and Google’s Android Marketplace have made it easy for those with ideas for doing things with phones to distribute their products globally. Such activities may not create a new wave of billion-dollar businesses, but they can put food on the table for many a family and pay the rent, and perhaps even the college fees.
In the end, the Luddites may still be wrong. But the nature of what constitutes work today—the notion of a full-time job—will have to change dramatically. The things that make people human—the ability to imagine, feel, learn, create, adapt, improvise, have intuition, act spontaneously—are the comparative advantages they have over machines. They are also the skills that machines, no matter how smart, have had the greatest difficulty replicating.
Marina Gorbis of the Institute for the Future, an independent think-tank in Palo Alto, California, believes that, while machines will replace people in any number of tasks, “they will amplify us, enabling us to do things we never dreamed of doing before.” If that new “human-machine partnership” gives people the dignity of work, as well as some means for financial reward, all the better. But for sure, the world is going to be a different place.
http://www.economist.com/blogs/babbage/2011/11/artificial-intelligence?fsrc=scn/tw/te/bl/ludditelegacy
From the Economist...
Artificial intelligence
Difference Engine: Luddite legacy
AN APOCRYPHAL tale is told about Henry Ford II showing Walter Reuther, the veteran leader of the United Automobile Workers, around a newly automated car plant. “Walter, how are you going to get those robots to pay your union dues,” gibed the boss of Ford Motor Company. Without skipping a beat, Reuther replied, “Henry, how are you going to get them to buy your cars?”
Whether the exchange was true or not is irrelevant. The point was that any increase in productivity required a corresponding increase in the number of consumers capable of buying the product. The original Henry Ford, committed to raising productivity and lowering prices remorselessly, appreciated this profoundly—and insisted on paying his workers twice the going rate, so they could afford to buy his cars.
For the company, there was an added bonus. By offering an unprecedented $5 a day in 1914, he caused the best tool-makers and machinists in America to flock to Ford. The know-how they brought boosted production efficiency still further and made Ford cars ever more affordable. With its ingenious Model T, Ford became the first car company in the world to bring motoring to the masses.
Economists see this as a classic example of how advancing technology, in the form of automation and innovation, increases productivity. This, in turn, causes prices to fall, demand to rise, more workers to be hired, and the economy to grow. Such thinking has been one of the tenets of economics since the early 1800s, when hosiery and lace-makers in Nottingham—inspired by Ned Ludd, a legendary hero of the English proletariat—smashed the mechanical knitting looms being introduced at the time for fear of losing their jobs.
Some did lose their jobs, of course. But if the Luddite Fallacy (as it has become known in development economics) were true, we would all be out of work by now—as a result of the compounding effects of productivity. While technological progress may cause workers with out-dated skills to become redundant, the past two centuries have shown that the idea that increasing productivity leads axiomatically to widespread unemployment is nonsense.
But here is the question: if the pace of technological progress is accelerating faster than ever, as all the evidence indicates it is, why has unemployment remained so stubbornly high—despite the rebound in business profits to record levels? Two-and-a-half years after the Great Recession officially ended, unemployment has remained above 9% in America. That is only one percentage point better than the country’s joblessness three years ago at the depths of the recession.
The modest 80,000 jobs added to the economy in October were not enough to keep up with population growth, let alone re-employ any of the 12.3m Americans made redundant between 2007 and 2009. Even if job creation were miraculously nearly to triple to the monthly average of 208,000 that is was in 2005, it would still take a dozen years to close the yawning employment gap caused by the recent recession, says Laura D’Andrea Tyson, an economist at University of California, Berkeley, who was chairman of the Council of Economic Advisers during the Clinton administration.
The conventional explanation for America's current plight is that, at an annualised 2.5% for the most recent quarter (compared with an historical average of 3.3%), the economy is simply not expanding fast enough to put all the people who lost their jobs back to work. Consumer demand, say economists like Dr Tyson, is evidently not there for companies to start hiring again. Clearly, too many chastened Americans are continuing to pay off their debts and save for rainy days, rather than splurging on things they may fancy but can easily manage without.
There is a good deal of truth in that. But it misses a crucial change that economists are loth to accept, though technologists have been concerned about it for several years. This is the disturbing thought that, sluggish business cycles aside, America's current employment woes stem from a precipitous and permanent change caused by not too little technological progress, but too much. The evidence is irrefutable that computerised automation, networks and artificial intelligence (AI)—including machine-learning, language-translation, and speech- and pattern-recognition software—are beginning to render many jobs simply obsolete.
This is unlike the job destruction and creation that has taken place continuously since the beginning of the Industrial Revolution, as machines gradually replaced the muscle-power of human labourers and horses. Today, automation is having an impact not just on routine work, but on cognitive and even creative tasks as well. A tipping point seems to have been reached, at which AI-based automation threatens to supplant the brain-power of large swathes of middle-income employees.
That makes a huge, disruptive difference. Not only is AI software much cheaper than mechanical automation to install and operate, there is a far greater incentive to adopt it—given the significantly higher cost of knowledge workers compared with their blue-collar brothers and sisters in the workshop, on the production line, at the check-out and in the field.
In many ways, the white-collar employees who man the cubicles of business today share the plight of agricultural workers a century ago. In 1900, nearly half of the adult population worked on the land. Thanks to tractors, combine harvesters, crop-picking machines and other forms of mechanisation, agriculture now accounts for little more than 2% of the working population.
Displaced agricultural workers then, though, could migrate from fields to factories and earn higher wages in the process. What is in store for the Dilberts of today? Media theorist Douglas Rushkoff (“Program or Be Programmed” and “Life Inc”) would argue "nothing in particular." Put bluntly, few new white-collar jobs, as people know them, are going to be created to replace those now being lost—despite the hopes many place in technology, innovation and better education.
The argument against the Luddite Fallacy rests on two assumptions: one is that machines are tools used by workers to increase their productivity; the other is that the majority of workers are capable of becoming machine operators. What happens when these assumptions cease to apply—when machines are smart enough to become workers? In other words, when capital becomes labour. At that point, the Luddite Fallacy looks rather less fallacious.
This is what Jeremy Rifkin, a social critic, was driving at in his book, “The End of Work”, published in 1995. Though not the first to do so, Mr Rifkin argued prophetically that society was entering a new phase—one in which fewer and fewer workers would be needed to produce all the goods and services consumed. “In the years ahead,” he wrote, “more sophisticated software technologies are going to bring civilisation ever closer to a near-workerless world.”
The process has clearly begun. And it is not just white-collar knowledge workers and middle managers who are being automated out of existence. As data-analytics, business-intelligence and decision-making software do a better and cheaper job, even professionals are not immune to the job-destruction trend now underway. Pattern-recognition technologies are making numerous highly paid skills redundant.
Radiologists, who can earn over $300,000 a year in America, after 13 years of college education and internship, are among the first to feel the heat. It is not just that the task of scanning tumour slides and X-ray pictures is being outsourced to Indian laboratories, where the job is done for a tenth of the cost. The real threat is that the latest automated pattern-recognition software can do much of the work for less than a hundredth of it.
Lawyers are in a similar boat now that smart algorithms can search case law, evaluate the issues at hand and summarise the results. Machines have already shown they can perform legal discovery for a fraction of the cost of human professionals—and do so with far greater thoroughness than lawyers and paralegals usually manage.
In 2009, Martin Ford, a software entrepreneur from Silicon Valley, noted in “The Lights in the Tunnel” that new occupations created by technology—web coders, mobile-phone salesmen, wind-turbine technicians and so on—represent a tiny fraction of employment. And while it is true that technology creates jobs, history shows that it can vaporise them pretty quickly, too. “The IT jobs that are now being off-shored and automated are brand new jobs that were largely created in the tech boom of the 1990s,” says Mr Ford.
In his analysis, Mr Ford noted how technology and innovation improve productivity exponentially, while human consumption increases in a more linear fashion. In his view, Luddism was, indeed, a fallacy when productivity improvements were still on the relatively flat, or slowly rising, part of the exponential curve. But after two centuries of technological improvements, productivity has "turned the corner" and is now moving rapidly up the more vertical part of the exponential curve. One implication is that productivity gains are now outstripping consumption by a large margin.
Another implication is that technology is no longer creating new jobs at a rate that replaces old ones made obsolete elsewhere in the economy. All told, Mr Ford has identified over 50m jobs in America—nearly 40% of all employment—which, to a greater or lesser extent, could be performed by a piece of software running on a computer. Within a decade, many of them are likely to vanish. “The bar which technology needs to hurdle in order to displace many of us in the workplace,” the author notes, “is much lower than we really imagine.”
In their recent book, “Race Against the Machine”, Erik Brynjolfsson and Andrew McAfee from the Massachusetts Institute of Technology agree with Mr Ford's analysis—namely, that the jobs lost since the Great Recession are unlikely to return. They agree, too, that the brunt of the shake-out will be borne by middle-income knowledge workers, including those in the retail, legal and information industries. But the authors' perspective is from an ivory tower rather than from the hands-on world of creating start-ups in Silicon Valley. Their proposals for reform, while spot on in principle, expect rather a lot from the political system and other vested interests.
Unlike Mr Ford, Dr Brynjolfsson and Dr McAfee are more sanguine about the impact smart technology is having on the job market. As they see it, those threatened the most by technology should learn to work with machines, rather than against them. Do that, they suggest, and the shake-out among knowledge workers becomes less of a threat and more of an opportunity.
As an example, they point to the way Amazon and eBay have spurred over 600,000 people to earn their livings by dreaming up products for a world-wide customer base. Likewise, Apple’s App Store and Google’s Android Marketplace have made it easy for those with ideas for doing things with phones to distribute their products globally. Such activities may not create a new wave of billion-dollar businesses, but they can put food on the table for many a family and pay the rent, and perhaps even the college fees.
In the end, the Luddites may still be wrong. But the nature of what constitutes work today—the notion of a full-time job—will have to change dramatically. The things that make people human—the ability to imagine, feel, learn, create, adapt, improvise, have intuition, act spontaneously—are the comparative advantages they have over machines. They are also the skills that machines, no matter how smart, have had the greatest difficulty replicating.
Marina Gorbis of the Institute for the Future, an independent think-tank in Palo Alto, California, believes that, while machines will replace people in any number of tasks, “they will amplify us, enabling us to do things we never dreamed of doing before.” If that new “human-machine partnership” gives people the dignity of work, as well as some means for financial reward, all the better. But for sure, the world is going to be a different place.
http://www.economist.com/blogs/babbage/2011/11/artificial-intelligence?fsrc=scn/tw/te/bl/ludditelegacy
Sunday, October 30, 2011
What Occupy Wall Street Gets Wrong About Inequality | Foreign Affairs
What Occupy Wall Street Gets Wrong About Inequality | Foreign Affairs
October 24, 2011
SNAPSHOT
What Occupy Wall Street Gets Wrong About Inequality
A Better Way to Think About the Bailout, Jobs, and Taxes
Douglas Holtz-Eakin
DOUGLAS HOLTZ-EAKIN is President of the American Action Forum and is a Former Director of the Congressional Budget Office.
The protestors taking part in the Occupy Wall Street demonstrations around the country, despite their disparate backgrounds, seem to have settled on a recurring theme: fairness. It is not fair that Wall Street employees got a bailout and still have their jobs while so many workers in the United States have neither. It is not fair that the rich are not taxed at higher rates. It is not fair that some people are far richer than others.
Complaints about the bailout and jobs are ironic, because it did not have to be this way. Indeed, it is a tribute to the bad execution, not the bad intent, of policy that the Occupy Wall Street movement exists in the first place.
In 2008, when it was first conceived, the Toxic Asset Relief Program (TARP, now simply referred to as "the bailout") was supposed to save jobs across the economy -- not by bailing out banks but by solving the problem of toxic assets, the mortgage-backed securities at the heart of the financial crisis. This did not mean handing taxpayer dollars to banks. At the time, Senator Christopher Dodd (D-Conn.), then the chair of the Senate Banking Committee, called the proposal "stunning and unprecedented in scope and lack of detail." He went on, "It would allow the Secretary of the Treasury to intervene in our economy by purchasing at least $700 billion of toxic assets. It would allow the Secretary to hold on to those assets for years and to pay millions of dollars to hand-picked firms to manage those assets." Notice that there is no mention of a bailout: the focus was not banks but toxic assets anywhere in the system.
Congress held hearings to consider the TARP proposal, during which Henry Paulson, then Secretary of the Treasury, testified that "the $700 billion program we have proposed is not a spending program. It is an asset purchase program, and the assets which are bought and held will ultimately be resold, with the proceeds coming back to the government." Ben Bernanke, chair of the Federal Reserve, concurred, saying, "The Federal Reserve supports the Treasury's proposal to buy illiquid assets from financial institutions."
At the time, I was the director for domestic and economic policy for John McCain's presidential campaign; I remember the words, intentions, testimony, legislative language, press releases, and promises that Bush administration officials made as to what the Treasury Department needed to counter the shock to the economy. In short, they said that the Treasury Department needed to buy toxic assets to stop the free-fall, not to direct infusions of taxpayer money into the banks that were teetering on collapse.
But this, unfortunately, is exactly what happened. Shortly after TARP was passed into law, Paulson abandoned purchases and elected to direct equity injections into banks. The bailout began. And once the Obama administration took office and Timothy Geithner became Treasury Secretary, Washington announced plans to address toxic assets but then abandoned them. In short, TARP was hijacked by Paulson and Geithner and turned into a bailout for bankers, with no discipline for either the banks or the bankers. This unfairness of bailouts was not the original intention.
Similarly, unemployment -- another central grievance of Occupy Wall Street -- did not have to be so dismal. The 2009 stimulus bill was poorly crafted, with few "shovel-ready" infrastructure jobs, too much waste, and a bevy of ineffective pet programs. In the aftermath of the stimulus, the Obama administration took its eye off job creation and instead prioritized its social agenda (health care reform), green objectives (Waxman-Markey and EPA regulations), labor priorities (National Labor Relations Board agenda), and other legislative and regulatory initiatives that were damaging to economic growth. And as the coup de grâce the administration placed the U.S. government on a dangerous, explosive debt trajectory that invites a return to the worst days of the financial crisis.
The Obama administration should have pushed for reforms to existing tax policy that would have created incentives for businesses and entrepreneurs to base their operations in the United States and to spend at a faster rate on innovation, workers, repairs, and new plants and equipment.
The place to start is the corporate income tax, which harms the United States' international competitiveness in two important ways. First, the current 35 percent rate is far too high: when combined with state-level taxes, U.S. corporations face the highest tax rates among developed countries. The rate should be reduced to 25 percent or lower. Second, the United States remains the only developed country to tax corporations based on their worldwide earnings. Other states follow a territorial approach in which, for example, a German corporation pays taxes to Germany only on its earnings in Germany, to the United States only on its earnings there, and so on. If the United States were to adopt the territorial approach, it would place U.S. firms on a level playing field with their competitors.
Proponents of the worldwide approach used by the United States argue that because it does not let U.S. firms enjoy lower taxes when they invest abroad, it removes a possible incentive to send jobs overseas. Imagine two Ohio firms, they say: one invests $100 million in Ohio, the other $100 million in Brazil. The worldwide approach treats the profits on these two investments equally -- both are taxed at U.S. rates -- giving the company that invests in Brazil no advantage over its competitor that invests in the United States.
But this line of reasoning is misguided and out of date. For starters, because firms all over the world pay lower taxes than the two Ohio-based companies, the likeliest outcome is that both firms would have trouble competing with global rivals. Moreover, when U.S. multinational firms invest in and expand employment abroad, they tend also to invest in and expand employment in the United States. In the end, healthy and competitive firms grow and expand, while uncompetitive firms do not. Washington's goal should be to make sure that U.S. companies do not end up overtaxed, uncompetitive, and eventually out of business. And finally, because the United States is the only holdout still using a worldwide approach, it is less attractive as a headquarters for large global firms. As the United States loses these headquarters, it will also lose the employment, research, and manufacturing that are typically located nearby.
Related to their grievances about both jobs and taxes -- and indeed, to the very notion of fairness as they see it -- the Occupy Wall Street protesters would like to deal with the fallout of the economic collapse by having the rich pay higher taxes. After all, the incomes of the country's wealthiest have continued to rise when much of the labor force is idle. Moreover, they say, the wealthy get much of their current income in the form of capital gains, which are more lightly taxed.
But these arguments ignore the fact that capital gains are already taxed once, when the income is originally earned. And beyond that, the arguments of Occupy Wall Street are especially misleading because they focus only on annual tax payments and ignore the larger picture of federal finances.
The largest tax that most Americans pay each year is the payroll tax, which is dedicated to funding Social Security and Medicare. The vast majority of them will get much more out of these programs than they will ever pay in. In effect, these individuals are subsidized by the payroll and income taxes of the remainder of U.S. taxpayers.
More important, they get their federal government for free. Everything the country's original framers envisioned for the federal government -- national security, infrastructure, basic research, and so on -- is paid for largely by income taxes. Nearly one half of Americans pay no income tax, and the top five percent pay well over 50 percent of the income taxes.
Put another way, when a tiny fraction of Americans is paying for many of the federal services that every American enjoys, why is it fair to make them pay even more? Stepping back further, the major injustice is not who pays what tax rate but that, unless the United States changes course, future generations will inherit broken social safety-net programs, enormous debt, and a crippled economy. Focusing the fairness debate on the income taxes of a handful of Americans simply distracts from the greater injustices in federal fiscal policy.
Finally, some in the Occupy Wall Street movement and its sympathizers question the fundamental fairness of the capitalist system. And indeed, regardless of whether one focuses on the well-being of the least affluent (caring about the social safety net) or on the gap between them and the rich (caring about inequality per se) there is legitimate concern for the outlook for the bottom of the U.S. income distribution.
The solution, however, is to rely more on capitalism. The seminal economic event of the early twenty-first century is not the financial crisis and the recession but the entry to the global labor market of billions of workers in China, India, and elsewhere. The simplest economics suggest that this added competition will lower the relative market earnings of unskilled laborers and raise the return to higher-skilled workers and capital investment.
This is bad news for poor wage-earners and an advantage to those with human and financial capital. Progressives have reacted by calling for pure redistribution, empowering unionized labor, or attempting to close U.S. borders to the flows of goods, capital, and labor. But global market forces will overwhelm such ill-conceived government attempts to reverse economic fundamentals.
A better strategy would be to harness those very market forces by building human and financial capital. Fundamental reforms to the K-12 education system that would emphasize parental choice and reward teacher performance would be a good place to start; among other pluses, Republicans and Democrats now agree on the basic merits of such changes.
But Washington will also need a more thoroughgoing focus on building human capital at every stage of a person's career.
From a budgetary perspective, the United States should not only rein in the over-promises of existing entitlements but also reverse the underlying strategy of focusing more on a person's needs at the end of life than at the beginning. Instead of just providing entitlements for retirement income, health care, and elder assistance, the U.S. government should prioritize entitlements for pre-K schooling, primary education, nutrition, and preventive care.
Similarly, the U.S. government should move away from making unemployment insurance, food stamps, and other low-income programs conditional on low cash flows. Instead, it should integrate those programs with individual accounts that can be managed to accumulate wealth. Such a structure would provide strong incentives for reliance on work and on a timely exit from support.
Occupy Wall Street may be right about one thing: fairness is and should be at the heart of debate over financial policy and economic well-being. But the solution to these questions of fairness -- and to the United States' economic woes -- is to use the capitalist system, not to destroy it.
Copyright © 2002-2010 by the Council on Foreign Relations, Inc.
All rights reserved. To request permission to distribute or reprint this article, please fill out and submit a Permissions Request Form. If you plan to use this article in a coursepack or academic website, visit Copyright Clearance Center to clear permission.
October 24, 2011
SNAPSHOT
What Occupy Wall Street Gets Wrong About Inequality
A Better Way to Think About the Bailout, Jobs, and Taxes
Douglas Holtz-Eakin
DOUGLAS HOLTZ-EAKIN is President of the American Action Forum and is a Former Director of the Congressional Budget Office.
The protestors taking part in the Occupy Wall Street demonstrations around the country, despite their disparate backgrounds, seem to have settled on a recurring theme: fairness. It is not fair that Wall Street employees got a bailout and still have their jobs while so many workers in the United States have neither. It is not fair that the rich are not taxed at higher rates. It is not fair that some people are far richer than others.
Complaints about the bailout and jobs are ironic, because it did not have to be this way. Indeed, it is a tribute to the bad execution, not the bad intent, of policy that the Occupy Wall Street movement exists in the first place.
In 2008, when it was first conceived, the Toxic Asset Relief Program (TARP, now simply referred to as "the bailout") was supposed to save jobs across the economy -- not by bailing out banks but by solving the problem of toxic assets, the mortgage-backed securities at the heart of the financial crisis. This did not mean handing taxpayer dollars to banks. At the time, Senator Christopher Dodd (D-Conn.), then the chair of the Senate Banking Committee, called the proposal "stunning and unprecedented in scope and lack of detail." He went on, "It would allow the Secretary of the Treasury to intervene in our economy by purchasing at least $700 billion of toxic assets. It would allow the Secretary to hold on to those assets for years and to pay millions of dollars to hand-picked firms to manage those assets." Notice that there is no mention of a bailout: the focus was not banks but toxic assets anywhere in the system.
Congress held hearings to consider the TARP proposal, during which Henry Paulson, then Secretary of the Treasury, testified that "the $700 billion program we have proposed is not a spending program. It is an asset purchase program, and the assets which are bought and held will ultimately be resold, with the proceeds coming back to the government." Ben Bernanke, chair of the Federal Reserve, concurred, saying, "The Federal Reserve supports the Treasury's proposal to buy illiquid assets from financial institutions."
At the time, I was the director for domestic and economic policy for John McCain's presidential campaign; I remember the words, intentions, testimony, legislative language, press releases, and promises that Bush administration officials made as to what the Treasury Department needed to counter the shock to the economy. In short, they said that the Treasury Department needed to buy toxic assets to stop the free-fall, not to direct infusions of taxpayer money into the banks that were teetering on collapse.
But this, unfortunately, is exactly what happened. Shortly after TARP was passed into law, Paulson abandoned purchases and elected to direct equity injections into banks. The bailout began. And once the Obama administration took office and Timothy Geithner became Treasury Secretary, Washington announced plans to address toxic assets but then abandoned them. In short, TARP was hijacked by Paulson and Geithner and turned into a bailout for bankers, with no discipline for either the banks or the bankers. This unfairness of bailouts was not the original intention.
Similarly, unemployment -- another central grievance of Occupy Wall Street -- did not have to be so dismal. The 2009 stimulus bill was poorly crafted, with few "shovel-ready" infrastructure jobs, too much waste, and a bevy of ineffective pet programs. In the aftermath of the stimulus, the Obama administration took its eye off job creation and instead prioritized its social agenda (health care reform), green objectives (Waxman-Markey and EPA regulations), labor priorities (National Labor Relations Board agenda), and other legislative and regulatory initiatives that were damaging to economic growth. And as the coup de grâce the administration placed the U.S. government on a dangerous, explosive debt trajectory that invites a return to the worst days of the financial crisis.
The Obama administration should have pushed for reforms to existing tax policy that would have created incentives for businesses and entrepreneurs to base their operations in the United States and to spend at a faster rate on innovation, workers, repairs, and new plants and equipment.
The place to start is the corporate income tax, which harms the United States' international competitiveness in two important ways. First, the current 35 percent rate is far too high: when combined with state-level taxes, U.S. corporations face the highest tax rates among developed countries. The rate should be reduced to 25 percent or lower. Second, the United States remains the only developed country to tax corporations based on their worldwide earnings. Other states follow a territorial approach in which, for example, a German corporation pays taxes to Germany only on its earnings in Germany, to the United States only on its earnings there, and so on. If the United States were to adopt the territorial approach, it would place U.S. firms on a level playing field with their competitors.
Proponents of the worldwide approach used by the United States argue that because it does not let U.S. firms enjoy lower taxes when they invest abroad, it removes a possible incentive to send jobs overseas. Imagine two Ohio firms, they say: one invests $100 million in Ohio, the other $100 million in Brazil. The worldwide approach treats the profits on these two investments equally -- both are taxed at U.S. rates -- giving the company that invests in Brazil no advantage over its competitor that invests in the United States.
But this line of reasoning is misguided and out of date. For starters, because firms all over the world pay lower taxes than the two Ohio-based companies, the likeliest outcome is that both firms would have trouble competing with global rivals. Moreover, when U.S. multinational firms invest in and expand employment abroad, they tend also to invest in and expand employment in the United States. In the end, healthy and competitive firms grow and expand, while uncompetitive firms do not. Washington's goal should be to make sure that U.S. companies do not end up overtaxed, uncompetitive, and eventually out of business. And finally, because the United States is the only holdout still using a worldwide approach, it is less attractive as a headquarters for large global firms. As the United States loses these headquarters, it will also lose the employment, research, and manufacturing that are typically located nearby.
Related to their grievances about both jobs and taxes -- and indeed, to the very notion of fairness as they see it -- the Occupy Wall Street protesters would like to deal with the fallout of the economic collapse by having the rich pay higher taxes. After all, the incomes of the country's wealthiest have continued to rise when much of the labor force is idle. Moreover, they say, the wealthy get much of their current income in the form of capital gains, which are more lightly taxed.
But these arguments ignore the fact that capital gains are already taxed once, when the income is originally earned. And beyond that, the arguments of Occupy Wall Street are especially misleading because they focus only on annual tax payments and ignore the larger picture of federal finances.
The largest tax that most Americans pay each year is the payroll tax, which is dedicated to funding Social Security and Medicare. The vast majority of them will get much more out of these programs than they will ever pay in. In effect, these individuals are subsidized by the payroll and income taxes of the remainder of U.S. taxpayers.
More important, they get their federal government for free. Everything the country's original framers envisioned for the federal government -- national security, infrastructure, basic research, and so on -- is paid for largely by income taxes. Nearly one half of Americans pay no income tax, and the top five percent pay well over 50 percent of the income taxes.
Put another way, when a tiny fraction of Americans is paying for many of the federal services that every American enjoys, why is it fair to make them pay even more? Stepping back further, the major injustice is not who pays what tax rate but that, unless the United States changes course, future generations will inherit broken social safety-net programs, enormous debt, and a crippled economy. Focusing the fairness debate on the income taxes of a handful of Americans simply distracts from the greater injustices in federal fiscal policy.
Finally, some in the Occupy Wall Street movement and its sympathizers question the fundamental fairness of the capitalist system. And indeed, regardless of whether one focuses on the well-being of the least affluent (caring about the social safety net) or on the gap between them and the rich (caring about inequality per se) there is legitimate concern for the outlook for the bottom of the U.S. income distribution.
The solution, however, is to rely more on capitalism. The seminal economic event of the early twenty-first century is not the financial crisis and the recession but the entry to the global labor market of billions of workers in China, India, and elsewhere. The simplest economics suggest that this added competition will lower the relative market earnings of unskilled laborers and raise the return to higher-skilled workers and capital investment.
This is bad news for poor wage-earners and an advantage to those with human and financial capital. Progressives have reacted by calling for pure redistribution, empowering unionized labor, or attempting to close U.S. borders to the flows of goods, capital, and labor. But global market forces will overwhelm such ill-conceived government attempts to reverse economic fundamentals.
A better strategy would be to harness those very market forces by building human and financial capital. Fundamental reforms to the K-12 education system that would emphasize parental choice and reward teacher performance would be a good place to start; among other pluses, Republicans and Democrats now agree on the basic merits of such changes.
But Washington will also need a more thoroughgoing focus on building human capital at every stage of a person's career.
From a budgetary perspective, the United States should not only rein in the over-promises of existing entitlements but also reverse the underlying strategy of focusing more on a person's needs at the end of life than at the beginning. Instead of just providing entitlements for retirement income, health care, and elder assistance, the U.S. government should prioritize entitlements for pre-K schooling, primary education, nutrition, and preventive care.
Similarly, the U.S. government should move away from making unemployment insurance, food stamps, and other low-income programs conditional on low cash flows. Instead, it should integrate those programs with individual accounts that can be managed to accumulate wealth. Such a structure would provide strong incentives for reliance on work and on a timely exit from support.
Occupy Wall Street may be right about one thing: fairness is and should be at the heart of debate over financial policy and economic well-being. But the solution to these questions of fairness -- and to the United States' economic woes -- is to use the capitalist system, not to destroy it.
Copyright © 2002-2010 by the Council on Foreign Relations, Inc.
All rights reserved. To request permission to distribute or reprint this article, please fill out and submit a Permissions Request Form. If you plan to use this article in a coursepack or academic website, visit Copyright Clearance Center to clear permission.
Subscribe to:
Comments (Atom)