MIT Center for International Studies

The Bush, NeoCon and Pro-War Liberal Blunders That Produced the New Mess in Iraq

And so the inevitable is unfolding: a possible collapse of the U.S.-imposed Iraqi state, the apparent triumph of the most brutal extremists in the world, and more to come in Syria, Afghanistan, and possibly Jordan, Mali, Libya, and who knows where else. The first step to recovery -- if recovery is even feasible -- is an honest reckoning of why this is happening.

Keep reading... Show less

Who Is the Real Leader of Russia?

Ever since Dmitri Medvedev's nomination to succeed Vladimir Putin as president of Russia, followed by his election and now his inauguration, Kremlin watchers, both Russian and Western, have been discussing the so-called "Putin-Medvedev tandem" and asking who will really lead Russia. Is the duumvirate stable? Will it degenerate into squabbling among the Kremlin clans behind the scenes?

The pundits have identified four plausible scenarios. One is that President Medvedev will indeed have the principal power, including the possibility of ousting Putin as prime minister, or marginalizing him, since the Russian political system has been "super-presidential," i.e., strongly centered in the presidency, since the adoption of the new constitution by Boris Yeltsin in 1993. The second is that the system will remain centered around Prime Minister Putin through informal power mechanisms that have much more weight in this system than do the formal powers granted by the constitution; this is the scenario I consider most likely. A third is that the United Russia Party will emerge as dominant in this situation, able to make or break presidents through the electoral process. A fourth is that the whole country, or at least the government, will fall apart because of feuding among the followers of the president and the prime minister who will be unable to decide on the fair division of spoils that come with holding power in this country that covers one-sixth of the Earth's land mass.

Because the corridors of power are so completely impenetrable to outsiders, no one knows what will happen. Still, Putin and his advisers' actions in the months leading up to the election and then inauguration of Medvedev as president of the Russian Federation suggest some answers.

Putin's Trajectory

In many ways, Putin has been the most transparent of Russian leaders. Immediately upon his ascension to formal power as president in spring 2000, he spoke of a "power vertical," which he then proceeded unapologetically to construct. He proposed two years ago that he might become prime minister. On Oct. 2, 2007, at the Congress of United Russia, Putin called the notion that he might head the government "completely realistic."

There have, however, been ambiguities and contradictions throughout his two terms as president, including, most recently, with the issue of succession. Beginning in the fall of 2007, Kremlin officials and United Russia leaders began consistently calling on Putin to remain a "national leader" in order to ensure the continuity of current policies. Yet at the same time there was no official clarification as to what exactly this might entail.

Recently there also has been a profound marshaling of historical symbolism that seems to increase with every turn of the story. While systematically downplaying what they are doing, Putin and his handlers have gone to surprising lengths to marshal symbolism straight from the pages of Russian history.

Specifically, I argue that the solution in which Medvedev would be elected president in 2008 and he would then in turn name Putin as his prime minister was evolving steadily behind the scenes in ways that were not always transparent to outside observers and that support the hypothesis that Medvedev is likely to be more of a figurehead than a real president.

Because Putin famously loves surprises, he (and his handlers) did not let the public know who was going to be named heir apparent to the presidency until Dec. 10, 2007. An element of surprise and anxious waiting had become by now in the Putin presidency an element de rigueur, keeping politicians and the public guessing. Who would succeed Putin, everyone was asking. Many were convinced that Putin would truly step down because he stated so often that he would do so.

Yet my own reading of the situation was that precisely his adamant, repeated insistence that he would step down in 2008 began to sound hollow even in 2005-06. Methought the gentleman did protest too much.

In the context of what became known as the "2008 question" (would he or wouldn't he serve a third term?), Putin was claiming that he hoped there would be "continuity on policy regardless of who was in power."

He insisted that he did not want public speakers to speculate on the succession or even to use the phrase "third term."

Yet, of course, that was exactly what Kremlin watchers loved to do: speculate on what the president was going to do once his constitutionally mandated term of office was up. There was a whole cottage industry in both Russia and the United States devoted to the what-will-he-do-now question.

The Putin Plan

As early as 2001, Sergei Mironov, head of the Just Russia Party (a minor pro-Kremlin political party), already was saying two terms wouldn't be enough; Putin should be elected to a third term. Mironov first said this, in fact, the day after he was elected as chairman of the Russian Federation Council, the upper house of parliament, showing his tremendous loyalty (or should we say sycophancy) toward the president. He then went on to repeat this argument verbatim virtually every year after that.

After Putin's re-election in 2004, several other federal and especially regional lawmakers also began to make noises about a constitutional amendment that would allow a third term.

The volume of these noises increased markedly after Putin appropriated the right to appoint the governors in the wake of the Beslan crisis of September 2004. Now, even the most seemingly independent of governors (Mintimer Shaimiev, for example, in Tatarstan) began praising Putin and discussing the need for a third term. Their own self-interest dictated that they praise the sitting president who could decide their fates so unilaterally.

In spring 2007, United Russia Party officials began speaking of a "Putin Plan" as if such a plan really existed and as if it contained genuine content. By late September 2007, President Putin had even formally approved the "plan," claiming characteristically that it was not he who had made up the plan but rather the United Russia Party.

But what was the "plan" made of? Only quotes and slogans from the president's addresses to the federal assembly. "The greatness of Russia" hardly constitutes a "plan." Invoking Russian "civilization" also does not make a plan.

The person most vocally committed to the Putin Plan has been Boris Gryzlov, speaker of the Parliament and head of the United Russia Party. Gryzlov in May 2007 began to speak not only of the Putin Plan but also of Putin as "national leader" for Russia. Because Putin was the national leader of Russia, the Putin Plan could work, he claimed. And, of course, because of the success of the Putin Plan, Putin must remain the national leader of the country. In other words, his logic was absolutely circular and tautological.

For Russians, he said, "Vladimir Putin is the absolute national leader." In October 2007 he added, "I think that it is not necessary to hold a concrete post in order to be a leader."

At about the same time at the United Russia Congress, a number of people stood up, in purely Soviet fashion, to beg Putin to stay in politics. One was a female weaver named Elena Lapshina, who was introduced to express "the hopes of the simple people": "I see so many big bosses and just smart people at this congress. I appeal to all of you: Let's think of something together so that Vladimir Vladimirovich Putin will remain the president of Russia after 2008 as well."

On the eve of the United Russia Party Congress on Nov. 6, 2007, the party website published an article by Abdul-Khakim Sultygov, Putin's envoy for human rights to Chechnya and later United Russia's specialist on nationality affairs. Now the party was developing an ingenious (though discursively empty) and highly symbolic way out of the 2008 dilemma. According to the "Putin Plan," the party would strive to make Putin a "national leader." Then, an imagined "Citizens' Council of the Russian Nation" would create a "Pact of Civil Unity," which would necessitate the return of Putin to national leadership. Although Sultygov was the nominal author of this webpage, commentators have agreed that it bears all the hallmarks of Vladislav Surkov, the chief ideologist in the Kremlin. Once again, Putin and the Kremlin denied any involvement. These were said to be Sultygov's own ideas.

Here the party's and the Kremlin's goal was clearly to create two equations: Putin equals national leader, and United Russia equals party of power. For a historian this is fascinating because, of course, that was exactly what Vladimir Illyich Lenin did in the years leading up to and right after 1917: The Social Democratic Party equals the party of power, and Lenin equals the embodiment/the personification of the revolution.

Beyond "the Troubles"

By the fall of 2007 the party and especially the president had created another important equation: Putin does not equal Yeltsin; in short, the immediately preceding period was completely repudiated. Not only had Putin replaced Yeltsin, showing himself to be vigorous, decisive, muscular and sober, but also the whole period of the 1990s was now being portrayed as the "Time of Troubles," an analogy with the Time of Troubles that Russia experienced from 1598 to 1613. The 1990s were depicted as a Time of Troubles in terms of the vast political and social upheavals of the decade, the financial insecurities, and the constantly changing political figures at the top. But it was also because the boyars were ruling instead of the tsar. In the 17th century Time of Troubles this was a period of the seven boyars; in the 1990s this was the rule of the seven bankers, or semibankirshchina. The solution that began to be voiced in 2007 was the call for a national leader who would rule in conjunction with the people without the intermediaries of the bad boyars, just as Mikhail Romanov had been elected tsar of all the Russias in 1613.

On Nov. 29, 2007, just before the Dec. 2 Duma elections, when Putin's new approach to the elections and to power was unfolding, he spoke in his annual address of the forces of evil that he believed were trying to reshape plans for Russia's development, change the political course supported by the Russian people, and "return to the times of humiliation, dependence and dissolution" that followed the fall of the Soviet Union.

Many were coming to believe that such a Time of Troubles would require a strong leader for Russia's salvation.

On Dec. 2, 2007, the United Russia won overwhelmingly in the Duma elections. This was vigorously trumpeted as a victory for Putin. On Dec. 10, the four parties affiliated with the government -- United Russia, Just Russia, the Agrarian Party of Russia and the Civic Force Party -- proposed Medvedev as the next president in a highly choreographed event that several Russian commentators referred to as a "show." Leaders of the four parties came out and proposed to Putin that the next president should be Medvedev, then his deputy prime minister. According to journalists present at the time, Putin turned to his younger colleague and asked: "Dmitri Anatol'evich, did they speak with you about this?" "Yes," answered Medvedev, "there were preliminary consultations about this. They were very positive. We will continue this conversation today and tomorrow."

Putin then commented, "Together we can form a solid power in the Russian Federation after the March 2008 elections." In other words, Putin was not handing over the reins of government to his younger colleague; rather, he was planning to rule with him, by his side. Putin was also very clear to say that he would not hang Medvedev's picture on his wall. Ostensibly this was because he knew him so well. In reality, it was a pointed statement that Medvedev would look up to him, not the other way round.

A week later, Medvedev confirmed that they would be working together when he announced that, if he was elected, he would name Putin as his prime minister.

The Dual Monarchy

Commentators immediately jumped on this relationship between Medvedev and Putin, arguing that Russia has never known a successful "dual power." Yet that view ignores the productive and long-lived dual monarchy that lasted from 1619 to 1633 when Mikhail Romanov ruled in conjunction with his father, Philaret.

The parallels today to this earlier regime are striking. In 1613, at the end of the civil war that had engulfed Russia and that became known as the Time of Troubles, the Assembly of All the Land (the Zemskii Sobor) chose Mikhail Romanov as its new tsar. Mikhail, only 16 years old and not in good health, was said to be "weak in the legs." In 1619 when he returned from exile, Philaret, who had been forcibly tonsured as a monk during the years of upheaval, was made head of the Russian Orthodox Church rather than co-tsar. Yet from that moment until his death in 1633, he routinely used the title "Great Sovereign" (a title traditionally referred to the tsar) and co-signed the majority of official state documents, especially those relating to foreign policy.

The idea of a dual monarchy under Putin and Medvedev, I submit, has long been in production (as they would say in a movie studio).

On March 23, 2005, Putin visited the city of Kostroma, a visit that was recorded and aired on the Russian television program "Russkii Vzgliad" (the Russian view), a weekly talk show designed (in its own formulation) "for those who love Russia." The timing of the visit was the fifth anniversary of Putin's inauguration as president and also the saint's day associated with the celebration of the Kostroma icon of the Virgin Mary. That icon, in turn, was the one used to bless Mikhail Romanov on his coronation in 1613. The narrator of the television program even claimed that the recent Russian Time of Troubles (the Yeltsin years) had lasted just as many years as the original Time of Troubles in the early 17th century. In the television show President Putin stood for a long moment before the Kostroma icon, with head silently bowed. The Kostroma bishop spoke of an unnamed saint's prophesy of the resurrection of Russia.

One might consider all this a coincidence were it not for the fact that Putin himself had called for the joint meeting of the presidium of the State Council and the President's Council on Art and Culture to meet in Kostroma on that date. It was also his choice to visit the icons from 1613.

Sometime in fall 2007, the presidential elections were scheduled to take place on March 2, 2008. That date also just happens to be the anniversary of the announcement of the election of Mikhail Romanov in 1613.

The Medvedev-Putin parallel to Mikhail-Philaret has many virtues for the Kremlin today. The younger "son" figure (Medvedev has often said that Putin is like a father figure to him, and he is 14 years younger) will rule officially, while the older father figure will rule in practice. Both Putin and the monk Philaret were barred from ruling officially (Putin because of the limit of two terms, Philaret because he had been tonsured during the Time of Troubles). Ostensibly the father figures would occupy the apparently less significant position (head of the church, head of the governmental administration), yet through charisma and connections Philaret was able to rule in practice, as undoubtedly will Putin. (Amazingly, it even turns out that Putin is probably related to Philaret and Mikhail Romanov, since his ancestors were peasants on the estates of Philaret's brother Ivan Nikitich.)

In the period leading up to the March 2 elections, many analysts and journalists expressed concern that if Putin did not remain in power, the bureaucrats would begin fighting amongst themselves over the spoils of government.

This, too, was part of the public relations campaign: to show that "the people" (unspecified, of course) are afraid that without Putin, the country will sink into civil war and anarchy. This fear, of course, has deep historical roots, dating back at least to "The Lay of Igor's Host," the famous work of literature most often cited as teaching the dangers of a divided ruling house.

This past fall and winter (2007-08), Putin supporters brought forward another historical chestnut in support of their candidate: If Putin did not remain the national leader, Russia would fall behind the "civilized world"; it would once again become "backward."

Why, ultimately, did Putin make such a big deal of not running for president for a third term?

I think a key part of the answer is that he himself did not know exactly what he was going to do. As in Soviet times, it was convenient to allow a little bit of discussion in the pages of the press and on the internet so as to garner more ideas and options.

I also think it suited Putin's notion of having a union between president and people that the people should call for his re-election, the people should appear to ask him to remain in office as national leader.

This, of course, satisfied a condition of the classic cult of personality, especially as expressed in the Soviet era, namely that the Soviet leader should appear modest, should be called to office but not seek it himself.

But I think this also had deeper roots in the tsarist notion of a union between tsar and people. The tsar could represent the people because he was one of them, because he knew them better than his advisers did.

It also satisfied the agenda of United Russia, which sought to be the one party of power. In order to uphold that status, the party needed to be the one to appear to put forth the national leader. He was their leader; they were his party. Rather than developing a serious ideology, it was easier to assume that the very name of Vladimir Putin would have resonance in the country and bring voters to the polls.

The question on everyone's minds today is whether this union of president and prime minister, Medvedev and Putin, can hold. My guess is that it can. In the months leading up to his election and then inauguration, Medvedev was consistently given the lesser role in front of the cameras, and he accepted that role. On Feb. 14, 2008, when Putin held a conversation with journalists in Moscow that lasted a record four hours and 40 minutes, Medvedev was in Siberia proposing a return to giving an award for families with many children. On another occasion, when Putin was pronouncing on national strategy, Medvedev was in Kaliningrad opening a maternity hospital. The gender symbolism here was not accidental. Putin consistently put Medvedev in charge of the "national projects" -- health, education, housing and agriculture -- all secondary, one might even say "female," spheres of activity.

Putin, in the meantime, has said that the prime minister is responsible for the government, the economy, foreign relations and the military. In the last few days since being named prime minister, he has already named seven deputy prime ministers, including his most recent prime minister, Viktor Zubkov, who now will serve as his first deputy prime minister. By moving many of his key advisers from the Kremlin, where the president rules, to the White House, where the prime minister sits, Putin is clearly declaring to the world who will be the true boss. Dmitri Medvedev may be president of the country (a position that the constitution says should be dominant), but the real power, both formal and informal, will be in the hands of Vladimir Putin. Ultimately, in Russia today as in Soviet times, personality (and personal control) is more important than the institution or formal definition of power in the presidency.

As long as Medvedev does not create any trouble, which it is not in his interests to do, Putin can benefit from having an apparently "liberal" face to attract foreign investment. Presidential elections can be held every four years, with new presidents coming to power. Medvedev can even be re-elected.

Putin, meanwhile, can remain in power indefinitely, either in his capacity as head of United Russia or in his capacity as prime minister. Should there be a misstep on Medvedev's part, however, and the relationship become unsatisfactory, there is no reason that Putin cannot be re-elected to the presidency. The constitution stipulates merely that he cannot hold more than two consecutive terms. If the country were to be plunged into some kind of "crisis," even one that was transparently manufactured, elections could be held speedily and Putin, the elder statesman, would be returned to power as president.

For all these reasons, it seems likely that Putin will continue to be the dominant figure in the duumvirate. The question of Medvedev's selection, then election and inauguration appears to have been carefully organized so that he would be the junior partner and Putin would dominate, regardless of the formal institutional domination of the presidency over the prime ministership. During the inauguration, Putin entered the Kremlin hall first before Medvedev. Putin gave a speech before Medvedev's, and it was just as long. Then, finally, Putin walked down the stairs outside the Kremlin to review the presidential troops alongside Medvedev. At every step in the ceremonies that day, Putin was ahead of Medvedev and waiting for him.

A Critical Look at the Forced Spread of Democracy

The first subject to discuss in considering the future of the liberal internationalist agenda is the importance of the democratization project to the definition of Wilsonianism. The second is the meaning of multilateralism. In the first case, Thomas Knock and Anne-Marie Slaughter argue in a forthcoming volume that democratization was never an important part of Wilsonianism; that, instead, multilateralism is the key to liberal internationalism. On the basis of this argument, they come to the conclusion that the Bush Doctrine is not in the Wilsonian tradition. In my contribution to this volume, I object to this denigration of the place of democracy in liberal internationalism as being fundamentally illogical. Accordingly, I find the Bush Doctrine easily identifiable as Wilsonian.

I argue for the centrality of democracy to the Wilsonian project because it seems clear that the microfoundations for a regime in society are critical to the ability of those states that participate in multilateral organizations to do so effectively. That is, in order to function effectively, ultimately to provide for a peaceful world order, a multilateral organization needs to be dominated by democratic states, known for their rule-abiding behavior, their transparency, predictability, and accountability. Wilson wanted the League of Nations to be a League under the control of democracies and concerned with expanding this form of government, but then in late February 1919 at Versailles, he abandoned that idea. From a liberal internationalist perspective, the result of the League's character was that it was undermined not only by the failure of the United States to join, but also by the role played in it by autocratic states. It is worth adding that in his drafts of the Pan American Union some three years earlier, Wilson had also looked forward to a community of American states based on the consent of the governed. In a word, a world of peace was necessarily a world dominated by what today is often called "market democracies," a type of social, economic, and political order that Wilson argued was fundamentally different from and better than any alternative order. In such an order the place of democratic governments was central.

From a liberal perspective it is altogether logical that democratic states would make better partners in multilateral institutions than those that were autocratic (much less "totalitarian," a term and reality that only became evident after Wilson's death). That said, Wilson had to work with such material as he had at hand, whence, presumably, his capitulation to the idea that the League would not be dominated by the democracies. Such a compromise could not be satisfactory unless it were seen as a way-station on the road to the expansion of democratic government, a process that a rule-creating and abiding organization like the League might well encourage.

It is therefore altogether Wilsonian for liberal internationalists today to recognize the deficiencies of the United Nations yet at the same time not to sacrifice the notion of the paramount importance of multilateralist cooperation among democratic peoples for the sake of world peace. This is illustrated by Madeleine Albright's, and now Anne-Marie Slaughter's, notion of a "Community of Democracies" or a "Concert of Democracies" standing alongside the U.N. but capable of acting with unity and purpose in a military fashion should such a community deem it necessary.

Albright and Slaughter's position on the centrality of democratic solidarity is perfectly Wilsonian. This is what makes the Bush Doctrine so clearly Wilsonian as well. From President George W. Bush's initial speech on the matter to the West Point commencement in June 2002, through what is generally considered the best statement of the doctrine in the National Security Strategy of the United States in September 2002, it is clear that the leading element of his plan to construct a new world order (but not its only aspect) is the replacement of what he repeatedly has called "tyranny" by the spread of democratic government -- not only in Iraq but throughout the "Broader Middle East," if not beyond.

As a result, democratic government, like multilateralism and open markets, may be only one aspect of the Wilsonian project. But of its various aspects, democratic regimes are the most critical.

American Hegemony

Knock and Slaughter also disagree with me on the meaning of "multilateralism" in the Wilsonian agenda. Slaughter argues that such cooperation involves sacrifice of sovereignty, as if such a process will be experienced by all members of such organizations equally. What she never says is that multilateralism is, in effect, a program for American hegemony. I don't necessarily have anything against American hegemony; it may be good for the world. I don't necessarily have anything against imperialism; it may be good for those people subjected to it. It's a matter of debate. She, however, doesn't buy into the notion that her version of Wilsonianism is hegemonic or imperialistic, and sees rather the U.S. as being no more than first among equals. On this she is on solid ground, for as Knock shows, Wilson himself excluded the idea that multilateralism would be a vehicle of American power projection.

But is it realistic to suppose that American "participation" in multilateral institutions among fellow market democracies would not in fact be a "leadership" position that could easily develop into a "hegemonic" relationship? Wilson, Slaughter, and Knock may argue against such a conclusion, but I maintain they would be mistaken.

How do I arrive at the conclusion that "multilateralism" is a code word for "hegemonism"? In my book, A Pact with the Devil, I discuss the evolution of Wilsonianism over time, in contrast to others who (like Knock and Slaughter) are interested in an essentialist notion of Wilson, as if such thinking did not evolve in important ways over time. For them, we can find out what Wilsonianism means if we look at Woodrow Wilson. For me, the doctrine changes over time. Thus, I posit a "pre-classical" period of liberal internationalism going back to the American Revolution, which on one hand represented a Christian notion, and on the other secular enlightenment. But liberal internationalism only gets "classical" when we get to Woodrow Wilson, who had a clear project of what he meant, a framework.

It involved democratization, economic interdependence, openness -- and that's why the liberal economic tradition is important -- multilateralism, and American participation, indeed, American leadership. Wilsonianism becomes much more ideological when we get into the Cold War period, because it becomes the way in which the United States structures the liberal, democratic world -- the "Free World," as it was called -- with containment of communism as a major doctrine, and liberal internationalism as what works within a community of free states. Finally, we get to the period beginning in the 1990s, when liberal internationalism becomes an ideology in any sense of the word you care to describe ideology.

Herein lies the dynamic of the self-assurance that led to hegemonism and then imperialism on the part of the United States operating under the flag of liberal internationalism. Previously, liberalism had always suffered relative to Marxism/Leninism, by lacking the kind of theoretical rigor that Marxism/Leninism had during the Cold War. In the 1990s, however, a new bundle of concepts appeared that elevated the theoretical coherence of Wilsonianism. Of these concepts, the most important argument was Democratic Peace Theory (DPT). DPT was sponsored by a variety of well-known intellectuals at Yale, Princeton, Stanford, and Harvard. It was the notion that since democracies don't fight one another, the spread of democracy internationally would contribute to, or create, world peace. In other words, Kant was trumping Hobbes.

The problem for DPT, though, was that while it thought it desirable that democratic governance be expanded, it wasn't sure that expansion was actually feasible. Accordingly, a number of comparative political scientists began to argue against the warnings of an earlier generation that the transition to democracy was inherently difficult. These reservations could be waved off by the apparent evidence of the historic moment we were at in the 1990s. Great men -- the Pope, Nelson Mandela, Kim Dae-jung, Vaclav Havel among others -- plus the democracy idea, plus a little help from your friends at AID or NED -- would be enough to bring about the democratic transition. We should relax, therefore, all the notions of "preconditions" and "sequence" that the comparative political studies of the 1960s to 1980s had said existed.

In other words, what was desirable was also feasible. It was an intoxicating time: what was hoped for from the point of view of DPT was now seen as doable by liberal comparative political analysts.

Enter the group of liberal international jurists, like Thomas M. Frank or Anne-Marie Slaughter, who declared that sovereignty should be redefined to apply only to those states that rested on the consent of the governed. Governments that were non-democratic and that were either involved in gross human rights abuses or amassing weapons of mass destruction could legitimately be attacked. A new doctrine of "just war" was born.

Once you had this volatile mix, you have a Wilsonian argument for imperialism. Consider the stance of John Rawls, in his last book, The Law of Peoples, in which he explicitly rests his argument on DPT; he writes about Kant, and says in effect that life is not worth living if you don't think this democratization project is actually feasible, in what he calls a "realistic utopia." I am not saying that Rawls would have approved of the invasion of Iraq; I'm sure he would have been horrified by many of the things that happened there. But I think that Rawls can be correctly cited as an antecedent to the liberal imperialist democratization agenda.

What we have, then, is an evolution of Wilsonianism as a doctrine in the direction of progressive liberal imperialism, although it took the Bush administration and the enunciation of the Bush Doctrine to bring it about. Neoconservatives have shouldered far more responsibility than is their due for the consensus on the ideas behind the Bush Doctrine. With the exception of Francis Fukuyama, there was not a neoconservative who contributed to these ideas. Instead, these ideas, for the most part, belonged to individuals who were prominent within the Democratic Party. And here, I would cite particularly the Democratic Leadership Council, Progressive Policy Institute, headed by Will Marshall, and such people as Anne-Marie Slaughter, Larry Diamond, and Kenneth Pollack.

What we find in the current political cycle is that in fact the ideas of the Bush Doctrine, which might have met their death on the battlefields of Iraq, have migrated from the Republican into the Democratic Party. The neoconservatives are less welcome than they have been in the Republican Party (although their reemergence around John McCain in the spring of 2008 may show this announcement of their demise to be premature), but these "neoliberals," as I like to call them, are still alive and ready to provide intellectual framework of a Wilsonian type to a new Democratic administration. Consider as an example the book, With All our Might, edited by Will Marshall and including chapters by Pollack, Slaughter, Diamond, Michael McFaul, and a number of others, which was praised by the Weekly Standard.

There are also self-styled liberals at Brookings and the Rand Corporation and the Carnegie Endowment, who subscribe to these ideas as well. In this vein is the Princeton Project, "Forging a World of Liberty Under Law," which struck me as quite exceptional in what it had to say when it was published late in 2006. The Project's co-directors were John Ikenberry and Anne-Marie Slaughter. Its leading concepts were essentially three. First, that the United States should have military primacy in the world. Secondly, that there should be a "global Concert of Democracies," led, of course, by the United States, which would act in unison and outside the United Nations, which itself cannot be counted on to organize effective collective action. And thirdly, that this Concert would back -- by military means if necessary -- something called PAR ("Popular, Accountable and Rights-regarding" governments), thus providing a rationale for remaking governments that are recalcitrant to American hegemony. All of this adds up to a version of the Bush Doctrine, only now with unilateralism replaced by multilateralism, which itself will be hegemonism.

So, if we ask ourselves whether the Bush Doctrine represents modern Wilsonianism, my answer is unequivocally "yes." I would like to still be considered a liberal internationalist. But I'm a liberal internationalist of the Cold War period -- a person who is selective about where democracy should be pushed, a person who thinks that American imperialism in the name of democracy promotion is a counterproductive action. The fostering of human rights and democratic government may be good counsel where the U.S. and its democratic allies have leverage and local circumstances favorable to such a process. But the United States should tread lightly in the Muslim world, sub-Saharan Africa, China, and Russia. It should be prepared where necessary and possible to cooperate with governments whose character it finds objectionable.

And it should avoid the self-confident, self-righteous, and self-defeating conceit that it represents freedom and peace in all it does. As the Bush Doctrine has demonstrated, the notion that the United States is "the last, best hope of earth" (Abraham Lincoln) is a belief from which we need to seek relief.

What We Can Learn From Woodrow Wilson

We can't do much better than reclaiming the Declaration of Independence as a fundamental foreign policy document in American history. We have a tendency to read it in a simplistic way, and to think of it only as a sort of airy declaration of what were then human rights, and a declaration of separation from England. But, in fact, the founders had a fairly well-articulated sense of what they were doing with foreign policy, and a fairly revolutionary sense of their foreign policy. So I'm quite interested in how Woodrow Wilson rediscovers the founders and makes them relevant for his time.

This thinking about Wilson began for me about ten years ago when I came to be a speechwriter in the second term of Bill Clinton's presidency. I was quite interested in which presidents were considered historically interesting to Clinton and quickly figured out it was John F. Kennedy, obviously, and Franklin D. Roosevelt a little less obviously, and Teddy Roosevelt, who was a huge influence on Bill Clinton, and always has been. It was a time in the 1990s when a lot of very favorable books were coming out about Teddy Roosevelt, and it was an attractive time to be thinking about him. At the same time, I felt Wilson was completely ignored. I don't remember Clinton ever talking about Wilson. In the collected speeches of Bill Clinton -- it's something like eighteen very fat volumes, the man enjoys speaking -- if we looked up Wilson, I'm sure we could find a few references, but very few.

As a historian, I thought that was fascinating. I looked a little into Wilson and the way people talk about him, a sort of casual dismissal of Wilsonian idealism, which is a put-down -- I don't think it's ever used favorably in the press. George Bush vigorously denies that he's a Wilsonian idealist, and it's largely an accusation leveled at him, not something he claims for himself. Henry Kissinger's book, Diplomacy, opens with a discussion of Wilson versus Theodore Roosevelt, and he states it very clearly. One is an idealist, one is a realist.

I think the tide may be about to turn for Wilson. I do think he is a pivot for all of American history before him, converting it into the twentieth century. For my research, more than anything, I read his speeches, which was a pleasure. There are a lot of Wilson's speeches, and they are fascinating. They are radically different from what came before. They are radically different from what Theodore Roosevelt was saying. We think of them as roughly equal levels of orators, but I think Wilson vastly exceeded Teddy Roosevelt, and there's nothing in the late nineteenth century like him at all. You really have to go back to Abraham Lincoln for a sense that there's a mystical power in American history that's very forceful, that is acting through Wilson and through the American people and exerting considerable force on world events.

You can call it naïve and idealistic, and it's easy to find examples of naïve phrasings in Wilson's oratory, but you can also say, this person knew American history better than any president before him, and arguably better than any president since, and knew the founding moment extremely well: Jefferson in particular, but Hamilton, also, and knew the Civil War vividly, with first-hand knowledge. It's always worth restating that he was a war child. He grew up in a ravaged South. He saw destruction around him. He was not in peril himself, but he certainly lived in a South that was basically destroyed by the Civil War and had a lifelong, profound aversion to war.

He knew -- as many people did, but I think with particular clarity -- that the moment he was living in was a moment of great destiny for the United States. He uses the word destiny and the word providence a lot. Throughout the years of his presidency, you see him changing and evolving considerably. He comes in, of course, as an anti-war president, and even in his second term he promises to keep us out of war. Events conspire against him, and in April 1917 he leads the United States into World War I, where it exerted a decisive impact -- a late entrance into the war, but a very decisive entrance, and there was nothing half-baked about it. Nearly two million soldiers were sent over. It was full war. It was the first European intervention by the United States, and it was a moment of tremendous change in the history of the United States that I don't think we give him proper credit for. It seems like our mixed memories and our largely negative memories are almost entirely rooted in the failed struggle to bring the U.S. into the League of Nations, and we overlook the considerable importance of the fact that he entered World War I at all.

World War I and Afterwards

The general consensus is that during the war itself he led effectively, and there's no better way to gauge his performance than to see the way the world looked at him as the war ended, which is perfectly captured in Erez Manela's book, The Wilsonian Moment H.G. Wells uses the word "messiah" to describe the way people felt about him. So, I consider him a most significant president and a most significant interventionist, and not someone who was so carried away with pacifism that he didn't act in the opposite direction when circumstances required it.

He failed, of course, in the Versailles negotiations. He failed to win passage in the Senate of American entry into the League of Nations. And his health failed. That basically is the cardinal sin of American politics, failure. It taints the way we remember him. But that does not mean that the ideas he was expressing were insignificant or insincere, or that his idealism did not have a considerable realism inside it. Idealism is only idealism when it can't be achieved, but when you have the political willpower to make an idea happen, then it's not idealistic -- it's realistic.

The chief Republican critique of Wilson at the time was that he was giving away something extremely valuable, which was American sovereignty. But what he was trying to do was to create a collective arrangement, which required all parties to give up a certain amount of sovereignty to make the new arrangement work. It was a tiny amount of sovereignty, and no serious person thinks that the United States would have been threatened in any way by this. But that the way he talked about it, and the way the Republicans talked about it, allowed him to be perceived as articulating a weaker position than he actually had. The ideas that he was expressing were quite radical and new, and we're still, in many ways, trying to live up to them. They're in place in many ways that we accept nowadays without even thinking about it. We gave up sovereignty in 1949, for example, when we entered NATO. We're giving up sovereignty every time we allow a new country into NATO, which we're doing often, because we are obligated by treaty to go to their defense.

Consider a few more modern ways that Wilson's legacy has been used, and effectively, since his death. F.D.R. cites him quite movingly when he comes to take the nomination at Chicago in 1932. John F. Kennedy alludes to him briefly in his American University speech in 1963. Jimmy Carter gave a number of speeches that fit right into the Wilsonian rubric. Ronald Reagan, in many ways, was profoundly Wilsonian. In his famous speech to Evangelicals in 1983, he said, "America is great because America is good," which is a very Wilsonian thought -- that virtue conveys political power and, ultimately, military power. Mikhail Gorbachev's important speech in 1988, when he proposed a new world in which the Soviet Union would renounce weapons and all countries would renounce weapons, and we would live by a more virtuous international standard, was a profoundly Wilsonian speech.

And then we get to George W. Bush, who has certainly given Wilsonian utterances. It would be hard to find a better example than his second inaugural address, which pledged to end tyranny for all time. And yet, I don't find him a sincere Wilsonian. I don't think it was a sincere use of that rhetoric. I don't think there was anything like the learned understanding of American history that Wilson always had. I don't think there was anything like the implied commitment to egalitarianism and democracy that is in every Wilson speech, and certainly not to economic egalitarianism, which Franklin Roosevelt was certainly leading up to at the end of his presidency. So, in my opinion, Bush's is a kind of convenient rhetoric because it sounds good, but used in ways that Wilson would have been extremely uncomfortable with and profoundly opposed to.

The Declaration and Liberalism

How might the liberal tradition in foreign policy be defined, as seen through the prism of Woodrow Wilson? Admiration for the full promise of the Declaration of Independence, which is a document we've had trouble living up to for our entire history. In speech after speech, Wilson talked about it. He went to Philadelphia just before leading the United States into war and called the Declaration a document leading up to war and an important statement of American foreign policy.

The Declaration implies full political participation. Of course, we failed to live up to that for many decades, and even centuries. Human rights -- the term isn't used, but it's implied. And economic empowerment is also implied in a phrase like, "the pursuit of happiness." Every time another great document has been written -- the Atlantic Charter, or the Helsinki Accords, which aren't really American documents -- the Declaration of Independence is very much in the background.

In the Declaration there is a sense that diplomacy was broken, that we needed more open dealing between nations, and that the European system was profoundly flawed. There was a system -- it was the monarchical system, and aristocratic emissaries between countries -- and Wilson, like so many Americans before him, wanted to smash that system. "Open covenants, openly arrived at" is also something with a direct link to the ideas about open diplomacy in the Declaration of Independence. (The classic study of the founders and their foreign policy is To the Farewell Address, by Felix Gilbert, an absolutely brilliant distillation of what they were trying to do in foreign policy.)

Fewer rituals of diplomacy, fewer complicated treaties with secret codicils. Willingness to intervene in a just cause, which is an important and complicated point for Democrats to assume, but, for most of this century, until after Vietnam, the Democrats were considered the party more likely to intervene than the Republicans.

Bill Clinton considered himself quite willing to intervene in a just cause, someone with pacifistic tendencies but no aversion to the use of force when the cause was morally just. And that is, of course, an extremely complicated thing to parse, but it's in the story of Woodrow Wilson. It's in the story of Franklin D. Roosevelt. It's in the story of Democratic foreign policy in the twentieth century.

A devotion to free trade, another very complicated point. Democrats have not always been the strongest advocates of free trade, but that is a principle that goes back to the founders. The American Revolution, in a way, was a fight over free trade. Freedom of the seas is a codicil of that thought.

A particular concern for Europe and its great relevance to the United States, without thinking that European rituals are desirable, but nevertheless accepting that Europe is key to the stability of the United States. At the same time, a particular concern for the welfare of small nations, which Wilson talked about over, and over, and over again, and was seen in the international arena as the defender of small nations against England, France, and Germany.

Self-determination, that very slippery concept that got him into so much trouble, and which we, ourselves, were not entirely living up to in the Wilsonian moment. The Palmer Raids, the way black people were treated, and women were treated, certainly suggests that we didn't have full self-determination. But all people ultimately do have the right to shape their foreign policy. They not only have the right to elect their leaders, but as a consequence, they have the right to shape foreign policy. That's an important thought that I think Democrats should retain.

And, finally, an important thought that's in everything Wilson said, and a most effective tool of persuasion, is this profound belief that a new time is coming. Whether you think it's a religious new time, or a political new time, but nevertheless, the disasters of the present can be solved, and a new time is coming.

How Will Pakistani Conflict Impact the World?

South Asia has emerged as a strategically pivotal region, from the counter-terrorism and counter-insurgency campaign in Afghanistan to the emergence of India as an economic and military power. The current political crisis in Pakistan -- with President Pervez Musharraf suspending the constitution and declaring an emergency on November 3 -- threatens core interests of South Asia's major political actors, including the United States.

There are two distinct conflicts within Pakistan's polity. The first is between rebels along the Afghan border and the Pakistani state, and the second is between pro-democracy forces and Musharraf's military dictatorship. The outcomes of both struggles will affect the rest of the region, with some implications potentially being felt globally. Who is affected by Pakistan's turmoil, and why?

Pakistan on the Brink

The army has been the country's key political player since the 1950s, combining a focus on India with a domestic-political role as guardian and governor. The army is the most cohesive and well-organized institution in the country. Its involvement in political life is so embedded that a true withdrawal from politics is highly unlikely. Since Musharraf's coup in 1999, the army has further expanded its reach into society and the economy, sidelining political parties and civil society organizations.

Musharraf's decision to unseat the chief justice of the Supreme Court, Iftikar Muhammed Chaudry, in March 2007 outraged lawyers and pro-democracy activists who took to the streets in protest. These protests put pressure on Musharraf to move toward some kind of democratization, with a focus on fashioning a power-sharing deal with exiled former Prime Minister Benazir Bhutto. This further encouraged pro-democracy forces and the country's judiciary to stand up to Musharraf. The specific trigger for the November 3 imposition of a state of emergency was apparently intelligence reports that the Supreme Court was going to rule that Musharraf's re-election in October was unconstitutional.

This would be enough drama for most countries. But Pakistan is also sorely pressed by the rise of radical Islamist violence and tribal revolt along its western border with Afghanistan. Pakistani rulers' decisions during the 1980s and 1990s to back militants in Afghanistan and Kashmir have boomeranged to devastating effect. The ruling establishment now faces spreading radicalization, accelerated by easy availability of weapons and a plethora of militant organizations, which are starting to seep into the country's urban core. The Pakistan army is extremely hard-pressed in these border regions, suffering high losses and shocking instances of surrender and desertion that have raised serious alarm among informed observers. Recent reports suggest that even the elite strike corps usually intended for action on the Indian border are being re-deployed into the Afghan border regions.

This pair of conflicts, and Musharraf's response, is deeply troubling to the other major actors in the region. Pakistan also is the site of a simmering tribal revolt in the southwestern province of Balochistan, and ethnic tensions in Karachi and the southern province of Sindh. Now we need to consider how the current instability in Pakistan affects these states. The international and domestic politics of South Asia's states are tightly intertwined, and so spillover from Pakistan's politics cannot be easily isolated or contained.

The Neighbors

Afghanistan. Hamid Karzai's government in Kabul has consistently sparred with Musharraf's Pakistan. The resurgent Taliban are primarily based in Afghanistan's south among the Pashtun community. The Durand Line separating Pakistan and Afghanistan artificially divides Pashtuns who can be found on both sides of the border. The Taliban have been greatly assisted by these cross-border ties. Moreover, many American and Indian officials allege that the Pakistani government did not crack down on the Taliban as hard as it should have after the overthrow of the Taliban regime. There is powerful suspicion that Pakistan's dominant security elites see the Taliban and Pashtun rebels as their key tool of influence within strategically crucial Afghanistan and thus have continued to at least tacitly support them.

Pakistan's political situation holds two severe risks for Afghanistan's current regime. The first is that the Taliban will continue to grow in strength. Taliban-linked elements have successfully imposed costs on Pakistani security forces and carved out territorial control in the hills and valleys of Federally Administered Tribal Areas and North-West Frontier Province. This provides a sanctuary to continue the fight against Afghan government forces, as well as U.S. and NATO forces. These counterinsurgent forces are already facing a severe challenge, and emboldened Islamist and Pashtun forces in Pakistan will not help matters.

Second, and at a more structural level, growing Pashtun assertion threatens Afghanistan's territorial cohesion. Afghanistan is a multi-ethnic state, and its current rulers draw heavily from the country's non-Pashtun north. There has been sentiment in the past for a "Pashtunistan" that would carve out a separate territory for Pashtuns out of Afghanistan and Pakistan. The precedents of East Pakistan and Balochistan suggest that action by a Punjabi-dominated Pakistani military can spur further separatism and regional resentment. If the Pakistani military's offensives in the northwest heighten separatist tendencies, this is sure to influence at least some Afghan Pashtuns toward separatism.

India. Pakistan's neighbor and arch-rival India is watching the current situation with growing alarm. The government's subdued response to emergency rule suggests that it does not want to rock the boat one way or another. The costs of sustained tension with Pakistan are a huge external check on India's continued rise, necessitating money and policy-makers' attention that could be far better spent on other social, economic, and military priorities. The current crisis holds the potential to further inflame this relationship in three important ways.

First, a Pakistani state focused on fighting for domestic power may be less willing and able to control the many militant organizations that have targeted India from Pakistani bases. Organizations like Lashkar-e-Toiba and Jaish-e-Mohammed have launched dramatic terrorist attacks throughout India in recent years, while also operating in Indian-administered Kashmir. A further fracturing of the monopoly of state power may unleash a wave of violence against India. Indian economic growth hinges on relative domestic stability, which would be badly undermined by an upsurge of terrorism in key urban centers.

Second, there is always the possibility that the beleaguered Pakistani regime may try to mobilize support at home by raising tensions with India. In the current context this is unlikely, as the Pakistani military is in no position to credibly play a game of brinkmanship with an increasingly-powerful India, but the future contingency remains. An extraordinarily deep distrust of Pakistan is embedded in the worldviews of many Indian security elites, and so all Pakistani moves will be viewed with profound suspicion.

Finally, the worst-case scenario of a collapsing Pakistan would pose enormous challenges to India. The most direct result would be a massive wave of refugees, like the deluge that streamed out of East Pakistan (now Bangladesh) during Pakistani counter-insurgency operations in 1971. This social and logistical burden would be joined by uncertainty over the control of Pakistan's nuclear weapons and the political future of Pakistan's provinces and power centers. Far more would have to happen before this disastrous situation arises, but it is being openly discussed in Delhi.

Iran. Iran is affected by Pakistan's politics in two ways. The first relates to Afghanistan, where the Iranians (and Indians) are heavily involved in advancing their interests through consulates, development money, alliances, and intelligence services. The Iranians were foes of the Taliban in the 1990s, and have been backing Shia Hazaras in Afghanistan. Further destabilization of Afghanistan will result in an intensification of Iranian effort, at a time when Iran already has more international complications than it can handle. It is unlikely that further Iranian involvement in such a context would contribute to peace and stability in Afghanistan.

A second possible spillover from a Pakistan in turmoil comes from Baloch armed groups, who also straddle the Pakistan-Iran border. There have been acts of violence by Baloch separatists in eastern Iran, and so a weakening Pakistani central state may encourage Baloch regional sentiments in a multiethnic Iran.

The United States

Much of the coverage of the current crisis has focused on the U.S. reaction. This is because the U.S. has provided approximately $10 billion in aid to Pakistan since 2001 and views Pakistan as a front-line state in its battle against Al Qaeda. The U.S. also has a long history with Pakistan, most importantly as a backer of military rulers to advance American strategic interests during the Cold War. Musharraf's recent actions highlight the limits of American influence in Pakistan -- emergency rule was imposed despite direct opposition from Secretary of State Condoleezza Rice and CENTCOM head Admiral William Fallon.

The U.S. is profoundly affected by Pakistan's turmoil. First, and most obviously, it is now even more completely reliant on the cooperation of a few key security elites in Islamabad in containing Al Qaeda and the Taliban. Other centers of political power and influence are being repressed by the military regime, or have been co-opted by the military rulers. The United States has not broadened its constituency in Pakistan, and it is now too late to do so. American policy-makers will have to take what they can get in terms of Pakistani security cooperation, but given Pakistan's other strategic compulsions it is simply not clear what form such cooperation will take.

Second, this loss of influence will encourage voices in the United States calling for the introduction of special forces and airpower into Pakistan's northwest to hunt for high-value targets. The logistical challenges remain immense, but the previous counter-argument that American intervention would undermine ally Musharraf have been weakened by Musharraf's defiance of U.S. wishes.

Third, the overall thrust of American strategy in South Asia is under growing pressure. Pakistan and Afghanistan are obviously two of the key frontlines of the war against Al Qaeda. The bolstering of India as a strategic partner has involved huge effort, but both Indian and American attention may be diverted to dealing with fallout from Pakistan. This will not undermine an Indo-American alignment, but it may prevent the U.S. and India from focusing on forward thinking and bold policy initiatives.

The Future

This analysis has focused on the regional implications of a continued status quo or deterioration in Pakistan's political situation. The worst may not come, and Musharraf and Bhutto may still cobble together a transition to some form of power-sharing and elections in 2008. But states with interests in the region are actively contemplating how they would be influenced in both status quo and worst-case scenarios. For better or worse, Pakistan's politics will have a major impact on both the international politics of the region and the internal politics of its neighbors.

Immigration Reform That Just Might Work

The debate over immigration reform in America has come full circle. It began in late 2005 with an "enforcement only" bill in the House of Representatives that relied on aggressive implementation of existing law and greatly restricting future immigration. The most extreme legislation proposed in this vein would have made felons of undocumented immigrants and prosecuted those who provide such immigrants with aid or comfort. In essence, the proposal threw down a gauntlet to any who supported immigrant rights. While the most punitive measures of that bill were largely rejected, the parameters that it laid out represent the current position of the U.S. government. Today's policy focuses on border security, employer sanctions and the detention and deportation of undocumented immigrants.

Efforts to provide a more comprehensive approach to immigration reform have also failed. The inability of Congress to pass legislation in either 2006 or 2007 reflects the lack of consensus among policymakers over how to resolve the broader issues. While there are major economic interests supportive of comprehensive reform, misperceptions about migration effects, cultural prejudices, and the real and perceived costs of immigration undermined support of bipartisan legislation. The result is that the Bush administration has retreated into an "enforcement first" or "enforcement only" approach that ignores the economic and labor needs of the country. This approach is one that has not worked in the past -- and will not work in the future.

Failed Reform on Capitol Hill
The Senate took a more comprehensive approach to resolving the nation's immigration woes in 2006 and 2007. The Senate approach included tough enforcement measures, but also offered a path to legalization for the undocumented. It also provided a temporary worker program with labor and wage protections, and increased legal channels for permanent immigration to the United States. The premise of the Senate approach was that each piece of the reform package was necessary to address a complex set of issues holistically. Nonetheless, opponents labeled these measures "amnesty."

The failure of Congress to pass a comprehensive reform package reflects a set of competing interests that proved impossible to reconcile. Advocates for comprehensive reform -- including business leaders in a number of key industries, pro-immigration organizations, and even the White House -- were defeated by a vocal minority that played upon fears that immigrants were displacing native-born workers, draining social programs and overtaking American communities. The majority of Americans believe the immigration system is broken and must be fixed in a manner that allows undocumented immigrants to legalize their status, but their voices were muted in this debate.

What do the debates of 2005-07 portend for the coming years? With a bad problem only getting worse, will the next several years increase pressure on powerful interests, especially business, to convince cultural conservatives that immigration reform is critical to the nation's future economic health? Can the border be secured and the law enforced in a manner that does not trample over individual rights? How can costs currently absorbed at the state level, such as for emergency medical care of uninsured immigrants, be alleviated so that state coffers are not drained as a result of federal inaction?

Economic Forces
These questions will persist so long as the underlying economic forces that spawned them remain. The free trade policies of the past thirty years and the opening of the American economy to imported goods have contributed greatly to a process of de-industrialization. Technological developments and a broad expansion of the service sector have also changed the face of many industries. These elements, combined with an aging and better-educated American workforce, left many U.S. industries without a steady supply of low-wage, low-skill American workers. American companies began to look elsewhere.

The North American Free Trade Agreement (NAFTA) was expected to facilitate trade in goods but not labor. While NAFTA did not address migration directly, many supporters believed that it would create jobs in Mexico, reducing the flow of undocumented immigrants from Mexico to the United States such that, as then-Mexican President Carlos Salinas said, Mexico would export goods, not people. From 1994-2004, however, the decade after NAFTA came into force, unauthorized immigration from Mexico to the United States increased. The free trade agreement, combined with a financial crisis and stagnant job growth in Mexico, created a surge in Mexico's unemployed workforce. At the same time, a strong U.S. economy in the late 1990s fuelled migration to the north, much of it outside legal channels. Low-wage jobs in the United States serve as a magnet drawing labor to the north. Reports by the Bureau of Labor Statistics show that job growth in industries requiring low levels of formal education -- such as food service, hospitality, and construction -- will continue to increase.

Such job growth occurs as U.S. citizens increasingly decline to take the toughest and most dangerous low-skilled jobs. As President Bush stated in 2004, "Some of the jobs being generated in America's growing economy are jobs American citizens are not filling." U.S. Commerce Secretary Carlos Gutierrez called the worker shortage acute for low-skilled jobs, warning of a "very detrimental impact to the economy" if this shortage is left uncorrected.

In addition to meeting the demand for labor, immigration from Latin America -- both lawful and unlawful -- has contributed to the boom in consumer spending. A University of Georgia study put Hispanic purchasing power at $798 billion in 2006 and predicted this figure will reach $1.2 trillion by 2011. The U.S. Census Bureau found that 1.6 million Hispanic-owned firms provided jobs to 1.5 million employees in 2002. The same firms had receipts of $222 billion and generated payroll of $36.7 billion. Scholars also argue that certain businesses would not exist or could not expand without immigrant labor, much of it undocumented.

Experts debate the impact of immigrant labor on the wages of less-skilled Americans who are in fact competing for the same jobs. Recent studies found that the influx in Mexican workers negatively affects the wages of less-educated native-born workers, but simultaneously improves the wages of college graduates in the United States. Other scholars found no discernable impact on wages that could be attributed to immigrant labor in many urban areas. Where these wage variances exist, they are apt to hurt the least-educated of the American workers, including the rural poor and minorities.

The Anti-Reform Movement
Anti-reform sentiment seems to be driven by the same type of fear and isolationism that has plagued American immigration policy for centuries. Germans were derided by Benjamin Franklin in the 1750s for failing to learn proper English, swarming across our shores, and potentially destabilizing the government. Chinese were recruited for their labor but prohibited from owning property; Africans were forcibly brought to the United States as property.

Anti-immigrant sentiments are also fueled by a combination of fears and prejudices, exacerbated by the attacks of Sept. 11, 2001, and heavily propounded by certain politicians, commentators and cultural conservatives who fear that immigration is changing the complexion of America. Of course, demographics are changing the face of America overall with the percentage of Hispanic Americans growing rapidly. Immigrants, whether documented or undocumented, tend to put down roots where jobs are available and where familiar communities are already established with many now settling in suburbs and rural areas that were traditionally populated by smaller or negligible immigrant populations. Anti-immigrant activists argue that these populations drain state resources for health and education and complain that the federal government does not adequately reimburse state programs.

The argument falls flat in relation to public education, which is funded through property taxes that are ultimately paid by homeowners and renters, regardless of immigration status. Critics have a stronger argument in relation to health care. However, state health care systems generally do not track the immigration status of patients and therefore no one knows what the true burden is. The high cost of emergency care for the uninsured is part of a larger problem faced throughout America for which undocumented immigrants cannot alone be blamed.

Prejudice also plays a role in the anti-immigrant backlash, driven by unfounded fears of terrorist attacks or crime waves. In fact, "incarceration rates are lowest among immigrant young men, even among the least educated and the least acculturated among them," according to a study by the Migration Policy Institute. Federal records analyzed by Syracuse University have shown that claims of high rates of arrests by federal prosecutors for federal terrorist crimes are typically minor and non-violent offenses related to immigration status violations. It found that less than 0.01 percent of arrests of non-citizens by Homeland Security agents were terrorist related.

What Does the Future Hold?
The prospects for reform are now dim. Analysts predict it will be eight to 10 years before Congress gains the courage to address the issue in a comprehensive manner again. Meanwhile, targeted bills to bolster border security, increase detention space, and curtail judicial review of deportation orders are expected to proliferate.

Must enforcement come first? According to a 2005 study by the Migration Policy Institute, "overall spending on enforcement activities has ballooned ... with appropriations growing from $1 billion to $4.9 billion between fiscal years 1985 and 2002 and staffing levels increasing greatly." For several of those years, as spending spiked, so did the number of undocumented immigrants entering the United States. Scholars argue that border enforcement has a minimal deterrent impact on illegal immigration into the United States. Rather, as discussed earlier, migration is spurred largely by economics. There is no question, however, that border fences and heightened security in populated areas have pushed illegal crossings into desolate desert areas, making the journey more dangerous. As a result of the remote crossings, increased danger and the associated higher cost of such travel, seasonal migration flows have decreased and "rates of return migration have plummeted." Rather than decrease illegal entries, border fences create incentives for the undocumented to choose the risks attendant to undocumented status over the physical danger of seasonal border crossings.

Meanwhile, the aggressive tactics of U.S. Immigration and Customs Enforcement (ICE) have not demonstrably improved national security. High-profile raids elicited cheers from restrictionists but further damaged the image of ICE with immigrant groups. They claimed that families were cruelly and unnecessarily separated and that some of those arrested were denied access to legal counsel. The August 2007 announcement of a crackdown on employers of undocumented immigrants is the latest step in fulfilling an enforcement agenda that offers little hope of repairing the broken system, and may serve to harm the economy. Employers in low-skilled industries with high rates of undocumented workers, such as agriculture, are likely to be targeted in spite of the fact that they face significant hurdles in locating and hiring native-born workers or authorized immigrants.

Immigration advocates on both sides are mobilized to make their cases in Washington and to the American public. Nonetheless, positive reform will not be achieved until powerful business interests argue persuasively that the need for workers in key industries outweighs the arguments of the cultural conservatives that immigrants are harmful to the nation's economic growth and social fabric.

A comprehensive approach that creates legal avenues for immigrants to live and work in the United States combined with tough but humane border security and law enforcement-including employer sanctions for bad actors who continue to skirt the law or abuse workers-is the most viable solution for security and economic growth. It is also a solution that honors the oft-stated, if not always fulfilled, vision of America as a melting pot that welcomes and protects immigrants.

Does Globalization Bring War or Peace?

Do high levels of international trade lead to peace? Norman Angell authored the best-selling book on international politics in history, arguing that economic interdependence between Germany and England made any war between the two unthinkable -- an illusion. His book, The Great Illusion, was translated into 17 languages and sold one million copies; Angell himself won the Nobel Peace Prize. Unfortunately, within a few years of publication, Britain and Germany eagerly threw themselves into the abyss of the First World War.

The analytic literature on the Commercial Peace is much less robust than scholarship on the Democratic Peace, the latter positing the improbability of war between democracies. The Commercial Peace literature displays less consistency and theoretical rigor, with precise causes largely untested. Statistical analyses of trade relationships generally find that trade is conducive to peace; however, numerous case studies find that international trade either played no part in particular leaders' decisions about war or prompted them to escalate rather than become dependent on others.

Nonetheless, some patterns emerge. Trade highly concentrated with a single partner correlates with conflict, as does a marked difference in states' respective dependence. At the same time, however, high levels of trade with the aggregate international market correlate with cooperation. The nature of the traded goods matters -- trade in commodities with substantial strategic applications (e.g., oil or high-tech capital equipment) is most conducive to conflict.

Most important, high levels of economic exchange act as an accelerant: extensive trade enhances either cooperation or conflict. The implication is that specific outcomes are contingent on economic interdependence's interaction with some domestic institutional factor: states' strategic response to global market forces will vary according to their internal political-societal composition.

Economic Sectors and Foreign Policy
A growing body of research indicates that the domestic institutions and dominant sectoral coalitions of the trading nations determine the effect of economic interdependence on states' foreign policy. Put simply, international trade has distributional consequences, producing relative winners and losers in each society, affecting these groups' foreign policy preferences. When constituencies advantaged by global markets dominate the political system, national policy will favor conciliation and multilateral cooperation -- including when the median voter is both politically empowered and gains from trade.

On the other hand, when groups uncompetitive in global exchanges have the power to turn their sectoral preferences into the "national interest," the state will likely pursue a foreign policy of confrontation and the unilateral quest for advantage. Imperial Japan, for example, actually had a higher level of economic interdependence than did its 1920s democratic predecessor, but nonetheless embarked on aggressive imperialism.

Two other sectoral characteristics of the dominant political coalition can determine state response to economic interdependence. Sectors have different exposure to parts of the global economy: some sectors' major markets are the core countries of the world economic system (the wealthiest and most powerful states); others, however, are linked tightly to the global economic periphery (the poorer, less stable states); others still depend on the domestic market and have no interest in paying for active foreign policies of any type. Sectors reliant on the core will favor cooperation with other Great Powers to ensure continued access to these rich markets. Those tied to fixed investments or key markets in the roiling periphery will favor aggressive policies to project state power into these zones, creating spheres of influence.

Finally, sectors differ in their benefit from public expenditures on military power: some (the classic "military-industrial complex") can expect lucrative long-term contracts, while others can only expect to foot the fiscal bill.

At any given level of economic interdependence, a state dominated by political affiliates of globally uncompetitive, periphery-linked, security-spending advantaged sectors will pursue a more expansionist and confrontational policy than a state led by actors from globally competitive sectors whose markets are internal or in the core and that make minimal gains from defense spending. Wilhelmine Germany embodied the first type of state due to its notorious coalition of "Iron and Rye" -- the dual dominance of the corporate chieftains of heavy industry and the agrarian Prussian officer-aristocrats. A striking example of the second type of state, led by a political coalition of finance and export-oriented industry, is 1920s Japan, which embraced conciliatory multilateralism. When these sectoral differences coincide with partisan cleavages, struggles over foreign policy can hinge on fundamental strategy, as in the 1930s' debate in the United States over isolationism versus engagement.

All else being equal, cooperative and multilateral security policies will likely encourage peace, while confrontational and unilateral policies are more likely to lead to conflict. Beyond this, globalization influences the ways these policies may interact in specific instances.

Many hope trade will constrain or perhaps pacify a rising China, resurgent Russia, and proliferation-minded Iran, as it well may. Nonetheless, any prudent analysis must incorporate caveats drawn from states' particular political economy of security policy. In non-democratic states, however important global markets may be to the economy in aggregate, elites will be most sensitive to sectoral interests of their specific power base. This mismatch can cause systematic distortions in their ability to interpret other states' strategic signals correctly when genuine conflicts of interest emerge with a nation more domestically constrained.

Leadership elites drawn from domestic-oriented, uncompetitive, or non-tradable constituencies will tend to discount deterrent signals sent by trading partners whose own domestic institutions favor those commerce-oriented interests, believing such interests make partners less likely to fulfill their threats. For example, one reason the BJP government of India decided to achieve an open nuclear weapons capability was that its small-business, domestic-oriented heart constituency was both less vulnerable to trade sanctions and less willing to believe that the US would either impose or long sustain such sanctions, given its own increased economic interests in India.

Sometimes, deterrent signals may not be sent at all, since one nation's governing coalition may include commerce-dependent groups whose interests prevent state leaders from actually undertaking necessary balancing responses or issuing potent signals of resolve in the first place; the result can be fatally muddled strategy and even war -- as witness the series of weak attempts before the First World War by finance-dominated Britain to deter "Iron and Rye"-dominated Germany.

The emergence of truly global markets makes it all the less plausible under most circumstances that a revisionist state will be unable to find some alternative source of resources or outlet for its goods. Ironically, the more the international economy resembles a true global marketplace rather than an oligopolistic economic forum, the less likely it would appear that aggressors must inevitably suffer lasting retaliatory cut-offs in trade. There will always be someone else with the capability to buy and sell.

Peaceful Relations in a Globalized World
American policymakers should beware claims of globalization's axiomatic pacifying effects. Trade creates vested interests in peace, but these interests affect policy only to the extent they wield political clout. In many of the states whose behavior we most wish to alter, such sectors -- internationalist, export-oriented, reliant on global markets -- lack a privileged place at the political table. Until and unless these groups gain a greater voice within their own political system, attempts to rely on the presumed constraining effects of global trade carry substantially greater risk than commonly thought.

A few examples tell much. Quasi-democratic Russia is a state whose principal exposure to global markets lies in oil, a commodity whose considerable strategic coercive power the Putin regime freely invokes. The oil sector has effectively merged with the state, making Russia's deepening ties to the global economy a would-be weapon rather than an avenue of restraint. Russian economic liberalization without political liberalization is unlikely to pay the strong cooperative dividends many expect.

China will prove perhaps the ultimate test of the Pax Mercatoria. The increasing international Chinese presence in the oil and raw materials extraction sectors would seem to bode ill, given such sectors' consistent history elsewhere of urging state use of threats and force to secure these interests. Much will come down to the relative political influence of export-oriented sectors heavily reliant on foreign direct investment and easy access to the vast Western market versus the political power of their sectoral opposites: uncompetitive state-owned enterprises, energy and mineral complexes with important holdings in the global periphery, and a Chinese military that increasingly has become a de facto multi-sectoral economic-industrial conglomerate. Actions to bolster the former groups at the expense of the latter would be effort well spent.

At home, as even advanced sectors feel the competitive pressures of globalization, public support for internationalism and global engagement will face severe challenges. As more sectors undergo structural transformation, the natural coalitional constituency for committed global activist policy will erode; containing the gathering backlash will require considerable leadership.

Trade can indeed be a palliative; too often, however, we seem to think of economic interdependence as a panacea; the danger is that in particular instances it may prove no more than a placebo.

Is the Foreign Policy Process Working?

For decades, political analysts have dissected the mechanisms in the U.S. government and other institutions to describe how foreign policy is made. The matter seems to rise with international crises, and those are upon us again: the Iraq and Afghanistan wars, the confrontation with Iran, HIV/AIDS, and the pressures of climate change, among other issues, underscore the point. With the U.S. government split between parties, fractiousness is in full view.

With troubles for the U.S. global position mounting, it is easy to say that the foreign policy process is not working well. But what are the sources of trouble, and how readily can they be fixed?

This is not the first, doubtful moment for the wheels of the foreign policy mechanism. At the time of the Vietnam War, the criticism from the public was more deafening than today's, and it took Congress until 1971 to explore, via the Fulbright hearings, the course of the war. That same year, Daniel Ellsberg leaked the Pentagon Papers, seeming to verify the malady of a dysfunctional apparatus. Later that decade, hearings conducted by Sen. Frank Church uncovered covert operations, revealing broad illegality. The Iran-contra affair, the nuclear-weapons and "star wars" buildup of the late 1970s and 1980s, and other controversial episodes earned broad scrutiny, typically spurred by public or media activism followed by congressional probes.

We have, in short, been down this road before. The question is what can better be done to make the process work more satisfactorily.

The Current Morass
What is unusual today is that the Iraq war became unpopular rather quickly, with little leadership from the Democrats or strong oppositional voices in the news media or civil society. From support above 70 percent in March 2003, for example, by February 2005 the public was evenly split on the decision to invade Iraq, and support has dwindled since. This has had an impact on accountability: the public's quick disapproval virtually demanded new answers, but Congress, under Republicans until this year, exercised little oversight, and Democrats were unwilling to challenge Bush until the midterm election season in 2006. For the first three years of the war, then, the public strong skepticism or disapproval was ignored by the workings of government.

Facing growing public unrest and political paralysis within the government, President Bush felt compelled to empanel a "fresh look" after a Republican congressman from Virginia, Frank Wolf, proposed such a review after visiting Iraq in late 2005. The White House was initially opposed, but Secretary of State Condoleezza Rice prevailed and Congress quickly appropriated the money. Former congressman Lee Hamilton and former secretary of state James Baker headed the panel, the Iraq Study Group (ISG).

It is relatively rare when a foreign policy issue that is current, unresolved, and extremely controversial would receive its most formal review and recommendations from a non-governmental body. Apart from the co-chairs, the ISG was comprised of members with little foreign policy experience; its forty or so experts are well-versed but were drawn from the foreign policy establishment; and its work was done in secret.3 It withheld its policy recommendations until after the 2006 midterm elections, and the administration immediately undermined its conclusions -- essentially declaring it would not heed such advice -- although in practice it gradually adopted some of its views. Altogether, then, the ISG is hardly a model for exploring options.

That it was freighted with responsibilities difficult to deliver on is less a comment on ISG's competence than the deeper ailments of the system that produced the Iraq catastrophe and allowed it to fester for years. Now in charge in Congress, the Democrats have not won many points in its oversight functions, either, fidgeting over withdrawal deadlines and the level of coercive language they can use, and failing to convince enough Republicans to come along. Meanwhile, the enormous human toll in Iraq -- one-half in "absolute poverty," high child malnourishment, 70 percent without clean water, and so on -- goes practically unnoticed. So the failure of accountability persists in both branches.

Four Guideposts
The "what went wrong?" question is not merely a matter of competence in foreign policy implementation, but indicative of more fundamental issues. At least four are visible: grand strategy, democratic principles, consultation with allies, adversaries, and international organizations, and matching resources to goals.

Strategy. The "preventive war" strategy bracing the Iraq invasion was partially a departure from previous U.S. strategy, which had relied mainly on deterrence of the use of nuclear weapons in particular, and diplomacy. But a broader strategy was also at work in the invasion and other actions in the region -- the attempt to transform the authoritarian political structures besetting several Arab states (and Iran) in one swiftly delivered blow and subsequent efforts at "coercive democratization." This broad goal, articulated in the 2002 National Security Strategy, was borne of the shock of the 9/11 attacks and a pre-existing desire by many in the Bush administration for a much more assertive military posture in the region and around the world.

But the strategy was hardly debated in foreign policy circles or Congress, much less among the broader public, before it was imbedded as the national strategy. Despite the abject failure in Iraq -- to find WMDs, or to transform the region to democracy and free markets, and at an enormous cost -- it remains official doctrine, and little discussed. While presidential doctrines may be the Washington equivalent of New Year's resolutions, the nation -- led by political leaders, intellectuals, and civil society -- needs to take this more seriously.

Democracy. The Bush administration has formed and conducted much of its foreign policy in secret, an anathema to democratic principles, and has avoided congressional involvement, even though the Constitution grants significant power to Congress in global affairs. On both counts, this behavior is stoutly anti-democratic.

Matters of secrecy are not merely anti-democratic in a formal sense; the practice has powerful consequences. As the Commission created to explore government secrecy in the mid-1990s put it, "secrecy has the potential to undermine well-informed judgment by limiting the opportunity for input, review, and criticism, thus allowing individuals and groups to avoid the type of scrutiny that might challenge long-accepted beliefs and ways of thinking."

That the Bush White House is resolutely closed to scrutiny is well established. Its secrecy about the reasons for going to war with Iraq, particularly the virtually nonexistent intelligence regarding WMDs, is now widely accepted as a colossal blunder. Secrecy is sometimes necessary, as all acknowledge, but the attempts at balance begun in the post-Cold War era have been set back drastically. And despite the foreign policy blunders, the current president's penchant for secrecy has not subsided, and Congress is not challenging that, either.

The role of Congress is always in play during foreign policy debacles. "War nourishes the presidency," Arthur Schlesinger, Jr., once noted, and presidential powers in foreign policy tend to be cumulative, rather than episodic. Scholars generally agree that Congress and the president equally share foreign policy power, though the lack of precision in the delegation of authority is an "invitation to struggle." And struggle there has been since 9/11, confrontations over funding for the Iraq war in particular, which reflects the main authority Congress clearly possesses -- the power of the purse.

Here, the current Congress -- in contrast to the rubber stampers of 2003-2006 -- has made minor inroads, at least forcing a debate about timing of withdrawal, but the spending is approved and indeed the record-sized military budget overall is sailing through Congress with few visible objections. In past episodes in Southeast Asia and Central America, funding cutoffs or restrictions were the preferred method to exercise congressional authority, and were sometimes circumvented illegally.

Even the relatively mild efforts at oversight, however, have been met by the administration with charges that oversight "emboldens the enemy." This tendentious language undermines cooperation and intensifies the struggle between the two branches, hindering effective dialogue, action, and accountability.

Consultation. The lack of consultation is not limited to Congress. The war in Iraq, in contrast to the war in Afghanistan, has been conducted without heed to multilateral institutions, including international law, or with longstanding allies, apart from the U.K. Rice's advice to Bush to "punish France, ignore Germany and forgive Russia" for their opposition to the Iraq war is emblematic of that attitude. Collective security decision-making is bound to be more cumbersome and cautious than the decision making of individual states, but that can be an advantage in situations that are not urgent.

Likewise, addressing foreign policy crises through multilateral institutions like the UN or NATO provide other benefits: legitimacy (and legality) of action, cost and burden sharing, better intelligence, and international (and cross cultural) dialogue, to name the most obvious. U.S. presidents in recent years have generated misleading expectations about the UN in particular by focusing on the supposed constraints of international institutions. In trade and other fields, however, multilateralism is welcome, because American interests are served and indeed preeminent. At least prospectively, the benefits of multilateralism should accrue both to economics and security.

Regional diplomacy is also a matter of consultation, and has been conducted sporadically and bilaterally until this spring, when very brief meetings of regional stakeholders in the Persian Gulf were convened. (A pivotal recommendation of the ISG, regional diplomacy remains meager and fraught with additional and divisive issues, such as Iran's nuclear program). This lack of consultation and negotiation is chronically problematic for U.S. foreign affairs.

Resources. As is widely noted, the wars in Afghanistan and Iraq, among other foreign policy priorities, are not supported by sufficient levels of resources. The wars, for example, have been financed by deficit spending, in effect, and are often presented as supplemental budgets, which are less transparent and subject to review than annual requests. Underfunding may be another problem: the near-universal conclusion that too few U.S. troops have been involved in Iraq from the beginning is in part a resource issue.

Prominent among the other areas of foreign policy implementation where resources did not match objectives is President Bush's HIV/AIDS initiative. Congress appropriated more money for prevention and treatment than is being spent. A number of critics also point to the homeland security effort, a lynchpin of the global war on terror, as clearly demonstrating a lack of adequate funds to realize its stated intentions.

The Bush administration is not the first presidency to set goals it could not achieve with the resources it was willing to mobilize. In combination with its strategic ambition, resistance to congressional involvement, opacity, and unilateralism, however, the failure to match resources to objectives is all the more disabling.

Examining the Process
While the policies themselves deserve more exploration by scholars, journalists, and policy professionals, as well as by Congress, the process of policy making and implementation should not be ignored. As a general rule, the right has favored more executive power and the left more congressional input. What are the relative merits and drawbacks of these two preferences? Can new mechanisms of accountability -- paying for wars with a special tax, for example -- proceed without excessively boxing in presidential authority? How can transparency in intelligence analysis and budgeting be facilitated? Can we have a national discussion about the U.S. role in the world -- for example, our relationship to multilateral institutions -- that is encouraged by political leaders?

Grappling in a sustained, sophisticated, and non-partisan way with the foreign policy process is long overdue. Iraq in particular demonstrates how badly broken the process is, a canary in the coal mine for U.S. globalism in the years to come.

Can Scientific Codes of Conduct Deter Bioweapons?

At least since the First World War, when the German army sabotaged the Allies' pack animals with anthrax and glanders, worldwide concern about biological weapons has focused on how to improve legal restraints against biological weapons (BW). Over these same years, the major powers have vacillated in their willingness to promote international treaties and laws against BW programs. At the end of the Cold War, hopes were high for a global consensus to strengthen the 1972 Biological and Toxin Weapons Convention (BTWC), making it a standing organization comparable to that of the 1993 Chemical Weapons Convention and an expanded mandate to ensure compliance. Instead, in the name of national security, the United States has recently promoted an emphasis on voluntary measures. One of these, the international adoption of biosecurity codes of conduct, puts the burden on elite scientists to solve a problem of weapons proliferation that can be better addressed by effective legal restraints.

Allies' pack animals with anthrax and glanders, worldwide concern about biological weapons has focused on how to improve legal restraints against biological weapons (BW). Over these same years, the major powers have vacillated in their willingness to promote international treaties and laws against BW programs. At the end of the Cold War, hopes were high for a global consensus to strengthen the 1972 Biological and Toxin Weapons Convention (BTWC), making it a standing organization comparable to that of the 1993 Chemical Weapons Convention and an expanded mandate to ensure compliance. Instead, in the name of national security, the United States has recently promoted an emphasis on voluntary measures. One of these, the international adoption of biosecurity codes of conduct, puts the burden on elite scientists to solve a problem of weapons proliferation that can be better addressed by effective legal restraints.

The American promotion of biosecurity codes of conduct has its origins in a diplomatic impasse. In July 2001 in Geneva, then Under Secretary of State John Bolton announced U.S. withdrawal from long-term negotiations to strengthen the BTWC. For seven often difficult years the Ad Hoc Group (AHG) of States Parties to the treaty had been hammering out a protocol to improve verification and treaty compliance. The 1925 Geneva Protocol forbids the use of biological and chemical weapons; the BTWC bans all aspects of state programs, including the development, production, trade, and stockpiling of germ weapons or disease agents. During the Cold War, however, the BTWC's lack of strong compliance measures forced a precarious reliance on trust without verification. In 1992, revelations about biological weapons in the Soviet Union and in Iraq underscored the need for change. In addition, the end of the Cold War prompted new optimism about international arms control. For biological weapons, the optimism proved short-lived.

The July 2001 withdrawal of the United States from AHG negotiations was followed in December 2001 by American pressure to disband the group entirely. Instead, a compromise was reached. For the next five years, until the BTWC's Sixth Review Conference in 2006, AHG discussions of biosecurity measures would focus on voluntary options. One of these, the development of codes of conduct to prevent the misuse of biomedical research, depended directly on input from scientific academies worldwide.

Encouraged by the U.S. Department of State and the States Parties to the BTWC, the InterAcademy Panel (IAP) on International Issues, a global network of science academies, created a biosecurity working group, consisting of members from the United States, the United Kingdom, the Netherlands, Italy, China, Cuba, and Nigeria. In December 2005, the IAP published its statement of principles regarding scientific codes of conduct. Endorsed by 68 national science academies, these principles exhort scientists to foresee and prevent the harmful consequences of their research, meet required laboratory safety standards, educate themselves, their students and the public about the BTWC and relevant domestic law, and inform authorities of any violations they might witness.

Codes of conduct that address biosecurity can be an important step toward raising general consciousness among biomedical researchers. According to preliminary inquiries, very few Western microbiologists have paid attention to the potential for harm in their work. In this, they lag behind U.S. politicians who, during the 1990s, successfully defined the threat of bioterrorism as a new policy imperative and channeled many millions of biodefense dollars to federal agencies, primarily the Department of Defense.

After the 9/11 attacks and the mysterious appearance of the anthrax letters soon after, there was an enormous growth in American biodefense -- to over $80 billion in open source funding by 2006. This project broadened to include the National Institutes of Health, which, with the creation of National Centers of Excellence at major medical centers and new high-containment laboratories for select agent research, has put microbiology at the center of an unprecedented national security initiative.

Politics, Ethics, and Science
Behavioral guidelines raise fundamental questions about individual conscience versus the impact of the social context on moral choice. Most contemporary microbiologists, although they may feel autonomous in their work, remain susceptible to larger institutional and political pressures. Whether in academic medical centers, pharmaceutical companies, or government facilities, they work in corporately organized settings where norms, professional responsibilities, and missions are bureaucratically defined. In addition to those pressures, these scientific environments react significantly to national norms concerning transparency and public accountability. Their common characteristic is a reliance on scientific methods with no necessary moral component, although critical scientific inquiry might conflict with political strictures.

The capacity of scientists to set aside moral scruples is abundantly illustrated in the history of biological weapons in the last century, when tens of thousands of microbiologists were employed in secret state programs, in defiance of international norms and laws protecting civilians in war. One major power after another -- France, Imperial Japan, the United Kingdom, the United States, and the USSR -- pursued biological weapons for strategic use. Very few of these BW scientists ever recanted their dedication to helping infect masses of civilians with anthrax, tularemia, plague, smallpox and other diseases. None risked the onerous whistle-blower role.

How does one reconcile belief in the moral authority of biomedical scientists, with their knowledge to save lives and prevent suffering, with this dark history? One explanation lies in the power of the closed scientific enclave in weapons research to normalize otherwise conflicting values. In each of the state biological weapons programs, scientists worked in communities isolated from the wider world and sheltered from criticism or controversy. In times of war, they identified as loyal patriots and in times of peace they identified as dedicated government employees.

The 1934-1945 Japanese BW program in occupied Manchuria created an extreme version of the secret scientific enclave. Its main center, Unit 731 near Harbin, was for nearly ten years a garrison town, within which scientists from the best Japanese medical schools lived comfortably in close proximity to their laboratories and to prisons that were a continual source of captive Chinese research subjects. Starting in 1939, these scientists began orchestrating the first modern use of germ weapons in war, which, in the summer of 1942, culminated in lethal disease attacks on dozens of Chinese villages and towns. Decades later, in public confessions, some of them described their blind commitment to serving the emperor and revealed they had "no feeling of apology or of doing anything bad," even when performing human vivisection.

Although the other state BW programs stopped short of war crimes, their scientists had to rationalize their commitment to the goal of mass germ attacks. In the 1920s, the French military used suspicions of German intent to conduct germ warfare to justify their secret BW research. In the Second World War, leaders of the British biological weapons program were dedicated to total war doctrine that made it essential to target enemy civilians in urban and industrial areas. This same doctrine underlies the U.S. development of biological and nuclear weapons during and following the war, and also shaped the later Soviet program.

Biological weapons scientists in secret programs sometimes cognitively divorced their scientific objectives from the broader military mission of mass killing. One example comes from the memoirs of a former Soviet civilian microbiologist, Igor Domarovskij, who worked in the closed city of Obolensk. When his development of a more virulent strain of tularemia was disrupted, he blamed bad management and complained bitterly that his "efforts went for nothing."

The Challenge Today
The twenty first century thus far appears to offer fewer incentives or opportunities than the last for covert, malevolent exploitation of the life sciences. Wars between major industrial states have ceased, totalitarian regimes have either collapsed or undergone radical transformations, and globalization has increased international communication. Throughout history, though, political entities -- whether tribes, kingdoms, or nation-states -- have consistently sought new, superior weapons. Sooner or later, the allure of biotechnological advances will inspire visions of military advantage that could, as in the past, be secretly pursued. We can only guess how the international transfer of biotechnology will interact with the dynamics of economic growth and political change. What is certain is that, as in the past, the participation of capable scientists is essential to any programmatic degradation of the life sciences -- or their protection.

At first glance, the InterAcademy Panel's third recommendation -- that biomedical scientists should spread information about international laws and policies against biological weapons -- appears unrealistic. The institutional rewards for political action, compared with those for scientific discovery, are practically zero. Yet such engagement is crucial. Biomedical scientists in influential positions are best situated to guard the humanitarian goals of their enterprise, or risk the imposition of other values. In the 1980s, German biologist Benno Müller-Hill, having written about Nazi scientists, was criticized for not characterizing the infamous death camp physician Josef Mengele as a "monster." Müller-Hill's reply was, "I said that Mengele learned nothing but science from his teachers and that his teachers never dared to think about reality. I said that science without justice and equal rights led to Auschwitz."

The issue of codes of conduct relating to biosecurity has put scientists on the alert to a new category of professional responsibilities. But the problem of biological weapons is too important and complex to leave to voluntary measures alone. The best hope for protection against biological weapons lies in the range of legal restraints that have been gradually building over the last several decades.

Unfortunately, these restraints are by no means as strong or comprehensive as they should be. Many nations have still failed to implement the domestic legislation required by the BTWC. No international treaty yet criminalizes individual complicity in developing, producing, possessing or using biological or chemical weapons. The International Criminal Court in its 1998 statute makes no specific reference to biological weapons, only to "employing poison or poisoned weapons." Meanwhile, the United States is stuck in the 2001 BTWC diplomatic impasse. The 1993 Chemical Weapons Convention's Organization for the Prevention of Chemical Weapons and the International Atomic Energy Agency have the resources to aggressively promote arms control for chemical and nuclear weapons, respectively, while the 1972 Biological and Toxin Weapons Convention remains unnecessarily frozen in Cold War limitations.

In other vital policy areas, the Bush administration's retreat from international leadership and its misguided reliance on unilateralism and secrecy have been recognized as faulty and even disastrous approaches to world politics. The time is right for American biomedical scientists to use their authority to criticize these same approaches to the problem of biological weapons.

How the International Community Can Help Sudan

There is a tendency in the outside world to see the tragedy in the Darfur region of the Sudan in isolation from the regional conflicts that have been proliferating in the country for a half century. These conflicts reflect an acute crisis of national identity that is both a cause of genocidal wars and a factor in the state's indifference to the resulting humanitarian consequences. This explains the Sudanese government's resistance to international provision of protection and assistance to the affected populations.

The conflicts in the Sudan indicate a nation in painful search of itself, striving to be free from historical discrimination based on race, ethnicity, religion, and culture. It is, therefore, necessary to combine a suitable humanitarian response with solutions that go to the roots of the national identity crisis and address its stratifying implications.

The history of conflict

Initially, conflict dichotomized the country into the Arab-Muslim North, comprising two-thirds of the country in land and population, and the African South, comprising the remaining third, where people largely adhere to traditional African beliefs but have been increasingly converting to Christianity since colonial times. However, this dichotomy is an oversimplification, for the majority in the North are non-Arab, although Muslims. Even the so-called Arabs are in fact a hybrid African-Arab race, who, through assimilationist opportunities were encouraged to pass as Arabs.

The normative framework of assimilation in the North dictated that if one became a Muslim, was Arabic speaking, culturally Arabized, and could claim a genealogical link to an Arab ancestry, one was elevated to a status of respectability and dignity. In sharp contrast, if one were black, one was labeled as a "heathen," and cast into the denigrated category of slaves or enslavables. Islam and Arabism, therefore, allowed people to pass as Arabs, with marginal regard to the color of the skin. However, skin color remained important, for one must not be too dark, as black was considered the color of slaves, nor too light, as that indicated connection with the European infidels, or the Hallab, a gypsy-type racial category. Even the color of the white Arabs was considered undesirable. One had to be the right color of brown to join the honored class. The standard color of the Sudanese "Arabs" is therefore akhdar, which translates to "green," actually the brown color that is representative of the northern hybrid race. As the South was the hunting ground for slaves, this assimilation process was confined to the North and southern identity remained one of resistance.

The North-South dichotomy was reinforced by all the regimes that ruled the country. The Turko-Egyptian Administration (1821-1885) was the first to create a semblance of a state, although it could not fully control the South. The Mahdist revolution that overthrew the Turko-Egyptian rule in 1885 established a theocratic Muslim state, but still could not subdue the South. The Anglo-Egyptian conquest in 1898 established the Condominium Administration in which the British, the dominant partner, administered the country as two separate entities, encouraged Arabism and Islam in the North, and isolated the South, leaving it to develop along traditional African lines. While the North developed economically, politically, and socially, the South was neglected, except for rudimentary educational and health services provided by Christian missionaries. As the Sudan approached independence, because of pressure from the North and Egypt, the British suddenly reversed the policy of separate development in favor of a unitary state, with centralized administration and no safeguards for the vulnerable people of the South.

It must be emphasized that what generates conflict is not the mere differences of identities, but the implications of those differences in the sharing of power, wealth, social services, employment and development opportunities. In virtually all of these areas, the South was totally neglected. Although the South is the richest in natural resources, abundant arable land, water supply, livestock, timber, and minerals, because these resources were not developed and the South remained in a state of inertia, the British felt that it was not viable as an independent country, and would remain dependent on the North. Implicit in that dependency was to be northern domination in which the "Arabs" replaced the British in a system of internal colonialism, which the South resisted violently.

Southern rebellion began in August 1955, four months before independence on January 1, 1956, as a result of fears that independence would not only result in northern domination, but could also mark a return to the Arab enslavement of the Africans. This triggered a secessionist war that would last for seventeen years, kill more than one million people, and force another one million into refuge in neighboring countries. The war was ended by the Addis Ababa Agreement of 1972, which gave the South regional autonomy. The agreement also gave the Ngok Dinka of Abyei, annexed to the administration of the North in 1905, the right to decide by referendum to return to the South, although that provision was never implemented. In 1983, the war resumed because of the unilateral abrogation of the Addis Ababa Agreement by the central government. Unlike its secessionist predecessors, the Sudan People's Liberation Movement and Army (SPLM/A) recast the objective of the war as the liberation of the whole country and the creation of a New Sudan, in which there would be no discrimination on the grounds of race, ethnicity, religion, culture or gender.

This recasting of the war objectives appealed to the non-Arab regions of the North, which began to see themselves as Africans and became more conscious of their own marginalization by the Arab-dominated center. They even saw themselves as worse off than the South, where decades of liberation struggle had gained some concessions from the Khartoum government. The first to join the SPLM/A in the mid-80s were rebels from the Nuba Mountains of Southern Kordofan and the Ingassana (or Funj) of southern Blue Nile. In 1991, non-Arab groups from Darfur, with the support of the SPLM/A, staged a rebellion that was ruthlessly crushed. Rebellion in Darfur resumed in 2003 by two groups, the Sudan Liberation Movement and Army (SLM/A) and the Justice and Equality Movement (JEM). The Beja of eastern Sudan also engaged in a low-level rebellion whose objectives have much in common with the other regional rebel groups. Even the people of Nubia, to the far north, are reviving their distinctive identity and pride in their Nubian civilization. One word is often given by these regions to explain the root cause of their rebellions -- marginalization -- the denial of civil, political, economic, social, and cultural rights of citizenship.

Ironically, while the South was initially deemed to be too poor to be viable as an independent country, the discovery of large oil reserves in the South has now shifted the argument to the North needing the South. The wealth of the South itself became a source of conflict. The mammoth Jonglei Canal that aimed at retrieving the waters of the swampy Sudd region of the South and channeling them to be used in irrigation schemes in the North and Egypt provoked a violent reaction from southerners that interrupted work on the canal. In the area of oil where commercial reserves were discovered in the South, pumping the crude to be refined in the North and exported to generate revenue for the central government also provoked a hostile reaction in the South that forced the Standard Oil of California to abandon work in the Sudan. Sanctions against the Sudan because of involvement in international terrorism kept Western companies out of oil exploration in the country, and international pressure forced Talisman of Canada to sell its concessions. Companies from China, Malaysia, India, and other Asian countries stepped in and intensified oil production, with little or no concern for the environment and the rights of the local population in the oil-rich areas, where people were massively displaced to clear the fields for production. Oil production literally fueled the conflict and raised the stakes very high. A joke has a northerner saying, "We fought the South for all these years because of their mango trees. Now that oil has been found in their land, how can we leave them in peace?"

Current agreements

After a long peace process that was initiated in 1993 by the sub-regional organization, the Inter-Governmental Authority for Development (IGAD), and reinvigorated in 2002 by the international community-in particular, the United States-the government of the Sudan and the SPLM/A concluded the Comprehensive Peace Agreement (CPA) on January 9, 2005, ending a devastating conflict that had killed more than two million people, displaced more than four million internally, and forced into refuge one-half million. The CPA gives the people of the South the right to govern themselves during a six-year interim period and to decide, by a referendum to be held after that period, whether to remain in a united Sudan or become an independent state.

The CPA also stipulates that efforts be exerted during the interim period to make unity attractive to the South. During the interim period, the South is to share in the Government of National Unity (GNU) in which the president of the South will also be the first vice president of all of the Sudan, and the SPLM will have proportional representation in all branches of the government. The South is to retain its own army, the SPLA, and Joint Integrated Units are to be formed as the nucleus of the national army, should the South opt for unity. Wealth-sharing arrangements give the North 50 percent of the revenue from the oil in the South and other southern resources. The Central Bank will have two branches, a northern branch, which will follow the Islamic banking system, and a southern branch, which will follow a conventional banking system. The guiding principle is "One Sudan, Two Systems," a concept that was developed by a task force at the Washington-based Center for Strategic and International Studies.

The CPA gives the people of Abyei the right to decide by a referendum to be held simultaneously with the southern referendum whether to join the South or remain in the North in a special autonomous administrative status under the North-South collegiate presidency. The agreement also gives the people of Southern Kordofan and southern Blue Nile a significant measure of autonomy and the right to have their views sought on their system of governance through popular consultation, a form of internal self-determination.

In Darfur, the Darfur Peace Agreement (DPA) was reached in May 2006 between the Sudan government and a faction of the SLM, which also offers the people of Darfur a measure of autonomy. The DPA has, however, been rejected by JEM and the SLM/A as well as by the overwhelming majority of the people of Darfur. As a result, war continues unabated in Darfur.

In the eastern region, with the mediation of Eritrea, the Eastern Sudan Peace Agreement (ESPA) was reached in November 2006. It is, however, likely that the ESPA will meet with the same contradictions that confronted the DPA, and that hostilities will possibly resume.

While the CPA has stopped hostilities in the South and the border regions of Abyei, Nuba Mountains, and southern Blue Nile, and the DPA and the ESPA aim at doing the same in Darfur, these agreements have not effectively addressed the national identity crisis and the marginalization of the non-Arab regions. It is also becoming obvious that their implementation is seriously flawed in both pace and content. What appears to be emerging is a pattern of containment in which the National Congress Party (NCP) continues to dominate the GNU and pursue its Arab-Islamic agenda, with minimum concessions for containment purposes.

The process of transformation

Sudan's crisis of national identity is reflected in two major distortions that require a correction of the system of governance. One is the racial and cultural self-perception of the dominant ethnic groups as Arabs, despite the obvious indicators of racial admixture. The other is the imposition of Arab-Islamic identity as the framework for national identity, which is inherently stratifying and discriminatory. The vision of the New Sudan postulated by the SPLM/A aims at transforming the country to be rid of any discrimination based on race, ethnicity, religion, culture and gender. This has widely been misconstrued as turning the tables to make Sudan an African country and reverse the discrimination to be against the Arabs, an allegation which the SPLM/A vehemently denies. For the average southerner, however, northerners are so entrenched in their Arabism that even the objective of promoting equality seems farfetched.

So, lofty as the vision of the New Sudan is, it was viewed with suspicion and even hostility by both sides. Southerners generally saw it as utopian and undesirable, in as far as it was premised on maintaining the unity of the country. They preferred to see the country as divided between North and South, Arab and African, Islamic and Christian, and demanded the right of self-determination with a predominant view towards secession. To them, the vision of a united New Sudan was that of the SPLM/A leader, Dr. John Garang de Mabior, and, at best, was a clever ploy to counter the anti-separatist biases in Africa and the international community, and perhaps gain allies in the North.

Northern reaction ranged from dismissing the vision as unrealistic, to resenting it as an insult to the Arab-Islamic image of the country, to confronting it as a threat to the establishment. With time, southerners began to appreciate the vision as it brought tangible benefits to their struggle. Northerners, especially from the marginalized non-Arab regions, became even more inspired than southerners by Garang's vision. However, his untimely death in a helicopter crash in the summer of 2005, only three weeks after being sworn in as first vice president of the country and president of the government of southern Sudan, was to become a major set-back for the pursuit of the vision.

Nonetheless, through his powerful vision, Garang set in motion a quest for the transformation of the country that seems irreversible. It is obvious that the division of the country into an Arab-Islamic North and an African-Christian and animist South is being seriously challenged. The perception of the hybrid African-Arab race and culture in the North as Arab is also being scrutinized by some, including many "Arabs." Even with the prospects of southern secession, the vision of a New Sudan is not likely to fade away. An independent South would remain sympathetic to the plight of the black Africans in the North, who are almost certain to continue their struggle against Arab domination and would continue to seek the support of the South. The question remains whether this will be done through a peaceful democratic process or through armed struggle.

The CPA was the outcome of regional and international solidarity in peacemaking. Beyond the South, Darfur at first seemed to offer an opportunity for the African Union (AU) to demonstrate its ability to transcend the traditional restrictions of sovereignty, narrowly interpreted as a barricade against international involvement, and to see it instead as a positive concept of state responsibility to protect and assist its citizens in cooperation with the international community. If the AU fails in this apportionment of the responsibility to protect, it will be a major set-back to the nascent organization that promised significant progress beyond the constraints of its predecessor, the Organization of African Unity. The current formula of a hybrid force of AU plus, with more plusses from the international community as needed, may not only be the practical, but perhaps also the desirable way forward.

Beyond responding to the immediate humanitarian needs in Darfur, the international community should make a concerted effort to help the Sudanese develop a framework of transformation based on the acceptance of pluralism and full equality of citizenship. Just as peace came to the South, and will most likely be realized in the other regions, through the active involvement of regional and international actors, the transformation of the country into a New Sudan of full equality will also require the support of the international community. However, in the end, the destiny of the nation is in the hands of the Sudanese themselves. Will the country transform itself constructively towards the envisioned New Sudan or will it fragment and disintegrate into a failed state? This may sound like a rhetorical question in as far as the desirable answer is obvious, but what should be is not necessarily what will be. The question about the future of the Sudan therefore remains pertinent.

BRAND NEW STORIES

Don't Sit on the Sidelines of History. Join Alternet All Access and Go Ad-Free. Support Honest Journalism.