The Bush administration’s assertion of a right to flex its offensive military muscle against so-called rogue states via pre-emptive force and preventive war is both a political response to the terrorist attacks of September 11, 2001, and consistent with the history of active U.S. interventionism. But while offensive force is certainly not a new development or concept, its expression in the form of a very public national security doctrine and the President’s claim of a moral right to preempt or prevent threats is a highly expansive interpretation of that history. Since the early years of the Republic, there has been a crusading missionary component to America’s existence: namely, to tell the world about the success of the American experiment and to extend to others the benefit of its wisdom. For Thomas Paine, America was “an asylum for mankind” and John Winthrop once spoke of a shining “city on a hill,” words which later formed the basis of Manifest Destiny and American exceptionalism.1
What follows is an examination of the Bush Doctrine and an assessment of its consistency in the history of American foreign policy. This article will analyze the specific tactics associated with offensive force and illustrates why the strategic national security doctrine of striking first places a premium on targeting so-called rogue states that pursue weapons of mass destruction and align themselves with terrorist groups. The examination will also focus on the political foundation of President Bush’s interpretation of the doctrine of the right of self-defense. Afterward, the article will engage in an historical analysis of American interventionism which will hopefully establish that the Bush Doctrine of striking first is not a major transformation in U.S. national security. Then, the essay will dissect the parameters of the Bush administration’s notion of democratic idealism and will conclude by highlighting the costs associated with exercising offensive military force.
Iraq and the Targeting of Rogue States
The 9/11 terrorist attacks provided the United States with the political opportunity to transform its strategic national security doctrine away from what some perceived as Cold War-era containment and deterrence toward a strategy that emphasizes offensive warfare against so-called rogue states or dangerous nations and regimes. For one observer, 9/11 represented a “long-standing call for the U.S. to develop a comprehensive strategy that finally spoke to the challenges of the Post-Cold War era.”2 National Security Adviser Condoleezza Rice described the political opportunities for strategic alteration of U.S. national security policy by comparing 9/11 to the immediate post-W.W.II period that provided fertile ground for the assertion of the Truman Doctrine:
President Bush decided almost immediately to build his doctrine on a foundation that included the targeting of states. In a speech on September 12, the president stated “we will make no distinction between the terrorists who committed these acts and those who harbor them.”4 Bush also added that the attacks “were more than acts of terror, they were acts of war.” Thus, unlike any other attack on American targets by terrorists, 9/11 would not be interpreted by Bush as merely an isolated crime, but the beginning of a war against states and non-states alike.
But how the United States should respond, with what means and, most importantly, where? Some in the administration saw 9/11 as an opportunity to attack Iraq, but other believed the President had to target only those terrorists and states directly implicated in the attacks on America, namely Al-Qaeda and the Taliban in Afghanistan. Leading the charge to influence the president to target nation states was Deputy Secretary of Defense Paul Wolfowitz. He stated:
The attacks of 9/11 allowed the Bush administration to expand the U.S. military response to include terrorists and state sponsors of terrorism in a world that for Bush was now clearly defined. In other words, the president retained the option of overthrowing states that supported terrorism or pursued weapons of mass destruction that could be supplied to terrorists. In his speech to Congress on September 20, 2001, he contended:
Bush also expanded the war on terrorism to include pre-emptive and preventive action against states that both sponsor terrorism and/or pursue WMD. In his 2002 State of the Union, Bush stated, “First, we will shut down terrorist camps, disrupt terrorist plans and bring terrorists to justice. And second, we must prevent the terrorists and regimes who seek chemical, biological or nuclear weapons from threatening the United States and the world.”7 Bush also linked North Korea, Iran, and Iraq as states that sponsor terrorism and pursue weapons of mass destruction (WMD) and hinted at U.S. action against Iraq.
Bush also made a case that all three states served as imminent threats to the United States:
Bush made his most forceful public case for striking first on June 1, 2002, in his graduation speech to army cadets at West Point. He stated that “our security will require all Americans to be forward-looking and resolute, to be ready for preemptive action when necessary…”10
The publication of the 2002 National Security Strategy (N.S.S.) document cemented these ideas into a formal presidential doctrine. It states, “today, our enemies will use weapons of mass destruction as weapons of choice… We cannot let our enemies strike first.” It also goes on to justify the need for pre-emption based on the concept of self-protection:
The N.S.S. also asserts the power to engage in preventive war:
Furthermore, it posits U.S. global primacy by stating that “our forces will be strong enough to dissuade potential adversaries from pursuing a military build-up in hopes of surpassing, or equaling, the power of the United States.”13
Bush’s embrace of preemption and preventive war against terrorists quickly became known as the “first strike doctrine.” Preemptive military force or preemption involves striking first at an imminent and ominous threat, believing that an attack is going to occur. Preventive war is the use of force against non-imminent threats in the hope of preventing against future attacks. This highly controversial method of using force dismisses the utility of deterrence and containment, is based almost exclusively on unilateralism, and places considerable faith in predicting the future intentions of states and non-states. Both preemption and preventive war are premised on the belief that terrorists which combine suicidal attacks with other deadly tactics and states that support terrorism and pursue WMD cannot be contained and deterred.
Right of Self-Defense
The political premise of Bush’s doctrine of striking first rests on the traditional right of self-defense. Self-defense as a politically legitimating factor on the road to war has a long history in international law, dating to Hugo Grotius in 1625.14 Historically, the legitimate claim of self-defense included the right to preemptive use of force. This “inherent moral right” was not absolute, however. In 1842, U.S. Secretary of State Daniel Webster helped to clarify self-defense in the 1837 Caroline case with England, the conditions under which America could exercise the right to preempt an attack with military force in self-defense. Preemptive military force had clearly defined moral limitations and could only be justified in cases in which “the necessity of that self-defense is instant, overwhelming, and leaving no choice of means and no moment for deliberation [the act of self-defense must also involve] nothing unreasonable or excessive.”15
Chapter VII, Article 51 of the United Nations Charter preserves for member states the right of self-defense: “Nothing in the present Charter shall impair the inherent right of individual or collective self-defense an armed attack occurs against a Member of the United Nations, until the Security Council has taken measures necessary to maintain international peace and security.”16 Like any legal text, the exact meaning and scope of what constitutes legitimate self-defense is open and has been subject to debate. However, the intent seems clear; resorting to self-defense is legitimate only in cases of real, looming, and imminent attack. In essence, distinctions between unilateralism and multilateralism, states and non-states, and deterrence, preemption, and preventive war are fundamental. A denial of these distinctions is politically and morally unacceptable.
President Bush’s strategic first strike doctrine spells out in detail the case for striking first for the sake of self-defense. According to his National Security Strategy:
Condoleezza Rice has elaborated on this by claiming:
The Bush administration’s version of self-defense and its arguments in favor of a first strike doctrine rest on the larger view that warfare has been transformed by terrorism in general and September 11th in particular. As Colin Powell argues, “It’s a different world…. it’s a new kind of threat.”19 And in several important respects, war has changed along the lines the administration suggests, although that transformation is nothing new as terrorism has been an ongoing threat to the United States for a number of decades. Prevailing wisdom suggests that nontraditional enemies, namely terrorist groups and rogue states, are prepared to wage modern warfare by concealing their movements, weapons, and intentions by attacking civilians, military personnel, technologies, and infrastructure and by shirking their commitments to international law. Provoked by the 9/11 terrorist attacks, American policymakers are motivated by the fear of nuclear, chemical, and biological weapons falling into the hands of terrorists and irresponsible, brutal tyrants. President Bush contends in his 2002 National Security Strategy that Americans face enemies who “reject basic human values and hate the United States and everything for which it stands.”20 Although vulnerability could be reduced, it is impossible to achieve complete immunity.
As the argument goes, America’s vulnerability to terrorism and its fear of WMD proliferation has legitimized the Bush administration’s assumption of a more offensive military posture grounded on self-defense. But it is the character of potential threats from states that becomes extremely important in evaluating the legitimacy of Bush’s first strike doctrine. Thus, the assertion is that 9/11 demonstrates that the United States will continue to be confronted by rogue states opposing America’s goal of preventing the proliferation of WMD in the hands of dangerous leaders, such as Saddam Hussein and Kim Jong Il or their spread to non-state terrorists. Indeed, these twin threats have contributed to a political environment that has allowed Bush to significantly alter America’s strategic national security doctrine. According to Yale historian John Lewis Gaddis:
The 9/11 attack may have justified America’s right to use force as a defensive retaliatory response against states such as the Taliban that sponsored Al-Qaeda. The Bush administration, however, makes a questionable and controversial moral jump when it assumes that “rogue states” not directly affiliated with terrorist activity or the 9/11 attacks desire to harm the United States with WMD and that they pose a military threat. The President sees no essential difference between “rogue states” and terrorists, and he erases the difference between terrorists and those states in which they reside: “We make no distinction between terrorists and those who knowingly harbor or provide aid to them.”22
There is also a strong difference between the intentions and capabilities of so-called rogue states. A first-strike doctrine sees a major national security threat in the accelerating “proliferation” of destructive capacities, which it suggests is due to rising technical capacities of states beyond the traditional great powers. Importantly, the doctrine makes no distinction between the acquisition of what it terms “the world’s most destructive technologies” and the intention to turn those technologies into usable weapons.23
The Long Tradition of U.S. Interventionism
Since the early days of the American Republic, presidents have embodied a nationalist and idealist yearning that has transformed U.S. foreign policy into a moral crusade or visionary quest for spreading American values throughout the world via armed intervention. For example, a 1975 congressional study makes clear the long history of U.S. intervention around the globe since 1798. Prior to its entry in World War II, America had intervened 163 times in foreign nations with its armed forces, and it has averaged one armed intervention per year since the end of that war.24 While the scope of armed intervention has historically focused on Latin America, the Caribbean, and East Asia, the nation has recently used force in Somalia, Liberia, the former Yugoslavia, Afghanistan, and Iraq.
In the very early years of the Republic, the America made it clear that it intended to support peoples who wanted independence, but would only serve as an exemplar in doing so. Americans sought to safeguard their daring new adventure in government by shunning all foreign quarrels and overseas obligations. As the eighteenth century ended, President George Washington admonished his countrymen to “steer clear of permanent alliances,” and Thomas Jefferson attempted to strike a delicate balance between trade and national security by warning “Peace, commerce, and honest friendship with all nations, entangling alliances with none.”25 On July 4, 1821, Secretary of State John Quincy Adams even enshrined America’s aversion to imperialism with the following precept:
The words of America’s early leaders did not prevent the United States from using offensive military force to expand westward in violation of treaties with Native Americans, to extending northward into Canada in search of more territory, and quelling Barbary pirate attacks on U.S. commercial interests in North Africa. According to Adams’s yardstick, only a direct threat to national survival could justify America’s entry into foreign wars; that is, non-interventionism would be the benchmark of American foreign policy. But Adams was directing his words toward Europe, in particular Great Britain and France.
Non-interventionism and restraint were certainly not the case within the Western Hemisphere as America believed its moral concerns extended to what it perceived as its own geographic sphere of influence. Just two years after Adams’ speech, in 1823, President James Monroe altered America’s approach to the world by enunciating his famous Monroe Doctrine to Congress, which ironically Adams drafted. The doctrine combined the ingredients of moralism with American self-interest, a mix that has often perplexed and bemused others, by stating that the United States would prevent any attempt by European powers “to extend their systems to any portion of this hemisphere as dangerous to our peace and security.” The second part of the doctrine pledged that, in exchange, the nation vowed “never to entangle ourselves in the broils of Europe.”27
For all the considerable virtue attached to such a pronouncement—idealism and self-interest—Monroe never consulted the Latin American countries covered under his declaration, thereby triggering an everlasting resentment and anger in the region. Of course, the United Kingdom gave America enough leeway to utter such a bold decree since it controlled the Atlantic sea lanes which spared America from having to more rapidly build up its own navy. Although the United States enforced the Monroe Doctrine inconsistently, according to whatever interpretation of the national interest suited it at the time, it was enforced frequently by Monroe’s successors. From 1806 until the present, the United States intervened with its armed forces more than 100 times in Latin America. These interventions included brief military forays against coastal areas, armed attacks on pirates, efforts to secure the construction of the Panama Canal, violent overthrows of anti-U.S. governments, forcible takeovers of property for repayments of debts, arrests of renegades, and sometimes direct assistance to get rid of dictatorships. There was always a publicly stated moral imperative behind each action. These ranged from forestalling chaos, to saving the lives and property of Americans, to extending democracy abroad. Lurking in the shadows were American economic interests.
The greatest period of U.S. dominance in the region started with President Theodore Roosevelt and ended with President Franklin Roosevelt. American soldiers occupied five countries for decades: Cuba, the Dominican Republic, Panama, Haiti, and Nicaragua. The rationale was to uphold U.S. values in unsettled situations south of the border; President Theodore Roosevelt even formulated a corollary to the Monroe Doctrine, arrogating to Washington the right of “preventive enforcement” to seize governments that failed to pay their debts.28 However, each occupation turned out to be debacles of one sort or another, and the moral arguments soon appeared shabby and thin. Since the American public eventually saw no moral basis for these incursions, it withdrew its mandate, leading President Franklin Roosevelt to withdraw the remaining U.S. forces in the region as a gesture from the good neighbor to the north. The United States, however, never fully embraced interdependence until corporate desires drove it to enter the North American Free Trade Agreement (N.A.F.T.A.) in 1993.
President Woodrow Wilson broke the notion of non-entanglement in European affairs by sending American soldiers into World War I in 1917. Then, at the Versailles Peace Treaty, Wilson took a major step toward an unprecedented global commitment by advancing the concept of the League of Nations in order to promote the morally broader ideas of freedom and self-determination in America’s revolutionary heritage. As he said with a very American sense of certainty: “We have come to redeem the world by giving it liberty and justice.”29Although Wilson failed to make the political argument in defense of U.S. League membership in the U.S. Senate, he made a strong moral case for internationalism. As summed up by Robert Nisbet:
The sheer mayhem and ruination of World War II changed all of these reservations. America realized that was going to have to become a permanent participating member of the post-war world order because its very survival was at stake in a new world soon characterized by the presence of nuclear weapons. In the ensuing Cold War, the United States badly needed friends and allies, both democratic and undemocratic. President Roosevelt proposed that the nation back the United Nations Charter, drafted by his own State Department. The UN would be unlike the League. It was designed to be uniquely configured to America’s national interests as well as its idealism. This body struck a balance between America’s moral foundations and the growing extent of its global power, namely giving the veto power in the Security Council to the five most powerful states at the time and allowing other member nations to debate their views in a General Assembly. The UN would provide the cover America needed to advance its Cold War interests vis-a-vis its nemesis, the U.S.S.R.
President Truman rallied the country around the Truman Doctrine, the Marshall Plan, deterrence, and containment, all by his pleas to deter the spread of Soviet global power and communism. Such a sweeping appeal touched both the moral and security preoccupations of Americans, who were held together by a fierce sense of anti-communism. The strategy worked for decades. Presidents Eisenhower, Kennedy, and Johnson enforced the Truman Doctrine as a way of halting the Soviets and protecting American business interests within the structural parameters of the Bretton Woods international monetary system, which fixed exchange rates on the U.S. dollar. However, both Johnson and Nixon tilted too far toward national security arguments alone, eventually losing their moral standing on the killing fields of Southeast Asia.
America’s long history of at times embracing dictatorships in defense of U.S. interests eroded the moral basis of public support. America’s unconditional support for dictatorial regimes of the day, namely the Shah in Iran, Saddam Hussein in Iraq, Ferdinand Marcos in the Philippines, Manual Noriega in Panama, General Augusto Pinochet in Chile (along with the overthrow of democratically elected Salvador Allende)—all demonstrated that America would set aside its moral principles in order to contain Soviet power and Communism. President Carter elevated human rights to a serious concern in his foreign policy, but he applied the policy in such a haphazard and unconvincing way that he could neither find moral support at home nor address America’s national interests abroad. President Reagan appealed to America’s moral soul by calling the U.S.S.R. an “Evil Empire” and demanding that Gorbachev “tear down that wall,” but he stumbled over the Iran Contra scandal, his mammoth deficits, and his support for right-wing death squads in Central America.
Since the collapse of the Soviet state, international relations scholars have debated what they see as a U.S. foreign policy fluctuating between multilateralism and unilateralism.31 Proponents of multilateralism recommend greater U.S. reliance on international laws, institutions, and organizations to manage global issues; they prefer containment strategies as the most effective means to promote international cooperation.32Advocates of unilateralism maintain that America should use its power to reshape the global system in accordance with its interests.33 In the absence of the Soviet Union and Communist governments in Eastern Europe, U.S. foreign policy meandered under post-Cold War presidents George H.W. Bush and Bill Clinton. As observed by National Security Adviser Rice:
President George Bush Sr. seemingly fought the drug war by eventually ousting the long time American-backed Noriega and restoring democracy, but without first seeking the consent of the UN Security Council. He then secured the backing of the Security Council for his assault on Saddam Hussein in Kuwait, which instantly gave him moral authority, though this could be interpreted by some as an attempt at rescuing oil companies under the guise of protecting a defenseless country against an aggressive one. He also had morality on his side when he sent troops to Somalia to prevent a widespread humanitarian crisis, but his efforts ended in disaster and withdrawal under Clinton.
This is why President Clinton sidestepped the issue of sending U.S. troops to Rwanda. Although he was able temporarily to build up his moral credentials when he ordered American troops to Haiti in 1994, he had difficulties in sustaining sympathies for China with his economic interests-based policy of engagement. Clearly, the twin issues of human rights violations and alleged Chinese financial support for his 1996 re-election eroded his ethical footing. Even more, with the end of the Cold War, Clinton faced a public that felt that since the Cold War was won on the global level, Washington should look more inward.
The threat of global governance, blue-helmeted peace-keepers, multilateralism, and international rules and treaties featured prominently in the conservative agendas in Congress. Deprived of anti-Communism as the belief holding disparate forces together, populist conservatives found that attacks on the UN and international organizations resonated with a more economically and culturally insecure America. Rejecting as liberal hogwash the “assertive multilateralism” of Clinton’s Secretary of State Madeleine Albright, the Republican Congress appealed to America’s culture of individualism, making simultaneous cases against the domestic welfare state and Cold War-era multilateralism in foreign policy. In an unprecedented speech by a U.S. legislator before the UN, Senator Jesse Helms, Chair of the Senate Foreign Relations Committee, threw down a clear challenge:
Clinton’s foreign policy measures were tempered by such popular legislative sentiments. Evidence of this ambivalent multilateralism is plentiful. For example, between 1993 and 1995, U.S. foreign policy toward the war in Bosnia-Herzegovina was driven by the view that the United States would only intervene if the fighting expanded beyond Bosnia and Croatia. The war ended only when Clinton unilaterally forced the warring sides to the peace table at Dayton, Ohio, in 1995, allowing only a diplomat from the United Kingdom a minor voice representing the European Union (EU). The NATO air campaign against Serbian forces in Kosovo eventually ended the fighting. That largely fulfilled U.S. unilateral interests because the bombing was largely carried out by American forces under the guise of NATO. Military operations were given tacit support from Russia in exchange for U.S. economic inducements.36 Also, over the objections of the EU and Canada, Clinton continued implementing U.S. sanctions against European and Canadian businesses operating in Iran, Libya, and Cuba. Further, Clinton’s inability in October 1999 to persuade the U.S. Senate to supply advice and consent on the Comprehensive Nuclear Test Ban Treaty projected an image of the United States as unwilling to engage with other nations. Clinton began testing a national missile defense system, one that would eventually contribute to George W. Bush’s decision to withdrawal from the Anti-Ballistic Missile Treaty of 1972.
After 9/11, President George Bush the younger and his national security team began chipping away at a host of treaties and conventions they believed constrained U.S. power.37 The Bush administration’s strong ambivalence toward multilateralism deprived international institutions the necessary powers to respond to nontraditional security issues such as conflicts over natural resources, public health and infectious diseases, international crime, and environmental degradation. Secretary of Defense Donald Rumsfeld, Deputy Secretary of Defense Paul Wolfowitz, and Vice-President Richard Cheney held—and hold—traditional neoconservative views of security that leaves little or no room for consideration of proposals for new forms of global governance to address non-security issues and non-traditional, yet very real, threats. For example, Bush’s decision to renounce the Clinton administration’s support for the International Criminal Court is reflective of an anti-multilateral streak. Arms Control Undersecretary John Bolton’s statement that signing the letter renouncing the Rome Statute “was the happiest moment of my government service” evidenced a unilateralist campaign.38 Similarly, Bush’s opposition to the Kyoto Protocols on climate change is well known, as is his desire to undermine efforts to establish international norms on fossil fuels. Moreover, the power of the U.S. oil industry in shaping foreign policy was also exposed in a leaked memo from Exxon-Mobil that had previously asked the White House: “Can Watson be replaced now at the request of the US?”39
Such details underscore fundamental shifts in the policy discourse of the Bush presidency. For the Bush foreign policy team, at stake is the ability of the United States to flex its global might. To make the twenty-first century a “New American Century,” neoconservatives who gained the upper hand in the administration have pushed for a fundamental reordering of U.S. global engagement with military strategies designed to fill the Post-Cold War void with American precepts. For Bush, strategies of realism and liberal internationalism that worked in tandem to promote American global power during the Cold War are outdated in today’s world, a world in which the United States is no longer constrained by another superpower. Realism—with its attendant balance-of-power politics, coalitions and alliances, deterrence, and containment—is no longer applicable in what Bush and most of his team view as a unipolar world characterized by major power imbalances between the America and the world. Likewise, Wilsonian strategies of enlightened self-interest designed to build economic and political alliances under U.S. global leadership have been deemed, for the most part, unnecessary and out of touch with today’s global power structure. So, too, are liberal geopolitical strategies such as the democracy-centered policies and the humanitarian interventionism of Bush Sr. and Clinton that stressed a new world order of inclusion and rules-based processes.
Whether one describes the current structure of the international system as unipolar, bipolar, or multipolar, it is indisputable that the United States possesses significant military, political, economic, and cultural influence in the world. Paradoxically, the attacks of 9/11 and their aftermath have given the nation a renewed sense of vulnerability. Yet, given the fact of being a target, they have at the same time affirmed U.S. global dominance. The combination of U.S. vulnerability and American global primacy has reinforced anti-multilateral tendencies in American foreign policy.
Some may consider the U.S. invasion of Iraq in March 2003 as reflective of morally problematic elements in this trend against multilateralism:
Critics of the neoconservative assault on multilateralism argue that American interests and national security have been morally undermined by the flexing of offensive military force in Iraq. The thickening web of multilateral regimes and treaties is regarded, as one astute observer of multilateralism noted, as Lilliputian attempts to tie down Gulliver.40 These assaults internationalist cooperation views may bring the disintegration of the post-World War II moral and political framework of multilateralism, thrusting global affairs into a Hobbesian world where unrestrained power prevails.41
Clearly, there have been strong moralizing elements in the tradition of U.S. intervention that still hold true today. According to Nisbet,
But such words should be tempered by a realism that suggests America has often times placed its political interests above its moral objectives, as evidenced by its past embrace of dictators and its support for repressive regimes. According to Walter Russell Mead, “Wilsonians found, to their great chagrin and surprise, that popular enthusiasm for military intervention can be limited. Revulsion against atrocities does not quickly or universally translate into the political will to put American forces in harms way.”43 We should therefore be mindful of the words of Reinhold Neibuhr who observed, “Moral pretensions and political parochialism are the two weaknesses of the life of a messianic nation.”44
George W. Bush and Democratic Idealism
The assertion of a first strike strategic national security doctrine and the long history of U.S. interventionism have invariably been couched in terms of a democratic idealism that has been on the rise since Woodrow Wilson in varying forms. George W. Bush’s 2002 National Security Strategy taps America’s deep moral roots and its sense of mission. Instead of promoting Wilsonian liberal values, however, those driving U.S. foreign policy today are more comfortable with stark moral contrasts, linking America’s post-9/11 mission to an apocalyptic conflict between good and evil.
Bush’s moral simplicity has helped him ease the American transition from the targeted war on international terrorist networks to the much broader confrontation with what he calls the “axis of evil” and other so-called “evildoers.” The grand moral scale of Bush’s approach to U.S. foreign policy has been driven by the goal of conquering evil and a dismissal of concerns about the means employed. Bush nonetheless has been consistent with the moral irony of U.S. foreign policy. For example, his spoken vision of the world in terms of “good and evil” and “us and them” should ideally be tempered by his administration’s continued support for repressive regimes in Pakistan, Saudi Arabia, Israel, and China, and rejection of human rights conditions on aid.
The moral convictions and contradictions of the Bush administration are consistent with the idealistic image of America as a shining “city upon a hill,” an image seen in stark contrast with the evil forces it is attempting to eradicate. Tom Barry emphasizes, “Over the past five centuries, American society has continued to believe in its own moral transcendence, but the city on the hill has experienced major urban renewal.”45 At the initiation of the Cold War, the moral values of the city were commonly regarded as Western principles. The collapse of the USSR led many to believe in the perfection of the American democratic ideal. For Barry, “neoconservative ‘end of history’ and ‘clash of civilization’ interpretations of history fortified the American conviction that its Judeo-Christian transatlantic culture constituted the epitome of civilization.” He suggests that those who dissent from this ideal, namely Western European leaders, are regarded as “moral relativists, political opportunists, and weak-kneed partners afraid to speak evil’s name.”46
The end of the Cold War left U.S. foreign policy without a defining legacy but began a process toward building a political environment in which moral simplicity and absolutism would flourish under George W. Bush. In the absence of a domestic anticommunist core, no political sector—left, centrist, or right—could persuasively occupy the moral high ground and articulate a new vision for U.S. global engagement. The “New World Order” of the Bush Sr. administration was met with derision from the right, as were the “assertive multilateralism” and revived liberal internationalist policies of the Clinton administration. The left focused almost exclusively on backlash politics opposing the new liberal-conservative consensus on free trade, while alternately supporting and critiquing the liberal-centrist consensus around humanitarian interventionism. Also focused largely on backlash politics against the perceived liberalism of the Clinton presidency and largely bereft of their core Anti-Communism, the Right abandoned Cold War-style multilateralism of “West against the Rest” and accepted a new strategy of “the US against the Rest.”
This coherent yet simplistic moral vision unified traditionalist concerns of social conservatives, military/industrial complex advocates, and unilateralists bent on declaring American global supremacy. This troika, today known as neoconservative foreign policymakers and thinkers, forms the core of Bush’s power base. Dismissive of arguments about new transnational threats to global stability (climate change, resource scarcity conflicts, infectious disease), the new vision was at once simple and grandiose. Simple in that U.S. foreign policy should not get bogged down in conflicts and humanitarian crises that have no direct bearing on U.S. national interests and grandiose in that policymakers should assert global hegemony. This agenda was morally legitimized and politically justified in the minds of neoconservatives quickly on September 11th 2001.
So what is really new about Bush? After all, the United States has a long history of throwing its weight around, intervening militarily, sidelining at times with the U.N., at other times allying itself with dictators and human rights abusers, and always asserting for itself the high ground of morality and the blessing of the almighty. According to Howard Fineman of Newsweek, “Every president invokes God and asks his blessing. Every president promises, though not always in so many words, to lead according to moral principles rooted in biblical tradition. The English writer G.K. Chesterton called America ‘a nation with the soul of a church,’ and every president, at times, is the pastor in the bully pulpit.” But Fineman describes Bush as occupying a unique position in this tradition: “it has taken a war, and the prospect of more, to highlight a central fact —this president and this presidency is the most resolutely ‘faith-based’ in modern times, an enterprise founded, supported, and guided by trust in the temporal and spiritual power of God.” Bush’s personal friend and Secretary of Commerce Donald Evans contends that moral vision “gives him a desire to serve others and a very clear sense of what is good and what is evil.”47
Bush’s new first strike strategy against “evildoers” incorporates many of the operative and cultural features of American exceptionalism and interventionism. However, it has largely dropped the notion that U.S. leadership should operate within a framework of rules, norms, and institutions designed to benefit all nations. As a result, Washington’s global leadership is increasingly seen as less benign. Under Bush, the salient features of the nation’s global engagement are its aggressive anti-multilateralism, renewed militarism and disdain for diplomacy, along with strict moral interpretation of right and wrong, good and evil, and a with-us or against-us stance. Underlying and fortifying these currents is the language of anti-terrorism, which has replaced anti-Communism as the organizing and unifying principle in Post-9/11 American politics.
Thus, the emerging Bush strategy is an agenda of preemption and preventive war distinguished by a “moral simplicity,” that according to Bush justifies America’s “endless” war against “evildoers.” Bush’s moral simplicity and warning that you are “either with us or with the terrorists” reflect a one-dimensional approach to foreign policy. The president’s morally simplistic worldview also relegates realism to the backburner. For example, in a key foreign policy speech at West Point in June 2002, Bush outlined a supremacist or neo-imperial agenda of international security. Not only would the U.S. no longer count on coalitions of great powers to guarantee collective security, it also would prevent the rise of any potential global rival — keeping U.S. “military strengths beyond challenges.” As for the “the moral clarity” that Bush’s supporters say he uses to interpret world politics, we should welcome the valuable words of Bryan Hehir, former head of the Harvard Divinity School:
The first strike doctrine claims that policy change was inevitable after years of continuity and relative policy tinkering during the Cold War. But how much is the United States willing to operate alone in the area of national security, how much change will Bush promote in the domestic political processes of other states, and with what means? The president’s desire to make his doctrine a highly public declaration and his claim that the war on terrorism is just render these questions legitimate. America’s invasion and subsequent occupation of Iraq certainly demonstrates that the United States can redraw regional maps and replace governments by force over the objections of most of the world.
However, the American occupation of Iraq represents one of the most highly controversial deployments of American power since World War II. Hardly anyone in Congress or in the public debated the overall diplomatic consequences of the relatively unilateral invasion that was opposed by many of America’s traditional allies and the United Nations. Fewer raised the most fundamental point: A global strategy based on the first strike doctrine could mean the beginning of the end of the very same international institutions, laws, and norms the United States built and strengthened for more than half a century.
America may have expended by means of the Bush Doctrine too much of its political capital, influence, and treasure in order to demonstrate to the world that it is the world’s preeminent superpower. The costs of the U.S. invasion and occupation of Iraq could lead to a weakening of its efforts to destroy the global Al-Qaeda network and its remnants in Taliban strongholds in present-day Afghanistan. Certainly, as events unfold, America’s struggle to deal with the insurgency and to prevent against the influx of foreign fighters into the country is reflective of a relative hubris on the part of the policymakers who most heavily pushed for the application of the Bush Doctrine in Iraq. As a result, the future application of the Bush Doctrine and the option of political usefulness of diplomacy, especially with regional and international organizations, appear in doubt since the president has publicly identified the future success of his strategic policy of offensive force with American success in Iraq. Even more, in the absence of finding Hussein’s alleged WMD, the doctrine’s effectiveness as a tool of counter-proliferation in the war on terrorism is tenuous and ambiguous at best.
What is at stake in the eroding situation in Iraq is nothing less than a fundamental shift in America’s moral and political leadership of the post-9/11 world. Rather than continuing to serve as a first among equals, the United States appears to be taking international law into its own hands, creating new rules and standards of military engagement in the absence of multilateral consent. Opposition from the United Nations and Western Europe indicate there is credible global fear of a world in which there is no state powerful enough to counter American power in a situation reminiscent of the Soviet Union in the Cold War era.
In an international environment in which there is an absence of a counterweight to American global power (unlike during the Cold War), several problems remain. While the Bush administration is correct in demanding that the threat of non-state terrorism requires a more forward-leaning response, the invasion of Iraq was a different matter altogether. Might not the United States have been able to successfully contain Hussein, deter him and bring him down the way America in part forced a peaceful dissolution of the USSR—that is, by applying a combination of economic, diplomatic, political, military, and moral pressure on Communism?
While Bush’s arguments in favor of using offensive force against Iraq were powerful, were they also persuasive? The move against Iraq raises fundamentally different questions from those posed by the American and United Nations response to Iraq’s invasion of Kuwait in 1991 and to the 2001 U.S. invasion of Afghanistan in response to the 9/11 attacks. In those cases, the use of force was consistent with just and ethical doctrines of self-defense and met with support around the world. By contrast, what we are concerned with now is whether the overthrow and capture of Saddam Hussein by offensive force was outside the framework of legitimate international security norms and laws.
Taken together, the broader structure of international law and specific notions of power and morality provide us with a lens through which we can assess whether or not offensive warfare is ethical, moral, and just. In essence, international law contains a doctrine of legitimate “anticipatory self-defense” in terms of multilateralism and legitimate global institutions; realism respects a nation’s security dilemma and its narrowly defined self-interests; and just war holds that threats must be clearly imminent. For some, the first strike doctrine is well beyond the legitimate bounds of anticipatory self-defense; they contend that the U.S. invaded Iraq simply to establish a precedent to expand international law to encompass preventive war. Others believe the 9/11 attacks have allowed the United States to define its interests more broadly; that is, an expanded interpretation of self-defense was the only requirement to justify a first strike against Iraq and for future offensive wars against both states and non-states.
In sum, one can understand why any administration would favor preemption and why some would be attracted to preventive war if they thought these offensive measures would guarantee invulnerability. But this psychological reassurance is at best illusory and the effort to attain it may be counterproductive.
1. See: Thomas Paine, Common Sense (New York Penguin Books), Edited by Isaac Kramnick, 1983, 31. Winthrop is quoted in Elizabeth Connelly and Arthur M. Schlesinger Jr., John Winthrop: Politician and Statesman (Chelsea House Publishing, 2000).
The author is assistant professor of political science at the University of Central Florida at Orlando, Florida. Professor Dolan’s research focuses on American for-eign policy, national security policy, and the American presidency. His co-edited volume “Striking First” (Palgrave-Macmillan) will appear later this year and his book-length manuscript “President Bush and the Morality of Offensive War” is scheduled for publication by Ashgate Press in 2005.