Skip to main content

The Problem of Evoking the Past to Justify Policy

by Michael W. Santos

History teaches many lessons. One of these lessons may be to rely not on our perceptions of history as guarantees of the future course of events but rather to look to the underlying logic of realities.–Ed.

As the Cold War was ending, Francis Fukuyama argued that trends seemed to suggest that “something very fundamental has happened in world history.” In particular, “the triumph of the West, of the Western idea,” was evident everywhere. According to Fukuyama, with the passing of Marxism-Leninism and the growing “‘Common Marketization’ of international relations,” the beginning of the end of history was at hand. Although international conflict would not disappear completely because the world would for a time be divided between those in the historical and the post-historical phases of development, major conflicts between large states were “passing from the scene.”1 In retrospect, it may be said of Fukuyama’s assertion about history what Mark Twain once said about reports of his death: that they were “greatly exaggerated.”

Francis Fukuyama

Still, as events surrounding the end of the Cold War and its aftermath unfolded, it was easy to believe that Fukuyama might be right. After all, not only did the Berlin Wall fall and the Soviet Union collapse, but also students had taken to the streets in China demanding democratic reforms. Even though the protests at Tiananmen Square had been brutally suppressed, they seemed to be a sign that a new generation of Chinese citizens was accepting the Western idea.

Economically, globalization and the creation of regional markets seem to break down national barriers and generate unprecedented changes. “Common Marketization” led to the development of the Mercosur in South America in 1991, the establishment of the European Union in 1993, and the implementation of the North America Free Trade Agreement in 1994. The unprecedented international cooperation that allowed President George H.W. Bush to build a coalition against Saddam Hussein during the first Persian Gulf War led him to assert that the planet was on the brink of a new world order which he described as “an era in which the nations of the world, east and west, north and south, can prosper and live in harmony.”2

Fukuyama’s view of history had a powerful effect on the development of American foreign policy. Fellow neoconservatives William Kristol and Robert Kagan established the Project for the New American Century in 1997, which according to its website, is “dedicated to a few fundamental propositions: that American leadership is good both for America and for the world; and that such leadership requires military strength, diplomatic energy and commitment to moral principle.”3 As a member of that group, Fukuyama signed a letter recommending that President Bill Clinton commit himself to a strategy aimed “at the removal of Saddam Hussein’s regime from power.” The logic behind such a policy was simple: “‘containment’ of Saddam Hussein has been steadily eroding over the past several months. As recent events have demonstrated, we can no longer depend on our partners in the Gulf War coalition to continue to uphold the sanctions or to punish Saddam when he blocks or evades UN inspections. Our ability to ensure that Saddam Hussein is not producing weapons of mass destruction, therefore, has substantially diminished.”4

William Kristol
Robert Kagen

Such calls for American unilateralism would find a more sympathetic audience with President George W. Bush. Nine days after 9/11, members of the Project wrote to Bush lauding his “commitment to ‘lead the world to victory’” and affirming Secretary of State Colin Powell’s assertions that the United States needed to “go after terrorism wherever we find it in the world” as well as target those “other groups out there that mean us no good.”5 The assumptions undergirding neoconservatism allowed the Bush administration to doggedly pursue a policy of escalation in Iraq, convinced, or if one wishes to be cynical, hoping to convince Americans and the world, that history was on the side of U.S. action. As Bush put it in his “Axis of Evil” speech, “History has called America and our allies to action, and it is both our responsibility and our privilege to fight freedom’s fight.”6

Interestingly, such views are not limited to neoconservatives. At a press conference on March 18, 2011, President Barrack Obama told reporters, “I don’t think anybody disputes that Gaddafi has more firepower than the opposition. I believe that Gaddafi is on the wrong side of history. I believe that the Libyan people are anxious for freedom and the removal of somebody who has suppressed them for decades now. We are going to be in contact with the opposition, as well as in consultation with the international community, to try to achieve the goal of Mr. Gaddafi being removed from power.”7

Whether justifying U.S. unilateralism under the Bush administration or international intervention under Obama’s watch, such assumptions about history ignore two fundamental truths. The first is that history does not take sides except for those assigned to it by individuals seeking to use it to make their case for or against a particular point of view. Indeed, one need only consider how most policy is developed to understand that although history can teach us many things, it is not purposive in the sense of moving towards a preordained end.

The realities of the world as we experience it have less to do with clearly predictable historical patterns and rational decision-making than with what economist Herbert Simon called “satisficing”. According to Simon, people tend to make decisions in light of short-term realities and what he called satisficing strategies. That is, they usually cannot foresee outcomes or accurately assess probabilities, rarely can or do evaluate all possible situations with precision, and suffer from fallible memories. Therefore, they tend to make the best decision possible under the circumstances, usually choosing an option that seems to best address a problem, rather than seeking out the perfect and most rational choice. Sometimes that yields positive results, sometimes negative, but very seldom does it lead to optimal outcomes.8

Herbert Simon

Dean Acheson described the process perfectly in his memoirs. Reflecting on his years in the State Department between 1941 and 1952, he observed that that period, “was one of great obscurity to those who lived through it. Not only was the future clouded, a common enough situation, but also the present was equally clouded. We all had far more than the familiar difficulty of determining the capabilities and intentions of those who inhabit the planet with us. The significance of events was shrouded in ambiguity. We groped for interpretations of them, sometimes reversed lines of action, based on earlier views, and hesitated long before grasping what now seems obvious.”9 Such realities were no different for decision makers before Acheson, or for those since.

The second problem with Fukuyama’s, Bush’s, or Obama’s historical logic is that it is premised on a view of history that, although long used by Americans to justify their actions in the world, reflects an oversimplified understanding of historical events. Still, it is easy to grasp its appeal, since at first blush there is so much circumstantial evidence to recommend it. The inevitable triumph of democracy, capitalism, and Enlightenment logic has seemed foreordained to many Americans because it provides an easy explanation for how a loose collection of colonies defeated the most powerful empire in the world in the eighteenth century, how it subsequently expanded across the continent in about 100 years, or how it became a global power. Over the last twenty or so years, the march of history has seemed to be confirm these fundamental assumptions, first with the apparent triumph of democracy and capitalism following the Cold War, and now with the democratic revolutions that have rocked the old order in the Middle East.

What is easily forgotten in such a formulistic approach to history is that none of these events was preordained. Moreover, the “spin” put on them had a uniquely American, and in many cases, self-serving purpose. Consider that as it expanded west, bringing it into conflict with Mexico and Native Americans in the nineteenth century, the U.S. reassured itself that its actions was in keeping with the values of democracy and the will of God and of history. As newspaper columnist John L. O’Sullivan outlined the logic of Manifest Destiny, “[W] e are the nation of progress, of individual freedom, of universal enfranchisement. Equality of rights is the cynosure of our union of States, the grand exemplar of the correlative equality of individuals… We must [move] onward to the fulfilment [sic] of our mission… This is our high destiny, and in nature’s eternal, inevitable decree of cause and effect we must accomplish it. All this will be our future history, to establish on earth the moral dignity and salvation of man…”10 Although such thinking might have made sense to Americans, Mexicans and Native-Americans saw the historical process that led to the United States overspreading the continent much differently.

Following the Spanish-American War in 1898, the United States acquired an empire that included Guam, the Philippines, and Puerto Rico using much the same logic. Over the course of the next decade and a half, it involved itself in China, Latin America, and the Philippines, ostensibly in the name of democracy but more often than not in the interest of empire and economic and national self-interest. For those like Senator Albert J. Beveridge of Indiana, there was nothing inconsistent about promoting freedom and establishing empire. As he explained America’s war with the Philippines in 1900, “The Philippines are ours forever . . . And just beyond the Philippines are China’s illimitable markets. We will not retreat from either. We will not repudiate our duty in the archipelago. We will not abandon our opportunity in the Orient. We will not renounce our part in the mission of our race, trustee, under God, of the civilization of the world… He has marked us as His chosen people, henceforth to lead in the regeneration of the world… It is God’s great purpose made manifest in the instincts of our race, whose present phase is our personal profit, but whose far-off end is redemption of the world and the Christianization of mankind… ”11

A fundamental assumption that what was good for America was good for the world undergirded such thinking, just a clearly as it does that of neoconservatives and the Project for the New American Century. Moreover, the rhetoric that the United States is on the side of history is equally present in the words of O’Sullivan, Beveridge, Bush, and Obama — though the latter’s verbiage is not quite so brazen. Thus, despite the rhetoric of change that dominated the 2008 presidential campaign, little apparently has changed.

Indeed, when one takes a long term view, one sees that Obama’s rationale for acting in Libya is part of a tradition that includes the globalization of American ideals embodied in Woodrow Wilson’s Fourteen Points, the principles of the Atlantic Charter, the logic of Truman Doctrine, the ideals of John Fitzgerald Kennedy, Lyndon Johnson’s justification for committing American troops to Vietnam, and Ronald Reagan’s hard line against Communism. All are grounded in a commitment to freedom, and as such, to the nation’s self-proclaimed sense of mission dating back at least to John L. O’Sullivan. The problem is not that the United States has seen itself in this way almost since its inception; it is that beginning in the twentieth century, America became what Walter McDougall called a “crusader state.” Wrapping its policy in moralism, democracy, and Western rationalism and justifying those ideals with a sense of historical inevitably, it set about conscientiously or at least rhetorically—depending on whowas in charge—to provide salvation to a world ravaged by revolution and war. This became especially true after 1945.12

The parallels between the Cold War and the War on Terror are particularly instructive. During the former, competition between the United States and the Soviet Union for influence in the developing world led to wars of national liberation, tribalism, and other localized issues being played out in the context of superpower relations and the rhetoric of ideology, often to the detriment of all involved. A misreading of history at least in part exacerbated the problem. Ignoring the cultural and historical context shaping struggles in developing nations, both the United States and Soviet Union operated on a fundamental assumption that history was on their side, albeit they also believed that it needed their help. For Russians, Marxism-Leninism spoke to the inevitable triumph of Communism as dialectical materialism took its logical course. For Americans, a Fukuyama-like confidence in a democratic dialectic shaped the underlying logic for their use of power.

In the era of terrorism, the situation is in many ways the same. Neoconservative logic oversimplified a multi-faceted threat with far more complex historical dimensions than a simplistic view of the world as black and white, good and evil, would allow. As it did during the Cold War, such thinking inevitably bred disillusionment and frustration. When George W. Bush declared “Mission Accomplished” aboard the aircraft carrier USS Abraham Lincoln on May 1, 2003, about a month and a half after the beginning of the Iraq War, his claims soon seemed hollow in light of the protracted conflict and the fact that events in Iraq and Afghanistan seemed to be doing nothing to determine the ultimate outcome of the War on Terror.

A change of leadership, however, has done little to provide any further clarity on how to proceed. According to Bob Woodward’s book Obama’s Wars, strategy meetings about the war in Afghanistan often resembled a three-ring circus. Although the Pentagon and Secretary of State Hillary Clinton wanted 40,000 troops sent to Afghanistan, Vice President Joe Biden advocated rooting out Al Qaeda from along the Pakistan border using Predator drone strikes and Special Forces. Obama, meanwhile, often found himself frustrated by the conflicting recommendations. The divisions went public when General Stanley McChrystal, the Commander of the International Security Assistance Force (ISAF) and of U.S. Forces in Afghanistan, resigned on June 23, 2010 after a blunt interview with Rolling Stone magazine.13

Divisions like these are par for the course and fuel contradictory messages. Indeed, President Obama finds himself in a situation much like Richard Nixon did in Southeast Asia when he tried to Vietnamize the war there. Nixon announced troop withdrawals in June 1969 only to escalate the war with incursions into Cambodia in 1970. For his part, Obama, despite promises of troop withdrawals, had to defend a surge in Afghanistan as he prepared to receive his Nobel Peace Prize. With the July 2011 deadline to bring the American military home from Afghanistan growing near, the President has been quoted as saying that the United States will not simply be “switching off the lights.” For some experts, given such public pronouncements, the conditions in Afghanistan, and the ISAF’s mission, it is probable that there will still be more than 50,000 U.S. troops in Afghanistan when Obama runs for re-election in 2012.14

As for Iraq, despite plans to pull the last American soldiers out by December 2011, Defense Secretary Robert Gates has indicated that following a meeting with Iraqi President Jalal al Talabani he believes that the U.S. should keep several thousand troops in the country after the deadline to maintain the peace that has been won. The fear is that a complete American pullout would create a security vacuum, offering an opportunity for power grabs by antagonists in an unresolved and simmering Arab-Kurd dispute, a weakened but still active Al Qaeda, or an adventurous Iran.15 Given what happened two years after the signing of the Paris Peace Accords ending the Vietnam War, such precautions are understandable.

What is less understandable, however, is the logic guiding the Obama administration in Libya. Given his experiences with Afghanistan and Iraq, one would assume that the President would be cautious about justifying policy decisions with sweeping statements that imply historical inevitability, even if it was only for rhetorical effect. Indeed, one could point to the apparent ineffectiveness of the no fly zone and of British and French complaints about America taking a backseat in the operation as only the most recent example of satisficing defining the dynamics of decision-making. Although President Obama has found it handy to engage the logic that history is on our side, he has been astute enough to understand that committing too many U.S. assets to helping the process along, especially when they are stretched elsewhere in unpopular causes, would be political suicide. Yet, by invoking history as a rationale for action in the first place, he has engaged in the same sort of fallacious thinking that muddied the waters for earlier administrations. If history teaches us anything, it is that invoking it as a rationalization for policy tends to engender confusion or worse from constituents at home and from allies and enemies abroad.

Still, it is difficult for American policy makers to avoid the temptation to use history to justify policy, at least not without changing the United States’ political character and national culture, which have been shaped by popular historical interpretations about what has made the United States unique. America’s self-identity is so tied up with the ideals of freedom and democracy that to not at least pay lip service to them is nearly impossible. Thus Democrat and Republican, conservative and liberal, all acknowledge the power of democracy to transform the world.

Indeed, it is easy to point to the events that were transforming the world when Fukuyama wrote, and that appear to be driving change in the Middle East today, or that have consistently informed the U.S. agenda since Woodrow Wilson, and see clear patterns where the “Western idea” is working itself out. Such thinking, however, ignores the fact that world history is not American history. Different societies have been shaped by their own pasts in ways that have little to do with the forces that informed the development of the United States. To ignore that fact—either with comments about the end of history or that certain people are on “the wrong side of history”— is to ignore things as they are.

The logic is clear. When high expectations turn to angst and then frustration, the temptation to help history along is often too great. This has led to interventionism in places like Vietnam, recommendations by the Project for the New American Century to topple Saddam Hussein, and statements by President Obama justifying the removal of Muammar Gaddafi. One of the most important lessons from history, however, is that global unrest is a function of historical events in other parts of the world that have little to do with the inevitable forward march of Western ideas.End.

Notes

[1] Francis Fukuyama, “The End of History?” The National Interest, Summer 1989. Francis Fukuyama, The End of History and the Last Man (New York: The Free Press, 1992).
[2] Address Before a Joint Session of the Congress on the Persian Gulf Crisis and the Federal Budget Deficit, 11 September 1990, George Bush Presidential Library and Museum, College Station, TX, http://bushlibrary.tamu.edu/research/public_papers.php?id=2217&year=1990&month=9
[3] Project for the New American Century, http://www.newamericancentury.org/ .
[4] Letter to President Clinton on Iraq, 26 January 1998, Project for the New American Century, http://www.newamericancentury.org/iraqclintonletter.htm .
[5] Letter to President Bush on the War on Terror, 20 September 2001, Project for the New American Century, http://www.newamericancentury.org/Bushletter.htm
[6] The President’s State of the Union Address, 29 January 2002, George W. Bush White House Archives, http://georgewbush-whitehouse.archives.gov/news/releases/2002/01/20020129-11.html
[7] Washington Post, 18 March 2011.
[8] Herbert A. Simon, Administrative Behavior: A Study of Decision-Making Processes in Administrative Organizations 4th edition (New York: The Free Press, 1997).
[9] Dean Acheson, Present at the Creation: My Years in the State Department (New York: Doubleday and Company, 1969), 3-4.
[10] John L. O’Sullivan, “The Great Nation of Futurity,” The United States Democratic Review, VI, (1839), 426-30.
[11] Senator Albert Beveridge, “In Support of an American Empire”, 9 January 1900, Congressional Record, Fifty-Sixth Congress, First Session, XXXIII (Washington: United States Government Printing Office, 1900), 704-11.
[12] Walter A. McDougall, Promised Land, Crusader State: America’s Encounter with the World Since 1776 (New York: Houghton Mifflin Company, 1997).
[13] Bob Woodward, Obama’s Wars (New York: Simon and Schuster, 2010).
[14] Michael O’Hanlon, “Staying Power: The U.S. Mission in Afghanistan Beyond 2011,” Foreign Affairs, September/October 2010.
[15] Washington Post, 10 April 2011.

Michael W. Santos
Michael W. Santos

Michael W. Santos earned his doctorate from Carnegie-Mellon University in Pittsburgh. A professor of history at Lynchburg College in Virginia, he teaches American History. He was the Founding Director of both the Lynchburg College Symposium Readings Program and the Center for the History and Culture Central Virginia. Dr. Santos edited several source books that examined the impact of national and international events on Central Virginia and wrote the history of Lynchburg College for its centennial. Interested in social as well as diplomatic history, he has published in labor and maritime history, including the book Caught in Irons: North Atlantic Fishermen in the Last Days of Sail. He is currently working on two books: Nukes, Spooks, and Kooks: United States Foreign Policy Since 1945 and The Elephant, the Mouse, and the Lion: U.S.-Canadian-British Relations during the Vietnam War.

Comments are closed.