Skip to main content

by Raymond Smith

Editor’s note: Dr. Smith’s 2011 book, The Craft of Political Analysis for Diplomats offered suggestions for doing political analysis better, from the viewpoint of a foreign service officer who had spent most of his diplomatic career practicing the craft.   In this follow-on piece, Smith expands on his original discussion with thoughts on the cardinal sins of political analysis.

The Essential Tools

An embassy political officer is responsible for analyzing a host country’s domestic and foreign policies, with a view toward aiding policy-makers and advancing U.S. interests. This requires understanding the local political situation and how that may affect U.S. objectives in the country

Good analysis has to consider technique, presentation and the audience. Unlike description, it requires context, comparison and/or conceptualization. Masterful writing cannot make vapid analysis better, and clunky writing obscures quality analysis.   A well-written, quality analysis must also be brief and include a careful summary. Policy-makers do not have time to read a forty-page, reputation-enhancing article in a prestigious publication. In reality, the summary is probably all the policy-maker will read of even a concise analysis. It is the best chance you have to get your most important points across. Finally, the analysis must be directed toward the audience you want to reach. This may require choosing between a broader audience and a more influential one.

The Cardinal Sins

Wikileaks had the unintended consequence of allowing the interested public to see a lot of outstanding American diplomatic political analysis. During my State Department career, I also observed some common and sometimes consequential analytical mistakes. My list of the important ones includes making predictions, mistakes in assessing risk, straight-line thinking, adjusting analysis to suit superiors, rationalizing policy decisions, asking the wrong questions, failing to consider contingent outcomes, and thinking like an American.

Making Predictions

Prediction has no place in political analysis. Leave that to the soothsayers. Analysis deals with probabilities. Presenting nuanced analysis to government figures can, however, be difficult. They have to make decisions. They are looking for some certainty and suspect tentativeness. What the analyst sees as nuanced, the policy-maker may see as wishy-washy.

Intelligence analysts often try to deal with this by using terms such as “high probability” or “low probability”. My sense is that for practical purposes policy-makers tend to treat the former as “this is what is going to happen” and the latter as “this is not going to happen”.   My preference is to offer a number, or a range of numbers, in assessing outcomes. While this approach may suggest greater precision than exists, it has the virtue of making it more difficult to treat low probability as no probability.

Mistakes in Assessing Risk

There are many kinds of risk. For the analyst, the most pernicious may be the failure to ask “What if I am wrong?” You could try to fix this problem simply by including in each analysis a final paragraph assessing the consequences of being wrong. Unspoken assumptions that inform analysis are difficult to unearth and such a concluding section may become a rote exercise rather than a careful examination of assumptions. A more elaborate variant has been to set up “Red Teams” to challenge important analytical efforts. However, I am not personally aware of occasions when Red Team analyses have been chosen to guide policy-making rather than leading to modifications around the edges of the original “Blue Team” analysis.

Low probability, high consequence outcomes are typical kick-the-can- down-the-road events, garnering insufficient or misplaced attention and resources. Monty Python was right: “No one ever expects the Spanish Inquisition.” Potential contemporary Spanish inquisition-type events include: a species-ending asteroid strike, large-scale nuclear war, coronavirus, and global warming. Past events in the international sphere include: prior to 9/11, a large scale terrorist attack on the United States, the collapse of the Soviet Union, the Arab Spring.   The challenge for the analyst is to correctly convey probabilities and consequences. For the policy-maker, it is how to direct finite resources. For example, what level of intellectual and monetary capital should be applied to a threat like an asteroid strike that is existential, but may never materialize?

Closer to home, our political leadership knows that a large-scale nuclear war would lead to societal extinction, if not species extinction. We would literally be bombed back into the stone age. Enormous intellectual and financial capital has been devoted to avoiding nuclear war by enhancing deterrence. Deterrence seeks to lower risk by raising consequences. Much less effort has gone into lowering both risk and consequences. Proponents of deterrence argue that it has kept the nuclear peace. And so it has, and will, unless it doesn’t, in which case we will find out that we understated the risk and did too little to mitigate the consequences.

Getting this right is not easy. Political analysis actually is more difficult than rocket science, which relies on well-established physical laws and extensive experimentation and testing. Nevertheless, we political analysts could still learn something from rocket scientists about the consequences of failing to identify weak links.

Straight-Line Thinking

Predicting the future from the past is intuitive, seductive, simple and lazy. It is also frequently right. But when it is wrong, it is very wrong. Great changes, what Thomas Kuhn in The Structure of Scientific Revolutions calls paradigm shifts (and others call tipping points), occur rapidly and, often, unexpectedly. Afterward, analysts, seeking to make sense of what has happened, trace the origins and declare the revolution inevitable, thus reinforcing another round of straight-line thinking. It often leads to the failure to assess risk and consequences properly. It frequently leads to overestimating the longevity of authoritarian regimes, mistaking rigidity for strength. Contemporary examples include the communist regimes of eastern Europe, the Soviet Union, and Tunisia, Libya and Egypt during the Arab Spring.

Adjusting Analysis to Manage Superiors

Gorbachev announced his resignation on television on December 25, 1991; by the end of the year the USSR had collapsed.

Advisors try to avoid upsetting political leaders, who then might start to make decisions on their own.   This is doubly so if a policy you recommended is challenged. In 1990, I drafted a cable in Moscow raising the possibility of the collapse of the Soviet Union and suggesting steps to prepare for the eventuality.   President Bush asked his senior staff for views on my analysis. The response was drafted by Condoleeza Rice, approved by her boss at the NSC, and cleared by pretty much the entire foreign policy apparatus. It essentially told the President that things were well in hand and our current policy had everything covered. They were not and it did not.

Rationalizing Policy Decisions

Analysis suffers when bureaucrats are attempting to manage their bosses, as in the above example. It can also suffer when the reverse occurs, when the political leadership has decided what it wants to do and seeks analysis that supports the decision. Few of us are unaware of the pressures brought on the intelligence community to support the decision in 2003 to invade Iraq. During the 1990s governmental and foreign policy community debate over NATO expansion, the arguments for expansion varied and extended over time to address criticisms of the policy. A carefully considered and persuasive policy does not require shape-shifting justifications of its merit. This analytical sin is a variant of believing your own press releases. Press releases are propaganda tools, not analytical ones. Propaganda should support analysis, not substitute for it.

Asking the Wrong Questions

If analysts are not asked the right questions, they are not likely to come up with the right answers. The wrong question–will the Soviet Union collapse—asks for a prediction. The right question–is the possibility of Soviet collapse high enough that we should start to think about how it could affect us—asks for an analysis of probabilities and consequences. We asked the right question at the U.S. embassy in Moscow in 1990. The intelligence community in Washington responded by answering the wrong question. The wrong question: Should we intervene in Libya to overthrow Qadaffi? The right question: Do we have the ability, resources and willingness to produce a better outcome if Qadaffi is overthrown?

Failing to Consider Contingent Outcomes

Former Secretary of State Hillary Clinton reportedly remarked while considering U.S. policy toward Libya during the Arab Spring that she would rather be caught doing something. Fairly early in my foreign service career, I served as chairman of the Open Forum Panel, which was broadly mandated to challenge established foreign policy thinking. After a few months on the job, my colleagues in policy planning gave me a plaque that said: “Don’t just do something. Stand there.” I took it as a compliment, since both they and I knew I had no intention of just standing there. Later, however, I wish I could have hung that plaque on Secretary Clinton’s wall and persuaded her instead of doing something to just stand there and ask questions like: What will happen in Libya if Qadaffi is overthrown? Or in Syria if Assad is not?

I was desk officer for Sudan when Black September terrorists assassinated our Ambassador and deputy chief of mission. During the eighteen months the Sudanese government held the terrorists without trial, I unsuccessfully urged my superiors to develop plans to deal with a Sudanese decision to release them. Ultimately, they were tried, convicted and released on the same day to the Egyptian government, which purportedly put them under house arrest.   Since we had no approved policy to deal with a release contingency, relations with Sudan went into limbo for months while Secretary of State Kissinger decided what to do.

Thinking Like an American

 The 2003 Iraq invasion is a veritable petri dish for the study of analytical sins. Political officers serve abroad so that they can inform their analyses with local political and cultural knowledge and bring that knowledge back to their Washington assignments. Localitis may sometimes ensue, but the greater danger in Washington is seeing the world through an American prism. At times among the political leadership, this amounts to an ideological block to contrary information. The top levels of the administration in Washington in 2003 appear to have genuinely believed that the only thing standing in the way of an outbreak of democracy in Iraq was Saddam Hussein and his Baathist cronies.

Secretary Powell, speaking to the U.N. Security Council in February 2003, alleged that Iraq was hiding weapons of mass destruction from inspectors.

Even if our fellow analysts do not wear ideological blinders, they may wear conceptual ones. Prior to the 2003 Iraq invasion, arms experts in the intelligence community (who were not experts on Iraq) made the unspoken assumption that Saddam Hussein’s behavior could be explained only by the existence of a WMD program. Their understanding of the technical evidence before them flowed from this basic assumption and led to a unanimous (except for one State Department office that demurred on nuclear weapons) conclusion that Iraq did have a WMD program. They showed Secretary of State Powell photos of vehicles and other equipment that in their view could be used only in such a program. Powell used the photos as a basis for his speech to the UN seeking support for the planned invasion. No contemporary WMD program was ever found. Why was the analysis so wrong? It did not take into account the fact that Saddam Hussein had cultural and local political reasons for acting as he did that transcended American logic. This was less the fault of the weapons analysts than of their superiors, who had not considered, or perhaps did not wish to consider, that local knowledge as well as technical expertise was needed on this critical question.

Political Analysis is a Craft

Like other crafts, political analysis mixes art and science in varying degrees. You hone your craft by practicing it, and by checking your results against real world outcomes. Two analysts, each practicing the craft diligently, will sometimes come to different conclusions. No doubt readers of this article may have different judgments on aspects of it. I invite you to take what you can use and leave the rest, with the final thought that analysis of the craft will in the end yield better analyses of the subject matter.End.


Raymond Smith

Dr. Smith retired from the Foreign Service at the rank of Minister Counselor.  His assignments included director, office of the former Soviet Union and eastern Europe in the Bureau of Intelligence and Research at State and as minister counselor for political affairs at the U.S. embassy in Moscow.  In addition to The Craft of Political Analysis for Diplomats, he is the author of Negotiating with the Soviets and numerous articles on international relations.

 

Comments are closed.