
by Karl Stoltz
The Tools of Information Manipulation
You are probably reading this article on your mobile phone or your laptop, or maybe your office or home PC. In fact, that is probably how you get most of your information today, not on pieces of paper delivered to your door or in your mail, or even via AM or FM radio or on network television. And that change in how we receive information is increasingly true all around the world.
That convenient all-purpose device you’re staring at also allows our adversaries to attack us from within. It has become one of the greatest threats to our long-term national security. Over the past decade, adversaries like Russia, China, and Iran have mastered the art of burrowing into our consciences surreptitiously, planting falsehoods, disrupting elections, and undermining the fundamental institutions that safeguard democracy.
These hybrid attacks were once all labeled “disinformation.” Today, they are more precisely defined as Foreign Information Manipulation and Interference (FIMI). FIMI affects us like Rohypnol and other date rape drugs: at first, we are often not aware that we are ingesting it. We are addicted to our favorite social media sites, so we refuse to believe that our phones and PCs could be used against us. But information manipulators are master bartenders, able to mix a twist of truth into heavily distilled fictions. They mimic reputable sites in such a way that many consume these distortions without realizing they are swallowing them. And, like alcohol, manipulated information amplifies our emotions. It turns minor political differences into bitter arguments, it undermines confidence in those we should trust, and it destroys longstanding relationships.
The purveyors of disinformation and distortion know their craft, and they know that the best way to make information stick is not with cold hard facts, but with emotive conspiratorial suggestions, ideally spiked with a sense of extreme urgency. They also know that they must hide the source of their poisonous whispers, because if consumers knew that they originated from the Kremlin, from the CCP in Beijing, or from the Iran Revolutionary Guard, many would be less likely to trust the information. They go to great lengths to conceal their identities.
While recent administrations of both US parties refer to America’s “strategic competition” with Russia, the PRC, and Iran, a “competition” requires that both sides play by the same rules. Our adversaries do not play by the rules. They seek competitive advantage over us in the global space, and the future of warfare in this era of drones, artificial intelligence, and constant technological innovation will not be a conventional conflict of tanks, warships, and fighter jets. It will be a hybrid war, fought as much in cyberspace as on battlefields. Their irregular warfare campaigns directed at us and our allies are already underway today, and we carry the Trojan Horses that enable those attacks in our pockets and purses.

Why We Accept Falsehoods as Facts
In ancient days, information told us whether to flee or fight, where to find shelter and sustenance, whom in the clan to trust, and whom to obey. We still carry those primal information-processing genes within us today as primal instincts and fears. The first time you learn about something, it tends to stay with you, especially If it evokes a strong emotion. It takes an enormous amount of contrary information to cause you to question your initial judgments. Those who peddle disinformation know this, and they invoke strong emotions such as fear, anger, and jealousy to help their propaganda go viral.
As anthropologist Bob Deutch notes, fear and reason operate in different systems in the brain, and “fear reduces cognitive capabilities.” Among the irrational core fears that most people have, as outlined by sociologist Dan Gardner in Risk: the Science and Politics of Fear and Robert M. Smith in Primal Fear: Tribalism, Empathy, and the Way Forward, are the fears of Catastrophic Potential (fear of mass-casualty events, rather than threats that cause smaller numbers of fatalities; this is why people are more frightened of airplane crashes than the much more dangerous public highways); Unfamiliarity (why you fear unknown streets at night, but not those in your own neighborhood); Lack of Understanding (if you have been taught how something works, you are less hesitant to use it than if its operation is a total mystery); Loss of Trust (when something you consider reliable does not do what you think it should, your sense of risk rises); Generational Risk (anything that threatens children or future generations evokes a strong emotional reaction); Outsiders (we innately fear those who do not share our values); Disruption (anything that unbalances existing social cohesion); and Conspiracies (“everyone else knows what is going on but me; what are they hiding?”).
If we put all these fears together, you will find a common theme that can be seen in almost all FIMI efforts: (1) a sense of urgency; (2) an appeal to one or more of your strongest emotions; (3) the exaggeration of irrational fears; (4) the perception of a serious threat to your group (your ethnicity, religion, gender, race, or other social group; and (5) a conspiratorial whisper that “THEY don’t want you to know about this information, but we are sharing it with you anyway; please retweet it or share it with other friends.”
What makes this emotional manipulation even more effective is that today, in the 21st Century, we have lost many of the religious, ideological, and hierarchical filters that once managed the flow of information to the average citizen. Today, anyone with an electronic device can gain access to all the information in the world, and with the rapid rise of AI, it often now arrives already translated into our preferred language, and sometimes untouched previously by human hands. We have all become our own editors and station managers. Some manage that task better than others.
The Scale and Objectives of Adversarial Efforts
While US and Western diplomats still strive to convey truthful information and build trust, our adversaries do not have this problem. They are less interested in developing long-term mutual relationships, and more interested in short-term gains at the expense of our interests. America has been slow to recognize the danger of this new form of FIMI-based hybrid warfare. In part, this is because we continue to rely on democratic institutions, including the media, to inform us and because we tend to believe information to be true until proven otherwise. In authoritarian nations, the reverse is true: their publics tend to first disbelieve whatever they see. They know that their media are centrally controlled, heavily censored, and serve as just a propaganda arm of the ruling elites. Thus, it is easier for authoritarian governments to adapt Orwellian standards of doublespeak and manipulation when sending information abroad, and they have plenty of prior practice in deluding their own populations.

The Kremlin spends an estimated $4 billion annually on FIMI. Russia’s goal is not to build greater global admiration of their nation, but simply to create more social disharmony within Western countries, to undermine public faith in their democratic institutions, and to stir political division and chaos. Why? Because doing so works in Russia’s global interests. It divides the U.S. from its allies, it distracts nations that might otherwise be unified in opposing Russia’s nvasion of Ukraine, and it amplifies the voices of extremists everywhere.

The PRC is more subtle, but it spends even more — an estimated $6 billion per year — to advance “digital authoritarianism,” or the use of digital infrastructure to repress freedom of expression, censor independent voices, promote officially sanctioned disinformation, and deny human rights. Beijing has taught its information control tactics to would-be authoritarians in Africa, Asia, and Latin America, and as others emulate the PRC’s rules of information control, each nation’s information ecosystem becomes more receptive to future propaganda, disinformation, and censorship requests from Beijing.

The Islamic Republic of Iran spends less than $1 billion on its own FIMI efforts and primarily focuses on the Middle Eastern and North Africa regions. It prefers to work through proxies, including terrorist organizations such as Hamas, Hezbollah, and the Houthis in Yemen. Iran shares online tools to help them raise funds, recruit new members, appeal to disaffected local audiences, and mischaracterize their brutal attacks on civilians as holy war victories. Iran’s long arm of repression also reaches around the world to intimidate and even murder journalists and others who oppose their authoritarian regime.
Countering These Efforts
So how can we respond? My three years at the State Department’s Global Engagement Center (the GEC) gave insights into four areas where future U.S. diplomatic efforts would be well advised to concentrate:
First, we should keep pressure on our adversaries by exposing and disrupting their networks. We should not waste your time going after each lie (the “whack-a-mole” strategy), but instead educate the public about the clandestine networks funneling false information to them. If we show the history of individuals or organizations spreading falsehoods, we can throw sunlight onto their methods and undermine their ability to spread further half-truths. In the GEC, we called some of these efforts pre-bunking (exposing a FIMI campaign before it starts). This can be a very effective deterrent (“Russia is going to lie to you soon about X”), and often causes the adversary to drop the FIMI topic altogether, but it must be used sporadically, usually when there is a need to prevent loss of life.
Second, we must work to build interagency and international coalitions, and to enhance the capacity of likeminded nations to develop their own capacities to detect and counter FIMI. There is both strength and extra credibility in a united front. FIMI is a multi-dimensional challenge, and governments that do not share information and coordinate regularly between their homeland security, defense, intelligence, and foreign affairs agencies, and nations that do not share FIMI information with their allies, tend to duplicate efforts, flail inconclusively, and win minor battles, but lose the info war. Many nations now recognize the threat that FIMI poses to their own national security and have strengthened their multilateral engagement on this issue. This makes it harder for malign actors to find the shadows around the world that they need to hide in to manipulate the truth.
Third, we have to make technological progress our ally before our adversaries make it an even greater threat. Artificial Intelligence (AI) accelerates the speed with which our adversaries can spread FIMI and enhances their ability to deceive us with deep and even deeper fakes, but we can also use AI and other new technologies as tools to detect disinformation in all languages rapidly, and to trace its origins more efficiently than ever before. We need to master all it offers.
Finally, we need to continue to deliver training and assistance to target audiences. Educational organizations and independent media outlets are ideal partners for this effort. USAID and the GEC helped develop free online games (https://harmonysquare.game, and https://catpark.game) that taught people how easy it is to sow discord through manipulated information, and how to raise one’s personal radar to detect it. We cannot retreat from these efforts, despite the current pressure to cut personnel and funding, or we risk losing the infowar.
You, the reader of this article, can help in this cause, too. When you receive a message (via email, social media, or other sources) that calls for immediate actions, or cleverly plays on a core emotion, or hints that you are gaining exclusive information that others “don’t want you to see,” become a skeptic. Vet the background of the person or organization that sent it, ask yourself why the matter is so urgent or so emotive, and spend a few extra minutes using Wikipedia or a search engine to verify the source. It may help you avoid becoming a victim of misinformation, a disinformation plot, a phishing attack, or a criminal scam.
The battlefront of the insurgent information warfare campaigns being waged against us today are in our own hands. We have met the enemy, and he is us. The sooner we recognize that fact and bolster our defenses, the sooner we will deter our adversaries from attacking us this way, and we will see the positive results in more rational domestic debates, greater international cooperation, a strengthened global democracy, and an increased cost to the adversaries who seek to use FIMI to turn us on one another and not against them, the real threat.
Karl Stoltz is Foreign Policy Advisor for Deft 9 Solutions, Inc., a D.C.-based contractor focused on strategic communications and countering FIMI. He served for 38 ½ years in the US Foreign Service, including twice as deputy chief of mission (Rangoon and Copenhagen), three times as minister-counselor for public affairs (Moscow, Pretoria, Kuala Lumpur), and twice as an office director in the State Department. He speaks Russian, Indonesian, and Malaysian, and earned his university degrees from the University of Virginia.
(The opinions and characterizations in this piece are those of the author and do not necessarily represent those of the US government.)