No matter where you live, if you are on the Internet you are in a war zone. You may not feel the terror of bombs dropping around you, nor the horrors of physical violence, but there is war nonetheless. For as long as humans have fought, military tactics have always involved propaganda, information manipulation, and deception. Today, digital communication technologies have changed the landscape of what the U.S. military calls “irregular warfare.” As opposed to “conventional warfare,” this kind of war is not primarily about the use of physical force, and it is not primarily about targeting an adversary’s military assets. Irregular warfare includes economic warfare (sanctions), cyber warfare (attacks within the digital domain), and political warfare (diplomacy), but it most pervasively manifests as some version of population-centric information and narrative warfare. The goal is the same as conventional war: to acquire by coercive means some political or strategic good, such as economic, geographical, or military advantage. In this kind of war, the mind of every citizen is a potential target, and tactical victories could include a delegitimized election, implosions of civic discourse, and mutual hatred sparked between fellow citizens.
Just as the destructive power of nuclear weapons forced humanity to reorient to the idea that mutually assured destruction exists at the extremes of physical violence, so advances in information warfare require us to face the same truth of inevitable self-destruction, and to mutually back away. The challenges before us are technological, psychological, and cultural. But the first step in all of this is knowing that we are caught up in a new kind of war. If we are to survive, we must all understand how this situation came about, and grasp the basic dynamics of the advancing battle fronts.
Irregular wars have arms races between combatants in much the same way as regular wars. Techniques of information warfare (such as propaganda) have been developed and expanded by scientists for decades, with much success. We argue that arms races in the context of information wars lead to the same end state as arms races in conventional warfare: mutually assured destruction (MAD). During the Cold War, when doctrines and awareness of MAD meant that nuclear weapons could not be used without risking destruction on a global scale, nations instead committed military assets and intelligence resources into irregular warfare campaigns. But it was not until the emergence of digital technologies that humanity collectively faced the reality of information weapons of mass destruction. In the context of informational warfare, mutually assured destruction is the total collapse of the epistemic commons, and the exhaustion of language as a means of cooperation for all parties on all sides of the conflict.
There has been a resurgence in the formal study and critique of propaganda because of recent escalations in irregular warfare. This article paints a picture of massive and sophisticated information warfare campaigns, including between nation states (Russia vs. U.S.), and between political parties (Democrats vs. Republicans). As a U.S. citizen living in a “swing state,” it is possible to be subject to propaganda on social media from both domestic political parties and foreign militaries—a constant battle, 24 hours a day, 365 days a year, all in high-definition in the palm of your hand. Humanity has never been subject to propaganda and information warfare of this kind or at this scale. What are the consequences of this unprecedented situation for individuals and societies?
Because of advances in irregular warfare, open societies are under siege. They are becoming increasingly fragmented and divided, while authoritarian societies are hardening into centralized information bubbles, ready to pop. Under conditions of total information warfare, democratic forms of government become impossible to operate in earnest because the people have no adequate means of making sense of the world. Without a healthy press, education, or public sphere, it is impossible for citizens to develop a realistic mutual understanding of the world. Undemocratic authoritarian states also become increasingly difficult to maintain, because the political elites themselves can too easily end up within insular echo chambers, absent of the free and novel insights needed to solve truly complex social problems.
In 2016, Russian intelligence agents hacked the email of John Podesta, the chairman of Hillary Clinton’s campaign. Following the hack, they set up a “Wikileaks-inspired” site (called DC Leaks) to disseminate the content of the emails to the American voting public and the international press. Among other revelations, the leaks gave Bernie Sanders supporters insights into their candidate’s treatment during the primary. The leaks also highlighted a host of other information operations being undertaken by Russian intelligence elsewhere in the culture. Some of the emails were woven into the “Pizzagate” narrative, which falsely claimed Democratic leaders were holding children captive in the basement of a Washington, D.C. pizzeria. This came from social media accounts known to operate from the Internet Research Agency (IRA). The IRA is probably the most famous of all troll farms, which has been operated by the Russian government for more than a decade. This state-administered department employs hundreds of people who work out of a nondescript office building in the suburbs of St. Petersburg.
While it would be easy to see DC Leaks as election interference in favor of Trump, the actual state of Russian operations is much broader, undermining all sides of the political spectrum. One particular case demonstrates the relative ease with which a sock puppet, or disguised account, of Russian intelligence can use social media to deceive protesters into taking action on the streets of an American city. In 2017, one of the most followed Black activist Twitter feeds (@Blacktivist) was later discovered to be one of many fake accounts set up by Russian trolls. Alongside another called @WokeBlack, these accounts were engaged in organizing protests in U.S. cities. Blacktivist’s posts were retweeted in the millions. They were often understood as articulate and valid takes on the racial situation in the U.S. Blacktivist and WokeBlack encouraged Black voters to vote for Green candidate Jill Stein in swing states in which their votes could have put Clinton into the White House. This highlights a concerning phenomenon, in which citizens’ adherence to their in-group narrative is now so strong that even the revelation that they have been infiltrated by hostile state actors is not enough to close the divide in society.
One common misunderstanding of information war and propaganda is that it only involves the spreading of lies and fake news. Of course, these are key tools, but note that the emails published on DC Leaks are not simply forgeries, and the ideas of Blacktivist are not simply wrong. It is precisely that they are true that makes them powerful, and potentially more powerfully divisive. This is one possible goal of information war (especially as practiced by the Russian state): to sow seeds of internal dissension, confusion, and ultimately epistemic nihilism. As we will see, “fact-checking” and fake news debunking are only a small part of the solution, because a great deal of what should be classed as propaganda will pass the test of earnest fact-checking. We will take a detailed look at propaganda—including its various definitions and techniques—in the second article of this series.
It is estimated that during the 2016 election, Americans shared Russian propaganda on social media hundreds of millions of times. It has also been estimated that during that same time almost one-fifth of Americans’ Twitter discussions were likely to come from bots. American citizens have been subject to AI bots and sock puppets, built by their own political parties, as well as foreign militaries and hackers. All this activity has been understood as a new class of information warfare, often called computational propaganda. Various definitions are offered for this term, which provides a name for the process that leads to information weapons of mass destruction.
While the 2016 U.S. election was a watershed in computational propaganda, the same phenomenon has basically swept the planet, beginning as early as 2010. Ukraine, Estonia, China, Iran, Mexico, the UK, and the U.S. have all had major politically significant incidents of computational propaganda. Research on computational propaganda is underway at various academic centers and think tanks, including at the Oxford Internet Institute, the Stanford Internet Observatory, and the Digital Forensics Lab of the Atlantic Council. The focus has been largely on the techniques, organizations, and forensic approaches, revealing a dangerous new frontier of digitally enhanced irregular warfare. We posit that this frontier leads toward mutually assured destruction, like all frontiers of arms races in weapons technologies.
In one sense, mutually assured destruction in the context of information war is simple. It has been known from the earliest days of military strategy: you can be blinded by your own smokescreen, and even more so when your enemy is using one too. The use of powerful information manipulation tactics to coerce the enemy requires the creation of organizations that specialize in making and using such tools of war. History suggests that it can be hard to achieve trust and collaboration in governments that maintain large and complex propaganda operations. Stalin’s demise in Russia can be at least partially attributed to this lack of trust. Stalin spent his last days in a bunker, paranoid and suffering the consequences of creating an almost completely manipulated information environment. Accounts show that during the Cold War, both the CIA and KGB used deceptive techniques to convince their own government agencies of the success of their campaigns (i.e. the agencies propagandized their own colleagues to ensure continued support for their work). Societies that depend on the politicized control of information end up shrouding both political leaders and the masses in mere simulations of reality.
The idea that any group of leaders is immune to the cognitive and emotional distortions they inflict upon the masses is misleading. While a small political elite might know more than most other members of their society, they are nevertheless limited epistemically by their position as problem-solvers who are segregated from actual free and open streams of information. They cannot readily trust high-ranking officials in their own intelligence and military, who themselves are employed in the practice of information manipulation and are interested in keeping their jobs and reputations. They also cannot rely on well-educated and expert members of the general population, who in fact have been lifetime subjects of information manipulation. Nor can they rely on input from foreign nations, who are systematically trying to control what information is available to their adversaries, and how it is framed. Over time, a downward spiral of distrust and confusion degrades decision-making and problem-solving capacities until the social system collapses, as occurred eventually with the Soviet Union.
Politically motivated information asymmetries produce only short-term gains. Social systems of this kind are undone by the long-term consequences of the damage inflicted on public sensemaking. The dangers of what is possible when centralizing and politicizing the control of information have long been noted by those arguing in support of open societies. However, under the conditions created by advances in digital technologies, problems of information war have become more complex.
Political parties in the world’s most open societies have wielded unprecedented instruments of information warfare on their own people as part of election cycles—unwittingly or not. The role of Cambridge Analytica in Trump's election is now a familiar cultural touchpoint. But very little was said about the armies of hired trolls suddenly popping up to follow and support Biden and Harris, many of them outsourced to a firm in India. Chat bots and botnets were created by both parties in both 2016 and 2020. But it was Obama’s campaign in 2012 that truly broke the ice in the use of computational propaganda in the domestic sphere.
Some may argue that this is just a natural extension of the propaganda created by political parties of the past, which was always intense in the United States. This is true in the same way it is true that the atomic bomb is simply a natural extension of the weapons programs of the past. We have reached a point at which a difference in magnitude has become a difference in kind.
Historically, as weapons technologies crossed thresholds of destructiveness, military and governance responses resulted in enhanced regulations and international agreements on use. This process occurred with biological weapons and with atomic weapons. We propose that information weapons of mass destruction exist, but have not yet been recognized as such. Therefore, law abiding political actors can use them without consequence, and without understanding the damage inflicted on the public sphere and information commons.
Every major government has their own version of Russia’s IRA. These organizations would be more accurately referred to as cyborg armies. They consist of a mixture of human brains, cultural software (memes), digital hardware, and artificial intelligence. Faculties house hundreds of individuals working in shifts 24 hours a day. One individual will run dozens or hundreds of sock puppet social media accounts. Many of these accounts will be automated using custom computer code. Teams of writers produce content for orchestrated campaigns.
These digital information warriors have real-world impacts, far beyond the arguments they troll on Twitter. As described above, there are cases in which anti-government protests within the U.S. were organized by sock puppet accounts set up by foreign governments. Mexico and the Philippines had elections fundamentally disrupted by massive operations, likely involving foreign governments’ troll farms. Many people in the U.S. do not realize that Germany also had its capitol stormed in 2020, just before the U.S. capitol, likely also in large part due to escalations in domestic computational propaganda involving troll farms.
Now imagine a near future in which Virtual Reality (VR) and Augmented Reality (AR) have merged with social media platforms. This future is closer than you might think: Facebook bought Oculus Rift for more than $2 billion in 2014, with the brand becoming one of the major players in VR. The industry is seeking to provide a radically immersive experience that literally overlays a digital simulation across the totality of individual experience.
It will soon be possible to spend countless hours plugged into VR headsets experiencing the equivalent of a “news feed” that consumes your entire sensory experience. This is not a small screen in your hand. When the story is reported from location you are “there” in a way you never were before. Imagine further that “deep fakes” are drawn into the mix—deep fakes are technically advanced digital forgeries, typically in the form of video footage showing someone doing or saying something they never did or said. This would mean that political messages can be delivered to you through the simulated image of your best friend or your deceased father. This is bringing micro-targeting of political messaging to a level of emotional manipulation previously unimaginable. We are entering the realm of the dystopian.
With this unprecedented level of persuasive information technology, it should be clear that if governments and media elites misuse their powers, technologies could be created that would be the equivalent of mental imprisonment. This would be digitally enabled brainwashing on a scale that could capture an entire generation of minds, especially if a popular and marketable application is developed to package delivery. We are looking at the equivalent of having a weapon that can destroy a population—only they will not die, they will instead “lose their minds and hearts” over to the will of those in control of the VR attention-capture technologies. It is no longer beyond reason to consider the reality of a world of automated computational brainwashing, psychologically irresistible, and delivered at scale. While this scenario is still science fiction, leaders in the field of computational propaganda are already worrying about how long we have left until these trends start playing out.
Society has been left teetering on the edge of mass insanity, caught up in the dynamics of MAD, reaching the final limit of military strategies of total war. How did we get to this treacherous place?
Caution: Rabbit Holes and Halls of Mirrors
One of the difficulties in researching irregular warfare is that the very texts providing information on the subject can themselves be weapons of information warfare. When you read a book by an American writing about the KGB, for example, how can you be sure it is not an artifact of the CIA? This would seem absurd if it were not already known that the CIA has been known to publish books, most of which you would never know had anything to do with the CIA. And then the question must be asked: is this text here, the one you are reading, not just some kind of propaganda?
These traps are noted by several recent scholars in the field of propaganda analysis. They demonstrate how the act of making the public aware of the quantity and effectiveness of computational propaganda only furthers the aims of some propagandists. If the goal is to make the target population confused and suspicious of all their information sources, and thus unable to effectively cooperate, then making everyone think there is propaganda everywhere would accomplish that goal. Furthermore, when the propaganda in question raises legitimate concerns or presents verifiable and damning facts (such as DC Leaks), then drawing attention to the information war furthers the purposes of those waging it.
Also, when someone “exposes Russian propaganda,” as we have done to a degree here, it is hard to say they are not taking sides in the war. Indeed, doing so appears part of an offensive or defensive maneuver. Once awareness is directed at the dynamics of information warfare, a hall of mirrors unfolds in which everything can potentially enter the vortex of critical suspicion. Caution is warranted when exploring rabbit holes about information war.
Not every person in our war-ravaged public sphere is a warrior. Education can still take place, even if a great deal of the information landscape involves the coercive, bad faith manipulation of ideas. One of the overall goals of this series of papers is to provide the tools and insights necessary to tell the difference between education and propaganda. Upon completion, the series itself should be clearly identifiable as good faith communication motivated by an interest in education, not warfare.
Steven Pinker has noted that there has not been a major war for more than half a century, and that overall, as history has unfolded, the relative scale and violence of armed conflict has subsided. He suggests this is a sign of progress toward peace, justice, and truth. In this he echoes others like Francis Fukuyama, who proposed that the West had reached the end of history, which means in one sense the end of large-scale war. This may be justifiable in the context of warfare as something involving only bombs and guns. Irregular warfare, however, has been increasing in intensity, scope, and impact. Total war between major nation states has not truly reduced; instead, it has transformed into something less physically violent and more psychologically violent.
Given humanity’s long history of war and physical conflict, it can be hard to grasp that irregular warfare is now the predominant mode of military action. Today conventional (kinetic/physical) warfare is used in support of irregular campaigns. Bombs and guns are not the main event; instead they are enveloped within a broader strategy including informational, economic, political, and psychological warfare. This reorientation began in earnest with the Cold War and over time it has only become more entrenched in military strategy.
Of course, there has also been a buildup of conventional weaponry, including nuclear weapons. But as Yuval Harari notes, it is highly improbable, almost unbelievable, that “no one has fired the big guns.” Why not? Humans have always used the most powerful means of physical violence at their disposal. Is the awareness of mutually assured destruction enough? Probably not, and bombs would have dropped, were it not also for the transfer of military and government assets into non-kinetic warfare. Shelving nukes has not meant the end of war; it has meant the beginning of war by other means.
When the U.S. entered World War II, there was a tremendous amount of related propaganda. This included tens of thousands of pamphlets and hours of radio broadcast intended to prepare the way for the men, guns, tanks, and planes. This is how information warfare tactics had traditionally been used: as an aid to physical violence, and as a means of subverting the morale of civilian and military populations to ease the way for conquest. Historically, information and intelligence units worked to support victory won by use of force.
But when the U.S. and USSR began the Cold War, in the shadow of atomic weapons, the situation had changed fundamentally. Total war was not over; it had only become less obvious: it had turned into a war of ideas. Proxy wars, like Korea and Vietnam, were fought as part of the symbolic maneuvers of the broader total information war. With the benefit of hindsight, the Cold War is widely understood as being a culture war, which is another way of saying information war. Proxy wars were part of the manipulation of information, more than they were about the acquisition of resources through military conquest. The agreements between superpowers during the Cold War did not create peace, they only limited physical violence to make way for unlimited information, political, and psychological warfare.
Marshall McLuhan is famously quoted as saying “The Bomb is pure information.” What he meant is that nuclear weapons are only in part instruments of physical destruction. They are also creations of scientists and politicians, and so the existence of atomic bombs entails a vast network of communication and information. By that same token, the “nuclear age” also requires a vast regime of information control. Jean Baudrillard, who was influenced by McLuhan, also wrote specifically about the way nuclear weapons and energy demand new forms of information management to provide convincing narratives, creating “simulations” of risk, safety, and scientific authority. The Cold War involved very real and imminent existential risk, and also the manipulation of the public view of that risk through practices of information warfare. The example of Carl Sagan’s role in the questionable science (but brilliant propaganda) around “nuclear winter” has been discussed in detail elsewhere.
The Cold War occurred during a time in history at which developments in communications technologies made possible a new form of international “psychological warfare.” This was the preferred term used by Eisenhower to describe his approach to irregular warfare. Beginning in the late 19th century, the nature of humanity’s information ecosystem was changed forever by a succession of technologies associated with “the communications revolution”: telegraph, telephones, radio, mass-circulated periodicals, film, television, electronic mail, and the Internet. This created an operational concept of “global public opinion” that could be engaged in real time.
By the time the Cold War was underway advertising and public relations had become major industries and were woven into every dimension of communications media. A common approach to propaganda was to “camouflage” it by making it so the hand of the government or PR organ was not obvious. This means that the whole array of available media may be used, which during the Cold War involved orchestrated campaigns unfolding across print (newspapers and magazines), radio, TV, movies, events (like parades and ceremonies), high art, and academic papers and books. Most people encountering any one of these forms of media would be unlikely to consider it as part of a broader campaign of psychological warfare. The cumulative effect, however, is the creation of an environment of resonant and mutually reinforcing symbols and images that work over time to change mindsets, dispositions, and behaviors.
The content and placement of this warfare is hard to pin down. For example, both the CIA and the KGB worked in and around “the peace movement” in the U.S. and Europe. Their work often differed in outcome but employed the same fundamental tactic: to instigate and support movements and protests that manifest the legitimacy of one ideology (or the illegitimacy of the other). During the 1950s and 1960s the CIA was one of the main financial supports for the National Student Association, one of the largest student organizations in the U.S., with representatives on campuses across the country. The National Student Association organized protests for free speech, desegregation, anti-war, anti-colonialism, and feminism. The goal of CIA involvement was to demonstrate that the U.S. was an open society by promoting the visibility of dissent, which made America different from the Soviet Union. In “the free world,” students were able to speak their mind and protests were able to change public sentiment and even law. The CIA also used this operation to keep tabs on all the student radicals and maintain the protest movements within certain “safe limits.”
The KGB by contrast was involved in orchestrating international scientific cooperation in the West for anti-nuclear proliferation protests. They leaked documents from the Pentagon to drive fear of U.S. belligerence within European nations, pushing them toward peace. They released the names and addresses of hundreds of CIA agents in a book published in English called Who’s Who in CIA. The response to this by the CIA was to publish a similar book simply called KGB. More broadly, the Soviet Union promoted ideological war through cultural diplomacy involving events of high culture, including ballet, classical music, literature, science, and other aspects of Russian culture. The aim of this propaganda was to show that the U.S. was immature, capitalistic, and superficial, and therefore unable to hold a true vision for the future of the human race.
Both nations focused on educational institutions, at home and abroad. This shift brought warfare explicitly into the domain of intergenerational transmission, which we have discussed in our article on educational crises as a primary social function. Information warfare targets the deep social structure of societies, just as the destruction of roads, crops, and military assets targets the deep physical infrastructures. Both create conditions in which normal life cannot persist. Societies are made vulnerable to disunification, mutual antagonism, and manipulation by outside actors.
It might seem that the manipulation of information and education is less dangerous than the destruction of roads and power grids. This is not the case. As shown in our work on educational crises: without coherent processes of intergenerational transmission, societies fail. One outcome of the Cold War was the creation of educational infrastructures around the world that had very specific properties, imprints left from decades-long campaigns involving the weaponization of knowledge.
When the Cold War ended what became of the vast infrastructure built to conduct psychological warfare? It was not disassembled; it was simply put to different use.
State-on-state information war is currently taking place on a large scale, but now it is a complex multi-player scenario in which digital technologies allow relatively small actors to have potentially significant impacts. A clear recent example may be observed in the case of “fake news city,” a small metropolitan area in Macedonia (now North Macedonia) that has flooded the U.S. with disinformation, making massive profits along the way. The information commons is now impacted by a wider range of threat actors, with varying objectives, targets, and interests. The average citizen must live with the psychological risks of this new kind of inter-state warfare.
Arguably the most important recent developments in irregular warfare have taken place in the political campaigns of competing parties in Western democracies, especially the U.S. The U.S.’s culture wars are irregular wars, and the country’s two political parties are waging constant irregular warfare against each other. This has resulted in the self-propagandizing of the U.S. by virtue of unregulated competition between its own political parties. Both information and legislation (and now in the context of the pandemic, biomedical regulations) are being used as part of this domestic irregular warfare. Democrats and Republicans now relate to each other the way the U.S. (as a whole) used to relate to the Soviet Union during the Cold War.
It is likely that continual escalation of civil irregular war in the U.S. has no other outcome than mutually assured destruction. In reality, at the very least, this means the dissolution of the union through conflict, and the end of the nation as an experiment in open societies. As demonstrated in a number of related Consilience Papers, society is subject to widespread bad faith communication in the media, and an endemic pattern of political actors spreading ideas by fostering collusion between special interest groups, intellectuals, and the press. Taken together with the escalation in information warfare, the full consequences of our current situation begin to come into view.
Importantly, the war that is being waged is almost invisible. This is both because the information is “camouflaged” and because the combatants and victims include fellow citizens of the same country. The best propaganda does not look like propaganda. Those impacted by it do not believe they are being propagandized.
There are several signs to indicate when populations are being subject to information warfare. Groups tend to partition into ideological “sides,” each with its own “propaganda bubble,” based upon information from particular media sources. This ultimately results in the reduction of complex political issues to binary and polarizing propositions, both epistemically and ethically. Each side has its own information feed telling them precisely how the other side is stupid and unethical. Both sides believe the other side is propagandized and that their side is not. The opposing group is set up to be scapegoated and marginalized. Information warfare creates dangerous conditions, which are not conducive to open communication, healthy socialization, or effective cooperation. Innocent individuals are caught in the crossfire.
How do we avoid becoming casualties of the information war? How do we work as peacekeepers, or field medics, or disarmament activists? Note that the question is not “How do we win the information war?” This is the critical point: there is no winning this war, in just the same way that there is no winning a nuclear war. We must agree to put aside our most powerful weapons of irregular warfare, just as we have so far put aside our most powerful weapons of physical violence.
But before these kinds of actions can be taken, populations must find ways to protect themselves, and to set up the equivalent to demilitarized zones. Societies must begin to rebuild cultural areas in which education can take place, rather than information warfare. A truth of war is that not everyone in a warzone is a warrior, and this is especially true in a population-centric war. There are bystanders, innocents, as well as medics and journalists embedded in the field. The information war seeks to recruit all onlookers into the psychological violence. It is possible to resist being brought into the fray, but only in certain contexts and under certain conditions. It requires that people understand when they are using language as a weapon to fight culture war and when they are using it to reach mutual understanding. It requires understanding the techniques and impacts of information war in order to be able to design and implement defensive tactics.
The techniques of information war are increasingly digital. Artificial Intelligence-driven botnets wielded by “cyborg” info-warriors now move across social media platforms like a digital blitzkrieg. Some of our defenses must be deeply technical, including measures in digital forensics, automated “fact-checking,” and broader cyber security. With the right set of technological innovations, the informational landscape could be “demilitarized” to a certain extent. But these kinds of solutions are not enough. Without the right psychological capacities and cultural motivations, the tools designed to disarm could themselves become weapons, much as “fact-checking” websites have become today.
The impacts of information war begin as psychological and cultural, and then begin to include deeper aspects of society, such as economics, public health, and infrastructure. These basic systems depend on our sound judgment and mental health, and when we are no longer able to make sense of the world together, they begin to decay. Populations targeted in information war can endure profoundly disorienting cognitive dissonance, emotional volatility, and (in many cases) tendencies towards extremism, moral righteousness, and ultimately physical violence. The impacts are devastating on the relationships, ethics, and conversations (the “spirit”) of our communities. These impacts can unfold without awareness within those very communities being targeted, and they never know their dissolution or transformation was a result of irregular warfare. Therefore, our defenses against the encroachment of war must also be psychological and cultural. They must include media literacy, upgrades to education and related institutions, and the creation of novel public forums that enable collective sensemaking.
There are potential futures in which the technology currently being used to create information weapons of mass destruction could be used to create the most powerful educational infrastructures humans have ever experienced. We must make this choice, for that future is not the default path. There is no future for open societies otherwise.
The terminology here is shifting, and a cluster of terms has been identified by the U.S. military: psychological warfare, unconventional warfare, 4th- or 5th-generation warfare, and asymmetric warfare. See: Army Special Operations Forces Unconventional Warfare: Field Manual No. 3-05.130 (Washington, DC: Headquarters, Department of the Army, 2008). ↩
Thomas Rid covers these events in greatest detail: Active Measures: The Secret History of Disinformation and Political Warfare (New York: Farrar, Straus, and Giroux, 2020). ↩
Ibid. See also P.W. Singer and Emerson T. Brooking, LikeWar: The Weaponization of Social Media (New York: Houghton Mifflin Harcourt, 2018) and Peter Pomerantsev, This Is Not Propaganda: Adventures in the War against Reality (New York: PublicAffairs, 2019). ↩
Sam Levin, “Did Russia fake black activism on Facebook to sow division in the US?” The Guardian, September 30, 2017, https://www.theguardian.com/technology/2017/sep/30/blacktivist-facebook-account-russia-us-election. ↩
Singer and Brooking, LikeWar. See note 3 above. ↩
Yochai Benkler, Robert Faris, and Hal Roberts, Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics (New York: Oxford Press, 2018). ↩
For more on epistemic nihilism, see “Challenges to Making Sense of the 21st Century,” The Consilience Project, March 30 2021, https://consilienceproject.org/challenges-to-making-sense-of-the-21st-century/. ↩
See “How to Lie with Facts,” The Consilience Project, upcoming publication. ↩
Craig Timberg, “Russian propaganda may have been shared hundreds of millions of times, new research says,” The Washington Post, October 5, 2017, https://www.washingtonpost.com/news/the-switch/wp/2017/10/05/russian-propaganda-may-have-been-shared-hundreds-of-millions-of-times-new-research-says/. ↩
Samuel C. Woolley and Douglas Guibeault, “United States: Manufacturing Consensus Online” in Computational Propaganda: Political Parties, Politicians, and Political Manipulation of Social Media, eds. Samuel C. Woolley and Philip N. Howard (New York: Oxford University Press, 2019), 185-211. ↩
Thomas Rid, Active Measures. See note 2 above. ↩
Of course, the collapse of the USSR is complex and involved many dynamics, although most accounts agree that the breakdown delegitimated Soviet propaganda, and led to the loss of an overarching Soviet identity that could avoid nationalistic backlash within their multinational block. See Alex Pravda, “The collapse of the Soviet Union: 1990-1991” in The Cambridge History of the Cold War, Vol III: Endings, eds. Melvyn P. Leffler and Odd Arne Westad (Cambridge: Cambridge University Press, 2010), 367-78. ↩
Siddharthya Roy, “Joe Biden, Kamala Harris got a big social media boost from Indian troll farms,” Newsweek, November 2, 2020, https://www.newsweek.com/joe-biden-kamala-harris-got-big-social-media-boost-indian-troll-farms-1544047. ↩
As Benkler, Faris, and Roberts observe: “The Obama 2012 campaign represented the first systematic use of big data in individualization for a campaign to target individual voters” (344-45). See note 6 above. ↩
Samantha Bradshaw, Hannah Bailey, and Philip Howard, Industrialized Disinformation: 2020 Global Inventory of Organized Social Media Manipulation (Oxford: Computational Propaganda Research Project, Oxford Internet Institute Oxford University, 2021), https://demtech.oii.ox.ac.uk/wp-content/uploads/sites/127/2021/01/CyberTroop-Report-2020-v.2.pdf. ↩
See the rich descriptions offered in Pomerantsev’s This Is Not Propaganda. See note 3 above. ↩
Singer and Brooking, LikeWar. See note 3 above. ↩
Pomerantsev, This is Not Propaganda. See note 3 above. ↩
“Germany coronavirus: Anger after attempt to storm parliament,” BBC News, August 30, 2020, https://www.bbc.com/news/world-europe-53964147. ↩
Samuel Woolley, The Reality Game: How The Next Wave of Technology Will Break the Truth and What We Can Do about It (London: Hachette, 2020). ↩
Steven Pinker, The Better Angels of Our Nature (New York: Penguin Books, 2011). ↩
Francis Fukuyama, The End of History and the Last Man (New York: Free Press, 1992). ↩
Yuval Harari, Homo Deus: A Brief History of Tomorrow (London: Vintage Books, 2015). ↩
Louis Menand, The Free World: Art and Thought in the Cold War (New York: Farrar, Straus, and Giroux, 2021). ↩
Marshall McLuhan, Understanding Me: Lectures and Interviews, eds. Stephanie McLuhan and David Staines (Cambridge, MA: MIT Press, 2015). ↩
Jean Baudrillard, Simulacra and Simulation, trans. Sheila Faria Glaser (Ann Arbor, University of Michigan Press, 1994). ↩
Thomas Rid, Active Measures. See note 2 above. ↩
Kenneth Osgood, Total Cold War: Eisenhower’s Secret Propaganda Battle at Home and Abroad (Lawrence: University Press of Kansas, 2006). ↩
The pioneer of this kind of approach is typically considered to be Edward Bernays, who was involved in U.S. propaganda efforts since before World War I. See Larry Tye, The Father of Spin: Edward Bernays and the Birth of Public Relations (New York: Crown Publishers, 1998). ↩
Menand, The Free World. See note 26 above. ↩
Thomas Rid, Active Measures. See note 2 above. ↩
Jessica C.E. Gienow-Hecht, “Culture and the Cold War in Europe” in The Cambridge History of the Cold War, Vol I: Origins (Cambridge, UK: Cambridge University Press, 2010). ↩
Osgood, Total Cold War. See note 30 above. ↩
See “Help Wanted: On The Nature of Educational Crises,” The Consilience Project, June 6, 2021, https://consilienceproject.org/education-crisis/. ↩
Emma Jane Kirby, “The city getting rich from fake news,” BBC News, December 5, 2016, https://www.bbc.com/news/magazine-38168281. ↩
See “How to Lie with Facts,” The Consilience Project, upcoming publication and “Endgames of Bad Faith Communication,” The Consilience Project, upcoming publication. ↩
This is a useful technical term, which is discussed more in the second paper of the series; see Benkler, Faris, and Roberts, Network Propaganda, note 6 above. ↩
Manuel DeLanda, War in the Age of Intelligent Machines, (New York: Zone Books, 1991). ↩