Facts, Beliefs, and the Brain: How Propaganda, Ideology, and Donald Trump Inhabit the Group Mind
IN THIS ARTICLE
In the human experience, political ideology and propaganda have played powerful roles in forging group identity. In the evolution of the human species, beliefs have been as powerful as facts and truths. Knowledge of this research and political reality can help us to understand contemporary politics, and why lies continue to shape political discourse, and also why populist messages are resilient, even when they are wrong.
In The Atlantic magazine (March 11, 2017), Caitlin Flanagan wrote an essay arguing that despite a blizzard of satire lampooning President Donald Trump coming from late night comics and cable television shows, the result has not (so far) diminished the faith of Trump’s base of followers. Titled, “How Late Night Comedy Alienated Conservatives, Made Liberals Smug and Fueled the Rise of Trump,” Flanagan contended, to the contrary, that the cacophony of mocking comedy has deepened polarization and hardened President’s Trump’ support. Flanagan said,
“Though aimed at blue-state sophisticates, these shows are an unintended but powerful form of propaganda for conservatives. When Republicans see these harsh jokes—which echo down through morning news shows and the chattering day’s worth of viral clips, along with those of Jimmy Kimmel, Stephen Colbert and Seth Meyers—they don’t just see a handful of comics mocking them. They see HBO, Comedy Central, TBS, ABC, CBS and NBC. In other words they see exactly what Donald Trump has taught them: that the entire media landscape loathes them, their values, their family, and their religion. It is hardly a reach for them to further imagine that the legitimate news shows on these channels are run by similarly partisan players—nor is it at all illogical. No wonder so many of Trump’s followers are inclined to believe only the things that he or his spokespeople tell them directly—everyone else on the tube thinks they’re a bunch of trailer-park, Oxy-snorting half-wits who divide their time between retweeting Alex Jones fantasies and ironing Klan hoods,” (2017).
Flanagan has a point and it turns out that what she has observed might be coded into the human condition. Public criticism of the president and his followers may have the effect of “circling their wagons” and strengthening group identity in part due to human evolution. In the field of political propaganda this result is akin to watching the use of anchoring myths backfiring in sensational and unwanted ways.
Frustrated Progressives and angry liberals have been asking themselves when the 2016 Trump voters will wake up, show some “buyer’s remorse” and begin to withdraw their support for the President. Not only is such thinking perhaps an unhealthy train of thought, but it is not likely to happen any time soon. What liberals believe is obvious to them will likely not be obvious at all to the Trump supporter sold on their vision of “Making America Great Again.” The slogan of “Making America Great Again” means many different things to different people, which is part of its virtue as a political slogan. No one needs to know how to articulate that vision specifically as long as the slogan binds the group together in a shared sense of identity. The voters who supported President Trump will probably stick with their “man” for a long time—and many—forever.
The author of the book The Sixth Extinction, (2014) Elizabeth Kolbert, wrote an article for the New Yorker magazine in February 2017 entitled: “Why Facts Don’t Change Our Minds: New Discoveries about the Human Mind Show the Limitations of Reason,” (New Yorker, February 27, 2017). In this article Kolbert explains why it is very difficult to change the minds of the ideologically convinced. From the study of the intersection between propaganda and ideology we know that of all the forms of persuasion in rhetoric and communication, the most difficult kind is known as “response changing persuasion.” Response changing persuasion (according to Garth Jowett and Victoria O’Donnell, pp. 38-39) involves: “asking people to switch from one attitude to another…People are reluctant to change; thus to convince them to do so, the persuader has to relate the change to something in which the persuadee already believes,” (Jowett and O’Donnell, p. 39). Finding the right “something” is not easy to do. One of the greatest challenges in communication is to find a way to change a person’s mind once it has been made up. The difficulty is increased when the subject is a matter of personal belief, and thus the individual wants to believe in their world view or ideology, no matter what the facts actually say. Ultimately, when it comes to political ideology, changing peoples’ patterns of beliefs requires skill, patience, tenacity and luck.
A Professor of Archaeology at St. Lawrence University named Peter Neal Peregrine, (writing in The Conversation, February 22, 2017) noted that there exists a distinction between two common modes through which human beings determine what we call facts. In our modern times, the predominant mode of understanding “facts” has been through Science. Professor Peregrine argues [in the article titled “Seeking Truth from Alternative Facts,”] for example, that the claim about the “massive and unprecedented” size of the crowd at President Trump’s Inauguration was viewed as silly by most observers because (from a scientific perspective), the claim was empirically false. Science does not employ alternative facts (Ball, 2017). Science (we believe) makes judgments based on established bodies of method, theory and logical argument. In the end, the “alternative facts” that claimed the “largest audience to ever witness an Inauguration—ever,” was materially false because of what was observable and measureable. Scientific perspective helps determine the “truth” (such as we may know it) by empirical observation, measurement and the scientific methods which always maintain the prospect of falsifiability--that a theory or observation may be disproven. In his own scientific research, Professor Peregrine readily admits, for example, that sometimes two archaeologists can look at the same artifact and be uncertain if it is a stone, or some ancient tool. To make a determination, archeologists will apply careful rules, methods and measurements according to their scientific discipline. In the end, the marshaling of material evidence will tip the decision on the “truth” one way or another—until it may be disproven.
Science versus Authority
In 2017, the Trump administration is often operating with a different and an older tradition of marking what is determined to be true. This method, Peregrine suggests, in contrast to Science, is known as the argument of authority. Prior to the rise of Science in the Enlightenment and Scientific Revolution, the authority of those with power determined the nature of Truth. In the realm of propaganda, we see the importance under such circumstances in the propagation of ideas, so that the facts or the truths we accept are largely determined by what we may believe, by faith, and what the authorities or the “Powers that Be” may tell us to believe. The wisdom of authority goes back to long before the Middle Ages and deep into the Human experience.
The Enlightenment (the 17th and 18th centuries) gave the world Science as we know it today. The scientific method was a human creation—and was aimed at challenging the venerable modes of judging truth, especially as it related to the natural world and the lives of people. For millennia human beings judged between competing claims of truth based upon whatever the people in power said was true. There was no separation between facts and values; the Shaman, Kings, Emperors, Prophets or Popes ordained the truth. What anyone might have seen, measured or reasoned did not matter. [It is instructive to recall the story of the scientist Galileo and his struggle with the Roman Catholic Church authorities in his discoveries about the solar system, and the invention of the telescope. In the end Galileo’s science did win, but not without a fight]. In the book Ignorance: How it Drives Science, (2012), Stuart Firestein exposed a significant distinction between Science and Authority and that is the substantial role that ignorance plays in forging scientific endeavor and discovery. Knowing what you do not know, and then working through it, is the way science progresses. In Firestein’s view, the scientific method is only a part of the story where the marshalling of data and testing leading to facts is, by itself, simply a process that corroborates or disproves theories. Much of science is also about intuition and serendipity, and in the end, facts remain disinterested. As Firestein says,” Thoroughly conscious ignorance is…the prelude to discovery, “(p. 57). What this means is that ignorance is the inspiration of both our imaginations and our potential discoveries. In contrast, the guidance of “authority” stifles imagination in the cold hands of power and interests.
Gunther Stent’s introductory essay to James Watson’s book The Double Helix, (which is Watson’s personal account of the discovery of DNA) is illustrative of the modes of scientific inquiry. Perhaps few discoveries have had more impact on civilization than the double helix. Stent wrote that, “Just as the Renaissance sprang from the confrontation of the Christian West within the Muslim East, so molecular biology sprang from the confrontation of genetics with biochemistry, (p. xi). In other words, ideas and new ways of thinking are born in competition and struggles over competing forms of truth. Science does not move in a hierarchical fashion, but at its best, is a contest of ideas. In our times, the battle between Science versus Authority was a competition many in the West believed to have been largely settled. The methods for determining truth back before the time of the Renaissance (1300-1700) were established on Authority, and those with social, political and economic power determined the truth. The Renaissance sparked a revolution against Authority. The story of the discovery of DNA as told by James Watson, shows the interplay of human intuition, personality and empirical observation. Science evolved to explode the arguments of Authority; ultimately the truth is a matter of conscious ignorance, experimentation, measurement and proof.
The claims often made by Donald Trump and his administration about facts and reality thus have an old world quality. Professor Peregrine asked a very sincere question: if we believe that those with alternative facts are empowered to shape the truth based simply on their authority, are we-- as a civilization--moving backward in time, beyond the Enlightenment and into the Middle Ages? Scientific data no matter how carefully collected and measured do not carry much weight against arguments based on authority. For example, a good comparison is Evolution versus Creationism. Creationists claim the Earth and all life were created by God and their accounts of this are based on authority, and especially on their sense of the authority of religious belief. Therefore, it is next to impossible, no matter how high the biologist may pile the scientific data concerning Evolution and the science of genetics, to challenge the authority of Creationism. The beliefs of Creationists may remain impervious. What a scientific view calls a false claim can be, in the eyes of the true believer, absolutely true.
In the classic work in 1951 entitled The True Believer, (1951) Eric Hoffer explained what he called: “the peculiarities common to all mass movements.” As Hoffer said:
Eric Hoffer also pointedly observed that:
Hoffer’s argument was quite simple: Human imaginations and the possibility that illusion may control human perception and cognition is a real and a dynamic force in human history. Beliefs are truly the “stuff that dreams are made of” and people will die for what they believe—even if it is wrong—because they believe it to be true. Humanity may look back over the history of the glories and the tragedies and often ask why? The answer lies partly in our brains—and it also may lie in the natural powers of Fear—which generates hatred, illusion and anxiety.
Eric Hoffer made a most compelling suggestion when he wrote: “Passionate hatred can give meaning and purpose to an empty life. Thus people haunted by the purposelessness of their lives try to find a new content not only by dedicating themselves to a holy cause but also by nursing a fanatical grievance. A mass movement offers them unlimited opportunities for both,” (p. 92). Hoffer was right, and it is not simply the frailty of the human heart—but it rides in our genes and our biology, too. The current populist wave of anger against the establishment in America, Britain, France and across the West is partly inspired by a sense of grievance and fear, and in those conditions, facts are less important by far than what people prefer to believe.
In Ignorance: How It Drives Science, (2012) Firestein observed: “Because you see, the single biggest problem with understanding the brain is having one. Not that it isn’t smart enough. It isn’t reliable,” (p. 125). Human beings are vulnerable to fanatical grievances because of the fear of what is not, or the desire to control what has not happened, or because of the drive for things that are yet to exist. The human brain is both powerful and yet unreliable. People will act on the fear of what they don’t know and cannot predict. Fear is deeply embedded in the evolutionary development of the human species, and fear helps human communities survive. But it is the easiest passion to manipulate in the human heart and the most dangerous. Fear can bring out the very worst in human behavior and thinking. People will easily fear and hate all those who stand in the way of what is not, and also what they believe they want. Human beings can fall prey to believing that something that is not has been taken away from them by some other. For example, such fantasies inspired the Holocaust. Dreams, beliefs—and worse lies—all can become living nightmares.
The Human Brain: Powerful and Unreliable
The social scientific and biological evidence clearly suggests that human beings do not easily change their minds or beliefs once they are established. Elizabeth Kolbert (New Yorker, February 2017) cites several psychological studies beginning in 1975 at Stanford University. In 1975, undergraduate students were given pairs of suicide notes. The pairs held one note written by a random individual, and another by someone who was a real victim of suicide. In the experiment some students found they had a gift for correctly identifying the real suicide notes as opposed to the fake notes. Other students in the experiment found, in contrast, they were terrible at the task. Of course—all of the experimental scores were fictions. All of the suicide notes had come from the coroner’s office and the students who had been told they were generally correct were, on average, no more successful than the students told that they had guessed wrong. These deceptions were revealed in what turned out to be the second phase of the experiment. In part two the students were asked how they should rate their responses. This was another deception. When the students were asked to guess the number of suicide notes they had gotten correct, the students in the original high score group believed that they had done very well; meanwhile the low-score group believed that they had done worse than the other students. The students drew these conclusions even after they were told the truth despite the fact that the entire experiment was a deception and that no group had estimated more successfully than the other. The students tended to accept the false results.
In Kolbert’s article she discusses similar experiments from the 1970s and 80s and the results are the same: “Even after evidence for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” (Kolbert, p. 3). The Stanford studies became famous. Literally thousands of such experimental studies over time confirmed similar results. As Kolbert said: “Reasonable-seeming people are often totally irrational,” (p. 3). In our times, and because of the notion that we are navigating a post-fact, post-truth environment, this understanding is all the more significant. Even so, like the definition of Political Power—the real question is why? How and why do people act this way? Political Power is understood to be generated by human relationships. These human relationships are built upon perceptions about Motives and Resources (Burns, 1978). People assess the motives and resources of one another—and followers choose to follow leaders based on perceptions. Propaganda seeks to shape and manipulate human perceptions. The actual factors that are most central to fomenting political power are also naturally embedded in propaganda. The development of propaganda as a tool was certainly no accident.
A new study titled the Enigma of Reason, by Hugo Mercier and Dan Sperber (2017) may hold some clues. Reason [as we understand the human trait] likely evolved in human beings and human communities living in the African savannah. In her book The Sixth Extinction, (2014) Kolbert carefully identified the one trait in human beings that allowed humanity to work and hunt cooperatively. This trait—related to the construction of our mouths and tongues--is the ability to communicate. The primary factor of human success and our ability to compete over other animals and species is this cooperative behavior. Kolbert links this factor to the long time line leading to a current human inspired “6th Extinction.” On our own we would not, and will not, survive. Mercier and Sperber’s research lends further support to this idea: “Humans’ biggest advantage over other species is our ability to cooperate. Cooperation is difficult to establish and almost as difficult to sustain. …’Reason is an adaptation to the hypersocial niche humans have evolved for themselves’,” (Kolbert, 2017). Habits of thinking that we might think of as strange or weird turn out to be very clever from a “social interaction” standpoint.
To illustrate this point, Kolbert (2017) uses the concept of confirmation bias. Confirmation bias refers to the tendency that people have to accept beliefs that support their preexisting beliefs and to reject any and all new information that comes into conflict with said beliefs. Stanford University again has provided the research which repeatedly confirms the concept. A classic Stanford study conducted by C.G. Lord, Lee Ross and Mark Lepper in 1979 dealt with capital punishment. A group of students were gathered among whom half supported the death penalty while the other half did not. The students were shown two studies. One study provided data to support the deterrence argument—that capital punishment deters crime and murder; the other study provided data that called deterrence into question. As you might guess, the students who supported capital punishment thought the pro-deterrence data were credible. The students who originally opposed capital punishment, viewed the anti-deterrence data as credible. In fact—students who began the study pro-capital punishment were more in favor of the death penalty than prior to the experiment. Those who opposed death penalties also became more fervent in opposition, (Lord, Ross and Lepper, 1979). Why? The answer was Confirmation bias.
Mercier and Sperber’s evolutionary research suggests that the “myside” bias played a role in human “hyper sociability,” (Kolbert, p. 5). As Human individuals, being free riders is frequently a positive choice—getting anything we can with as little invested is basically rational for the individual. The problem is that free rider behavior in groups is a catastrophe. Because human beings must live in groups to survive, the hyper-social qualities began to be selected in our evolution. Over the expanse of time human communities selected for hyper-social qualities. This selection process led to the irony of confirmation or what can also be called “myside” bias, (Mercier and Sperber, 2017). One would think that Confirmation bias—or only agreeing with what my group believes, or what I have always believed in the face of facts to the contrary—should be dangerous. After all, adaptation in response to new data would generally be wise—unless not doing so performed some adaptive function.Continued on Next Page »