On Obedience as Identity: Milgram and the Banality of Evil
Stanley Milgram's now famous obedience to authority (OTA) experiments, conducted in 1963, 1965, and 1974, shocked the world and are still among the most well-known experiments of all time in social psychology.
Attempting to find scientific explanations for the Holocaust (Russell, 2011), Milgram designed the experiment to test ordinary people’s susceptibility to authority. Subjects were instructed to administer increasingly strong electric shocks via a specially designed fake shock generator to a victim in another room, even when the subject protested (Milgram, 1963). In his ‘baseline study’ (the first official trial run) (Haslam & Reicher, 2012), 65% of participants administered what could have lethal electric shocks (Milgram, 1963).
Milgram's obedience studies have as a result been taken as evidence for what Arendt (Arendt & Elon, 2006) described as the ‘banality of evil.’ It seems that people's desire to be good subjects and abide by authority is greater than their desire to do good (Haslam & Reicher, 2012). Milgram (1974) theorised an 'agentic state,' arguing that humans evolved to be submissive to hierarchy (Waller, 2007). In this agentic state, we act as agents of others in authority and thus divert responsibility away from ourselves.
However, as I argue below, Milgram's studies do not necessarily support this theory. Rather, we need to look at the background of these experiments, as Reicher and Haslam (2011) argue, and explain why and under which conditions people abide to authority.
The Milgram Experiments are not evidence that humans cannot help but to obey authority. At best, they are merely evidence that obedience is likely under certain experimental conditions.
This essay thus offers a critical view on the conclusions drawn from Milgram’s work on obedience. At first, I consider how Milgram arrived at his findings to then reinterpret them as conditional. I then reconceptualise obedience as dependent on identity. Eventually, I consider the contribution the OTA studies have had on our understanding of the psychology of evil in the Third Reich.
Milgram's studies have been widely replicated (Burger, 2009; Dambrun & Vatiné, 2010; Mantell, 1971; Zeigler-Hill, Southard, Archer, & Donohoe, 2013) and results show similarly high completion rates, suggesting that Milgram's design is high on internal validity. Indeed, Milgram himself conducted over 20 pilot studies with a range of variations. This is also one of the main critiques of Milgram's 'baseline study.' It was carefully calibrated to achieve the highest possible completion rate. Milgram aimed for eye-catching findings that were “great drama as well as great science” (Haslam & Reicher, 2011) 'to create the strongest obedience situation' (Russell, 2011).
He introduced numerous strain resolving mechanisms (SRMs) to reduce the tension participants felt and thus increase their likelihood of completing the experiment. This included putting the experiment in a social learning context, selecting actors carefully based on their personality characteristics, creating physical distance between the subject and the victim, and using a foot-in-the-door-technique of increasing the electric shock intensity ever so slightly. The foot-in-the-door technique practically rendered participants unable to justify quitting the experiment (Russell, 2011). Testing varied SRMs, the studies thus showed very different completion rates of obedience, that ranged from none to total obedience.
Moreover, Milgram has been accused of fraudulent research practices. Indeed, his attempts to fine-tune the outcome of his experiments seem to have little to do with scientific inquiry, but with blatant manipulation. As Perry (2013) claims, there is severe ethical concerns with his study as participants were not properly debriefed (Andersson, 2014). Milgram also altered the experimental conditions, and did so inconsistently. For instance, prods of the experimenter were not always halted after four verbal prods as described by Milgram. Moreover, according to Perry, only 50% of participants actually believed the experiment to be real, and of these two-thirds disobeyed.
This leads us to question what the experiments actually reveal about obedience. Certainly, they are not evidence that humans cannot help but to obey authority. At best, they are merely evidence that obedience is likely under certain experimental conditions (Haslam & Reicher, 2011). That is, with the right characters, with the same physical distance between subject and victim, with the same foot-in-the-door technique, with participants believing that the experiment is real, with pressure from the experimenter, etc. Only a small alteration in the experimental setup may have great effects on obedience, as Milgram has demonstrated himself with his pilot studies.
Obedience, then, is not a fact, it is conditional. It is also not an on/off condition, but rather a scale. People are obedient under certain conditions until these conditions change. Different situations will produce different levels of obedience. For instance, if in a hypothetical non-laboratory situation, people were instructed to torture a victim that they could see suffering (as in Nazi Germany), mere obedience to authority would be very unlikely and expected to decrease with increasing suffering of the victim. That is to say, Milgram’s experiments lack considerable mundane realism.
Afterall, the 'baseline study' can not be seen as representative of the many variants. Throughout the studies, overall, the majority of participants disobeyed (Haslam & Reicher, 2012). A meta analysis (Packer, 2008) of Milgram's studies revealed that participants who did not obey were most likely to do so when they first heard the request of the 'learner' to be released. This may suggest that among conflicting requests (experimenter vs. learner), participants that disobeyed considered the learner's rights higher than the researcher's (2008). It may further be evidence of a change in participants’ social identification (Haslam, Reicher, & Birney, 2014; Haslam, Reicher, & Smith, 2012; Haslam & Reicher, 2011, 2012).
As Reicher and Haslam (Haslam et al., 2012; 2011, 2012) confirm in their recent study (Haslam et al., 2014), compliance depends upon the commitment of participants to the experiment and the experimenter, that is, their identification with the scientific enterprise. When this identification is discredited - by the request of the learner, or the experimenter imposing himself upon participants with a direct command (the 4th prod) - participants’ social identity changes. They are led to question the righteousness of their actions and disobey. Thus in actuality, the studies seem to show a conflict, a dilemma between relationships. They are less about obedience than they are about the power of social identity-based leadership to create committed followers (Haslam et al., 2012). Who participants identify with may thereby depend on the physical distance to the learner, personality characteristics of learner and experimenter, and the affiliation of the experimenter with science and a scientific institution (Haslam & Reicher, 2011).
Obedience, then, is not innate to human beings, it is also not inevitable. In fact, Arendt’s ‘banality of evil’ (Arendt & Elon, 2006) is not a human condition, as Milgram (1974) claimed for his ‘agentic state’, neither is it about diverting responsibility. In fact, it has nothing to do with obedience at all. The evil committed in the Third Reich was committed out of an unawareness of doing evil, because “evil [had] lost its distinctive characteristic” (Arendt & Elon, 2006, p. xiii).
The Nazi regime had established evil as good, and good as evil. It had created a new ‘righteousness’. Indeed, doing evil requires to identify with the cause of evil. It is active, not passive, intended, not helpless (Haslam et al., 2012; Haslam & Reicher, 2012). In this sense, Milgram’s experiments reveal a similar attempt to redefine right and wrong: The experimenter made the subjects believe that their wrongdoing was not only for a good cause (science), but ‘essential’. Subjects needed to identify with the scientific enterprise. Yet, the ‘banality of evil’ in the Third Reich was not total obedience, but that the supported cause (the Nazi ideology) and committed atrocities were not seen as wrong, and did thus not require obedience, but solely conformity.
To conclude: Milgram’s ‘baseline study’ was fine-tuned to achieve eye-catching results which were achieved through invalid research practices. The OTA studies are merely evidence that obedience is likely under those particular experimental conditions, while in a different scenario disobedience might be more lieky. Rather than taking Milgrim's study as evidence of the inevitability of obedience, his work can be taken as evidence for the power of social identity-based leadership. Indeed, the OTA studies seem to be about getting people to believe in the importance of science, and the extent of obedience seems to depend on the extent of scientific justification provided (Haslam et al., 2014, 2012; Haslam & Reicher, 2011, 2012). As a result, Milgram’s studies contribute little to the understanding of the Holocaust because the Holocaust did not actually require obedience, but conformity.
Andersson, T. (2014). Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments by Gina Perry. International Social Science Review, 89(1), Article 16.
Arendt, H., & Elon, A. (2006). Eichmann in Jerusalem: A Report on the Banality of Evil. Penguin Group US.
Burger, J. M. (2009). Replicating Milgram: Would people still obey today? The American Psychologist, 64(1), 1–11.
Dambrun, M., & Vatiné, E. (2010). Reopening the study of extreme social behaviors: Obedience to authority within an immersive video environment. European Journal of Social Psychology, 40(5), 760–773.
Haslam, S. A., & Reicher, S. D. (2011). After shock? Towards a social identity explanation of the Milgram “obedience” studies. British Journal of Social Psychology. Retrieved from http://onlinelibrary.wiley.com/doi/10.1111/j.2044-8309.2010.02015.x/full
Haslam, S. A., & Reicher, S. D. (2012). Contesting the “Nature” Of Conformity: What Milgram and Zimbardo’s Studies Really Show. PLoS Biology, 10(11), e1001426.
Haslam, S. A., Reicher, S. D., & Birney, M. E. (2014). Nothing by Mere Authority: Evidence that in an Experimental Analogue of the Milgram Paradigm Participants are Motivated not by Orders but by Appeals to Science. The Journal of Social Issues, 70(3), 473–488.
Haslam, S. A., Reicher, S. D., & Smith, J. R. (2012). Working toward the experimenter reconceptualizing obedience within the Milgram Paradigm as Identification-Based followership. Perspectives on Psychological Science: A Journal of the Association for Psychological Science. Retrieved from http://pps.sagepub.com/content/7/4/315.short
Mantell, D. M. (1971). The Potential for Violence in Germany. The Journal of Social Issues, 27(4), 101–112.
Milgram, S. (1963). Behavioral Study of obedience. Journal of Abnormal and Social Psychology, 67(4), 371–378.
Milgram, S. (1965). Some conditions of obedience and disobedience to authority. Human Relations; Studies towards the Integration of the Social Sciences.
Milgram, S. (1974). Obedience to authority: an experimental view. Harper & Row.
Packer, D. J. (2008). Identifying Systematic Disobedience in Milgram’s Obedience Experiments: A Meta-Analytic Review. Perspectives on Psychological Science: A Journal of the Association for Psychological Science, 3(4), 301–304.
Perry, G. (2013). Behind the shock machine: The untold story of the notorious Milgram psychology experiments. The New Press.
Russell, N. J. C. (2011). Milgram’s Obedience to Authority experiments: origins and early evolution. British Journal of Social Psychology, 50(Pt 1), 140–162.
Waller, J. E. (2007). Becoming Evil: How Ordinary People Commit Genocide and Mass Killing. Oxford University Press, USA.
Zeigler-Hill, V., Southard, A. C., Archer, L. M., & Donohoe, P. L. (2013). Neuroticism and negative affect influence the reluctance to engage in destructive obedience in the Milgram paradigm. The Journal of Social Psychology, 153(2), 161–174.