Chapter 4 The Psychology of Obedience PDF
Document Details
Uploaded by QualifiedBaroque
Bishop's University
null
Tags
Summary
This document discusses the psychology of obedience and Stanley Milgram's research. It explores the concept of obedience within the context of social psychology, including the "Authoritarian Personality" theory. The work delves into the factors that influence obedience, and presents a thorough analysis of Milgram's experiments.
Full Transcript
Chapter 4 The Psychology of Obedience The late Stanley Milgram is usually thought of as a social psychologist rather than a political one, not least perhaps because he spent his career in departments of psychology rather than political science; the term “ political psychology,” as noted earlier, i...
Chapter 4 The Psychology of Obedience The late Stanley Milgram is usually thought of as a social psychologist rather than a political one, not least perhaps because he spent his career in departments of psychology rather than political science; the term “ political psychology,” as noted earlier, is used almost exclusively in the latter discipline. But in spite of the differing labels that practitioners of different fields attach to their work, Milgram could justifiably claim to be one of the most important political psychologists of his time. In fact, Milgram began his initial foray into the world of academia as a political science student, and remained interested in the psychological dimensions of political questions for the rest of his (tragically short) life. And although he made many con- tributions to our understanding of human behavior, which have implications for politics, he will always be best remembered for his work on political obe- dience. For instance, Milgram asked, why do individuals so readily obey some “ higher” authority such as the state, even when the demands of that authority come violently into conflict with the moral and ethical values most of us like to think we cherish? Through addressing this question, Milgram played an instrumental role in overturning (or at least reducing the appeal of ) dispositionist accounts, particularly those which followed in the wake of the Nazi Holocaust and blamed that cataclysmic event upon the supposed “ peculiarities” of the German people. The “Authoritarian Personality” In order to understand the true impact that Stanley Milgram’s work had on our understanding of both political obedience and the practice of genocide, we first need to understand the nature of the times in which he was working. The context for Milgram’s experiments was a then widely accepted (though always controversial ) dispositionist theory known as the authoritarian personality. In the 1940s and 1950s, social scientists from a variety The Psychology of Obedience 51 of fields tried to come to grips with the horror of what had happened at concentration camps like Auschwitz and Dachau. Understandably, “why the Holocaust?” became one of the most frequently posed enquiries in social science after 1945. One popular answer to that question at the time suggested that there was something unusual or exceptional about the German people, something which made the Nazi genocide almost inevitable in retrospect. The nature of Germans themselves, and more particularly authoritarian child-rearing practices in the homes within which they had grown up, was directly respon- sible for the creation of intolerant, conservative thinking, according to the authors of the book The Authoritarian Personality. First published in 1950 and authored by Theodore Adorno and his colleagues, the book argued that the roots of fascism are to be found in parental repression and authoritarianism.1 As James Waller notes, the book was heavily influenced by the Freudian psychoanalytic theories dominant at the time. It argued that: The origins of this personality were in the innate, and socially unac- ceptable, drives of sex and aggression. When the restraints against the expression of these drives are unusually harsh, the individual becomes anxious, insecure, and unusually attuned to external authority sources for behavior guidance. This reverence for authority goes far beyond the normal, balanced, and realistic respect for valid authority that most of us have; it reflects an exaggerated, emotional need to submit.2 The harsh child-rearing practices of the parent also generate repressed fear and hostility, which eventually need some outlet. This outlet comes in the form of displacement, often taking the form of hostility towards minority groups and more generally those who are “different.” Adorno and his colleagues also developed a scale—known commonly today simply as the “F scale”—which correlated a variety of personality traits with the susceptibility to believe antidemocratic or fascist propaganda. Milgram’s Experiments Stanley Milgram was very much a situationist, and as such he was suspicious of theories like the one above which attribute behavior solely to people’s dispositions. As a follower of the social psychologist Solomon Asch, he intuitively believed that if you place people in a powerful enough situ- ation, they will go against their dispositions: their beliefs, their values, even their own eyesight. In the 1950s Asch had conducted what became a very famous series of experiments, in which he asked people to estimate the 52 The Situation AB AB c Figure 4.1 The cards used in Solomon Asch’s experiments on social pressure lengths of a series of simple lines. For instance, let’s say that you are shown three lines on the right, labeled A, B, and C, as seen in Figure 4.1, and then asked which of the three lines is closest in length to the line presented on the left. Asch found, as expected, that practically everyone gets the answer right when asked such questions. This was not an earth-shattering result, since the questions were so easy that a child of four or five could have answered them correctly. But this was just a baseline condition, in which people were asked to figure out the correct answers on their own. The real purpose of the experiment was to investigate what came next, when he placed his subjects into groups and again asked them to perform the same simple task. But there was an interesting piece of deception involved this time. In one variation, he had a “real” subject placed in a room with six others who were in effect actors pretending to be fellow subjects. While for the sake of believability these fake subjects sometimes got the answers right, Asch rigged the experiment so that the six individuals would sometimes collectively give the same wrong answer to a question, and then another, and another, leaving the real subject with a difficult dilemma. For instance, they might claim that option B on the right was closest in length to the line on the left. Suppose that you are in this situation yourself. What would you do? Would you stand up and tell the other six “you’re all wrong, and I’m right. Can’t you see that the option you’ve selected is obviously the wrong answer?” Or would you feel embarrassed and go along with the majority, even though you know they are giving the wrong answer? Would you feel uncomfortable questioning the judgment and intelligence of six strangers? Would you start to question your own judgment and intelligence? Or would you start to think that you might well need to visit an optometrist? Asch found that the latter scenarios were by far the most common; in other words, the vast majority of subjects simply went The Psychology of Obedience 53 along with the group’s faulty judgment, even though they knew (or suspected) that their assessment was simply wrong.3 Seventy-five percent of his subjects in a series of trials went against their own judgment at least once when the group collectively gave a wrong answer. Milgram was very much interested in how social pressures like these can affect the judgment of individuals, and after a great deal of thought he came up with a highly inventive research design—like Asch’s, involving a clever piece of deception—which would gain him a measure of fame but also a reputation for controversy which would dog him for the rest of his life. He wanted to see how far people would go in following the commands of a “ legitimate” authority when those commands became increasingly harsh and inhumane. He created an experiment in which a man in a laboratory coat told subjects to administer increasingly harsh “electrical shocks” to a help- less victim.4 This was justified to the subjects as part of a supposedly scientific experiment on how people learn in response to punishment. In one classical condition, the “victim” could be heard but not seen behind a thin wall, though Milgram repeated the same experiment in a number of ways, each time varying the degree of proximity between the subject being told to administer the shocks and the “victim,” or varying some other aspect of the basic design. The subjects administered the shocks using what were supposedly higher and higher levels of electricity on a generator. In reality, the “victim” was an actor (an associate of Milgram) and was not actually receiving electrical shocks at all. Also, the generator was fake, but the experiment was set up in such a convincing way that the “ teacher” (as the real subjects were termed) genuinely believed that he or she was shocking the “ learner” (the actor). Prior to his experiment, Milgram conducted a poll of psychiatrists and psychologists. They predicted that less than 1 percent of subjects would go all the way on the “generator,” to the maximum charge of 450 volts.5 Amazingly, though, in the classic condition described above, 65 percent of subjects did this; in fact, they went all the way to a position labeled “danger” and then simply “XXX.” This was so, despite the fact that when a certain level of shock was reached, the “victim” would cry out in pain and beg to be allowed to leave the experiment. Nor did the results change (as many people intuitively expect) when Milgram used women as subjects; average obedience remained 65 percent. This is surprising perhaps, since women could be seen either as less obedient (considered more compassionate) or more obedient (considered more passive). Interestingly, though, Milgram found that gender made very little difference, if any. This was far from all Milgram found, however. He observed a number of interesting reactions in “obedient” subjects as they went about performing their tasks. All, with varying degrees of visibility, experienced strain and 54 The Situation discomfort. Some laughed or cried; those who laughed, however, did so not out of sadism or cruelty but as a nervous reaction to stress, Milgram argued. The subjects also became preoccupied with narrow, technical aspects of the job at hand, and afterwards saw themselves as not responsible for their own actions. Here there are potentially interesting parallels with what happens to pilots who are asked to bomb civilian areas. In the Oscar-winning documen- tary film Hearts and Minds, for instance, one bomber pilot who had conducted numerous sorties in Vietnam said that he would become very preoccupied with the task itself when conducting bombing raids, and would not even think about the people he was dropping the bombs on. He relates that he felt like “an opera singer, conducting an aria.”6 Milgram also varied the form of the experiment in theoretically inter- esting ways. He wanted to see what effects, for instance, changing (a) the way orders were given, ( b) the location of the experiment, and (c) distance between subject and actor would have. One of the most interesting findings related to proximity or distance between the teacher and learner. As proximity between them increased, obedience decreased (although it did not disappear altogether). This was especially true when the subject and victim were placed in the same room. In the “ touch–proximity” condition—in which the subjects were required to force the victim’s hand down onto the shock plate—it fell to just below 18 percent, and in the “ proximity” scenario (where subject and victim were merely in the same room) it was only slightly increased to 20 percent. It is noticeable, though, that even in this condition, obedience was still somewhat high. The larger point of the experiment was simply this, however: Milgram had selected ( by means of an ad in a local paper) ordinary, everyday, law- abiding members of the community of New Haven, Connecticut, obtaining a representative sample of the population across various socioeconomic, reli- gious, and other characteristics. He had also weeded out anyone who seemed psychologically “abnormal”—especially anyone who showed overt signs of a sadistic personality—so that the actions of his subjects could not easily be attributed to their dispositions later on.7 He had then placed them in a situation in which their dispositions—especially their avowed moral or ethical beliefs—seemed to fall out of the picture. The heavy implication is that we are all capable of violating our most cherished principles and values when placed in a situation in which an authority perceived as “ legitimate” urges us to obey. Approaches such as the authoritarian personality, on the other hand, are simply wrong, Milgram suggested, since they fail to take account of the ways in which social forces can be more powerful than dispositions in shaping behavior. They make the fatal error of assuming that “evil acts” must be the work of “evil people.” The Psychology of Obedience 55 The Banality of Evil In his book Obedience: An Experimental View, Milgram draws parallels between his own work and Hannah Arendt’s analysis of Adolf Eichmann in her book Eichmann in Jerusalem.8 As a top Nazi official responsible for deporting Jews to the gas chambers during the Holocaust, but who had escaped Germany after the war, Eichmann had long been a target of Israeli intelligence. In 1960 he was discovered living under an assumed identity in Argentina, and was kidnapped by Israeli agents to face trial for his crimes. He was found guilty in Jerusalem and later executed. Arendt covered Eichmann’s trial at the time, but what surprised her most was how ordinary he seemed. The whole trial was televised live in Israel. But rather than the sadistic monster that most Israelis were expecting when they tuned in, they saw instead a rather dull and ordinary man standing in front of the court, a Nazi pen-pusher whose main job had been processing files and making sure that the trains deporting Jews ran on time. Arendt was strongly criticized for making this observation at the time for reasons that are perhaps understandable, but she coined a phrase to describe Eichmann and those like him which has since become famous: “ the banality of evil.” Her point was not that Eichmann should be absolved of responsibility for his actions—far from it. It was, rather, that evil is often the end result of a chain of actions for which no one individual bears sole responsibility, and that individual links in that chain can be (and frequently are) composed of the actions of what the historian Christopher Browning more recently referred to as “ordinary men.” 9 Similarly, Milgram found that when responsibility for testing and punishing the “victim” was divided among a number of individuals, obedience increases still further. The potential political significance of this is evident, since political decision-making tasks of all kinds are often parceled out like this. Milgram calls this “socially organized evil,” where no one person has sole or exclusive responsibility for an act. Considered together, the independent observations of Arendt and Milgram— the first anecdotal, the second experimental—carry a weight which many find convincing as an explanation of something which almost seems inexplicable, the systematic slaughter of the Jews in supposedly “civilized” countries at the very heart of Europe. Moreover, many of their observations make a good deal of sense when applied to both the Holocaust and more recent genocides. In the case of Nazi Germany, it is clear that the slaughter of the Jews simply could not have been accomplished on the scale that it was had not ordinary, everyday members of German society—people who considered themselves otherwise decent, moral, and law-abiding—been willing to participate (in some cases, very directly) in a process whose objective was the extermination of other human beings whose only crime was being ethnically different from 56 The Situation Adolf Hitler’s vision of what was “ ideal.” We can also observe how thin the line is between what we conventionally call “good” and “evil”—and how easy that line is to cross—in the notorious case of the Rwandan genocide of 1994, in which neighbor killed neighbor on the basis of relatively short- lived racial “differences” which had in many ways been created by Western colonizers to suit their own purposes. Why We Obey: The Drift Towards Dispositionism Human beings, Milgram notes, live in hierarchical structures (family, school, college, business, military). This appears to be the result of evolutionary bias ( hierarchy works), breeding a built-in potential to obey authority. Interestingly, this argument is the very antithesis of a rigid situationist approach such as S–R behaviorism, which treats the human brain as a “ blank slate.” Milgram suggests that humans are born with a basic disposition to obey, an essentially dispositionist argument of the “dispositions-exist-at-birth” variety. Beyond this, however, his explanation is more situationist in nature. This evolutional ten- dency, he argues, interacts with social structures and specific circumstances to produce specific cases of obedience.10 Certain factors made the subjects likely to obey before they even got to the experiment (such as the fact that we are socialized to obey “ higher units” in a hierarchical structure), and these then interacted with the specific circumstances designed in the experiment to elicit obedience. As individuals obeyed, they shifted into what Milgram calls the “agentic state”—a psychological condition in which the individuals no longer see themselves as responsible for their own actions.11 Milgram’s 35 (Or 50) Percent A complicating factor for Milgram’s (mainly situational ) paradigm is that there is evidence of cultural variation in the degree to which members of different societies obey authority. David Mantell, who repeated Milgram’s study in Munich, Germany in the early 1970s, found an obedience rate of 85 percent in the “classic” version of the experiment, a full 20 percent higher than the obedience rate in New Haven.12 Anecdotally, there is some interesting evidence that Rwandans may also be especially prone to obey authority. Asked why so many ordinary Rwandans in 1994 killed people who were in many cases their neighbors, Francois Xavier Nkurunziza, a lawyer from Kigali with a Hutu father and Tutsi mother, said: Conformity is very deep, very developed. In Rwandan history, everyone obeys authority. People revere power, and there isn’t enough education. The Psychology of Obedience 57 You take a poor, ignorant population, and give them arms, and say, “ It’s yours. Kill.” They’ll obey. The peasants, who were paid or forced to kill, were looking up to people of higher socio-economic standing to see how to behave. So the people of influence... are often the big men in the genocide. They may think that they didn’t kill because they didn’t take life with their own hands, but the people were looking to them for their orders. And, in Rwanda, an order can be given very quietly.13 If true, this is ultimately quite compatible with Milgram’s approach. His theory of why people obey must leave some room for cultural differences in the propensity to obey, since humans are obviously socialized within different authority structures. And in the Rwandan case, there is ample evidence that authority figures of all kinds—mayors, businessmen, even clergy—condoned or encouraged what occurred in Rwanda in 1994. It was the fastest genocide of the twentieth century. More problematically, situationism arguably falls down in its inability to explain why a significant minority of individuals—fully 35 percent when the actor cannot even be heard crying out in the next room, a not insubstantial figure—refuse to obey authority when it violates conscience or values. And the figure goes up to as much as 50 percent when the actor can be heard, which leads Hibbing and his colleagues to question whether this should really have been a dispositionist analysis rather than a sitautionist one; if as many people disobey as obey, is the situation really driving behavior, or are dif- fering predispositions?14 Milgram devoted less attention to the analysis of why some people disobeyed, but it is clear that for many of them their own personal experiences and values—their dispositions, in other words—mattered so much that they never felt that they “ had no choice.” Out of those who refused to shock the victim, one had been brought up in Nazi Germany (a medical technician given the name Gretchen Brandt in Milgram’s book). She clearly recognized the similarities between that very vivid series of events and what she was being asked to do. Another disobedient subject was a professor of the Old Testament, and we know that others simply refused to go along on the grounds that “ this is wrong.” All of this suggests that dispositions matter for the 35 percent. The situation, moreover, was insufficiently powerful to shape the behavior of 80 percent of the subjects when forced to shock a victim sitting directly in front of them. Moreover, the fact that Milgram’s subjects were told that their actions would result in no damage to the health of the fake “subject” is at the very least a complicating factor, since it is plainly obvious to those who participate in genocides that they are doing real damage, of the very worst possible kind. 58 The Situation There were many who refused to participate in the extermination of the Jews, and even a large number who actively worked against what the Nazis were doing. Oskar Schindler, the German industrialist who risked everything to protect hundreds of Jews, is perhaps the best known, but there were many others who risked even more than Schindler for complete strangers. Raoul Wallenberg and Per Anger, both Swedish diplomats, are together estimated to have saved as many as 100,000 Hungarian Jews from the gas chambers by using their diplomatic immunity to issue fake Swedish passports; German pastor Dietrich Bonhoeffer preached against the Nazi regime in his church, and was eventually executed for his “crimes”; and in the Rwandan case, the Hutu businessman Paul Rusesabagina—made famous by the film Hotel Rwanda—saved over a thousand Rwandans (most of them Tutsis) by sheltering them in his hotel and bribing local officials with whiskey, money, and other goods.15 As the authors of The Altruistic Personality suggest, it is clear that we can only explain the heroic acts of these “rescuers” by examining their dispositions.16 In addition, it is clear that Milgram’s paradigm on its own cannot fully explain all aspects of genocide, though it does illuminate many of the psychological forces which drag ordinary people along in its wake. One thing that is absent from Milgram’s experimental design but present in practically all genocides—as we shall see in Chapter 14 —is the systematic dehumanization of victims. As James Waller notes: regarding victims as outside our universe of moral obligation and, therefore, not deserving of compassionate treatment removes normal moral restraints against aggression. The body of a dehumanized victim possesses no meaning. It is waste, and its removal is a matter of sanitation. There is no moral or empathetic context through which the perpetrator can relate to the victim.17 The dehumanization of Jews in Europe is but the most obvious form of this. Philip Gourevitch has chillingly described the ways in which Tutsis became dehumanized in the eyes of Hutus over many years prior to the Rwandan geno- cide of 1994. In the years before the genocide, he notes, “ Tutsis were known in Rwanda as inyenzi, which means cockroaches.”18 Following a history of being discriminated against, the Hutus took power in the revolution of 1959; Tutsi guerrillas who periodically fought against the new order were the first to be described as “cockroaches.”19 The term would be invoked repeatedly on Rwandan radio after the death of Hutu President Habyarimana, as broadcasters urged Hutus to kill Tutsis. There can be few more demeaning or dehumanizing ways to consider another human being than to compare him or her to an insect. The Psychology of Obedience 59 Interestingly, the subjects in Milgram’s experiment did sometimes dehu- manize the “ learner” themselves—one obedient subject famously justified his actions afterwards by claiming that “ he was so dumb he deserved it”—but this aspect was mostly absent from Milgram’s design. Another factor absent from Milgram’s experiment were the powerful emotional forces which attend genocidal acts. Apart from the obvious absence of ethnic hatred, there is no sense of humiliation on the part of those doing the “shocking” in Milgram’s laboratory. As Adam Jones notes, “ it is difficult to find a historical or contemporary case of genocide in which humiliation is not a central motivating force.”20 An obvious example is the sense of outrage which Germans felt after the imposition of the punitive Versailles Treaty in 1919. Combined with the hyperinflation of the 1920s and the Great Depression, many Germans looked around for a scapegoat upon whom blame for the various disasters could be heaped.21 Similarly, in Rwanda, Belgian colonizers and other Westerners had deliberately discriminated against Hutus and in favor of Tutsis, treating the latter as a privileged elite (and inevitably creating resent- ment amongst the former).22 In general, certain socioeconomic circumstances seem to give rise to genocide, or at least provide the enabling conditions for genocide to take place.23 While Milgram’s research convincingly illustrates the mechanisms which make it possible for normal, everyday people to commit atrocities, it could be argued that it cannot by itself serve as a fully comprehensive account of why genocide occurs. Milgram should not, of course, be held accountable for failing to reproduce all of the conditions typically associated with genocide in his laboratory—there are obvious practical and ethical limits to the things one can do in that kind of environment—and his work on obedience is hence only a starting point in our understanding of why geno- cides occur. On the other hand, Milgram often noted that he was able to elicit a quite extraordinary level of conformity in his subjects in the absence of any of the factors—ethnic hatred, dehumanization, humiliation, and economic distress—we have mentioned above. As Milgram put it at the end of his book: The results, as seen and felt in the laboratory, are to this author disturbing. They raise the possibility that human nature, or—more specifically— the kind of character produced in American democratic society, cannot be counted on to insulate its citizens from brutality and inhumane treat- ment at the direction of malevolent authority. A substantial proportion of people do what they are told to do, irrespective of the content of the act and without limitations of conscience, as long as they perceive that the command comes from a legitimate authority.24 60 The Situation Assessing Milgram’s Obedience Paradigm As we did in the previous chapter with behaviorism, it seems appropriate to end with a look at the major strengths and weaknesses of Milgram’s approach. In Table 4.1 we summarize the main ones discussed in this chapter. While not exhaustive, they should help you decide for yourself where you stand on the utility (or otherwise) of Milgram’s experiments as an explanation for genocide and extreme political behaviors in general. So far, we have analyzed two explanations of political behavior which emphasize the determining power of the social environment in shaping how we act: Skinner’s behaviorism and Milgram’s obedience paradigm. We began this book, the reader may recall, with a description of the Abu Ghraib scandal, which did a great deal of damage to the validity of America’s invasion of Iraq—and the general image of the United States—in the eyes of the world. In the next chapter, we examine another situationist perspective which may throw some light on the events at Abu Ghraib. Was the highly unethical behavior in which many of the prison guards engaged the product of mental abnormalities, the product of “a few bad apples,” as George W. Bush and other members of his administration insisted? Were their psychological dispositions to blame, in other words? Or was their behavior encouraged by Table 4.1 A summary of some of the arguments for and against Milgram’s obedience paradigm For Milgram convinced the vast majority of his subjects (65%) to go against their own dispositions (the power of the situation). He used quite minimal inducements to produce the high level of obedience observed (e.g. the “authority” was a man in a gray lab coat). His findings are supported by other related research in social psychology, such as that of Solomon Asch. His findings match the less systematic but interesting observations of others, such as Hannah Arendt. His finding that the level of obedience varies with proximity to the victim is borne out by the lessons of modern warfare. Against Milgram cannot explain the dispositionally driven behavior of the 35% who rebelled. There seem to be cultural differences in the propensity to obey, presumably related to differing dispositions between nations. Milgram himself offers a theory of obedience which is partly based on dispositions inherited through an evolutionary process. Many of the causal factors associated with genocides are absent from his experimental design. The Psychology of Obedience 61 a set of situational inducements which might well have been repeated had an entirely different set of individuals played the same roles? This is the question to which we turn next. Notes 1 Theodore Adorno, Else Frenkel-Brunswik, Daniel Levinson, and Nevitt Sanford, The Authoritarian Personality (New York: Harper, 1950). 2 James Waller, Becoming Evil: How Ordinary People Commit Genocide and Mass Killing (New York: Oxford University Press, 2002), p.77. 3 As we shall see in Chapter 6, this research also had a profound impact on another situationist, Irving Janis. 4 Stanley Milgram, Obedience to Authority: An Experimental View (New York: Harper & Row, 1974). 5 Milgram might not be that famous today if this poll had been accurate in its prediction. Research that merely confirms the conventional wisdom rarely captures much attention! 6 Peter Davis (director), Hearts and Minds (BBS Productions, 1974). 7 Milgram, Obedience to Authority, p.15. 8 Ibid., p.5; Hannah Arendt, Eichmann in Jerusalem: A Report On The Banality of Evil (New York: Viking Press, 1963). 9 Christopher Browning, Ordinary Men: Reserve Police Battalion 101 and the Final Solution in Poland (New York: HarperCollins, 1992). 10 Milgram, Obedience to Authority, pp.123–34. 11 Ibid., pp.132–34. 12 David Mantell, “ The Potential For Violence In Germany,” Journal of Social Issues, 27: 101–12, 1971. 13 Quoted in Philip Gourevitch, We Wish To Inform You That Tomorrow We Will Be Killed With Our Families: Stories From Rwanda (New York: Picador, 1998), p.23. 14 John Hibbing, Kevin Smith, and John Alford, Predisposed: Liberals, Conservatives, and the Biology of Political Differences (New York: Routledge, 2014). 15 See Adam Jones, Genocide: A Comprehensive Introduction (New York: Routledge, 2006), pp.275–81. 16 Samuel Oliner and Pearl Oliner, The Altruistic Personality: Rescuers of Jews in Nazi Europe (New York: Free Press, 1988). 17 Waller, Becoming Evil, p.245. 18 Gourevitch, We Wish To Inform You That Tomorrow We Will Be Killed With Our Families, p.32. 19 Ibid., p.64. 20 Jones, Genocide, p.268. 21 Ibid., p.269. 22 Gourevitch, We Wish To Inform You That Tomorrow We Will Be Killed With Our Families, pp.47–62. 23 See Kristen Monroe, “Review Essay: The Psychology of Genocide,” Ethics & International Affairs, 9: 215–39, 1995. 24 Milgram, Obedience to Authority, p.189. 62 The Situation Suggested Further Reading Hannah Arendt, Eichmann in Jerusalem: A Report on the Banality of Evil (New York: Viking Press, 1963). Thomas Blass, The Man Who Shocked the World: The Life and Legacy of Stanley Milgram (New York: Basic Books, 2009). Stanley Milgram, Obedience to Authority: An Experimental View (New York: Harper & Row, 1974). Films Obedience (1965): Released by Penn State University Press. Milgram was a pretty good amateur film maker, and this one shows the original experiments themselves in striking and dramatic detail that many people find both amusing and “shocking.” It can be hard to find and even harder to purchase, but it’s absolutely essential viewing for all students if you can get a copy. The Human Behavior Experiments (2006): ABC documentary which re-creates the Milgram experiment thirty years later—with some restrictions—coming to very similar results.