In the original Milgram experiments, participants acted as teachers and were told by the experimenters to shock their student (a confederate) if the student failed to give correct answers in a simple word-pairing test. Milgram tested the obedience of the participants under multiple conditions. These conditions varied, from making the teachers physically place the confederate's hand on the shock plate for each incorrect response to having the experimenter order the teachers to shock the student by phone to telling the teachers that the confederate had a heart condition. Combinations of confederates and experimenters were also tested.
The central finding was that the majority of participants shocked the confederate until the highest level (450 volts) under most conditions. This majority shrank when disobedience was shown, when the authority figure was remote, or when the participant had to physically place the confederate's hand on the shock plate (Touch-Proximity). These results were completely unexpected. As Blass (1991) notes in his critique of Milgram's research, "When asked to predict the outcome of the obedience experiment, neither a group of Yale students nor a group of psychologists were even remotely close to predicting the actual result" (Blass, 1991, p.398). College students and psychiatrists were unable to consider the possibility that a majority of participants would choose to obey an experimenter's commands, even when it meant shocking another human with 450 volts of electricity.
Milgram's experiments were duplicated in several other Western nations (Blass 1991) and the results were similar: a majority of participants went all the way to the highest shock level. Blass mentions an even more startling variation on the original experiment done in 1976 (Blass, 1991, p.400). The experimenter told one group of participants that they would have to be learners after a session of shocking the victim. The possibility of retaliation did not affect obedience, as 81% of the participants shocked the learner all the way to the highest level.
Milgram's experiments and the related duplications and variations that followed show distressing results. Many theories have been suggested as to why the rate of obedience in these experiments is so high. Milgram focused on the situation as a factor, noting, for example, that when the authority was remote or divided (arguing experimenters), the rate of disobedience went up. Blass suggests that while situation plays a role in obedience, certain personality elements also contribute to one's obeying or disobeying authority. Among these elements are religious belief and level of suspiciousness. As evidence for this supposition, Blass (1991) describes a 1972 dissertation outlining the effects religious belief (or lack thereof) can have on obedience. Highly religious people were less likely to question authority and were also more obedient (Blass, 1991, pp. 404-5).
Given the high percentage of people in America who profess belief in God and the Judeo-Christian religious beliefs America was founded on, the results of the Milgram study seem more rational. A religious person who has been brought up to defer to authority is more likely to obey than an atheist. When Milgram's study was duplicated in West Germany, where the population is not as religious , the rates of obedience were just over fifty percent. Essentially, the people in the Milgram and Milgram-inspired studies were not being cruel because they wished to. They were attempting to conform to the norms of Western culture.
It is this attempt at conforming to norms that produces a disturbing dualism in Westerners. They must reconcile their internal ethical sense with the ethics of Western culture, which often permit obedience to authority to supersede one's personal judgments. The complicit Germans in Nazi Germany might have considered Hitler's policies wrong, but in the Judeo-Christian context of their culture, they subjugated their opinions and obeyed. This was generally accomplished by shifting responsibility. Milgram noted that while many of the participants did not enjoy shocking the learner, they were more likely to go to the highest level when the experimenter claimed responsibility.
The Stanford prison experiment, unlike Milgram's obedience experiments, did not deal with subjects who wished to give up responsibility for their actions. It dealt with participants who wished to assume full responsibility for their actions.
The experiment had equal numbers of men divided randomly into two groups: guards and prisoners. The prisoners endured a realistic mock-arrest and were placed in a simulated prison by the guards. Zimbardo, Haney and Banks, the Stanford experimenters, were attempting to analyze the interactions that occur between guards and prisoners in order to understand the brutal behaviours of prison guards in real prisons. They used only subjects who tested completely normal and average on personality tests.
The guards were told that they could discipline the prisoners and assign them various jobs. No physical violence was permitted. The experimenters were amazed at how quickly the prisoners and guards adopted their assigned roles. The prisoners became passive and submissive. The guards grew increasingly more brutal. At one point a guard locked up a prisoner in solitary confinement all night because he thought the experimenters were not being harsh enough with the prisoners.
In his response to a criticism of the Stanford experiment, William DeJong points out that while none of the participants agreed on a specific stereotype of a prisoner, nearly all agreed that guards "would be oppressive and hostile."(DeJong, 1975, p. 1014). As in Milgram's experiment, it is cultural norms which inform the participants' behaviour patterns. While prisoners are often portrayed in Western culture as either passive or aggressive, prison guards are generally portrayed as brutal, sadistic people. The guards in the Stanford experiment merely took this cultural cue and used it as a behaviour model in the experiment.
The study was terminated early due to intense psychosomatic illnesses among some of the prisoners and physical violence from the guards, though the experimenters had clearly forbidden it at the beginning of the experiment. Given that all the participants tested normally and showed no signs of deviation, the experimenters were surprised at how swiftly the roles of guard and prisoner were adopted. However, given Western cultural norms that reward aggression in males and provide a stereotype of a particularly aggressive male in this situation (guarding a jail full of prisoners), the guards were actually conforming to societal norms. The prisoners, though they lacked a clear stereotype, also followed societal norms in letting the dominant males (the guards) take control of the situation.
The thread linking these two studies is social norms. In both Milgram's experiments and the Stanford experiment, the participants are doing their best to conform to social norms. The dissonance lies in the fact that these social norms are presented in a context that goes against the internal norms of most of the participants. However shocking these studies might seem, the fact remains that sometimes in real life, people are placed in situations where their internal ethics conflict with those of society. In this case, one must either go against social norms and follow one's internal ethics or one must sublimate one's own ethics by shifting responsibility onto authority or society. Because the majority of Westerners wish to be accepted, when there is a conflict between their norms and society's, they choose society's unless they witness other members of society violating the norms. Then one can redefine social norms to reflect the fact that there is a social group compatible with one's internal norms. This does not occur often in Western society, and so Westerners are forced into a state of ethical and moral duality, periodically choosing whether they should violate dissonant social norms or force their internal norms to conform (Levy, 1997).
It is, in fact, this duality that renders the Milgram and Stanford studies unethical. In testing how far people would push themselves to conform to social norms, the experimenters were forced to violate accepted social norms. Both Milgram and Zimbardo et al, had to choose between following ethical psychological principles and not getting the results they needed or violating some of these principles and getting controversial but important results. In both studies, participants were deceived about the amount of physical and psychological stress they would have to endure in the experiment. Common sense would dictate that if the participants had been fully informed, the experimenters would have had far fewer subjects to test. In addition, though they violated social norms in misleading the participants, the extensive debriefings provided at the end of the studies served as a balance. No participant suffered long-term repercussions from the studies.
Though many psychologists have argued that Milgram's obedience experiments and the Stanford prison experiment were unethical and cruel, they served a purpose that justified the temporary cruelty endured by the participants. These studies served to emphasize the duality of people existing in a Western, Judeo-Christian culture. By subjecting them to social norms that were highly dissonant with their internal norms, these experimenters were able to clearly show the battles normal, ordinary people must fight with themselves when their culture emphasizes actions and situations that they disagree with (Levy, 1997).
Carl Sagan, in his book Shadows of Forgotten Ancestors, mentions an experiment involving macaques that was similar to the Milgram experiments. Macaques were fed only if they shocked a fellow macaque. Nearly ninety percent of them chose to starve rather than shock another macaque. Sagan uses this anecdote to criticize human ethical and moral standards (Sagan, 1992, pp.117-8). His criticism, while justifiable, neglects the fact that Western human society is far more complex than macaque society. Though obedience to authority, though it involves torturing another, and willingness to be a brutal guard seem cruel, they are socially sanctioned actions within the context of Western civilization. As the civilization has grown, so has its ethical and moral norms. Because of this complexity, sometimes one's internal ethics will conflict with society's. An example would be Nazi Germany. The majority of the Germans were conforming to social norms, even though they often disagreed with them. According to Milgram, "Even Eichmann was sickened when he toured the concentration camps" (Milgram, 1974).
This is the price one pays for living in a technologically advanced, complex society. It is possible that in less complex, non-Western societies, people have little or no societal conflicts with their internal norms. This conflict is what remains at the heart of the Milgram experiments and the Stanford prison study. In a complex society, one must occasionally do cruel things because it is a social norm. One must be of two minds, both good and bad, alternating with changing cultural mores. The unasked question in these studies, the question that has not been definitively answered is: Does one wish to exist in a society that forces one to behave evilly or cruelly or occasion in return for technological and social advancement? Sagan terms the macaque experiment a "Faustian bargain". Perhaps the term is better applied to Western civilization.
Blass, Thomas. (1991). Understanding Behavior in the Milgram Obedience Experiment: The Role of Personality, Situations, and Their Interactions. Journal of Personality and Social Behavior, 60, 398-407.
DeJong, William. (1975). Another Look at Banuazizi and Movahedi's Analysis of the Stanford Prison Experiment. American Psychologist, 30, 1014.
Levy, A. (1997). Obedience and Individual Responsibility. A Heart of Good and Evil. Available: http://web.archive.org/web/19970704234734/http://www.sas.upenn.edu/~adlevy/evil.
html [1997, December 3].
Milgram, S. (1974). The Perils of Obedience. Harper's Magazine. [Online]. Available: jmcneil.sba.muohio.edu/Private/PerilsofObedience.html [1997, November 30].
Sagan, C., and Druyan, A. (1992). Shadows of Forgotten Ancestors. Random House: New York.