On March 16, 1968, Lt. William Calley ordered his troops to open fire on the small Vietnamese village of My Lei. Over 300 unarmed men, women, and children were massacred. Calley himself rounded up a group of villagers into a ditch, then mowed them down with machine gun fire. At his trial, Calley testified that he had been ordered to kill everyone in the village by a superior officer.
In 1997, 39 members of the "Heaven's Gate" cult commited suicide. Marshall Herff Applewhite started the cult using only his charisma and simple psychological techniques. He eventually convinced his followers that aliens travelling on the comet Hale-Bopp were coming to destroy the world, and that by killing themselves, their spirits could be saved by traveling to the alien space ship.
During World War II, Japanese kamikaze pilots volunteered for suicide missions as a way of honoring their emperor, whom they saw as a god. The September 11 attacks were most likely conducted with similar motives.
From the 1930s to the 1970s, scientists working for the US Department of Energy conducted numerous dangerous experiments on unconsenting schoolchildren and members of the general public. In one such experiment, the milk in school lunches was laced with various radioactive substances to test the uptake of radiation into the bones.
During the Holocaust, concentration camp guards poured Zyklon B into gas chambers, forced prisoners to live under inhuman conditions, and carried out macabre and painful medical "experiments" on children. Residents of nearby towns ignored the constant smell of burning bodies.
After World War II, psychologist Stanley Milgram was troubled by the compliance of concentration camp guards. Popular opinion at the time was that a disposition towards violence was something peculiar to the German character. Milgram conducted an experiment to prove his hypothesis that the majority of people would willingly commit cruelty under the right conditions.
The test subject is told that he will be participating in an experiment on memory. The test subject and an actor engage in a lottery to decide who will play the role of teacher and learner. The actor is a likeable middle-aged man who mentions he has a heart condition. The lottery is rigged so that the test subject is always the teacher. The experimenter, who has a stern appearance and is dressed in a white lab coat, leads the learner (actor) into a small room where he is connected to electrodes. The teacher (subject) is then taken into an adjacent room, where he sits at a mock-up machine with thirty switches marked from 14-450 volts. The switches are also labelled with descriptions, going from "Slight Shock" to "Danger: Severe Shock". The subject is given a sample shock of 45 volts, strengthening his belief in the authenticity of the machine.
The experimenter instructs the teacher (subject) to read a list of word pairs into a microphone, which is connected to a speaker in the other room. The learner must then repeat the list of words. Each time a wrong answer is given, the teacher is instructed to give a shock, starting at the lowest voltage and increasing to the next higher voltage for each wrong answer.
Conflict arises when the man receiving the shock begins to show that he is experiencing discomfort. At 75 volts, he grunts; at 120 volts, he complains loudly; at 150, he demands to be released from the experiment. As the voltage increases, he begins to complain of heart problems. At 285 volts, his response can be de
scribed only as an agonized scream. Soon thereafter, he makes no sound at all.
65% of test subjects continued administering shocks up to the 450 volt limit, at the urging of the experimenter, even after the learner has stopped responding completely. Not a single one of the subjects refused before reaching 300 volts.
Morris Braverman, another subject, is a thirty-nine-year-old social worker. He looks older than his years because of his bald head and serious demeanor. His brow is furrowed, as if all the world's burdens were carried on his face. He appears intelligent and concerned.
When the learner refuses to answer and the experimenter instructs Braverman to treat the absence of an answer as equivalent to a wrong answer, he takes his instruction to heart. Before administering 300 volts he asserts officiously to the victim, "Mr. Wallace, your silence has to be considered as a wrong answer." Then he administers the shock. He offers halfheartedly to change places with the learner, then asks the experimenter. "Do I have to follow these instructions literally?" He is satisfied with the experimenter's answer that he does. His very refined and authoritative manner of speaking is increasingly broken up by wheezing laughter.
- Stanley Milgram
Milgram conducted several more variations on this experiment. He found that compliance decreased significantly when the experimenter was less impressively dressed. It also decreased when the learner was physically closer to the subject,
and increased when the experimenter was closer. Compliance was greatest when the learner was not visible to the subject.
In 1971, students at Stanford University conducted an experiment to simulate the conditions in a prison. Eight volunteer students played the role of "prisoners" and eight played the role of "guards." The experiment was planned to last two weeks, but it was terminated after only six days when the situation began to get out of control. The guards became increasingly sadistic, and the prisoners began to show signs of extreme stress and nervous breakdown, even though they were free to leave the experiment at any time. Even Stanford psychology students can fall prey to this behavior pattern.
The vice president of Arthur Anderson in charge of the Enron account ordered employees to shred thousands of incriminating Enron documents. These were CPAs who would have known that this was illegal, but not a single one refused or alerted authorities. The vice president was impressively dressed and nearby, while the victims were far away and unknown to the auditors. Therefore, they did not feel responsible for their own actions.
When a large group of people form around an ideology, people within the group are put under pressure to stop thinking critically. Our instincts tells us to conform, even if the rational mind is skeptical. This results in cognitive dissonance. When an individual experiences cognitive dissonance, he instinctively attempts to reduce the dissonance, either by adding cognitions consistent with instincts, or removing inconsistent ones. Therefore, people tend to react emotionally against an argument that goes against the group ideology without ever rationally considering it.
For instance, the people shredding Enron documents rationally knew what they were doing was wrong, but were emotionally driven to do it anyway. To reduce the dissonance, they probably added consistent cognitions (I'm only following orders, everyone else must know something I don't), and marginalized inconsistent ones (It's not a big deal).
This pattern can be seen in groups of almost any type. Religions and political parties rely on it for their very existence. Until we can learn to stop allowing emotions to overpower reason, we will always be vulnerable to ourselves.
[Ed. Note - this was originally written as a school assignment. It's handed in now, so I thought the K5 audience might enjoy it. Mr. Kennedy, if you're reading this... don't flip out.]