Ď㽶ĘÓƵ

Subscribe to the OSS Weekly Newsletter!

Would You Punish Someone with Electric Shocks If Told to Do So?

Stanley Milgram’s infamous experiments into obedience to authority gave us solid, replicable data. But how should we interpret it?
Image by Yale University Manuscripts and Archives.

The banality of evil.

They were just following orders.

People absolve themselves of any responsibility when there is someone in charge.

These ideas have permeated our thinking, in part because of the defense many Nazi officers gave during the Nuremberg trials, and in part because of Stanley Milgram’s infamous experiments into obedience to authority. Even when the victim begged from the other room not to be shocked, so many of Milgram’s subjects kept increasing the voltage all the way to the “severe shock” setting because a man in a lab coat told them it was OK.

The Milgram experiments are probably the most famous and influential studies in all of psychology. They have been discussed in the context of the law, business ethics, and Holocaust studies. They have also whipped up a frenzy of criticism and backlash over the decades since , leading researchers to argue about the ethics of deception and of causing distress to research participants—including nervous laughter, sweating and uncontrollable seizures—in the name of conducting high-impact studies into big ideas.

To this day, alternative explanations to the “just following orders” theory continue to be proposed. The fact that Milgram’s seminal explorations are still being cast in a new light over half a century after they began is an important reminder that scientific data on its own is not enough.

It needs to be interpreted.

From 0% to 93% fully obeyed the experimenter

We have all heard a version of the story of Milgram and the electric shocks. In the 1960s, participants (then known by the clinical and dehumanizing moniker “subjects”) were told by an experimenter to administer progressively higher electrical shocks to another human being when presented with wrong answers, and against expectations, most of these participants—two thirds, in fact—went all the way to the point where the victim stopped complaining and was presumed dead. Except that the participants had been deceived and no electrical shocks were actually delivered. The experimenter and victim were actors.

The reality is much more complicated. There was no one study. Milgram actually conducted over twenty of them in an exploratory process, changing variables as the results came in, to test different ways in which obedience might be curbed or encouraged, after having refined his approach through . The notion that two thirds of participants delivered maximum shocks was the result of the first official study Milgram published on this topic, but the rest of the experiments, written about in his 1974 book Obedience to Authority, showed a wide range of complete obedience, depending on the specific experimental conditions Milgram was testing.

There seems to have been for Stanley Milgram, who had just received his doctorate in psychology and was accepted into Yale as an assistant professor, to study obedience to authority. He had worked with Solomon Asch, who had investigated conformity and shown that, when embedded in a team of actors in cahoots with the researcher, a naïve study participant would go along with the group even though the group’s answer was clearly wrong. The problem with Asch’s research is that it focused on eyeballing the length of lines drawn on cards. Milgram was looking for something more relevant to the real world, having to do with destructive acts.

Stanley Milgram was also Jewish, and that provided an added incentive for him to research how people responded to authority. He was born in New York City in 1933, the year the Nazi Party came into power in Germany. Relatives of Milgram’s immediate family who had survived the concentration camps came to stay with his parents in 1946. And as a symbolic historical echo, the televised trial of Adolf Eichmann, the architect of the Holocaust, began a few months before Milgram’s first research participants came to the Yale University campus to be tested. Eichmann was hanged five days after the end of Milgram’s first study.

Many members of the Nazi leadership explained away their actions as having simply followed orders, an excuse that became known as the Nuremberg defense. Milgram’s experiments certainly look at first glance as confirming the banality of evil, that the capacity to commit atrocities resides in all of us and simply awaits the right circumstances to be made manifest. The fact that Milgram’s results have been replicated time and time again, by different researchers all over the world, using both men and women, lends credence to the idea.

However, all that these replications show is that the results Milgram got were real, not that the banality of evil is necessarily true. So many of Milgram’s participants—ordinary adults from the community around Yale University—were willing to shock a stranger because he had made a mistake in a memory exercise. They were told by a man in a lab coat to please go on; that the experiment required them to continue; that it was absolutely essential to continue; that they had no other choice and had to go on. Thus, they kept flipping the switches, despite the pounding on the wall, the screams, the demands to be let out. This much is true.

But why did they do it?

A noble goal

I was recently asked by a journalist if I thought that the open data movement, which calls for the free and transparent availability of scientific data to everyone, would help solve the problem of people disbelieving scientific facts and embracing fake science and conspiracy theories. I don’t think so because a lack of data is not the main issue here. It’s how to interpret it.

Milgram wrote a whole book about his experiments, and Yale University holds an archive of the 720 individual experiments he conducted between August 1961 and May 1962, including . We have the data. Yet here we are, sixty years later, and psychologists still do not agree on exactly why so many of Milgram’s participants acted the way they did. There are theories, of course, but no consensus.

Milgram himself began by suspecting that his obedient subjects had chosen to cooperate with the experimenter, that as he wrote, and that these experiments would help people make better choices under duress. He subsequently came to embrace a different explanation: submission. The participants willing to shock despite the feedback they were receiving from the victim had relinquished their agency, he proposed. They were, essentially, just following orders.

There is something troubling about this explanation, especially since Milgram’s experiments are often used to help make sense of how the Nazis went about killing , up to a quarter of million people with disabilities, hundreds and thousands of queer people, and more. provide evidence that the Nuremberg defense of “just following orders” was actually a conspiracy. It had been discussed and adopted prior to the trial as their official excuse. In truth, there were many examples of Nazi leaders reminding their people of the nobility of their cause. As a team of researchers reinterpreting Milgram’s data put it,

Likewise, there are valid arguments to be made that the data Milgram collected can be interpreted in a similar light, that his participants did not simply absolve themselves of all responsibility because someone else was in charge, but that they wanted to actively contribute to an important research project. Milgram, in fact, made sure that his participants saw his experiments as being worthy and noble. It can be argued that the participants wanted to help the experimenter answer important scientific questions and were willing to keep pushing because the experimenter told them it was in pursuit of this crucial goal and that the shocks would not result in permanent tissue damage. A later analysis of the data showed that one of the major points where participants decided to stop administering the shocks was . This was the moment in the research script when the victim would first demand to be released. If the participant felt he was helping the experimenter with an important scientific goal, this is the point where this bond would be challenged. At least, that’s the hypothesis, one of many.

And while there may be a similarity between Milgram’s participants and the Nazis in that both believed they were doing the right thing, there are also many points of divergence. In Milgram’s experiments, the victim was not seen as subhuman and the shocks were said not to cause permanent harm. The Nazis had a long-term goal but Milgram’s participants didn’t, beyond carrying out the instructions of the experimenter. The Holocaust unfolded over years, while each research participant spent maybe an hour in the laboratory. Importantly, Nazi leaders were in authority while Milgram’s experimenter was seen as an authority because of the scientific context. Obedience to the man in the lab coat drastically declined when he was not in the room, whereas the Nazi’s political authority had to work at a distance.

Figuring out what is really going on in Milgram-type experiments will prove challenging, however. Research ethics in the 1960s were a lot looser than they are now. Because of abuses in medical research, such as the radiation studies of the Manhattan project and the Tuskegee syphilis experiment, American guidelines on studies using human participants came about in 1966 and were . In 2009, psychology researcher Jerry Burger wrote that He followed this up by detailing his own partial replication of Milgram’s work on obedience to authority.

How did he do it in an ethical landscape much more concerned with the well-being of research participants? He used an extensive procedure to screen out anyone who might react negatively during the experiment; repeatedly told his participants they could withdraw during the study and still get paid; stopped the experiment at 150 volts, since those who chose to administer this voltage in the past tended to go all the way to the end anyway; gave his participants a thorough debriefing right after the experiment was done; and ended an experiment if any sign of excessive stress was perceived. The specific Milgram study he was attempting to replicate had demonstrated an 82.5% obedience rate; Burger’s was not statistically different at 70%.

Burger opened the door to continuing to investigate obedience to authority in a more humane context, but his methodology was criticized by others. The subtitle to his paper was “Would people still obey today?,” and while he provided data that indicated that not much had changed since the 1960s (and, by implication, the 1930s and 40s), . Given how widely known Milgram’s experiments are, we can wonder if this knowledge has made people more empathetic and less willing to blindly obey authority. Burger, however, specifically screened out anyone who had taken more than two psychology classes to avoid people who might suspect what was going on. He also screened out anyone who might become genuinely stressed out by the protocol. All of this means that the number of people who would disobey today might have been undercounted.

Over the course of half a century, Stanley Milgram’s seminal explorations of authority and obedience have been scrutinized, denounced, criticized, and replicated. The fake box his participants used to deliver fake shocks can now be . Audio recordings of the experimental sessions have been listened to and commented on. And yet, we still do not know precisely why so many ordinary people thought they were punishing a fellow human being for a mistake they had made and kept going… or why so many refused to participate past a certain point. It’s probably not just a question of the situation they were in, nor only because of who they were as people, but rather an interaction between both situation and personality.

Psychology is often referred to as a “squishy science,” because human behaviour is so much more fluid and harder to predict than the behaviour of a single atom. That doesn’t mean psychology is useless. It means the data we accumulate on it requires careful interpretation. Knowing what happens is only part of the answer. We must know why it happens.

Take-home message:
- Stanley Milgram oversaw more than 20 different sets of experiments into obedience to authority in the 1960s, where participants thought they were administering electric shocks to a victim, and depending on the study design, from 0% to 93% of participants fully obeyed to the end
- There is still no consensus on why so many participants were obedient, though theories having to do with choice, submission, and the nobility of the scientific research have been put forward
- Because of stricter ethical rules protecting research participants, it is now impossible to fully replicate what Milgram did in the 1960s, although full replications were done decades ago and partial replications have been done recently


Back to top