The Uncomfortable Truth About Mind Control
In the Sixties, a groundbreaking series of experiments found that 65 per cent of us would kill if ordered to do so.
We have vain brains; we see ourselves as better than we really are. We like to think that we exercise free will, that put into a situation where we were challenged to do something we thought unacceptable then we’d refuse. But, if you believe that, then you are probably deluded.
I make this claim, based partly on the work of psychologist Stanley Milgram. Milgram devised and carried out ingenious experiments that exposed the frailty and self-delusion that are central to our lives. He showed how easy it is to make ordinary people do terrible things, that “evil” often happens for the most mundane of reasons.
I first read about Milgram’s work when I was a banker in the Seventies, working in the City. I was so fascinated by his ideas that I re-trained as a doctor, with the intention of becoming a psychiatrist. Instead I became a science journalist. Recently I got the chance to make The Brain: A Secret History, a television series which reveals how much we have learnt about ourselves through the work of some of the 20th century’s most influential, and deeply flawed, psychologists.
In the course of making the series we found rare archive and first-hand accounts of the many inventive and sometimes sinister ways in which experimental psychology has been used to probe, tease, control and manipulate human behaviour. High on the list of psychologists I wanted to learn more about was Stanley Milgram.
The son of Jewish immigrants from Eastern Europe, Milgram struggled to understand how it was that German soldiers in the Second World War were persuaded to take part in barbaric acts, such as the Holocaust. As he once wrote: “How is possible, I ask myself, that ordinary people who are courteous and decent in everyday life could act callously, inhumanely, without any limitations of conscience.”
Milgram was working as an assistant professor at Yale University in 1960 when he dreamt up an experiment that would try to answer that question. It was beautifully designed to reveal uncomfortable truths about human nature. Milgram described the moment he had the idea as “incandescent”.
Some claim that what Milgram did was ethically and scientifically dubious. I have always thought it was justified and hugely important, but I had never had the chance to interview any of the “volunteers” who had unwittingly taken part in his notorious experiment, to get their perspective.
Last summer, nearly 50 years after the original experiment, I finally met one of the few remaining survivors, Bill Menold. I talked to Bill in his kitchen, surrounded by his grandchildren, who were eager to hear his account.
In 1961 Bill Menold was 23 and had recently left the army. “I happened to see an ad in The New Haven Register and it said ‘memory and learning experiment’ and they were going to pay $4, and I thought. I’m going to be in New Haven that day, why not?”
He went along to a building where he met an earnest young man in a white lab coat – The Experimenter – and a middle-aged volunteer. The Experimenter told Bill that he would be the Teacher and the other volunteer would be the Learner. The Teacher’s task was to give the Learner a simple set of memory tasks, which he would then be tested on. If the Learner got an answer wrong, the Teacher had to give him an electric shock. If he continued to give wrong answers, the shocks would steadily increase.
Bill was left in a room with a microphone and a set of electrical controls. The Learner was put in another room, where Bill could hear but not see him. Then the experiment began. The Learner was a slow learner.
“Wrong – 150 volts.”
Bill sat at the desk, carrying out the task he had been asked to do. Despite the screams coming from the next room, he continued to ask questions and administer electric shocks when the Learner failed to answer correctly.
“Wrong – 195 volts.”
Even now he finds it hard to explain what went on inside him that day. “You are sitting in that chair with this stuff going on and that pressure that you were under, it’s very hard to think clearly. I’ve never had anything before or since that was like that. Where you were literally out of your mind.”
“Wrong – 350 volts.”
I just said to myself, I’m just gonna play this out and pretty soon we’ll be out of here. I’m finishing this thing. I don’t care what happens. Once you make the decision, you’ve made your decision. I want to go home. I want to get out of here, go and get a beer somewhere and go home. You know?
“Wrong – 450 volts.”
When I asked him if he thought he had killed the Learner, Bill replied, “Yeah. When he stopped responding.”
What Bill and the other volunteers who took part weren’t told was that the electric shocks were fake – and that both the Experimenter and the Learner were actors. The real purpose of the experiment was to see how far the volunteers would go. Stanley Milgram had asked colleagues how many people they thought would go all the way and administer a lethal 450-volt shock. Most said less than 1 per cent – and those would probably be psychopaths.
Yet Bill, like 65 per cent of the volunteers, gave an apparently lethal electric shock when told to do so.
I remember thinking, when I first read this, that such a figure was completely unbelievable. I was absolutely certain, and I’m sure everyone who read about Milgram’s work was equally certain, that I would never give a fatal electric shock to someone simply because I had been asked to do so by someone in authority. It is inconceivable that I could be manipulated in this way.
Perhaps, I thought, the volunteers had deep-down realised that this was a fake experiment, that they were just playing along. When critics put this point to Milgram he scathingly responded, “the suggestion that the subjects only feigned sweating, trembling, and stuttering to please the Experimenter is pathetically detached from reality, equivalent to the statement that haemophiliacs bleed to keep their physicians busy”.
Milgram argued that far from being in any way fake, his experiment demonstrated in a very stark way something that we all know happens, but which we can’t bring ourselves to believe. It is more comfortable to imagine that there was something uniquely evil or weak about German prison guards than to believe that most of us would behave the same way when faced by the same set of circumstances. “One of the illusions about human behaviour is that it stems from personality or character, but social psychology shows us that often human behaviour is dominated by the roles that we are asked to play.”
Bill was surprisingly sanguine about having been deceived, and very honest, particularly when he was talking about that moment when he abandoned his moral compass and handed over responsibility for his actions to the Experimenter. With the wisdom of hindsight he was able to admire the thoroughness of the experiment and the skill with which the actors had played their parts.
I think that a more legitimate criticism than, “they were faking it,” was the relevance of Milgram’s experiment to the real world. Perhaps people had behaved the way they did largely because of the artificiality of the situation? In 1966, inspired by Milgram’s findings, a psychiatrist called Charles Hofling created a more realistic scenario.
He arranged for 22 nurses working in a large hospital to be rung, separately, by a man simply calling himself, “Dr Smith”. Dr Smith told each of the nurses that he wanted them to give 20mg of a drug called Astroten to a patient, who he named. Dr Smith also told the nurses that he was on his way to the hospital and would sign the necessary paperwork when he arrived.
The drug, an invention of the experimenters, had been placed in the drug cabinet several days before the telephone call with a prominent warning on its side that 10mg was the maximum safe dose. Despite this, and despite the fact that hospital protocol specifically stated that no drug should ever be administered based solely on a phone call, 21 out of the 22 nurses were preparing to give the 20mg dose when they were stopped. The nurses had bowed to the imagined authority of the “doctor”.
People obviously knew, long before Milgram and Hofling did their experiments, that humans have a tendency to blindly follow orders, if they are presented in a plausible fashion by someone who is apparently in authority. What these experiments revealed was just how strong this “tendency” really is. Psychology, which is often criticised for discovering the bleeding obvious, had shown that it was capable of making surprising, original, disturbing contributions to our understanding of ourselves.
Some professional bodies,such as the US army, responded to these findings by incorporating it into their training, making sure that would-be officers were aware of the pressures they might come under to follow orders they felt were unethical. Medical and nursing students are also now taught of the dangers of blindly following orders.
Others,such as the American Psychological Society, responded to criticisms of Milgram’s methods by adopting new guidelines for the treatment of volunteers in psychological experiments. In a more nebulous way, I also think Milgram contributed to the widespread questioning and suspicion of authority that was characteristic of his era, the 1960s.
Milgram’s own motivation for doing experiments was not mistrust of authority, but the desire to understand why authority has such a hold over us. To find out more, he then took to the streets to see how people would behave in a situation where there was no obvious authority.
Milgram went with his students on to the New York subway. Their task was to approach passengers on the train and say, pleasantly: “I’d like your seat, please”. As Milgram pointed out beforehand, “If you ask a New Yorker if he would give up his seat to a man who gives no reason for asking, he would say ‘never’. But what would he really do?” The answer was that in just over half of all cases people gave up their seats when asked.
Recently I decided to repeat this experiment in a busy London shopping centre, with similar results. I was surprised by how many people complied with my completely unreasonable request, but even more surprised by how uncomfortable I found asking them to do it, something Milgram also discovered.
“I was about to say the words ‘excuse me, sir, may I have your seat,’ but I found something very interesting, there was an enormous inhibition, the words wouldn’t come out, I simply couldn’t utter them, there was this terrible restraint against saying this phrase.”
Although it was unexpected, Milgram thought that this was a hugely significant finding. He had found through his own personal experience just how important feeling socially awkward is when it comes to modifying behaviour. We don’t like breaking the social rules – whether it’s asking for somebody’s seat, or disobeying the instructions of somebody whose authority we have accepted.
In everyday situations there is an implicit set of rules of who is in charge and if we violate these rules it leads to feelings of embarrassment and awkwardness so intense we prefer to accept the submissive role the occasion requires. It is a terrible critique of human behaviour that we would rather let something terrible happen than act in a socially embarrassing manner. Yet it helps explain some of the chilling crimes you read about when someone is attacked, even murdered, in a public place and no one intervenes.
Now I’d like to believe that we have, as a society and because of what psychologists like Milgram have taught us, become less blind to the demands of authority. I’d like to believe that, but I don’t.
Dr Thomas Blass, Milgram’s biographer, recently asked himself that question.
“Would Milgram find less obedience if he conducted his experiments today? I doubt it. To go beyond speculation on this question, I gathered all of Milgram’s standard obedience experiments and the replications conducted by other researchers. The experiments spanned a 25-year period from 1961 to 1985.
“I did a correlational analysis relating each study’s year of publication and the amount of obedience it found. I found a zero-correlation – that is, no relationship whatsoever. In other words, on the average, the later studies found no more or less obedience than the ones conducted earlier.”
There was a recent example of the continuing tendency towards blind obedience in the USA when a con man, dubbed “the modern Milgram”, made the staff of dozens of fast-food restaurants behave in an appalling fashion simply by ringing up and pretending to be a policeman.
He persuaded managers to strip-search their staff in search of stolen goods, to make them jog naked, even to strip off and appear naked in front of startled customers. One manager, who strip-searched an employee and was subsequently jailed, said, “I didn’t want to do it, but it was like he was making me”.
Milgram once wrote that we are “puppets controlled by the strings of society”. Yet what is also true is that not all puppets jump when their strings are pulled. Many of the fast-food managers who were rung up the “policeman” refused to follow his orders. In Milgram’s own experiment, although 65 per cent of the volunteers were prepared to give apparently lethal electric shocks, that still left 35 per cent who would not.
What no experimenter has yet been able to predict are the characteristics that mark out those who will rebel from the rest. The only way you will ever know how you measure up is when you find yourself tested. You have a one in three chance of passing.
I really like your writing style, wonderful info, appreciate it for posting :D.