Psychology is maybe most valuable when it exposes our inherent biases and what's commonly known as irrational thinking. But you'd expect people to learn and correct their behavior after learning the findings of surprising psychology experiments. For most people, this isn't the case.
Psychology is maybe most valuable when it exposes our inherent biases and what’s commonly known as irrational thinking. But you’d expect people to learn and correct their behavior after learning the findings of surprising psychology experiments. For most people, this isn’t the case. So, what are we, in turn, to learn from all this? One obvious conclusion is that psychology only helps those who are ready to change their mental paradigm or refrain from being overconfident when faced with overwhelming statistical evidence saying otherwise. In other words, psychology helps those who “think slow”, a point beautifully illustrated by Daniel Kahneman in his book “Thinking, Fast and Slow”.
Psychology: it tells us that most people think they’re better than everybody else
Kahneman’s book reads like a detective story, but the plot isn’t about some crime, it’s about the characters: you, the reader. The author doesn’t speculate, but bases his arguments on facts and published data in science journals to show us that we aren’t perfectly rational beings in the way that most economy theory wants us and describes us to be.
One of the leading examples is a now famous study made in 1975 by social psychologists Richard Nisbett and Eugene Borgida of the University of Michigan called the “helping experiment”. In the study, participants join the experiment in pairs where each enters an opaque booth in close proximity to the other. Participants are asked to talk about their lives with the other person, which they can’t see, via an intercomm. Each participant took two-minute turns to chat. When the experiment actually turned interesting was when a participant (a trained actor) would feign a seizure, while the intercomm was on. The actor first voiced a painful growl, then asked for help, then collapsed (you could hear him collapse on the floor).
So, what do you think happened? Surprisingly, most of the participants (11 out of 15) didn’t rush to help, presumably because individuals feel relieved of responsibility when they know that others have heard the same plea for help. This was just a hypothesis, of course, and there are many other possible explanations, but fact of the matter is that most participants didn’t came forward to help a man in obvious mortal peril, despite the fact that most people would agree that this situation would call for them actively assisting the person in distress.
But, again, do people learn anything about the psychology behind this experiment? In another experiment, Nisbett and Borgida showed pictures and videos of various people from the opaque booths and asked participants to say whether these people would be likely to help a choking person. Half were informed of the statistics that an overwhelming majority didn’t come to aid, while the other half was kept in the dark. You’d think those who were told about the results had a more reserved opinion, yet both groups showed a rosy outcome.
A study that looked at Milgram’s famous, but shocking experiment seems to follow the same pattern as described above. It’s so darn unethical that most psychology professors shrug at it, but it revels so much about obedience. The 1961 experiment made by Stanley Milgram was very simple: participants were kindly invited to administer electric shocks to a person sitting right in front of them from 15 up to 450 volts in 15-volt increments if the said person failed to provide a correct answer. Unbeknownst to them, the person being given the shocks was an actor, and the electric shocks themselves weren’t real. Before the experiment began surveys suggested that only one in 1000 people would go all the way up to 450 volts. Guess how many people turned the knob to the very absolute limit? Some 65% of participants zapped people with painful 450 volts (it doesn’t matter these weren’t real; to the participant’s knowledge they were as real as they get).
At the time, Milgram’s experiment showed us that there’s a Nazi concentration guard living inside each of us, ready to renounce personal responsibility in the face of authority. And we’re not talking about a life and death situation. Participants weren’t threatened they would be put against the wall had they refused to comply (they were commanded at each step to continue and were assured the person receiving the shocks isn’t suffering, despite the obvious pans and screams of pain). A similar striking obedience pattern can be observed in the Stanford prison experiment.
Psychology: useless for mainstream public?
While Milgram’s study is highly interesting, it’s Professor Michael Hobbiss from Boston School, UK that chose to relate it to Nisbett’s and Borgida’s study. When Hobbiss asked his class “had you been a participant in Milgram’s experiment, would you have gone all the way to 450v?”, only 10% participants answered they would. In a second round of surveys, Hobbiss revealed the results of the experiment where 65% of the volunteers went to maximum voltage, and again asked them about their opinion. Would you have gone all the way up in light of this new information, Hobiss asked.
After being apprised of Milgram’s findings, the proportion rose to a figure between 20% and 30%. A significant improvement, yet one which still leaves 70-80% of people thinking they would be in the 35% of defiant participants. Clearly these figures don’t add up, and like Nisbett’s and Borgida’s study, the findings paint a clear picture: most people choose to ignore additional information when faced with a psychological problem and still go on about it, handling it the same had they never had learned about the new findings in the first place. Bottom line: we ignore any evidence from what other people normally do and assume that we are different. Most people say they’re not most people, which of course can’t be true since it doesn’t make any sense. Keep this in mind when faced with psychological or moral and ethical dilemmas.
In the end, Kahneman tells us that there are two processes of thinking: a fast one when we renounce new information that confounds our world views (system 1), and a slow one where people rationalize more thoroughly (system 2). Most people think they’re using system 2, but what Kahneman shows illustrating with various examples is that system 1 is taking the lead in most situations. With this in mind, why are professors bothering teaching psychology? Well, there’s still some people who choose to learn from psychological findings, although they represent a minority, and maybe these people all well worth going through all the pains. Whatever’s the case, try to use system 2.