You're not as virtuous as you think you are - Nitin Nohria
The Washington Post
October 18, 2015 08:00 MYT
October 18, 2015 08:00 MYT
I've been teaching Stanley Milgram's electric-shock experiment to business school students for more than a decade, but "The Experimenter," a movie out this week about the man behind the famous social science research, illuminates something I never really considered.
In one scene, Milgram (played by Peter Sarsgaard) explains his experiment to a class at Harvard: A subject, assigned to be the "teacher," is ordered to administer increasingly intense shocks to another study participant in the role of "learner," allegedly to illustrate how punishment affects learning and memory.
Except, unbeknownst to the subject, the shocks are fake, the other participant works for the lab and the study is about obedience to authority. More than 60 percent of subjects obeyed fully, delivering up to the strongest shock, despite cries of pain from the learner.
Those cries were pre-recorded, but the teachers' distress was real: They stuttered, groaned, trembled and dug their fingernails into their flesh even as they did what they were asked to do.
"How do you justify the deception?" one student asks.
"I like to think of it as an illusion, not deception," Milgram counters, claiming that the former has a "revelatory function."
The student doesn't buy it: "You were delivering shocks, to your subjects, psychological shocks . . . methodically for one year."
Before seeing the film, I didn't fully appreciate that parallel. In the grainy black-and-white documentary footage that the real-life Milgram produced, he remains off-camera. I'd never put much thought into the moral dilemma he faced. I'd never asked myself what I would have done in his position.
I'm fairly certain that - even in an era before institutional review boards, informed-consent and mandatory debriefings - I would have determined that it's wrong to inflict that much psychological distress. But I can't be absolutely sure.
When I ask students whether, as participants, they would have had the courage to stop administering shocks, at least two thirds raise their hands, even though only one third of Milgram's subjects refused.
I've come to refer to this gap between how people believe they would behave and how they actually behave as "moral overconfidence." In the lab, in the classroom and beyond, we tend to be less virtuous than we think we are. And a little moral humility could benefit us all.
Moral overconfidence is on display in politics, in business, in sports - really, in all aspects of life. There are political candidates who say they won't use attack ads until, late in the race, when they're behind in the polls and under pressure from donors and advisers, their ads become increasingly negative.
There are chief executives who come in promising to build a business for the long-term, but then condone questionable accounting gimmickry to satisfy short-term market demands.
There are baseball players who shun the use of steroids until they age past their peak performance and start to look for something to slow the decline.
These people may be condemned as hypocrites. But they aren't necessarily bad actors. Often, they've overestimated their inherent morality and underestimated the influence of situational factors.
Moral overconfidence is in line with what studies find to be our generally inflated view of ourselves. We rate ourselves as above-average drivers, investors and employees, even though math dictates that can't be true for all of us.
We also tend to believe we are less likely than the typical person to exhibit negative qualities and to experience negative life events: to get divorced, become depressed or have a heart attack.
In some ways, this cognitive bias is useful. We're generally better served by being over confident and optimistic than by lacking confidence or being too pessimistic. Positive illusions have been shown to promote happiness, caring, productivity and resilience.
As psychologists Shelley Taylor and Jonathon Brown have written, "These illusions help make each individual's world a warmer and more active and beneficent place in which to live."
But overconfidence can lead us astray. We may ignore or explain away evidence that runs counter to our established view of ourselves, maintaining faith in our virtue even as our actions indicate otherwise. We may forge ahead without pausing to reflect on the ethics of our decisions.
We may be unprepared for, and ultimately overwhelmed by, the pressures of the situation. Afterward, we may offer variations on the excuse: "I was just doing what the situation demanded."
The gap between how we'd expect ourselves to behave and how we actually behave tends to be most evident in high-pressure situations, when there is some inherent ambiguity, when there are competing claims on our sense of right and wrong, and when our moral transgressions are incremental, taking us down a slippery slope.
All these factors were present in Milgram's experiment. The subjects felt the pressure of the setting, a Yale University lab, and of prompts such as "It is absolutely essential that you continue." There was ambiguity surrounding what a researcher might reasonably request and what rights research subjects should demand.
There also was a tension between the subjects' moral obligation to do no harm and their obligation to dutifully complete an experiment they had volunteered for and that might contribute to the broader advancement of science and human understanding.
Additionally, because the subjects were first asked to administer low-voltage jolts that increased slowly over time, it was tricky for them to determine exactly when they went too far and violated their moral code.
For a real-world example, consider Enron. Employees were under extraordinary pressure to present a picture of impressive earnings. Ambiguities and conflicts were built into the legal and regulatory systems they were operating in.
And so they pushed accounting rules to their limits. What began as innovative and legitimate financial engineering progressed to a corporate shell game that met the letter of the law but flouted its spirit - and ultimately led to Enron's collapse.
This is not to say that Enron's top executives - Kenneth Lay, Jeffrey Skilling and Andrew Fastow - were good people, but to emphasize how others throughout the organization got caught up in morally troublesome behavior.
We would see fewer headlines about scandal and malfeasance, and we could get our actions to better match our expectations, if we tempered our moral overconfidence with some moral humility. When we recognize that the vast majority of us overestimate our ability to do the right thing, we can take constructive steps to limit our fallibility and reduce the odds of bad behavior.
One way to instill moral humility is to reflect on cases of moral transgression. We should be cautious about labeling people as evil, sadistic or predatory. Of course, bad people who deliberately do bad things are out there. But we should be attuned to how situational factors affect generally good people who want to do the right thing.
Research shows that when we are under extreme time pressure, we are more likely to behave unethically. When we operate in isolation, we are more likely to break rules. When incentives are very steep (we get a big reward if we reach a goal, but much less if we don't), we are more likely to try to achieve them by hook or by crook.
I teach a case about an incentive program that Sears Auto Centers had in the 1990s. The company began offering mechanics and managers big payments if they met certain monthly goals - for instance, by doing a certain number of brake jobs.
To make their numbers, managers and mechanics began diagnosing problems where none existed and making unnecessary repairs. At first, employees did this sporadically and only when it was absolutely necessary to make quota, but soon they were doing unneeded brake jobs on many cars. They may not have set out to cheat customers, but that's what they ended up doing.
Along with studying moral transgression, we should celebrate people who do the right thing when pressured to do wrong.
These would include whistleblowers such as Jeffrey Wigand of the tobacco industry and Sherron Watkins of Enron. But we also can look to the civil rights movement, the feminist movement and the gay rights movement, among others, to find people who used their ingenuity and took great risks to defy conventions or authorities they considered unjust.
I teach another case study in which a senior banker asks an associate to present data to a client that makes the expected returns of a transaction look more attractive than they actually are.
When I ask students how they would respond, most say they'd initiate a conversation with the boss in which they gently push back. I then role-play as a busy banker who's on the phone, annoyed at the associate showing up to talk about this issue again: "Why are you back here? Haven't you done it yet? I'll take responsibility - just do as I instructed."
Asked what they'd do next, the students generally fall into two groups: Most say they'd cave and go along with the instructions, and some say they would resign. What's interesting is how they stake out the two extreme positions; very few have the imagination to find a middle ground, such as talking to peers, or senior employees beyond the boss, or seeking out an ombudsman.
The aim in teaching this case is to help students see ways to behave more resourcefully and imaginatively in the face of pressure, and to adopt a wider perspective that offers alternative solutions.
Leaders have an additional duty to reduce the incentives and pressures in their organizations that are likely to encourage moral transgressions - and to clear a path for employees to report behavior that steps over boundaries.
Some professions have had success implementing formal codes of conduct. Doctors look to the Hippocratic Oath as a simple guide to right and wrong, and police read suspects the Miranda Rights to limit their own sense of power and make clear that arrestees should not feel powerless. Unfortunately, in business, such scripts remain less developed. Vision and values statements are common. But Enron's certainly didn't stand in the way of wrongdoing.
In the absence of effective moral codes, or in combination with them, a culture of "psychological safety" can help people find moral courage. The concept, pioneered by my Harvard colleague Amy Edmondson, is that organizations should encourage employees to take risks - report mistakes, ask questions, pitch proposals - without the fear that they will be blamed or criticized.
Edmondson found that high-performing medical teams had higher rates of reported errors than teams with lower performance. The teams comfortable reporting errors were able to collectively learn from their mistakes. Similarly, places where employees are comfortable objecting to moral pressure or flagging moral transgression have the best chance of correcting course.
Stanley Milgram defended his study and its methodology in public without hesitation. In his notebooks, however, there's a hint of moral wavering: "If we fail to intervene, although we know a man is being made upset, why separate these actions of ours from those of the subject, who feels he is causing discomfort to another."
Milgram apparently overrode that instinct. But we shouldn't. If we all maintain a healthy dose of moral humility, we can avoid the blind obedience of the subjects in his experiment, as well as the harm we can unwittingly cause when in positions of authority.
* Nitin Nohria is the dean of Harvard Business School.
** Views expressed here are strictly of the author's and doesn't necessarily reflect Astro AWANI's.