If a system of death camps were set up in the United States of the sort we had seen in Nazi Germany, one would be able to find sufficient personnel for those camps in any medium-sized American town. -Stanley Milgram
For those who may be unfamiliar, some quick background:
It wasn’t until the early sixties that Adolf Eichmann (one of the key architects of the Holocaust, and for whom the phrase “the banality of evil” was coined) was finally brought to trial. His primary defense was commonly used by others during the earlier Nuremberg trials: he was “just following orders.” To answer whether such a defense could even conceivably excuse an individual who participated in such atrocities, psychologist Stanley Milgram conducted a series of psychological studies on obedience to authority. He found that roughly two-thirds of people were willing to administer fatal shocks upon the command of an authority figure.
Let that sink in a second.
In the early seventies, Philip Zimbardo (another psychologist) conducted the famous Stanford Prison Experiment. Basically, he recruited a bunch of students, split them into groups of prisoners and guards, and then placed them into an immersive prison simulation. Within days, ‘prisoners’ were experiencing mental breakdowns and ‘guards’ were engaging in abuse. The experiment, scheduled to last two weeks, was aborted after six days.
In his excellent book The Lucifer Effect, Zimbardo reflects enormous power that social dynamics or situational forces exert on individuals. Given the right circumstances, we are all capable of acting heinously. But what are those conditions? Zimbardo offers us ten, as lessons learned from Milgram. These are steps to turn ‘good’ people ‘evil’ (evil traps):
- Establish contractual obligation.
- Give people meaning roles such as ‘prisoner’ or ‘guard’.
- Enforce adherence of ‘the rules’ – arbitrary or changing, they must be followed!
- Euphemize (it’s not murder a civilian, it’s “drone that guy”).
- Promote the diffusion of responsibility (“I just work here…”).
- Start small (the first shock in Milgram’s experiment was only 15 volts).
- Take small incremental steps (Think of the boiling frog).
- Have the authority start out ‘just’, and only gradually change to ‘unjust’.
- Make it hard to exit, allow the voicing of dissent, and insist on compliance.
- Offer the “big lie” or ideologically to justify the use of any means.
Take a second and try to see these at work around you. Most organizations set many of these conditions. It’s really in #8 that the slide towards ‘evil’ happens. Without them, we’re left with basically every bureaucracy on the face of the planet. But if we view #8 more as a decoupling from principle than a slide towards the unjust, we can see even it at work in many bureaucracies. In this way, it is really just an instance of the organization’s leadership falling victim to #3. Absent some external correction mechanism, this situation leads to a sort of moral domain sheer (MDS) – the organization develops its own moral world, replete with an internal logic but disconnected from the surrounding world.
Here’s a quick example of what I mean. How is it that a soldier can go to war, kill people, and remain morally intact? Well, the soldier is no longer in the usual moral world when they go to war. They have left for a moral domain where killing is acceptable, as long as the rules were followed. But this is the same process at work, whether the result is weighed good or bad.
Steve Biko, founder of the Black Consciousness Movement under apartheid rule, taught that all whites in South Africa shared in the responsibility for apartheid – not because they were white, per se, but because they participated in benefiting from the system. They were effectively members of an organization that suffered from MDS, and he would not let them hide behind diffused responsibility (#5). Nor did the west allow Eichmann or the many other Nazis shirk their responsibilities.
When our governments (powerful organizations) take action, we the citizens (participants, not subjects) are responsible for it. Not solely responsible, but each of us owns a piece of it. Our governments act in our name. We, the people.
If MDS is practically a feature of organizations and can potentially ‘corrupt’ any of us, but we still retain responsibility for the group and moral culpability for ourselves, what can we do? What if the extent of our political voice is to choose the lesser of two evils?
Remember last week’s lesson: there is no box. If a system produces only unsatisfactory outcomes, maybe it’s time to realize that the system is the problem.
I, for one, wouldn’t want to put Milgram to the test on that opening quote. Maybe it’s time to choose a different system. One that doesn’t produce concentration camps.