ABOUT THE SPEAKER
Philip Zimbardo - Psychologist
Philip Zimbardo was the leader of the notorious 1971 Stanford Prison Experiment -- and an expert witness at Abu Ghraib. His book The Lucifer Effect explores the nature of evil; now, in his new work, he studies the nature of heroism.

Why you should listen

Philip Zimbardo knows what evil looks like. After serving as an expert witness during the Abu Ghraib trials, he wrote The Lucifer Effect: Understanding How Good People Turn Evil. From Nazi comic books to the tactics of used-car salesmen, he explores a wealth of sources in trying to explain the psychology of evil.

A past president of the American Psychological Association and a professor emeritus at Stanford, Zimbardo retired in 2008 from lecturing, after 50 years of teaching his legendary introductory course in psychology. In addition to his work on evil and heroism, Zimbardo recently published The Time Paradox, exploring different cultural and personal perspectives on time.

Still well-known for his controversial Stanford Prison Experiment, Zimbardo in his new research looks at the psychology of heroism. He asks, "What pushes some people to become perpetrators of evil, while others act heroically on behalf of those in need?"

More profile about the speaker
Philip Zimbardo | Speaker | TED.com
TED2008

Philip Zimbardo: The psychology of evil

Filmed:
7,078,283 views

Philip Zimbardo knows how easy it is for nice people to turn bad. In this talk, he shares insights and graphic unseen photos from the Abu Ghraib trials. Then he talks about the flip side: how easy it is to be a hero, and how we can rise to the challenge.
- Psychologist
Philip Zimbardo was the leader of the notorious 1971 Stanford Prison Experiment -- and an expert witness at Abu Ghraib. His book The Lucifer Effect explores the nature of evil; now, in his new work, he studies the nature of heroism. Full bio

Double-click the English transcript below to play the video.

00:13
Philosophers, dramatists, theologians
0
1000
3000
00:16
have grappled with this question for centuries:
1
4000
2000
00:18
what makes people go wrong?
2
6000
2000
00:20
Interestingly, I asked this question when I was a little kid.
3
8000
2000
00:23
When I was a kid growing up in the South Bronx, inner-city ghetto
4
11000
2000
00:25
in New York, I was surrounded by evil,
5
13000
2000
00:28
as all kids are who grew up in an inner city.
6
16000
2000
00:30
And I had friends who were really good kids,
7
18000
2000
00:32
who lived out the Dr. Jekyll Mr. Hyde scenario -- Robert Louis Stevenson.
8
20000
4000
00:36
That is, they took drugs, got in trouble, went to jail.
9
24000
4000
00:40
Some got killed, and some did it without drug assistance.
10
28000
4000
00:44
So when I read Robert Louis Stevenson, that wasn't fiction.
11
32000
3000
00:47
The only question is, what was in the juice?
12
35000
1000
00:48
And more importantly, that line between good and evil --
13
36000
5000
00:53
which privileged people like to think is fixed and impermeable,
14
41000
3000
00:56
with them on the good side, and the others on the bad side --
15
44000
3000
00:59
I knew that line was movable, and it was permeable.
16
47000
3000
01:02
Good people could be seduced across that line,
17
50000
2000
01:05
and under good and some rare circumstances, bad kids could recover
18
53000
3000
01:08
with help, with reform, with rehabilitation.
19
56000
3000
01:12
So I want to begin with this this wonderful illusion
20
60000
2000
01:14
by [Dutch] artist M.C. Escher.
21
62000
2000
01:17
If you look at it and focus on the white,
22
65000
1000
01:18
what you see is a world full of angels.
23
66000
2000
01:20
But let's look more deeply, and as we do,
24
68000
3000
01:23
what appears is the demons, the devils in the world.
25
71000
3000
01:27
And that tells us several things.
26
75000
1000
01:28
One, the world is, was, will always be filled with good and evil,
27
76000
3000
01:31
because good and evil is the yin and yang of the human condition.
28
79000
3000
01:34
It tells me something else. If you remember,
29
82000
2000
01:36
God's favorite angel was Lucifer.
30
84000
3000
01:39
Apparently, Lucifer means "the light."
31
87000
2000
01:42
It also means "the morning star," in some scripture.
32
90000
2000
01:44
And apparently, he disobeyed God,
33
92000
3000
01:47
and that's the ultimate disobedience to authority.
34
95000
3000
01:50
And when he did, Michael, the archangel, was sent
35
98000
3000
01:54
to kick him out of heaven along with the other fallen angels.
36
102000
3000
01:57
And so Lucifer descends into hell, becomes Satan,
37
105000
3000
02:01
becomes the devil, and the force of evil in the universe begins.
38
109000
2000
02:03
Paradoxically, it was God who created hell as a place to store evil.
39
111000
5000
02:09
He didn't do a good job of keeping it there though.
40
117000
1000
02:10
So, this arc of the cosmic transformation
41
118000
3000
02:13
of God's favorite angel into the Devil,
42
121000
2000
02:15
for me, sets the context for understanding human beings
43
123000
4000
02:19
who are transformed from good, ordinary people
44
127000
3000
02:22
into perpetrators of evil.
45
130000
2000
02:24
So the Lucifer effect, although it focuses on the negatives --
46
132000
4000
02:28
the negatives that people can become,
47
136000
3000
02:31
not the negatives that people are --
48
139000
1000
02:32
leads me to a psychological definition. Evil is the exercise of power.
49
140000
6000
02:38
And that's the key: it's about power.
50
146000
2000
02:40
To intentionally harm people psychologically,
51
148000
3000
02:44
to hurt people physically, to destroy people mortally, or ideas,
52
152000
3000
02:47
and to commit crimes against humanity.
53
155000
2000
02:51
If you Google "evil," a word that should surely have withered by now,
54
159000
3000
02:54
you come up with 136 million hits in a third of a second.
55
162000
4000
02:58
A few years ago -- I am sure all of you were shocked, as I was,
56
166000
4000
03:02
with the revelation of American soldiers
57
170000
3000
03:05
abusing prisoners in a strange place
58
173000
3000
03:08
in a controversial war, Abu Ghraib in Iraq.
59
176000
3000
03:11
And these were men and women
60
179000
1000
03:12
who were putting prisoners through unbelievable humiliation.
61
180000
5000
03:18
I was shocked, but I wasn't surprised,
62
186000
1000
03:19
because I had seen those same visual parallels
63
187000
2000
03:22
when I was the prison superintendent of the Stanford Prison Study.
64
190000
3000
03:25
Immediately the Bush administration military said ... what?
65
193000
2000
03:28
What all administrations say when there's a scandal.
66
196000
2000
03:30
"Don't blame us. It's not the system. It's the few bad apples,
67
198000
3000
03:34
the few rogue soldiers."
68
202000
1000
03:35
My hypothesis is, American soldiers are good, usually.
69
203000
3000
03:38
Maybe it was the barrel that was bad.
70
206000
2000
03:40
But how am I going to -- how am I going to deal with that hypothesis?
71
208000
3000
03:43
I became an expert witness
72
211000
1000
03:44
for one of the guards, Sergeant Chip Frederick,
73
212000
2000
03:46
and in that position, I had access to the dozen investigative reports.
74
214000
4000
03:50
I had access to him. I could study him,
75
218000
3000
03:54
have him come to my home, get to know him,
76
222000
1000
03:55
do psychological analysis to see, was he a good apple or bad apple.
77
223000
4000
03:59
And thirdly, I had access to all of the 1,000 pictures
78
227000
4000
04:03
that these soldiers took.
79
231000
1000
04:05
These pictures are of a violent or sexual nature.
80
233000
2000
04:07
All of them come from the cameras of American soldiers.
81
235000
3000
04:11
Because everybody has a digital camera or cell phone camera,
82
239000
2000
04:13
they took pictures of everything. More than 1,000.
83
241000
2000
04:16
And what I've done is I organized them into various categories.
84
244000
2000
04:18
But these are by United States military police, army reservists.
85
246000
5000
04:24
They are not soldiers prepared for this mission at all.
86
252000
3000
04:27
And it all happened in a single place, Tier 1-A, on the night shift.
87
255000
5000
04:32
Why? Tier 1-A was the center for military intelligence.
88
260000
3000
04:35
It was the interrogation hold. The CIA was there.
89
263000
3000
04:39
Interrogators from Titan Corporation, all there,
90
267000
2000
04:41
and they're getting no information about the insurgency.
91
269000
3000
04:45
So they're going to put pressure on these soldiers,
92
273000
1000
04:46
military police, to cross the line,
93
274000
2000
04:49
give them permission to break the will of the enemy,
94
277000
3000
04:52
to prepare them for interrogation, to soften them up,
95
280000
2000
04:54
to take the gloves off. Those are the euphemisms,
96
282000
2000
04:56
and this is how it was interpreted.
97
284000
3000
05:00
Let's go down to that dungeon.
98
288000
1000
05:01
(Camera shutter)
99
289000
37000
05:38
(Thuds)
100
326000
6000
05:45
(Camera shutter)
101
333000
14000
05:59
(Thuds)
102
347000
10000
06:09
(Breathing)
103
357000
8000
06:17
(Bells)
104
365000
31000
06:49
So, pretty horrific.
105
397000
3000
06:52
That's one of the visual illustrations of evil.
106
400000
3000
06:55
And it should not have escaped you that
107
403000
1000
06:56
the reason I paired the prisoner with his arms out
108
404000
4000
07:00
with Leonardo da Vinci's ode to humanity
109
408000
3000
07:03
is that that prisoner was mentally ill.
110
411000
2000
07:05
That prisoner covered himself with shit every day,
111
413000
2000
07:07
and they used to have to roll him in dirt so he wouldn't stink.
112
415000
3000
07:10
But the guards ended up calling him "Shit Boy."
113
418000
2000
07:12
What was he doing in that prison
114
420000
2000
07:14
rather than in some mental institution?
115
422000
3000
07:17
In any event, here's former Secretary of Defense Rumsfeld.
116
425000
3000
07:20
He comes down and says, "I want to know, who is responsible?
117
428000
2000
07:22
Who are the bad apples?" Well, that's a bad question.
118
430000
3000
07:25
You have to reframe it and ask, "What is responsible?"
119
433000
3000
07:28
Because "what" could be the who of people,
120
436000
2000
07:30
but it could also be the what of the situation,
121
438000
2000
07:32
and obviously that's wrongheaded.
122
440000
2000
07:34
So how do psychologists go about understanding
123
442000
2000
07:36
such transformations of human character,
124
444000
2000
07:38
if you believe that they were good soldiers
125
446000
2000
07:40
before they went down to that dungeon?
126
448000
1000
07:42
There are three ways. The main way is -- it's called dispositional.
127
450000
2000
07:44
We look at what's inside of the person, the bad apples.
128
452000
3000
07:48
This is the foundation of all of social science,
129
456000
3000
07:51
the foundation of religion, the foundation of war.
130
459000
2000
07:55
Social psychologists like me come along and say, "Yeah,
131
463000
2000
07:57
people are the actors on the stage,
132
465000
2000
07:59
but you'll have to be aware of what that situation is.
133
467000
2000
08:01
Who are the cast of characters? What's the costume?
134
469000
3000
08:04
Is there a stage director?"
135
472000
1000
08:05
And so we're interested in, what are the external factors
136
473000
2000
08:08
around the individual -- the bad barrel?
137
476000
2000
08:10
And social scientists stop there, and they miss the big point
138
478000
3000
08:13
that I discovered when I became an expert witness for Abu Ghraib.
139
481000
3000
08:16
The power is in the system.
140
484000
2000
08:18
The system creates the situation that corrupts the individuals,
141
486000
3000
08:21
and the system is the legal, political, economic, cultural background.
142
489000
5000
08:26
And this is where the power is of the bad-barrel makers.
143
494000
3000
08:29
So if you want to change a person, you've got to change the situation.
144
497000
3000
08:32
If you want to change the situation,
145
500000
1000
08:33
you've got to know where the power is, in the system.
146
501000
2000
08:35
So the Lucifer effect involves understanding
147
503000
2000
08:37
human character transformations with these three factors.
148
505000
5000
08:43
And it's a dynamic interplay.
149
511000
1000
08:44
What do the people bring into the situation?
150
512000
2000
08:46
What does the situation bring out of them?
151
514000
2000
08:48
And what is the system that creates and maintains that situation?
152
516000
4000
08:52
So my book, "The Lucifer Effect," recently published, is about,
153
520000
2000
08:54
how do you understand how good people turn evil?
154
522000
2000
08:57
And it has a lot of detail
155
525000
1000
08:58
about what I'm going to talk about today.
156
526000
2000
09:01
So Dr. Z's "Lucifer Effect," although it focuses on evil,
157
529000
3000
09:04
really is a celebration of the human mind's
158
532000
2000
09:06
infinite capacity to make any of us kind or cruel,
159
534000
4000
09:10
caring or indifferent, creative or destructive,
160
538000
3000
09:13
and it makes some of us villains.
161
541000
2000
09:15
And the good news story that I'm going to hopefully come to
162
543000
2000
09:18
at the end is that it makes some of us heroes.
163
546000
2000
09:20
This is a wonderful cartoon in the New Yorker,
164
548000
3000
09:23
which really summarizes my whole talk:
165
551000
2000
09:25
"I'm neither a good cop nor a bad cop, Jerome.
166
553000
2000
09:27
Like yourself, I'm a complex amalgam
167
555000
2000
09:29
of positive and negative personality traits
168
557000
2000
09:32
that emerge or not, depending on the circumstances."
169
560000
3000
09:35
(Laughter)
170
563000
2000
09:37
There's a study some of you think you know about,
171
565000
3000
09:40
but very few people have ever read the story. You watched the movie.
172
568000
4000
09:44
This is Stanley Milgram, little Jewish kid from the Bronx,
173
572000
3000
09:47
and he asked the question, "Could the Holocaust happen here, now?"
174
575000
3000
09:51
People say, "No, that's Nazi Germany,
175
579000
1000
09:52
that's Hitler, you know, that's 1939."
176
580000
2000
09:54
He said, "Yeah, but suppose Hitler asked you,
177
582000
2000
09:56
'Would you electrocute a stranger?' 'No way, not me, I'm a good person.' "
178
584000
3000
10:00
He said, "Why don't we put you in a situation
179
588000
1000
10:01
and give you a chance to see what you would do?"
180
589000
2000
10:03
And so what he did was he tested 1,000 ordinary people.
181
591000
4000
10:07
500 New Haven, Connecticut, 500 Bridgeport.
182
595000
3000
10:10
And the ad said, "Psychologists want to understand memory.
183
598000
4000
10:14
We want to improve people's memory,
184
602000
1000
10:15
because memory is the key to success." OK?
185
603000
3000
10:18
"We're going to give you five bucks -- four dollars for your time."
186
606000
5000
10:24
And it said, "We don't want college students.
187
612000
1000
10:25
We want men between 20 and 50."
188
613000
2000
10:27
In the later studies, they ran women.
189
615000
1000
10:28
Ordinary people: barbers, clerks, white-collar people.
190
616000
4000
10:32
So, you go down, and one of you is going to be a learner,
191
620000
3000
10:35
and one of you is going to be a teacher.
192
623000
1000
10:36
The learner's a genial, middle-aged guy.
193
624000
2000
10:38
He gets tied up to the shock apparatus in another room.
194
626000
3000
10:41
The learner could be middle-aged, could be as young as 20.
195
629000
3000
10:44
And one of you is told by the authority, the guy in the lab coat,
196
632000
4000
10:48
"Your job as teacher is to give this guy material to learn.
197
636000
3000
10:51
Gets it right, reward him.
198
639000
1000
10:52
Gets it wrong, you press a button on the shock box.
199
640000
2000
10:54
The first button is 15 volts. He doesn't even feel it."
200
642000
3000
10:58
That's the key. All evil starts with 15 volts.
201
646000
3000
11:01
And then the next step is another 15 volts.
202
649000
2000
11:04
The problem is, at the end of the line, it's 450 volts.
203
652000
2000
11:06
And as you go along, the guy is screaming,
204
654000
3000
11:09
"I've got a heart condition! I'm out of here!"
205
657000
2000
11:11
You're a good person. You complain.
206
659000
2000
11:13
"Sir, who's going to be responsible if something happens to him?"
207
661000
2000
11:15
The experimenter says, "Don't worry, I will be responsible.
208
663000
3000
11:18
Continue, teacher."
209
666000
1000
11:19
And the question is, who would go all the way to 450 volts?
210
667000
4000
11:24
You should notice here, when it gets up to 375,
211
672000
2000
11:26
it says, "Danger. Severe Shock."
212
674000
1000
11:28
When it gets up to here, there's "XXX" -- the pornography of power.
213
676000
3000
11:31
(Laughter)
214
679000
1000
11:32
So Milgram asks 40 psychiatrists,
215
680000
1000
11:33
"What percent of American citizens would go to the end?"
216
681000
3000
11:37
They said only one percent. Because that's sadistic behavior,
217
685000
3000
11:40
and we know, psychiatry knows, only one percent of Americans are sadistic.
218
688000
3000
11:44
OK. Here's the data. They could not be more wrong.
219
692000
4000
11:48
Two thirds go all the way to 450 volts. This was just one study.
220
696000
5000
11:53
Milgram did more than 16 studies. And look at this.
221
701000
3000
11:56
In study 16, where you see somebody like you go all the way,
222
704000
4000
12:00
90 percent go all the way. In study five, if you see people rebel, 90 percent rebel.
223
708000
6000
12:06
What about women? Study 13 -- no different than men.
224
714000
3000
12:10
So Milgram is quantifying evil as the willingness of people
225
718000
3000
12:13
to blindly obey authority, to go all the way to 450 volts.
226
721000
3000
12:16
And it's like a dial on human nature.
227
724000
2000
12:18
A dial in a sense that you can make almost everybody totally obedient,
228
726000
4000
12:23
down to the majority, down to none.
229
731000
2000
12:25
So what are the external parallels? For all research is artificial.
230
733000
4000
12:29
What's the validity in the real world?
231
737000
1000
12:30
912 American citizens committed suicide or were murdered
232
738000
4000
12:34
by family and friends in Guyana jungle in 1978,
233
742000
3000
12:37
because they were blindly obedient to this guy, their pastor --
234
745000
3000
12:40
not their priest -- their pastor, Reverend Jim Jones.
235
748000
2000
12:42
He persuaded them to commit mass suicide.
236
750000
3000
12:46
And so, he's the modern Lucifer effect,
237
754000
1000
12:47
a man of God who becomes the Angel of Death.
238
755000
3000
12:52
Milgram's study is all about individual authority to control people.
239
760000
4000
12:56
Most of the time, we are in institutions,
240
764000
3000
13:00
so the Stanford Prison Study is a study of the power of institutions
241
768000
3000
13:03
to influence individual behavior.
242
771000
2000
13:05
Interestingly, Stanley Milgram and I were in the same high school class
243
773000
3000
13:08
in James Monroe in the Bronx, 1954.
244
776000
3000
13:13
So this study, which I did
245
781000
1000
13:14
with my graduate students, especially Craig Haney --
246
782000
2000
13:16
we also began work with an ad.
247
784000
1000
13:17
We didn't have money, so we had a cheap, little ad,
248
785000
2000
13:19
but we wanted college students for a study of prison life.
249
787000
3000
13:22
75 people volunteered, took personality tests.
250
790000
3000
13:25
We did interviews. Picked two dozen:
251
793000
2000
13:27
the most normal, the most healthy.
252
795000
1000
13:29
Randomly assigned them to be prisoner and guard.
253
797000
2000
13:31
So on day one, we knew we had good apples.
254
799000
2000
13:33
I'm going to put them in a bad situation.
255
801000
2000
13:35
And secondly, we know there's no difference
256
803000
2000
13:38
between the boys who are going to be guards
257
806000
1000
13:39
and the boys who are going to be prisoners.
258
807000
1000
13:41
The kids who were going to be prisoners,
259
809000
1000
13:42
we said, "Wait at home in the dormitories. The study will begin Sunday."
260
810000
2000
13:45
We didn't tell them
261
813000
1000
13:46
that the city police were going to come and do realistic arrests.
262
814000
36000
14:22
(Video) Student: A police car pulls up in front, and a cop comes to the front door,
263
850000
6000
14:28
and knocks, and says he's looking for me.
264
856000
2000
14:30
So they, right there, you know, they took me out the door,
265
858000
2000
14:33
they put my hands against the car.
266
861000
3000
14:36
It was a real cop car, it was a real policeman,
267
864000
2000
14:39
and there were real neighbors in the street,
268
867000
1000
14:40
who didn't know that this was an experiment.
269
868000
4000
14:44
And there was cameras all around and neighbors all around.
270
872000
3000
14:47
They put me in the car, then they drove me around Palo Alto.
271
875000
3000
14:52
They took me to the police station,
272
880000
3000
14:55
the basement of the police station. Then they put me in a cell.
273
883000
10000
15:05
I was the first one to be picked up, so they put me in a cell,
274
893000
2000
15:07
which was just like a room with a door with bars on it.
275
895000
4000
15:12
You could tell it wasn't a real jail.
276
900000
1000
15:13
They locked me in there, in this degrading little outfit.
277
901000
5000
15:19
They were taking this experiment too seriously.
278
907000
2000
15:21
Philip Zimbardo: Here are the prisoners who are going to be dehumanized.
279
909000
2000
15:23
They're going to become numbers.
280
911000
1000
15:24
Here are the guards with the symbols of power and anonymity.
281
912000
3000
15:27
Guards get prisoners
282
915000
1000
15:28
to clean the toilet bowls out with their bare hands,
283
916000
2000
15:30
to do other humiliating tasks.
284
918000
2000
15:32
They strip them naked. They sexually taunt them.
285
920000
2000
15:34
They begin to do degrading activities,
286
922000
2000
15:36
like having them simulate sodomy.
287
924000
2000
15:38
You saw simulating fellatio in soldiers in Abu Ghraib.
288
926000
3000
15:41
My guards did it in five days. The stress reaction was so extreme
289
929000
5000
15:46
that normal kids we picked because they were healthy
290
934000
2000
15:48
had breakdowns within 36 hours.
291
936000
2000
15:50
The study ended after six days, because it was out of control.
292
938000
4000
15:54
Five kids had emotional breakdowns.
293
942000
2000
15:58
Does it make a difference if warriors go to battle
294
946000
2000
16:00
changing their appearance or not?
295
948000
2000
16:02
Does it make a difference if they're anonymous,
296
950000
1000
16:03
in how they treat their victims?
297
951000
2000
16:05
We know in some cultures, they go to war,
298
953000
1000
16:06
they don't change their appearance.
299
954000
1000
16:07
In other cultures, they paint themselves like "Lord of the Flies."
300
955000
2000
16:09
In some, they wear masks.
301
957000
2000
16:11
In many, soldiers are anonymous in uniform.
302
959000
3000
16:14
So this anthropologist, John Watson, found
303
962000
2000
16:17
23 cultures that had two bits of data.
304
965000
2000
16:19
Do they change their appearance? 15.
305
967000
2000
16:21
Do they kill, torture, mutilate? 13.
306
969000
2000
16:23
If they don't change their appearance,
307
971000
2000
16:25
only one of eight kills, tortures or mutilates.
308
973000
2000
16:27
The key is in the red zone.
309
975000
2000
16:29
If they change their appearance,
310
977000
1000
16:30
12 of 13 -- that's 90 percent -- kill, torture, mutilate.
311
978000
4000
16:35
And that's the power of anonymity.
312
983000
1000
16:36
So what are the seven social processes
313
984000
2000
16:38
that grease the slippery slope of evil?
314
986000
2000
16:40
Mindlessly taking the first small step.
315
988000
2000
16:42
Dehumanization of others. De-individuation of Self.
316
990000
3000
16:45
Diffusion of personal responsibility. Blind obedience to authority.
317
993000
3000
16:49
Uncritical conformity to group norms.
318
997000
1000
16:50
Passive tolerance to evil through inaction or indifference.
319
998000
3000
16:54
And it happens when you're in a new or unfamiliar situation.
320
1002000
2000
16:56
Your habitual response patterns don't work.
321
1004000
2000
16:59
Your personality and morality are disengaged.
322
1007000
2000
17:01
"Nothing is easier than to denounce the evildoer;
323
1009000
3000
17:04
nothing more difficult than understanding him," Dostoyevksy tells us.
324
1012000
3000
17:07
Understanding is not excusing. Psychology is not excuse-iology.
325
1015000
4000
17:12
So social and psychological research reveals
326
1020000
1000
17:13
how ordinary, good people can be transformed without the drugs.
327
1021000
4000
17:17
You don't need it. You just need the social-psychological processes.
328
1025000
3000
17:20
Real world parallels? Compare this with this.
329
1028000
4000
17:26
James Schlesinger -- and I'm going to have to end with this -- says,
330
1034000
2000
17:28
"Psychologists have attempted to understand how and why
331
1036000
2000
17:31
individuals and groups who usually act humanely
332
1039000
2000
17:33
can sometimes act otherwise in certain circumstances."
333
1041000
3000
17:37
That's the Lucifer effect.
334
1045000
1000
17:38
And he goes on to say, "The landmark Stanford study
335
1046000
2000
17:40
provides a cautionary tale for all military operations."
336
1048000
4000
17:44
If you give people power without oversight,
337
1052000
2000
17:47
it's a prescription for abuse. They knew that, and let that happen.
338
1055000
3000
17:50
So another report, an investigative report by General Fay,
339
1058000
5000
17:55
says the system is guilty. And in this report,
340
1063000
2000
17:57
he says it was the environment that created Abu Ghraib,
341
1065000
3000
18:00
by leadership failures that contributed
342
1068000
2000
18:02
to the occurrence of such abuse,
343
1070000
1000
18:03
and the fact that it remained undiscovered
344
1071000
2000
18:05
by higher authorities for a long period of time.
345
1073000
2000
18:07
Those abuses went on for three months. Who was watching the store?
346
1075000
4000
18:11
The answer is nobody, and, I think, nobody on purpose.
347
1079000
2000
18:14
He gave the guards permission to do those things,
348
1082000
1000
18:15
and they knew nobody was ever going to come down to that dungeon.
349
1083000
3000
18:18
So you need a paradigm shift in all of these areas.
350
1086000
3000
18:21
The shift is away from the medical model
351
1089000
2000
18:23
that focuses only on the individual.
352
1091000
2000
18:25
The shift is toward a public health model
353
1093000
2000
18:28
that recognizes situational and systemic vectors of disease.
354
1096000
3000
18:31
Bullying is a disease. Prejudice is a disease. Violence is a disease.
355
1099000
4000
18:35
And since the Inquisition, we've been dealing with problems
356
1103000
2000
18:37
at the individual level. And you know what? It doesn't work.
357
1105000
3000
18:40
Aleksandr Solzhenitsyn says, "The line between good and evil
358
1108000
3000
18:43
cuts through the heart of every human being."
359
1111000
2000
18:45
That means that line is not out there.
360
1113000
2000
18:47
That's a decision that you have to make. That's a personal thing.
361
1115000
3000
18:50
So I want to end very quickly on a positive note.
362
1118000
3000
18:53
Heroism as the antidote to evil,
363
1121000
2000
18:56
by promoting the heroic imagination,
364
1124000
1000
18:57
especially in our kids, in our educational system.
365
1125000
3000
19:00
We want kids to think, I'm the hero in waiting,
366
1128000
2000
19:02
waiting for the right situation to come along,
367
1130000
2000
19:05
and I will act heroically.
368
1133000
1000
19:06
My whole life is now going to focus away from evil --
369
1134000
2000
19:08
that I've been in since I was a kid -- to understanding heroes.
370
1136000
3000
19:11
Banality of heroism
371
1139000
1707
19:13
is, it's ordinary people who do heroic deeds.
372
1140707
2293
19:15
It's the counterpoint to Hannah Arendt's "Banality of Evil."
373
1143000
3000
19:18
Our traditional societal heroes are wrong,
374
1146000
3000
19:21
because they are the exceptions.
375
1149000
1000
19:22
They organize their whole life around this.
376
1150000
2000
19:24
That's why we know their names.
377
1152000
1000
19:25
And our kids' heroes are also wrong models for them,
378
1153000
2000
19:27
because they have supernatural talents.
379
1155000
2000
19:30
We want our kids to realize most heroes are everyday people,
380
1158000
2000
19:32
and the heroic act is unusual. This is Joe Darby.
381
1160000
4000
19:36
He was the one that stopped those abuses you saw,
382
1164000
2000
19:38
because when he saw those images,
383
1166000
2000
19:40
he turned them over to a senior investigating officer.
384
1168000
3000
19:43
He was a low-level private, and that stopped it. Was he a hero? No.
385
1171000
3000
19:47
They had to put him in hiding, because people wanted to kill him,
386
1175000
3000
19:50
and then his mother and his wife.
387
1178000
1000
19:51
For three years, they were in hiding.
388
1179000
2000
19:53
This is the woman who stopped the Stanford Prison Study.
389
1181000
3000
19:56
When I said it got out of control, I was the prison superintendent.
390
1184000
3000
19:59
I didn't know it was out of control. I was totally indifferent.
391
1187000
3000
20:02
She came down, saw that madhouse and said,
392
1190000
2000
20:04
"You know what, it's terrible what you're doing to those boys.
393
1192000
3000
20:07
They're not prisoners, they're not guards,
394
1195000
1000
20:08
they're boys, and you are responsible."
395
1196000
2000
20:11
And I ended the study the next day.
396
1199000
2000
20:13
The good news is I married her the next year.
397
1201000
2000
20:15
(Laughter)
398
1203000
3000
20:18
(Applause)
399
1206000
7000
20:25
I just came to my senses, obviously.
400
1213000
2000
20:27
So situations have the power to do, through --
401
1215000
2000
20:31
but the point is, this is the same situation
402
1219000
1000
20:32
that can inflame the hostile imagination in some of us,
403
1220000
4000
20:36
that makes us perpetrators of evil,
404
1224000
2000
20:38
can inspire the heroic imagination in others. It's the same situation.
405
1226000
3000
20:42
And you're on one side or the other.
406
1230000
1000
20:43
Most people are guilty of the evil of inaction,
407
1231000
2000
20:45
because your mother said, "Don't get involved. Mind your own business."
408
1233000
3000
20:48
And you have to say, "Mama, humanity is my business."
409
1236000
3000
20:51
So the psychology of heroism is -- we're going to end in a moment --
410
1239000
2000
20:53
how do we encourage children in new hero courses,
411
1241000
4000
20:57
that I'm working with Matt Langdon -- he has a hero workshop --
412
1245000
3000
21:00
to develop this heroic imagination, this self-labeling,
413
1248000
3000
21:03
"I am a hero in waiting," and teach them skills.
414
1251000
3000
21:06
To be a hero, you have to learn to be a deviant,
415
1254000
2000
21:09
because you're always going against the conformity of the group.
416
1257000
2000
21:11
Heroes are ordinary people whose social actions are extraordinary. Who act.
417
1259000
4000
21:15
The key to heroism is two things.
418
1263000
2000
21:17
A: you've got to act when other people are passive.
419
1265000
2000
21:20
B: you have to act socio-centrically, not egocentrically.
420
1268000
3000
21:23
And I want to end with the story that some of you know,
421
1271000
2000
21:25
about Wesley Autrey, New York subway hero.
422
1273000
2000
21:27
Fifty-year-old African-American construction worker.
423
1275000
2000
21:29
He's standing on a subway in New York.
424
1277000
2000
21:31
A white guy falls on the tracks.
425
1279000
1000
21:32
The subway train is coming. There's 75 people there.
426
1280000
3000
21:35
You know what? They freeze.
427
1283000
1000
21:36
He's got a reason not to get involved.
428
1284000
2000
21:38
He's black, the guy's white, and he's got two little kids.
429
1286000
2000
21:41
Instead, he gives his kids to a stranger,
430
1289000
1000
21:42
jumps on the tracks, puts the guy between the tracks,
431
1290000
3000
21:45
lies on him, the subway goes over him.
432
1293000
2000
21:47
Wesley and the guy -- 20 and a half inches height.
433
1295000
3000
21:51
The train clearance is 21 inches.
434
1299000
2000
21:53
A half an inch would have taken his head off.
435
1301000
2000
21:56
And he said, "I did what anyone could do,"
436
1304000
3000
21:59
no big deal to jump on the tracks.
437
1307000
1000
22:00
And the moral imperative is "I did what everyone should do."
438
1308000
4000
22:04
And so one day, you will be in a new situation.
439
1312000
2000
22:07
Take path one, you're going to be a perpetrator of evil.
440
1315000
2000
22:09
Evil, meaning you're going to be Arthur Andersen.
441
1317000
3000
22:12
You're going to cheat, or you're going to allow bullying.
442
1320000
2000
22:14
Path two, you become guilty of the evil of passive inaction.
443
1322000
2000
22:17
Path three, you become a hero.
444
1325000
1000
22:18
The point is, are we ready to take the path
445
1326000
3000
22:21
to celebrating ordinary heroes,
446
1329000
2000
22:23
waiting for the right situation to come along
447
1331000
2000
22:25
to put heroic imagination into action?
448
1333000
2000
22:27
Because it may only happen once in your life,
449
1335000
3000
22:31
and when you pass it by, you'll always know,
450
1339000
1000
22:32
I could have been a hero and I let it pass me by.
451
1340000
3000
22:35
So the point is thinking it and then doing it.
452
1343000
2000
22:37
So I want to thank you. Thank you. Thank you.
453
1345000
3000
22:40
Let's oppose the power of evil systems at home and abroad,
454
1348000
3000
22:43
and let's focus on the positive.
455
1351000
2000
22:45
Advocate for respect of personal dignity, for justice and peace,
456
1353000
3000
22:48
which sadly our administration has not been doing.
457
1356000
2000
22:50
Thanks so much.
458
1358000
1000
22:51
(Applause)
459
1359000
13000

▲Back to top

ABOUT THE SPEAKER
Philip Zimbardo - Psychologist
Philip Zimbardo was the leader of the notorious 1971 Stanford Prison Experiment -- and an expert witness at Abu Ghraib. His book The Lucifer Effect explores the nature of evil; now, in his new work, he studies the nature of heroism.

Why you should listen

Philip Zimbardo knows what evil looks like. After serving as an expert witness during the Abu Ghraib trials, he wrote The Lucifer Effect: Understanding How Good People Turn Evil. From Nazi comic books to the tactics of used-car salesmen, he explores a wealth of sources in trying to explain the psychology of evil.

A past president of the American Psychological Association and a professor emeritus at Stanford, Zimbardo retired in 2008 from lecturing, after 50 years of teaching his legendary introductory course in psychology. In addition to his work on evil and heroism, Zimbardo recently published The Time Paradox, exploring different cultural and personal perspectives on time.

Still well-known for his controversial Stanford Prison Experiment, Zimbardo in his new research looks at the psychology of heroism. He asks, "What pushes some people to become perpetrators of evil, while others act heroically on behalf of those in need?"

More profile about the speaker
Philip Zimbardo | Speaker | TED.com