ABOUT THE SPEAKER
Edward Tenner - Historian of technology and culture
Edward Tenner is an independent writer, speaker, and editor analyzing the cultural aspects of technological change.

Why you should listen

Edward Tenner is an independent writer and speaker on the history of technology and the unintended consequences of innovation. He writes for The Atlantic on history and current events, and was a founding advisor of Smithsonian's Lemelson Center, where he remains a senior research associate. He was executive editor for physical science and history at Princeton University Press, he has been a visiting lecturer at Princeton and has held visiting research positions at the Institute for Advanced Study, Woodrow Wilson International Center for Scholars, and the Princeton Center for Information Technology Policy. He is now a visiting scholar in the Rutgers School of Communication and Information and an affiliate of the Center for Arts and Cultural Policy of Princeton's Woodrow Wilson School.

More profile about the speaker
Edward Tenner | Speaker | TED.com
TED2011

Edward Tenner: Unintended consequences

Filmed:
873,460 views

Every new invention changes the world -- in ways both intentional and unexpected. Historian Edward Tenner tells stories that illustrate the under-appreciated gap between our ability to innovate and our ability to foresee the consequences.
- Historian of technology and culture
Edward Tenner is an independent writer, speaker, and editor analyzing the cultural aspects of technological change. Full bio

Double-click the English transcript below to play the video.

00:15
I didn't always love unintended consequences,
0
0
3000
00:18
but I've really learned to appreciate them.
1
3000
2000
00:20
I've learned that they're really the essence
2
5000
2000
00:22
of what makes for progress,
3
7000
2000
00:24
even when they seem to be terrible.
4
9000
3000
00:27
And I'd like to review
5
12000
2000
00:29
just how unintended consequences
6
14000
3000
00:32
play the part that they do.
7
17000
3000
00:35
Let's go to 40,000 years before the present,
8
20000
5000
00:40
to the time of the cultural explosion,
9
25000
4000
00:44
when music, art, technology,
10
29000
5000
00:49
so many of the things that we're enjoying today,
11
34000
2000
00:51
so many of the things that are being demonstrated at TED
12
36000
3000
00:54
were born.
13
39000
2000
00:56
And the anthropologist Randall White
14
41000
3000
00:59
has made a very interesting observation:
15
44000
3000
01:02
that if our ancestors
16
47000
2000
01:04
40,000 years ago
17
49000
2000
01:06
had been able to see
18
51000
3000
01:09
what they had done,
19
54000
2000
01:11
they wouldn't have really understood it.
20
56000
2000
01:13
They were responding
21
58000
2000
01:15
to immediate concerns.
22
60000
3000
01:18
They were making it possible for us
23
63000
2000
01:20
to do what they do,
24
65000
2000
01:22
and yet, they didn't really understand
25
67000
2000
01:24
how they did it.
26
69000
2000
01:26
Now let's advance to 10,000 years before the present.
27
71000
5000
01:31
And this is when it really gets interesting.
28
76000
2000
01:33
What about the domestication of grains?
29
78000
3000
01:36
What about the origins of agriculture?
30
81000
3000
01:39
What would our ancestors 10,000 years ago
31
84000
3000
01:42
have said
32
87000
2000
01:44
if they really had technology assessment?
33
89000
2000
01:46
And I could just imagine the committees
34
91000
2000
01:48
reporting back to them
35
93000
2000
01:50
on where agriculture was going to take humanity,
36
95000
3000
01:53
at least in the next few hundred years.
37
98000
3000
01:56
It was really bad news.
38
101000
2000
01:58
First of all, worse nutrition,
39
103000
2000
02:00
maybe shorter life spans.
40
105000
2000
02:02
It was simply awful for women.
41
107000
2000
02:04
The skeletal remains from that period
42
109000
2000
02:06
have shown that they were grinding grain morning, noon and night.
43
111000
5000
02:11
And politically, it was awful.
44
116000
3000
02:14
It was the beginning of a much higher degree
45
119000
3000
02:17
of inequality among people.
46
122000
3000
02:20
If there had been rational technology assessment then,
47
125000
3000
02:23
I think they very well might have said,
48
128000
2000
02:25
"Let's call the whole thing off."
49
130000
3000
02:28
Even now, our choices are having unintended effects.
50
133000
4000
02:32
Historically, for example,
51
137000
2000
02:34
chopsticks -- according to one Japanese anthropologist
52
139000
3000
02:37
who wrote a dissertation about it
53
142000
2000
02:39
at the University of Michigan --
54
144000
2000
02:41
resulted in long-term changes
55
146000
3000
02:44
in the dentition, in the teeth,
56
149000
2000
02:46
of the Japanese public.
57
151000
2000
02:48
And we are also changing our teeth right now.
58
153000
3000
02:51
There is evidence
59
156000
2000
02:53
that the human mouth and teeth
60
158000
2000
02:55
are growing smaller all the time.
61
160000
2000
02:57
That's not necessarily a bad unintended consequence.
62
162000
3000
03:00
But I think from the point of view of a Neanderthal,
63
165000
2000
03:02
there would have been a lot of disapproval
64
167000
2000
03:04
of the wimpish choppers that we now have.
65
169000
3000
03:07
So these things are kind of relative
66
172000
3000
03:10
to where you or your ancestors happen to stand.
67
175000
4000
03:14
In the ancient world
68
179000
2000
03:16
there was a lot of respect for unintended consequences,
69
181000
3000
03:19
and there was a very healthy sense of caution,
70
184000
3000
03:22
reflected in the Tree of Knowledge,
71
187000
2000
03:24
in Pandora's Box,
72
189000
2000
03:26
and especially in the myth of Prometheus
73
191000
2000
03:28
that's been so important
74
193000
2000
03:30
in recent metaphors about technology.
75
195000
2000
03:32
And that's all very true.
76
197000
3000
03:35
The physicians of the ancient world --
77
200000
2000
03:37
especially the Egyptians,
78
202000
2000
03:39
who started medicine as we know it --
79
204000
2000
03:41
were very conscious
80
206000
2000
03:43
of what they could and couldn't treat.
81
208000
2000
03:45
And the translations of the surviving texts say,
82
210000
5000
03:50
"This I will not treat. This I cannot treat."
83
215000
2000
03:52
They were very conscious.
84
217000
2000
03:54
So were the followers of Hippocrates.
85
219000
2000
03:56
The Hippocratic manuscripts also --
86
221000
2000
03:58
repeatedly, according to recent studies --
87
223000
3000
04:01
show how important it is not to do harm.
88
226000
3000
04:04
More recently,
89
229000
2000
04:06
Harvey Cushing,
90
231000
2000
04:08
who really developed neurosurgery as we know it,
91
233000
2000
04:10
who changed it from a field of medicine
92
235000
3000
04:13
that had a majority of deaths resulting from surgery
93
238000
4000
04:17
to one in which there was a hopeful outlook,
94
242000
3000
04:20
he was very conscious
95
245000
2000
04:22
that he was not always going to do the right thing.
96
247000
3000
04:25
But he did his best,
97
250000
2000
04:27
and he kept meticulous records
98
252000
2000
04:29
that let him transform that branch of medicine.
99
254000
3000
04:32
Now if we look forward a bit
100
257000
3000
04:35
to the 19th century,
101
260000
2000
04:37
we find a new style of technology.
102
262000
2000
04:39
What we find is,
103
264000
2000
04:41
no longer simple tools,
104
266000
3000
04:44
but systems.
105
269000
2000
04:46
We find more and more
106
271000
2000
04:48
complex arrangements of machines
107
273000
2000
04:50
that make it harder and harder
108
275000
2000
04:52
to diagnose what's going on.
109
277000
2000
04:54
And the first people who saw that
110
279000
2000
04:56
were the telegraphers of the mid-19th century,
111
281000
3000
04:59
who were the original hackers.
112
284000
2000
05:01
Thomas Edison would have been very, very comfortable
113
286000
3000
05:04
in the atmosphere of a software firm today.
114
289000
3000
05:07
And these hackers had a word
115
292000
3000
05:10
for those mysterious bugs in telegraph systems
116
295000
3000
05:13
that they called bugs.
117
298000
2000
05:15
That was the origin of the word "bug."
118
300000
4000
05:19
This consciousness, though,
119
304000
2000
05:21
was a little slow to seep through the general population,
120
306000
3000
05:24
even people who were very, very well informed.
121
309000
3000
05:27
Samuel Clemens, Mark Twain,
122
312000
2000
05:29
was a big investor
123
314000
2000
05:31
in the most complex machine of all times --
124
316000
3000
05:34
at least until 1918 --
125
319000
2000
05:36
registered with the U.S. Patent Office.
126
321000
2000
05:38
That was the Paige typesetter.
127
323000
2000
05:40
The Paige typesetter
128
325000
2000
05:42
had 18,000 parts.
129
327000
2000
05:44
The patent had 64 pages of text
130
329000
3000
05:47
and 271 figures.
131
332000
4000
05:51
It was such a beautiful machine
132
336000
2000
05:53
because it did everything that a human being did
133
338000
3000
05:56
in setting type --
134
341000
2000
05:58
including returning the type to its place,
135
343000
2000
06:00
which was a very difficult thing.
136
345000
2000
06:02
And Mark Twain, who knew all about typesetting,
137
347000
2000
06:04
really was smitten by this machine.
138
349000
3000
06:07
Unfortunately, he was smitten in more ways than one,
139
352000
3000
06:10
because it made him bankrupt,
140
355000
2000
06:12
and he had to tour the world speaking
141
357000
2000
06:14
to recoup his money.
142
359000
3000
06:17
And this was an important thing
143
362000
2000
06:19
about 19th century technology,
144
364000
2000
06:21
that all these relationships among parts
145
366000
2000
06:23
could make the most brilliant idea fall apart,
146
368000
4000
06:27
even when judged by the most expert people.
147
372000
2000
06:29
Now there is something else, though, in the early 20th century
148
374000
3000
06:32
that made things even more complicated.
149
377000
3000
06:35
And that was that safety technology itself
150
380000
3000
06:38
could be a source of danger.
151
383000
2000
06:40
The lesson of the Titanic, for a lot of the contemporaries,
152
385000
3000
06:43
was that you must have enough lifeboats
153
388000
2000
06:45
for everyone on the ship.
154
390000
2000
06:47
And this was the result
155
392000
3000
06:50
of the tragic loss of lives
156
395000
2000
06:52
of people who could not get into them.
157
397000
2000
06:54
However, there was another case, the Eastland,
158
399000
3000
06:57
a ship that capsized in Chicago Harbor in 1915,
159
402000
4000
07:01
and it killed 841 people --
160
406000
3000
07:04
that was 14 more
161
409000
2000
07:06
than the passenger toll of the Titanic.
162
411000
3000
07:09
The reason for it, in part, was
163
414000
2000
07:11
the extra life boats that were added
164
416000
3000
07:14
that made this already unstable ship
165
419000
3000
07:17
even more unstable.
166
422000
2000
07:19
And that again proves
167
424000
2000
07:21
that when you're talking about unintended consequences,
168
426000
3000
07:24
it's not that easy to know
169
429000
2000
07:26
the right lessons to draw.
170
431000
2000
07:28
It's really a question of the system, how the ship was loaded,
171
433000
3000
07:31
the ballast and many other things.
172
436000
3000
07:35
So the 20th century, then,
173
440000
3000
07:38
saw how much more complex reality was,
174
443000
2000
07:40
but it also saw a positive side.
175
445000
3000
07:43
It saw that invention
176
448000
3000
07:46
could actually benefit from emergencies.
177
451000
2000
07:48
It could benefit
178
453000
2000
07:50
from tragedies.
179
455000
3000
07:53
And my favorite example of that --
180
458000
2000
07:55
which is not really widely known
181
460000
2000
07:57
as a technological miracle,
182
462000
2000
07:59
but it may be one of the greatest of all times,
183
464000
3000
08:02
was the scaling up of penicillin in the Second World War.
184
467000
4000
08:06
Penicillin was discovered in 1928,
185
471000
3000
08:09
but even by 1940,
186
474000
2000
08:11
no commercially and medically useful quantities of it
187
476000
3000
08:14
were being produced.
188
479000
2000
08:16
A number of pharmaceutical companies were working on it.
189
481000
3000
08:19
They were working on it independently,
190
484000
2000
08:21
and they weren't getting anywhere.
191
486000
2000
08:23
And the Government Research Bureau
192
488000
2000
08:25
brought representatives together
193
490000
2000
08:27
and told them that this is something
194
492000
2000
08:29
that has to be done.
195
494000
2000
08:31
And not only did they do it,
196
496000
2000
08:33
but within two years,
197
498000
2000
08:35
they scaled up penicillin
198
500000
2000
08:37
from preparation in one-liter flasks
199
502000
3000
08:40
to 10,000-gallon vats.
200
505000
4000
08:44
That was how quickly penicillin was produced
201
509000
4000
08:48
and became one of the greatest medical advances of all time.
202
513000
4000
08:52
In the Second World War, too,
203
517000
2000
08:54
the existence
204
519000
2000
08:56
of solar radiation
205
521000
2000
08:58
was demonstrated by studies of interference
206
523000
3000
09:01
that was detected by the radar stations of Great Britain.
207
526000
4000
09:05
So there were benefits in calamities --
208
530000
3000
09:08
benefits to pure science,
209
533000
2000
09:10
as well as to applied science
210
535000
2000
09:12
and medicine.
211
537000
3000
09:15
Now when we come to the period after the Second World War,
212
540000
3000
09:18
unintended consequences get even more interesting.
213
543000
4000
09:22
And my favorite example of that
214
547000
2000
09:24
occurred beginning in 1976,
215
549000
3000
09:27
when it was discovered
216
552000
2000
09:29
that the bacteria causing Legionnaires disease
217
554000
3000
09:32
had always been present in natural waters,
218
557000
3000
09:35
but it was the precise temperature of the water
219
560000
4000
09:39
in heating, ventilating and air conditioning systems
220
564000
3000
09:42
that raised the right temperature
221
567000
4000
09:46
for the maximum reproduction
222
571000
3000
09:49
of Legionella bacillus.
223
574000
2000
09:51
Well, technology to the rescue.
224
576000
2000
09:53
So chemists got to work,
225
578000
2000
09:55
and they developed a bactericide
226
580000
2000
09:57
that became widely used in those systems.
227
582000
3000
10:00
But something else happened in the early 1980s,
228
585000
4000
10:04
and that was that there was a mysterious epidemic
229
589000
2000
10:06
of failures of tape drives
230
591000
3000
10:09
all over the United States.
231
594000
2000
10:11
And IBM, which made them,
232
596000
3000
10:14
just didn't know what to do.
233
599000
3000
10:17
They commissioned a group of their best scientists
234
602000
3000
10:20
to investigate,
235
605000
2000
10:22
and what they found was
236
607000
2000
10:24
that all these tape drives
237
609000
2000
10:26
were located near ventilation ducts.
238
611000
3000
10:29
What happened was the bactericide was formulated
239
614000
3000
10:32
with minute traces of tin.
240
617000
2000
10:34
And these tin particles were deposited on the tape heads
241
619000
3000
10:37
and were crashing the tape heads.
242
622000
3000
10:40
So they reformulated the bactericide.
243
625000
3000
10:43
But what's interesting to me
244
628000
2000
10:45
is that this was the first case
245
630000
2000
10:47
of a mechanical device
246
632000
2000
10:49
suffering, at least indirectly, from a human disease.
247
634000
3000
10:52
So it shows that we're really all in this together.
248
637000
3000
10:55
(Laughter)
249
640000
2000
10:57
In fact, it also shows something interesting,
250
642000
3000
11:00
that although our capabilities and technology
251
645000
3000
11:03
have been expanding geometrically,
252
648000
2000
11:05
unfortunately, our ability to model their long-term behavior,
253
650000
3000
11:08
which has also been increasing,
254
653000
2000
11:10
has been increasing only arithmetically.
255
655000
3000
11:13
So one of the characteristic problems of our time
256
658000
3000
11:16
is how to close this gap
257
661000
2000
11:18
between capabilities and foresight.
258
663000
3000
11:21
One other very positive consequence
259
666000
3000
11:24
of 20th century technology, though,
260
669000
3000
11:27
was the way in which other kinds of calamities
261
672000
4000
11:31
could lead to positive advances.
262
676000
3000
11:34
There are two historians of business
263
679000
3000
11:37
at the University of Maryland,
264
682000
2000
11:39
Brent Goldfarb and David Kirsch,
265
684000
2000
11:41
who have done some extremely interesting work,
266
686000
2000
11:43
much of it still unpublished,
267
688000
3000
11:46
on the history of major innovations.
268
691000
2000
11:48
They have combined the list of major innovations,
269
693000
3000
11:51
and they've discovered that the greatest number, the greatest decade,
270
696000
3000
11:54
for fundamental innovations,
271
699000
2000
11:56
as reflected in all of the lists that others have made --
272
701000
4000
12:00
a number of lists that they have merged --
273
705000
2000
12:02
was the Great Depression.
274
707000
3000
12:05
And nobody knows just why this was so,
275
710000
3000
12:08
but one story can reflect something of it.
276
713000
3000
12:11
It was the origin of the Xerox copier,
277
716000
3000
12:14
which celebrated its 50th anniversary
278
719000
3000
12:17
last year.
279
722000
2000
12:19
And Chester Carlson, the inventor,
280
724000
5000
12:24
was a patent attorney.
281
729000
3000
12:27
He really was not intending
282
732000
3000
12:30
to work in patent research,
283
735000
2000
12:32
but he couldn't really find an alternative technical job.
284
737000
4000
12:36
So this was the best job he could get.
285
741000
2000
12:38
He was upset by the low quality and high cost
286
743000
4000
12:42
of existing patent reproductions,
287
747000
3000
12:45
and so he started to develop
288
750000
3000
12:48
a system of dry photocopying,
289
753000
3000
12:51
which he patented in the late 1930s --
290
756000
3000
12:54
and which became the first dry photocopier
291
759000
4000
12:58
that was commercially practical
292
763000
2000
13:00
in 1960.
293
765000
2000
13:02
So we see that sometimes,
294
767000
2000
13:04
as a result of these dislocations,
295
769000
2000
13:06
as a result of people
296
771000
2000
13:08
leaving their original intended career
297
773000
3000
13:11
and going into something else
298
776000
2000
13:13
where their creativity could make a difference,
299
778000
2000
13:15
that depressions
300
780000
2000
13:17
and all kinds of other unfortunate events
301
782000
3000
13:20
can have a paradoxically stimulating effect
302
785000
3000
13:23
on creativity.
303
788000
2000
13:25
What does this mean?
304
790000
2000
13:27
It means, I think,
305
792000
2000
13:29
that we're living in a time of unexpected possibilities.
306
794000
2000
13:31
Think of the financial world, for example.
307
796000
3000
13:34
The mentor of Warren Buffett, Benjamin Graham,
308
799000
3000
13:37
developed his system of value investing
309
802000
5000
13:42
as a result of his own losses
310
807000
2000
13:44
in the 1929 crash.
311
809000
2000
13:46
And he published that book
312
811000
2000
13:48
in the early 1930s,
313
813000
3000
13:51
and the book still exists in further editions
314
816000
2000
13:53
and is still a fundamental textbook.
315
818000
2000
13:55
So many important creative things can happen
316
820000
4000
13:59
when people learn from disasters.
317
824000
3000
14:02
Now think of the large and small plagues that we have now --
318
827000
4000
14:06
bed bugs, killer bees, spam --
319
831000
5000
14:11
and it's very possible that the solutions to those
320
836000
3000
14:14
will really extend well beyond the immediate question.
321
839000
3000
14:17
If we think, for example, of Louis Pasteur,
322
842000
3000
14:20
who in the 1860s
323
845000
2000
14:22
was asked to study
324
847000
2000
14:24
the diseases of silk worms for the silk industry,
325
849000
4000
14:28
and his discoveries were really the beginning
326
853000
3000
14:31
of the germ theory of disease.
327
856000
2000
14:33
So very often, some kind of disaster --
328
858000
3000
14:36
sometimes the consequence, for example,
329
861000
3000
14:39
of over-cultivation of silk worms,
330
864000
3000
14:42
which was a problem in Europe at the time --
331
867000
2000
14:44
can be the key to something much bigger.
332
869000
2000
14:46
So this means
333
871000
2000
14:48
that we need to take a different view
334
873000
2000
14:50
of unintended consequences.
335
875000
2000
14:52
We need to take a really positive view.
336
877000
3000
14:55
We need to see what they can do for us.
337
880000
3000
14:58
We need to learn
338
883000
2000
15:00
from those figures that I mentioned.
339
885000
2000
15:02
We need to learn, for example, from Dr. Cushing,
340
887000
3000
15:05
who killed patients
341
890000
2000
15:07
in the course of his early operations.
342
892000
2000
15:09
He had to have some errors. He had to have some mistakes.
343
894000
3000
15:12
And he learned meticulously from his mistakes.
344
897000
3000
15:15
And as a result,
345
900000
2000
15:17
when we say, "This isn't brain surgery,"
346
902000
3000
15:20
that pays tribute to how difficult it was
347
905000
3000
15:23
for anyone to learn from their mistakes
348
908000
2000
15:25
in a field of medicine
349
910000
2000
15:27
that was considered so discouraging in its prospects.
350
912000
3000
15:30
And we can also remember
351
915000
3000
15:33
how the pharmaceutical companies
352
918000
2000
15:35
were willing to pool their knowledge,
353
920000
2000
15:37
to share their knowledge,
354
922000
2000
15:39
in the face of an emergency,
355
924000
2000
15:41
which they hadn't really been for years and years.
356
926000
3000
15:44
They might have been able to do it earlier.
357
929000
3000
15:47
The message, then, for me,
358
932000
3000
15:50
about unintended consequences
359
935000
2000
15:52
is chaos happens;
360
937000
3000
15:55
let's make better use of it.
361
940000
2000
15:57
Thank you very much.
362
942000
2000
15:59
(Applause)
363
944000
4000

▲Back to top

ABOUT THE SPEAKER
Edward Tenner - Historian of technology and culture
Edward Tenner is an independent writer, speaker, and editor analyzing the cultural aspects of technological change.

Why you should listen

Edward Tenner is an independent writer and speaker on the history of technology and the unintended consequences of innovation. He writes for The Atlantic on history and current events, and was a founding advisor of Smithsonian's Lemelson Center, where he remains a senior research associate. He was executive editor for physical science and history at Princeton University Press, he has been a visiting lecturer at Princeton and has held visiting research positions at the Institute for Advanced Study, Woodrow Wilson International Center for Scholars, and the Princeton Center for Information Technology Policy. He is now a visiting scholar in the Rutgers School of Communication and Information and an affiliate of the Center for Arts and Cultural Policy of Princeton's Woodrow Wilson School.

More profile about the speaker
Edward Tenner | Speaker | TED.com