ABOUT THE SPEAKER
James Surowiecki - Finance journalist
James Surowiecki argues that people, when we act en masse, are smarter than we think. He's the author of The Wisdom of Crowds and writes about finance for the New Yorker.

Why you should listen

James Surowiecki has been reporting on finance and human behavior since the days of the Motley Fool on AOL. He's had a ringside seat for some roller-coaster times in business, but always, his focus has been on regular people -- not titans of Wall Street, but you and me. In 2004, he published The Wisdom of Crowds, an exploration of the hive mind as it plays out in business and in other arenas of life.

Surowiecki is the finance writer for the New Yorker, and writes a lively and funny (and lately, indispensable) blog for newyorker.com, The Balance Sheet. His other books include the wonderfully titled but sadly quite relevant Best Business Crime Writing of 2003.

More profile about the speaker
James Surowiecki | Speaker | TED.com
TED2005

James Surowiecki: The power and the danger of online crowds

Filmed:
510,081 views

James Surowiecki pinpoints the moment when social media became an equal player in the world of news-gathering: the 2005 tsunami, when YouTube video, blogs, IMs and txts carried the news -- and preserved moving personal stories from the tragedy.
- Finance journalist
James Surowiecki argues that people, when we act en masse, are smarter than we think. He's the author of The Wisdom of Crowds and writes about finance for the New Yorker. Full bio

Double-click the English transcript below to play the video.

00:19
This was in an area called Wellawatta, a prime residential area in Colombo.
0
1000
4000
00:23
We stood on the railroad tracks
1
5000
2000
00:25
that ran between my friend's house and the beach.
2
7000
2000
00:28
The tracks are elevated about eight feet from the waterline normally,
3
10000
3000
00:31
but at that point the water had receded
4
13000
2000
00:33
to a level three or four feet below normal.
5
15000
3000
00:36
I'd never seen the reef here before.
6
18000
2000
00:38
There were fish caught in rock pools left behind by the receding water.
7
20000
4000
00:43
Some children jumped down and ran to the rock pools with bags.
8
25000
4000
00:47
They were trying to catch fish.
9
29000
2000
00:49
No one realized that this was a very bad idea.
10
31000
3000
00:52
The people on the tracks just continued to watch them.
11
34000
3000
00:55
I turned around to check on my friend's house.
12
37000
3000
00:58
Then someone on the tracks screamed.
13
40000
2000
01:00
Before I could turn around, everyone on the tracks was screaming and running.
14
42000
4000
01:04
The water had started coming back. It was foaming over the reef.
15
46000
4000
01:08
The children managed to run back onto the tracks.
16
50000
4000
01:12
No one was lost there. But the water continued to climb.
17
54000
4000
01:16
In about two minutes, it had reached the level of the railroad tracks
18
58000
3000
01:19
and was coming over it. We had run about 100 meters by this time.
19
61000
4000
01:23
It continued to rise.
20
65000
2000
01:25
I saw an old man standing at his gate, knee-deep in water, refusing to move.
21
67000
5000
01:30
He said he'd lived his whole life there by the beach,
22
72000
3000
01:33
and that he would rather die there than run.
23
75000
3000
01:36
A boy broke away from his mother to run back into his house
24
78000
3000
01:39
to get his dog, who was apparently afraid.
25
81000
3000
01:42
An old lady, crying, was carried out of her house and up the road by her son.
26
84000
5000
01:47
The slum built on the railroad reservation
27
89000
3000
01:50
between the sea and the railroad tracks was completely swept away.
28
92000
4000
01:54
Since this was a high-risk location, the police had warned the residents,
29
96000
3000
01:57
and no one was there when the water rose.
30
99000
2000
01:59
But they had not had any time to evacuate any belongings.
31
101000
4000
02:03
For hours afterwards, the sea was strewn with bits of wood for miles around --
32
105000
4000
02:07
all of this was from the houses in the slum.
33
109000
3000
02:10
When the waters subsided, it was as if it had never existed.
34
112000
4000
02:16
This may seem hard to believe --
35
118000
2000
02:18
unless you've been reading lots and lots of news reports --
36
120000
2000
02:20
but in many places, after the tsunami, villagers were still terrified.
37
122000
4000
02:24
When what was a tranquil sea swallows up people, homes
38
126000
3000
02:27
and long-tail boats -- mercilessly, without warning --
39
129000
2000
02:29
and no one can tell you anything reliable about whether another one is coming,
40
131000
4000
02:33
I'm not sure you'd want to calm down either.
41
135000
3000
02:36
One of the scariest things about the tsunami
42
138000
2000
02:38
that I've not seen mentioned is the complete lack of information.
43
140000
4000
02:42
This may seem minor, but it is terrifying to hear rumor after rumor
44
144000
3000
02:45
after rumor that another tidal wave, bigger than the last,
45
147000
3000
02:48
will be coming at exactly 1 p.m., or perhaps tonight, or perhaps ...
46
150000
5000
02:53
You don't even know if it is safe to go back down to the water,
47
155000
3000
02:56
to catch a boat to the hospital.
48
158000
2000
02:58
We think that Phi Phi hospital was destroyed.
49
160000
2000
03:00
We think this boat is going to Phuket hospital,
50
162000
3000
03:03
but if it's too dangerous to land at its pier,
51
165000
2000
03:05
then perhaps it will go to Krabi instead, which is more protected.
52
167000
4000
03:09
We don't think another wave is coming right away.
53
171000
3000
03:12
At the Phi Phi Hill Resort,
54
174000
2000
03:14
I was tucked into the corner furthest away from the television,
55
176000
3000
03:17
but I strained to listen for information.
56
179000
2000
03:19
They reported that there was an 8.5 magnitude earthquake in Sumatra,
57
181000
3000
03:22
which triggered the massive tsunami.
58
184000
2000
03:24
Having this news was comforting in some small way
59
186000
2000
03:26
to understand what had just happened to us.
60
188000
2000
03:28
However, the report focused on what had already occurred
61
190000
3000
03:31
and offered no information on what to expect now.
62
193000
3000
03:34
In general, everything was merely hearsay and rumor,
63
196000
3000
03:37
and not a single person I spoke to for over 36 hours
64
199000
3000
03:40
knew anything with any certainty.
65
202000
3000
03:43
Those were two accounts of the Asian tsunami from two Internet blogs
66
205000
6000
03:49
that essentially sprang up after it occurred.
67
211000
3000
03:52
I'm now going to show you two video segments from the tsunami
68
214000
5000
03:57
that also were shown on blogs.
69
219000
2000
03:59
I should warn you, they're pretty powerful.
70
221000
2000
04:01
One from Thailand, and the second one from Phuket as well.
71
223000
3000
04:05
(Screaming)
72
227000
9000
04:19
Voice 1: It's coming in. It's coming again.
73
241000
3000
04:22
Voice 2: It's coming again?
74
244000
2000
04:24
Voice 1: Yeah. It's coming again.
75
246000
2000
04:26
Voice 2: Come get inside here.
76
248000
4000
04:30
Voice 1: It's coming again. Voice 2: New wave?
77
252000
2000
04:35
Voice 1: It's coming again. New wave!
78
257000
2000
04:40
[Unclear]
79
262000
3000
04:44
(Screaming)
80
266000
20000
05:05
They called me out here.
81
287000
2000
05:21
James Surowiecki: Phew. Those were both on this site: waveofdestruction.org.
82
303000
5000
05:27
In the world of blogs, there's going to be before the tsunami and after the tsunami,
83
309000
4000
05:31
because one of the things that happened in the wake of the tsunami was that,
84
313000
4000
05:35
although initially -- that is, in that first day --
85
317000
2000
05:37
there was actually a kind of dearth of live reporting, there was a dearth of live video
86
319000
4000
05:41
and some people complained about this.
87
323000
2000
05:43
They said, "The blogsters let us down."
88
325000
3000
05:46
What became very clear was that,
89
328000
2000
05:48
within a few days, the outpouring of information was immense,
90
330000
5000
05:53
and we got a complete and powerful picture of what had happened
91
335000
5000
05:58
in a way that we never had been able to get before.
92
340000
3000
06:01
And what you had was a group of essentially unorganized, unconnected
93
343000
3000
06:04
writers, video bloggers, etc., who were able to come up with
94
346000
5000
06:09
a collective portrait of a disaster that gave us a much better sense
95
351000
4000
06:13
of what it was like to actually be there than the mainstream media could give us.
96
355000
5000
06:18
And so in some ways the tsunami can be seen as a sort of seminal moment,
97
360000
4000
06:22
a moment in which the blogosphere came, to a certain degree, of age.
98
364000
4000
06:28
Now, I'm going to move now from this kind of --
99
370000
2000
06:30
the sublime in the traditional sense of the word,
100
372000
2000
06:32
that is to say, awe-inspiring, terrifying -- to the somewhat more mundane.
101
374000
5000
06:37
Because when we think about blogs,
102
379000
2000
06:39
I think for most of us who are concerned about them,
103
381000
3000
06:42
we're primarily concerned with things like politics, technology, etc.
104
384000
5000
06:47
And I want to ask three questions in this talk,
105
389000
2000
06:49
in the 10 minutes that remain, about the blogosphere.
106
391000
3000
06:53
The first one is, What does it tell us about our ideas,
107
395000
3000
06:56
about what motivates people to do things?
108
398000
2000
06:58
The second is, Do blogs genuinely have the possibility
109
400000
4000
07:02
of accessing a kind of collective intelligence
110
404000
4000
07:06
that has previously remained, for the most part, untapped?
111
408000
3000
07:09
And then the third part is, What are the potential problems,
112
411000
4000
07:13
or the dark side of blogs as we know them?
113
415000
4000
07:17
OK, the first question:
114
419000
2000
07:19
What do they tell us about why people do things?
115
421000
2000
07:21
One of the fascinating things about the blogosphere specifically,
116
423000
3000
07:24
and, of course, the Internet more generally --
117
426000
3000
07:27
and it's going to seem like a very obvious point,
118
429000
2000
07:29
but I think it is an important one to think about --
119
431000
2000
07:31
is that the people who are generating these enormous reams of content
120
433000
4000
07:35
every day, who are spending enormous amounts of time organizing,
121
437000
5000
07:40
linking, commenting on the substance of the Internet,
122
442000
3000
07:43
are doing so primarily for free.
123
445000
3000
07:46
They are not getting paid for it in any way other than in the attention and,
124
448000
4000
07:50
to some extent, the reputational capital that they gain from doing a good job.
125
452000
5000
07:55
And this is -- at least, to a traditional economist -- somewhat remarkable,
126
457000
5000
08:00
because the traditional account of economic man would say that,
127
462000
4000
08:04
basically, you do things for a concrete reward, primarily financial.
128
466000
5000
08:09
But instead, what we're finding on the Internet --
129
471000
3000
08:12
and one of the great geniuses of it -- is that people have found a way
130
474000
4000
08:16
to work together without any money involved at all.
131
478000
3000
08:19
They have come up with, in a sense, a different method for organizing activity.
132
481000
5000
08:24
The Yale Law professor Yochai Benkler, in an essay called "Coase's Penguin,"
133
486000
5000
08:29
talks about this open-source model, which we're familiar with from Linux,
134
491000
4000
08:33
as being potentially applicable in a whole host of situations.
135
495000
3000
08:36
And, you know, if you think about this with the tsunami,
136
498000
2000
08:38
what you have is essentially a kind of an army of local journalists,
137
500000
4000
08:42
who are producing enormous amounts of material
138
504000
3000
08:45
for no reason other than to tell their stories.
139
507000
2000
08:47
That's a very powerful idea, and it's a very powerful reality.
140
509000
4000
08:51
And it's one that offers really interesting possibilities
141
513000
2000
08:53
for organizing a whole host of activities down the road.
142
515000
6000
08:59
So, I think the first thing that the blogosphere tells us
143
521000
3000
09:02
is that we need to expand our idea of what counts as rational,
144
524000
3000
09:05
and we need to expand our simple equation of value equals money,
145
527000
4000
09:09
or, you have to pay for it to be good,
146
531000
2000
09:11
but that in fact you can end up with collectively really brilliant products
147
533000
3000
09:14
without any money at all changing hands.
148
536000
3000
09:17
There are a few bloggers -- somewhere maybe around 20, now --
149
539000
3000
09:20
who do, in fact, make some kind of money, and a few
150
542000
3000
09:23
who are actually trying to make a full-time living out of it,
151
545000
3000
09:26
but the vast majority of them are doing it because they love it
152
548000
2000
09:28
or they love the attention, or whatever it is.
153
550000
2000
09:30
So, Howard Rheingold has written a lot about this
154
552000
2000
09:32
and, I think, is writing about this more,
155
554000
2000
09:34
but this notion of voluntary cooperation
156
556000
3000
09:37
is an incredibly powerful one, and one worth thinking about.
157
559000
3000
09:40
The second question is, What does the blogosphere actually do for us,
158
562000
5000
09:45
in terms of accessing collective intelligence?
159
567000
3000
09:48
You know, as Chris mentioned, I wrote a book called "The Wisdom of Crowds."
160
570000
3000
09:51
And the premise of "The Wisdom of Crowds" is that,
161
573000
3000
09:54
under the right conditions, groups can be remarkably intelligent.
162
576000
4000
09:58
And they can actually often be smarter
163
580000
2000
10:00
than even the smartest person within them.
164
582000
2000
10:02
The simplest example of this is if you ask a group of people
165
584000
3000
10:05
to do something like guess how many jellybeans are in a jar.
166
587000
4000
10:09
If I had a jar of jellybeans
167
591000
2000
10:11
and I asked you all to guess how many jellybeans were in that jar,
168
593000
3000
10:14
your average guess would be remarkably good.
169
596000
3000
10:17
It would be somewhere probably within three and five percent
170
599000
2000
10:19
of the number of beans in the jar,
171
601000
2000
10:21
and it would be better than 90 to 95 percent of you.
172
603000
5000
10:26
There may be one or two of you who are brilliant jelly bean guessers,
173
608000
3000
10:29
but for the most part the group's guess
174
611000
3000
10:32
would be better than just about all of you.
175
614000
2000
10:34
And what's fascinating is that you can see this phenomenon at work
176
616000
3000
10:37
in many more complicated situations.
177
619000
3000
10:40
For instance, if you look at the odds on horses at a racetrack,
178
622000
3000
10:43
they predict almost perfectly how likely a horse is to win.
179
625000
5000
10:48
In a sense, the group of betters at the racetrack
180
630000
3000
10:51
is forecasting the future, in probabilistic terms.
181
633000
4000
10:55
You know, if you think about something like Google,
182
637000
2000
10:57
which essentially is relying on the collective intelligence of the Web
183
639000
4000
11:01
to seek out those sites that have the most valuable information --
184
643000
4000
11:05
we know that Google does an exceptionally good job of doing that,
185
647000
3000
11:08
and it does that because, collectively, this disorganized thing
186
650000
3000
11:11
we call the "World Wide Web" actually has a remarkable order,
187
653000
4000
11:15
or a remarkable intelligence in it.
188
657000
2000
11:18
And this, I think, is one of the real promises of the blogosphere.
189
660000
3000
11:21
Dan Gillmor -- whose book "We the Media"
190
663000
2000
11:23
is included in the gift pack --
191
665000
2000
11:25
has talked about it as saying that, as a writer,
192
667000
3000
11:28
he's recognized that his readers know more than he does.
193
670000
4000
11:32
And this is a very challenging idea. It's a very challenging idea
194
674000
3000
11:35
to mainstream media. It's a very challenging idea to anyone
195
677000
2000
11:37
who has invested an enormous amount of time and expertise,
196
679000
4000
11:41
and who has a lot of energy invested in the notion
197
683000
3000
11:44
that he or she knows better than everyone else.
198
686000
4000
11:49
But what the blogosphere offers is the possibility
199
691000
3000
11:52
of getting at the kind of collective, distributive intelligence that is out there,
200
694000
5000
11:57
and that we know is available to us
201
699000
2000
11:59
if we can just figure out a way of accessing it.
202
701000
4000
12:03
Each blog post, each blog commentary
203
705000
3000
12:06
may not, in and of itself, be exactly what we're looking for,
204
708000
4000
12:10
but collectively the judgment of those people posting, those people linking,
205
712000
5000
12:15
more often than not is going to give you a very interesting
206
717000
4000
12:19
and enormously valuable picture of what's going on.
207
721000
3000
12:22
So, that's the positive side of it.
208
724000
2000
12:24
That's the positive side of what is sometimes called
209
726000
2000
12:26
participatory journalism or citizen journalism, etc. --
210
728000
5000
12:31
that, in fact, we are giving people
211
733000
3000
12:34
who have never been able to talk before a voice,
212
736000
3000
12:37
and we're able to access information that has always been there
213
739000
3000
12:40
but has essentially gone untapped.
214
742000
3000
12:43
But there is a dark side to this,
215
745000
2000
12:45
and that's what I want to spend the last part of my talk on.
216
747000
3000
12:48
One of the things that happens if you spend a lot of time on the Internet,
217
750000
3000
12:51
and you spend a lot of time thinking about the Internet,
218
753000
2000
12:53
is that it is very easy to fall in love with the Internet.
219
755000
4000
12:57
It is very easy to fall in love with the decentralized,
220
759000
3000
13:00
bottom-up structure of the Internet.
221
762000
2000
13:02
It is very easy to think that networks are necessarily good things --
222
764000
5000
13:07
that being linked from one place to another,
223
769000
2000
13:09
that being tightly linked in a group, is a very good thing.
224
771000
4000
13:13
And much of the time it is.
225
775000
2000
13:15
But there's also a downside to this -- a kind of dark side, in fact --
226
777000
4000
13:19
and that is that the more tightly linked we've become to each other,
227
781000
4000
13:23
the harder it is for each of us to remain independent.
228
785000
4000
13:27
One of the fundamental characteristics of a network is that,
229
789000
3000
13:30
once you are linked in the network,
230
792000
2000
13:32
the network starts to shape your views
231
794000
3000
13:35
and starts to shape your interactions with everybody else.
232
797000
3000
13:38
That's one of the things that defines what a network is.
233
800000
2000
13:40
A network is not just the product of its component parts.
234
802000
4000
13:44
It is something more than that.
235
806000
3000
13:47
It is, as Steven Johnson has talked about, an emergent phenomenon.
236
809000
4000
13:51
Now, this has all these benefits:
237
813000
2000
13:53
it's very beneficial in terms of the efficiency of communicating information;
238
815000
3000
13:56
it gives you access to a whole host of people;
239
818000
3000
13:59
it allows people to coordinate their activities in very good ways.
240
821000
3000
14:02
But the problem is that groups are only smart
241
824000
4000
14:06
when the people in them are as independent as possible.
242
828000
4000
14:10
This is the paradox of the wisdom of crowds,
243
832000
3000
14:13
or the paradox of collective intelligence,
244
835000
2000
14:15
that what it requires is actually a form of independent thinking.
245
837000
5000
14:20
And networks make it harder for people to do that,
246
842000
4000
14:24
because they drive attention to the things that the network values.
247
846000
4000
14:28
So, one of the phenomena that's very clear in the blogosphere
248
850000
4000
14:32
is that once a meme, once an idea gets going,
249
854000
4000
14:36
it is very easy for people to just sort of pile on,
250
858000
3000
14:39
because other people have, say, a link.
251
861000
3000
14:42
People have linked to it, and so other people in turn link to it, etc., etc.
252
864000
3000
14:45
And that phenomenon
253
867000
3000
14:48
of piling on the existing links
254
870000
3000
14:51
is one that is characteristic of the blogosphere,
255
873000
3000
14:54
particularly of the political blogosphere,
256
876000
3000
14:57
and it is one that essentially throws off
257
879000
3000
15:00
this beautiful, decentralized, bottom-up intelligence
258
882000
3000
15:03
that blogs can manifest in the right conditions.
259
885000
3000
15:06
The metaphor that I like to use is the metaphor of the circular mill.
260
888000
4000
15:10
A lot of people talk about ants.
261
892000
2000
15:12
You know, this is a conference inspired by nature.
262
894000
2000
15:14
When we talk about bottom-up, decentralized phenomena,
263
896000
3000
15:17
the ant colony is the classic metaphor, because,
264
899000
4000
15:21
no individual ant knows what it's doing,
265
903000
2000
15:23
but collectively ants are able to reach incredibly intelligent decisions.
266
905000
5000
15:29
They're able to guide their traffic with remarkable speed.
267
911000
7000
15:36
So, the ant colony is a great model:
268
918000
2000
15:38
you have all these little parts that collectively add up to a great thing.
269
920000
3000
15:41
But we know that occasionally ants go astray,
270
923000
4000
15:45
and what happens is that, if army ants are wandering around and they get lost,
271
927000
4000
15:49
they start to follow a simple rule --
272
931000
2000
15:51
just do what the ant in front of you does.
273
933000
4000
15:55
And what happens is that the ants eventually end up in a circle.
274
937000
4000
15:59
And there's this famous example of one that was 1,200 feet long
275
941000
4000
16:03
and lasted for two days, and the ants just kept marching around and around
276
945000
4000
16:07
in a circle until they died.
277
949000
3000
16:10
And that, I think, is a sort of thing to watch out for.
278
952000
3000
16:13
That's the thing we have to fear --
279
955000
2000
16:15
is that we're just going to keep marching around and around until we die.
280
957000
4000
16:19
Now, I want to connect this back, though, to the tsunami,
281
961000
3000
16:22
because one of the great things about the tsunami --
282
964000
2000
16:24
in terms of the blogosphere's coverage,
283
966000
2000
16:26
not in terms of the tsunami itself --
284
968000
2000
16:28
is that it really did represent a genuine bottom-up phenomenon.
285
970000
4000
16:32
You saw sites that had never existed before getting huge amounts of traffic.
286
974000
3000
16:35
You saw people being able to offer up
their independent points of view
287
977000
4000
16:39
in a way that they hadn't before.
288
981000
2000
16:41
There, you really did see the intelligence of the Web manifest itself.
289
983000
5000
16:46
So, that's the upside. The circular mill is the downside.
290
988000
3000
16:49
And I think that the former is what we really need to strive for.
291
991000
3000
16:52
Thank you very much. (Applause)
292
994000
1000

▲Back to top

ABOUT THE SPEAKER
James Surowiecki - Finance journalist
James Surowiecki argues that people, when we act en masse, are smarter than we think. He's the author of The Wisdom of Crowds and writes about finance for the New Yorker.

Why you should listen

James Surowiecki has been reporting on finance and human behavior since the days of the Motley Fool on AOL. He's had a ringside seat for some roller-coaster times in business, but always, his focus has been on regular people -- not titans of Wall Street, but you and me. In 2004, he published The Wisdom of Crowds, an exploration of the hive mind as it plays out in business and in other arenas of life.

Surowiecki is the finance writer for the New Yorker, and writes a lively and funny (and lately, indispensable) blog for newyorker.com, The Balance Sheet. His other books include the wonderfully titled but sadly quite relevant Best Business Crime Writing of 2003.

More profile about the speaker
James Surowiecki | Speaker | TED.com