ABOUT THE SPEAKER
Andreas Ekström - Author and journalist
Andreas Ekström describes the power structures of the digital revolution.

Why you should listen

Andreas Ekström is staff writer at Sydsvenskan, a daily morning paper in Malmö, Sweden.

His passion is educating for digital equality, and he has a vision for a world in which we share the wealth -- not only financially, but also in terms of knowledge and power. Andreas is the author of six books, a columnist and a commentator, and he often lectures and leads seminars on the digital revolution.

More profile about the speaker
Andreas Ekström | Speaker | TED.com
TEDxOslo

Andreas Ekström: The moral bias behind your search results

Filmed:
1,550,129 views

Search engines have become our most trusted sources of information and arbiters of truth. But can we ever get an unbiased search result? Swedish author and journalist Andreas Ekström argues that such a thing is a philosophical impossibility. In this thoughtful talk, he calls on us to strengthen the bonds between technology and the humanities, and he reminds us that behind every algorithm is a set of personal beliefs that no code can ever completely eradicate.
- Author and journalist
Andreas Ekström describes the power structures of the digital revolution. Full bio

Double-click the English transcript below to play the video.

00:13
So whenever I visit a school
and talk to students,
0
1087
2493
00:15
I always ask them the same thing:
1
3604
2143
00:18
Why do you Google?
2
6754
1434
00:20
Why is Google the search engine
of choice for you?
3
8624
3397
00:24
Strangely enough, I always get
the same three answers.
4
12855
2552
00:27
One, "Because it works,"
5
15431
2039
00:29
which is a great answer;
that's why I Google, too.
6
17494
2906
00:32
Two, somebody will say,
7
20424
2033
00:34
"I really don't know of any alternatives."
8
22481
2640
00:37
It's not an equally great answer
and my reply to that is usually,
9
25708
3128
00:40
"Try to Google the word 'search engine,'
10
28860
1921
00:42
you may find a couple
of interesting alternatives."
11
30805
2402
00:45
And last but not least, thirdly,
12
33231
2095
00:47
inevitably, one student will raise
her or his hand and say,
13
35350
3310
00:50
"With Google, I'm certain to always get
the best, unbiased search result."
14
38684
5183
00:57
Certain to always get the best,
unbiased search result.
15
45157
6506
01:05
Now, as a man of the humanities,
16
53091
2390
01:07
albeit a digital humanities man,
17
55505
2181
01:09
that just makes my skin curl,
18
57710
1738
01:11
even if I, too, realize that that trust,
that idea of the unbiased search result
19
59472
4886
01:16
is a cornerstone in our collective love
for and appreciation of Google.
20
64382
3855
01:20
I will show you why that, philosophically,
is almost an impossibility.
21
68658
4258
01:24
But let me first elaborate,
just a little bit, on a basic principle
22
72940
3254
01:28
behind each search query
that we sometimes seem to forget.
23
76218
3113
01:31
So whenever you set out
to Google something,
24
79851
2080
01:33
start by asking yourself this:
"Am I looking for an isolated fact?"
25
81955
3927
01:38
What is the capital of France?
26
86334
3161
01:41
What are the building blocks
of a water molecule?
27
89519
2425
01:43
Great -- Google away.
28
91968
2341
01:46
There's not a group of scientists
who are this close to proving
29
94333
3120
01:49
that it's actually London and H30.
30
97477
1997
01:51
You don't see a big conspiracy
among those things.
31
99498
2371
01:53
We agree, on a global scale,
32
101893
1533
01:55
what the answers are
to these isolated facts.
33
103450
2725
01:58
But if you complicate your question
just a little bit and ask something like,
34
106199
5302
02:03
"Why is there
an Israeli-Palestine conflict?"
35
111525
2683
02:06
You're not exactly looking
for a singular fact anymore,
36
114978
2640
02:09
you're looking for knowledge,
37
117642
1833
02:11
which is something way more
complicated and delicate.
38
119499
2578
02:14
And to get to knowledge,
39
122600
1549
02:16
you have to bring 10 or 20
or 100 facts to the table
40
124173
3031
02:19
and acknowledge them and say,
"Yes, these are all true."
41
127228
2976
02:22
But because of who I am,
42
130228
1674
02:23
young or old, black or white,
gay or straight,
43
131926
2270
02:26
I will value them differently.
44
134220
1611
02:27
And I will say, "Yes, this is true,
45
135855
1688
02:29
but this is more important
to me than that."
46
137567
2114
02:31
And this is where it becomes interesting,
47
139705
1990
02:33
because this is where we become human.
48
141719
2146
02:35
This is when we start
to argue, to form society.
49
143889
2996
02:38
And to really get somewhere,
we need to filter all our facts here,
50
146909
3357
02:42
through friends and neighbors
and parents and children
51
150290
2556
02:44
and coworkers and newspapers
and magazines,
52
152870
2032
02:46
to finally be grounded in real knowledge,
53
154926
3080
02:50
which is something that a search engine
is a poor help to achieve.
54
158030
4047
02:55
So, I promised you an example
just to show you why it's so hard
55
163284
6328
03:01
to get to the point of true, clean,
objective knowledge --
56
169636
3404
03:05
as food for thought.
57
173064
1468
03:06
I will conduct a couple of simple
queries, search queries.
58
174556
3893
03:10
We'll start with "Michelle Obama,"
59
178473
4040
03:14
the First Lady of the United States.
60
182537
1804
03:16
And we'll click for pictures.
61
184365
1729
03:19
It works really well, as you can see.
62
187007
2272
03:21
It's a perfect search
result, more or less.
63
189303
3028
03:24
It's just her in the picture,
not even the President.
64
192355
2750
03:27
How does this work?
65
195664
1313
03:29
Quite simple.
66
197837
1372
03:31
Google uses a lot of smartness
to achieve this, but quite simply,
67
199233
3215
03:34
they look at two things
more than anything.
68
202472
2060
03:36
First, what does it say in the caption
under the picture on each website?
69
204556
5156
03:41
Does it say "Michelle Obama"
under the picture?
70
209736
2215
03:43
Pretty good indication
it's actually her on there.
71
211975
2356
03:46
Second, Google looks at the picture file,
72
214355
2386
03:48
the name of the file as such
uploaded to the website.
73
216765
3032
03:51
Again, is it called "MichelleObama.jpeg"?
74
219821
2669
03:54
Pretty good indication it's not
Clint Eastwood in the picture.
75
222839
2922
03:57
So, you've got those two and you get
a search result like this -- almost.
76
225785
4265
04:02
Now, in 2009, Michelle Obama
was the victim of a racist campaign,
77
230074
6603
04:08
where people set out to insult her
through her search results.
78
236701
4015
04:13
There was a picture distributed
widely over the Internet
79
241430
2702
04:16
where her face was distorted
to look like a monkey.
80
244156
2644
04:18
And that picture was published all over.
81
246824
3169
04:22
And people published it
very, very purposefully,
82
250017
3761
04:25
to get it up there in the search results.
83
253802
1971
04:27
They made sure to write
"Michelle Obama" in the caption
84
255797
2955
04:30
and they made sure to upload the picture
as "MichelleObama.jpeg," or the like.
85
258776
4177
04:34
You get why -- to manipulate
the search result.
86
262977
2367
04:37
And it worked, too.
87
265368
1295
04:38
So when you picture-Googled
for "Michelle Obama" in 2009,
88
266687
2720
04:41
that distorted monkey picture
showed up among the first results.
89
269431
3387
04:44
Now, the results are self-cleansing,
90
272842
3566
04:48
and that's sort of the beauty of it,
91
276432
1753
04:50
because Google measures relevance
every hour, every day.
92
278209
3403
04:53
However, Google didn't settle
for that this time,
93
281636
2714
04:56
they just thought, "That's racist
and it's a bad search result
94
284374
3124
04:59
and we're going to go back
and clean that up manually.
95
287522
3135
05:02
We are going to write
some code and fix it,"
96
290681
2932
05:05
which they did.
97
293637
1247
05:07
And I don't think anyone in this room
thinks that was a bad idea.
98
295454
3742
05:11
Me neither.
99
299789
1164
05:14
But then, a couple of years go by,
100
302802
3032
05:17
and the world's most-Googled Anders,
101
305858
2984
05:20
Anders Behring Breivik,
102
308866
2279
05:23
did what he did.
103
311169
1706
05:24
This is July 22 in 2011,
104
312899
2001
05:26
and a terrible day in Norwegian history.
105
314924
2649
05:29
This man, a terrorist, blew up
a couple of government buildings
106
317597
3787
05:33
walking distance from where we are
right now in Oslo, Norway
107
321408
2883
05:36
and then he traveled
to the island of Utøya
108
324315
2051
05:38
and shot and killed a group of kids.
109
326390
2223
05:41
Almost 80 people died that day.
110
329113
1728
05:44
And a lot of people would describe
this act of terror as two steps,
111
332397
4559
05:48
that he did two things: he blew up
the buildings and he shot those kids.
112
336980
3411
05:52
It's not true.
113
340415
1165
05:54
It was three steps.
114
342326
2143
05:56
He blew up those buildings,
he shot those kids,
115
344493
2214
05:58
and he sat down and waited
for the world to Google him.
116
346731
3644
06:03
And he prepared
all three steps equally well.
117
351227
2627
06:06
And if there was somebody
who immediately understood this,
118
354544
2790
06:09
it was a Swedish web developer,
119
357358
1524
06:10
a search engine optimization expert
in Stockholm, named Nikke Lindqvist.
120
358906
3623
06:14
He's also a very political guy
121
362553
1588
06:16
and he was right out there
in social media, on his blog and Facebook.
122
364165
3276
06:19
And he told everybody,
123
367465
1206
06:20
"If there's something that
this guy wants right now,
124
368695
2455
06:23
it's to control the image of himself.
125
371174
2459
06:26
Let's see if we can distort that.
126
374760
1960
06:29
Let's see if we, in the civilized world,
can protest against what he did
127
377490
3977
06:33
through insulting him
in his search results."
128
381491
3317
06:36
And how?
129
384832
1187
06:38
He told all of his readers the following,
130
386797
2056
06:40
"Go out there on the Internet,
131
388877
1864
06:42
find pictures of dog poop on sidewalks --
132
390765
2895
06:46
find pictures of dog poop on sidewalks --
133
394708
2174
06:48
publish them in your feeds,
on your websites, on your blogs.
134
396906
3470
06:52
Make sure to write the terrorist's
name in the caption,
135
400400
2921
06:55
make sure to name
the picture file "Breivik.jpeg."
136
403345
4487
06:59
Let's teach Google that that's
the face of the terrorist."
137
407856
3801
07:05
And it worked.
138
413552
1278
07:07
Two years after that campaign
against Michelle Obama,
139
415853
2898
07:10
this manipulation campaign
against Anders Behring Breivik worked.
140
418775
3266
07:14
If you picture-Googled for him weeks after
the July 22 events from Sweden,
141
422065
4462
07:18
you'd see that picture of dog poop
high up in the search results,
142
426551
4327
07:22
as a little protest.
143
430902
1444
07:25
Strangely enough, Google
didn't intervene this time.
144
433425
4132
07:30
They did not step in and manually
clean those search results up.
145
438494
4272
07:35
So the million-dollar question,
146
443964
1716
07:37
is there anything different
between these two happenings here?
147
445704
3368
07:41
Is there anything different between
what happened to Michelle Obama
148
449096
3193
07:44
and what happened
to Anders Behring Breivik?
149
452313
2065
07:46
Of course not.
150
454402
1284
07:48
It's the exact same thing,
151
456861
1471
07:50
yet Google intervened in one case
and not in the other.
152
458356
2864
07:53
Why?
153
461244
1253
07:55
Because Michelle Obama
is an honorable person, that's why,
154
463283
3300
07:58
and Anders Behring Breivik
is a despicable person.
155
466607
2916
08:02
See what happens there?
156
470142
1535
08:03
An evaluation of a person takes place
157
471701
3255
08:06
and there's only one
power-player in the world
158
474980
3786
08:10
with the authority to say who's who.
159
478790
2480
08:13
"We like you, we dislike you.
160
481882
1741
08:15
We believe in you,
we don't believe in you.
161
483647
2039
08:17
You're right, you're wrong.
You're true, you're false.
162
485710
2547
08:20
You're Obama, and you're Breivik."
163
488281
1805
08:22
That's power if I ever saw it.
164
490791
2000
08:27
So I'm asking you to remember
that behind every algorithm
165
495206
3652
08:30
is always a person,
166
498882
1777
08:32
a person with a set of personal beliefs
167
500683
2495
08:35
that no code can ever
completely eradicate.
168
503202
2525
08:37
And my message goes
out not only to Google,
169
505751
2434
08:40
but to all believers in the faith
of code around the world.
170
508209
2810
08:43
You need to identify
your own personal bias.
171
511043
2976
08:46
You need to understand that you are human
172
514043
2013
08:48
and take responsibility accordingly.
173
516080
2491
08:51
And I say this because I believe
we've reached a point in time
174
519891
2938
08:54
when it's absolutely imperative
175
522853
1555
08:56
that we tie those bonds
together again, tighter:
176
524432
3217
08:59
the humanities and the technology.
177
527673
2368
09:02
Tighter than ever.
178
530483
1805
09:04
And, if nothing else, to remind us
that that wonderfully seductive idea
179
532312
3339
09:07
of the unbiased, clean search result
180
535675
2668
09:10
is, and is likely to remain, a myth.
181
538367
2767
09:13
Thank you for your time.
182
541984
1159
09:15
(Applause)
183
543167
2432

▲Back to top

ABOUT THE SPEAKER
Andreas Ekström - Author and journalist
Andreas Ekström describes the power structures of the digital revolution.

Why you should listen

Andreas Ekström is staff writer at Sydsvenskan, a daily morning paper in Malmö, Sweden.

His passion is educating for digital equality, and he has a vision for a world in which we share the wealth -- not only financially, but also in terms of knowledge and power. Andreas is the author of six books, a columnist and a commentator, and he often lectures and leads seminars on the digital revolution.

More profile about the speaker
Andreas Ekström | Speaker | TED.com