ABOUT THE SPEAKER
James Bridle - Artist, writer
Working across technologies and disciplines, James Bridle examines technology, knowledge and the end of the future.

Why you should listen

James Bridle is an artist and writer working across technologies and disciplines. His artworks and installations have been exhibited in Europe, North and South America, Asia and Australia, and have been viewed by hundreds of thousands of visitors online. He has been commissioned by organizations including the Victoria & Albert Museum, the Barbican, Artangel, the Oslo Architecture Triennale and the Istanbul Design Biennial, and he has been honored by Ars Electronica, the Japan Media Arts Festival and the Design Museum, London. His writing on literature, culture and networks has appeared in magazines and newspapers including Frieze, Wired, Domus, Cabinet, The Atlantic, the New Statesman and many others, and he has written a regular column for The Observer.

New Dark Age, Bridle's book about technology, knowledge and the end of the future is forthcoming from Verso (UK & US) in 2018. He lectures regularly on radio, at conferences, universities and other events including SXSW, Lift, the Global Art Forum and Re:Publica. He has been a resident at Lighthouse, Brighton, the White Building, London and Eyebeam, New York, and an adjunct professor on the interactive telecommunications program at New York University.

More profile about the speaker
James Bridle | Speaker | TED.com
TED2018

James Bridle: The nightmare videos of children's YouTube -- and what's wrong with the internet today

詹姆斯•布里德尔: 儿童YouTube的噩梦视频以及今天的互联网到底错在哪里

Filmed:
5,404,429 views

作家和艺术家詹姆斯•布里德尔揭露一个黑暗,奇怪的互联网角落。在YouTube上,有不明身份的人为了换取广告收入,制作洗脑式的视频给儿童观看。从“惊喜蛋”到“手指家庭之歌”到通过算法,将熟悉的卡通人物在暴力场景中混搭在一起,利用并恐吓孩子们的心灵。 它们向我们揭示了数据驱动的世界正在前往何方。“我们需要停止把技术当作是解决一切问题的良方,而应把它看作指引我们发现问题的指南针。这样我们才可以开正视它们,并开始着手解决它们,”布里德尔说。
- Artist, writer
Working across technologies and disciplines, James Bridle examines technology, knowledge and the end of the future. Full bio

Double-click the English transcript below to play the video.

00:12
I'm James詹姆士.
0
777
1183
我是詹姆斯。
我是个作家和艺术家,
00:13
I'm a writer作家 and artist艺术家,
1
1984
1686
我的工作跟技术有关。
00:15
and I make work about technology技术.
2
3694
2341
00:18
I do things like draw life-size真人大小 outlines大纲
of military军事 drones无人驾驶飞机
3
6454
3911
我做一些诸如绘制真实大小的军用无人机
在世界各地城市街道上,
00:22
in city streets街道 around the world世界,
4
10389
1828
这样人们就可以去想象和思考
00:24
so that people can start开始 to think
and get their heads around
5
12241
2950
这些平时蛮难见到和蛮难想象的技术。
00:27
these really quite相当 hard-to-see难看
and hard-to-think-about难以思考 technologies技术.
6
15215
3440
00:31
I make things like neural神经 networks网络
that predict预测 the results结果 of elections选举
7
19494
3836
我会制作一些神经网络的东西,
通过天气预报
来预测选举结果,
00:35
based基于 on weather天气 reports报告,
8
23354
1737
00:37
because I'm intrigued好奇 about
9
25115
1314
因为我感兴趣
00:38
what the actual实际 possibilities可能性
of these weird奇怪的 new technologies技术 are.
10
26453
3924
这些奇怪的新技术
实际的可能性是什么。
00:43
Last year, I built内置
my own拥有 self-driving自驾车 car汽车.
11
31405
2426
去年,我还建造了自己的自动驾驶汽车。
但因为我不完全相信技术,
00:45
But because I don't
really trust相信 technology技术,
12
33855
2526
00:48
I also designed设计 a trap陷阱 for it.
13
36405
1928
我同时也给它设计了个包围圈。
00:50
(Laughter笑声)
14
38777
1086
(笑声)
00:51
And I do these things mostly大多 because
I find them completely全然 fascinating迷人,
15
39887
4298
我做这些事情主要是
因为我发现他们真的很吸引人,
但也因为我觉得当我们谈论科技时,
00:56
but also because I think
when we talk about technology技术,
16
44209
2602
我们其实主要是谈论我们自身
00:58
we're largely大部分 talking about ourselves我们自己
17
46835
2619
01:01
and the way that we understand理解 the world世界.
18
49478
2299
以及我们理解世界的方式。
下面的故事就是有关科技的。
01:03
So here's这里的 a story故事 about technology技术.
19
51801
2442
01:07
This is a "surprise egg" video视频.
20
55520
2830
这是“惊喜蛋”视频。
01:10
It's basically基本上 a video视频 of someone有人
opening开盘 up loads负载 of chocolate巧克力 eggs
21
58374
3348
内容就是人打开各种巧克力蛋
向观众展示里面的玩具是啥。
01:13
and showing展示 the toys玩具 inside to the viewer观众.
22
61746
2126
01:16
That's it. That's all it does
for seven long minutes分钟.
23
64461
2649
就这样,这视频长达7分钟。
01:19
And I want you to notice注意
two things about this.
24
67428
3051
我想让你们注意这两点:
01:22
First of all, this video视频
has 30 million百万 views意见.
25
70503
4074
首先,这视频有3千万观看量。
(笑声)
01:26
(Laughter笑声)
26
74601
1275
01:28
And the other thing is,
27
76376
1166
另一件事是,
它来自一个拥有630万订阅用户的频道,
01:29
it comes from a channel渠道
that has 6.3 million百万 subscribers用户,
28
77566
3869
01:33
that has a total of eight billion十亿 views意见,
29
81459
2680
该频道总播放量为80亿次,
这个频道主要都是这类的内容。
01:36
and it's all just more videos视频 like this --
30
84163
3106
01:40
30 million百万 people watching观看 a guy
opening开盘 up these eggs.
31
88256
3908
3千万人观看打开这些蛋蛋。
这听起来相当古怪,
但如果你在YouTube上搜索“惊喜蛋”,
01:44
It sounds声音 pretty漂亮 weird奇怪的, but if you search搜索
for "surprise eggs" on YouTubeYouTube的,
32
92188
4481
它会告诉你这类视频多达1千万个,
01:48
it'll它会 tell you there's
10 million百万 of these videos视频,
33
96693
3523
01:52
and I think that's an undercount计数不足.
34
100240
1657
我认为这数字还低估了。
01:53
I think there's way, way more of these.
35
101921
1897
我认为数量远远大于此。
如果你继续搜索,几乎无穷无尽。
01:55
If you keep searching搜索, they're endless无穷.
36
103842
2242
这类视频,数不胜数。
01:58
There's millions百万 and millions百万
of these videos视频
37
106108
2159
如巴洛克般花哨地
混合着各种品牌和材料,
02:00
in increasingly日益 baroque巴洛克 combinations组合
of brands品牌 and materials物料,
38
108291
3454
02:03
and there's more and more of them
being存在 uploaded上传 every一切 single day.
39
111769
3846
每天这类视频上传的数量还越来越多。
02:07
Like, this is a strange奇怪 world世界. Right?
40
115639
3511
这真是个奇怪的世界,对吗?
但事实是,不是成年人在看这些视频。
02:11
But the thing is, it's not adults成年人
who are watching观看 these videos视频.
41
119174
3383
是小朋友,小孩子们。
02:14
It's kids孩子, small children孩子.
42
122581
2921
02:17
These videos视频 are
like crack裂纹 for little kids孩子.
43
125526
2154
这些视频就像小孩子的毒品。
02:19
There's something about the repetition重复,
44
127704
2075
是那种让人反复的东西,
02:21
the constant不变 little
dopamine多巴胺 hit击中 of the reveal揭示,
45
129803
2468
小多巴胺不断涌现出来,
02:24
that completely全然 hooks挂钩 them in.
46
132295
1866
完全令他们着迷不已。
02:26
And little kids孩子 watch these videos视频
over and over and over again,
47
134185
4809
孩子们一遍又一遍地看这些视频,
02:31
and they do it for hours小时
and hours小时 and hours小时.
48
139018
2327
一个小时接一个小时地看。
02:33
And if you try and take
the screen屏幕 away from them,
49
141369
2356
如果你试图把屏幕关上,
02:35
they'll他们会 scream惊叫 and scream惊叫 and scream惊叫.
50
143749
1782
他们会朝你尖叫,尖叫,尖叫
02:37
If you don't believe me --
51
145555
1262
假如你不相信我——
02:38
and I've already已经 seen看到 people
in the audience听众 nodding点头 --
52
146841
2607
我已经看到在座各位已经有人点头——
02:41
if you don't believe me, find someone有人
with small children孩子 and ask them,
53
149472
3391
如果你不相信我,找个有小孩的人问问,
他们肯定知道这些惊喜蛋视频。
02:44
and they'll他们会 know about
the surprise egg videos视频.
54
152887
2340
这就是我们开始的地方。
02:47
So this is where we start开始.
55
155251
2070
2018年,有人,或很多人
02:49
It's 2018, and someone有人, or lots of people,
56
157345
3642
02:53
are using运用 the same相同 mechanism机制 that, like,
FacebookFacebook的 and InstagramInstagram的 are using运用
57
161011
3941
在使用类似的机制,如Facebook
和Instagram就在使用的机制
02:56
to get you to keep checking检查 that app应用,
58
164976
1989
来让你沉迷他们的应用,
02:58
and they're using运用 it on YouTubeYouTube的
to hack the brains大脑 of very small children孩子
59
166989
3985
他们在YouTube上用这些机制
劫持小孩子的脑袋
03:02
in return返回 for advertising广告 revenue收入.
60
170998
1958
来换取广告收入。
03:06
At least最小, I hope希望
that's what they're doing.
61
174346
2001
至少,我希望这是他们在做的事情。
03:08
I hope希望 that's what they're doing it for,
62
176371
1955
我希望这是他们做这些事情的目的。
因为Youtube上有更简单的
赚取广告收入的方式
03:10
because there's easier更轻松 ways方法
of making制造 ad广告 revenue收入 on YouTubeYouTube的.
63
178350
5308
你可以捏造或者干脆照搬其他人的作品。
03:15
You can just make stuff东东 up or steal stuff东东.
64
183682
2332
03:18
So if you search搜索 for really
popular流行 kids'孩子们 cartoons卡通
65
186038
2635
所以如果你搜索真正流行的儿童卡通
比如“小猪佩奇”或者“狗狗巡逻队”
03:20
like "Peppa佩奇 Pig" or "Paw爪子 Patrol巡逻,"
66
188697
1654
03:22
you'll你会 find there's millions百万 and millions百万
of these online线上 as well.
67
190375
3147
你会发现这类视频也有不计其数。
03:25
Of course课程, most of them aren't posted发布
by the original原版的 content内容 creators创作者.
68
193546
3352
当然,其中绝大部分并非由
内容版权方上传。
他们来自大量不同的随机账号,
03:28
They come from loads负载 and loads负载
of different不同 random随机 accounts账户,
69
196922
2999
03:31
and it's impossible不可能 to know
who's谁是 posting发帖 them
70
199945
2240
几乎无法知道谁在上传,
03:34
or what their motives动机 might威力 be.
71
202209
1822
他们的动机是什么 。
03:36
Does that sound声音 kind of familiar?
72
204428
1930
这是不是听起来有点熟悉?
03:38
Because it's exactly究竟 the same相同 mechanism机制
73
206382
1980
因为这完全就是同一套机制,
03:40
that's happening事件 across横过 most
of our digital数字 services服务,
74
208386
2600
几乎每个网络平台上都会利用它,
我们根本无法知道这些信息的来源。
03:43
where it's impossible不可能 to know
where this information信息 is coming未来 from.
75
211010
3207
03:46
It's basically基本上 fake news新闻 for kids孩子,
76
214241
1829
这就是给儿童看的假新闻,
03:48
and we're training训练 them from birth分娩
77
216094
2161
我们从他们出生开始
03:50
to click点击 on the very first link链接
that comes along沿,
78
218279
2506
训练他们点击每一个链接,
03:52
regardless而不管 of what the source资源 is.
79
220809
1953
不管信息来源何处。
03:54
That's doesn't seem似乎 like
a terribly可怕 good idea理念.
80
222786
2603
这听起来可不是个好主意。
03:58
Here's这里的 another另一个 thing
that's really big on kids'孩子们 YouTubeYouTube的.
81
226399
2710
这个视频在儿童的YouTube频道
也相当流行。
04:01
This is called the "Finger手指 Family家庭 Song歌曲."
82
229133
1928
这叫做“手指之歌”。
我刚听到有人在观众席上叹气。
04:03
I just heard听说 someone有人 groan呻吟
in the audience听众.
83
231085
2018
04:05
This is the "Finger手指 Family家庭 Song歌曲."
84
233127
1624
这是“手指之歌”。
04:06
This is the very first one I could find.
85
234775
1930
这是我能找到的最初版本。
04:08
It's from 2007, and it only has
200,000 views意见,
86
236729
2829
来自2007年,当时只有20万播放量,
04:11
which哪一个 is, like, nothing in this game游戏.
87
239582
1976
在这场游戏中根本微不足道。
04:13
But it has this insanely疯狂地 earwormy耳虫 tune,
88
241582
2852
但它拥有不可思议的绕梁三日音调,
04:16
which哪一个 I'm not going to play to you,
89
244458
1682
我可不打算播放给你们听,
04:18
because it will sear烤焦 itself本身
into your brain
90
246164
2008
因为它也会钻进你的脑袋里,
04:20
in the same相同 way that
it seared香煎 itself本身 into mine,
91
248196
2395
就跟它钻进我脑袋一样,
04:22
and I'm not going to do that to you.
92
250615
1770
我不打算那样对你们。
04:24
But like the surprise eggs,
93
252409
1344
但跟惊喜蛋一样
它留在了孩子的头脑里
04:25
it's got inside kids'孩子们 heads
94
253777
2164
让他们产生沉迷。
04:27
and addicted上瘾 them to it.
95
255965
1607
04:29
So within a few少数 years年份,
these finger手指 family家庭 videos视频
96
257596
2531
所以不到几年,这些手指之歌视频
04:32
start开始 appearing出现 everywhere到处,
97
260151
1303
开始无处不在,
04:33
and you get versions版本
in different不同 languages语言
98
261478
2029
于是就有了各种语言的版本,
04:35
with popular流行 kids'孩子们 cartoons卡通 using运用 food餐饮
99
263531
2121
有流行卡通使用食物
04:37
or, frankly坦率地说, using运用 whatever随你 kind
of animation动画 elements分子
100
265676
2550
或者使用任何动画元素的版本。
你就像躺在上面一样。
04:40
you seem似乎 to have lying说谎 around.
101
268250
2252
04:43
And once一旦 again, there are millions百万
and millions百万 and millions百万 of these videos视频
102
271002
5197
再一次,这些不计其数视频
以各种疯狂的组合方式在网上出现。
04:48
available可得到 online线上 in all of these
kind of insane combinations组合.
103
276223
3435
你花在这些上面的时间越多,
04:51
And the more time
you start开始 to spend with them,
104
279682
2228
04:53
the crazier疯狂 and crazier疯狂
you start开始 to feel that you might威力 be.
105
281934
3694
你就会觉得自己越来越疯癫。
04:57
And that's where I
kind of launched推出 into this,
106
285652
3333
这就是我要开始讲的,
05:01
that feeling感觉 of deep strangeness陌生感
and deep lack缺乏 of understanding理解
107
289009
3666
始终,我有一种陌生感,
我也不理解
05:04
of how this thing was constructed
that seems似乎 to be presented呈现 around me.
108
292699
4175
这些东西如何被构建出来。
05:08
Because it's impossible不可能 to know
where these things are coming未来 from.
109
296898
3167
因为我没法知道这些东西来自哪里。
05:12
Like, who is making制造 them?
110
300089
1241
比如,谁制作了它们?
05:13
Some of them appear出现 to be made制作
of teams球队 of professional专业的 animators动画.
111
301354
3143
有些视频似乎来自专业动画团队。
05:16
Some of them are just randomly随机
assembled组装 by software软件.
112
304521
2882
有些只是随机由软件合成。
有些视频中有看起来友好的表演者
05:19
Some of them are quite相当 wholesome-looking看起来有益
young年轻 kids'孩子们 entertainers艺人.
113
307427
4253
但也有一些视频中的人
05:23
And some of them are from people
114
311704
1552
05:25
who really clearly明确地
shouldn't不能 be around children孩子 at all.
115
313280
3007
一看就是儿童不宜的。
05:28
(Laughter笑声)
116
316311
1615
(笑声)
05:30
And once一旦 again, this impossibility不可能的事
of figuring盘算 out who's谁是 making制造 this stuff东东 --
117
318987
4640
再一次,几乎不可能搞清楚是谁
制作了这些东西。
是机器人?
05:35
like, this is a botBOT?
118
323651
1156
05:36
Is this a person? Is this a troll拖钓?
119
324831
2647
是人?还是网络喷子?
05:39
What does it mean
that we can't tell the difference区别
120
327502
2382
我们无法分辨出彼此差别
05:41
between之间 these things anymore?
121
329908
1583
到底意味着什么?
05:43
And again, doesn't that uncertainty不确定
feel kind of familiar right now?
122
331515
4848
再一次,这种不确定性是不是有点熟悉?
05:50
So the main主要 way people get views意见
on their videos视频 --
123
338145
2580
所以人们获得观看量的主要方法是
05:52
and remember记得, views意见 mean money --
124
340749
1707
记住,观看量意味着金钱,
05:54
is that they stuff东东 the titles标题
of these videos视频 with these popular流行 terms条款.
125
342480
4742
是他们用热门词充斥这些视频的标题。
05:59
So you take, like, "surprise eggs"
126
347246
1687
以“惊喜蛋”为例,
06:00
and then you add
"Paw爪子 Patrol巡逻," "Easter复活节 egg,"
127
348957
2066
你会增加“狗狗巡逻队”、“复活节彩蛋“
06:03
or whatever随你 these things are,
128
351047
1393
或者任何其他词语,
06:04
all of these words from other
popular流行 videos视频 into your title标题,
129
352464
2893
这些来自其他热门视频的词
添加到你的标题,
直到你最终得到这种对人类而言
06:07
until直到 you end结束 up with this kind of
meaningless无意义的 mash糊状物 of language语言
130
355381
2968
06:10
that doesn't make sense to humans人类 at all.
131
358373
2498
毫无意义的词语混杂。
06:12
Because of course课程 it's only really
tiny kids孩子 who are watching观看 your video视频,
132
360895
3546
因为当然只有很小的孩子在看你的视频,
06:16
and what the hell地狱 do they know?
133
364465
1827
他们能知道什么?
06:18
Your real真实 audience听众
for this stuff东东 is software软件.
134
366316
3007
你这些东西的真正观众是软件,
06:21
It's the algorithms算法.
135
369347
1156
是算法。
06:22
It's the software软件 that YouTubeYouTube的 uses使用
136
370527
1855
这是YouTube使用来
06:24
to select选择 which哪一个 videos视频
are like other videos视频,
137
372406
2483
选择哪个视频像哪个视频,
让他们流行和推荐的算法。
06:26
to make them popular流行,
to make them recommended推荐的.
138
374913
2243
06:29
And that's why you end结束 up with this
kind of completely全然 meaningless无意义的 mash糊状物,
139
377180
3461
所以你最终得到的就是
这种完全没有意义的大杂烩,
06:32
both of title标题 and of content内容.
140
380665
2071
不管是标题还是内容。
06:35
But the thing is, you have to remember记得,
141
383792
1894
但事情是,你需要记住,
06:37
there really are still people within
this algorithmically算法 optimized优化 system系统,
142
385710
4478
这个优化的算法系统还是需要人的参与,
06:42
people who are kind
of increasingly日益 forced被迫 to act法案 out
143
390212
2790
这些人被迫面对处理
这些越来越奇怪的词语组合,
06:45
these increasingly日益 bizarre奇异的
combinations组合 of words,
144
393026
3066
就像一个绝望的即兴艺术家
要对上百万尖叫孩子组合
06:48
like a desperate殊死 improvisation即兴 artist艺术家
responding响应 to the combined结合 screams尖叫声
145
396116
5173
06:53
of a million百万 toddlers幼儿 at once一旦.
146
401313
2203
做出回应一样。
06:57
There are real真实 people
trapped被困 within these systems系统,
147
405168
2468
一些人则被困在这个系统里面,
06:59
and that's the other deeply strange奇怪 thing
about this algorithmically算法 driven驱动 culture文化,
148
407660
4055
另一个很奇怪的事情是
关于算法驱动文化,
07:03
because even if you're human人的,
149
411739
1381
因为即便你是人类,
07:05
you have to end结束 up behaving行为 like a machine
150
413144
2145
你最终也会变得像机器一样,
07:07
just to survive生存.
151
415313
1800
只是为了生存。
而且,屏幕的另一面是,
07:09
And also, on the other side of the screen屏幕,
152
417137
2100
07:11
there still are these little kids孩子
watching观看 this stuff东东,
153
419261
2947
这些小孩仍然在看这些视频,
07:14
stuck卡住, their full充分 attention注意 grabbed抓起
by these weird奇怪的 mechanisms机制.
154
422232
4206
他们的注意力完全被
这些奇怪的机制所左右。
07:18
And most of these kids孩子 are too small
to even use a website网站.
155
426768
2798
大部分小孩年纪都很小,
甚至还不会使用网页。
07:21
They're just kind of hammering锻造
on the screen屏幕 with their little hands.
156
429590
3276
他们只会用他们的小手敲打着屏幕。
这是自动播放按钮,
07:24
And so there's autoplay自动播放,
157
432890
1217
它会不断循环播放这些视频,
07:26
where it just keeps保持 playing播放 these videos视频
over and over and over in a loop循环,
158
434131
3579
07:29
endlessly不休 for hours小时 and hours小时 at a time.
159
437734
2059
无休止持续数小时。
07:31
And there's so much weirdness怪事
in the system系统 now
160
439817
2843
现在这个系统里有很多奇怪的东西
自动播放会带你去一些非常奇怪的地方。
07:34
that autoplay自动播放 takes you
to some pretty漂亮 strange奇怪 places地方.
161
442684
3009
这里演示的是,在十几个步骤里,
07:37
This is how, within a dozen steps脚步,
162
445717
2488
07:40
you can go from a cute可爱 video视频
of a counting数数 train培养
163
448229
3158
你可能会从数火车的有趣视频
播到手淫的米老鼠。
07:43
to masturbating自慰 Mickey米奇 Mouse老鼠.
164
451411
2442
07:46
Yeah. I'm sorry about that.
165
454529
2288
是的,非常抱歉。
事情确实变糟糕了。
07:48
This does get worse更差.
166
456841
1700
这就是会发生的事情:
07:50
This is what happens发生
167
458565
1282
07:51
when all of these different不同 keywords关键字,
168
459871
3086
当所有这些不同的关键词
所有可以吸引注意力的内容
07:54
all these different不同 pieces of attention注意,
169
462981
2461
07:57
this desperate殊死 generation of content内容,
170
465466
2807
所有迫不及待地让视频播出的心理
08:00
all comes together一起 into a single place地点.
171
468297
2582
都在同一个地方冒出来。
08:03
This is where all those deeply weird奇怪的
keywords关键字 come home to roost.
172
471871
4472
这是这些非常怪异的关键词
自食其果的地方。
你混合了手指家庭视频
08:08
You cross-breed杂交 the finger手指 family家庭 video视频
173
476367
2391
08:10
with some live-action现场操作 superhero超级英雄 stuff东东,
174
478782
2088
和一些超级英雄的真人动作视频,
08:12
you add in some weird奇怪的,
trollish特罗利什 in-jokes笑话中 or something,
175
480894
3256
你加入一些奇怪,恶搞或其他元素。
08:16
and suddenly突然, you come
to a very weird奇怪的 place地点 indeed确实.
176
484174
3366
突然之间,你就会进入非常怪异的领域。
那些会让父母感到不安的东西
08:19
The stuff东东 that tends趋向 to upset烦乱 parents父母
177
487564
2113
08:21
is the stuff东东 that has kind of violent暴力
or sexual有性 content内容, right?
178
489701
3331
是充满暴力和色情的内容,对吧?
儿童们的卡通形象被侵犯,
08:25
Children's儿童 cartoons卡通 getting得到 assaulted殴打,
179
493056
2822
08:27
getting得到 killed杀害,
180
495902
2018
被杀害,
08:29
weird奇怪的 pranks恶作剧 that actually其实
genuinely真正的 terrify惊吓 children孩子.
181
497944
3343
怪异的恶作剧,真的让孩子们感到恐惧。
现在你所能得到的是软件
把这些不同的内容
08:33
What you have is software软件 pulling in
all of these different不同 influences影响
182
501311
3675
自动组合成了儿童们的噩梦。
08:37
to automatically自动 generate生成
kids'孩子们 worst最差 nightmares噩梦.
183
505010
2961
这些东西真的,真的会影响小孩子。
08:39
And this stuff东东 really, really
does affect影响 small children孩子.
184
507995
2701
父母报告说他们的孩子受到了精神创伤,
08:42
Parents父母 report报告 their children孩子
being存在 traumatized创伤,
185
510720
2866
08:45
becoming变得 afraid害怕 of the dark黑暗,
186
513610
1392
变得害怕黑暗,
开始害怕他们喜爱的卡通角色。
08:47
becoming变得 afraid害怕 of their favorite喜爱
cartoon动画片 characters人物.
187
515026
3050
08:50
If you take one thing away from this,
it's that if you have small children孩子,
188
518524
3611
如果你可以从中学到一件事:
如果你有小孩,
08:54
keep them the hell地狱 away from YouTubeYouTube的.
189
522159
1996
让他们远离YouTube。
08:56
(Applause掌声)
190
524743
3949
(掌声)
09:02
But the other thing, the thing
that really gets得到 to me about this,
191
530504
3096
但另一件事情,
这事情真正让我关心这个的是
09:05
is that I'm not sure we even really
understand理解 how we got to this point.
192
533624
4629
我不确定我们是否真正理解
事情是如何发展到这一步的。
09:10
We've我们已经 taken采取 all of this influence影响,
all of these things,
193
538951
2931
我们已经看到了所有这些影响,
所有这些东西,
09:13
and munged蒙格德 them together一起 in a way
that no one really intended.
194
541906
2953
以一种没人意料得到的方式
组合在一起。
然而,这也是我们构建
整个世界的方式。
09:16
And yet然而, this is also the way
that we're building建造 the entire整个 world世界.
195
544883
3156
我们得到了所有这些数据,
09:20
We're taking服用 all of this data数据,
196
548063
1773
无数的坏数据,
09:21
a lot of it bad data数据,
197
549860
1447
大量充满偏见的历史数据,
09:23
a lot of historical历史的 data数据
full充分 of prejudice偏见,
198
551331
3029
充斥着历史上最糟糕冲动的数据,
09:26
full充分 of all of our worst最差
impulses冲动 of history历史,
199
554384
2837
我们把他们制作成海量数据集
09:29
and we're building建造 that
into huge巨大 data数据 sets
200
557245
2049
09:31
and then we're automating自动化 it.
201
559318
1423
然后让它们自动化。
09:32
And we're munging蒙金 it together一起
into things like credit信用 reports报告,
202
560765
3502
我们把它们生成信用报告,
保险费用,
09:36
into insurance保险 premiums保费,
203
564291
1634
预测性警务系统,
09:37
into things like predictive预测
policing治安 systems系统,
204
565949
2693
量刑建议。
09:40
into sentencing宣判 guidelines方针.
205
568666
1762
09:42
This is the way we're actually其实
constructing建设 the world世界 today今天
206
570452
2821
这就是我们今天基于这些数据
构建世界的方式。
09:45
out of this data数据.
207
573297
1151
09:46
And I don't know what's worse更差,
208
574472
1698
我不知道哪个更糟糕:
09:48
that we built内置 a system系统
that seems似乎 to be entirely完全 optimized优化
209
576194
3228
是我们似乎建造了一个完全
09:51
for the absolute绝对 worst最差 aspects方面
of human人的 behavior行为,
210
579446
2808
适合人类负面行为的优化系统,
或者我们只是无意中造就了它,
09:54
or that we seem似乎
to have doneDONE it by accident事故,
211
582278
2425
09:56
without even realizing实现
that we were doing it,
212
584727
2207
完全没有意识,
因为我们并不理解我们们正创建的系统,
09:58
because we didn't really understand理解
the systems系统 that we were building建造,
213
586958
3382
我们也并不知道是否有
其他不同的方式来使用它。
10:02
and we didn't really understand理解
how to do anything differently不同 with it.
214
590364
3683
10:06
There's a couple一对 of things I think
that really seem似乎 to be driving主动 this
215
594769
3365
我认为有几个因素看起来导致了
10:10
most fully充分 on YouTubeYouTube的,
216
598158
1189
YouTube内容事件的发生,
10:11
and the first of those is advertising广告,
217
599371
1827
其中之首是广告。
一种靠注意力赢利的模式
10:13
which哪一个 is the monetization货币化 of attention注意
218
601222
2837
10:16
without any real真实 other variables变量 at work,
219
604083
3136
而不考虑其他变量因素,
不关心是哪些人确实开发了这些内容。
10:19
any care关心 for the people who are
actually其实 developing发展 this content内容,
220
607243
3885
10:23
the centralization集中 of the power功率,
the separation分割 of those things.
221
611152
3636
权力的集中,东西的分散。
我想,不管你对用广告来宣传某物
10:26
And I think however然而 you feel
about the use of advertising广告
222
614812
3144
10:29
to kind of support支持 stuff东东,
223
617980
1238
有着什么样的看法,
10:31
the sight视力 of grown长大的 men男人 in diapers尿布
rolling压延 around in the sand
224
619242
3067
在沙里翻滚的穿纸尿裤的大人
10:34
in the hope希望 that an algorithm算法
that they don't really understand理解
225
622333
2983
希望无知的机器算法
10:37
will give them money for it
226
625340
1315
会发钱给他们。
10:38
suggests提示 that this
probably大概 isn't the thing
227
626679
2037
意味着这类事情
10:40
that we should be basing筑底
our society社会 and culture文化 upon,
228
628740
2563
不应该是我们社会和文化所依托的
也不该是我们应该资助的。
10:43
and the way in which哪一个
we should be funding资金 it.
229
631327
2160
另一个主要驱动因素是自动化,
10:45
And the other thing that's kind of
the major重大的 driver司机 of this is automation自动化,
230
633511
3519
10:49
which哪一个 is the deployment部署
of all of this technology技术
231
637054
2329
是所有这些技术的部署
多多益善,不加审察。
10:51
as soon不久 as it arrives到达,
without any kind of oversight疏忽,
232
639407
2521
10:53
and then once一旦 it's out there,
233
641952
1412
而一旦出了问题,
就两手一摊,“嗨,
不是我们做的,技术搞的。”
10:55
kind of throwing投掷 up our hands and going,
"Hey, it's not us, it's the technology技术."
234
643388
3843
10:59
Like, "We're not involved参与 in it."
235
647255
1642
又如,“我们没有参与其中。”
这还不够好,
11:00
That's not really good enough足够,
236
648921
1767
11:02
because this stuff东东 isn't
just algorithmically算法 governed治理,
237
650712
2710
因为这些平台不仅由算法控制,
还有被算法监管。
11:05
it's also algorithmically算法 policed维持治安.
238
653446
2498
11:07
When YouTubeYouTube的 first started开始
to pay工资 attention注意 to this,
239
655968
2848
当YouTube开始关注到这个问题时,
他们说他们会做的首要事情是
11:10
the first thing they said
they'd他们会 do about it
240
658840
2087
11:12
was that they'd他们会 deploy部署
better machine learning学习 algorithms算法
241
660951
2695
部署更好的机器学习算法
来控制内容。
11:15
to moderate中等 the content内容.
242
663670
1329
11:17
Well, machine learning学习,
as any expert专家 in it will tell you,
243
665023
3485
好吧,机器学习,
很多专家都会告诉你,
11:20
is basically基本上 what we've我们已经 started开始 to call
244
668532
1896
就是那种我们开始说的
11:22
software软件 that we don't really
understand理解 how it works作品.
245
670452
2588
我们无法真正理解它如何工作的软件。
11:25
And I think we have
enough足够 of that already已经.
246
673064
3983
我认为我们已经受够它了。
11:29
We shouldn't不能 be leaving离开
this stuff东东 up to AIAI to decide决定
247
677071
3166
我们不该让AI决定
11:32
what's appropriate适当 or not,
248
680261
1251
什么是合适与否的内容,
因为我们知道会发生什么。
11:33
because we know what happens发生.
249
681536
1436
11:34
It'll它会 start开始 censoring终检 other things.
250
682996
1688
它会开始审查其他事情。
它会开始审查同性内容。
11:36
It'll它会 start开始 censoring终检 queer同性恋者 content内容.
251
684708
1783
它会开始审查合法的公开演讲。
11:38
It'll它会 start开始 censoring终检
legitimate合法 public上市 speech言语.
252
686515
2237
在演讲中应该允许什么,
11:40
What's allowed允许 in these discourses话语,
253
688776
1925
这不该是留给不可靠系统去决定的事。
11:42
it shouldn't不能 be something
that's left up to unaccountable莫名其妙 systems系统.
254
690725
3097
这是我们所有人都应该讨论的事情。
11:45
It's part部分 of a discussion讨论
all of us should be having.
255
693846
2947
但我要留个提醒
11:48
But I'd leave离开 a reminder提醒
256
696817
1308
替代方案也并非完美。
11:50
that the alternative替代 isn't
very pleasant愉快, either.
257
698149
2753
11:52
YouTubeYouTube的 also announced公布 recently最近
258
700926
1535
YouTube最近也宣布
它们会推出它们的儿童版APP
11:54
that they're going to release发布
a version of their kids'孩子们 app应用
259
702485
2767
11:57
that would be entirely完全
moderated主持 by humans人类.
260
705276
2407
将会完全由人工监管。
12:00
FacebookFacebook的 -- Zuckerberg扎克伯格 said
much the same相同 thing at Congress国会,
261
708134
3618
Facebook——扎克伯格在国会上
也说了类似的话,
当被问到如何去调整他们的产品时。
12:03
when pressed压制 about how they
were going to moderate中等 their stuff东东.
262
711776
2987
12:06
He said they'd他们会 have humans人类 doing it.
263
714787
1747
他说他们会让人处理。
12:08
And what that really means手段 is,
264
716558
1459
那其实隐含的意思是,
12:10
instead代替 of having toddlers幼儿 being存在
the first person to see this stuff东东,
265
718041
3223
以其让蹒跚学步的孩子
成为第一个看到这些东西的人,
12:13
you're going to have underpaid少缴,
precarious危险的 contract合同 workers工人
266
721288
2788
你打算让那些工资过低,不稳定,
没有心理辅导的临时工
12:16
without proper正确 mental心理 health健康 support支持
267
724100
1726
被那些视频伤害。
12:17
being存在 damaged破损 by it as well.
268
725850
1376
12:19
(Laughter笑声)
269
727250
1096
(笑声)
我想我们都可以做得比这更好。
12:20
And I think we can all do
quite相当 a lot better than that.
270
728370
2601
12:22
(Applause掌声)
271
730995
2499
(鼓掌)
12:26
The thought, I think, that brings带来 those
two things together一起, really, for me,
272
734068
4613
总结这两件事,我的想法在于:
自主能动性
12:30
is agency机构.
273
738705
1420
就像,我们是否真正懂得“自主能动性”:
12:32
It's like, how much do we really
understand理解 -- by agency机构, I mean:
274
740149
3157
12:35
how we know how to act法案
in our own拥有 best最好 interests利益.
275
743330
4390
我们是否知怎样按照
自己的最佳利益行事。
12:39
Which哪一个 -- it's almost几乎 impossible不可能 to do
276
747744
1787
而这几乎无法在
12:41
in these systems系统 that we don't
really fully充分 understand理解.
277
749555
3485
我们并不完全理解的系统中实现。
12:45
Inequality不等式 of power功率
always leads引线 to violence暴力.
278
753064
3071
权力的不对等总会导致暴力。
我们可以在这些系统中看到
12:48
And we can see inside these systems系统
279
756159
1685
理解的不对等也会造成同样的结果。
12:49
that inequality不等式 of understanding理解
does the same相同 thing.
280
757868
2611
如果我们能够做一件事情
去提升这些系统,
12:52
If there's one thing that we can do
to start开始 to improve提高 these systems系统,
281
760503
3779
12:56
it's to make them more legible清晰
to the people who use them,
282
764306
2718
那就是让它们变得更透明
12:59
so that all of us have
a common共同 understanding理解
283
767048
2196
这样我们所有的人都对其中的情况
13:01
of what's actually其实 going on here.
284
769268
1851
有一个共同的理解。
13:03
The thing, though虽然, I think
most about these systems系统
285
771970
2968
但是,我认为这些系统的关键问题
我希望我已解释过了,
这真的并不是YouTube的问题。
13:06
is that this isn't, as I hope希望
I've explained解释, really about YouTubeYouTube的.
286
774962
3857
13:10
It's about everything.
287
778843
1312
任何事都是一样。
这些问责和自主能动性,
13:12
These issues问题 of accountability问责 and agency机构,
288
780179
2444
13:14
of opacity不透明度 and complexity复杂,
289
782647
2225
不透明性和复杂性问题,
13:16
of the violence暴力 and exploitation开发
that inherently本质 results结果
290
784896
3177
暴力和剥削问题
是因为权力集中于少数人手中。
13:20
from the concentration浓度
of power功率 in a few少数 hands --
291
788097
2794
这些都是更大的问题。
13:22
these are much, much larger issues问题.
292
790915
2579
13:26
And they're issues问题 not just of YouTubeYouTube的
and not just of technology技术 in general一般,
293
794395
3687
他们不仅是YouTube的问题,
而且不仅仅是科技问题,
它们甚至都不是新问题。
13:30
and they're not even new.
294
798106
1265
13:31
They've他们已经 been with us for ages年龄.
295
799395
1461
他们已经存在很久了。
13:32
But we finally最后 built内置 this system系统,
this global全球 system系统, the internet互联网,
296
800880
4390
但是我们最终建立了这个系统,
这个全球系统,互联网,
它实际上是用这种特殊的方式
向我们展示,
13:37
that's actually其实 showing展示 them to us
in this extraordinary非凡 way,
297
805294
3019
他们至高无上。
13:40
making制造 them undeniable不可否认.
298
808337
1547
13:41
Technology技术 has this extraordinary非凡 capacity容量
299
809908
2820
技术有这种非凡的能力
13:44
to both instantiate实例 and continue继续
300
812752
3973
去具现化和继续
13:48
all of our most extraordinary非凡,
often经常 hidden desires欲望 and biases偏见
301
816749
4248
我们所有最卓越,而通常
被隐藏的欲望和偏见,
并把它们编码到世界中,
13:53
and encoding编码 them into the world世界,
302
821021
1866
但是它也会把它们写下来,
这样我们就能看到它们了。
13:54
but it also writes them down
so that we can see them,
303
822911
3474
所以我们不能假装这些问题不再存在了。
13:58
so that we can't pretend假装
they don't exist存在 anymore.
304
826409
3330
14:01
We need to stop thinking思维 about technology技术
as a solution to all of our problems问题,
305
829763
4319
我们需要停止把技术
当作是解决一切问题的良方,
而应把它看作指引我们发现
问题的指南针,
14:06
but think of it as a guide指南
to what those problems问题 actually其实 are,
306
834106
3757
这样我们才可以开始正视它们
14:09
so we can start开始 thinking思维
about them properly正确
307
837887
2144
并开始着手解决它们。
14:12
and start开始 to address地址 them.
308
840055
1766
14:13
Thank you very much.
309
841845
1335
谢谢!
(掌声)
14:15
(Applause掌声)
310
843204
5192
14:21
Thank you.
311
849733
1188
谢谢!
14:22
(Applause掌声)
312
850945
2869
(掌声)
14:28
Helen海伦 Walters沃尔特斯: James詹姆士, thank you
for coming未来 and giving us that talk.
313
856839
3178
海伦·沃尔特斯:詹姆斯,
谢谢你的到来和这个演讲。
14:32
So it's interesting有趣:
314
860041
1189
这很有趣:
当在电影中,当那些机器人
开始统治世界时,
14:33
when you think about the films影片 where
the robotic机器人 overlords霸主 take over,
315
861254
3495
那个景象好似比你描述的更加宏伟。
14:36
it's all a bit more glamorous富有魅力的
than what you're describing说明.
316
864773
3279
但我想知道的是,在那些电影里,
往往会有人类抵抗军
14:40
But I wonder奇迹 -- in those films影片,
you have the resistance抵抗性 mounting安装.
317
868076
3749
现实中是否也存在这些反抗军呢?
14:43
Is there a resistance抵抗性 mounting安装
towards this stuff东东?
318
871849
3216
你有没有看到任何积极的迹象,
抵抗的萌芽?
14:47
Do you see any positive signs迹象,
green绿色 shoots of resistance抵抗性?
319
875089
3796
14:52
James詹姆士 Bridle缰绳: I don't know
about direct直接 resistance抵抗性,
320
880507
2416
詹姆斯·布里德尔:我不清楚
直接的抵抗力量,
因为我觉得这是非常长期的问题。
14:54
because I think this stuff东东
is super long-term长期.
321
882947
2264
我认为它在文化中根深蒂固。
14:57
I think it's baked烘烤的 into culture文化
in really deep ways方法.
322
885235
2510
我的朋友,埃莉诺·萨塔,总是说
14:59
A friend朋友 of mine,
Eleanor埃莉诺 Saitta萨伊塔, always says
323
887769
2132
任何足够规模和范围的科技问题
15:01
that any technological技术性 problems问题
of sufficient足够 scale规模 and scope范围
324
889935
3609
首先是政治问题。
15:05
are political政治 problems问题 first of all.
325
893568
2267
所有这些我们在努力解决的问题
15:07
So all of these things we're working加工
to address地址 within this
326
895859
2785
不能仅仅通过研发更好的技术来解决,
15:10
are not going to be addressed解决
just by building建造 the technology技术 better,
327
898668
3274
而应通过改变产生
这些技术的社会来解决。
15:13
but actually其实 by changing改变 the society社会
that's producing生产 these technologies技术.
328
901966
3464
所以,现在我想我们还有很长的路要走。
15:17
So no, right now, I think we've我们已经 got
a hell地狱 of a long way to go.
329
905454
3027
但就像我说的,我想把它们拆解,
15:20
But as I said, I think by unpacking拆包 them,
330
908505
1986
通过解构他们,诚实地谈论他们
15:22
by explaining说明 them, by talking
about them super honestly老老实实,
331
910515
2697
15:25
we can actually其实 start开始
to at least最小 begin开始 that process处理.
332
913236
2505
我们至少可以开始这个过程。
沃尔特斯: 还有当你谈到易读性和
数字素养时,
15:27
HWHW: And so when you talk about
legibility易读性 and digital数字 literacy读写能力,
333
915765
3562
我觉得非常难以想象
15:31
I find it difficult to imagine想像
334
919351
1591
我们要把数字扫盲的负担
放在用户身上。
15:32
that we need to place地点 the burden负担
of digital数字 literacy读写能力 on users用户 themselves他们自己.
335
920966
3680
但在这个新世界,教育是谁的责任?
15:36
But whose谁的 responsibility责任
is education教育 in this new world世界?
336
924670
4562
布里德尔:再次,我认为这个责任是
我们所有人的责任,
15:41
JBJB: Again, I think this responsibility责任
is kind of up to all of us,
337
929256
3612
我们所做的、所构建的、所制造的一切,
15:44
that everything we do,
everything we build建立, everything we make,
338
932892
2984
需要与每个回避问题的人
15:47
needs需求 to be made制作
in a consensual两厢情愿 discussion讨论
339
935900
3692
以相互求同为目的,进行讨论;
15:51
with everyone大家 who's谁是 avoiding避免 it;
340
939616
1940
我们建造这些系统并不是为了
去诱惑人
15:53
that we're not building建造 systems系统
intended to trick and surprise people
341
941580
4341
去做正确的事情。
15:57
into doing the right thing,
342
945945
2300
而是教育他们的每一步,
16:00
but that they're actually其实 involved参与
in every一切 step in educating教育 them,
343
948269
3236
因为每一个系统都是有教育意义的。
16:03
because each of these systems系统
is educational教育性.
344
951529
2278
这就是我所希望的,
即使是非常严峻的问题,
16:05
That's what I'm hopeful有希望 about,
about even this really grim严峻 stuff东东,
345
953831
3102
如果你能正确地看待它,
16:08
that if you can take it
and look at it properly正确,
346
956957
2262
这本身就是一种教育,
16:11
it's actually其实 in itself本身
a piece of education教育
347
959243
2089
让你可以开始看到复杂的系统
是如何结合在一起工作的
16:13
that allows允许 you to start开始 seeing眼看
how complex复杂 systems系统 come together一起 and work
348
961356
3762
也许能够将这些知识
应用到世界的其他地方。
16:17
and maybe be able能够 to apply应用
that knowledge知识 elsewhere别处 in the world世界.
349
965142
3501
沃尔特斯:詹姆斯,
这是一个很重要的讨论,
16:20
HWHW: James詹姆士, it's such这样
an important重要 discussion讨论,
350
968667
2115
16:22
and I know many许多 people here
are really open打开 and prepared准备 to have it,
351
970806
3227
我知道这里很多人都乐于和你讨论,
谢谢你今早的开场演讲。
16:26
so thanks谢谢 for starting开始 off our morning早上.
352
974057
1859
布里德尔:非常感谢。
16:27
JBJB: Thanks谢谢 very much. Cheers干杯.
353
975940
1400
(掌声)
16:29
(Applause掌声)
354
977364
1651
Translated by psjmz mz
Reviewed by Joey Chen

▲Back to top

ABOUT THE SPEAKER
James Bridle - Artist, writer
Working across technologies and disciplines, James Bridle examines technology, knowledge and the end of the future.

Why you should listen

James Bridle is an artist and writer working across technologies and disciplines. His artworks and installations have been exhibited in Europe, North and South America, Asia and Australia, and have been viewed by hundreds of thousands of visitors online. He has been commissioned by organizations including the Victoria & Albert Museum, the Barbican, Artangel, the Oslo Architecture Triennale and the Istanbul Design Biennial, and he has been honored by Ars Electronica, the Japan Media Arts Festival and the Design Museum, London. His writing on literature, culture and networks has appeared in magazines and newspapers including Frieze, Wired, Domus, Cabinet, The Atlantic, the New Statesman and many others, and he has written a regular column for The Observer.

New Dark Age, Bridle's book about technology, knowledge and the end of the future is forthcoming from Verso (UK & US) in 2018. He lectures regularly on radio, at conferences, universities and other events including SXSW, Lift, the Global Art Forum and Re:Publica. He has been a resident at Lighthouse, Brighton, the White Building, London and Eyebeam, New York, and an adjunct professor on the interactive telecommunications program at New York University.

More profile about the speaker
James Bridle | Speaker | TED.com