ABOUT THE SPEAKER
James Bridle - Artist, writer
Working across technologies and disciplines, James Bridle examines technology, knowledge and the end of the future.

Why you should listen

James Bridle is an artist and writer working across technologies and disciplines. His artworks and installations have been exhibited in Europe, North and South America, Asia and Australia, and have been viewed by hundreds of thousands of visitors online. He has been commissioned by organizations including the Victoria & Albert Museum, the Barbican, Artangel, the Oslo Architecture Triennale and the Istanbul Design Biennial, and he has been honored by Ars Electronica, the Japan Media Arts Festival and the Design Museum, London. His writing on literature, culture and networks has appeared in magazines and newspapers including Frieze, Wired, Domus, Cabinet, The Atlantic, the New Statesman and many others, and he has written a regular column for The Observer.

New Dark Age, Bridle's book about technology, knowledge and the end of the future is forthcoming from Verso (UK & US) in 2018. He lectures regularly on radio, at conferences, universities and other events including SXSW, Lift, the Global Art Forum and Re:Publica. He has been a resident at Lighthouse, Brighton, the White Building, London and Eyebeam, New York, and an adjunct professor on the interactive telecommunications program at New York University.

More profile about the speaker
James Bridle | Speaker | TED.com
TED2018

James Bridle: The nightmare videos of children's YouTube -- and what's wrong with the internet today

詹姆斯 · 布瑞德: 兒童 YouTube 上的夢魘影片——現今的網際網路到底出了什麼問題?

Filmed:
5,404,429 views

作家和藝術詹姆斯 · 布瑞德向我們揭示了網際網路黑暗、詭異的一面。在 YouTube 上,一些素人或素人團體為了賺取廣告收入,製作各種吸睛、洗腦的孩童影片。從《驚喜蛋》和《手指家庭之歌》所暗藏的危機,到各種耳熟能詳的卡通人物不斷地在暴力場景中出現,詹姆斯警示我們這些影片正侵蝕著年輕的心靈,同時這些現象向我們展示了這個越來越以數字至上的世界正走向何方。他說:「我們不能再把科技視為是解決所有問題的利器。 我們需要把科技當作是一種指引,帶領我們發現真正的問題所在,才能正視並去解決這些問題。」
- Artist, writer
Working across technologies and disciplines, James Bridle examines technology, knowledge and the end of the future. Full bio

Double-click the English transcript below to play the video.

00:12
I'm James詹姆士.
0
777
1183
我是詹姆斯。
00:13
I'm a writer作家 and artist藝術家,
1
1984
1686
我是個作家也是個藝術家。
00:15
and I make work about technology技術.
2
3694
2341
我喜歡創造一些有關科技的作品。
00:18
I do things like draw life-size真人大小 outlines大綱
of military軍事 drones無人駕駛飛機
3
6454
3911
我所做的事包括:
在世界各地的城市街道上繪製
00:22
in city streets街道 around the world世界,
4
10389
1828
實體大小的軍用無人機輪廓,
00:24
so that people can start開始 to think
and get their heads around
5
12241
2950
這樣大家就可以了解並開始思考
00:27
these really quite相當 hard-to-see難以看到的
and hard-to-think-about難以思考的問題 technologies技術.
6
15215
3440
這些平時難以見到
也很難想像的科技。
00:31
I make things like neural神經 networks網絡
that predict預測 the results結果 of elections選舉
7
19494
3836
我會建造類神經網路的東西,
它能根據氣象報導來預測選舉結果,
00:35
based基於 on weather天氣 reports報告,
8
23354
1737
00:37
because I'm intrigued好奇 about
9
25115
1314
因為我很好奇
00:38
what the actual實際 possibilities可能性
of these weird奇怪的 new technologies技術 are.
10
26453
3924
這些奇怪的新科技究竟有多少可能。
00:43
Last year, I built內置
my own擁有 self-driving自駕車 car汽車.
11
31405
2426
去年,我自製了一台自動駕駛汽車。
00:45
But because I don't
really trust相信 technology技術,
12
33855
2526
但因為我並不完全相信科技,
00:48
I also designed設計 a trap陷阱 for it.
13
36405
1928
所以我也為它設計了一個陷阱。
00:50
(Laughter笑聲)
14
38777
1086
(笑聲)
00:51
And I do these things mostly大多 because
I find them completely全然 fascinating迷人,
15
39887
4298
我去做這些事主要是因為
我覺得它們真的很吸引我,
00:56
but also because I think
when we talk about technology技術,
16
44209
2602
也是因為我認為,
當我們談到科技時,
00:58
we're largely大部分 talking about ourselves我們自己
17
46835
2619
我們其實是在談論我們自己
01:01
and the way that we understand理解 the world世界.
18
49478
2299
以及我們理解世界的方式。
01:03
So here's這裡的 a story故事 about technology技術.
19
51801
2442
下面我想和大家分享一個
關於科技的故事。
01:07
This is a "surprise egg" video視頻.
20
55520
2830
這是一個叫做《驚喜蛋》的短片。
01:10
It's basically基本上 a video視頻 of someone有人
opening開盤 up loads負載 of chocolate巧克力 eggs
21
58374
3348
影片的內容就是一個人
打開了一堆巧克力蛋,
01:13
and showing展示 the toys玩具 inside to the viewer觀眾.
22
61746
2126
然後把裡面的玩具展示給觀眾看。
01:16
That's it. That's all it does
for seven long minutes分鐘.
23
64461
2649
僅此而已,
7 分鐘的影片就這個內容。
01:19
And I want you to notice注意
two things about this.
24
67428
3051
我想請各位從中注意兩件事。
01:22
First of all, this video視頻
has 30 million百萬 views意見.
25
70503
4074
第一,這個影片有
3000 萬人次的點閱率。
01:26
(Laughter笑聲)
26
74601
1275
(笑聲)
01:28
And the other thing is,
27
76376
1166
第二件事是,
01:29
it comes from a channel渠道
that has 6.3 million百萬 subscribers用戶,
28
77566
3869
播放這個影片的頻道
擁有 630 萬名的訂閱者,
01:33
that has a total of eight billion十億 views意見,
29
81459
2680
累計有 80 億人次的點閱率,
01:36
and it's all just more videos視頻 like this --
30
84163
3106
而該頻道中的大多數的影片
都差不多是這樣子的,
01:40
30 million百萬 people watching觀看 a guy
opening開盤 up these eggs.
31
88256
3908
3000 萬人看一個人打開這些蛋。
01:44
It sounds聲音 pretty漂亮 weird奇怪的, but if you search搜索
for "surprise eggs" on YouTubeYouTube的,
32
92188
4481
這聽起來太奇怪了,但如果你在
YouTube 上搜索「驚喜蛋」,
01:48
it'll它會 tell you there's
10 million百萬 of these videos視頻,
33
96693
3523
你會找到 1000 萬支相關影片,
01:52
and I think that's an undercount低估.
34
100240
1657
我認為這還是個低估的數字。
01:53
I think there's way, way more of these.
35
101921
1897
我認為實際數量遠大於此。
01:55
If you keep searching搜索, they're endless無窮.
36
103842
2242
如果你繼續搜尋,
就會發現它們多不勝數。
01:58
There's millions百萬 and millions百萬
of these videos視頻
37
106108
2159
有數百萬支像這樣的影片,
02:00
in increasingly日益 baroque巴洛克 combinations組合
of brands品牌 and materials物料,
38
108291
3454
標題和實際內容都是千奇百怪,
02:03
and there's more and more of them
being存在 uploaded上傳 every一切 single day.
39
111769
3846
而且每天都以遞增的數量在上傳。
02:07
Like, this is a strange奇怪 world世界. Right?
40
115639
3511
這真是個奇怪的世界,對吧?
02:11
But the thing is, it's not adults成年人
who are watching觀看 these videos視頻.
41
119174
3383
但重點是,看這些影片的
觀眾並不是成人,
02:14
It's kids孩子, small children孩子.
42
122581
2921
而是小孩,年紀很小的小孩。
02:17
These videos視頻 are
like crack裂紋 for little kids孩子.
43
125526
2154
這些影片就像是小孩們的古柯鹼,
02:19
There's something about the repetition重複,
44
127704
2075
它們一遍又一遍地重複播放,
02:21
the constant不變 little
dopamine多巴胺 hit擊中 of the reveal揭示,
45
129803
2468
揭曉驚奇蛋的驚喜感,
讓多巴胺一點一滴地累積,
02:24
that completely全然 hooks掛鉤 them in.
46
132295
1866
就這樣讓小孩們完全上了癮。
02:26
And little kids孩子 watch these videos視頻
over and over and over again,
47
134185
4809
小孩們會一次又一次地
觀看這些影片,
02:31
and they do it for hours小時
and hours小時 and hours小時.
48
139018
2327
他們會在此花費數小時的時間。
02:33
And if you try and take
the screen屏幕 away from them,
49
141369
2356
如果你試著阻止他們觀看,
02:35
they'll他們會 scream驚叫 and scream驚叫 and scream驚叫.
50
143749
1782
他們會不斷地一直尖叫。
02:37
If you don't believe me --
51
145555
1262
如果你們不相信我,
02:38
and I've already已經 seen看到 people
in the audience聽眾 nodding點頭 --
52
146841
2607
我已經看到觀眾席上
有人在點頭了——
02:41
if you don't believe me, find someone有人
with small children孩子 and ask them,
53
149472
3391
如果你們不相信我,
去問問那些有小孩的人,
02:44
and they'll他們會 know about
the surprise egg videos視頻.
54
152887
2340
他們都知道驚喜蛋影片是什麼。
02:47
So this is where we start開始.
55
155251
2070
所以,我們從這裡開始說起。
02:49
It's 2018, and someone有人, or lots of people,
56
157345
3642
2018 年,有人或很多人
02:53
are using運用 the same相同 mechanism機制 that, like,
FacebookFacebook的 and InstagramInstagram的 are using運用
57
161011
3941
用像臉書及 Instagram
現今在用的相同機制,
02:56
to get you to keep checking檢查 that app應用,
58
164976
1989
讓你不斷回去查看它們的應用程式。
02:58
and they're using運用 it on YouTubeYouTube的
to hack the brains大腦 of very small children孩子
59
166989
3985
他們在 YouTube 上用
這種方法入侵小孩子的腦袋,
03:02
in return返回 for advertising廣告 revenue收入.
60
170998
1958
來賺取廣告收入。
至少,我希望他們只是在
賺取廣告收入。
03:06
At least最小, I hope希望
that's what they're doing.
61
174346
2001
03:08
I hope希望 that's what they're doing it for,
62
176371
1955
我希望他們做這種事的目的
只是為了賺錢,
03:10
because there's easier更輕鬆 ways方法
of making製造 ad廣告 revenue收入 on YouTubeYouTube的.
63
178350
5308
因為 YouTube 上
有更簡單賺取廣告收入的方法。
03:15
You can just make stuff東東 up or steal stuff東東.
64
183682
2332
你可以捏造一些東西
或抄襲別人的東西,
03:18
So if you search搜索 for really
popular流行 kids'孩子們 cartoons卡通
65
186038
2635
比如如果你去搜尋當下
很流行的兒童卡通,
03:20
like "Peppa佩帕 Pig" or "Paw爪子 Patrol巡邏,"
66
188697
1654
像「粉紅豬小妹」
或「汪汪隊立大功」,
03:22
you'll你會 find there's millions百萬 and millions百萬
of these online線上 as well.
67
190375
3147
你也會找到數百萬個搜尋結果,
03:25
Of course課程, most of them aren't posted發布
by the original原版的 content內容 creators創作者.
68
193546
3352
當然,大多數這類卡通都
不是原創者上傳的,
03:28
They come from loads負載 and loads負載
of different不同 random隨機 accounts賬戶,
69
196922
2999
它們來自一大堆隨機帳戶。
03:31
and it's impossible不可能 to know
who's誰是 posting發帖 them
70
199945
2240
無法知道是誰上傳的,
03:34
or what their motives動機 might威力 be.
71
202209
1822
也不知道他們上傳的動機,
03:36
Does that sound聲音 kind of familiar?
72
204428
1930
這聽起來是不是有點熟悉?
03:38
Because it's exactly究竟 the same相同 mechanism機制
73
206382
1980
因為這些操弄方式,
03:40
that's happening事件 across橫過 most
of our digital數字 services服務,
74
208386
2600
正是目前大多數的
數位網路平台所做的事。
03:43
where it's impossible不可能 to know
where this information信息 is coming未來 from.
75
211010
3207
你根本不可能知道這些資訊的來源。
基本上,就像是給兒童看的假新聞。
03:46
It's basically基本上 fake news新聞 for kids孩子,
76
214241
1829
03:48
and we're training訓練 them from birth分娩
77
216094
2161
可笑的是,孩子從出生開始,
我們就訓練他們按下最先看到的連結,
03:50
to click點擊 on the very first link鏈接
that comes along沿,
78
218279
2506
03:52
regardless而不管 of what the source資源 is.
79
220809
1953
不管它的來源為何。
03:54
That's doesn't seem似乎 like
a terribly可怕 good idea理念.
80
222786
2603
這聽起來不是個非常好的主意。
03:58
Here's這裡的 another另一個 thing
that's really big on kids'孩子們 YouTubeYouTube的.
81
226399
2710
還有一個 YouTube 頻道
在兒童圈也很夯,
04:01
This is called the "Finger手指 Family家庭 Song歌曲."
82
229133
1928
叫做《手指家庭之歌》。
04:03
I just heard聽說 someone有人 groan呻吟
in the audience聽眾.
83
231085
2018
我剛聽觀眾席上有人在吟唱,
04:05
This is the "Finger手指 Family家庭 Song歌曲."
84
233127
1624
這就是《手指家庭之歌》。
04:06
This is the very first one I could find.
85
234775
1930
這是我能找到的最初版本。
04:08
It's from 2007, and it only has
200,000 views意見,
86
236729
2829
是 2007 年上傳的,
只有 20 萬人次的點擊率。
04:11
which哪一個 is, like, nothing in this game遊戲.
87
239582
1976
這點點擊率似乎不算什麼,
04:13
But it has this insanely瘋狂地 earwormy厄爾沃米 tune調,
88
241582
2852
但它的曲調卻會在腦中揮之不去。
04:16
which哪一個 I'm not going to play to you,
89
244458
1682
我不會放給在座的各位聽,
04:18
because it will sear烤焦 itself本身
into your brain
90
246164
2008
因為它的魔音
會傳入你們腦中盤旋不去,
04:20
in the same相同 way that
it seared香煎 itself本身 into mine,
91
248196
2395
我自己深受其害,
04:22
and I'm not going to do that to you.
92
250615
1770
我不會這樣對你們。
04:24
But like the surprise eggs,
93
252409
1344
但就像驚喜蛋,
04:25
it's got inside kids'孩子們 heads
94
253777
2164
它會進到孩子們的腦中,
04:27
and addicted上癮 them to it.
95
255965
1607
讓孩子們上癮。
04:29
So within a few少數 years年份,
these finger手指 family家庭 videos視頻
96
257596
2531
短短幾年間
這些手指家庭之歌的影片
04:32
start開始 appearing出現 everywhere到處,
97
260151
1303
在各處流行開來,
04:33
and you get versions版本
in different不同 languages語言
98
261478
2029
還有不同語言版本的,
04:35
with popular流行 kids'孩子們 cartoons卡通 using運用 food餐飲
99
263531
2121
在各種兒童動畫片中出現,
04:37
or, frankly坦率地說, using運用 whatever隨你 kind
of animation動畫 elements分子
100
265676
2550
有食物版的,
可以這麼說,你能找到的
各種動畫元素都有相應的版本。
04:40
you seem似乎 to have lying說謊 around.
101
268250
2252
04:43
And once一旦 again, there are millions百萬
and millions百萬 and millions百萬 of these videos視頻
102
271002
5197
再說一次,線上有
數百萬支這樣的影片,
04:48
available可得到 online線上 in all of these
kind of insane combinations組合.
103
276223
3435
有著各種瘋狂的組合。
04:51
And the more time
you start開始 to spend with them,
104
279682
2228
你花越多時間在它們上面,
04:53
the crazier瘋狂 and crazier瘋狂
you start開始 to feel that you might威力 be.
105
281934
3694
你就會覺得自己越瘋狂。
04:57
And that's where I
kind of launched推出 into this,
106
285652
3333
我就是這樣開始投入的,
05:01
that feeling感覺 of deep strangeness陌生感
and deep lack缺乏 of understanding理解
107
289009
3666
有種很深的陌生感,
也完全沒有辦法理解
05:04
of how this thing was constructed
that seems似乎 to be presented呈現 around me.
108
292699
4175
我周圍的這些事物
是怎麼被製造出來的。
05:08
Because it's impossible不可能 to know
where these things are coming未來 from.
109
296898
3167
因為不可能知道這些影片的來源,
05:12
Like, who is making製造 them?
110
300089
1241
它們是誰製作的?
05:13
Some of them appear出現 to be made製作
of teams球隊 of professional專業的 animators動畫.
111
301354
3143
當中有些看起來是由
專業動畫師團隊製作的,
05:16
Some of them are just randomly隨機
assembled組裝 by software軟件.
112
304521
2882
有些則只是軟體隨機拼湊而成的,
05:19
Some of them are quite相當 wholesome-looking看起來很健康
young年輕 kids'孩子們 entertainers藝人.
113
307427
4253
有些影片看起來似乎對孩子有益,
05:23
And some of them are from people
114
311704
1552
有些則顯而易見
05:25
who really clearly明確地
shouldn't不能 be around children孩子 at all.
115
313280
3007
絕對是兒童不宜的。
(笑聲)
05:28
(Laughter笑聲)
116
316311
1615
05:30
And once一旦 again, this impossibility不可能的事
of figuring盤算 out who's誰是 making製造 this stuff東東 --
117
318987
4640
同樣的,不可能知道
這些東西是由誰製作的。
05:35
like, this is a botBOT?
118
323651
1156
是機器人製作的嗎?
05:36
Is this a person? Is this a troll拖釣?
119
324831
2647
是人製作的嗎?或是酸民製作的?
05:39
What does it mean
that we can't tell the difference區別
120
327502
2382
當我們再也不能分辨
它們的差別時,
05:41
between之間 these things anymore?
121
329908
1583
意味著什麼呢?
05:43
And again, doesn't that uncertainty不確定
feel kind of familiar right now?
122
331515
4848
同樣的,這樣的
不確定性是否有點熟悉?
05:50
So the main主要 way people get views意見
on their videos視頻 --
123
338145
2580
人們獲取點閱率的主要方式──
05:52
and remember記得, views意見 mean money --
124
340749
1707
注意,點閱率就是金錢──
05:54
is that they stuff東東 the titles標題
of these videos視頻 with these popular流行 terms條款.
125
342480
4742
是把熱搜的關鍵字塞進影片標題裡。
05:59
So you take, like, "surprise eggs"
126
347246
1687
比如,你可以用「驚喜蛋」
06:00
and then you add
"Paw爪子 Patrol巡邏," "Easter復活節 egg,"
127
348957
2066
接著加上「汪汪隊立大功」
和「復活節彩蛋」
06:03
or whatever隨你 these things are,
128
351047
1393
或這一類的東西,
06:04
all of these words from other
popular流行 videos視頻 into your title標題,
129
352464
2893
把其他熱門影片的
關鍵字加進你的標題,
06:07
until直到 you end結束 up with this kind of
meaningless無意義的 mash糊狀物 of language語言
130
355381
2968
最終變成一串無意義的標題字句,
06:10
that doesn't make sense to humans人類 at all.
131
358373
2498
沒有任何人類看得懂。
06:12
Because of course課程 it's only really
tiny kids孩子 who are watching觀看 your video視頻,
132
360895
3546
當然因為只有幼童會看你的影片,
06:16
and what the hell地獄 do they know?
133
364465
1827
他們哪懂什麼?
06:18
Your real真實 audience聽眾
for this stuff東東 is software軟件.
134
366316
3007
這類影片實際的觀眾是軟體本身。
06:21
It's the algorithms算法.
135
369347
1156
它是種演算法,
06:22
It's the software軟件 that YouTubeYouTube的 uses使用
136
370527
1855
是 YouTube 用來
06:24
to select選擇 which哪一個 videos視頻
are like other videos視頻,
137
372406
2483
篩選相似影片,
06:26
to make them popular流行,
to make them recommended推薦的.
138
374913
2243
及讓影片更熱門、受推薦的演算法。
06:29
And that's why you end結束 up with this
kind of completely全然 meaningless無意義的 mash糊狀物,
139
377180
3461
這就是為什麼最後你看到的
標題或內容,
06:32
both of title標題 and of content內容.
140
380665
2071
是毫無意義的大雜燴。
06:35
But the thing is, you have to remember記得,
141
383792
1894
但重要的是,你們必須記住,
06:37
there really are still people within
this algorithmically算法 optimized優化 system系統,
142
385710
4478
這個演算最佳化系統
還是有人的參與。
06:42
people who are kind
of increasingly日益 forced被迫 to act法案 out
143
390212
2790
這些人被迫要應對處理
06:45
these increasingly日益 bizarre奇異的
combinations組合 of words,
144
393026
3066
這些與日俱增的怪異文字組合,
06:48
like a desperate殊死 improvisation即興 artist藝術家
responding響應 to the combined結合 screams尖叫聲
145
396116
5173
就像是個拼了命的即興藝術家,
要在同一時間去回應
100 萬名齊聲尖叫的學步兒。
06:53
of a million百萬 toddlers幼兒 at once一旦.
146
401313
2203
06:57
There are real真實 people
trapped被困 within these systems系統,
147
405168
2468
真的有人被困在這些系統當中,
06:59
and that's the other deeply strange奇怪 thing
about this algorithmically算法 driven驅動 culture文化,
148
407660
4055
這種演算法導向的文化,
還有個很奇怪的特點,
07:03
because even if you're human人的,
149
411739
1381
就是即使你是個人,
07:05
you have to end結束 up behaving行為 like a machine
150
413144
2145
最終也得要像機器一樣行為,
07:07
just to survive生存.
151
415313
1800
才得以存活下來。
07:09
And also, on the other side of the screen屏幕,
152
417137
2100
此外,在螢幕的另一端,
07:11
there still are these little kids孩子
watching觀看 this stuff東東,
153
419261
2947
還是有幼童在看這些影片,
07:14
stuck卡住, their full充分 attention注意 grabbed抓起
by these weird奇怪的 mechanisms機制.
154
422232
4206
雙眼黏著螢幕,所有的注意力
都被這些詭異的手法所吸引。
07:18
And most of these kids孩子 are too small
to even use a website網站.
155
426768
2798
大部分的孩子年紀小到
都還不會使用網路。
07:21
They're just kind of hammering鍛造
on the screen屏幕 with their little hands.
156
429590
3276
他們只會用小手捶打螢幕。
還有所謂的自動播放。
07:24
And so there's autoplay自動播放,
157
432890
1217
07:26
where it just keeps保持 playing播放 these videos視頻
over and over and over in a loop循環,
158
434131
3579
這個功能會讓各種影片
以接力賽的方式播放,
07:29
endlessly不休 for hours小時 and hours小時 at a time.
159
437734
2059
無止盡地一直播放下去。
07:31
And there's so much weirdness怪事
in the system系統 now
160
439817
2843
現今的系統中有太多奇怪的東西了,
07:34
that autoplay自動播放 takes you
to some pretty漂亮 strange奇怪 places地方.
161
442684
3009
以至於自動播放會帶你
看到一些很奇怪的影片。
07:37
This is how, within a dozen steps腳步,
162
445717
2488
就是這樣,只要十幾個步驟,
07:40
you can go from a cute可愛 video視頻
of a counting數數 train培養
163
448229
3158
你就可能從一支可愛的數火車影片,
07:43
to masturbating自慰 Mickey米奇 Mouse老鼠.
164
451411
2442
跑到米老鼠手淫的影片。
07:46
Yeah. I'm sorry about that.
165
454529
2288
是的,非常遺憾。
07:48
This does get worse更差.
166
456841
1700
情況變得越來越糟。
07:50
This is what happens發生
167
458565
1282
會造成這種現象,
07:51
when all of these different不同 keywords關鍵字,
168
459871
3086
是因為當這些不同的熱門關鍵字、
07:54
all these different不同 pieces of attention注意,
169
462981
2461
所有能吸引注意力的組合、
07:57
this desperate殊死 generation of content內容,
170
465466
2807
與迫不及待要產製播出的影片,
08:00
all comes together一起 into a single place地點.
171
468297
2582
全部結合在一起所造成的結果。
08:03
This is where all those deeply weird奇怪的
keywords關鍵字 come home to roost.
172
471871
4472
這就是那些極其怪異的關鍵字
所自食的惡果。
08:08
You cross-breed雜交品種 the finger手指 family家庭 video視頻
173
476367
2391
你將手指家庭影片
08:10
with some live-action現場行動 superhero超級英雄 stuff東東,
174
478782
2088
和真人版超級英雄的內容混雜,
08:12
you add in some weird奇怪的,
trollish特魯利什 in-jokes笑話 or something,
175
480894
3256
再加上一些奇怪、酸民才懂的笑話
或其他東西,
08:16
and suddenly突然, you come
to a very weird奇怪的 place地點 indeed確實.
176
484174
3366
轉瞬間,你就真的會看到
非常奇怪的頁面。
08:19
The stuff東東 that tends趨向 to upset煩亂 parents父母
177
487564
2113
會讓父母惱火的內容,
08:21
is the stuff東東 that has kind of violent暴力
or sexual有性 content內容, right?
178
489701
3331
通常就是與暴力或是色情
相關的內容,對嗎?
08:25
Children's兒童 cartoons卡通 getting得到 assaulted毆打,
179
493056
2822
兒童卡通正遭到攻擊,
08:27
getting得到 killed殺害,
180
495902
2018
正一點點死去,
08:29
weird奇怪的 pranks惡作劇 that actually其實
genuinely真正的 terrify驚嚇 children孩子.
181
497944
3343
怪異的惡作劇內容真的會嚇壞孩子。
08:33
What you have is software軟件 pulling in
all of these different不同 influences影響
182
501311
3675
你們看到的就是軟體匯入
上述各種雜亂無章的元素後,
自動呈現出孩子最害怕的夢魘影片。
08:37
to automatically自動 generate生成
kids'孩子們 worst最差 nightmares噩夢.
183
505010
2961
08:39
And this stuff東東 really, really
does affect影響 small children孩子.
184
507995
2701
這些東西真的會影響到小朋友。
08:42
Parents父母 report報告 their children孩子
being存在 traumatized創傷,
185
510720
2866
有家長反應他們的孩子受到了創傷,
08:45
becoming變得 afraid害怕 of the dark黑暗,
186
513610
1392
開始害怕黑暗,
08:47
becoming變得 afraid害怕 of their favorite喜愛
cartoon動畫片 characters人物.
187
515026
3050
開始害怕他們最喜歡的卡通角色,
08:50
If you take one thing away from this,
it's that if you have small children孩子,
188
518524
3611
如果你要從這當中學到一件事
那就是:若你有小孩,
08:54
keep them the hell地獄 away from YouTubeYouTube的.
189
522159
1996
千萬別讓他們靠近 YouTube。
08:56
(Applause掌聲)
190
524743
3949
(掌聲)
09:02
But the other thing, the thing
that really gets得到 to me about this,
191
530504
3096
還有一件事真的對我影響很大,
09:05
is that I'm not sure we even really
understand理解 how we got to this point.
192
533624
4629
那就是我不確定我們是否了解
我們是如何走到今天這一步的。
09:10
We've我們已經 taken採取 all of this influence影響,
all of these things,
193
538951
2931
我們匯入所有的影響因素、
所有的東西,
09:13
and munged口大口地 them together一起 in a way
that no one really intended.
194
541906
2953
並以無法預期的方式運作出結果。
09:16
And yet然而, this is also the way
that we're building建造 the entire整個 world世界.
195
544883
3156
然而,這也是我們
建造整個世界的方式。
09:20
We're taking服用 all of this data數據,
196
548063
1773
我們匯集所有的數據資料,
09:21
a lot of it bad data數據,
197
549860
1447
儘管許多資料是不好的,
09:23
a lot of historical歷史的 data數據
full充分 of prejudice偏見,
198
551331
3029
許多歷史資料是充滿偏見的、
09:26
full充分 of all of our worst最差
impulses衝動 of history歷史,
199
554384
2837
充滿我們史上衝動偏激的觀點,
09:29
and we're building建造 that
into huge巨大 data數據 sets
200
557245
2049
然後把這些數據資料建入
龐大的數據庫中,
09:31
and then we're automating自動化 it.
201
559318
1423
接著讓它們自動化,
09:32
And we're munging咀嚼 it together一起
into things like credit信用 reports報告,
202
560765
3502
它們自行運作產製出信用報告、
09:36
into insurance保險 premiums保費,
203
564291
1634
保險費、
09:37
into things like predictive預測
policing治安 systems系統,
204
565949
2693
預測性警務系統、
09:40
into sentencing宣判 guidelines方針.
205
568666
1762
和判刑指南。
09:42
This is the way we're actually其實
constructing建設 the world世界 today今天
206
570452
2821
其實我們就是以這些數據資料
09:45
out of this data數據.
207
573297
1151
在建構當今的世界。
09:46
And I don't know what's worse更差,
208
574472
1698
我不知道哪種比較糟糕:
09:48
that we built內置 a system系統
that seems似乎 to be entirely完全 optimized優化
209
576194
3228
是我們似乎建造了一個
人類絕對負面行為的優化系統,
09:51
for the absolute絕對 worst最差 aspects方面
of human人的 behavior行為,
210
579446
2808
09:54
or that we seem似乎
to have doneDONE it by accident事故,
211
582278
2425
還是似乎是無意為之
卻這樣做了,
09:56
without even realizing實現
that we were doing it,
212
584727
2207
甚至我們真的沒有意識到
自己在做什麼,
09:58
because we didn't really understand理解
the systems系統 that we were building建造,
213
586958
3382
因為我們真的不了解
我們建立的系統,
10:02
and we didn't really understand理解
how to do anything differently不同 with it.
214
590364
3683
且我們其實不了解有什麼
其他不同的方式可以採用。
10:06
There's a couple一對 of things I think
that really seem似乎 to be driving主動 this
215
594769
3365
我認為有幾樣東西肯定
是在 YouTube 上
10:10
most fully充分 on YouTubeYouTube的,
216
598158
1189
驅使這個現象發生的原因,
10:11
and the first of those is advertising廣告,
217
599371
1827
第一項就是廣告。
10:13
which哪一個 is the monetization貨幣化 of attention注意
218
601222
2837
它靠關注和點閱率獲利,
10:16
without any real真實 other variables變量 at work,
219
604083
3136
不考量其他的變數,
10:19
any care關心 for the people who are
actually其實 developing發展 this content內容,
220
607243
3885
也不在乎這些內容是誰創作的,
10:23
the centralization集中 of the power功率,
the separation分割 of those things.
221
611152
3636
權力的集中化,隔離了其他的
影響變數。
10:26
And I think however然而 you feel
about the use of advertising廣告
222
614812
3144
我認為,不論你對於
使用廣告來宣傳某個商品
10:29
to kind of support支持 stuff東東,
223
617980
1238
有什麼樣的感受,
10:31
the sight視力 of grown長大的 men男人 in diapers尿布
rolling壓延 around in the sand
224
619242
3067
像這些成年男子包著尿布
在沙灘上打滾的畫面,
10:34
in the hope希望 that an algorithm算法
that they don't really understand理解
225
622333
2983
這些人冀望他們搞不懂的演算法,
會因這段影片而付錢給他們。
10:37
will give them money for it
226
625340
1315
10:38
suggests提示 that this
probably大概 isn't the thing
227
626679
2037
這種現象表明,我們不應該
10:40
that we should be basing築底
our society社會 and culture文化 upon,
228
628740
2563
將我們的社會和文化
立基在這種東西之上,
10:43
and the way in which哪一個
we should be funding資金 it.
229
631327
2160
也不應該用這種方法來贊助它。
10:45
And the other thing that's kind of
the major重大的 driver司機 of this is automation自動化,
230
633511
3519
另外一個驅動因素就是自動化。
10:49
which哪一個 is the deployment部署
of all of this technology技術
231
637054
2329
也就是說運用所有的技術,
10:51
as soon不久 as it arrives到達,
without any kind of oversight疏忽,
232
639407
2521
在沒有任何監督的機制下,
10:53
and then once一旦 it's out there,
233
641952
1412
一旦影片上架曝光了,
10:55
kind of throwing投擲 up our hands and going,
"Hey, it's not us, it's the technology技術."
234
643388
3843
就兩手一攤、無奈地說 :「嘿,
跟我們無關,是科技製做出來的。」
10:59
Like, "We're not involved參與 in it."
235
647255
1642
就像「我們沒有參與其中。」一樣。
11:00
That's not really good enough足夠,
236
648921
1767
這理由可不好。
11:02
because this stuff東東 isn't
just algorithmically算法 governed治理,
237
650712
2710
因為這種東西
不僅是由演算法來主導,
11:05
it's also algorithmically算法 policed維持治安.
238
653446
2498
也是由演算法來監管的。
11:07
When YouTubeYouTube的 first started開始
to pay工資 attention注意 to this,
239
655968
2848
YouTube 首次正視這個問題時,
11:10
the first thing they said
they'd他們會 do about it
240
658840
2087
他們說第一件事要做的事,
11:12
was that they'd他們會 deploy部署
better machine learning學習 algorithms算法
241
660951
2695
就是他們要使用更好的
機器學習演算法,
11:15
to moderate中等 the content內容.
242
663670
1329
來調整播放內容。
11:17
Well, machine learning學習,
as any expert專家 in it will tell you,
243
665023
3485
關於機械學習,
任何專家都會告訴你,
11:20
is basically基本上 what we've我們已經 started開始 to call
244
668532
1896
那就是我們所稱的軟體,
11:22
software軟件 that we don't really
understand理解 how it works作品.
245
670452
2588
一個沒人知道它是如何運作的東西。
11:25
And I think we have
enough足夠 of that already已經.
246
673064
3983
這些軟體已經夠多了。
11:29
We shouldn't不能 be leaving離開
this stuff東東 up to AIAI to decide決定
247
677071
3166
我們不應該任由人工智慧來決定
11:32
what's appropriate適當 or not,
248
680261
1251
什麼是合適的,
因為我們知道會發生什麼。
11:33
because we know what happens發生.
249
681536
1436
它將開始審查其他東西。
11:34
It'll它會 start開始 censoring終檢 other things.
250
682996
1688
它將開始審查同性戀內容。
11:36
It'll它會 start開始 censoring終檢 queer同性戀者 content內容.
251
684708
1783
它將開始審查合法的公共演講。
11:38
It'll它會 start開始 censoring終檢
legitimate合法 public上市 speech言語.
252
686515
2237
11:40
What's allowed允許 in these discourses話語,
253
688776
1925
演講是否合法獲准,
11:42
it shouldn't不能 be something
that's left up to unaccountable莫名其妙 systems系統.
254
690725
3097
不應該由一個
無法負起責任的系統來決定。
11:45
It's part部分 of a discussion討論
all of us should be having.
255
693846
2947
這是我們所有人
都應該思考與討論的。
11:48
But I'd leave離開 a reminder提醒
256
696817
1308
我還想提醒各位:
11:50
that the alternative替代 isn't
very pleasant愉快, either.
257
698149
2753
一些替代的方案也不盡如人意。
11:52
YouTubeYouTube的 also announced公佈 recently最近
258
700926
1535
YouTube 最近也宣佈
11:54
that they're going to release發布
a version of their kids'孩子們 app應用
259
702485
2767
將要推出兒童專用的應用程式。
11:57
that would be entirely完全
moderated主持 by humans人類.
260
705276
2407
裡面的內容將完全由人來篩選,
12:00
FacebookFacebook的 -- Zuckerberg扎克伯格 said
much the same相同 thing at Congress國會,
261
708134
3618
臉書總裁扎克伯格
也在國會說了相同的話,
12:03
when pressed壓制 about how they
were going to moderate中等 their stuff東東.
262
711776
2987
當被問到將如何改進他們的內容時。
12:06
He said they'd他們會 have humans人類 doing it.
263
714787
1747
他說已經有人在做這件事了。
12:08
And what that really means手段 is,
264
716558
1459
他真正表達出來的是
12:10
instead代替 of having toddlers幼兒 being存在
the first person to see this stuff東東,
265
718041
3223
與其讓那些蹣跚學步的幼童
成為第一個看這些內容的人,
12:13
you're going to have underpaid少繳,
precarious危險的 contract合同 workers工人
266
721288
2788
你要讓那些拿著臨時性合約、
薪水過低、
沒有心理健康醫療支持的員工們,
12:16
without proper正確 mental心理 health健康 support支持
267
724100
1726
成為這些夢魘影片的受害者。
12:17
being存在 damaged破損 by it as well.
268
725850
1376
(笑聲)
12:19
(Laughter笑聲)
269
727250
1096
我想我們可以做到的
遠不止這些。
12:20
And I think we can all do
quite相當 a lot better than that.
270
728370
2601
(掌聲)
12:22
(Applause掌聲)
271
730995
2499
12:26
The thought, I think, that brings帶來 those
two things together一起, really, for me,
272
734068
4613
總結這兩件事,我真正的想法是:
12:30
is agency機構.
273
738705
1420
監管代理。
12:32
It's like, how much do we really
understand理解 -- by agency機構, I mean:
274
740149
3157
想想我們自己真的能了解多少。
藉由監管,我的意思是:
12:35
how we know how to act法案
in our own擁有 best最好 interests利益.
275
743330
4390
我們如何知道依最佳利益來行事。
12:39
Which哪一個 -- it's almost幾乎 impossible不可能 to do
276
747744
1787
這要在我們自己都搞不懂
12:41
in these systems系統 that we don't
really fully充分 understand理解.
277
749555
3485
它是如何運作的系統中,
是不可能達成的。
12:45
Inequality不等式 of power功率
always leads引線 to violence暴力.
278
753064
3071
權力的不平等終會導致暴力。
12:48
And we can see inside these systems系統
279
756159
1685
我們也可在這些系統中看到,
12:49
that inequality不等式 of understanding理解
does the same相同 thing.
280
757868
2611
理解的不平等也會造成相同的結果。
12:52
If there's one thing that we can do
to start開始 to improve提高 these systems系統,
281
760503
3779
如果要做一件事來改善這些系统,
12:56
it's to make them more legible清晰
to the people who use them,
282
764306
2718
那就是讓使用他們的人
能更清楚地了解它們。
12:59
so that all of us have
a common共同 understanding理解
283
767048
2196
這樣大家都有基礎的認知,
13:01
of what's actually其實 going on here.
284
769268
1851
理解到實際的狀況
13:03
The thing, though雖然, I think
most about these systems系統
285
771970
2968
我對這些系統著墨最多的,
13:06
is that this isn't, as I hope希望
I've explained解釋, really about YouTubeYouTube的.
286
774962
3857
如我前所述,
其實並不關乎於 Youtube。
13:10
It's about everything.
287
778843
1312
而是所有的一切。
13:12
These issues問題 of accountability問責 and agency機構,
288
780179
2444
這些關乎責任和監管的問題,
13:14
of opacity不透明度 and complexity複雜,
289
782647
2225
不透明與複雜性的問題,
13:16
of the violence暴力 and exploitation開發
that inherently本質 results結果
290
784896
3177
由於中央集權所導致的
13:20
from the concentration濃度
of power功率 in a few少數 hands --
291
788097
2794
暴力和剝削問題──
13:22
these are much, much larger issues問題.
292
790915
2579
這些更重要、更嚴重的問題。
13:26
And they're issues問題 not just of YouTubeYouTube的
and not just of technology技術 in general一般,
293
794395
3687
它們不僅僅是 YouTube
或一般的科技問題而已,
13:30
and they're not even new.
294
798106
1265
甚至不是新的問題,
13:31
They've他們已經 been with us for ages年齡.
295
799395
1461
這些問題已經存在很久了,
13:32
But we finally最後 built內置 this system系統,
this global全球 system系統, the internet互聯網,
296
800880
4390
但是最終我們建立了這個系统,
全球性的系统 ── 網際網路,
13:37
that's actually其實 showing展示 them to us
in this extraordinary非凡 way,
297
805294
3019
以非凡的方式向我們展現
13:40
making製造 them undeniable不可否認.
298
808337
1547
它無法讓人抗拒的魅力。
13:41
Technology技術 has this extraordinary非凡 capacity容量
299
809908
2820
科技具有非凡的能力
13:44
to both instantiate實例 and continue繼續
300
812752
3973
去例示與延續
13:48
all of our most extraordinary非凡,
often經常 hidden desires慾望 and biases偏見
301
816749
4248
我們所有那些卓越的能力,
而其通常隱藏著慾望與偏見,
13:53
and encoding編碼 them into the world世界,
302
821021
1866
而我們把那些慾望與偏見
一併編碼寫進了這世界,
13:54
but it also writes them down
so that we can see them,
303
822911
3474
但也因為它們被編碼寫下了
所以我們看得到,
13:58
so that we can't pretend假裝
they don't exist存在 anymore.
304
826409
3330
所以我們就不能假裝它們並不存在。
14:01
We need to stop thinking思維 about technology技術
as a solution to all of our problems問題,
305
829763
4319
我們不能再認為科技
是解決所有問題的利器。
14:06
but think of it as a guide指南
to what those problems問題 actually其實 are,
306
834106
3757
我們需要把科技當作一種指引,
帶領我們發現真正的問題所在,
14:09
so we can start開始 thinking思維
about them properly正確
307
837887
2144
如此我們方能正視我們的問題,
14:12
and start開始 to address地址 them.
308
840055
1766
並且解決它們。
14:13
Thank you very much.
309
841845
1335
感謝聆聽!
14:15
(Applause掌聲)
310
843204
5192
(掌聲)
14:21
Thank you.
311
849733
1188
謝謝!
14:22
(Applause掌聲)
312
850945
2869
(掌聲)
14:28
Helen海倫 Walters沃爾特斯: James詹姆士, thank you
for coming未來 and giving us that talk.
313
856839
3178
海倫·沃特斯:詹姆斯,
謝謝你蒞臨演講。
非常有趣!
14:32
So it's interesting有趣:
314
860041
1189
當你想像影片是由機器霸主接管時,
14:33
when you think about the films影片 where
the robotic機器人 overlords霸主 take over,
315
861254
3495
14:36
it's all a bit more glamorous富有魅力的
than what you're describing說明.
316
864773
3279
會以為比你講的還要迷人刺激些。
14:40
But I wonder奇蹟 -- in those films影片,
you have the resistance抵抗性 mounting安裝.
317
868076
3749
我想知道──這些影片,阻抗增長。
14:43
Is there a resistance抵抗性 mounting安裝
towards this stuff東東?
318
871849
3216
對你所描述這些的阻抗
是否有所增長呢?
14:47
Do you see any positive signs跡象,
green綠色 shoots of resistance抵抗性?
319
875089
3796
你看到任何正面的跡象、
萌芽的阻抗嗎?
詹姆斯·布瑞德:
我並不知道正面的阻抗,
14:52
James詹姆士 Bridle韁繩: I don't know
about direct直接 resistance抵抗性,
320
880507
2416
因為我認為這是需要長期抗戰的事。
14:54
because I think this stuff東東
is super long-term長期.
321
882947
2264
我想它已崁入到文化很深的層次。
14:57
I think it's baked烘烤的 into culture文化
in really deep ways方法.
322
885235
2510
14:59
A friend朋友 of mine,
Eleanor埃莉諾 Saitta賽塔, always says
323
887769
2132
我的友人叫艾麗諾 · 塞爾塔,
她總是說:
15:01
that any technological技術性 problems問題
of sufficient足夠 scale規模 and scope範圍
324
889935
3609
任何影響規模和範圍
巨大的科技問題,
15:05
are political政治 problems問題 first of all.
325
893568
2267
一開始都源自政治問題。
15:07
So all of these things we're working加工
to address地址 within this
326
895859
2785
所以這些我們正在努力解決的事情,
15:10
are not going to be addressed解決
just by building建造 the technology技術 better,
327
898668
3274
並不是光靠改進我們的科技而已,
應該要改變創造出這些科技的社會。
15:13
but actually其實 by changing改變 the society社會
that's producing生產 these technologies技術.
328
901966
3464
15:17
So no, right now, I think we've我們已經 got
a hell地獄 of a long way to go.
329
905454
3027
所以現在我認為還有漫長的路要走。
15:20
But as I said, I think by unpacking拆包 them,
330
908505
1986
但如我所說,藉由將它們搬上檯面,
15:22
by explaining說明 them, by talking
about them super honestly老老實實,
331
910515
2697
通過真誠地解釋與溝通,
我們就至少可以踏出
這個漫長旅程的第一步。
15:25
we can actually其實 start開始
to at least最小 begin開始 that process處理.
332
913236
2505
15:27
HWHW: And so when you talk about
legibility易讀性 and digital數字 literacy讀寫能力,
333
915765
3562
海倫·沃特斯:
當你談及易懂性和數位素養的時候,
15:31
I find it difficult to imagine想像
334
919351
1591
我認為這很難想像,
15:32
that we need to place地點 the burden負擔
of digital數字 literacy讀寫能力 on users用戶 themselves他們自己.
335
920966
3680
要用戶自己背負數位素養的責任。
15:36
But whose誰的 responsibility責任
is education教育 in this new world世界?
336
924670
4562
但是在這個新世界裡
教育是誰的責任呢?
15:41
JBJB: Again, I think this responsibility責任
is kind of up to all of us,
337
929256
3612
詹姆斯·布瑞德:我覺得
這責任落在我們所有人的身上,
15:44
that everything we do,
everything we build建立, everything we make,
338
932892
2984
我們所做、做建、所創造的全部,
15:47
needs需求 to be made製作
in a consensual兩廂情願 discussion討論
339
935900
3692
都需經彼此相互討論達成共識,
15:51
with everyone大家 who's誰是 avoiding避免 it;
340
939616
1940
包含那些迴避問題的人。
15:53
that we're not building建造 systems系統
intended to trick and surprise people
341
941580
4341
我們建造系統並不是為了
欺騙或者震攝人們
15:57
into doing the right thing,
342
945945
2300
去做正確的事情,
16:00
but that they're actually其實 involved參與
in every一切 step in educating教育 them,
343
948269
3236
而是在每一個步驟去教育他們,
16:03
because each of these systems系統
is educational教育性.
344
951529
2278
因每個系統都具教育性。
16:05
That's what I'm hopeful有希望 about,
about even this really grim嚴峻 stuff東東,
345
953831
3102
這就是我希望看到的,
即使它如此令人不悅,
16:08
that if you can take it
and look at it properly正確,
346
956957
2262
如果你能正確、適當地看待它,
16:11
it's actually其實 in itself本身
a piece of education教育
347
959243
2089
那麼它本身就是一種教育,
16:13
that allows允許 you to start開始 seeing眼看
how complex複雜 systems系統 come together一起 and work
348
961356
3762
讓你看到複雜的系統
是如何結合在一起工作的,
16:17
and maybe be able能夠 to apply應用
that knowledge知識 elsewhere別處 in the world世界.
349
965142
3501
或許還能夠將這些知識
應用到世界其他地方。
海倫·沃特斯:詹姆斯,
這是個重要的討論。
16:20
HWHW: James詹姆士, it's such這樣
an important重要 discussion討論,
350
968667
2115
我知道許多人抱著開放的心
來聆聽你的演說。
16:22
and I know many許多 people here
are really open打開 and prepared準備 to have it,
351
970806
3227
感謝你為我們的早晨
揭開精彩的序幕。
16:26
so thanks謝謝 for starting開始 off our morning早上.
352
974057
1859
詹姆斯·布瑞德:
非常感謝大家!
16:27
JBJB: Thanks謝謝 very much. Cheers乾杯.
353
975940
1400
(掌聲)
16:29
(Applause掌聲)
354
977364
1651
Translated by Shupeng Han
Reviewed by SF Huang

▲Back to top

ABOUT THE SPEAKER
James Bridle - Artist, writer
Working across technologies and disciplines, James Bridle examines technology, knowledge and the end of the future.

Why you should listen

James Bridle is an artist and writer working across technologies and disciplines. His artworks and installations have been exhibited in Europe, North and South America, Asia and Australia, and have been viewed by hundreds of thousands of visitors online. He has been commissioned by organizations including the Victoria & Albert Museum, the Barbican, Artangel, the Oslo Architecture Triennale and the Istanbul Design Biennial, and he has been honored by Ars Electronica, the Japan Media Arts Festival and the Design Museum, London. His writing on literature, culture and networks has appeared in magazines and newspapers including Frieze, Wired, Domus, Cabinet, The Atlantic, the New Statesman and many others, and he has written a regular column for The Observer.

New Dark Age, Bridle's book about technology, knowledge and the end of the future is forthcoming from Verso (UK & US) in 2018. He lectures regularly on radio, at conferences, universities and other events including SXSW, Lift, the Global Art Forum and Re:Publica. He has been a resident at Lighthouse, Brighton, the White Building, London and Eyebeam, New York, and an adjunct professor on the interactive telecommunications program at New York University.

More profile about the speaker
James Bridle | Speaker | TED.com