ABOUT THE SPEAKER
Jaron Lanier - Scientist, musician, visual artist, author
Jaron Lanier is a scientist, musician and writer best known for his work in virtual reality and his advocacy of humanism and sustainable economics in a digital context.

Why you should listen

Jaron Lanier is interested in the idea that virtual reality might help us notice the magic of ordinary reality and the idea that paying people for the data that is now taken from them might be the best path to a sustainable, dignified future economy.

Lanier's 1980s start-up created the first commercial VR products and introduced avatars, multi-person virtual world experiences and prototypes of major VR applications such as surgical simulation. Lanier coined or popularized the terms "virtual reality" and "mixed reality." He recently released a new book, Dawn of the New Everything: Encounters with Reality and Virtual Reality. Both of his previous books, Who Owns the Future? and You Are Not a Gadget are international bestsellers. His most recent book (May 29, 2018) is entitled Ten Arguments for Deleting Your Social Media Accounts Right Now.

Lanier was a mainstay of the earliest TED conferences; he still thinks of TED as a charming little gathering.

More profile about the speaker
Jaron Lanier | Speaker | TED.com
TED2018

Jaron Lanier: How we need to remake the internet

傑倫拉尼爾: 我們要如何重建網際網路

Filmed:
3,029,544 views

傑倫拉尼爾曾在數位文化發展的初期,協助刻劃過網際網路的遠景。當時數位文化的理想是讓大眾都能使用,讓人類可以分享知識。但即使這樣,最後這遠景也因為一些黑暗面而變調了:我們的生活被個人裝置給控制了,它們監控著我們的資料,刺激著我們的感官。(聽起來很耳熟嗎?)在這場很有遠景的演說中,拉尼爾要我們反思像 Google 及臉書這類以數位文化為基礎,「全球悲劇,荒謬至極」的公司,並討論要如何回到過去。他說:「我們不能有這樣的社會:如果兩個人想要溝通,唯一可能的辦法竟是由想要操縱它們的第三方提供資金。」
- Scientist, musician, visual artist, author
Jaron Lanier is a scientist, musician and writer best known for his work in virtual reality and his advocacy of humanism and sustainable economics in a digital context. Full bio

Double-click the English transcript below to play the video.

00:12
Back in the 1980s, actually其實,
I gave my first talk at TEDTED,
0
944
4009
我的第一場 TED 演講
是在 1980 年代,
00:16
and I brought some of the very,
very first public上市 demonstrations示威
1
4977
4262
那時我把最早最早的一些
00:21
of virtual虛擬 reality現實 ever to the TEDTED stage階段.
2
9263
4234
虛擬實境展示帶到 TED 舞台上。
00:26
And at that time, we knew知道
that we were facing面對 a knife-edge刀邊 future未來
3
14375
6867
當時,大家都知道我們
正在面對一個不確定的未來、
00:33
where the technology技術 we needed需要,
4
21266
5201
在那個未來世界裡,
有我們需要的科技、
00:38
the technology技術 we loved喜愛,
5
26491
1851
喜愛的科技,
00:40
could also be our undoing撤消.
6
28366
2047
但也有可能是毀滅我們的科技。
00:43
We knew知道 that if we thought
of our technology技術
7
31266
4091
我們知道,如果我們把科技視為
00:47
as a means手段 to ever more power功率,
8
35381
3254
通往更多權力的手段、
00:50
if it was just a power功率 trip,
we'd星期三 eventually終於 destroy破壞 ourselves我們自己.
9
38659
3707
玩權弄勢的工具,
我們最終就會摧毀我們自己。
00:54
That's what happens發生
10
42390
1181
00:55
when you're on a power功率 trip
and nothing else其他.
11
43595
2787
如果你只想利用科技
來玩權弄勢就會這樣。
00:59
So the idealism唯心主義
12
47509
3389
在那時候,
數位文化的理想主義
01:02
of digital數字 culture文化 back then
13
50922
4809
01:07
was all about starting開始 with
that recognition承認 of the possible可能 darkness黑暗
14
55755
4739
談的都是,一開始就要我們認清
科技可能會帶來的黑暗面,
01:12
and trying to imagine想像
a way to transcend超越 it
15
60518
3350
並嘗試用美及創意的方式
01:15
with beauty美女 and creativity創造力.
16
63892
2578
來突破這些黑暗面。
01:19
I always used to end結束 my early TEDTED Talks會談
with a rather horrifying可怕的 line, which哪一個 is,
17
67033
6507
早期在 TED 演講時,我的結尾
總是一句蠻嚇人的台詞:
01:26
"We have a challenge挑戰.
18
74478
3866
「我們有一個挑戰。
01:30
We have to create創建
a culture文化 around technology技術
19
78368
4024
我們得為科技創造出一種文化,
01:34
that is so beautiful美麗, so meaningful富有意義的,
20
82416
3968
一種相當美麗、有意義的文化、
01:38
so deep, so endlessly不休 creative創作的,
21
86408
2541
一種相當深刻、有無盡創意、
01:40
so filled填充 with infinite無窮 potential潛在
22
88973
3016
充滿無限潛力的科技文化,
01:44
that it draws us away
from committing提交 mass suicide自殺."
23
92013
3253
這樣的科技文化才能讓我們
避免掉集體自殺。」
01:48
So we talked about extinction滅絕
as being存在 one and the same相同
24
96519
5588
我們把人類滅絕當作一回事,
就如同我們在談,
01:54
as the need to create創建 an alluring誘人,
infinitely無限地 creative創作的 future未來.
25
102131
4830
我們需要創造出一個誘人且
有著無限創意未來一樣的重要。
01:59
And I still believe
that that alternative替代 of creativity創造力
26
107639
5382
我仍然相信,創新的另一面
02:05
as an alternative替代 to death死亡
27
113045
1974
有可能就是死亡,
02:07
is very real真實 and true真正,
28
115043
1969
這個說法是非常真實的,
02:09
maybe the most true真正 thing there is.
29
117036
1983
也許是世上最真實的。
02:11
In the case案件 of virtual虛擬 reality現實 --
30
119870
2095
就虛擬實境來說,
02:13
well, the way I used to talk about it
31
121989
2282
我以前談論它的方式,
02:16
is that it would be something like
32
124295
2635
會把它說成像是
02:18
what happened發生 when people
discovered發現 language語言.
33
126954
2850
人類剛發明語言時一樣。
02:21
With language語言 came來了 new adventures冒險,
new depth深度, new meaning含義,
34
129828
4675
隨著語言,出現了新的冒險、
新的深度、新的意義、
02:26
new ways方法 to connect,
new ways方法 to coordinate坐標,
35
134527
2080
新的連結方式、新的協調方式、
02:28
new ways方法 to imagine想像,
new ways方法 to raise提高 children孩子,
36
136631
4034
新的想像方式、
新的養育孩子方式;
02:32
and I imagined想像, with virtual虛擬 reality現實,
we'd星期三 have this new thing
37
140689
4262
我在想,有了虛擬實境,
我們就會創造出一種
02:36
that would be like a conversation會話
38
144975
1593
新的對話方式,
02:38
but also like waking-state醒狀態
intentional故意的 dreaming做夢.
39
146592
3344
但又像是在清醒的狀態下,
刻意作夢一樣。
02:41
We called it post-symbolic後象徵性 communication通訊,
40
149960
2653
我們稱之為「後象徵性溝通」,
02:44
because it would be like just directly
making製造 the thing you experienced有經驗的
41
152637
4358
因為那就像你直接的親身體驗,
02:49
instead代替 of indirectly間接
making製造 symbols符號 to refer參考 to things.
42
157019
3619
而不只是間接地看到事物的表象。
02:53
It was a beautiful美麗 vision視力,
and it's one I still believe in,
43
161466
4338
那是個很美的遠景,
且是我現在仍然相信的遠景,
02:57
and yet然而, haunting令人難以忘懷 that beautiful美麗 vision視力
44
165828
3215
但,圍繞在這美好景像的黑暗面
03:01
was the dark黑暗 side
of how it could also turn out.
45
169067
3150
也有可能會發生。
03:04
And I suppose假設 I could mention提到
46
172241
5048
我想,我可以談談,
03:09
from one of the very earliest最早
computer電腦 scientists科學家們,
47
177313
3064
很早期的一位電腦科學家,
03:12
whose誰的 name名稱 was Norbert諾伯特 Wiener維納,
48
180401
2135
他的名字叫諾伯特維納,
03:14
and he wrote a book back in the '50s,
from before I was even born天生,
49
182560
3754
在五○年代我都還沒
出生時,他寫了一本書,
03:18
called "The Human人的 Use of Human人的 Beings生命."
50
186338
2658
書名叫《人有人的用處》。
03:21
And in the book,
he described描述 the potential潛在
51
189779
4172
在書中,他描述到我們是有可能
03:25
to create創建 a computer電腦 system系統
that would be gathering蒐集 data數據 from people
52
193975
6181
會創造出一個
收集人類資料的電腦,
03:32
and providing提供 feedback反饋
to those people in real真實 time
53
200180
3572
並提供即時回應給人類的電腦,
03:35
in order訂購 to put them kind of partially部分,
statistically統計學, in a Skinner斯金納 box,
54
203776
5135
為了統計這些行為,
這有點像是把人類放到施金納箱中
03:40
in a behaviorist行為主義 system系統,
55
208935
2444
一種控制動物行為的實驗箱,
03:43
and he has this amazing驚人 line
where he says,
56
211403
2501
他說了一句很棒的話,
03:45
one could imagine想像,
as a thought experiment實驗 --
57
213928
2738
大家可以想像有一種思想的實驗──
03:48
and I'm paraphrasing釋義,
this isn't a quote引用 --
58
216690
2461
我現在要講的是改述過的釋義,
不是引述──
03:51
one could imagine想像 a global全球 computer電腦 system系統
59
219175
3080
大家可以想像,
有一個全球的電腦系統,
03:54
where everybody每個人 has devices設備
on them all the time,
60
222279
2842
在此系統中,每個人身上
時時刻刻都有一些裝置,
03:57
and the devices設備 are giving them
feedback反饋 based基於 on what they did,
61
225145
3272
這些裝置會根據
他們的行為給出回饋,
04:00
and the whole整個 population人口
62
228441
1875
而在系統裡的全部人,
04:02
is subject學科 to a degree
of behavior行為 modification修改.
63
230340
3576
都會受到某種程度的行為修正。
04:05
And such這樣 a society社會 would be insane,
64
233940
3546
這樣子的社會簡直太瘋狂了,
04:09
could not survive生存,
could not face面對 its problems問題.
65
237510
3097
這樣的社會無法生存,
無法面對它的問題。
04:12
And then he says, but this is
only a thought experiment實驗,
66
240631
2621
接著,他說,但這只是個思想實驗,
04:15
and such這樣 a future未來
is technologically技術 infeasible可行.
67
243276
3420
將來要把人類丟到實驗箱,
以科學的角度而言,我們也辦不到。
04:18
(Laughter笑聲)
68
246720
1092
(笑聲)
04:19
And yet然而, of course課程,
it's what we have created創建,
69
247836
3002
但,沒錯,我們人類現在
就已經陷入這樣的窘境,
04:22
and it's what we must必須 undo解開
if we are to survive生存.
70
250862
3277
而如果人類想要生存,
就要趕緊回頭。
04:27
So --
71
255457
1151
所以,
04:28
(Applause掌聲)
72
256632
3540
(掌聲)
04:32
I believe that we made製作
a very particular特定 mistake錯誤,
73
260631
5977
我認為,我們犯了一個
非常特殊的錯誤,
04:38
and it happened發生 early on,
74
266632
2234
它在很早就發生了,
04:40
and by understanding理解 the mistake錯誤 we made製作,
75
268890
2074
而透過了解我們所犯下的錯誤,
04:42
we can undo解開 it.
76
270988
1859
我們就能將它還原。
04:44
It happened發生 in the '90s,
77
272871
2559
事情發生在九○年代,
04:47
and going into the turn of the century世紀,
78
275454
2742
正要進入世紀的轉折點,
04:50
and here's這裡的 what happened發生.
79
278220
1388
發生的經過如下。
04:53
Early digital數字 culture文化,
80
281200
1374
早期的數位文化,
04:54
and indeed確實, digital數字 culture文化 to this day,
81
282598
4972
當然,還有至今的數位文化,
04:59
had a sense of, I would say,
lefty左撇子, socialist社會主義的 mission任務 about it,
82
287594
6309
有一種……我會說是左翼、
社會主義使命的感覺,
05:05
that unlike不像 other things
that have been doneDONE,
83
293927
2160
這不像其它已經有的東西,
05:08
like the invention發明 of books圖書,
84
296111
1434
比如書籍的發明,
05:09
everything on the internet互聯網
must必須 be purely純粹 public上市,
85
297569
3413
在網路上的一切都
必須要是完全公開的,
05:13
must必須 be available可得到 for free自由,
86
301006
2325
必須要可以免費使用,
05:15
because if even one person
cannot不能 afford給予 it,
87
303355
3388
因為,如果有一個人負擔不起,
05:18
then that would create創建
this terrible可怕 inequity不公平.
88
306767
2572
那就會造成很糟的不平等。
05:21
Now of course課程, there's other ways方法
to deal合同 with that.
89
309912
2524
當然,有其它方法
可以處理這個問題。
05:24
If books圖書 cost成本 money,
you can have public上市 libraries圖書館.
90
312460
3016
如果書籍要錢,
你可以用公共圖書館。
05:27
And so forth向前.
91
315500
1174
諸如此類。
05:28
But we were thinking思維, no, no, no,
this is an exception例外.
92
316698
2618
但我們在想,不、不、不,
這是個例外。
05:31
This must必須 be pure public上市 commons公地,
that's what we want.
93
319340
4605
它必須要是單純地能讓公眾
使用的,我們希望如此。
05:35
And so that spirit精神 lives生活 on.
94
323969
2634
如此,精神才能傳承下去。
05:38
You can experience經驗 it in designs設計
like the Wikipedia維基百科, for instance,
95
326627
3715
你可以體驗這些設計,
比如像維基百科
及其它許多這類的設計。
05:42
many許多 others其他.
96
330366
1341
05:43
But at the same相同 time,
97
331731
1874
但同時,
05:45
we also believed相信, with equal等於 fervor熱情,
98
333629
2588
我們也能帶著同等的熱情,
05:48
in this other thing
that was completely全然 incompatible不相容,
99
336241
3937
相信另外一種完全不一樣的事情,
05:52
which哪一個 is we loved喜愛 our tech高科技 entrepreneurs企業家.
100
340202
3627
那就是,我們愛我們的科技企業家。
05:55
We loved喜愛 Steve史蒂夫 Jobs工作;
we loved喜愛 this Nietzschean尼采 myth神話
101
343853
3739
我們愛史帝夫賈伯斯,
我們愛這種尼采般的神話,
05:59
of the techie技術人員 who could dent凹痕 the universe宇宙.
102
347616
3468
這些能改變世界的科技天才。
06:03
Right?
103
351108
1318
對嗎?
06:04
And that mythical神話 power功率
still has a hold保持 on us, as well.
104
352450
5848
而那神話般的力量
仍能持續地獲得我們的支持。
06:10
So you have these two different不同 passions激情,
105
358322
4459
所以,你會有這兩種不同的熱忱,
06:14
for making製造 everything free自由
106
362805
1937
一種是讓一切都免費,
06:16
and for the almost幾乎 supernatural超自然
power功率 of the tech高科技 entrepreneur企業家.
107
364766
5166
另一種是科技企業家的力量,
近乎超自然的力量。
06:21
How do you celebrate慶祝 entrepreneurship創業
when everything's一切的 free自由?
108
369956
4352
但當一切都是免費時,
你要如何讚頌企業家精神?
06:26
Well, there was only
one solution back then,
109
374332
3125
在當時,只有一個解決方案,
06:29
which哪一個 was the advertising廣告 model模型.
110
377481
2087
那就是廣告獲利模式。
06:31
And so therefore因此, Google谷歌
was born天生 free自由, with ads廣告,
111
379592
4003
因此,Google 剛開始
是免費的,但附帶廣告。
06:35
FacebookFacebook的 was born天生 free自由, with ads廣告.
112
383619
3682
臉書剛開始也是
免費的,但附帶廣告。
06:39
Now in the beginning開始, it was cute可愛,
113
387325
3865
一開始,這還蠻討喜的,
06:43
like with the very earliest最早 Google谷歌.
114
391214
1960
就像最早期的 Google。
06:45
(Laughter笑聲)
115
393198
1286
(笑聲)
06:46
The ads廣告 really were kind of ads廣告.
116
394508
2897
廣告真的就只是廣告。
06:49
They would be, like,
your local本地 dentist牙醫 or something.
117
397429
2485
廣告可能就是你當地的牙醫之類的。
06:51
But there's thing called Moore's摩爾定律 law
118
399938
1920
但有樣東西叫做摩爾定律,
06:53
that makes品牌 the computers電腦
more and more efficient高效 and cheaper便宜.
119
401882
3142
它讓電腦越來越高效
也越來越便宜。
06:57
Their algorithms算法 get better.
120
405048
1858
演算法也越來越強。
06:58
We actually其實 have universities高校
where people study研究 them,
121
406930
2596
在大學裡真的有人在研究它們,
07:01
and they get better and better.
122
409550
1628
且它們越來越好。
07:03
And the customers顧客 and other entities實體
who use these systems系統
123
411202
4452
客戶和使用這些系統的其它機構
07:07
just got more and more experienced有經驗的
and got cleverer聰明 and cleverer聰明.
124
415678
4127
變得越來越有經驗,
且越來越聰明。
07:11
And what started開始 out as advertising廣告
125
419829
2397
一開始本來只是廣告,
07:14
really can't be called
advertising廣告 anymore.
126
422250
2477
現在真的不能再稱為廣告了。
07:16
It turned轉身 into behavior行為 modification修改,
127
424751
2912
它轉變成了「行為修改」。
07:19
just as Norbert諾伯特 Wiener維納
had worried擔心 it might威力.
128
427687
4493
這就是諾伯特維納所擔心的。
07:24
And so I can't call these things
social社會 networks網絡 anymore.
129
432204
4620
所以我已經無法再稱
這些東西為社交網路了。
07:28
I call them behavior行為 modification修改 empires帝國.
130
436848
3814
我稱它們為「行為修改帝國」。
07:32
(Applause掌聲)
131
440686
2235
(掌聲)
07:34
And I refuse垃圾 to vilify醜化 the individuals個人.
132
442945
4214
我是反對誹謗個人的。
07:39
I have dear friends朋友 at these companies公司,
133
447183
2271
在這些公司中有我親愛的朋友,
07:41
sold出售 a company公司 to Google谷歌, even though雖然
I think it's one of these empires帝國.
134
449478
4760
我們也曾把一間公司賣給 Google,
即使我認為 Google 也是帝國之一。
07:46
I don't think this is a matter
of bad people who've誰一直 doneDONE a bad thing.
135
454262
5060
我不認為這是壞人做了壞事的問題。
07:51
I think this is a matter
of a globally全球 tragic悲慘,
136
459346
4576
我認為這一場全球性的悲劇,
07:55
astoundingly令人嘆為觀止 ridiculous荒謬 mistake錯誤,
137
463946
4572
非常荒謬的錯誤,
08:00
rather than a wave of evil邪惡.
138
468542
4129
而不是邪惡的浪潮。
08:04
Let me give you
just another另一個 layer of detail詳情
139
472695
2682
讓我再做深一層的細節說明,
08:07
into how this particular特定
mistake錯誤 functions功能.
140
475401
3103
解釋這個錯誤是如何產生的。
08:11
So with behaviorism行動主義,
141
479337
2707
行為主義是這樣的,
08:14
you give the creature生物,
whether是否 it's a rat or a dog or a person,
142
482068
5064
不論是哪種生物,
比如老鼠、狗,或是人,
08:19
little treats對待 and sometimes有時
little punishments處罰
143
487156
2840
它會根據生物的行為
回饋一點點甜頭或懲罰,
08:22
as feedback反饋 to what they do.
144
490020
1817
08:24
So if you have an animal動物 in a cage,
it might威力 be candy糖果 and electric電動 shocks震盪.
145
492710
5912
如果你把一隻動物放在籠子中,
你給牠的可能就是糖果和電擊。
08:30
But if you have a smartphone手機,
146
498646
2524
但如果你有一支智慧手機,
08:33
it's not those things,
it's symbolic象徵 punishment懲罰 and reward獎勵.
147
501194
6926
手機雖然不像那些實驗箱,
但也有象徵性的懲罰和獎賞。
08:40
Pavlov巴甫洛夫, one of the early behaviorists相反,
148
508144
2443
巴夫洛夫是最早的行為學家之一,
08:42
demonstrated證明 the famous著名 principle原理.
149
510611
2952
他提出了著名的原則。
08:45
You could train培養 a dog to salivate垂涎三尺
just with the bell, just with the symbol符號.
150
513587
3961
你只要用一個鈴噹或手勢,
就可以訓練一隻狗流口水。
08:49
So on social社會 networks網絡,
151
517572
1586
在社交網路上,
08:51
social社會 punishment懲罰 and social社會 reward獎勵
function功能 as the punishment懲罰 and reward獎勵.
152
519182
5080
有人會酸你或給你按讚
就像懲罰及獎賞一樣。
08:56
And we all know
the feeling感覺 of these things.
153
524286
2077
我們都知道懲罰和獎賞的感受如何。
08:58
You get this little thrill顫抖 --
154
526387
1451
你會有點興奮,
08:59
"Somebody liked喜歡 my stuff東東
and it's being存在 repeated重複."
155
527862
2350
「有人喜歡我的東西,
且重複按讚。」
09:02
Or the punishment懲罰:
"Oh my God, they don't like me,
156
530236
2334
或被懲罰,「喔,天啊,他們不喜歡我,
也許別人比較受歡迎,喔,天啊。」
09:04
maybe somebody else其他
is more popular流行, oh my God."
157
532594
2239
你會有這兩種很常見的感受,
09:06
So you have those two
very common共同 feelings情懷,
158
534857
2226
09:09
and they're doled施捨 out in such這樣 a way
that you get caught抓住 in this loop循環.
159
537107
3564
就這樣一點一點地
把你困在這迴圈中。
09:12
As has been publicly公然 acknowledged承認
by many許多 of the founders創始人 of the system系統,
160
540695
4095
這個系統的許多創始者
都已經公開承認這個現象,
09:16
everybody每個人 knew知道 this is what was going on.
161
544814
2341
人人都知道發生的狀況就是如此。
09:19
But here's這裡的 the thing:
162
547871
1619
但,重點是:
09:21
traditionally傳統, in the academic學術的 study研究
of the methods方法 of behaviorism行動主義,
163
549514
5294
傳統行為主義方法的學術研究
09:26
there have been comparisons對比
of positive and negative stimuli刺激.
164
554832
5436
比較正面和負面的刺激。
09:32
In this setting設置, a commercial廣告 setting設置,
165
560292
2364
在這樣的前提下,
有商業行為的前提下,
09:34
there's a new kind of difference區別
166
562680
1596
會產生一種新的差異,
09:36
that has kind of evaded迴避
the academic學術的 world世界 for a while,
167
564300
2769
有好一段時間它都沒被學術界發現,
09:39
and that difference區別
is that whether是否 positive stimuli刺激
168
567093
4048
那差異就是,在不同的情況下,
正面刺激是否比
負面刺激更有效之類的....
09:43
are more effective有效 than negative ones那些
in different不同 circumstances情況,
169
571165
3309
09:46
the negative ones那些 are cheaper便宜.
170
574498
2104
結果,負面刺激比較便宜,
09:48
They're the bargain討價還價 stimuli刺激.
171
576626
2056
用負面刺激很划算。
09:50
So what I mean by that is it's much easier更輕鬆
172
578706
5703
我這麼說的意思是
失去信任比建立信任容易。
09:56
to lose失去 trust相信 than to build建立 trust相信.
173
584433
3116
09:59
It takes a long time to build建立 love.
174
587573
3172
我們要花很長的時間才能建立「愛」。
10:02
It takes a short time to ruin廢墟 love.
175
590769
2606
但只要很短暫的時間就能毀了「愛」。
10:05
Now the customers顧客 of these
behavior行為 modification修改 empires帝國
176
593399
4588
這些「行為修改帝國」的客戶
深陷在非常快的迴圈中。
10:10
are on a very fast快速 loop循環.
177
598011
1423
10:11
They're almost幾乎 like
high-frequency高頻 traders貿易商.
178
599458
2045
他們幾乎就像是股票的高頻交易者。
10:13
They're getting得到 feedbacks反饋
from their spends
179
601527
2024
他們從客戶的消費和動作獲得回饋,
10:15
or whatever隨你 their activities活動 are
if they're not spending開支,
180
603575
2802
從而知道哪些效果好,
10:18
and they see what's working加工,
and then they do more of that.
181
606401
3270
就會更那樣做。
10:21
And so they're getting得到 the quick feedback反饋,
182
609695
2040
因為那樣,他們會快速得到回饋,
10:23
which哪一個 means手段 they're responding響應
more to the negative emotions情緒,
183
611759
3040
也就是說他們對
負面情緒比較有反應,
10:26
because those are the ones那些
that rise上升 faster更快, right?
184
614823
3937
因為這些負面刺激的回饋,
比正向刺激來得快,對吧?
10:30
And so therefore因此,
even well-intentioned用心良苦 players玩家
185
618784
3548
因此,即使是出發點很好的業者,
10:34
who think all they're doing
is advertising廣告 toothpaste牙膏
186
622356
2865
他們認為他們所做的
不過就是為牙膏打廣告,
10:37
end結束 up advancing前進 the cause原因
of the negative people,
187
625245
3031
結果卻是協助造成了這社會上
10:40
the negative emotions情緒, the cranks曲柄,
188
628300
2334
充滿了負面情緒的人、怪胎、
10:42
the paranoidsparanoids,
189
630658
1444
偏執狂、
10:44
the cynics憤世嫉俗者, the nihilists虛無主義.
190
632126
3080
憤世嫉俗、對人生無望的人。
10:47
Those are the ones那些 who get
amplified放大 by the system系統.
191
635230
3493
系統會放大的就是這些人。
10:50
And you can't pay工資 one of these companies公司
to make the world世界 suddenly突然 nice不錯
192
638747
5651
你無法支付其中任何一家公司
讓世界突然變好或民主進步,
10:56
and improve提高 democracy民主
193
644422
1151
10:57
nearly幾乎 as easily容易 as you can pay工資
to ruin廢墟 those things.
194
645597
3841
無法像破壞這些東西那樣容易。
11:01
And so this is the dilemma困境
we've我們已經 gotten得到 ourselves我們自己 into.
195
649462
3719
所以,是我們自己
造成了這樣的困境。
11:05
The alternative替代 is to turn back the clock時鐘,
with great difficulty困難,
196
653856
5232
替代方案就是
盡全力地讓時光倒流,
11:11
and remake翻拍 that decision決定.
197
659112
2841
然後重新做決定。
11:13
Remaking改造 it would mean two things.
198
661977
4038
重新做決定意味著兩件事。
11:18
It would mean first that many許多 people,
those who could afford給予 to,
199
666039
3928
第一,許多負擔得起這些東西的人
11:21
would actually其實 pay工資 for these things.
200
669991
2207
就真的得要為這些東西付錢。
11:24
You'd pay工資 for search搜索,
you'd pay工資 for social社會 networking聯網.
201
672222
4407
搜尋要錢、用社交網路要錢。
11:28
How would you pay工資?
Maybe with a subscription訂閱 fee費用,
202
676653
3461
你要如何付錢?也許是付訂閱費,
11:32
maybe with micro-payments小額支付 as you use them.
203
680138
2738
也許是在使用時支付極低的費用。
11:34
There's a lot of options選項.
204
682900
1802
有許多選擇。
11:36
If some of you are recoiling反沖,
and you're thinking思維,
205
684726
2397
如果有些人打退堂鼓,在想:
11:39
"Oh my God, I would never pay工資
for these things.
206
687147
2366
「天啊,我絕不會為這些東西付錢。
11:41
How could you ever get anyone任何人 to pay工資?"
207
689537
2095
你怎麼能要任何人付錢?」
11:43
I want to remind提醒 you
of something that just happened發生.
208
691656
3239
那麼我要提醒你一件剛發生的事。
11:46
Around this same相同 time
209
694919
2054
之前 Google、臉書這些公司
11:48
that companies公司 like Google谷歌 and FacebookFacebook的
were formulating制定 their free自由 idea理念,
210
696997
5707
在發想他們的免費想法時,
11:54
a lot of cyber網絡 culture文化
also believed相信 that in the future未來,
211
702728
4504
有許多網路文化也相信,在未來,
11:59
televisions電視 and movies電影
would be created創建 in the same相同 way,
212
707256
3022
我們也會用同樣的方式
來製作電視和電影,
12:02
kind of like the Wikipedia維基百科.
213
710302
1755
有點像維基百科。
12:04
But then, companies公司
like NetflixNetflix公司, Amazon亞馬遜, HBOHBO,
214
712456
5064
但,接著,像網飛、
亞馬遜、HBO 這類公司
12:09
said, "Actually其實, you know, subscribe訂閱.
We'll give you give you great TV電視."
215
717544
3739
比如:「你只要訂閱我們,
我們就給你很好的節目。」
12:13
And it worked工作!
216
721307
1373
結果很有效!
12:14
We now are in this period
called "peak TV電視," right?
217
722704
3874
我們現在處在所謂的
「電視節目選擇超多」的時期,對吧?
12:18
So sometimes有時 when you pay工資 for stuff東東,
things get better.
218
726602
4198
所以,有時為東西付錢反而是好事。
12:22
We can imagine想像 a hypothetical假想 --
219
730824
2286
我們可以想像一個假設性的──
12:25
(Applause掌聲)
220
733134
4671
(掌聲)
12:29
We can imagine想像 a hypothetical假想 world世界
of "peak social社會 media媒體."
221
737829
3659
我們可以想像一個
社交媒體超多的世界。
12:33
What would that be like?
222
741512
1349
那會是什麼樣子?
12:34
It would mean when you get on,
you can get really useful有用,
223
742885
2770
那意味著,當你上社交媒體,
你能得到非常有用、
12:37
authoritative權威性 medical advice忠告
instead代替 of cranks曲柄.
224
745679
3095
有權威性的醫療建議,
而不是亂七八糟的垃圾。
12:41
It could mean when you want
to get factual事實 information信息,
225
749143
3310
也可能意味著,
當你想要取得真實資訊時,
12:44
there's not a bunch of weird奇怪的,
paranoid偏執 conspiracy陰謀 theories理論.
226
752477
3254
不會得到一堆怪異、
偏執的陰謀論。
12:47
We can imagine想像 this wonderful精彩
other possibility可能性.
227
755755
4235
我們可以想像這美好的
「另一種可能性」。
12:52
Ah.
228
760014
1261
啊。
12:53
I dream夢想 of it. I believe it's possible可能.
229
761299
2130
我夢想它能成真。
我相信它能成真。
12:55
I'm certain某些 it's possible可能.
230
763453
3302
我很確定它能成真。
12:58
And I'm certain某些 that the companies公司,
the Googles谷歌 and the FacebooksFacebook的,
231
766779
4747
我很確定像 Google
及臉書這些公司
13:03
would actually其實 do better in this world世界.
232
771550
2312
他們會做得更好。
13:05
I don't believe we need
to punish懲治 Silicon Valley.
233
773886
3166
我不認為我們需要去懲罰矽谷。
13:09
We just need to remake翻拍 the decision決定.
234
777076
2253
我們只需要重新決定。
13:12
Of the big tech高科技 companies公司,
235
780702
1882
在大型科技公司中,
13:14
it's really only two that depend依靠
on behavior行為 modification修改 and spying間諜
236
782608
5563
其實只有兩間
仰賴行為修改和暗中監視
13:20
as their business商業 plan計劃.
237
788195
1257
做為它們的事業計畫。
13:21
It's Google谷歌 and FacebookFacebook的.
238
789476
1759
就是 Google 和臉書。
13:23
(Laughter笑聲)
239
791259
1310
(笑聲)
13:24
And I love you guys.
240
792593
1691
我愛你們。
13:26
Really, I do. Like, the people
are fantastic奇妙.
241
794308
2721
真的,我愛你們。那些人都很棒。
13:30
I want to point out, if I may可能,
242
798371
3182
如果可以的話,我想聲明這點。
13:33
if you look at Google谷歌,
243
801577
1151
比如說 Google,
13:34
they can propagate傳播 cost成本 centers中心
endlessly不休 with all of these companies公司,
244
802752
5087
他們可以和這些公司
無限地將成本中心分散出去,
13:39
but they cannot不能 propagate傳播 profit利潤 centers中心.
245
807863
2048
但他們無法將利益中心散播出去。
13:41
They cannot不能 diversify多樣化,
because they're hooked迷上.
246
809935
3181
他們無法多樣化,
因為他們被困住了。
13:45
They're hooked迷上 on this model模型,
just like their own擁有 users用戶.
247
813140
2627
他們被固定在這個模式上,
就如同他們的使用者一樣。
13:47
They're in the same相同 trap陷阱 as their users用戶,
248
815791
2298
他們和他們的使用者
掉到了同樣的困境中;
13:50
and you can't run
a big corporation公司 that way.
249
818113
2504
但是大型的企業不能那樣經營。
13:52
So this is ultimately最終 totally完全
in the benefit效益 of the shareholders股東
250
820641
3603
因為,他們最終追求的是股東的利益,
13:56
and other stakeholders利益相關者 of these companies公司.
251
824268
2445
這些公司的股東權益。
13:58
It's a win-win雙贏 solution.
252
826737
2350
這樣才是雙贏的解決方案。
14:01
It'll它會 just take some time
to figure數字 it out.
253
829111
2515
只是要花些時間就能想通這點。
14:03
A lot of details細節 to work out,
254
831650
2262
有很多細節需要考量,
14:05
totally完全 doable可行.
255
833936
1830
完全是可行的。
14:07
(Laughter笑聲)
256
835790
2415
(笑聲)
14:10
I don't believe our species種類
can survive生存 unless除非 we fix固定 this.
257
838229
3834
我認為若不解決這個問題,
人類就無法生存。
14:14
We cannot不能 have a society社會
258
842087
2290
我們不能活在這樣的社會:
14:16
in which哪一個, if two people
wish希望 to communicate通信,
259
844401
2961
如果兩個人想要溝通,
14:19
the only way that can happen發生
is if it's financed資助 by a third第三 person
260
847386
3440
唯一可能發生的方式是
由想要操縱他們的第三方提供資金。
14:22
who wishes祝福 to manipulate操作 them.
261
850850
2346
14:25
(Applause掌聲)
262
853220
6238
(掌聲)
14:35
(Applause掌聲 ends結束)
263
863077
1151
14:36
In the meantime與此同時,
if the companies公司 won't慣於 change更改,
264
864942
2945
到那時候,如果這些公司仍不改變,
14:39
delete刪除 your accounts賬戶, OK?
265
867911
1666
刪除你的帳號,好嗎?
14:41
(Laughter笑聲)
266
869601
1269
(笑聲)
14:42
(Applause掌聲)
267
870894
1046
(掌聲)
14:43
That's enough足夠 for now.
268
871964
1509
我就講到這邊。
14:45
Thank you so much.
269
873497
1151
非常謝謝。
14:46
(Applause掌聲)
270
874672
6804
(掌聲)
Translated by Lilian Chiu
Reviewed by Yi-Fan Yu

▲Back to top

ABOUT THE SPEAKER
Jaron Lanier - Scientist, musician, visual artist, author
Jaron Lanier is a scientist, musician and writer best known for his work in virtual reality and his advocacy of humanism and sustainable economics in a digital context.

Why you should listen

Jaron Lanier is interested in the idea that virtual reality might help us notice the magic of ordinary reality and the idea that paying people for the data that is now taken from them might be the best path to a sustainable, dignified future economy.

Lanier's 1980s start-up created the first commercial VR products and introduced avatars, multi-person virtual world experiences and prototypes of major VR applications such as surgical simulation. Lanier coined or popularized the terms "virtual reality" and "mixed reality." He recently released a new book, Dawn of the New Everything: Encounters with Reality and Virtual Reality. Both of his previous books, Who Owns the Future? and You Are Not a Gadget are international bestsellers. His most recent book (May 29, 2018) is entitled Ten Arguments for Deleting Your Social Media Accounts Right Now.

Lanier was a mainstay of the earliest TED conferences; he still thinks of TED as a charming little gathering.

More profile about the speaker
Jaron Lanier | Speaker | TED.com