ABOUT THE SPEAKER
Jaron Lanier - Scientist, musician, visual artist, author
Jaron Lanier is a scientist, musician and writer best known for his work in virtual reality and his advocacy of humanism and sustainable economics in a digital context.

Why you should listen

Jaron Lanier is interested in the idea that virtual reality might help us notice the magic of ordinary reality and the idea that paying people for the data that is now taken from them might be the best path to a sustainable, dignified future economy.

Lanier's 1980s start-up created the first commercial VR products and introduced avatars, multi-person virtual world experiences and prototypes of major VR applications such as surgical simulation. Lanier coined or popularized the terms "virtual reality" and "mixed reality." He recently released a new book, Dawn of the New Everything: Encounters with Reality and Virtual Reality. Both of his previous books, Who Owns the Future? and You Are Not a Gadget are international bestsellers. His most recent book (May 29, 2018) is entitled Ten Arguments for Deleting Your Social Media Accounts Right Now.

Lanier was a mainstay of the earliest TED conferences; he still thinks of TED as a charming little gathering.

More profile about the speaker
Jaron Lanier | Speaker | TED.com
TED2018

Jaron Lanier: How we need to remake the internet

Filmed:
1,335,675 views

In the early days of digital culture, Jaron Lanier helped craft a vision for the internet as public commons where humanity could share its knowledge -- but even then, this vision was haunted by the dark side of how it could turn out: with personal devices that control our lives, monitor our data and feed us stimuli. (Sound familiar?) In this visionary talk, Lanier reflects on a "globally tragic, astoundingly ridiculous mistake" companies like Google and Facebook made at the foundation of digital culture -- and how we can undo it. "We cannot have a society in which, if two people wish to communicate, the only way that can happen is if it's financed by a third person who wishes to manipulate them," he says.
- Scientist, musician, visual artist, author
Jaron Lanier is a scientist, musician and writer best known for his work in virtual reality and his advocacy of humanism and sustainable economics in a digital context. Full bio

Double-click the English transcript below to play the video.

00:12
Back in the 1980s, actually,
I gave my first talk at TED,
0
944
4009
00:16
and I brought some of the very,
very first public demonstrations
1
4977
4262
00:21
of virtual reality ever to the TED stage.
2
9263
4234
00:26
And at that time, we knew
that we were facing a knife-edge future
3
14375
6867
00:33
where the technology we needed,
4
21266
5201
00:38
the technology we loved,
5
26491
1851
00:40
could also be our undoing.
6
28366
2047
00:43
We knew that if we thought
of our technology
7
31266
4091
00:47
as a means to ever more power,
8
35381
3254
00:50
if it was just a power trip,
we'd eventually destroy ourselves.
9
38659
3707
00:54
That's what happens
10
42390
1181
00:55
when you're on a power trip
and nothing else.
11
43595
2787
00:59
So the idealism
12
47509
3389
01:02
of digital culture back then
13
50922
4809
01:07
was all about starting with
that recognition of the possible darkness
14
55755
4739
01:12
and trying to imagine
a way to transcend it
15
60518
3350
01:15
with beauty and creativity.
16
63892
2578
01:19
I always used to end my early TED Talks
with a rather horrifying line, which is,
17
67033
6507
01:26
"We have a challenge.
18
74478
3866
01:30
We have to create
a culture around technology
19
78368
4024
01:34
that is so beautiful, so meaningful,
20
82416
3968
01:38
so deep, so endlessly creative,
21
86408
2541
01:40
so filled with infinite potential
22
88973
3016
01:44
that it draws us away
from committing mass suicide."
23
92013
3253
01:48
So we talked about extinction
as being one and the same
24
96519
5588
01:54
as the need to create an alluring,
infinitely creative future.
25
102131
4830
01:59
And I still believe
that that alternative of creativity
26
107639
5382
02:05
as an alternative to death
27
113045
1974
02:07
is very real and true,
28
115043
1969
02:09
maybe the most true thing there is.
29
117036
1983
02:11
In the case of virtual reality --
30
119870
2095
02:13
well, the way I used to talk about it
31
121989
2282
02:16
is that it would be something like
32
124295
2635
02:18
what happened when people
discovered language.
33
126954
2850
02:21
With language came new adventures,
new depth, new meaning,
34
129828
4675
02:26
new ways to connect,
new ways to coordinate,
35
134527
2080
02:28
new ways to imagine,
new ways to raise children,
36
136631
4034
02:32
and I imagined, with virtual reality,
we'd have this new thing
37
140689
4262
02:36
that would be like a conversation
38
144975
1593
02:38
but also like waking-state
intentional dreaming.
39
146592
3344
02:41
We called it post-symbolic communication,
40
149960
2653
02:44
because it would be like just directly
making the thing you experienced
41
152637
4358
02:49
instead of indirectly
making symbols to refer to things.
42
157019
3619
02:53
It was a beautiful vision,
and it's one I still believe in,
43
161466
4338
02:57
and yet, haunting that beautiful vision
44
165828
3215
03:01
was the dark side
of how it could also turn out.
45
169067
3150
03:04
And I suppose I could mention
46
172241
5048
03:09
from one of the very earliest
computer scientists,
47
177313
3064
03:12
whose name was Norbert Wiener,
48
180401
2135
03:14
and he wrote a book back in the '50s,
from before I was even born,
49
182560
3754
03:18
called "The Human Use of Human Beings."
50
186338
2658
03:21
And in the book,
he described the potential
51
189779
4172
03:25
to create a computer system
that would be gathering data from people
52
193975
6181
03:32
and providing feedback
to those people in real time
53
200180
3572
03:35
in order to put them kind of partially,
statistically, in a Skinner box,
54
203776
5135
03:40
in a behaviorist system,
55
208935
2444
03:43
and he has this amazing line
where he says,
56
211403
2501
03:45
one could imagine,
as a thought experiment --
57
213928
2738
03:48
and I'm paraphrasing,
this isn't a quote --
58
216690
2461
03:51
one could imagine a global computer system
59
219175
3080
03:54
where everybody has devices
on them all the time,
60
222279
2842
03:57
and the devices are giving them
feedback based on what they did,
61
225145
3272
04:00
and the whole population
62
228441
1875
04:02
is subject to a degree
of behavior modification.
63
230340
3576
04:05
And such a society would be insane,
64
233940
3546
04:09
could not survive,
could not face its problems.
65
237510
3097
04:12
And then he says, but this is
only a thought experiment,
66
240631
2621
04:15
and such a future
is technologically infeasible.
67
243276
3420
04:18
(Laughter)
68
246720
1092
04:19
And yet, of course,
it's what we have created,
69
247836
3002
04:22
and it's what we must undo
if we are to survive.
70
250862
3277
04:27
So --
71
255457
1151
04:28
(Applause)
72
256632
3540
04:32
I believe that we made
a very particular mistake,
73
260631
5977
04:38
and it happened early on,
74
266632
2234
04:40
and by understanding the mistake we made,
75
268890
2074
04:42
we can undo it.
76
270988
1859
04:44
It happened in the '90s,
77
272871
2559
04:47
and going into the turn of the century,
78
275454
2742
04:50
and here's what happened.
79
278220
1388
04:53
Early digital culture,
80
281200
1374
04:54
and indeed, digital culture to this day,
81
282598
4972
04:59
had a sense of, I would say,
lefty, socialist mission about it,
82
287594
6309
05:05
that unlike other things
that have been done,
83
293927
2160
05:08
like the invention of books,
84
296111
1434
05:09
everything on the internet
must be purely public,
85
297569
3413
05:13
must be available for free,
86
301006
2325
05:15
because if even one person
cannot afford it,
87
303355
3388
05:18
then that would create
this terrible inequity.
88
306767
2572
05:21
Now of course, there's other ways
to deal with that.
89
309912
2524
05:24
If books cost money,
you can have public libraries.
90
312460
3016
05:27
And so forth.
91
315500
1174
05:28
But we were thinking, no, no, no,
this is an exception.
92
316698
2618
05:31
This must be pure public commons,
that's what we want.
93
319340
4605
05:35
And so that spirit lives on.
94
323969
2634
05:38
You can experience it in designs
like the Wikipedia, for instance,
95
326627
3715
05:42
many others.
96
330366
1341
05:43
But at the same time,
97
331731
1874
05:45
we also believed, with equal fervor,
98
333629
2588
05:48
in this other thing
that was completely incompatible,
99
336241
3937
05:52
which is we loved our tech entrepreneurs.
100
340202
3627
05:55
We loved Steve Jobs;
we loved this Nietzschean myth
101
343853
3739
05:59
of the techie who could dent the universe.
102
347616
3468
06:03
Right?
103
351108
1318
06:04
And that mythical power
still has a hold on us, as well.
104
352450
5848
06:10
So you have these two different passions,
105
358322
4459
06:14
for making everything free
106
362805
1937
06:16
and for the almost supernatural
power of the tech entrepreneur.
107
364766
5166
06:21
How do you celebrate entrepreneurship
when everything's free?
108
369956
4352
06:26
Well, there was only
one solution back then,
109
374332
3125
06:29
which was the advertising model.
110
377481
2087
06:31
And so therefore, Google
was born free, with ads,
111
379592
4003
06:35
Facebook was born free, with ads.
112
383619
3682
06:39
Now in the beginning, it was cute,
113
387325
3865
06:43
like with the very earliest Google.
114
391214
1960
06:45
(Laughter)
115
393198
1286
06:46
The ads really were kind of ads.
116
394508
2897
06:49
They would be, like,
your local dentist or something.
117
397429
2485
06:51
But there's thing called Moore's law
118
399938
1920
06:53
that makes the computers
more and more efficient and cheaper.
119
401882
3142
06:57
Their algorithms get better.
120
405048
1858
06:58
We actually have universities
where people study them,
121
406930
2596
07:01
and they get better and better.
122
409550
1628
07:03
And the customers and other entities
who use these systems
123
411202
4452
07:07
just got more and more experienced
and got cleverer and cleverer.
124
415678
4127
07:11
And what started out as advertising
125
419829
2397
07:14
really can't be called
advertising anymore.
126
422250
2477
07:16
It turned into behavior modification,
127
424751
2912
07:19
just as Norbert Wiener
had worried it might.
128
427687
4493
07:24
And so I can't call these things
social networks anymore.
129
432204
4620
07:28
I call them behavior modification empires.
130
436848
3814
07:32
(Applause)
131
440686
2235
07:34
And I refuse to vilify the individuals.
132
442945
4214
07:39
I have dear friends at these companies,
133
447183
2271
07:41
sold a company to Google, even though
I think it's one of these empires.
134
449478
4760
07:46
I don't think this is a matter
of bad people who've done a bad thing.
135
454262
5060
07:51
I think this is a matter
of a globally tragic,
136
459346
4576
07:55
astoundingly ridiculous mistake,
137
463946
4572
08:00
rather than a wave of evil.
138
468542
4129
08:04
Let me give you
just another layer of detail
139
472695
2682
08:07
into how this particular
mistake functions.
140
475401
3103
08:11
So with behaviorism,
141
479337
2707
08:14
you give the creature,
whether it's a rat or a dog or a person,
142
482068
5064
08:19
little treats and sometimes
little punishments
143
487156
2840
08:22
as feedback to what they do.
144
490020
1817
08:24
So if you have an animal in a cage,
it might be candy and electric shocks.
145
492710
5912
08:30
But if you have a smartphone,
146
498646
2524
08:33
it's not those things,
it's symbolic punishment and reward.
147
501194
6926
08:40
Pavlov, one of the early behaviorists,
148
508144
2443
08:42
demonstrated the famous principle.
149
510611
2952
08:45
You could train a dog to salivate
just with the bell, just with the symbol.
150
513587
3961
08:49
So on social networks,
151
517572
1586
08:51
social punishment and social reward
function as the punishment and reward.
152
519182
5080
08:56
And we all know
the feeling of these things.
153
524286
2077
08:58
You get this little thrill --
154
526387
1451
08:59
"Somebody liked my stuff
and it's being repeated."
155
527862
2350
09:02
Or the punishment:
"Oh my God, they don't like me,
156
530236
2334
09:04
maybe somebody else
is more popular, oh my God."
157
532594
2239
09:06
So you have those two
very common feelings,
158
534857
2226
09:09
and they're doled out in such a way
that you get caught in this loop.
159
537107
3564
09:12
As has been publicly acknowledged
by many of the founders of the system,
160
540695
4095
09:16
everybody knew this is what was going on.
161
544814
2341
09:19
But here's the thing:
162
547871
1619
09:21
traditionally, in the academic study
of the methods of behaviorism,
163
549514
5294
09:26
there have been comparisons
of positive and negative stimuli.
164
554832
5436
09:32
In this setting, a commercial setting,
165
560292
2364
09:34
there's a new kind of difference
166
562680
1596
09:36
that has kind of evaded
the academic world for a while,
167
564300
2769
09:39
and that difference
is that whether positive stimuli
168
567093
4048
09:43
are more effective than negative ones
in different circumstances,
169
571165
3309
09:46
the negative ones are cheaper.
170
574498
2104
09:48
They're the bargain stimuli.
171
576626
2056
09:50
So what I mean by that is it's much easier
172
578706
5703
09:56
to lose trust than to build trust.
173
584433
3116
09:59
It takes a long time to build love.
174
587573
3172
10:02
It takes a short time to ruin love.
175
590769
2606
10:05
Now the customers of these
behavior modification empires
176
593399
4588
10:10
are on a very fast loop.
177
598011
1423
10:11
They're almost like
high-frequency traders.
178
599458
2045
10:13
They're getting feedbacks
from their spends
179
601527
2024
10:15
or whatever their activities are
if they're not spending,
180
603575
2802
10:18
and they see what's working,
and then they do more of that.
181
606401
3270
10:21
And so they're getting the quick feedback,
182
609695
2040
10:23
which means they're responding
more to the negative emotions,
183
611759
3040
10:26
because those are the ones
that rise faster, right?
184
614823
3937
10:30
And so therefore,
even well-intentioned players
185
618784
3548
10:34
who think all they're doing
is advertising toothpaste
186
622356
2865
10:37
end up advancing the cause
of the negative people,
187
625245
3031
10:40
the negative emotions, the cranks,
188
628300
2334
10:42
the paranoids,
189
630658
1444
10:44
the cynics, the nihilists.
190
632126
3080
10:47
Those are the ones who get
amplified by the system.
191
635230
3493
10:50
And you can't pay one of these companies
to make the world suddenly nice
192
638747
5651
10:56
and improve democracy
193
644422
1151
10:57
nearly as easily as you can pay
to ruin those things.
194
645597
3841
11:01
And so this is the dilemma
we've gotten ourselves into.
195
649462
3719
11:05
The alternative is to turn back the clock,
with great difficulty,
196
653856
5232
11:11
and remake that decision.
197
659112
2841
11:13
Remaking it would mean two things.
198
661977
4038
11:18
It would mean first that many people,
those who could afford to,
199
666039
3928
11:21
would actually pay for these things.
200
669991
2207
11:24
You'd pay for search,
you'd pay for social networking.
201
672222
4407
11:28
How would you pay?
Maybe with a subscription fee,
202
676653
3461
11:32
maybe with micro-payments as you use them.
203
680138
2738
11:34
There's a lot of options.
204
682900
1802
11:36
If some of you are recoiling,
and you're thinking,
205
684726
2397
11:39
"Oh my God, I would never pay
for these things.
206
687147
2366
11:41
How could you ever get anyone to pay?"
207
689537
2095
11:43
I want to remind you
of something that just happened.
208
691656
3239
11:46
Around this same time
209
694919
2054
11:48
that companies like Google and Facebook
were formulating their free idea,
210
696997
5707
11:54
a lot of cyber culture
also believed that in the future,
211
702728
4504
11:59
televisions and movies
would be created in the same way,
212
707256
3022
12:02
kind of like the Wikipedia.
213
710302
1755
12:04
But then, companies
like Netflix, Amazon, HBO,
214
712456
5064
12:09
said, "Actually, you know, subscribe.
We'll give you give you great TV."
215
717544
3739
12:13
And it worked!
216
721307
1373
12:14
We now are in this period
called "peak TV," right?
217
722704
3874
12:18
So sometimes when you pay for stuff,
things get better.
218
726602
4198
12:22
We can imagine a hypothetical --
219
730824
2286
12:25
(Applause)
220
733134
4671
12:29
We can imagine a hypothetical world
of "peak social media."
221
737829
3659
12:33
What would that be like?
222
741512
1349
12:34
It would mean when you get on,
you can get really useful,
223
742885
2770
12:37
authoritative medical advice
instead of cranks.
224
745679
3095
12:41
It could mean when you want
to get factual information,
225
749143
3310
12:44
there's not a bunch of weird,
paranoid conspiracy theories.
226
752477
3254
12:47
We can imagine this wonderful
other possibility.
227
755755
4235
12:52
Ah.
228
760014
1261
12:53
I dream of it. I believe it's possible.
229
761299
2130
12:55
I'm certain it's possible.
230
763453
3302
12:58
And I'm certain that the companies,
the Googles and the Facebooks,
231
766779
4747
13:03
would actually do better in this world.
232
771550
2312
13:05
I don't believe we need
to punish Silicon Valley.
233
773886
3166
13:09
We just need to remake the decision.
234
777076
2253
13:12
Of the big tech companies,
235
780702
1882
13:14
it's really only two that depend
on behavior modification and spying
236
782608
5563
13:20
as their business plan.
237
788195
1257
13:21
It's Google and Facebook.
238
789476
1759
13:23
(Laughter)
239
791259
1310
13:24
And I love you guys.
240
792593
1691
13:26
Really, I do. Like, the people
are fantastic.
241
794308
2721
13:30
I want to point out, if I may,
242
798371
3182
13:33
if you look at Google,
243
801577
1151
13:34
they can propagate cost centers
endlessly with all of these companies,
244
802752
5087
13:39
but they cannot propagate profit centers.
245
807863
2048
13:41
They cannot diversify,
because they're hooked.
246
809935
3181
13:45
They're hooked on this model,
just like their own users.
247
813140
2627
13:47
They're in the same trap as their users,
248
815791
2298
13:50
and you can't run
a big corporation that way.
249
818113
2504
13:52
So this is ultimately totally
in the benefit of the shareholders
250
820641
3603
13:56
and other stakeholders of these companies.
251
824268
2445
13:58
It's a win-win solution.
252
826737
2350
14:01
It'll just take some time
to figure it out.
253
829111
2515
14:03
A lot of details to work out,
254
831650
2262
14:05
totally doable.
255
833936
1830
14:07
(Laughter)
256
835790
2415
14:10
I don't believe our species
can survive unless we fix this.
257
838229
3834
14:14
We cannot have a society
258
842087
2290
14:16
in which, if two people
wish to communicate,
259
844401
2961
14:19
the only way that can happen
is if it's financed by a third person
260
847386
3440
14:22
who wishes to manipulate them.
261
850850
2346
14:25
(Applause)
262
853220
6238
14:35
(Applause ends)
263
863077
1151
14:36
In the meantime,
if the companies won't change,
264
864942
2945
14:39
delete your accounts, OK?
265
867911
1666
14:41
(Laughter)
266
869601
1269
14:42
(Applause)
267
870894
1046
14:43
That's enough for now.
268
871964
1509
14:45
Thank you so much.
269
873497
1151
14:46
(Applause)
270
874672
6804

▲Back to top

ABOUT THE SPEAKER
Jaron Lanier - Scientist, musician, visual artist, author
Jaron Lanier is a scientist, musician and writer best known for his work in virtual reality and his advocacy of humanism and sustainable economics in a digital context.

Why you should listen

Jaron Lanier is interested in the idea that virtual reality might help us notice the magic of ordinary reality and the idea that paying people for the data that is now taken from them might be the best path to a sustainable, dignified future economy.

Lanier's 1980s start-up created the first commercial VR products and introduced avatars, multi-person virtual world experiences and prototypes of major VR applications such as surgical simulation. Lanier coined or popularized the terms "virtual reality" and "mixed reality." He recently released a new book, Dawn of the New Everything: Encounters with Reality and Virtual Reality. Both of his previous books, Who Owns the Future? and You Are Not a Gadget are international bestsellers. His most recent book (May 29, 2018) is entitled Ten Arguments for Deleting Your Social Media Accounts Right Now.

Lanier was a mainstay of the earliest TED conferences; he still thinks of TED as a charming little gathering.

More profile about the speaker
Jaron Lanier | Speaker | TED.com