English-Video.net comment policy

The comment field is common to all languages

Let's write in your language and use "Google Translate" together

Please refer to informative community guidelines on TED.com

TED2018

Jaron Lanier: How we need to remake the internet

Filmed:
947,604 views

In the early days of digital culture, Jaron Lanier helped craft a vision for the internet as public commons where humanity could share its knowledge -- but even then, this vision was haunted by the dark side of how it could turn out: with personal devices that control our lives, monitor our data and feed us stimuli. (Sound familiar?) In this visionary talk, Lanier reflects on a "globally tragic, astoundingly ridiculous mistake" companies like Google and Facebook made at the foundation of digital culture -- and how we can undo it. "We cannot have a society in which, if two people wish to communicate, the only way that can happen is if it's financed by a third person who wishes to manipulate them," he says.

- Scientist, musician, visual artist, author
Jaron Lanier is a scientist, musician and writer best known for his work in virtual reality and his advocacy of humanism and sustainable economics in a digital context. Full bio

Back in the 1980s, actually,
I gave my first talk at TED,
00:12
and I brought some of the very,
very first public demonstrations
00:16
of virtual reality ever to the TED stage.
00:21
And at that time, we knew
that we were facing a knife-edge future
00:26
where the technology we needed,
00:33
the technology we loved,
00:38
could also be our undoing.
00:40
We knew that if we thought
of our technology
00:43
as a means to ever more power,
00:47
if it was just a power trip,
we'd eventually destroy ourselves.
00:50
That's what happens
00:54
when you're on a power trip
and nothing else.
00:55
So the idealism
00:59
of digital culture back then
01:02
was all about starting with
that recognition of the possible darkness
01:07
and trying to imagine
a way to transcend it
01:12
with beauty and creativity.
01:15
I always used to end my early TED Talks
with a rather horrifying line, which is,
01:19
"We have a challenge.
01:26
We have to create
a culture around technology
01:30
that is so beautiful, so meaningful,
01:34
so deep, so endlessly creative,
01:38
so filled with infinite potential
01:40
that it draws us away
from committing mass suicide."
01:44
So we talked about extinction
as being one and the same
01:48
as the need to create an alluring,
infinitely creative future.
01:54
And I still believe
that that alternative of creativity
01:59
as an alternative to death
02:05
is very real and true,
02:07
maybe the most true thing there is.
02:09
In the case of virtual reality --
02:11
well, the way I used to talk about it
02:13
is that it would be something like
02:16
what happened when people
discovered language.
02:18
With language came new adventures,
new depth, new meaning,
02:21
new ways to connect,
new ways to coordinate,
02:26
new ways to imagine,
new ways to raise children,
02:28
and I imagined, with virtual reality,
we'd have this new thing
02:32
that would be like a conversation
02:36
but also like waking-state
intentional dreaming.
02:38
We called it post-symbolic communication,
02:41
because it would be like just directly
making the thing you experienced
02:44
instead of indirectly
making symbols to refer to things.
02:49
It was a beautiful vision,
and it's one I still believe in,
02:53
and yet, haunting that beautiful vision
02:57
was the dark side
of how it could also turn out.
03:01
And I suppose I could mention
03:04
from one of the very earliest
computer scientists,
03:09
whose name was Norbert Wiener,
03:12
and he wrote a book back in the '50s,
from before I was even born,
03:14
called "The Human Use of Human Beings."
03:18
And in the book,
he described the potential
03:21
to create a computer system
that would be gathering data from people
03:25
and providing feedback
to those people in real time
03:32
in order to put them kind of partially,
statistically, in a Skinner box,
03:35
in a behaviorist system,
03:40
and he has this amazing line
where he says,
03:43
one could imagine,
as a thought experiment --
03:45
and I'm paraphrasing,
this isn't a quote --
03:48
one could imagine a global computer system
03:51
where everybody has devices
on them all the time,
03:54
and the devices are giving them
feedback based on what they did,
03:57
and the whole population
04:00
is subject to a degree
of behavior modification.
04:02
And such a society would be insane,
04:05
could not survive,
could not face its problems.
04:09
And then he says, but this is
only a thought experiment,
04:12
and such a future
is technologically infeasible.
04:15
(Laughter)
04:18
And yet, of course,
it's what we have created,
04:19
and it's what we must undo
if we are to survive.
04:22
So --
04:27
(Applause)
04:28
I believe that we made
a very particular mistake,
04:32
and it happened early on,
04:38
and by understanding the mistake we made,
04:40
we can undo it.
04:42
It happened in the '90s,
04:44
and going into the turn of the century,
04:47
and here's what happened.
04:50
Early digital culture,
04:53
and indeed, digital culture to this day,
04:54
had a sense of, I would say,
lefty, socialist mission about it,
04:59
that unlike other things
that have been done,
05:05
like the invention of books,
05:08
everything on the internet
must be purely public,
05:09
must be available for free,
05:13
because if even one person
cannot afford it,
05:15
then that would create
this terrible inequity.
05:18
Now of course, there's other ways
to deal with that.
05:21
If books cost money,
you can have public libraries.
05:24
And so forth.
05:27
But we were thinking, no, no, no,
this is an exception.
05:28
This must be pure public commons,
that's what we want.
05:31
And so that spirit lives on.
05:35
You can experience it in designs
like the Wikipedia, for instance,
05:38
many others.
05:42
But at the same time,
05:43
we also believed, with equal fervor,
05:45
in this other thing
that was completely incompatible,
05:48
which is we loved our tech entrepreneurs.
05:52
We loved Steve Jobs;
we loved this Nietzschean myth
05:55
of the techie who could dent the universe.
05:59
Right?
06:03
And that mythical power
still has a hold on us, as well.
06:04
So you have these two different passions,
06:10
for making everything free
06:14
and for the almost supernatural
power of the tech entrepreneur.
06:16
How do you celebrate entrepreneurship
when everything's free?
06:21
Well, there was only
one solution back then,
06:26
which was the advertising model.
06:29
And so therefore, Google
was born free, with ads,
06:31
Facebook was born free, with ads.
06:35
Now in the beginning, it was cute,
06:39
like with the very earliest Google.
06:43
(Laughter)
06:45
The ads really were kind of ads.
06:46
They would be, like,
your local dentist or something.
06:49
But there's thing called Moore's law
06:51
that makes the computers
more and more efficient and cheaper.
06:53
Their algorithms get better.
06:57
We actually have universities
where people study them,
06:58
and they get better and better.
07:01
And the customers and other entities
who use these systems
07:03
just got more and more experienced
and got cleverer and cleverer.
07:07
And what started out as advertising
07:11
really can't be called
advertising anymore.
07:14
It turned into behavior modification,
07:16
just as Norbert Wiener
had worried it might.
07:19
And so I can't call these things
social networks anymore.
07:24
I call them behavior modification empires.
07:28
(Applause)
07:32
And I refuse to vilify the individuals.
07:34
I have dear friends at these companies,
07:39
sold a company to Google, even though
I think it's one of these empires.
07:41
I don't think this is a matter
of bad people who've done a bad thing.
07:46
I think this is a matter
of a globally tragic,
07:51
astoundingly ridiculous mistake,
07:55
rather than a wave of evil.
08:00
Let me give you
just another layer of detail
08:04
into how this particular
mistake functions.
08:07
So with behaviorism,
08:11
you give the creature,
whether it's a rat or a dog or a person,
08:14
little treats and sometimes
little punishments
08:19
as feedback to what they do.
08:22
So if you have an animal in a cage,
it might be candy and electric shocks.
08:24
But if you have a smartphone,
08:30
it's not those things,
it's symbolic punishment and reward.
08:33
Pavlov, one of the early behaviorists,
08:40
demonstrated the famous principle.
08:42
You could train a dog to salivate
just with the bell, just with the symbol.
08:45
So on social networks,
08:49
social punishment and social reward
function as the punishment and reward.
08:51
And we all know
the feeling of these things.
08:56
You get this little thrill --
08:58
"Somebody liked my stuff
and it's being repeated."
08:59
Or the punishment:
"Oh my God, they don't like me,
09:02
maybe somebody else
is more popular, oh my God."
09:04
So you have those two
very common feelings,
09:06
and they're doled out in such a way
that you get caught in this loop.
09:09
As has been publicly acknowledged
by many of the founders of the system,
09:12
everybody knew this is what was going on.
09:16
But here's the thing:
09:19
traditionally, in the academic study
of the methods of behaviorism,
09:21
there have been comparisons
of positive and negative stimuli.
09:26
In this setting, a commercial setting,
09:32
there's a new kind of difference
09:34
that has kind of evaded
the academic world for a while,
09:36
and that difference
is that whether positive stimuli
09:39
are more effective than negative ones
in different circumstances,
09:43
the negative ones are cheaper.
09:46
They're the bargain stimuli.
09:48
So what I mean by that is it's much easier
09:50
to lose trust than to build trust.
09:56
It takes a long time to build love.
09:59
It takes a short time to ruin love.
10:02
Now the customers of these
behavior modification empires
10:05
are on a very fast loop.
10:10
They're almost like
high-frequency traders.
10:11
They're getting feedbacks
from their spends
10:13
or whatever their activities are
if they're not spending,
10:15
and they see what's working,
and then they do more of that.
10:18
And so they're getting the quick feedback,
10:21
which means they're responding
more to the negative emotions,
10:23
because those are the ones
that rise faster, right?
10:26
And so therefore,
even well-intentioned players
10:30
who think all they're doing
is advertising toothpaste
10:34
end up advancing the cause
of the negative people,
10:37
the negative emotions, the cranks,
10:40
the paranoids,
10:42
the cynics, the nihilists.
10:44
Those are the ones who get
amplified by the system.
10:47
And you can't pay one of these companies
to make the world suddenly nice
10:50
and improve democracy
10:56
nearly as easily as you can pay
to ruin those things.
10:57
And so this is the dilemma
we've gotten ourselves into.
11:01
The alternative is to turn back the clock,
with great difficulty,
11:05
and remake that decision.
11:11
Remaking it would mean two things.
11:13
It would mean first that many people,
those who could afford to,
11:18
would actually pay for these things.
11:21
You'd pay for search,
you'd pay for social networking.
11:24
How would you pay?
Maybe with a subscription fee,
11:28
maybe with micro-payments as you use them.
11:32
There's a lot of options.
11:34
If some of you are recoiling,
and you're thinking,
11:36
"Oh my God, I would never pay
for these things.
11:39
How could you ever get anyone to pay?"
11:41
I want to remind you
of something that just happened.
11:43
Around this same time
11:46
that companies like Google and Facebook
were formulating their free idea,
11:48
a lot of cyber culture
also believed that in the future,
11:54
televisions and movies
would be created in the same way,
11:59
kind of like the Wikipedia.
12:02
But then, companies
like Netflix, Amazon, HBO,
12:04
said, "Actually, you know, subscribe.
We'll give you give you great TV."
12:09
And it worked!
12:13
We now are in this period
called "peak TV," right?
12:14
So sometimes when you pay for stuff,
things get better.
12:18
We can imagine a hypothetical --
12:22
(Applause)
12:25
We can imagine a hypothetical world
of "peak social media."
12:29
What would that be like?
12:33
It would mean when you get on,
you can get really useful,
12:34
authoritative medical advice
instead of cranks.
12:37
It could mean when you want
to get factual information,
12:41
there's not a bunch of weird,
paranoid conspiracy theories.
12:44
We can imagine this wonderful
other possibility.
12:47
Ah.
12:52
I dream of it. I believe it's possible.
12:53
I'm certain it's possible.
12:55
And I'm certain that the companies,
the Googles and the Facebooks,
12:58
would actually do better in this world.
13:03
I don't believe we need
to punish Silicon Valley.
13:05
We just need to remake the decision.
13:09
Of the big tech companies,
13:12
it's really only two that depend
on behavior modification and spying
13:14
as their business plan.
13:20
It's Google and Facebook.
13:21
(Laughter)
13:23
And I love you guys.
13:24
Really, I do. Like, the people
are fantastic.
13:26
I want to point out, if I may,
13:30
if you look at Google,
13:33
they can propagate cost centers
endlessly with all of these companies,
13:34
but they cannot propagate profit centers.
13:39
They cannot diversify,
because they're hooked.
13:41
They're hooked on this model,
just like their own users.
13:45
They're in the same trap as their users,
13:47
and you can't run
a big corporation that way.
13:50
So this is ultimately totally
in the benefit of the shareholders
13:52
and other stakeholders of these companies.
13:56
It's a win-win solution.
13:58
It'll just take some time
to figure it out.
14:01
A lot of details to work out,
14:03
totally doable.
14:05
(Laughter)
14:07
I don't believe our species
can survive unless we fix this.
14:10
We cannot have a society
14:14
in which, if two people
wish to communicate,
14:16
the only way that can happen
is if it's financed by a third person
14:19
who wishes to manipulate them.
14:22
(Applause)
14:25
(Applause ends)
14:35
In the meantime,
if the companies won't change,
14:36
delete your accounts, OK?
14:39
(Laughter)
14:41
(Applause)
14:42
That's enough for now.
14:43
Thank you so much.
14:45
(Applause)
14:46

▲Back to top

About the speaker:

Jaron Lanier - Scientist, musician, visual artist, author
Jaron Lanier is a scientist, musician and writer best known for his work in virtual reality and his advocacy of humanism and sustainable economics in a digital context.

Why you should listen

Jaron Lanier is interested in the idea that virtual reality might help us notice the magic of ordinary reality and the idea that paying people for the data that is now taken from them might be the best path to a sustainable, dignified future economy.

Lanier's 1980s start-up created the first commercial VR products and introduced avatars, multi-person virtual world experiences and prototypes of major VR applications such as surgical simulation. Lanier coined or popularized the terms "virtual reality" and "mixed reality." He recently released a new book, Dawn of the New Everything: Encounters with Reality and Virtual Reality. Both of his previous books, Who Owns the Future? and You Are Not a Gadget are international bestsellers. His most recent book (May 29, 2018) is entitled Ten Arguments for Deleting Your Social Media Accounts Right Now.

Lanier was a mainstay of the earliest TED conferences; he still thinks of TED as a charming little gathering.

More profile about the speaker
Jaron Lanier | Speaker | TED.com