ABOUT THE SPEAKER
César Hidalgo - Physicist
César Hidalgo studies how teams, cities and nations learn.

Why you should listen

César A. Hidalgo leads the Collective Learning group at The MIT Media Lab and is an associate professor of media arts and sciences at MIT. Hidalgo's work focuses on understanding how teams, organizations, cities and nations learn. At the Collective Learning group, Hidalgo develops software tools to facilitate learning in organizations. His academic publications have been cited more than 10,000 times, and his online systems, including the Observatory of Economic Complexity and DataUSA, have received more than 100 million views and numerous awards.

Hidalgo's latest book, Why Information Grows (2015), has been translated into 10+ languages. He is also the co-author of The Atlas of Economic Complexity (2014) and a co-founder of Datawheel LLC, a company that has professionalized the creation of large data visualization engines.

More profile about the speaker
César Hidalgo | Speaker | TED.com
TED2018

César Hidalgo: A bold idea to replace politicians

Filmed:
2,044,328 views

César Hidalgo has a radical suggestion for fixing our broken political system: automate it! In this provocative talk, he outlines a bold idea to bypass politicians by empowering citizens to create personalized AI representatives that participate directly in democratic decisions. Explore a new way to make collective decisions and expand your understanding of democracy.
- Physicist
César Hidalgo studies how teams, cities and nations learn. Full bio

Double-click the English transcript below to play the video.

00:13
Is it just me,
0
1373
1744
00:15
or are there other people here
1
3141
2332
00:17
that are a little bit
disappointed with democracy?
2
5497
2344
00:20
(Applause)
3
8986
2336
00:24
So let's look at a few numbers.
4
12141
2071
00:26
If we look across the world,
5
14934
2173
00:29
the median turnout
in presidential elections
6
17131
3892
00:33
over the last 30 years
7
21047
1651
00:34
has been just 67 percent.
8
22722
2622
00:38
Now, if we go to Europe
9
26329
1973
00:40
and we look at people that participated
in EU parliamentary elections,
10
28326
4428
00:44
the median turnout in those elections
11
32778
2071
00:46
is just 42 percent.
12
34873
2104
00:50
Now let's go to New York,
13
38125
1669
00:51
and let's see how many people voted
in the last election for mayor.
14
39818
4681
00:56
We will find that only
24 percent of people showed up to vote.
15
44523
3817
01:01
What that means is that,
if "Friends" was still running,
16
49063
3095
01:04
Joey and maybe Phoebe
would have shown up to vote.
17
52182
3348
01:07
(Laughter)
18
55554
1290
01:09
And you cannot blame them
because people are tired of politicians.
19
57434
4426
01:13
And people are tired of other people
using the data that they have generated
20
61884
3887
01:17
to communicate with
their friends and family,
21
65795
2198
01:20
to target political propaganda at them.
22
68017
2094
01:22
But the thing about this
is that this is not new.
23
70519
2728
01:25
Nowadays, people use likes
to target propaganda at you
24
73271
3225
01:28
before they use your zip code
or your gender or your age,
25
76520
3373
01:31
because the idea of targeting people
with propaganda for political purposes
26
79917
3570
01:35
is as old as politics.
27
83511
1592
01:37
And the reason why that idea is there
28
85530
2278
01:39
is because democracy
has a basic vulnerability.
29
87832
3472
01:43
This is the idea of a representative.
30
91710
1932
01:46
In principle, democracy is the ability
of people to exert power.
31
94049
3944
01:50
But in practice, we have to delegate
that power to a representative
32
98017
3818
01:53
that can exert that power for us.
33
101859
2255
01:56
That representative is a bottleneck,
34
104561
1825
01:58
or a weak spot.
35
106410
1297
01:59
It is the place that you want to target
if you want to attack democracy
36
107731
3949
02:03
because you can capture democracy
by either capturing that representative
37
111704
3489
02:07
or capturing the way
that people choose it.
38
115217
2148
02:10
So the big question is:
39
118065
1416
02:11
Is this the end of history?
40
119505
1704
02:13
Is this the best that we can do
41
121989
3103
02:17
or, actually, are there alternatives?
42
125878
3067
02:22
Some people have been thinking
about alternatives,
43
130130
2354
02:24
and one of the ideas that is out there
is the idea of direct democracy.
44
132508
3679
02:28
This is the idea of bypassing
politicians completely
45
136790
2479
02:31
and having people vote directly on issues,
46
139293
2403
02:33
having people vote directly on bills.
47
141720
2265
02:36
But this idea is naive
48
144415
1336
02:37
because there's too many things
that we would need to choose.
49
145775
3171
02:40
If you look at the 114th US Congress,
50
148970
2782
02:43
you will have seen that
the House of Representatives
51
151776
2487
02:46
considered more than 6,000 bills,
52
154287
2889
02:49
the Senate considered
more than 3,000 bills
53
157200
2656
02:51
and they approved more than 300 laws.
54
159880
2808
02:54
Those would be many decisions
55
162712
1579
02:56
that each person would have to make a week
56
164315
2187
02:58
on topics that they know little about.
57
166526
2142
03:01
So there's a big cognitive
bandwidth problem
58
169229
2281
03:03
if we're going to try to think about
direct democracy as a viable alternative.
59
171534
3992
03:08
So some people think about the idea
of liquid democracy, or fluid democracy,
60
176205
4435
03:12
which is the idea that you endorse
your political power to someone,
61
180664
3776
03:16
who can endorse it to someone else,
62
184464
1700
03:18
and, eventually, you create
a large follower network
63
186188
2541
03:20
in which, at the end, there's a few people
that are making decisions
64
188753
3294
03:24
on behalf of all of their followers
and their followers.
65
192071
3143
03:28
But this idea also doesn't solve
the problem of the cognitive bandwidth
66
196326
4129
03:32
and, to be honest, it's also quite similar
to the idea of having a representative.
67
200479
3877
03:36
So what I'm going to do today is
I'm going to be a little bit provocative,
68
204795
3468
03:40
and I'm going to ask you, well:
69
208277
2300
03:42
What if, instead of trying
to bypass politicians,
70
210601
6562
03:49
we tried to automate them?
71
217187
2200
03:57
The idea of automation is not new.
72
225871
2926
04:00
It was started more than 300 years ago,
73
228821
2080
04:02
when French weavers decided
to automate the loom.
74
230925
3068
04:06
The winner of that industrial war
was Joseph-Marie Jacquard.
75
234820
4360
04:11
He was a French weaver and merchant
76
239204
1751
04:12
that married the loom
with the steam engine
77
240979
2440
04:15
to create autonomous looms.
78
243443
2190
04:17
And in those autonomous looms,
he gained control.
79
245657
2753
04:20
He could now make fabrics that were
more complex and more sophisticated
80
248434
3885
04:24
than the ones they
were able to do by hand.
81
252343
2128
04:27
But also, by winning that industrial war,
82
255193
2632
04:29
he laid out what has become
the blueprint of automation.
83
257849
3524
04:34
The way that we automate things
for the last 300 years
84
262135
2870
04:37
has always been the same:
85
265029
1382
04:39
we first identify a need,
86
267006
2509
04:41
then we create a tool
to satisfy that need,
87
269539
3184
04:44
like the loom, in this case,
88
272747
2040
04:46
and then we study how people use that tool
89
274811
2391
04:49
to automate that user.
90
277226
1485
04:51
That's how we came
from the mechanical loom
91
279242
3061
04:54
to the autonomous loom,
92
282327
1896
04:56
and that took us a thousand years.
93
284247
2120
04:58
Now, it's taken us only a hundred years
94
286391
2071
05:00
to use the same script
to automate the car.
95
288486
3211
05:05
But the thing is that, this time around,
96
293286
2452
05:07
automation is kind of for real.
97
295762
2129
05:09
This is a video that a colleague of mine
from Toshiba shared with me
98
297915
3321
05:13
that shows the factory
that manufactures solid state drives.
99
301260
3259
05:16
The entire factory is a robot.
100
304543
2018
05:18
There are no humans in that factory.
101
306585
1925
05:21
And the robots are soon
to leave the factories
102
309033
2221
05:23
and become part of our world,
103
311278
2022
05:25
become part of our workforce.
104
313324
1835
05:27
So what I do in my day job
105
315183
1773
05:28
is actually create tools that integrate
data for entire countries
106
316980
3992
05:32
so that we can ultimately have
the foundations that we need
107
320996
3466
05:36
for a future in which we need
to also manage those machines.
108
324486
3687
05:41
But today, I'm not here
to talk to you about these tools
109
329195
2906
05:44
that integrate data for countries.
110
332125
1824
05:46
But I'm here to talk to you
about another idea
111
334463
2622
05:49
that might help us think about how to use
artificial intelligence in democracy.
112
337109
4865
05:53
Because the tools that I build
are designed for executive decisions.
113
341998
4733
05:58
These are decisions that can be cast
in some sort of term of objectivity --
114
346755
3842
06:02
public investment decisions.
115
350621
1745
06:04
But there are decisions
that are legislative,
116
352885
2631
06:07
and these decisions that are legislative
require communication among people
117
355540
3787
06:11
that have different points of view,
118
359351
1700
06:13
require participation, require debate,
119
361075
2613
06:15
require deliberation.
120
363712
1478
06:18
And for a long time,
we have thought that, well,
121
366241
2804
06:21
what we need to improve democracy
is actually more communication.
122
369069
3460
06:24
So all of the technologies that we have
advanced in the context of democracy,
123
372553
3709
06:28
whether they are newspapers
or whether it is social media,
124
376286
2778
06:31
have tried to provide us
with more communication.
125
379088
2382
06:34
But we've been down that rabbit hole,
126
382103
1822
06:35
and we know that's not
what's going to solve the problem.
127
383949
2748
06:38
Because it's not a communication problem,
128
386721
1996
06:40
it's a cognitive bandwidth problem.
129
388741
1748
06:42
So if the problem is one
of cognitive bandwidth,
130
390513
2366
06:44
well, adding more communication to people
131
392903
2587
06:47
is not going to be
what's going to solve it.
132
395514
2744
06:50
What we are going to need instead
is to have other technologies
133
398282
3113
06:53
that help us deal with
some of the communication
134
401419
3046
06:56
that we are overloaded with.
135
404489
2242
06:58
Think of, like, a little avatar,
136
406755
1699
07:00
a software agent,
137
408478
1339
07:01
a digital Jiminy Cricket --
138
409841
1878
07:03
(Laughter)
139
411743
1238
07:05
that basically is able
to answer things on your behalf.
140
413005
4012
07:09
And if we had that technology,
141
417759
1787
07:11
we would be able to offload
some of the communication
142
419570
2478
07:14
and help, maybe, make better decisions
or decisions at a larger scale.
143
422072
4147
07:18
And the thing is that the idea
of software agents is also not new.
144
426860
3719
07:22
We already use them all the time.
145
430603
2109
07:25
We use software agents
146
433216
1521
07:26
to choose the way that we're going
to drive to a certain location,
147
434761
3675
07:31
the music that we're going to listen to
148
439070
2101
07:33
or to get suggestions
for the next books that we should read.
149
441758
3021
07:37
So there is an obvious idea
in the 21st century
150
445994
2574
07:40
that was as obvious as the idea
151
448592
2643
07:43
of putting together a steam engine
with a loom at the time of Jacquard.
152
451259
5581
07:49
And that idea is combining
direct democracy with software agents.
153
457538
4456
07:54
Imagine, for a second, a world
154
462849
2121
07:56
in which, instead of having
a representative that represents you
155
464994
3166
08:00
and millions of other people,
156
468184
1574
08:01
you can have a representative
that represents only you,
157
469782
3036
08:05
with your nuanced political views --
158
473504
2254
08:07
that weird combination
of libertarian and liberal
159
475782
3344
08:11
and maybe a little bit
conservative on some issues
160
479150
2392
08:13
and maybe very progressive on others.
161
481566
2108
08:15
Politicians nowadays are packages,
and they're full of compromises.
162
483698
3267
08:18
But you might have someone
that can represent only you,
163
486989
3634
08:22
if you are willing to give up the idea
164
490647
1852
08:24
that that representative is a human.
165
492523
2249
08:27
If that representative
is a software agent,
166
495229
2082
08:29
we could have a senate that has
as many senators as we have citizens.
167
497335
4170
08:33
And those senators are going to be able
to read every bill
168
501529
2858
08:36
and they're going to be able
to vote on each one of them.
169
504411
2747
08:39
So there's an obvious idea
that maybe we want to consider.
170
507822
2956
08:42
But I understand that in this day and age,
171
510802
2422
08:45
this idea might be quite scary.
172
513248
1889
08:48
In fact, thinking of a robot
coming from the future
173
516391
3440
08:51
to help us run our governments
174
519855
1673
08:53
sounds terrifying.
175
521552
1631
08:56
But we've been there before.
176
524223
1651
08:57
(Laughter)
177
525898
1273
08:59
And actually he was quite a nice guy.
178
527195
2495
09:03
So what would the Jacquard loom
version of this idea look like?
179
531677
6434
09:10
It would be a very simple system.
180
538135
1901
09:12
Imagine a system that you log in
and you create your avatar,
181
540060
3458
09:15
and then you're going
to start training your avatar.
182
543542
2456
09:18
So you can provide your avatar
with your reading habits,
183
546022
2682
09:20
or connect it to your social media,
184
548728
1861
09:22
or you can connect it to other data,
185
550613
2408
09:25
for example by taking
psychological tests.
186
553045
2272
09:27
And the nice thing about this
is that there's no deception.
187
555341
2968
09:30
You are not providing data to communicate
with your friends and family
188
558333
3339
09:33
that then gets used in a political system.
189
561696
3151
09:36
You are providing data to a system
that is designed to be used
190
564871
3704
09:40
to make political decisions
on your behalf.
191
568599
2116
09:43
Then you take that data and you choose
a training algorithm,
192
571264
3980
09:47
because it's an open marketplace
193
575268
1563
09:48
in which different people
can submit different algorithms
194
576855
2786
09:51
to predict how you're going to vote,
based on the data you have provided.
195
579665
4394
09:56
And the system is open,
so nobody controls the algorithms;
196
584083
3455
09:59
there are algorithms
that become more popular
197
587562
2112
10:01
and others that become less popular.
198
589698
1723
10:03
Eventually, you can audit the system.
199
591445
1807
10:05
You can see how your avatar is working.
200
593276
1881
10:07
If you like it,
you can leave it on autopilot.
201
595181
2152
10:09
If you want to be
a little more controlling,
202
597357
2062
10:11
you can actually choose that they ask you
203
599443
1968
10:13
every time they're going
to make a decision,
204
601435
2068
10:15
or you can be anywhere in between.
205
603527
1635
10:17
One of the reasons
why we use democracy so little
206
605186
2405
10:19
may be because democracy
has a very bad user interface.
207
607615
3568
10:23
And if we improve the user
interface of democracy,
208
611207
2483
10:25
we might be able to use it more.
209
613714
2127
10:28
Of course, there's a lot of questions
that you might have.
210
616452
3207
10:32
Well, how do you train these avatars?
211
620473
2161
10:34
How do you keep the data secure?
212
622658
1894
10:36
How do you keep the systems
distributed and auditable?
213
624576
3248
10:39
How about my grandmother,
who's 80 years old
214
627848
2062
10:41
and doesn't know how to use the internet?
215
629946
1960
10:44
Trust me, I've heard them all.
216
632262
2221
10:46
So when you think about an idea like this,
you have to beware of pessimists
217
634507
4560
10:51
because they are known to have
a problem for every solution.
218
639091
4319
10:55
(Laughter)
219
643434
1825
10:57
So I want to invite you to think
about the bigger ideas.
220
645283
3040
11:00
The questions I just showed you
are little ideas
221
648347
3626
11:03
because they are questions
about how this would not work.
222
651997
2902
11:07
The big ideas are ideas of:
223
655502
1981
11:09
What else can you do with this
224
657507
1807
11:11
if this would happen to work?
225
659338
1889
11:13
And one of those ideas is,
well, who writes the laws?
226
661774
3445
11:17
In the beginning, we could have
the avatars that we already have,
227
665854
4223
11:22
voting on laws that are written
by the senators or politicians
228
670101
3497
11:25
that we already have.
229
673622
1351
11:27
But if this were to work,
230
675491
1714
11:29
you could write an algorithm
231
677902
2350
11:32
that could try to write a law
232
680276
2150
11:34
that would get a certain
percentage of approval,
233
682450
2421
11:36
and you could reverse the process.
234
684895
1782
11:38
Now, you might think that this idea
is ludicrous and we should not do it,
235
686701
3512
11:42
but you cannot deny that it's an idea
that is only possible
236
690237
2786
11:45
in a world in which direct democracy
and software agents
237
693047
3020
11:48
are a viable form of participation.
238
696091
2656
11:52
So how do we start the revolution?
239
700596
2753
11:56
We don't start this revolution
with picket fences or protests
240
704238
3310
11:59
or by demanding our current politicians
to be changed into robots.
241
707572
4190
12:03
That's not going to work.
242
711786
1549
12:05
This is much more simple,
243
713359
1612
12:06
much slower
244
714995
1159
12:08
and much more humble.
245
716178
1414
12:09
We start this revolution by creating
simple systems like this in grad schools,
246
717616
4349
12:13
in libraries, in nonprofits.
247
721989
2094
12:16
And we try to figure out
all of those little questions
248
724107
2654
12:18
and those little problems
249
726785
1221
12:20
that we're going to have to figure out
to make this idea something viable,
250
728030
3901
12:23
to make this idea something
that we can trust.
251
731955
2351
12:26
And as we create those systems that have
a hundred people, a thousand people,
252
734330
3635
12:29
a hundred thousand people voting
in ways that are not politically binding,
253
737989
3770
12:33
we're going to develop trust in this idea,
254
741783
2018
12:35
the world is going to change,
255
743825
1519
12:37
and those that are as little
as my daughter is right now
256
745368
2875
12:40
are going to grow up.
257
748267
1337
12:42
And by the time my daughter is my age,
258
750580
2369
12:44
maybe this idea, that I know
today is very crazy,
259
752973
4436
12:49
might not be crazy to her
and to her friends.
260
757433
4134
12:53
And at that point,
261
761956
1837
12:55
we will be at the end of our history,
262
763817
2603
12:58
but they will be
at the beginning of theirs.
263
766444
2821
13:01
Thank you.
264
769646
1183
13:02
(Applause)
265
770853
3042

▲Back to top

ABOUT THE SPEAKER
César Hidalgo - Physicist
César Hidalgo studies how teams, cities and nations learn.

Why you should listen

César A. Hidalgo leads the Collective Learning group at The MIT Media Lab and is an associate professor of media arts and sciences at MIT. Hidalgo's work focuses on understanding how teams, organizations, cities and nations learn. At the Collective Learning group, Hidalgo develops software tools to facilitate learning in organizations. His academic publications have been cited more than 10,000 times, and his online systems, including the Observatory of Economic Complexity and DataUSA, have received more than 100 million views and numerous awards.

Hidalgo's latest book, Why Information Grows (2015), has been translated into 10+ languages. He is also the co-author of The Atlas of Economic Complexity (2014) and a co-founder of Datawheel LLC, a company that has professionalized the creation of large data visualization engines.

More profile about the speaker
César Hidalgo | Speaker | TED.com