ABOUT THE SPEAKER
Daniel Suarez - Sci-fi author
Daniel Suarez concocts thrilling reads from terrifying (and not-so-farfetched) near-future scenarios.

Why you should listen

While working as a software developer, Daniel Suarez self-published Daemon, a cyber-thriller depicting a future where society is radically reshaped by disruptive technologies. It struck a chord -- and so did the sequel, Freedom (TM) -- rocketing Suarez into the pantheon of sci-fi prophets.

In his 2012 novel Kill Decision, Suarez digs into the consequences of technology that’s here to stay: autonomous bots and drones programmed to be lethal. Suarez argues that as we cede more control to software, we gamble with the very essence of democracy itself. How can we establish sane guidelines for technology that could easily outstrip our control?

More profile about the speaker
Daniel Suarez | Speaker | TED.com
TEDGlobal 2013

Daniel Suarez: The kill decision shouldn't belong to a robot

Filmed:
1,946,706 views

As a novelist, Daniel Suarez spins dystopian tales of the future. But on the TEDGlobal stage, he talks us through a real-life scenario we all need to know more about: the rise of autonomous robotic weapons of war. Advanced drones, automated weapons and AI-powered intelligence-gathering tools, he suggests, could take the decision to make war out of the hands of humans.
- Sci-fi author
Daniel Suarez concocts thrilling reads from terrifying (and not-so-farfetched) near-future scenarios. Full bio

Double-click the English transcript below to play the video.

00:12
I write fiction sci-fi thrillers,
0
691
3158
00:15
so if I say "killer robots,"
1
3849
2193
00:18
you'd probably think something like this.
2
6042
2361
00:20
But I'm actually not here to talk about fiction.
3
8403
2555
00:22
I'm here to talk about very real killer robots,
4
10958
2948
00:25
autonomous combat drones.
5
13906
3122
00:29
Now, I'm not referring to Predator and Reaper drones,
6
17028
3627
00:32
which have a human making targeting decisions.
7
20655
3260
00:35
I'm talking about fully autonomous robotic weapons
8
23915
3129
00:39
that make lethal decisions about human beings
9
27044
2654
00:41
all on their own.
10
29698
2417
00:44
There's actually a technical term for this: lethal autonomy.
11
32115
4000
00:48
Now, lethally autonomous killer robots
12
36115
2856
00:50
would take many forms -- flying, driving,
13
38971
3069
00:54
or just lying in wait.
14
42040
2746
00:56
And actually, they're very quickly becoming a reality.
15
44786
3109
00:59
These are two automatic sniper stations
16
47895
2484
01:02
currently deployed in the DMZ between North and South Korea.
17
50379
4137
01:06
Both of these machines are capable of automatically
18
54516
2171
01:08
identifying a human target and firing on it,
19
56687
3524
01:12
the one on the left at a distance of over a kilometer.
20
60211
4324
01:16
Now, in both cases, there's still a human in the loop
21
64535
3589
01:20
to make that lethal firing decision,
22
68124
2372
01:22
but it's not a technological requirement. It's a choice.
23
70496
5413
01:27
And it's that choice that I want to focus on,
24
75909
3093
01:31
because as we migrate lethal decision-making
25
79002
2641
01:33
from humans to software,
26
81643
3109
01:36
we risk not only taking the humanity out of war,
27
84752
3476
01:40
but also changing our social landscape entirely,
28
88228
3526
01:43
far from the battlefield.
29
91754
2224
01:45
That's because the way humans resolve conflict
30
93978
4509
01:50
shapes our social landscape.
31
98487
1733
01:52
And this has always been the case, throughout history.
32
100220
2633
01:54
For example, these were state-of-the-art weapons systems
33
102853
2661
01:57
in 1400 A.D.
34
105514
2079
01:59
Now they were both very expensive to build and maintain,
35
107593
3144
02:02
but with these you could dominate the populace,
36
110737
3240
02:05
and the distribution of political power in feudal society reflected that.
37
113977
3889
02:09
Power was focused at the very top.
38
117866
2687
02:12
And what changed? Technological innovation.
39
120553
3528
02:16
Gunpowder, cannon.
40
124081
1871
02:17
And pretty soon, armor and castles were obsolete,
41
125952
3817
02:21
and it mattered less who you brought to the battlefield
42
129769
2533
02:24
versus how many people you brought to the battlefield.
43
132302
3779
02:28
And as armies grew in size, the nation-state arose
44
136081
3638
02:31
as a political and logistical requirement of defense.
45
139719
3680
02:35
And as leaders had to rely on more of their populace,
46
143399
2376
02:37
they began to share power.
47
145775
1833
02:39
Representative government began to form.
48
147608
2599
02:42
So again, the tools we use to resolve conflict
49
150207
3288
02:45
shape our social landscape.
50
153495
3304
02:48
Autonomous robotic weapons are such a tool,
51
156799
4064
02:52
except that, by requiring very few people to go to war,
52
160863
5168
02:58
they risk re-centralizing power into very few hands,
53
166031
4840
03:02
possibly reversing a five-century trend toward democracy.
54
170871
6515
03:09
Now, I think, knowing this,
55
177386
1757
03:11
we can take decisive steps to preserve our democratic institutions,
56
179143
4352
03:15
to do what humans do best, which is adapt.
57
183495
3979
03:19
But time is a factor.
58
187474
2005
03:21
Seventy nations are developing remotely-piloted
59
189479
2851
03:24
combat drones of their own,
60
192330
2157
03:26
and as you'll see, remotely-piloted combat drones
61
194487
2593
03:29
are the precursors to autonomous robotic weapons.
62
197080
4472
03:33
That's because once you've deployed remotely-piloted drones,
63
201552
2767
03:36
there are three powerful factors pushing decision-making
64
204319
3384
03:39
away from humans and on to the weapon platform itself.
65
207703
4600
03:44
The first of these is the deluge of video that drones produce.
66
212303
5259
03:49
For example, in 2004, the U.S. drone fleet produced
67
217562
3853
03:53
a grand total of 71 hours of video surveillance for analysis.
68
221415
5312
03:58
By 2011, this had gone up to 300,000 hours,
69
226727
4499
04:03
outstripping human ability to review it all,
70
231226
3149
04:06
but even that number is about to go up drastically.
71
234375
3664
04:10
The Pentagon's Gorgon Stare and Argus programs
72
238039
2575
04:12
will put up to 65 independently operated camera eyes
73
240614
3164
04:15
on each drone platform,
74
243778
2038
04:17
and this would vastly outstrip human ability to review it.
75
245816
3303
04:21
And that means visual intelligence software will need
76
249119
2160
04:23
to scan it for items of interest.
77
251279
4048
04:27
And that means very soon
78
255327
1348
04:28
drones will tell humans what to look at,
79
256675
2747
04:31
not the other way around.
80
259422
2497
04:33
But there's a second powerful incentive pushing
81
261919
2473
04:36
decision-making away from humans and onto machines,
82
264392
3383
04:39
and that's electromagnetic jamming,
83
267775
2872
04:42
severing the connection between the drone
84
270647
2236
04:44
and its operator.
85
272883
2814
04:47
Now we saw an example of this in 2011
86
275697
2618
04:50
when an American RQ-170 Sentinel drone
87
278315
2956
04:53
got a bit confused over Iran due to a GPS spoofing attack,
88
281271
4307
04:57
but any remotely-piloted drone is susceptible to this type of attack,
89
285578
5114
05:02
and that means drones
90
290692
2052
05:04
will have to shoulder more decision-making.
91
292744
3620
05:08
They'll know their mission objective,
92
296364
3043
05:11
and they'll react to new circumstances without human guidance.
93
299407
4845
05:16
They'll ignore external radio signals
94
304252
2581
05:18
and send very few of their own.
95
306833
2330
05:21
Which brings us to, really, the third
96
309163
2006
05:23
and most powerful incentive pushing decision-making
97
311169
3862
05:27
away from humans and onto weapons:
98
315031
3342
05:30
plausible deniability.
99
318373
3293
05:33
Now we live in a global economy.
100
321666
2887
05:36
High-tech manufacturing is occurring on most continents.
101
324553
4334
05:40
Cyber espionage is spiriting away advanced designs
102
328887
2914
05:43
to parts unknown,
103
331801
1886
05:45
and in that environment, it is very likely
104
333687
2014
05:47
that a successful drone design will be knocked off in contract factories,
105
335701
4734
05:52
proliferate in the gray market.
106
340435
2170
05:54
And in that situation, sifting through the wreckage
107
342605
2460
05:57
of a suicide drone attack, it will be very difficult to say
108
345065
2960
06:00
who sent that weapon.
109
348025
4400
06:04
This raises the very real possibility
110
352425
2800
06:07
of anonymous war.
111
355225
2935
06:10
This could tilt the geopolitical balance on its head,
112
358160
2614
06:12
make it very difficult for a nation to turn its firepower
113
360774
3491
06:16
against an attacker, and that could shift the balance
114
364265
2848
06:19
in the 21st century away from defense and toward offense.
115
367113
3764
06:22
It could make military action a viable option
116
370877
3124
06:26
not just for small nations,
117
374001
2288
06:28
but criminal organizations, private enterprise,
118
376289
2545
06:30
even powerful individuals.
119
378834
2479
06:33
It could create a landscape of rival warlords
120
381313
3328
06:36
undermining rule of law and civil society.
121
384641
3680
06:40
Now if responsibility and transparency
122
388321
3616
06:43
are two of the cornerstones of representative government,
123
391937
2384
06:46
autonomous robotic weapons could undermine both.
124
394321
4320
06:50
Now you might be thinking that
125
398641
1546
06:52
citizens of high-tech nations
126
400187
2246
06:54
would have the advantage in any robotic war,
127
402433
2703
06:57
that citizens of those nations would be less vulnerable,
128
405136
3633
07:00
particularly against developing nations.
129
408769
4288
07:05
But I think the truth is the exact opposite.
130
413057
3524
07:08
I think citizens of high-tech societies
131
416581
2251
07:10
are more vulnerable to robotic weapons,
132
418832
3729
07:14
and the reason can be summed up in one word: data.
133
422561
4465
07:19
Data powers high-tech societies.
134
427026
3481
07:22
Cell phone geolocation, telecom metadata,
135
430507
3190
07:25
social media, email, text, financial transaction data,
136
433697
3472
07:29
transportation data, it's a wealth of real-time data
137
437169
3532
07:32
on the movements and social interactions of people.
138
440701
3373
07:36
In short, we are more visible to machines
139
444074
3775
07:39
than any people in history,
140
447849
2242
07:42
and this perfectly suits the targeting needs of autonomous weapons.
141
450091
5616
07:47
What you're looking at here
142
455707
1738
07:49
is a link analysis map of a social group.
143
457445
3246
07:52
Lines indicate social connectedness between individuals.
144
460691
3634
07:56
And these types of maps can be automatically generated
145
464325
2880
07:59
based on the data trail modern people leave behind.
146
467205
4715
08:03
Now it's typically used to market goods and services
147
471920
2477
08:06
to targeted demographics, but it's a dual-use technology,
148
474397
4416
08:10
because targeting is used in another context.
149
478813
3360
08:14
Notice that certain individuals are highlighted.
150
482173
2560
08:16
These are the hubs of social networks.
151
484733
3280
08:20
These are organizers, opinion-makers, leaders,
152
488013
3590
08:23
and these people also can be automatically identified
153
491603
2682
08:26
from their communication patterns.
154
494285
2382
08:28
Now, if you're a marketer, you might then target them
155
496667
2146
08:30
with product samples, try to spread your brand
156
498813
2543
08:33
through their social group.
157
501356
2829
08:36
But if you're a repressive government
158
504185
1953
08:38
searching for political enemies, you might instead remove them,
159
506138
4810
08:42
eliminate them, disrupt their social group,
160
510948
2760
08:45
and those who remain behind lose social cohesion
161
513708
3169
08:48
and organization.
162
516877
2621
08:51
Now in a world of cheap, proliferating robotic weapons,
163
519498
3324
08:54
borders would offer very little protection
164
522822
2635
08:57
to critics of distant governments
165
525457
1946
08:59
or trans-national criminal organizations.
166
527403
3646
09:03
Popular movements agitating for change
167
531049
3493
09:06
could be detected early and their leaders eliminated
168
534542
3609
09:10
before their ideas achieve critical mass.
169
538151
2911
09:13
And ideas achieving critical mass
170
541062
2591
09:15
is what political activism in popular government is all about.
171
543653
3936
09:19
Anonymous lethal weapons could make lethal action
172
547589
3997
09:23
an easy choice for all sorts of competing interests.
173
551586
3782
09:27
And this would put a chill on free speech
174
555368
3734
09:31
and popular political action, the very heart of democracy.
175
559102
5308
09:36
And this is why we need an international treaty
176
564410
2914
09:39
on robotic weapons, and in particular a global ban
177
567324
3540
09:42
on the development and deployment of killer robots.
178
570864
3908
09:46
Now we already have international treaties
179
574772
3254
09:50
on nuclear and biological weapons, and, while imperfect,
180
578026
3386
09:53
these have largely worked.
181
581412
2288
09:55
But robotic weapons might be every bit as dangerous,
182
583700
3768
09:59
because they will almost certainly be used,
183
587468
3288
10:02
and they would also be corrosive to our democratic institutions.
184
590756
5027
10:07
Now in November 2012 the U.S. Department of Defense
185
595783
3468
10:11
issued a directive requiring
186
599251
2458
10:13
a human being be present in all lethal decisions.
187
601709
4519
10:18
This temporarily effectively banned autonomous weapons in the U.S. military,
188
606228
4776
10:23
but that directive needs to be made permanent.
189
611004
3753
10:26
And it could set the stage for global action.
190
614757
4376
10:31
Because we need an international legal framework
191
619133
3845
10:34
for robotic weapons.
192
622978
2138
10:37
And we need it now, before there's a devastating attack
193
625116
2928
10:40
or a terrorist incident that causes nations of the world
194
628044
3152
10:43
to rush to adopt these weapons
195
631196
1924
10:45
before thinking through the consequences.
196
633120
3771
10:48
Autonomous robotic weapons concentrate too much power
197
636891
2981
10:51
in too few hands, and they would imperil democracy itself.
198
639872
6283
10:58
Now, don't get me wrong, I think there are tons
199
646155
2686
11:00
of great uses for unarmed civilian drones:
200
648841
2618
11:03
environmental monitoring, search and rescue, logistics.
201
651459
3939
11:07
If we have an international treaty on robotic weapons,
202
655398
2826
11:10
how do we gain the benefits of autonomous drones
203
658224
3587
11:13
and vehicles while still protecting ourselves
204
661811
2648
11:16
against illegal robotic weapons?
205
664459
3980
11:20
I think the secret will be transparency.
206
668439
4741
11:25
No robot should have an expectation of privacy
207
673180
3013
11:28
in a public place.
208
676193
3451
11:31
(Applause)
209
679644
5048
11:36
Each robot and drone should have
210
684692
2045
11:38
a cryptographically signed I.D. burned in at the factory
211
686737
2883
11:41
that can be used to track its movement through public spaces.
212
689620
2923
11:44
We have license plates on cars, tail numbers on aircraft.
213
692543
3381
11:47
This is no different.
214
695924
1841
11:49
And every citizen should be able to download an app
215
697765
2012
11:51
that shows the population of drones and autonomous vehicles
216
699777
3125
11:54
moving through public spaces around them,
217
702902
2429
11:57
both right now and historically.
218
705331
2733
12:00
And civic leaders should deploy sensors and civic drones
219
708064
3548
12:03
to detect rogue drones,
220
711612
2344
12:05
and instead of sending killer drones of their own up to shoot them down,
221
713956
3176
12:09
they should notify humans to their presence.
222
717132
2992
12:12
And in certain very high-security areas,
223
720124
2606
12:14
perhaps civic drones would snare them
224
722730
1909
12:16
and drag them off to a bomb disposal facility.
225
724639
2841
12:19
But notice, this is more an immune system
226
727480
3027
12:22
than a weapons system.
227
730507
1321
12:23
It would allow us to avail ourselves of the use
228
731828
2592
12:26
of autonomous vehicles and drones
229
734420
2032
12:28
while still preserving our open, civil society.
230
736452
4295
12:32
We must ban the deployment and development
231
740747
2999
12:35
of killer robots.
232
743746
1862
12:37
Let's not succumb to the temptation to automate war.
233
745608
4850
12:42
Autocratic governments and criminal organizations
234
750458
2718
12:45
undoubtedly will, but let's not join them.
235
753176
2956
12:48
Autonomous robotic weapons
236
756132
1891
12:50
would concentrate too much power
237
758023
2051
12:52
in too few unseen hands,
238
760074
2482
12:54
and that would be corrosive to representative government.
239
762556
3255
12:57
Let's make sure, for democracies at least,
240
765811
2961
13:00
killer robots remain fiction.
241
768772
2604
13:03
Thank you.
242
771376
1110
13:04
(Applause)
243
772486
4565
13:09
Thank you. (Applause)
244
777051
4616
Translated by Joseph Geni
Reviewed by Morton Bast

▲Back to top

ABOUT THE SPEAKER
Daniel Suarez - Sci-fi author
Daniel Suarez concocts thrilling reads from terrifying (and not-so-farfetched) near-future scenarios.

Why you should listen

While working as a software developer, Daniel Suarez self-published Daemon, a cyber-thriller depicting a future where society is radically reshaped by disruptive technologies. It struck a chord -- and so did the sequel, Freedom (TM) -- rocketing Suarez into the pantheon of sci-fi prophets.

In his 2012 novel Kill Decision, Suarez digs into the consequences of technology that’s here to stay: autonomous bots and drones programmed to be lethal. Suarez argues that as we cede more control to software, we gamble with the very essence of democracy itself. How can we establish sane guidelines for technology that could easily outstrip our control?

More profile about the speaker
Daniel Suarez | Speaker | TED.com