ABOUT THE SPEAKER
Kate Darling - Robot ethicist
Kate Darling investigates the relationship between humans and robots -- now, and in the future.

Why you should listen

Robots are taking increasingly bigger roles in life and business -- moving well beyond manufacturing and into transportation, education, medicine, care for the elderly, our offices and our homes. But ethics and law haven't caught up. Kate Darling, a pioneer in these fields, is helping quicken the pace.

A leading expert in robot ethics, Darling is a researcher at the Massachusetts Institute of Technology (MIT) Media Lab where she investigates social robotics and conducts experimental studies on human-robot interaction. Darling explores the emotional connection between people and life-like inventions, seeking to influence technology design and policy direction. Her writing and research anticipate difficult questions that lawmakers, engineers and the wider public must address as human-robot relationships evolve in the coming decades.

While her background is in intellectual property and law and economics, Darling's passion for technology and robots led her to interdisciplinary fields. She's a former Fellow and current Affiliate at the Harvard Berkman Klein Center for Internet & Society and a former fellow at the Yale Information Society Project, and she's an affiliate at the Institute for Ethics and Emerging Technologies. 
Her work has been featured in Vogue, The New Yorker, The Guardian, BBC, NPR, PBS, The Boston Globe, Forbes, CBC, WIRED, Boston Magazine, The Atlantic, Slate, Die Zeit, The Japan Times and more. She is a contributing writer to Robohub and IEEE Spectrum and speaks and holds workshops covering some of the more interesting developments in the world of robotics, and where we might find ourselves in the future.

Darling graduated from law school with honors and holds a doctorate of sciences from the Swiss Federal Institute of Technology (ETH Zurich) and an honorary doctorate of sciences from Middlebury College. In 2017, the American Bar Association honored her legal work with the Mark T. Banner award in Intellectual Property. She is the caretaker for several domestic robots, including her Pleos Yochai, Peter, and Mr. Spaghetti. She tweets as @grok_ about eating cheerios for dinner.

More profile about the speaker
Kate Darling | Speaker | TED.com
TED Salon Samsung

Kate Darling: Why we have an emotional connection to robots

Filmed:
2,403,010 views

We're far from developing robots that feel emotions, but we already have feelings towards them, says robot ethicist Kate Darling, and an instinct like that can have consequences. Learn more about how we're biologically hardwired to project intent and life onto machines -- and how it might help us better understand ourselves.
- Robot ethicist
Kate Darling investigates the relationship between humans and robots -- now, and in the future. Full bio

Double-click the English transcript below to play the video.

00:13
There was a day, about 10 years ago,
0
1047
2508
00:15
when I asked a friend to hold
a baby dinosaur robot upside down.
1
3579
3944
00:21
It was this toy called a Pleo
that I had ordered,
2
9889
3446
00:25
and I was really excited about it
because I've always loved robots.
3
13359
4401
00:29
And this one has really cool
technical features.
4
17784
2279
00:32
It had motors and touch sensors
5
20087
2119
00:34
and it had an infrared camera.
6
22230
2244
00:36
And one of the things it had
was a tilt sensor,
7
24498
2763
00:39
so it knew what direction it was facing.
8
27285
2318
00:42
And when you held it upside down,
9
30095
2134
00:44
it would start to cry.
10
32253
1572
00:46
And I thought this was super cool,
so I was showing it off to my friend,
11
34527
3496
00:50
and I said, "Oh, hold it up by the tail.
See what it does."
12
38047
2805
00:55
So we're watching
the theatrics of this robot
13
43268
3625
00:58
struggle and cry out.
14
46917
2199
01:02
And after a few seconds,
15
50767
2047
01:04
it starts to bother me a little,
16
52838
1972
01:07
and I said, "OK, that's enough now.
17
55744
3424
01:11
Let's put him back down."
18
59930
2305
01:14
And then I pet the robot
to make it stop crying.
19
62259
2555
01:18
And that was kind of
a weird experience for me.
20
66973
2452
01:22
For one thing, I wasn't the most
maternal person at the time.
21
70084
4569
01:26
Although since then I've become
a mother, nine months ago,
22
74677
2731
01:29
and I've learned that babies also squirm
when you hold them upside down.
23
77432
3433
01:32
(Laughter)
24
80889
1563
01:35
But my response to this robot
was also interesting
25
83023
2358
01:37
because I knew exactly
how this machine worked,
26
85405
4101
01:41
and yet I still felt
compelled to be kind to it.
27
89530
3262
01:46
And that observation sparked a curiosity
28
94450
2707
01:49
that I've spent the past decade pursuing.
29
97181
2832
01:52
Why did I comfort this robot?
30
100911
1793
01:56
And one of the things I discovered
was that my treatment of this machine
31
104228
3579
01:59
was more than just an awkward moment
in my living room,
32
107831
3701
02:03
that in a world where we're increasingly
integrating robots into our lives,
33
111556
5420
02:09
an instinct like that
might actually have consequences,
34
117000
3126
02:13
because the first thing that I discovered
is that it's not just me.
35
121452
3749
02:19
In 2007, the Washington Post
reported that the United States military
36
127249
4802
02:24
was testing this robot
that defused land mines.
37
132075
3230
02:27
And the way it worked
was it was shaped like a stick insect
38
135329
2912
02:30
and it would walk
around a minefield on its legs,
39
138265
2651
02:32
and every time it stepped on a mine,
one of the legs would blow up,
40
140940
3206
02:36
and it would continue on the other legs
to blow up more mines.
41
144170
3057
02:39
And the colonel who was in charge
of this testing exercise
42
147251
3786
02:43
ends up calling it off,
43
151061
2118
02:45
because, he says, it's too inhumane
44
153203
2435
02:47
to watch this damaged robot
drag itself along the minefield.
45
155662
4516
02:54
Now, what would cause
a hardened military officer
46
162978
3897
02:58
and someone like myself
47
166899
2043
03:00
to have this response to robots?
48
168966
1857
03:03
Well, of course, we're primed
by science fiction and pop culture
49
171537
3310
03:06
to really want to personify these things,
50
174871
2579
03:09
but it goes a little bit deeper than that.
51
177474
2789
03:12
It turns out that we're biologically
hardwired to project intent and life
52
180287
5309
03:17
onto any movement in our physical space
that seems autonomous to us.
53
185620
4766
03:23
So people will treat all sorts
of robots like they're alive.
54
191214
3465
03:26
These bomb-disposal units get names.
55
194703
2683
03:29
They get medals of honor.
56
197410
1682
03:31
They've had funerals for them
with gun salutes.
57
199116
2325
03:34
And research shows that we do this
even with very simple household robots,
58
202380
3833
03:38
like the Roomba vacuum cleaner.
59
206237
2135
03:40
(Laughter)
60
208396
1291
03:41
It's just a disc that roams
around your floor to clean it,
61
209711
3089
03:44
but just the fact it's moving
around on its own
62
212824
2306
03:47
will cause people to name the Roomba
63
215154
2167
03:49
and feel bad for the Roomba
when it gets stuck under the couch.
64
217345
3182
03:52
(Laughter)
65
220551
1865
03:54
And we can design robots
specifically to evoke this response,
66
222440
3340
03:57
using eyes and faces or movements
67
225804
3461
04:01
that people automatically,
subconsciously associate
68
229289
3259
04:04
with states of mind.
69
232572
2020
04:06
And there's an entire body of research
called human-robot interaction
70
234616
3293
04:09
that really shows how well this works.
71
237933
1826
04:11
So for example, researchers
at Stanford University found out
72
239783
3126
04:14
that it makes people really uncomfortable
73
242933
2001
04:16
when you ask them to touch
a robot's private parts.
74
244958
2472
04:19
(Laughter)
75
247454
2120
04:21
So from this, but from many other studies,
76
249598
2023
04:23
we know, we know that people
respond to the cues given to them
77
251645
4223
04:27
by these lifelike machines,
78
255892
2022
04:29
even if they know that they're not real.
79
257938
2017
04:33
Now, we're headed towards a world
where robots are everywhere.
80
261654
4056
04:37
Robotic technology is moving out
from behind factory walls.
81
265734
3065
04:40
It's entering workplaces, households.
82
268823
3013
04:43
And as these machines that can sense
and make autonomous decisions and learn
83
271860
6209
04:50
enter into these shared spaces,
84
278093
2552
04:52
I think that maybe the best
analogy we have for this
85
280669
2496
04:55
is our relationship with animals.
86
283189
1935
04:57
Thousands of years ago,
we started to domesticate animals,
87
285523
3888
05:01
and we trained them for work
and weaponry and companionship.
88
289435
4045
05:05
And throughout history, we've treated
some animals like tools or like products,
89
293504
4985
05:10
and other animals,
we've treated with kindness
90
298513
2174
05:12
and we've given a place in society
as our companions.
91
300711
3078
05:15
I think it's plausible we might start
to integrate robots in similar ways.
92
303813
3849
05:21
And sure, animals are alive.
93
309484
3096
05:24
Robots are not.
94
312604
1150
05:27
And I can tell you,
from working with roboticists,
95
315626
2580
05:30
that we're pretty far away from developing
robots that can feel anything.
96
318230
3522
05:35
But we feel for them,
97
323072
1460
05:37
and that matters,
98
325835
1207
05:39
because if we're trying to integrate
robots into these shared spaces,
99
327066
3627
05:42
we need to understand that people will
treat them differently than other devices,
100
330717
4628
05:47
and that in some cases,
101
335369
1844
05:49
for example, the case of a soldier
who becomes emotionally attached
102
337237
3172
05:52
to the robot that they work with,
103
340433
2047
05:54
that can be anything
from inefficient to dangerous.
104
342504
2504
05:58
But in other cases,
it can actually be useful
105
346551
2138
06:00
to foster this emotional
connection to robots.
106
348713
2623
06:04
We're already seeing some great use cases,
107
352184
2134
06:06
for example, robots working
with autistic children
108
354342
2604
06:08
to engage them in ways
that we haven't seen previously,
109
356970
3634
06:12
or robots working with teachers to engage
kids in learning with new results.
110
360628
4000
06:17
And it's not just for kids.
111
365433
1381
06:19
Early studies show that robots
can help doctors and patients
112
367750
3223
06:22
in health care settings.
113
370997
1427
06:25
This is the PARO baby seal robot.
114
373535
1810
06:27
It's used in nursing homes
and with dementia patients.
115
375369
3285
06:30
It's been around for a while.
116
378678
1570
06:32
And I remember, years ago,
being at a party
117
380272
3325
06:35
and telling someone about this robot,
118
383621
2571
06:38
and her response was,
119
386216
2126
06:40
"Oh my gosh.
120
388366
1262
06:42
That's horrible.
121
390508
1188
06:45
I can't believe we're giving people
robots instead of human care."
122
393056
3397
06:50
And this is a really common response,
123
398540
1875
06:52
and I think it's absolutely correct,
124
400439
2499
06:54
because that would be terrible.
125
402962
2040
06:57
But in this case,
it's not what this robot replaces.
126
405795
2484
07:00
What this robot replaces is animal therapy
127
408858
3120
07:04
in contexts where
we can't use real animals
128
412002
3198
07:07
but we can use robots,
129
415224
1168
07:08
because people will consistently treat
them more like an animal than a device.
130
416416
5230
07:15
Acknowledging this emotional
connection to robots
131
423502
2380
07:17
can also help us anticipate challenges
132
425906
1969
07:19
as these devices move into more intimate
areas of people's lives.
133
427899
3451
07:24
For example, is it OK
if your child's teddy bear robot
134
432111
3404
07:27
records private conversations?
135
435539
2237
07:29
Is it OK if your sex robot
has compelling in-app purchases?
136
437800
4063
07:33
(Laughter)
137
441887
1396
07:35
Because robots plus capitalism
138
443307
2501
07:37
equals questions around
consumer protection and privacy.
139
445832
3705
07:42
And those aren't the only reasons
140
450549
1612
07:44
that our behavior around
these machines could matter.
141
452185
2570
07:48
A few years after that first
initial experience I had
142
456747
3270
07:52
with this baby dinosaur robot,
143
460041
2311
07:54
I did a workshop
with my friend Hannes Gassert.
144
462376
2501
07:56
And we took five
of these baby dinosaur robots
145
464901
2897
07:59
and we gave them to five teams of people.
146
467822
2453
08:02
And we had them name them
147
470299
1697
08:04
and play with them and interact with them
for about an hour.
148
472020
3809
08:08
And then we unveiled
a hammer and a hatchet
149
476707
2206
08:10
and we told them to torture
and kill the robots.
150
478937
2278
08:13
(Laughter)
151
481239
3007
08:16
And this turned out to be
a little more dramatic
152
484857
2294
08:19
than we expected it to be,
153
487175
1278
08:20
because none of the participants
would even so much as strike
154
488477
3072
08:23
these baby dinosaur robots,
155
491573
1307
08:24
so we had to improvise a little,
and at some point, we said,
156
492904
5150
08:30
"OK, you can save your team's robot
if you destroy another team's robot."
157
498078
4437
08:34
(Laughter)
158
502539
1855
08:36
And even that didn't work.
They couldn't do it.
159
504839
2195
08:39
So finally, we said,
160
507058
1151
08:40
"We're going to destroy all of the robots
161
508233
2032
08:42
unless someone takes
a hatchet to one of them."
162
510289
2285
08:45
And this guy stood up,
and he took the hatchet,
163
513586
3579
08:49
and the whole room winced
as he brought the hatchet down
164
517189
2706
08:51
on the robot's neck,
165
519919
1780
08:53
and there was this half-joking,
half-serious moment of silence in the room
166
521723
6338
09:00
for this fallen robot.
167
528085
1698
09:01
(Laughter)
168
529807
1406
09:03
So that was a really
interesting experience.
169
531237
3694
09:06
Now, it wasn't a controlled
study, obviously,
170
534955
2459
09:09
but it did lead to some
later research that I did at MIT
171
537438
2850
09:12
with Palash Nandy and Cynthia Breazeal,
172
540312
2228
09:14
where we had people come into the lab
and smash these HEXBUGs
173
542564
3627
09:18
that move around in a really
lifelike way, like insects.
174
546215
3087
09:21
So instead of choosing something cute
that people are drawn to,
175
549326
3134
09:24
we chose something more basic,
176
552484
2093
09:26
and what we found
was that high-empathy people
177
554601
3480
09:30
would hesitate more to hit the HEXBUGS.
178
558105
2143
09:33
Now this is just a little study,
179
561575
1564
09:35
but it's part of a larger body of research
180
563163
2389
09:37
that is starting to indicate
that there may be a connection
181
565576
2944
09:40
between people's tendencies for empathy
182
568544
2373
09:42
and their behavior around robots.
183
570941
1976
09:45
But my question for the coming era
of human-robot interaction
184
573721
3627
09:49
is not: "Do we empathize with robots?"
185
577372
3055
09:53
It's: "Can robots change
people's empathy?"
186
581211
2920
09:57
Is there reason to, for example,
187
585489
2287
09:59
prevent your child
from kicking a robotic dog,
188
587800
2333
10:03
not just out of respect for property,
189
591228
2914
10:06
but because the child might be
more likely to kick a real dog?
190
594166
2953
10:10
And again, it's not just kids.
191
598507
1883
10:13
This is the violent video games question,
but it's on a completely new level
192
601564
4056
10:17
because of this visceral physicality
that we respond more intensely to
193
605644
4760
10:22
than to images on a screen.
194
610428
1547
10:25
When we behave violently towards robots,
195
613674
2578
10:28
specifically robots
that are designed to mimic life,
196
616276
3120
10:31
is that a healthy outlet
for violent behavior
197
619420
3892
10:35
or is that training our cruelty muscles?
198
623336
2544
10:39
We don't know ...
199
627511
1150
10:42
But the answer to this question has
the potential to impact human behavior,
200
630622
3945
10:46
it has the potential
to impact social norms,
201
634591
2768
10:49
it has the potential to inspire rules
around what we can and can't do
202
637383
3849
10:53
with certain robots,
203
641256
1151
10:54
similar to our animal cruelty laws.
204
642431
1848
10:57
Because even if robots can't feel,
205
645228
2864
11:00
our behavior towards them
might matter for us.
206
648116
3080
11:04
And regardless of whether
we end up changing our rules,
207
652889
2855
11:08
robots might be able to help us
come to a new understanding of ourselves.
208
656926
3556
11:14
Most of what I've learned
over the past 10 years
209
662276
2316
11:16
has not been about technology at all.
210
664616
2238
11:18
It's been about human psychology
211
666878
2503
11:21
and empathy and how we relate to others.
212
669405
2603
11:25
Because when a child is kind to a Roomba,
213
673524
2365
11:29
when a soldier tries to save
a robot on the battlefield,
214
677262
4015
11:33
or when a group of people refuses
to harm a robotic baby dinosaur,
215
681301
3638
11:38
those robots aren't just motors
and gears and algorithms.
216
686248
3191
11:42
They're reflections of our own humanity.
217
690501
1905
11:45
Thank you.
218
693523
1151
11:46
(Applause)
219
694698
3397
Translated by Joseph Geni
Reviewed by Krystian Aparta

▲Back to top

ABOUT THE SPEAKER
Kate Darling - Robot ethicist
Kate Darling investigates the relationship between humans and robots -- now, and in the future.

Why you should listen

Robots are taking increasingly bigger roles in life and business -- moving well beyond manufacturing and into transportation, education, medicine, care for the elderly, our offices and our homes. But ethics and law haven't caught up. Kate Darling, a pioneer in these fields, is helping quicken the pace.

A leading expert in robot ethics, Darling is a researcher at the Massachusetts Institute of Technology (MIT) Media Lab where she investigates social robotics and conducts experimental studies on human-robot interaction. Darling explores the emotional connection between people and life-like inventions, seeking to influence technology design and policy direction. Her writing and research anticipate difficult questions that lawmakers, engineers and the wider public must address as human-robot relationships evolve in the coming decades.

While her background is in intellectual property and law and economics, Darling's passion for technology and robots led her to interdisciplinary fields. She's a former Fellow and current Affiliate at the Harvard Berkman Klein Center for Internet & Society and a former fellow at the Yale Information Society Project, and she's an affiliate at the Institute for Ethics and Emerging Technologies. 
Her work has been featured in Vogue, The New Yorker, The Guardian, BBC, NPR, PBS, The Boston Globe, Forbes, CBC, WIRED, Boston Magazine, The Atlantic, Slate, Die Zeit, The Japan Times and more. She is a contributing writer to Robohub and IEEE Spectrum and speaks and holds workshops covering some of the more interesting developments in the world of robotics, and where we might find ourselves in the future.

Darling graduated from law school with honors and holds a doctorate of sciences from the Swiss Federal Institute of Technology (ETH Zurich) and an honorary doctorate of sciences from Middlebury College. In 2017, the American Bar Association honored her legal work with the Mark T. Banner award in Intellectual Property. She is the caretaker for several domestic robots, including her Pleos Yochai, Peter, and Mr. Spaghetti. She tweets as @grok_ about eating cheerios for dinner.

More profile about the speaker
Kate Darling | Speaker | TED.com