ABOUT THE SPEAKER
Kate Darling - Robot ethicist
Kate Darling investigates the relationship between humans and robots -- now, and in the future.

Why you should listen

Robots are taking increasingly bigger roles in life and business -- moving well beyond manufacturing and into transportation, education, medicine, care for the elderly, our offices and our homes. But ethics and law haven't caught up. Kate Darling, a pioneer in these fields, is helping quicken the pace.

A leading expert in robot ethics, Darling is a researcher at the Massachusetts Institute of Technology (MIT) Media Lab where she investigates social robotics and conducts experimental studies on human-robot interaction. Darling explores the emotional connection between people and life-like inventions, seeking to influence technology design and policy direction. Her writing and research anticipate difficult questions that lawmakers, engineers and the wider public must address as human-robot relationships evolve in the coming decades.

While her background is in intellectual property and law and economics, Darling's passion for technology and robots led her to interdisciplinary fields. She's a former Fellow and current Affiliate at the Harvard Berkman Klein Center for Internet & Society and a former fellow at the Yale Information Society Project, and she's an affiliate at the Institute for Ethics and Emerging Technologies. 
Her work has been featured in Vogue, The New Yorker, The Guardian, BBC, NPR, PBS, The Boston Globe, Forbes, CBC, WIRED, Boston Magazine, The Atlantic, Slate, Die Zeit, The Japan Times and more. She is a contributing writer to Robohub and IEEE Spectrum and speaks and holds workshops covering some of the more interesting developments in the world of robotics, and where we might find ourselves in the future.

Darling graduated from law school with honors and holds a doctorate of sciences from the Swiss Federal Institute of Technology (ETH Zurich) and an honorary doctorate of sciences from Middlebury College. In 2017, the American Bar Association honored her legal work with the Mark T. Banner award in Intellectual Property. She is the caretaker for several domestic robots, including her Pleos Yochai, Peter, and Mr. Spaghetti. She tweets as @grok_ about eating cheerios for dinner.

More profile about the speaker
Kate Darling | Speaker | TED.com
TED Salon Samsung

Kate Darling: Why we have an emotional connection to robots

凯特 · 达林: 我们为什么会对机器人有感情

Filmed:
2,403,010 views

机器人伦理学家凯特 · 达林指出,我们还远无法开发出有感情的机器人,但我们已经对机器人产生了感觉,而这种本能会导致一些后果。通过这个演讲来一起了解我们在生物学上是如何将意图和生命投射到机器上的——以及机器人如何帮助我们更好地了解自己。
- Robot ethicist
Kate Darling investigates the relationship between humans and robots -- now, and in the future. Full bio

Double-click the English transcript below to play the video.

00:13
There was a day, about 10 years年份 ago,
0
1047
2508
大概10年前的一天,
我让一个朋友头朝下地握持
一个小恐龙机器人。
00:15
when I asked a friend朋友 to hold保持
a baby宝宝 dinosaur恐龙 robot机器人 upside上边 down.
1
3579
3944
00:21
It was this toy玩具 called a PleoPLEO
that I had ordered有序,
2
9889
3446
这个机器人是我订购的
一款叫做Pleo的玩具,
00:25
and I was really excited兴奋 about it
because I've always loved喜爱 robots机器人.
3
13359
4401
我对此非常兴奋,因为我
一直都很喜欢机器人。
00:29
And this one has really cool
technical技术 features特征.
4
17784
2279
这个机器人有很酷的技术特征。
00:32
It had motors马达 and touch触摸 sensors传感器
5
20087
2119
它有马达和触觉传感器,
00:34
and it had an infrared红外线 camera相机.
6
22230
2244
还有一个红外摄像头。
00:36
And one of the things it had
was a tilt倾斜 sensor传感器,
7
24498
2763
它还有一个部件是倾斜传感器,
所以它就会知道
自己面对的是什么方向。
00:39
so it knew知道 what direction方向 it was facing面对.
8
27285
2318
00:42
And when you held保持 it upside上边 down,
9
30095
2134
当你把它倒过来,
它会开始哭泣。
00:44
it would start开始 to cry.
10
32253
1572
00:46
And I thought this was super cool,
so I was showing展示 it off to my friend朋友,
11
34527
3496
我觉得这点非常酷,
所以我展示给我朋友看,
我说:“抓住尾巴竖起来,
看看它会怎样。”
00:50
and I said, "Oh, hold保持 it up by the tail尾巴.
See what it does."
12
38047
2805
00:55
So we're watching观看
the theatrics戏剧 of this robot机器人
13
43268
3625
于是我们看着这个机器人表演,
00:58
struggle斗争 and cry out.
14
46917
2199
挣扎和哭泣。
01:02
And after a few少数 seconds,
15
50767
2047
几秒钟后,
01:04
it starts启动 to bother me a little,
16
52838
1972
我开始感到有点不安,
01:07
and I said, "OK, that's enough足够 now.
17
55744
3424
于是我说,“好了,差不多了,
01:11
Let's put him back down."
18
59930
2305
我们把它放回去吧。”
然后我抚摸着机器人,让它停止哭泣。
01:14
And then I pet宠物 the robot机器人
to make it stop crying哭了.
19
62259
2555
01:18
And that was kind of
a weird奇怪的 experience经验 for me.
20
66973
2452
这对我来说是一种奇怪的经历。
01:22
For one thing, I wasn't the most
maternal母系 person at the time.
21
70084
4569
首先,我那时还不是个很有母性的人。
尽管在那之前的9个月,
我已经成为了一个母亲,
01:26
Although虽然 since以来 then I've become成为
a mother母亲, nine months个月 ago,
22
74677
2731
我还知道,当你让婴儿
大头朝下时,婴儿也会抽泣。
01:29
and I've learned学到了 that babies婴儿 also squirm蠕动
when you hold保持 them upside上边 down.
23
77432
3433
(笑声)
01:32
(Laughter笑声)
24
80889
1563
01:35
But my response响应 to this robot机器人
was also interesting有趣
25
83023
2358
但我对这个机器人的反应也非常有趣,
01:37
because I knew知道 exactly究竟
how this machine worked工作,
26
85405
4101
因为我确切地知道
这个机器工作的原理,
然而我仍然感到有必要对它仁慈些。
01:41
and yet然而 I still felt
compelled被迫 to be kind to it.
27
89530
3262
01:46
And that observation意见 sparked引发 a curiosity好奇心
28
94450
2707
这个观察引起了好奇心,
让我花费了长达10年的时间去追寻。
01:49
that I've spent花费 the past过去 decade pursuing追求.
29
97181
2832
01:52
Why did I comfort安慰 this robot机器人?
30
100911
1793
为什么我会去安慰这个机器人?
01:56
And one of the things I discovered发现
was that my treatment治疗 of this machine
31
104228
3579
我发现我对待这个机器人的方式
01:59
was more than just an awkward尴尬 moment时刻
in my living活的 room房间,
32
107831
3701
不仅是我起居室里一个尴尬时刻,
02:03
that in a world世界 where we're increasingly日益
integrating整合 robots机器人 into our lives生活,
33
111556
5420
在这个世界里,我们正越来越多地
将机器人融入到我们生活中,
像这样的本能可能会产生一些后果,
02:09
an instinct直觉 like that
might威力 actually其实 have consequences后果,
34
117000
3126
02:13
because the first thing that I discovered发现
is that it's not just me.
35
121452
3749
因为我发现的第一件事情是,
这并非只是发生在我身上的个例。
02:19
In 2007, the Washington华盛顿 Post岗位
reported报道 that the United联合的 States状态 military军事
36
127249
4802
2007年,华盛顿邮报报道称,美国军方
02:24
was testing测试 this robot机器人
that defused化解 land土地 mines矿山.
37
132075
3230
正在测试拆除地雷的机器人。
02:27
And the way it worked工作
was it was shaped成形 like a stick insect昆虫
38
135329
2912
它的形状就像一只竹节虫,
用腿在雷区上行走,
02:30
and it would walk步行
around a minefield雷区 on its legs,
39
138265
2651
每次踩到地雷时,
它的一条腿就会被炸掉,
02:32
and every一切 time it stepped加强 on a mine,
one of the legs would blow打击 up,
40
140940
3206
然后继续用其他腿去引爆更多的地雷。
02:36
and it would continue继续 on the other legs
to blow打击 up more mines矿山.
41
144170
3057
负责这次测试的上校
02:39
And the colonel陆军上校 who was in charge收费
of this testing测试 exercise行使
42
147251
3786
02:43
ends结束 up calling调用 it off,
43
151061
2118
后来取消了这个测试,
02:45
because, he says, it's too inhumane不人道
44
153203
2435
因为他说,看着这个机器人
02:47
to watch this damaged破损 robot机器人
drag拖动 itself本身 along沿 the minefield雷区.
45
155662
4516
拖着残破的身躯在雷区
挣扎行走,实在太不人道了。
02:54
Now, what would cause原因
a hardened硬化 military军事 officer
46
162978
3897
那么,是什么导致了一个强硬的军官
02:58
and someone有人 like myself
47
166899
2043
和像我这样的人
对机器人有这种反应呢?
03:00
to have this response响应 to robots机器人?
48
168966
1857
03:03
Well, of course课程, we're primed
by science科学 fiction小说 and pop流行的 culture文化
49
171537
3310
不可否认,我们都被科幻小说
及流行文化所影响,
想要将这些东西拟人化,
03:06
to really want to personify使人格化 these things,
50
174871
2579
03:09
but it goes a little bit deeper更深 than that.
51
177474
2789
但真实情况还有着更深层的含义。
03:12
It turns out that we're biologically生物
hardwired硬线 to project项目 intent意图 and life
52
180287
5309
事实表明,我们天生就具有将意图和生活
03:17
onto any movement运动 in our physical物理 space空间
that seems似乎 autonomous自主性 to us.
53
185620
4766
投射到物理空间中,在我们
看来能自主行动的任何运动物体上。
03:23
So people will treat对待 all sorts排序
of robots机器人 like they're alive.
54
191214
3465
所以人们像对待活物一样
对待各种各样的机器人。
03:26
These bomb-disposal炸弹处理 units单位 get names.
55
194703
2683
这些拆弹机器人有自己的名字。
它们能获得荣誉勋章。
03:29
They get medals奖牌 of honor荣誉.
56
197410
1682
03:31
They've他们已经 had funerals葬礼 for them
with gun salutes礼炮.
57
199116
2325
人们为它们举行了葬礼,并用礼炮向它们致敬。
03:34
And research研究 shows节目 that we do this
even with very simple简单 household家庭 robots机器人,
58
202380
3833
研究还发现,我们即便对非常简单的
家居机器人也会这样,
03:38
like the Roomba伦巴 vacuum真空 cleaner清洁器.
59
206237
2135
比如Roomba吸尘器。
03:40
(Laughter笑声)
60
208396
1291
(笑声)
03:41
It's just a disc圆盘 that roams漫游
around your floor地板 to clean清洁 it,
61
209711
3089
它只是一个在你地板上
通过旋转进行清理的圆盘,
03:44
but just the fact事实 it's moving移动
around on its own拥有
62
212824
2306
但仅仅因为它能够自己移动,
03:47
will cause原因 people to name名称 the Roomba伦巴
63
215154
2167
就会导致人们想要给Roomba取名,
03:49
and feel bad for the Roomba伦巴
when it gets得到 stuck卡住 under the couch长椅.
64
217345
3182
当它卡在沙发下时,还会替它感到难过。
03:52
(Laughter笑声)
65
220551
1865
(笑声)
03:54
And we can design设计 robots机器人
specifically特别 to evoke唤起 this response响应,
66
222440
3340
我们可以专门设计机器人来唤起这种反应,
03:57
using运用 eyes眼睛 and faces面孔 or movements运动
67
225804
3461
使用诸如眼睛,面孔或动作,
04:01
that people automatically自动,
subconsciously下意识 associate关联
68
229289
3259
这些人们自动地,在潜意识中
04:04
with states状态 of mind心神.
69
232572
2020
与心智状态相联系的特征。
04:06
And there's an entire整个 body身体 of research研究
called human-robot人类与机器人 interaction相互作用
70
234616
3293
这一整套叫做人机交互的研究
显示了这个方法的效果的确非常好。
04:09
that really shows节目 how well this works作品.
71
237933
1826
04:11
So for example, researchers研究人员
at Stanford斯坦福 University大学 found发现 out
72
239783
3126
比如,在斯坦福大学的研究者发现,
当你叫人们触摸机器人的私处时,
04:14
that it makes品牌 people really uncomfortable不舒服
73
242933
2001
他们会感到很不舒服。
04:16
when you ask them to touch触摸
a robot's机器人 private私人的 parts部分.
74
244958
2472
04:19
(Laughter笑声)
75
247454
2120
(笑声)
从这个以及更多其他研究中,
04:21
So from this, but from many许多 other studies学习,
76
249598
2023
我们知道人们会对这些栩栩如生的机器
04:23
we know, we know that people
respond响应 to the cues线索 given特定 to them
77
251645
4223
04:27
by these lifelike逼真 machines,
78
255892
2022
给他们的线索做出反应,
即使他们知道它们只是机器。
04:29
even if they know that they're not real真实.
79
257938
2017
04:33
Now, we're headed当家 towards a world世界
where robots机器人 are everywhere到处.
80
261654
4056
我们正迈入一个机器人
无处不在的社会。
04:37
Robotic机器人 technology技术 is moving移动 out
from behind背后 factory walls墙壁.
81
265734
3065
机器人科技正在走出工厂的围墙。
04:40
It's entering进入 workplaces工作场所, households.
82
268823
3013
它们正在进入工作场所,家居环境。
04:43
And as these machines that can sense
and make autonomous自主性 decisions决定 and learn学习
83
271860
6209
随着这些能够感知并自己
做决定和学习的机器
04:50
enter输入 into these shared共享 spaces空间,
84
278093
2552
进入这些共享空间,
04:52
I think that maybe the best最好
analogy比喻 we have for this
85
280669
2496
我认为一个最好的类比就是
我们和动物的关系。
04:55
is our relationship关系 with animals动物.
86
283189
1935
04:57
Thousands成千上万 of years年份 ago,
we started开始 to domesticate animals动物,
87
285523
3888
几千年前,我们开始驯养动物,
05:01
and we trained熟练 them for work
and weaponry武器 and companionship友谊.
88
289435
4045
我们训练它们为我们工作,
保护和陪伴我们。
在这个历史进程中,我们把
有些动物当作工具或产品使用,
05:05
And throughout始终 history历史, we've我们已经 treated治疗
some animals动物 like tools工具 or like products制品,
89
293504
4985
对其它一些动物,我们则对它们很好,
05:10
and other animals动物,
we've我们已经 treated治疗 with kindness善良
90
298513
2174
05:12
and we've我们已经 given特定 a place地点 in society社会
as our companions同伴.
91
300711
3078
在社会中给予它们同伴的位置。
05:15
I think it's plausible似是而非 we might威力 start开始
to integrate整合 robots机器人 in similar类似 ways方法.
92
303813
3849
我认为我们可能会开始
以类似的方式整合机器人。
05:21
And sure, animals动物 are alive.
93
309484
3096
当然,动物有生命。
机器人没有。
05:24
Robots机器人 are not.
94
312604
1150
05:27
And I can tell you,
from working加工 with roboticists机器人专家,
95
315626
2580
作为机器人专家,我可以告诉各位,
05:30
that we're pretty漂亮 far away from developing发展
robots机器人 that can feel anything.
96
318230
3522
我们距离能产生感情的机器人还很遥远。
05:35
But we feel for them,
97
323072
1460
但我们同情它们,
05:37
and that matters事项,
98
325835
1207
这点很重要,
05:39
because if we're trying to integrate整合
robots机器人 into these shared共享 spaces空间,
99
327066
3627
因为如果我们尝试把机器人
整合进这些共享空间,
05:42
we need to understand理解 that people will
treat对待 them differently不同 than other devices设备,
100
330717
4628
就需要懂得人们会把它们
与其他设备区别对待,
05:47
and that in some cases,
101
335369
1844
而且在有些场景下,
05:49
for example, the case案件 of a soldier士兵
who becomes emotionally感情上 attached
102
337237
3172
比如,那个士兵对一起工作的机器人
05:52
to the robot机器人 that they work with,
103
340433
2047
产生情感依恋的例子,
05:54
that can be anything
from inefficient低效 to dangerous危险.
104
342504
2504
这可能是低效的,也可能是危险的。
05:58
But in other cases,
it can actually其实 be useful有用
105
346551
2138
但在其他场景下,
培养与机器人的情感联系
06:00
to foster培育 this emotional情绪化
connection连接 to robots机器人.
106
348713
2623
可能非常有用。
06:04
We're already已经 seeing眼看 some great use cases,
107
352184
2134
我们已经看到了一些很好的使用场景,
06:06
for example, robots机器人 working加工
with autistic自闭症 children孩子
108
354342
2604
比如跟自闭症儿童一起的机器人
06:08
to engage从事 them in ways方法
that we haven't没有 seen看到 previously先前,
109
356970
3634
以我们前所未见的方式与他们互动,
06:12
or robots机器人 working加工 with teachers教师 to engage从事
kids孩子 in learning学习 with new results结果.
110
360628
4000
或者让机器人与老师共事,
在帮助孩子们学习方面获得新的成果。
06:17
And it's not just for kids孩子.
111
365433
1381
并且并不只适用于儿童。
06:19
Early studies学习 show显示 that robots机器人
can help doctors医生 and patients耐心
112
367750
3223
早期的研究发现机器人
可以在医疗保健领域
06:22
in health健康 care关心 settings设置.
113
370997
1427
帮助医生和病人。
06:25
This is the PARO帕洛 baby宝宝 seal密封 robot机器人.
114
373535
1810
这是帕罗婴儿海豹机器人。
06:27
It's used in nursing看护 homes家园
and with dementia痴呆 patients耐心.
115
375369
3285
它被用于疗养院来陪伴老年痴呆症患者。
06:30
It's been around for a while.
116
378678
1570
它已经面世有阵子了。
06:32
And I remember记得, years年份 ago,
being存在 at a party派对
117
380272
3325
我记得若干年前,在参与的一次聚会上
06:35
and telling告诉 someone有人 about this robot机器人,
118
383621
2571
跟人讲到这个机器人时,
06:38
and her response响应 was,
119
386216
2126
她的反应往往是,
06:40
"Oh my gosh天哪.
120
388366
1262
“哦,天哪。
06:42
That's horrible可怕.
121
390508
1188
太可怕了。
06:45
I can't believe we're giving people
robots机器人 instead代替 of human人的 care关心."
122
393056
3397
我无法相信我们给人们的是
机器人护理,而不是人类护理。”
06:50
And this is a really common共同 response响应,
123
398540
1875
这是一个非常普遍的反应,
06:52
and I think it's absolutely绝对 correct正确,
124
400439
2499
我觉得这是完全正确的,
06:54
because that would be terrible可怕.
125
402962
2040
因为这可能会很可怕。
06:57
But in this case案件,
it's not what this robot机器人 replaces取代.
126
405795
2484
但在这个场景下,机器人替代的不是护理。
07:00
What this robot机器人 replaces取代 is animal动物 therapy治疗
127
408858
3120
机器人替代的是动物疗法,
07:04
in contexts上下文 where
we can't use real真实 animals动物
128
412002
3198
这可以用在无法使用真正动物,
但可以使用机器人的场合中,
07:07
but we can use robots机器人,
129
415224
1168
07:08
because people will consistently始终如一 treat对待
them more like an animal动物 than a device设备.
130
416416
5230
因为人们会把它们当成
动物而不是设备看待。
07:15
Acknowledging确认 this emotional情绪化
connection连接 to robots机器人
131
423502
2380
承认这种与机器人的情感联系
07:17
can also help us anticipate预料 challenges挑战
132
425906
1969
也能帮助我们预见到挑战,
07:19
as these devices设备 move移动 into more intimate亲密
areas of people's人们 lives生活.
133
427899
3451
随着这些设备将进入
人们生活中更亲密的领域。
07:24
For example, is it OK
if your child's孩子的 teddy泰迪熊 bear robot机器人
134
432111
3404
比如,用你孩子的玩具熊机器人
07:27
records记录 private私人的 conversations对话?
135
435539
2237
录制私人对话是否合适?
你的性爱机器人有强制的
内置付费系统是否合适?
07:29
Is it OK if your sex性别 robot机器人
has compelling引人注目 in-app应用内 purchases购买?
136
437800
4063
07:33
(Laughter笑声)
137
441887
1396
(笑声)
07:35
Because robots机器人 plus capitalism资本主义
138
443307
2501
因为机器人加上资本
07:37
equals等于 questions问题 around
consumer消费者 protection保护 and privacy隐私.
139
445832
3705
就等于消费者保护和隐私问题。
07:42
And those aren't the only reasons原因
140
450549
1612
这些还不是我们对待
07:44
that our behavior行为 around
these machines could matter.
141
452185
2570
这些机器人的行为
之所以重要的唯一原因。
07:48
A few少数 years年份 after that first
initial初始 experience经验 I had
142
456747
3270
在我第一次见到这只小恐龙机器人的
07:52
with this baby宝宝 dinosaur恐龙 robot机器人,
143
460041
2311
几年后,
07:54
I did a workshop作坊
with my friend朋友 Hannes汉内斯 Gassert格塞特.
144
462376
2501
我和朋友汉内斯 · 加瑟特
开展了一次研讨会。
07:56
And we took five
of these baby宝宝 dinosaur恐龙 robots机器人
145
464901
2897
我们拿了5个小恐龙机器人,
07:59
and we gave them to five teams球队 of people.
146
467822
2453
把它们分给5队人。
08:02
And we had them name名称 them
147
470299
1697
我们让他们为它们取名,
08:04
and play with them and interact相互作用 with them
for about an hour小时.
148
472020
3809
陪伴它们一起互动大约一个小时。
08:08
And then we unveiled亮相
a hammer锤子 and a hatchet斧头
149
476707
2206
然后我们拿出了斧头和锤子
08:10
and we told them to torture拷打
and kill the robots机器人.
150
478937
2278
让他们去折磨和杀死机器人。
08:13
(Laughter笑声)
151
481239
3007
(笑声)
08:16
And this turned转身 out to be
a little more dramatic戏剧性
152
484857
2294
这个结果比我们想的
08:19
than we expected预期 it to be,
153
487175
1278
要更有戏剧性,
因为甚至没有一个参与者去攻击
08:20
because none没有 of the participants参与者
would even so much as strike罢工
154
488477
3072
08:23
these baby宝宝 dinosaur恐龙 robots机器人,
155
491573
1307
这些小恐龙机器人。
08:24
so we had to improvise凑合 a little,
and at some point, we said,
156
492904
5150
所以我们得临时凑合一下,
在某个时候,我们说,
08:30
"OK, you can save保存 your team's球队的 robot机器人
if you destroy破坏 another另一个 team's球队的 robot机器人."
157
498078
4437
“好吧,你可以保住你们队的机器人,
但前提是把其它队的机器人毁掉。”
(笑声)
08:34
(Laughter笑声)
158
502539
1855
08:36
And even that didn't work.
They couldn't不能 do it.
159
504839
2195
即便这样也没用,他们不愿意去做。
08:39
So finally最后, we said,
160
507058
1151
所以最后,我们说,
“我们将要毁掉所有的机器人,
08:40
"We're going to destroy破坏 all of the robots机器人
161
508233
2032
08:42
unless除非 someone有人 takes
a hatchet斧头 to one of them."
162
510289
2285
除非有人拿短柄斧砍掉它们中的一个。”
08:45
And this guy stood站在 up,
and he took the hatchet斧头,
163
513586
3579
有个人站了起来,他拿起斧头,
08:49
and the whole整个 room房间 winced畏缩
as he brought the hatchet斧头 down
164
517189
2706
当他把斧头砍到机器人的脖子上时,
08:51
on the robot's机器人 neck颈部,
165
519919
1780
整个房间的人都缩了回去,
08:53
and there was this half-joking半开玩笑,
half-serious半严重 moment时刻 of silence安静 in the room房间
166
521723
6338
房间中出现了一个为这个
倒下的机器人半玩笑半严肃的
09:00
for this fallen堕落 robot机器人.
167
528085
1698
沉默时刻。
(笑声)
09:01
(Laughter笑声)
168
529807
1406
09:03
So that was a really
interesting有趣 experience经验.
169
531237
3694
那真是一个有趣的体验。
它不是一个对照实验,显然不是,
09:06
Now, it wasn't a controlled受控
study研究, obviously明显,
170
534955
2459
09:09
but it did lead to some
later后来 research研究 that I did at MITMIT
171
537438
2850
但这引发了我后来在麻省理工
09:12
with Palash帕拉什 Nandy南迪 and Cynthia辛西娅 BreazealBreazeal,
172
540312
2228
跟帕拉什 · 南迪和
辛西娅 · 布雷西亚尔做的研究,
我们让来到实验室的人们打碎这些
09:14
where we had people come into the lab实验室
and smash粉碎 these HEXBUGshexbug
173
542564
3627
09:18
that move移动 around in a really
lifelike逼真 way, like insects昆虫.
174
546215
3087
像活生生的昆虫那样移动的遥控电子甲虫。
与选择人们喜欢的可爱东西相比,
09:21
So instead代替 of choosing选择 something cute可爱
that people are drawn to,
175
549326
3134
09:24
we chose选择 something more basic基本,
176
552484
2093
我们选择了一些更基本的东西,
我们发现富有同情心的人们
09:26
and what we found发现
was that high-empathy高同理心 people
177
554601
3480
09:30
would hesitate迟疑 more to hit击中 the HEXBUGShexbugs.
178
558105
2143
在击碎这些机器昆虫时要更加犹豫。
09:33
Now this is just a little study研究,
179
561575
1564
这只是一个小小的研究,
但它是一个更大范围研究的一部分,
09:35
but it's part部分 of a larger body身体 of research研究
180
563163
2389
这开始表明人们的同情心
09:37
that is starting开始 to indicate表明
that there may可能 be a connection连接
181
565576
2944
09:40
between之间 people's人们 tendencies倾向 for empathy同情
182
568544
2373
与他们对待机器人的行为
09:42
and their behavior行为 around robots机器人.
183
570941
1976
可能存在某种联系。
09:45
But my question for the coming未来 era时代
of human-robot人类与机器人 interaction相互作用
184
573721
3627
但我对即将到来的人机交互时代的问题
09:49
is not: "Do we empathize同情 with robots机器人?"
185
577372
3055
并不是:“我们对机器人会产生同情心吗?”
09:53
It's: "Can robots机器人 change更改
people's人们 empathy同情?"
186
581211
2920
而是:“机器人会改变人类的同情心吗?”
09:57
Is there reason原因 to, for example,
187
585489
2287
是不是存在这样的原因,比如说,
09:59
prevent避免 your child儿童
from kicking a robotic机器人 dog,
188
587800
2333
阻止你的孩子踢一只机器狗,
10:03
not just out of respect尊重 for property属性,
189
591228
2914
不只是出于对财产的尊重,
而是因为孩子更可能会去踢一只真的狗?
10:06
but because the child儿童 might威力 be
more likely容易 to kick a real真实 dog?
190
594166
2953
10:10
And again, it's not just kids孩子.
191
598507
1883
并且,这不只适用于儿童。
10:13
This is the violent暴力 video视频 games游戏 question,
but it's on a completely全然 new level水平
192
601564
4056
这是一个关于暴力游戏的问题,
但这个问题上升到了一个全新的水平,
10:17
because of this visceral内脏 physicality肉体
that we respond响应 more intensely激烈 to
193
605644
4760
因为这种出于本能的物质性行为要比我们
对屏幕上的图像反应更强烈。
10:22
than to images图片 on a screen屏幕.
194
610428
1547
10:25
When we behave表现 violently猛烈 towards robots机器人,
195
613674
2578
当我们对机器人,对专门设计来
10:28
specifically特别 robots机器人
that are designed设计 to mimic模仿者 life,
196
616276
3120
模拟生命的机器人表现出暴力行径时,
这是暴力行为的健康疏导
10:31
is that a healthy健康 outlet出口
for violent暴力 behavior行为
197
619420
3892
10:35
or is that training训练 our cruelty残酷 muscles肌肉?
198
623336
2544
还是在培养我们实施残忍行径的力量?
10:39
We don't know ...
199
627511
1150
我们还不知道…
10:42
But the answer回答 to this question has
the potential潜在 to impact碰撞 human人的 behavior行为,
200
630622
3945
但这个问题的答案有可能影响人类行为,
10:46
it has the potential潜在
to impact碰撞 social社会 norms规范,
201
634591
2768
它有可能影响社会规范,
10:49
it has the potential潜在 to inspire启发 rules规则
around what we can and can't do
202
637383
3849
可能会启发我们制定对特定
机器人能做什么和不能做什么的
10:53
with certain某些 robots机器人,
203
641256
1151
规则,
就类似于我们的动物虐待法。
10:54
similar类似 to our animal动物 cruelty残酷 laws法律.
204
642431
1848
10:57
Because even if robots机器人 can't feel,
205
645228
2864
因为即便机器人不能感知,
我们对待它们的行为也可能
对我们有着重要意义。
11:00
our behavior行为 towards them
might威力 matter for us.
206
648116
3080
11:04
And regardless而不管 of whether是否
we end结束 up changing改变 our rules规则,
207
652889
2855
不管我们是否最终会改变我们的规则,
11:08
robots机器人 might威力 be able能够 to help us
come to a new understanding理解 of ourselves我们自己.
208
656926
3556
机器人也许能帮助我们对
自己有一个全新的认识。
11:14
Most of what I've learned学到了
over the past过去 10 years年份
209
662276
2316
我在过去10年中学到的经验大部分
11:16
has not been about technology技术 at all.
210
664616
2238
跟技术无关,
而是关于人类心理学,
11:18
It's been about human人的 psychology心理学
211
666878
2503
11:21
and empathy同情 and how we relate涉及 to others其他.
212
669405
2603
同情心,以及我们如何与他人相处。
11:25
Because when a child儿童 is kind to a Roomba伦巴,
213
673524
2365
因为当一个儿童友好地对待Roomba时,
11:29
when a soldier士兵 tries尝试 to save保存
a robot机器人 on the battlefield战场,
214
677262
4015
当一个士兵试图拯救战场上的机器人时,
或者当一组人拒绝伤害小恐龙机器人时,
11:33
or when a group of people refuses拒绝
to harm危害 a robotic机器人 baby宝宝 dinosaur恐龙,
215
681301
3638
11:38
those robots机器人 aren't just motors马达
and gears齿轮 and algorithms算法.
216
686248
3191
这些机器人就不只是马达,齿轮和算法。
11:42
They're reflections思考 of our own拥有 humanity人性.
217
690501
1905
它们映射出了我们的人性。
11:45
Thank you.
218
693523
1151
谢谢。
(鼓掌)
11:46
(Applause掌声)
219
694698
3397
Translated by jacks jun
Reviewed by Xinran Bi

▲Back to top

ABOUT THE SPEAKER
Kate Darling - Robot ethicist
Kate Darling investigates the relationship between humans and robots -- now, and in the future.

Why you should listen

Robots are taking increasingly bigger roles in life and business -- moving well beyond manufacturing and into transportation, education, medicine, care for the elderly, our offices and our homes. But ethics and law haven't caught up. Kate Darling, a pioneer in these fields, is helping quicken the pace.

A leading expert in robot ethics, Darling is a researcher at the Massachusetts Institute of Technology (MIT) Media Lab where she investigates social robotics and conducts experimental studies on human-robot interaction. Darling explores the emotional connection between people and life-like inventions, seeking to influence technology design and policy direction. Her writing and research anticipate difficult questions that lawmakers, engineers and the wider public must address as human-robot relationships evolve in the coming decades.

While her background is in intellectual property and law and economics, Darling's passion for technology and robots led her to interdisciplinary fields. She's a former Fellow and current Affiliate at the Harvard Berkman Klein Center for Internet & Society and a former fellow at the Yale Information Society Project, and she's an affiliate at the Institute for Ethics and Emerging Technologies. 
Her work has been featured in Vogue, The New Yorker, The Guardian, BBC, NPR, PBS, The Boston Globe, Forbes, CBC, WIRED, Boston Magazine, The Atlantic, Slate, Die Zeit, The Japan Times and more. She is a contributing writer to Robohub and IEEE Spectrum and speaks and holds workshops covering some of the more interesting developments in the world of robotics, and where we might find ourselves in the future.

Darling graduated from law school with honors and holds a doctorate of sciences from the Swiss Federal Institute of Technology (ETH Zurich) and an honorary doctorate of sciences from Middlebury College. In 2017, the American Bar Association honored her legal work with the Mark T. Banner award in Intellectual Property. She is the caretaker for several domestic robots, including her Pleos Yochai, Peter, and Mr. Spaghetti. She tweets as @grok_ about eating cheerios for dinner.

More profile about the speaker
Kate Darling | Speaker | TED.com