ABOUT THE SPEAKER
Kate Darling - Robot ethicist
Kate Darling investigates the relationship between humans and robots -- now, and in the future.

Why you should listen

Robots are taking increasingly bigger roles in life and business -- moving well beyond manufacturing and into transportation, education, medicine, care for the elderly, our offices and our homes. But ethics and law haven't caught up. Kate Darling, a pioneer in these fields, is helping quicken the pace.

A leading expert in robot ethics, Darling is a researcher at the Massachusetts Institute of Technology (MIT) Media Lab where she investigates social robotics and conducts experimental studies on human-robot interaction. Darling explores the emotional connection between people and life-like inventions, seeking to influence technology design and policy direction. Her writing and research anticipate difficult questions that lawmakers, engineers and the wider public must address as human-robot relationships evolve in the coming decades.

While her background is in intellectual property and law and economics, Darling's passion for technology and robots led her to interdisciplinary fields. She's a former Fellow and current Affiliate at the Harvard Berkman Klein Center for Internet & Society and a former fellow at the Yale Information Society Project, and she's an affiliate at the Institute for Ethics and Emerging Technologies. 
Her work has been featured in Vogue, The New Yorker, The Guardian, BBC, NPR, PBS, The Boston Globe, Forbes, CBC, WIRED, Boston Magazine, The Atlantic, Slate, Die Zeit, The Japan Times and more. She is a contributing writer to Robohub and IEEE Spectrum and speaks and holds workshops covering some of the more interesting developments in the world of robotics, and where we might find ourselves in the future.

Darling graduated from law school with honors and holds a doctorate of sciences from the Swiss Federal Institute of Technology (ETH Zurich) and an honorary doctorate of sciences from Middlebury College. In 2017, the American Bar Association honored her legal work with the Mark T. Banner award in Intellectual Property. She is the caretaker for several domestic robots, including her Pleos Yochai, Peter, and Mr. Spaghetti. She tweets as @grok_ about eating cheerios for dinner.

More profile about the speaker
Kate Darling | Speaker | TED.com
TED Salon Samsung

Kate Darling: Why we have an emotional connection to robots

凱特 · 達琳: 為什麽我們會與機器人建立情感聯結

Filmed:
2,403,010 views

我們還遠不足以開發出能感知情感的機器人,但是我們已經對他們懷有感情,機器人倫理學家凱特 · 達琳說道。並且,像這樣的本能會產生影響。瞭解更多關於我們與生俱來的天性如何讓我們會為機器賦予意圖和生命——以及它如何幫助我們更瞭解我們自身。
- Robot ethicist
Kate Darling investigates the relationship between humans and robots -- now, and in the future. Full bio

Double-click the English transcript below to play the video.

00:13
There was a day, about 10 years年份 ago,
0
1047
2508
10 年前的某一天,
00:15
when I asked a friend朋友 to hold保持
a baby寶寶 dinosaur恐龍 robot機器人 upside上邊 down.
1
3579
3944
我叫我的一個朋友
倒著抓住一只恐龍寶寶機器人。
00:21
It was this toy玩具 called a PleoPLEO
that I had ordered有序,
2
9889
3446
這是我訂購的玩具,叫做「普里奧」,
00:25
and I was really excited興奮 about it
because I've always loved喜愛 robots機器人.
3
13359
4401
我對此很興奮,因為
我對機器人情有獨鍾。
00:29
And this one has really cool
technical技術 features特徵.
4
17784
2279
這隻機器人使用了很酷的技術。
00:32
It had motors馬達 and touch觸摸 sensors傳感器
5
20087
2119
它裝備了一些馬達和觸摸感應器,
00:34
and it had an infrared紅外線 camera相機.
6
22230
2244
以及一個紅外線攝影機。
00:36
And one of the things it had
was a tilt傾斜 sensor傳感器,
7
24498
2763
此外,它還有一個傾斜感應器,
00:39
so it knew知道 what direction方向 it was facing面對.
8
27285
2318
所以它知道自己面對的方向。
00:42
And when you held保持 it upside上邊 down,
9
30095
2134
當你倒過來抓住它,
00:44
it would start開始 to cry.
10
32253
1572
它會開始哭。
00:46
And I thought this was super cool,
so I was showing展示 it off to my friend朋友,
11
34527
3496
我覺得這真是酷斃了,
所以我自豪地向朋友展示它,
00:50
and I said, "Oh, hold保持 it up by the tail尾巴.
See what it does."
12
38047
2805
我說:「噢,抓它的尾巴,
吊著它,看它會怎麽樣。」
00:55
So we're watching觀看
the theatrics戲劇 of this robot機器人
13
43268
3625
然後我們看著這隻機器人
裝模作樣地掙扎和哭喊。
00:58
struggle鬥爭 and cry out.
14
46917
2199
01:02
And after a few少數 seconds,
15
50767
2047
幾秒鐘之後,
01:04
it starts啟動 to bother me a little,
16
52838
1972
它開始令我心煩,
01:07
and I said, "OK, that's enough足夠 now.
17
55744
3424
我說:「好了,夠了。
01:11
Let's put him back down."
18
59930
2305
我們放下它。」
01:14
And then I pet寵物 the robot機器人
to make it stop crying哭了.
19
62259
2555
接着我安撫這隻機器人,
讓它平靜下來。
01:18
And that was kind of
a weird奇怪的 experience經驗 for me.
20
66973
2452
這件事讓我感覺十分怪異。
01:22
For one thing, I wasn't the most
maternal母系 person at the time.
21
70084
4569
一方面,我在那時
並不如母親一般慈愛。
01:26
Although雖然 since以來 then I've become成為
a mother母親, nine months個月 ago,
22
74677
2731
直到九個月前,我成為了母親,
01:29
and I've learned學到了 that babies嬰兒 also squirm蠕動
when you hold保持 them upside上邊 down.
23
77432
3433
我才知道原來寶寶在被
倒著抓住時也會掙扎扭動。
01:32
(Laughter笑聲)
24
80889
1563
(笑聲)
01:35
But my response響應 to this robot機器人
was also interesting有趣
25
83023
2358
但是我對這隻機器人的
反應也耐人尋味,
01:37
because I knew知道 exactly究竟
how this machine worked工作,
26
85405
4101
因為,即使我對這臺
機器的工作原理一清二楚,
01:41
and yet然而 I still felt
compelled被迫 to be kind to it.
27
89530
3262
但我仍然情不自禁地
對它生出惻隱之心。
01:46
And that observation意見 sparked引發 a curiosity好奇心
28
94450
2707
這個觀察結果激起了我的好奇心,
01:49
that I've spent花費 the past過去 decade pursuing追求.
29
97181
2832
讓我在過去這十年不斷探究:
01:52
Why did I comfort安慰 this robot機器人?
30
100911
1793
為什麼我會安撫這隻機器人?
01:56
And one of the things I discovered發現
was that my treatment治療 of this machine
31
104228
3579
我的其中一個發現就是:
我對這臺機器的所作所為
不僅僅只是客廳裡一個尷尬的
小插曲,它說明了更多東西。
01:59
was more than just an awkward尷尬 moment時刻
in my living活的 room房間,
32
107831
3701
02:03
that in a world世界 where we're increasingly日益
integrating整合 robots機器人 into our lives生活,
33
111556
5420
在如今,我們的生活
與機器人日益親密無間,
02:09
an instinct直覺 like that
might威力 actually其實 have consequences後果,
34
117000
3126
類似上述行為的本能
可能會造成實在的影響,
02:13
because the first thing that I discovered發現
is that it's not just me.
35
121452
3749
因為我的第一個發現就是:
我不是唯一一個這樣做的人。
02:19
In 2007, the Washington華盛頓 Post崗位
reported報導 that the United聯合的 States狀態 military軍事
36
127249
4802
2007 年,華盛頓郵報報導,
美國軍方正在測試
02:24
was testing測試 this robot機器人
that defused化解 land土地 mines礦山.
37
132075
3230
一種拆除地雷的機器人。
02:27
And the way it worked工作
was it was shaped成形 like a stick insect昆蟲
38
135329
2912
它是這樣工作的:
它的形狀類似於竹節蟲,
02:30
and it would walk步行
around a minefield雷區 on its legs,
39
138265
2651
它會用自己的「腿」在雷區附近移動。
02:32
and every一切 time it stepped加強 on a mine,
one of the legs would blow打擊 up,
40
140940
3206
每一次踩到地雷,
它都會被炸掉一隻腿,
02:36
and it would continue繼續 on the other legs
to blow打擊 up more mines礦山.
41
144170
3057
然後它會用其它的「腿」
繼續移動,引爆更多的地雷。
02:39
And the colonel陸軍上校 who was in charge收費
of this testing測試 exercise行使
42
147251
3786
那位負責這項測試工作的上校
02:43
ends結束 up calling調用 it off,
43
151061
2118
最終取消了它。
02:45
because, he says, it's too inhumane不人道
44
153203
2435
因為他說,看著這些傷痕累累的機器人
02:47
to watch this damaged破損 robot機器人
drag拖動 itself本身 along沿 the minefield雷區.
45
155662
4516
在雷區掙扎前行實在是慘無人道。
02:54
Now, what would cause原因
a hardened硬化 military軍事 officer
46
162978
3897
那麽,又是什麼讓這位老練的軍官
02:58
and someone有人 like myself
47
166899
2043
以及像我這樣的人
03:00
to have this response響應 to robots機器人?
48
168966
1857
對機器人做出這種反應?
03:03
Well, of course課程, we're primed
by science科學 fiction小說 and pop流行的 culture文化
49
171537
3310
當然,我們受科幻小說
和流行文化的影響,
03:06
to really want to personify使人格化 these things,
50
174871
2579
已經躍躍欲試要將這些事物人格化,
03:09
but it goes a little bit deeper更深 than that.
51
177474
2789
但還不止於此。
03:12
It turns out that we're biologically生物
hardwired硬線 to project項目 intent意圖 and life
52
180287
5309
事實證明,對於身邊任何
看來像是自發做出的動作,
03:17
onto any movement運動 in our physical物理 space空間
that seems似乎 autonomous自主性 to us.
53
185620
4766
生物天性決定了我們
會為其賦予意圖和生命。
03:23
So people will treat對待 all sorts排序
of robots機器人 like they're alive.
54
191214
3465
所以人們會把各種各樣的
機器人當成生命對待。
03:26
These bomb-disposal炸彈處理 units單位 get names.
55
194703
2683
這些炸彈處理單位被賦予了名字。
03:29
They get medals獎牌 of honor榮譽.
56
197410
1682
它們被授予了榮譽勛章。
03:31
They've他們已經 had funerals葬禮 for them
with gun salutes禮炮.
57
199116
2325
在它們的葬禮上,人們以禮炮致敬。
03:34
And research研究 shows節目 that we do this
even with very simple簡單 household家庭 robots機器人,
58
202380
3833
研究表明,甚至對於構造簡單的
家用機器人,我們也是如此,
03:38
like the Roomba倫巴 vacuum真空 cleaner清潔器.
59
206237
2135
例如「Roomba 吸塵器機器人」。
03:40
(Laughter笑聲)
60
208396
1291
(笑聲)
03:41
It's just a disc圓盤 that roams漫遊
around your floor地板 to clean清潔 it,
61
209711
3089
它只是一個圓盤,
在地板上遊蕩並打掃,
03:44
but just the fact事實 it's moving移動
around on its own擁有
62
212824
2306
但僅因為它自己四處移動,
03:47
will cause原因 people to name名稱 the Roomba倫巴
63
215154
2167
也會讓人們親昵地幫它取名字,
03:49
and feel bad for the Roomba倫巴
when it gets得到 stuck卡住 under the couch長椅.
64
217345
3182
當它被卡在沙發下時
還會給予同情。
03:52
(Laughter笑聲)
65
220551
1865
(笑聲)
03:54
And we can design設計 robots機器人
specifically特別 to evoke喚起 this response響應,
66
222440
3340
並且我們能專門為此
來設計機器人,
03:57
using運用 eyes眼睛 and faces面孔 or movements運動
67
225804
3461
藉助眼睛、面部或者動作,
04:01
that people automatically自動,
subconsciously下意識 associate關聯
68
229289
3259
人們在潛意識裏
會自動將其與某種心情聯繫起來。
04:04
with states狀態 of mind心神.
69
232572
2020
04:06
And there's an entire整個 body身體 of research研究
called human-robot人類與機器人 interaction相互作用
70
234616
3293
一系列「人與機器人互動」的研究
04:09
that really shows節目 how well this works作品.
71
237933
1826
都表明了這種機制的運作多麼廣泛。
04:11
So for example, researchers研究人員
at Stanford斯坦福 University大學 found發現 out
72
239783
3126
例如,史丹佛大學的研究者發現
04:14
that it makes品牌 people really uncomfortable不舒服
73
242933
2001
當要求人們去觸碰機器人的私處時,
04:16
when you ask them to touch觸摸
a robot's機器人 private私人的 parts部分.
74
244958
2472
他們會感到非常不適。
04:19
(Laughter笑聲)
75
247454
2120
(笑聲)
04:21
So from this, but from many許多 other studies學習,
76
249598
2023
因此,從很多研究看來,
04:23
we know, we know that people
respond響應 to the cues線索 given特定 to them
77
251645
4223
我們知道人們會對這些
逼真的機器人所帶來的暗示
04:27
by these lifelike逼真 machines,
78
255892
2022
作出反應,
04:29
even if they know that they're not real真實.
79
257938
2017
即使人們知道它們並非生命。
04:33
Now, we're headed當家 towards a world世界
where robots機器人 are everywhere到處.
80
261654
4056
我們正進入一個
機器人已無處不在的世界,
04:37
Robotic機器人 technology技術 is moving移動 out
from behind背後 factory walls牆壁.
81
265734
3065
機器人技術打破藩籬,開始走出工廠。
04:40
It's entering進入 workplaces工作場所, households.
82
268823
3013
它進入了工作場所、家庭。
04:43
And as these machines that can sense
and make autonomous自主性 decisions決定 and learn學習
83
271860
6209
隨着這些能夠感知、
自主決策及學習的機器
04:50
enter輸入 into these shared共享 spaces空間,
84
278093
2552
進入這些公共場所,
04:52
I think that maybe the best最好
analogy比喻 we have for this
85
280669
2496
我認為,也許對此最好的類比就是
04:55
is our relationship關係 with animals動物.
86
283189
1935
我們與動物之間的關係。
04:57
Thousands成千上萬 of years年份 ago,
we started開始 to domesticate animals動物,
87
285523
3888
數千年前,我們開始馴養動物,
05:01
and we trained熟練 them for work
and weaponry武器 and companionship友誼.
88
289435
4045
它們被我們馴化:付出勞動、
作為武器,並陪伴我們。
05:05
And throughout始終 history歷史, we've我們已經 treated治療
some animals動物 like tools工具 or like products製品,
89
293504
4985
在歷史上,我們把一些動物
當作工具或者產品,
05:10
and other animals動物,
we've我們已經 treated治療 with kindness善良
90
298513
2174
而對於另一些動物,我們予以善待
05:12
and we've我們已經 given特定 a place地點 in society社會
as our companions同伴.
91
300711
3078
並為它們保留一席之地,
成為我們的夥伴。
05:15
I think it's plausible似是而非 we might威力 start開始
to integrate整合 robots機器人 in similar類似 ways方法.
92
303813
3849
我認為,我們以相似的方式
接納機器人,是合情合理的。
05:21
And sure, animals動物 are alive.
93
309484
3096
當然,動物是活的,
05:24
Robots機器人 are not.
94
312604
1150
而機器人不是。
05:27
And I can tell you,
from working加工 with roboticists機器人專家,
95
315626
2580
實話實說,與機器人
專家共事讓我得知
05:30
that we're pretty漂亮 far away from developing發展
robots機器人 that can feel anything.
96
318230
3522
我們還遠不足以開發出
能有任何感受的機器人。
05:35
But we feel for them,
97
323072
1460
但是我們同情它們,
05:37
and that matters事項,
98
325835
1207
這很重要。
05:39
because if we're trying to integrate整合
robots機器人 into these shared共享 spaces空間,
99
327066
3627
因為如果想要讓機器人
融入這些公共空間,
05:42
we need to understand理解 that people will
treat對待 them differently不同 than other devices設備,
100
330717
4628
我們需要瞭解:人們對待
它們與其他設備的方式會不同。
05:47
and that in some cases,
101
335369
1844
在某些情況下,
05:49
for example, the case案件 of a soldier士兵
who becomes emotionally感情上 attached
102
337237
3172
例如,一位軍人對一同工作的機器人
05:52
to the robot機器人 that they work with,
103
340433
2047
產生了情感依賴,
05:54
that can be anything
from inefficient低效 to dangerous危險.
104
342504
2504
這可能造成種種後果,
小到不便,大到危險。
05:58
But in other cases,
it can actually其實 be useful有用
105
346551
2138
在其他情況下,
培養這種與機器人的情感聯結
06:00
to foster培育 this emotional情緒化
connection連接 to robots機器人.
106
348713
2623
是很有用處的。
06:04
We're already已經 seeing眼看 some great use cases,
107
352184
2134
我們已經看到了一些很好的例子,
06:06
for example, robots機器人 working加工
with autistic自閉症 children孩子
108
354342
2604
例如,輔導自閉症兒童的機器人
06:08
to engage從事 them in ways方法
that we haven't沒有 seen看到 previously先前,
109
356970
3634
前所未有地吸引着孩子們,
06:12
or robots機器人 working加工 with teachers教師 to engage從事
kids孩子 in learning學習 with new results結果.
110
360628
4000
或者,與老師們共事的機器人
讓孩子們投入學習,取得了新成果。
06:17
And it's not just for kids孩子.
111
365433
1381
這並非只限於孩子。
06:19
Early studies學習 show顯示 that robots機器人
can help doctors醫生 and patients耐心
112
367750
3223
早期的研究顯示,
在醫療保健領域中,
機器人能夠幫助醫生和病人。
06:22
in health健康 care關心 settings設置.
113
370997
1427
06:25
This is the PARO帕洛 baby寶寶 seal密封 robot機器人.
114
373535
1810
這是叫做「帕羅」的海豹寶寶機器人。
06:27
It's used in nursing看護 homes家園
and with dementia癡呆 patients耐心.
115
375369
3285
它被用於養老院中,陪伴失智症患者。
06:30
It's been around for a while.
116
378678
1570
這項服務已有一段時間了。
06:32
And I remember記得, years年份 ago,
being存在 at a party派對
117
380272
3325
我還記得,幾年前在一個聚會上,
06:35
and telling告訴 someone有人 about this robot機器人,
118
383621
2571
我告訴某人關於這種機器人的事,
06:38
and her response響應 was,
119
386216
2126
她的反應是,
06:40
"Oh my gosh天哪.
120
388366
1262
「噢,天哪,
06:42
That's horrible可怕.
121
390508
1188
這太可怕了。
06:45
I can't believe we're giving people
robots機器人 instead代替 of human人的 care關心."
122
393056
3397
簡直難以置信,照顧人們的
竟然是機器人,而不是人工護理。」
06:50
And this is a really common共同 response響應,
123
398540
1875
這是相當普遍的反應,
06:52
and I think it's absolutely絕對 correct正確,
124
400439
2499
而且我認為這也是人之常情,
06:54
because that would be terrible可怕.
125
402962
2040
因為那樣做真的非常糟糕。
06:57
But in this case案件,
it's not what this robot機器人 replaces取代.
126
405795
2484
但在這裏,機器人
所替代的並非人工護理。
07:00
What this robot機器人 replaces取代 is animal動物 therapy治療
127
408858
3120
機器人所替代的,是動物療法,
07:04
in contexts上下文 where
we can't use real真實 animals動物
128
412002
3198
在這種環境下,
我們不允許使用活的動物,
07:07
but we can use robots機器人,
129
415224
1168
但我們可以使用機器人,
07:08
because people will consistently始終如一 treat對待
them more like an animal動物 than a device設備.
130
416416
5230
因為人們永遠會把它們
當作動物,而不是設備。
07:15
Acknowledging確認 this emotional情緒化
connection連接 to robots機器人
131
423502
2380
隨着這些設備日益走近人們的生活,
07:17
can also help us anticipate預料 challenges挑戰
132
425906
1969
承認與機器人的情感聯結
07:19
as these devices設備 move移動 into more intimate親密
areas of people's人們 lives生活.
133
427899
3451
同樣能夠幫助我們
預料到即將面臨的挑戰。
07:24
For example, is it OK
if your child's孩子的 teddy泰迪熊 bear robot機器人
134
432111
3404
例如,如果你孩子的泰迪熊機器人
07:27
records記錄 private私人的 conversations對話?
135
435539
2237
會記録私人談話,這合理嗎?
07:29
Is it OK if your sex性別 robot機器人
has compelling引人注目 in-app應用內 purchases購買?
136
437800
4063
如果你的性愛機器人強烈
要求購買內置服務,這合理嗎?
07:33
(Laughter笑聲)
137
441887
1396
(笑聲)
07:35
Because robots機器人 plus capitalism資本主義
138
443307
2501
因為當機器人遇上資本主義,
07:37
equals等於 questions問題 around
consumer消費者 protection保護 and privacy隱私.
139
445832
3705
便會產生關於消費者保護
以及隱私方面的問題。
07:42
And those aren't the only reasons原因
140
450549
1612
這些並不是唯一造成
我們對於這些機器的
行為十分重要的原因。
07:44
that our behavior行為 around
these machines could matter.
141
452185
2570
07:48
A few少數 years年份 after that first
initial初始 experience經驗 I had
142
456747
3270
在最初那場
對恐龍寶寶機器人的
實驗數年之後,
07:52
with this baby寶寶 dinosaur恐龍 robot機器人,
143
460041
2311
07:54
I did a workshop作坊
with my friend朋友 Hannes漢內斯 Gassert格塞特.
144
462376
2501
我和朋友哈尼斯 · 哥薩特
開了個研討會。
07:56
And we took five
of these baby寶寶 dinosaur恐龍 robots機器人
145
464901
2897
我們拿了 5 個這樣的
恐龍寶寶機器人,
07:59
and we gave them to five teams球隊 of people.
146
467822
2453
分發給 5 組受試者。
08:02
And we had them name名稱 them
147
470299
1697
我們讓人們給它們起名字,
08:04
and play with them and interact相互作用 with them
for about an hour小時.
148
472020
3809
和它們一起玩耍、互動,
大約用了一個小時。
08:08
And then we unveiled亮相
a hammer錘子 and a hatchet斧頭
149
476707
2206
然後我們拿出一隻錘子和斧頭,
08:10
and we told them to torture拷打
and kill the robots機器人.
150
478937
2278
我們要受試者折磨並殺死機器人。
08:13
(Laughter笑聲)
151
481239
3007
(笑聲)
08:16
And this turned轉身 out to be
a little more dramatic戲劇性
152
484857
2294
實驗的結果比我們想象的
08:19
than we expected預期 it to be,
153
487175
1278
更有戲劇性,
08:20
because none沒有 of the participants參與者
would even so much as strike罷工
154
488477
3072
因為參與者甚至都不忍敲打
08:23
these baby寶寶 dinosaur恐龍 robots機器人,
155
491573
1307
這些恐龍寶寶機器人,
08:24
so we had to improvise湊合 a little,
and at some point, we said,
156
492904
5150
所以我們不得不臨時變卦,
後來只好告訴他們,
「好吧,如果摧毀了其他組的機器人,
你就能拯救自己的機器人了。」
08:30
"OK, you can save保存 your team's球隊的 robot機器人
if you destroy破壞 another另一個 team's球隊的 robot機器人."
157
498078
4437
08:34
(Laughter笑聲)
158
502539
1855
(笑聲)
08:36
And even that didn't work.
They couldn't不能 do it.
159
504839
2195
即使是這樣也沒有起色。
他們同樣做不到。
08:39
So finally最後, we said,
160
507058
1151
所以最後,我們說,
08:40
"We're going to destroy破壞 all of the robots機器人
161
508233
2032
「我們要摧毀所有的機器人了,
08:42
unless除非 someone有人 takes
a hatchet斧頭 to one of them."
162
510289
2285
除非某個人拿起斧頭
對其中一個下手。
08:45
And this guy stood站在 up,
and he took the hatchet斧頭,
163
513586
3579
某位仁兄站起來了,
他拿著斧頭,
08:49
and the whole整個 room房間 winced畏縮
as he brought the hatchet斧頭 down
164
517189
2706
當他砍中機器人的脖子時,
整個房間的空氣都仿彿凝固了。
08:51
on the robot's機器人 neck頸部,
165
519919
1780
08:53
and there was this half-joking半開玩笑,
half-serious半嚴重 moment時刻 of silence安靜 in the room房間
166
521723
6338
這就是當時在房間裏
因為這隻倒下的機器人
所帶來既幽默又嚴肅的時刻。
09:00
for this fallen墮落 robot機器人.
167
528085
1698
09:01
(Laughter笑聲)
168
529807
1406
(笑聲)
09:03
So that was a really
interesting有趣 experience經驗.
169
531237
3694
這真是一段有趣的經歷。
09:06
Now, it wasn't a controlled受控
study研究, obviously明顯,
170
534955
2459
當然,這不是一個具有控制組的研究,
09:09
but it did lead to some
later後來 research研究 that I did at MITMIT
171
537438
2850
但它確實導向了
我在麻省理工學院
和帕拉斯 · 南迪和辛西婭 · 布雷西亞
一起做的一些後續的研究,
09:12
with Palash帕拉什 Nandy南迪 and Cynthia辛西婭 BreazealBreazeal,
172
540312
2228
09:14
where we had people come into the lab實驗室
and smash粉碎 these HEXBUGshexbug
173
542564
3627
我們當時讓人們進入實驗室
粉碎這些電子甲蟲,
09:18
that move移動 around in a really
lifelike逼真 way, like insects昆蟲.
174
546215
3087
它們像昆蟲一樣十分逼真地移動。
09:21
So instead代替 of choosing選擇 something cute可愛
that people are drawn to,
175
549326
3134
所以,與其選擇某種
吸引人們的可愛動物,
09:24
we chose選擇 something more basic基本,
176
552484
2093
我們選擇了更為簡單的東西,
09:26
and what we found發現
was that high-empathy高同理心 people
177
554601
3480
我們發現,富有同情心的人
09:30
would hesitate遲疑 more to hit擊中 the HEXBUGShexbugs.
178
558105
2143
在粉碎電子甲蟲時會更加猶豫。
09:33
Now this is just a little study研究,
179
561575
1564
這只是一個小研究,
09:35
but it's part部分 of a larger body身體 of research研究
180
563163
2389
但這也屬於一個更大範疇的研究,
該研究開始表明:
在人們對同情之心的傾向
09:37
that is starting開始 to indicate表明
that there may可能 be a connection連接
181
565576
2944
09:40
between之間 people's人們 tendencies傾向 for empathy同情
182
568544
2373
和他們對於機器人的行為之間
09:42
and their behavior行為 around robots機器人.
183
570941
1976
或許存在聯繫。
09:45
But my question for the coming未來 era時代
of human-robot人類與機器人 interaction相互作用
184
573721
3627
但對於未來人類和機器互動的時代,
09:49
is not: "Do we empathize同情 with robots機器人?"
185
577372
3055
我想問的並不是「我們會對
機器人產生同情嗎?」
09:53
It's: "Can robots機器人 change更改
people's人們 empathy同情?"
186
581211
2920
而是「機器人能夠
改變我們的同情心嗎?」
09:57
Is there reason原因 to, for example,
187
585489
2287
例如,
09:59
prevent避免 your child兒童
from kicking a robotic機器人 dog,
188
587800
2333
阻止你的孩子踢一隻機器狗,
10:03
not just out of respect尊重 for property屬性,
189
591228
2914
且並非出於對財產的尊重,
10:06
but because the child兒童 might威力 be
more likely容易 to kick a real真實 dog?
190
594166
2953
而是因為如此孩子就更有可能
虐待真正的狗,這是否有道理?
10:10
And again, it's not just kids孩子.
191
598507
1883
並且,同樣不侷限於孩子。
10:13
This is the violent暴力 video視頻 games遊戲 question,
but it's on a completely全然 new level水平
192
601564
4056
這個問題源於暴力遊戲,
但它站在一個全新的角度,
10:17
because of this visceral內臟 physicality肉體
that we respond響應 more intensely激烈 to
193
605644
4760
因為這種自發的肢體
讓我們做出的反應
會比對螢幕上的圖片更強烈。
10:22
than to images圖片 on a screen屏幕.
194
610428
1547
10:25
When we behave表現 violently猛烈 towards robots機器人,
195
613674
2578
當我們虐待機器人,
10:28
specifically特別 robots機器人
that are designed設計 to mimic模仿者 life,
196
616276
3120
尤其是那種被設計成
模仿生命的機器人,
10:31
is that a healthy健康 outlet出口
for violent暴力 behavior行為
197
619420
3892
這是對暴力行為的正常發泄,
10:35
or is that training訓練 our cruelty殘酷 muscles肌肉?
198
623336
2544
還是在助長我們的暴戾習性?
10:39
We don't know ...
199
627511
1150
我們不知道……
10:42
But the answer回答 to this question has
the potential潛在 to impact碰撞 human人的 behavior行為,
200
630622
3945
但是這個問題的答案
能夠影響人類行為,
10:46
it has the potential潛在
to impact碰撞 social社會 norms規範,
201
634591
2768
它能夠影響社會準則,
10:49
it has the potential潛在 to inspire啟發 rules規則
around what we can and can't do
202
637383
3849
它能夠啓發我們,對於
特定的機器人,哪些事能做,
10:53
with certain某些 robots機器人,
203
641256
1151
哪些事不能做。
10:54
similar類似 to our animal動物 cruelty殘酷 laws法律.
204
642431
1848
與我們的「動物保護法」相似。
10:57
Because even if robots機器人 can't feel,
205
645228
2864
因為即使機器人無法感知情感,
11:00
our behavior行為 towards them
might威力 matter for us.
206
648116
3080
我們對它們的行為
對我們或許很重要。
11:04
And regardless而不管 of whether是否
we end結束 up changing改變 our rules規則,
207
652889
2855
不論最後我們是否
改變了我們的準則,
11:08
robots機器人 might威力 be able能夠 to help us
come to a new understanding理解 of ourselves我們自己.
208
656926
3556
機器人或許能夠幫助我們
再一次全新地認識我們自己。
11:14
Most of what I've learned學到了
over the past過去 10 years年份
209
662276
2316
我過去 10 年來
所瞭解的大部分內容
11:16
has not been about technology技術 at all.
210
664616
2238
並不是關於技術本身。
11:18
It's been about human人的 psychology心理學
211
666878
2503
而是關於人類的心理、
11:21
and empathy同情 and how we relate涉及 to others其他.
212
669405
2603
同情心以及我們
如何與他人建立聯繫。
11:25
Because when a child兒童 is kind to a Roomba倫巴,
213
673524
2365
因為當一個孩子和善相待
Roomba 機器人時,
11:29
when a soldier士兵 tries嘗試 to save保存
a robot機器人 on the battlefield戰場,
214
677262
4015
當一名軍人想在
戰場上拯救一名機器人時,
11:33
or when a group of people refuses拒絕
to harm危害 a robotic機器人 baby寶寶 dinosaur恐龍,
215
681301
3638
或者當一群人拒絶
傷害恐龍機器寶寶時,
11:38
those robots機器人 aren't just motors馬達
and gears齒輪 and algorithms算法.
216
686248
3191
這些機器人已不再是
馬達、齒輪和演算法。
11:42
They're reflections思考 of our own擁有 humanity人性.
217
690501
1905
他們都折射出我們人性的光輝。
11:45
Thank you.
218
693523
1151
謝謝。
11:46
(Applause掌聲)
219
694698
3397
(掌聲)
Translated by Wilde Luo
Reviewed by Nan-Kun Wu

▲Back to top

ABOUT THE SPEAKER
Kate Darling - Robot ethicist
Kate Darling investigates the relationship between humans and robots -- now, and in the future.

Why you should listen

Robots are taking increasingly bigger roles in life and business -- moving well beyond manufacturing and into transportation, education, medicine, care for the elderly, our offices and our homes. But ethics and law haven't caught up. Kate Darling, a pioneer in these fields, is helping quicken the pace.

A leading expert in robot ethics, Darling is a researcher at the Massachusetts Institute of Technology (MIT) Media Lab where she investigates social robotics and conducts experimental studies on human-robot interaction. Darling explores the emotional connection between people and life-like inventions, seeking to influence technology design and policy direction. Her writing and research anticipate difficult questions that lawmakers, engineers and the wider public must address as human-robot relationships evolve in the coming decades.

While her background is in intellectual property and law and economics, Darling's passion for technology and robots led her to interdisciplinary fields. She's a former Fellow and current Affiliate at the Harvard Berkman Klein Center for Internet & Society and a former fellow at the Yale Information Society Project, and she's an affiliate at the Institute for Ethics and Emerging Technologies. 
Her work has been featured in Vogue, The New Yorker, The Guardian, BBC, NPR, PBS, The Boston Globe, Forbes, CBC, WIRED, Boston Magazine, The Atlantic, Slate, Die Zeit, The Japan Times and more. She is a contributing writer to Robohub and IEEE Spectrum and speaks and holds workshops covering some of the more interesting developments in the world of robotics, and where we might find ourselves in the future.

Darling graduated from law school with honors and holds a doctorate of sciences from the Swiss Federal Institute of Technology (ETH Zurich) and an honorary doctorate of sciences from Middlebury College. In 2017, the American Bar Association honored her legal work with the Mark T. Banner award in Intellectual Property. She is the caretaker for several domestic robots, including her Pleos Yochai, Peter, and Mr. Spaghetti. She tweets as @grok_ about eating cheerios for dinner.

More profile about the speaker
Kate Darling | Speaker | TED.com