ABOUT THE SPEAKER
Leila Takayama - Social scientist
Leila Takayama conducts research on human-robot interaction.

Why you should listen

Leila Takayama is an acting associate professor of Psychology at the University of California, Santa Cruz, where she founded and leads the Re-Embodied Cognition Lab. Her lab examines how people make sense of, interact with, and relate to new technologies. Prior to academia, she was a researcher at GoogleX and Willow Garage, where she developed a taste for working alongside engineers, designers, animators, and more. Her interdisciplinary research continues in her current work on what happens when people interact with robots and through robots.

Takayama is a World Economic Forum Global Futures Council Member and Young Global Leader. In 2015, she was presented the IEEE Robotics & Automation Society Early Career Award. In 2012, she was named a TR35 winner and one of the 100 most creative people in business by Fast Company. She completed her PhD in Communication at Stanford University in 2008, advised by Professor Clifford Nass. She also holds a PhD minor in Psychology from Stanford, a master's degree in Communication from Stanford, and bachelor's of arts degrees in Psychology and Cognitive Science from UC Berkeley (2003). During her graduate studies, she was a research assistant in the User Interface Research (UIR) group at Palo Alto Research Center (PARC).

Photo: Melissa DeWitt

More profile about the speaker
Leila Takayama | Speaker | TED.com
TEDxPaloAlto

Leila Takayama: What's it like to be a robot?

萊菈高山: 身為機器人是什麼感覺?

Filmed:
1,183,118 views

我們已經住在機器人當中了:像是洗碗機、溫度調節器這類的工具和機器,它們完全融入我們的生活,讓我們完全沒想到要稱它們為機器人。有更多機器人的未來會是什麼樣子?社會科學家萊菈高山分享了在設計人類──機器人互動上一些獨特的挑戰,以及針對機器人未來做實驗,如何會讓我們最後更了解我們自己。
- Social scientist
Leila Takayama conducts research on human-robot interaction. Full bio

Double-click the English transcript below to play the video.

00:12
You only get one chance機會
to make a first impression印象,
0
760
2656
你只有一次機會建立第一印象,
00:15
and that's true真正 if you're a robot機器人
as well as if you're a person.
1
3440
3176
不論你是機器人或人類,都是如此。
00:18
The first time that I met會見
one of these robots機器人
2
6640
3016
當我初次見到一隻這種機器人,
00:21
was at a place地點
called Willow Garage車庫 in 2008.
3
9680
2896
是 2008 年,
在一個叫柳樹車庫公司的地方。
00:24
When I went to visit訪問 there,
my host主辦 walked me into the building建造
4
12600
3016
當我去那裡造訪時,
東道主陪我走進大樓,
00:27
and we met會見 this little guy.
5
15640
1576
我們見到了這個小傢伙。
00:29
He was rolling壓延 into the hallway門廳,
6
17240
1656
它在門廳跑來跑去,
00:30
came來了 up to me, satSAT there,
7
18920
1816
跑到我面前,坐在那裡,
00:32
stared盯著 blankly面無表情 past過去 me,
8
20760
2256
面無表情地望穿我,
00:35
did nothing for a while,
9
23040
1656
好一陣子什麼都沒做,
00:36
rapidly急速 spun his head around 180 degrees
10
24720
1936
快速把它的頭扭轉了180 度,
00:38
and then ran away.
11
26680
1536
然後跑走了。
00:40
And that was not a great first impression印象.
12
28240
2176
那不是很好的第一印象。
00:42
The thing that I learned學到了
about robots機器人 that day
13
30440
2176
我那天學到一件關於機器人的事,
00:44
is that they kind of do their own擁有 thing,
14
32640
2176
就是它們會做自己的事,
00:46
and they're not totally完全 aware知道的 of us.
15
34840
2136
並且完全沒有意識到我們的存在。
00:49
And I think as we're experimenting試驗
with these possible可能 robot機器人 futures期貨,
16
37000
3239
我想,當我們在實驗這些
可能的未來機器人時,
00:52
we actually其實 end結束 up learning學習
a lot more about ourselves我們自己
17
40263
2673
我們其實最後學到很多
關於我們自己的事,
00:54
as opposed反對 to just these machines.
18
42960
1656
不只學到關於這些機器的事。
00:56
And what I learned學到了 that day
19
44640
1336
我那天學到的是,
00:58
was that I had pretty漂亮 high
expectations期望 for this little dude花花公子.
20
46000
3416
我對這個小傢伙有很高的期望。
01:01
He was not only supposed應該 to be able能夠
to navigate導航 the physical物理 world世界,
21
49440
3176
它不僅應該要能夠引導實體世界,
01:04
but also be able能夠
to navigate導航 my social社會 world世界 --
22
52640
2656
也應該要能引導我的社交世界。
01:07
he's in my space空間; it's a personal個人 robot機器人.
23
55320
2176
它在我的空間中,它是個人機器人,
01:09
wWhy為什麼 didn't it understand理解 me?
24
57520
2016
為什麼它不了解我?
01:11
My host主辦 explained解釋 to me,
25
59560
1256
東道主向我解釋:
01:12
"Well, the robot機器人 is trying
to get from point A to point B,
26
60840
3176
「嗯,機器人試著從A點到達B點,
01:16
and you were an obstacle障礙 in his way,
27
64040
1776
而你是擋在它路上的障礙物,
01:17
so he had to replan重新計畫 his path路徑,
28
65840
2016
所以它得重新規劃它的路徑,
01:19
figure數字 out where to go,
29
67880
1256
搞清楚要走去哪裡,
01:21
and then get there some other way,"
30
69160
1696
然後走其他的路到達目的地。」
01:22
which哪一個 was actually其實
not a very efficient高效 thing to do.
31
70880
2456
這其實並不是很有效益的做法。
01:25
If that robot機器人 had figured想通 out
that I was a person, not a chair椅子,
32
73360
3256
如果那隻機器人能理解
我是一個人,不是一張椅子,
01:28
and that I was willing願意
to get out of its way
33
76640
2096
那我其實願意讓開,
01:30
if it was trying to get somewhere某處,
34
78760
1656
讓它能到它的目的地,
01:32
then it actually其實
would have been more efficient高效
35
80440
2216
這樣就會用更有效益的方式
01:34
at getting得到 its job工作 doneDONE
36
82680
1256
完成它的工作,
01:35
if it had bothered困擾
to notice注意 that I was a human人的
37
83960
2216
如果它花點功夫注意到我是一個人,
01:38
and that I have different不同 affordances啟示
than things like chairs椅子 and walls牆壁 do.
38
86200
3576
且我有著和椅子及牆壁
不同的能力,就可以做到這點。
01:41
You know, we tend趨向 to think of these robots機器人
as being存在 from outer space空間
39
89800
3216
我們傾向於認為
這些機器人是來自外太空,
01:45
and from the future未來
and from science科學 fiction小說,
40
93040
2136
來自未來,來自科幻小說,
01:47
and while that could be true真正,
41
95200
1416
雖然那是有可能的,
01:48
I'd actually其實 like to argue爭論
that robots機器人 are here today今天,
42
96640
2656
但其實我認為,
機器人現今就在這裡了,
01:51
and they live生活 and work
amongst其中包括 us right now.
43
99320
2776
它們現在就在人類當中生活、工作。
01:54
These are two robots機器人 that live生活 in my home.
44
102120
2856
這兩隻機器人住在我家。
01:57
They vacuum真空 the floors地板
and they cut the grass
45
105000
2496
它們會吸地板和除草,
01:59
every一切 single day,
46
107520
1216
每天都做,
02:00
which哪一個 is more than I would do
if I actually其實 had time to do these tasks任務,
47
108760
3376
就算我有時間做這些工作,
我也無法做到這麼多,
02:04
and they probably大概
do it better than I would, too.
48
112160
2336
且它們可能也做得比我好。
02:06
This one actually其實 takes care關心 of my kitty貓咪.
49
114520
2496
這隻機器人會照顧我的貓。
02:09
Every一切 single time
he uses使用 the box, it cleans清理 it,
50
117040
2576
每當牠使用貓砂時,它就會清理,
02:11
which哪一個 is not something I'm willing願意 to do,
51
119640
1976
這就不是我願意做的事了,
02:13
and it actually其實 makes品牌
his life better as well as mine.
52
121640
2576
它讓牠和我的生活都變得更好。
02:16
And while we call these robot機器人 products製品 --
53
124240
2416
雖然我們稱這些機器人為產品,
02:18
it's a "robot機器人 vacuum真空 cleaner清潔器,
it's a robot機器人 lawnmower割草機,
54
126680
2696
它是打掃機器人、
它是除草機器人、
02:21
it's a robot機器人 littler利特勒 box,"
55
129400
1495
它是貓砂機器人,
02:22
I think there's actually其實 a bunch
of other robots機器人 hiding in plain sight視力
56
130919
4137
我認為,在我們視線能及之處
還隱藏了一堆其他的機器人,
02:27
that have just become成為 so darn useful有用
57
135080
1856
只是它們變得太有用、太平凡,
02:28
and so darn mundane平凡
58
136960
1456
以致於我們為它們另外命名,
比如「洗碗機」,對吧?
02:30
that we call them things
like, "dishwasher洗碗機," right?
59
138440
2496
02:32
They get new names.
60
140960
1216
它們得到新的名字。
02:34
They don't get called robot機器人 anymore
61
142200
1696
它們不再被稱為機器人,
02:35
because they actually其實
serve服務 a purpose目的 in our lives生活.
62
143920
2416
因為,在我們的人生中,
它們有目的、用途。
02:38
Similarly同樣, a thermostat恆溫器, right?
63
146360
1496
同樣的,自動調溫器,對吧?
02:39
I know my robotics機器人 friends朋友 out there
64
147880
1776
我知道外頭那些機器人朋友們
02:41
are probably大概 cringing畏縮
at me calling調用 this a robot機器人,
65
149680
2336
大概對我稱呼它們為機器人
感到難為情,
02:44
but it has a goal目標.
66
152040
1256
但它有個目標,
02:45
Its goal目標 is to make my house
66 degrees Fahrenheit飛輪海,
67
153320
2896
它的目標就是讓我的房子
維持在華氏 66 度。
02:48
and it senses感官 the world世界.
68
156240
1256
它會感知這個世界,
02:49
It knows知道 it's a little bit cold,
69
157520
1576
知道有一點冷,
02:51
it makes品牌 a plan計劃 and then
it acts行為 on the physical物理 world世界.
70
159120
2616
它會做計畫,然後
在實體世界採取行動。
02:53
It's robotics機器人.
71
161760
1256
這就是機器人。
02:55
Even if it might威力 not
look like Rosie羅西 the Robot機器人,
72
163040
2576
即使它看起來不像
《傑森一家》裡的機器人,
02:57
it's doing something
that's really useful有用 in my life
73
165640
2936
但在我的人生中,
它所做的事非常有用,
03:00
so I don't have to take care關心
74
168600
1376
它讓我不用去費心
03:02
of turning車削 the temperature溫度
up and down myself.
75
170000
2576
自己把溫度調高調低。
03:04
And I think these systems系統
live生活 and work amongst其中包括 us now,
76
172600
3816
我認為,這些系統在
我們人類當中生活和工作,
03:08
and not only are these systems系統
living活的 amongst其中包括 us
77
176440
2336
且,不僅這些系統生活在我們當中,
03:10
but you are probably大概
a robot機器人 operator操作者, too.
78
178800
2656
你可能也是個機器人操作員。
03:13
When you drive駕駛 your car汽車,
79
181480
1256
當你駕駛你的車時,
03:14
it feels感覺 like you are operating操作 machinery機械.
80
182760
2216
感覺就像你在操作機械。
03:17
You are also going
from point A to point B,
81
185000
2816
你也是在從A點前往B點,
03:19
but your car汽車 probably大概 has power功率 steering操舵,
82
187840
2216
但你的車可能有動力方向盤,
03:22
it probably大概 has automatic自動 braking制動 systems系統,
83
190080
2696
它可能有自動煞車系統,
03:24
it might威力 have an automatic自動 transmission傳輸
and maybe even adaptive自適應 cruise巡航 control控制.
84
192800
3736
它可能有自排變速箱,
甚至有主動車距控制巡航系統。
03:28
And while it might威力 not be
a fully充分 autonomous自主性 car汽車,
85
196560
2936
雖然它可能不是完全自主的車,
03:31
it has bits of autonomy自治,
86
199520
1296
它還是有一些自主性,
03:32
and they're so useful有用
87
200840
1336
且這些自主性很有用,
03:34
and they make us drive駕駛 safer更安全,
88
202200
1816
讓我們能更安全地開車,
03:36
and we just sort分類 of feel
like they're invisible-in-use隱形使用, right?
89
204040
3656
我們只會感覺到,在使用它們時,
它們好像隱形了,對嗎?
03:39
So when you're driving主動 your car汽車,
90
207720
1576
當你在駕駛你的車時,
03:41
you should just feel like
you're going from one place地點 to another另一個.
91
209320
3096
你應該只會感覺到你是
從一個地方到另一個地方。
03:44
It doesn't feel like it's this big thing
that you have to deal合同 with and operate操作
92
212440
3736
感覺並不像是件需要你
去處理、操作、使用
這些控制功能的大事,
03:48
and use these controls控制
93
216200
1256
03:49
because we spent花費 so long
learning學習 how to drive駕駛
94
217480
2176
因為我們花了太多時間學習駕駛,
03:51
that they've他們已經 become成為
extensions擴展 of ourselves我們自己.
95
219680
2696
以致於它們已經成了
我們自己的延伸體。
03:54
When you park公園 that car汽車
in that tight little garage車庫 space空間,
96
222400
2696
當你把車停在那狹小的車庫空間中,
你知道車子的邊角在什麼位置。
03:57
you know where your corners角落 are.
97
225120
1576
當你開一輛你可能以前
沒開過的出租車時,
03:58
And when you drive駕駛 a rental出租 car汽車
that maybe you haven't沒有 driven驅動 before,
98
226720
3256
04:02
it takes some time
to get used to your new robot機器人 body身體.
99
230000
3056
要花一點時間去習慣
你的機器人身體。
04:05
And this is also true真正 for people
who operate操作 other types類型 of robots機器人,
100
233080
3976
操作其他種類的機器人也是這樣的,
04:09
so I'd like to share分享 with you
a few少數 stories故事 about that.
101
237080
2600
所以我想要和各位
分享幾個相關的故事。
04:12
Dealing交易 with the problem問題
of remote遠程 collaboration合作.
102
240240
2336
處理遠端協同作業的問題。
04:14
So, at Willow Garage車庫
I had a coworker同事 named命名 Dallas達拉斯,
103
242600
2576
在柳樹車庫,我有個同事叫達拉斯,
04:17
and Dallas達拉斯 looked看著 like this.
104
245200
1576
達拉斯看起來是這樣子。
04:18
He worked工作 from his home in Indiana印地安那
in our company公司 in California加州.
105
246800
4056
在我們加州的公司中,
他在印第安納州的家中工作。
在我們大部分的會議中,
他是桌上盒子裡傳出來的聲音,
04:22
He was a voice語音 in a box
on the table in most of our meetings會議,
106
250880
2936
04:25
which哪一個 was kind of OK
except that, you know,
107
253840
2215
這樣是還好,不過,你們知道的,
04:28
if we had a really heated加熱 debate辯論
and we didn't like what he was saying,
108
256079
3377
如果我們發生火熱的辯論,
且我們不喜歡他的說法時,
04:31
we might威力 just hang up on him.
109
259480
1416
我們可能就會把他掛斷。
04:32
(Laughter笑聲)
110
260920
1015
(笑聲)
04:33
Then we might威力 have a meeting會議
after that meeting會議
111
261959
2217
在那場會議後,
我們會再開一場會議,
04:36
and actually其實 make the decisions決定
in the hallway門廳 afterwards之後
112
264200
2696
且之後就在走廊做決策,
04:38
when he wasn't there anymore.
113
266920
1416
當他不在那裡的時候。
04:40
So that wasn't so great for him.
114
268360
1576
那對他而言並非好事。
04:41
And as a robotics機器人 company公司 at Willow,
115
269960
1736
柳樹車庫是間機器人公司,
04:43
we had some extra額外
robot機器人 body身體 parts部分 laying鋪設 around,
116
271720
2336
在公司裡滿地都是
機器人的身體部件,
所以,達拉斯和他的伙伴柯特
組裝了這個東西,
04:46
so Dallas達拉斯 and his buddy夥伴 Curt生硬
put together一起 this thing,
117
274080
2496
04:48
which哪一個 looks容貌 kind of
like SkypeSkype的 on a stick on wheels車輪,
118
276600
2936
看起來像是 Skype
接著一根棍子且下面有輪子,
04:51
which哪一個 seems似乎 like a techytechy, silly愚蠢 toy玩具,
119
279560
1736
它看來是個很蠢的科技玩具,
04:53
but really it's probably大概
one of the most powerful強大 tools工具
120
281320
2776
但其實,它可能是我所見過
所有為了遠端協同作業所製出
04:56
that I've seen看到 ever made製作
for remote遠程 collaboration合作.
121
284120
2480
最強大工具中的其中之一。
04:59
So now, if I didn't answer回答
Dallas'達拉斯' email電子郵件 question,
122
287160
3496
所以,現在,如果我沒回覆
達拉斯在電子郵件中問的問題,
05:02
he could literally按照字面 roll into my office辦公室,
123
290680
2216
他可以直接進入我的辦公室,
05:04
block my doorway門口
and ask me the question again --
124
292920
2576
擋住我的門,再問我一次那個問題
05:07
(Laughter笑聲)
125
295520
1016
(笑聲)
05:08
until直到 I answered回答 it.
126
296560
1216
直到我回答為止。
05:09
And I'm not going to turn him off, right?
That's kind of rude無禮.
127
297800
2976
我不會把他關機,
對吧?那樣有點失禮。
05:12
Not only was it good
for these one-on-one一對一 communications通訊,
128
300800
2696
它不只很適合用在一對一的溝通上,
05:15
but also for just showing展示 up
at the company公司 all-hands全手 meeting會議.
129
303520
2936
也很適合出席公司的全員大會。
05:18
Getting入門 your butt屁股 in that chair椅子
130
306480
1696
把你的屁股坐到椅子上,
05:20
and showing展示 people that you're present當下
and committed提交 to your project項目
131
308200
3216
讓人們看到你有出席,
有投入你的專案計畫,
05:23
is a big deal合同
132
311440
1256
這是很重要的,
05:24
and can help remote遠程 collaboration合作 a ton.
133
312720
2176
且能大大協助遠端協同作業。
05:26
We saw this over the period
of months個月 and then years年份,
134
314920
2856
在數個月、接著數年的期間,
我們都看到這個狀況,
05:29
not only at our company公司
but at others其他, too.
135
317800
2160
不只在我們的公司裡,
也在其他公司裡。
05:32
The best最好 thing that can happen發生
with these systems系統
136
320720
2336
對於這些系統,最棒的情況就是
05:35
is that it starts啟動 to feel
like you're just there.
137
323080
2336
你開始覺得你就是在那裡。
05:37
It's just you, it's just your body身體,
138
325440
1696
那就是你,那就是你的身體,
05:39
and so people actually其實 start開始
to give these things personal個人 space空間.
139
327160
3096
所以人們開始留
個人空間給這些東西。
05:42
So when you're having a stand-up站起來 meeting會議,
140
330280
1976
當你參加一場站立會議時,
人們在那空間中站立的位置,
05:44
people will stand around the space空間
141
332280
1656
就跟真人開會時的情況一樣。
05:45
just as they would
if you were there in person.
142
333960
2216
一切都很美好,
直到當機,就不好了。
05:48
That's great until直到
there's breakdowns故障 and it's not.
143
336200
2576
05:50
People, when they first see these robots機器人,
144
338800
1976
當人們初次看到這些機器人時,
05:52
are like, "Wow, where's哪裡 the components組件?
There must必須 be a camera相機 over there,"
145
340800
3576
通常會:「哇,零件在哪裡?
這裡一定有個攝影機。」
05:56
and they start開始 poking your face面對.
146
344400
1576
他們開始戳你的臉。
05:58
"You're talking too softly輕輕的,
I'm going to turn up your volume,"
147
346000
2936
「你說話太輕柔了,
我要把你的音量調高。」
這感覺就像有個同事
走到你面前,說:
06:00
which哪一個 is like having a coworker同事
walk步行 up to you and say,
148
348960
2616
「你說話太輕柔了,
我來把你的臉調一下。」
06:03
"You're speaking請講 too softly輕輕的,
I'm going to turn up your face面對."
149
351600
2896
那挺尷尬的,且不太好,
06:06
That's awkward尷尬 and not OK,
150
354520
1256
06:07
and so we end結束 up having to build建立
these new social社會 norms規範
151
355800
2616
所以我們最後就建立了使用
這些系統時的新社交規範。
06:10
around using運用 these systems系統.
152
358440
2216
06:12
Similarly同樣, as you start開始
feeling感覺 like it's your body身體,
153
360680
3416
同樣的,當你開始覺得
它是你身體的一部分時,
06:16
you start開始 noticing注意到 things like,
"Oh, my robot機器人 is kind of short."
154
364120
3696
你就會開始注意到一些事:
「喔,我的機器人有點矮。」
06:19
Dallas達拉斯 would say things to me --
he was six-foot六腳 tall --
155
367840
2656
達拉斯會對我說些事。他有六呎高。
06:22
and we would take him via通過 robot機器人
to cocktail雞尾酒 parties派對 and things like that,
156
370520
3456
而我們會透過機器人帶他去
雞尾酒派對之類的活動,
06:26
as you do,
157
374000
1216
跟你會做的一樣。
06:27
and the robot機器人 was about five-foot-tall五英尺高,
which哪一個 is close to my height高度.
158
375240
3336
而機器人大約五呎高,和我差不多。
06:30
And he would tell me,
159
378600
1216
他會告訴我:
06:31
"You know, people are not
really looking at me.
160
379840
2536
「你知道嗎,人們沒有真正看著我。
06:34
I feel like I'm just looking
at this sea of shoulders肩膀,
161
382400
2696
我覺得我好像是看著一片肩膀海,
06:37
and it's just -- we need a taller robot機器人."
162
385120
1976
那就是──我們需要更高的機器人。」
06:39
And I told him,
163
387120
1256
而我告訴他:
06:40
"Um, no.
164
388400
1296
「呃,不。
06:41
You get to walk步行 in my shoes for today今天.
165
389720
1936
今天,你要嚐嚐當我的滋味。
06:43
You get to see what it's like
to be on the shorter end結束 of the spectrum光譜."
166
391680
3536
你可以體會一下身為
比較矮的族群是什麼感覺。」
06:47
And he actually其實 ended結束 up building建造
a lot of empathy同情 for that experience經驗,
167
395240
3376
結果他從那次經驗
建立起了相當的同理心,
06:50
which哪一個 was kind of great.
168
398640
1256
這樣挺好的。
所以當他親自來造訪時,
06:51
So when he'd他會 come visit訪問 in person,
169
399920
1656
他若要跟我說話,
也不會站得比我高,
06:53
he no longer stood站在 over me
as he was talking to me,
170
401600
2416
06:56
he would sit down
and talk to me eye to eye,
171
404040
2096
他會坐下來,
和我能真正面對面談話,
06:58
which哪一個 was kind of a beautiful美麗 thing.
172
406160
1736
這是件很美好的事。
06:59
So we actually其實 decided決定
to look at this in the laboratory實驗室
173
407920
2656
所以,我們決定在
實驗室中研究這一點,
07:02
and see what others其他 kinds of differences分歧
things like robot機器人 height高度 would make.
174
410600
3656
看看像類似機器人身高
這種特性會造什麼其他的差異。
07:06
And so half of the people in our study研究
used a shorter robot機器人,
175
414280
2856
在我們的研究中,半數
受試者使用較矮的機器人,
07:09
half of the people in our study研究
used a taller robot機器人
176
417160
2416
另外半數則用較高的機器人,
07:11
and we actually其實 found發現
that the exact精確 same相同 person
177
419600
2256
結果發現,當同一個人
07:13
who has the exact精確 same相同 body身體
and says the exact精確 same相同 things as someone有人,
178
421880
3336
用同樣的身體,說同樣的話,
07:17
is more persuasive說服力
and perceived感知 as being存在 more credible可信的
179
425240
2616
如果用比較高的機器人,
就會比較有說服力,
07:19
if they're in a taller robot機器人 form形成.
180
427880
1656
且被認為比較可靠。
07:21
It makes品牌 no rational合理的 sense,
181
429560
1816
這沒有理性的解釋,
但那就是為什麼我們要研究心理學。
07:23
but that's why we study研究 psychology心理學.
182
431400
1696
07:25
And really, you know,
the way that Cliff懸崖 Nass納斯 would put this
183
433120
2856
其實,用克利福德那斯的說法,
07:28
is that we're having to deal合同
with these new technologies技術
184
436000
3016
我們得要處理這些嶄新的技術,
07:31
despite儘管 the fact事實
that we have very old brains大腦.
185
439040
2736
儘管我們的大腦是老舊的。
07:33
Human人的 psychology心理學 is not changing改變
at the same相同 speed速度 that tech高科技 is
186
441800
2976
人類心理學的改變速度沒有科技快,
07:36
and so we're always playing播放 catch-up跟上來,
187
444800
1816
所以我們一直在努力追趕,
07:38
trying to make sense of this world世界
188
446640
1656
試圖把這個有很多自動的東西
跑來跑去的世界給合理化。
07:40
where these autonomous自主性 things
are running賽跑 around.
189
448320
2336
通常,會說話的就是人,
不是機器,對嗎?
07:42
Usually平時, things that talk are people,
not machines, right?
190
450680
2736
07:45
And so we breathe呼吸 a lot of meaning含義
into things like just height高度 of a machine,
191
453440
4576
所以我們把很多的意義帶到
像機器高度這樣的事物中,
07:50
not a person,
192
458040
1256
而非人,
07:51
and attribute屬性 that
to the person using運用 the system系統.
193
459320
2360
然後將之歸因給使用系統的人。
07:55
You know, this, I think,
is really important重要
194
463120
2216
當談到機器人學的時候,
07:57
when you're thinking思維 about robotics機器人.
195
465360
1736
我認為這點十分重要。
07:59
It's not so much about reinventing重塑 humans人類,
196
467120
2096
它的重點並不是在重新發明人類,
08:01
it's more about figuring盤算 out
how we extend延伸 ourselves我們自己, right?
197
469240
3136
比較是在於我們要如何
延伸我們自己,對嗎?
08:04
And we end結束 up using運用 things
in ways方法 that are sort分類 of surprising奇怪.
198
472400
2976
結果是,我們會用
蠻讓人訝異的方式來使用東西。
08:07
So these guys can't play pool
because the robots機器人 don't have arms武器,
199
475400
4256
這些人無法玩撞球,
因為機器人沒有手臂,
08:11
but they can heckle詰問 the guys
who are playing播放 pool
200
479680
2336
但他們能和玩撞球的那些人起鬨,
08:14
and that can be an important重要 thing
for team球隊 bonding結合,
201
482040
3176
那對於團隊連結來說
可能是件重要的事,
08:17
which哪一個 is kind of neat整齊.
202
485240
1296
這樣挺好的。
08:18
People who get really good
at operating操作 these systems系統
203
486560
2496
非常擅長操作這些系統的人
08:21
will even do things
like make up new games遊戲,
204
489080
2056
能做到像是創造新遊戲這類的事,
比如半夜玩機器人足球,
08:23
like robot機器人 soccer足球
in the middle中間 of the night,
205
491160
2136
把垃圾筒推來推去。
08:25
pushing推動 the trash垃圾 cans around.
206
493320
1456
08:26
But not everyone's大家的 good.
207
494800
1576
但並非每個人都很擅長。
08:28
A lot of people have trouble麻煩
operating操作 these systems系統.
208
496400
2496
有些人在操作這些
系統時會遇到問題。
08:30
This is actually其實 a guy
who logged記錄 into the robot機器人
209
498920
2256
這個人登入了機器人,
而他的眼球向左轉了九十度。
08:33
and his eyeball眼球 was turned轉身
90 degrees to the left.
210
501200
2376
他自己並沒察覺,
08:35
He didn't know that,
211
503600
1256
結果他在辦公室裡頭亂撞,
08:36
so he ended結束 up just bashing
around the office辦公室,
212
504880
2176
撞到別人的桌子,弄得非常尷尬,
08:39
running賽跑 into people's人們 desks書桌,
getting得到 super embarrassed尷尬,
213
507080
2616
因此而大笑。他的音量太高了。
08:41
laughing about it --
his volume was way too high.
214
509720
2336
而照片裡的這個人告訴我:
08:44
And this guy here
in the image圖片 is telling告訴 me,
215
512080
2136
「我們需要機器人靜音按鈕。」
08:46
"We need a robot機器人 mute靜音 button按鍵."
216
514240
2096
08:48
And by that what he really meant意味著
was we don't want it to be so disruptive破壞性.
217
516360
3496
他那麼說的意思是,我們不希望
機器人會這樣引起混亂。
08:51
So as a robotics機器人 company公司,
218
519880
1616
所以,既然我們是機器人公司,
08:53
we added添加 some obstacle障礙
avoidance躲避 to the system系統.
219
521520
2456
我們就在系統上加裝了
障礙閃避功能。
08:56
It got a little laser激光 range範圍 finder發現者
that could see the obstacles障礙,
220
524000
3056
機器人有了一個小雷射測距儀,
能夠看見障礙物,
08:59
and if I as a robot機器人 operator操作者
try to say, run into a chair椅子,
221
527080
3136
如果我在操作機器人時,
試圖比如撞向一張椅子,
09:02
it wouldn't不會 let me,
it would just plan計劃 a path路徑 around,
222
530240
2496
它不會讓我這麼做,它會規劃繞路,
09:04
which哪一個 seems似乎 like a good idea理念.
223
532760
1856
這看似是個好主意。
09:06
People did hit擊中 fewer obstacles障礙
using運用 that system系統, obviously明顯,
224
534640
3176
很顯然,用了那個系統之後,
人們比較少撞到障礙物,
09:09
but actually其實, for some of the people,
225
537840
2096
但,對一些人而言,
09:11
it took them a lot longer
to get through通過 our obstacle障礙 course課程,
226
539960
2856
他們要花比較長的時間
通過障礙物課程,
09:14
and we wanted to know why.
227
542840
1560
我們想要知道為什麼。
09:17
It turns out that there's
this important重要 human人的 dimension尺寸 --
228
545080
3056
結果發現,有個很重要的人類維度,
09:20
a personality個性 dimension尺寸
called locus軌跡 of control控制,
229
548160
2296
一個人格維度,叫做「控制點」,
09:22
and people who have
a strong強大 internal內部 locus軌跡 of control控制,
230
550480
3136
如果一個人有很強的內在控制點,
09:25
they need to be the masters主人
of their own擁有 destiny命運 --
231
553640
3056
他會想要主宰自己的命運,
09:28
really don't like giving up control控制
to an autonomous自主性 system系統 --
232
556720
3096
他們不喜歡把控制權交給自動系統
09:31
so much so that they will
fight鬥爭 the autonomy自治;
233
559840
2136
以致會去對抗自動化:
09:34
"If I want to hit擊中 that chair椅子,
I'm going to hit擊中 that chair椅子."
234
562000
3096
「如果我想要撞到椅子,
我就要撞到椅子。」
09:37
And so they would actually其實 suffer遭受
from having that autonomous自主性 assistance幫助,
235
565120
3616
自動化的協助反而會
讓這類人很辛苦,
09:40
which哪一個 is an important重要 thing for us to know
236
568760
2576
我們能知道這點是很重要的,
09:43
as we're building建造 increasingly日益
autonomous自主性, say, cars汽車, right?
237
571360
3296
畢竟我們在建立越來越
自動化的…比如汽車,對吧?
09:46
How are different不同 people going
to grapple抓鉤 with that loss失利 of control控制?
238
574680
3360
要如何讓不同型的人使用而不失控?
09:50
It's going to be different不同
depending根據 on human人的 dimensions尺寸.
239
578880
2696
不同的人類維度會有不同的方式。
09:53
We can't treat對待 humans人類
as if we're just one monolithic單片 thing.
240
581600
3496
我們不能把人看成是單一的東西。
09:57
We vary變化 by personality個性, by culture文化,
241
585120
2416
我們有不同的人格、文化,
09:59
we even vary變化 by emotional情緒化 state
moment時刻 to moment時刻,
242
587560
2456
我們甚至在不同時刻
會有不同的情緒狀態,
10:02
and being存在 able能夠 to design設計 these systems系統,
243
590040
1976
如果要能夠設計這些系統,
10:04
these human-robot人類與機器人 interaction相互作用 systems系統,
244
592040
2296
這些人類與機器人的互動系統,
10:06
we need to take into account帳戶
the human人的 dimensions尺寸,
245
594360
2736
我們就得要考量人類維度,
10:09
not just the technological技術性 ones那些.
246
597120
1720
不只是技術維度。
10:11
Along沿 with a sense of control控制
also comes a sense of responsibility責任.
247
599640
4296
隨著控制感而來的,就是責任感。
10:15
And if you were a robot機器人 operator操作者
using運用 one of these systems系統,
248
603960
2856
如果你是個在使用
這些系統的機器人操作員,
10:18
this is what the interface接口
would look like.
249
606840
2056
介面看起來就是這樣子的。
10:20
It looks容貌 a little bit like a video視頻 game遊戲,
250
608920
1936
它看起來有一點像電玩遊戲,
10:22
which哪一個 can be good because
that's very familiar to people,
251
610880
2976
這是好事,因為人們熟悉它;
10:25
but it can also be bad
252
613880
1216
但也可能是壞事,
10:27
because it makes品牌 people feel
like it's a video視頻 game遊戲.
253
615120
2456
因為它會讓人們覺得
它是個電玩遊戲。
10:29
We had a bunch of kids孩子
over at Stanford斯坦福 play with the system系統
254
617600
2856
我們找了一群小朋友
到史丹佛來玩這個系統,
操作我們在門洛公園
辦公室裡的機器人,
10:32
and drive駕駛 the robot機器人
around our office辦公室 in Menlo門羅 Park公園,
255
620480
2456
小朋友們開始說像這樣的話:
10:34
and the kids孩子 started開始 saying things like,
256
622960
1936
「如果你撞到那邊的那個人,
得十分,那個人則是二十分。」
10:36
"10 points if you hit擊中 that guy over there.
20 points for that one."
257
624920
3176
他們會在走廊上追著目標跑。
10:40
And they would
chase them down the hallway門廳.
258
628120
2016
(笑聲)
10:42
(Laughter笑聲)
259
630160
1016
10:43
I told them, "Um, those are real真實 people.
260
631200
1936
我告訴他們:「呃,那些是真人。
10:45
They're actually其實 going to bleed流血
and feel pain疼痛 if you hit擊中 them."
261
633160
3296
如果你們撞到他們,
他們真的會流血且會痛。」
10:48
And they'd他們會 be like, "OK, got it."
262
636480
1616
他們才說:「好,了解。」
10:50
But five minutes分鐘 later後來,
they would be like,
263
638120
2056
但五分鐘之後,他們又會說:
「撞到那邊那個人有二十分,
他看起來需要被撞一下。」
10:52
"20 points for that guy over there,
he just looks容貌 like he needs需求 to get hit擊中."
264
640200
3616
這有一點像《戰爭遊戲》,對嗎?
10:55
It's a little bit
like "Ender's安德的 Game遊戲," right?
265
643840
2136
在另一端有一個真實的世界,
10:58
There is a real真實 world世界 on that other side
266
646000
1936
10:59
and I think it's our responsibility責任
as people designing設計 these interfaces接口
267
647960
3416
我認為我們身為人的責任,
是要設計這些介面
11:03
to help people remember記得
268
651400
1256
來協助人們記得
11:04
that there's real真實 consequences後果
to their actions行動
269
652680
2256
他們的行為會造成真實的後果,
11:06
and to feel a sense of responsibility責任
270
654960
2296
讓他們在操作這些
越來越自動的東西時,
11:09
when they're operating操作
these increasingly日益 autonomous自主性 things.
271
657280
3280
能夠有責任感。
11:13
These are kind of a great example
272
661840
2296
這些是很棒的例子,
11:16
of experimenting試驗 with one
possible可能 robotic機器人 future未來,
273
664160
3256
說明對一種可能的
機器人未來所做的實驗,
11:19
and I think it's pretty漂亮 cool
that we can extend延伸 ourselves我們自己
274
667440
3856
我覺得我們能將自己
延伸出去,是挺酷的事,
11:23
and learn學習 about the ways方法
that we extend延伸 ourselves我們自己
275
671320
2336
同時也能學習我們
將自己延伸出去的方式,
11:25
into these machines
276
673680
1216
延伸至機器中,
11:26
while at the same相同 time
being存在 able能夠 to express表現 our humanity人性
277
674920
2696
同時還能表現出我們的人性
以及我們的個性。
11:29
and our personality個性.
278
677640
1216
我們也會建立對他人的同理心,
11:30
We also build建立 empathy同情 for others其他
279
678880
1576
理解那些比較矮、比較高、
比較快、比較慢,
11:32
in terms條款 of being存在
shorter, taller, faster更快, slower比較慢,
280
680480
3216
11:35
and maybe even armless斷臂,
281
683720
1416
甚至沒有手臂的人,
11:37
which哪一個 is kind of neat整齊.
282
685160
1336
這樣挺好的。
11:38
We also build建立 empathy同情
for the robots機器人 themselves他們自己.
283
686520
2536
我們也會建立
對機器人本身的同理心。
11:41
This is one of my favorite喜愛 robots機器人.
284
689080
1656
這是我最喜歡的機器人之一。
11:42
It's called the TweenbotTweenbot.
285
690760
1456
它叫 Tweenbot。
11:44
And this guy has a little flag that says,
286
692240
1976
這傢伙有一面小旗子,上面寫著:
11:46
"I'm trying to get
to this intersection路口 in Manhattan曼哈頓,"
287
694240
2576
「我想前往曼哈頓的
這個十字路口。」
11:48
and it's cute可愛 and rolls勞斯萊斯
forward前鋒, that's it.
288
696840
2776
它很可愛,會向前跑,就這樣。
11:51
It doesn't know how to build建立 a map地圖,
it doesn't know how to see the world世界,
289
699640
3456
它不知道如何建立地圖,
它不知道如何看世界,
它只會尋求協助。
11:55
it just asks for help.
290
703120
1256
而人們很棒的一點,
11:56
The nice不錯 thing about people
291
704400
1336
11:57
is that it can actually其實 depend依靠
upon the kindness善良 of strangers陌生人.
292
705760
3096
就是它真的可以仰賴陌生人的善心。
12:00
It did make it across橫過 the park公園
to the other side of Manhattan曼哈頓 --
293
708880
3896
它真的穿過公園到了
曼哈頓的另一端。
12:04
which哪一個 is pretty漂亮 great --
294
712800
1256
這很棒,
因為人們會把它拿起來,
轉向正確的方向。
12:06
just because people would pick it up
and point it in the right direction方向.
295
714080
3456
(笑聲)
12:09
(Laughter笑聲)
296
717560
936
那樣很棒,對嗎?
12:10
And that's great, right?
297
718520
1256
我們在試著建立
人類──機器人的世界,
12:11
We're trying to build建立
this human-robot人類與機器人 world世界
298
719800
2696
12:14
in which哪一個 we can coexist共存
and collaborate合作 with one another另一個,
299
722520
3416
在這個世界中,我們彼此能
共同存在且和諧地共事,
12:17
and we don't need to be fully充分 autonomous自主性
and just do things on our own擁有.
300
725960
3376
我們不用完全自動化,
也不用只靠自己做事。
12:21
We actually其實 do things together一起.
301
729360
1496
而是真正能夠一起做事。
要實現這個理想,
12:22
And to make that happen發生,
302
730880
1256
我們其實需要其他人的協助,
像是藝術家、設計師、
12:24
we actually其實 need help from people
like the artists藝術家 and the designers設計師,
303
732160
3256
政策制訂者、法律學者、
12:27
the policy政策 makers製造商, the legal法律 scholars學者,
304
735440
1856
心理學家、社會學家、人類學家 ──
12:29
psychologists心理學家, sociologists社會學家,
anthropologists人類學家 --
305
737320
2216
我們需要更多的觀點參與討論,
12:31
we need more perspectives觀點 in the room房間
306
739560
1816
12:33
if we're going to do the thing
that Stu斯圖 Card says we should do,
307
741400
2976
如果我們要做史都華卡德
說我們應該做的事情,
12:36
which哪一個 is invent發明 the future未來
that we actually其實 want to live生活 in.
308
744400
3936
也就是:發明一個會讓我們
想要住在其中的未來。
12:40
And I think we can continue繼續 to experiment實驗
309
748360
2656
我認為我們能持續做實驗,
12:43
with these different不同
robotic機器人 futures期貨 together一起,
310
751040
2176
將不同的機器人未來一起拿來實驗,
12:45
and in doing so, we will end結束 up
learning學習 a lot more about ourselves我們自己.
311
753240
4680
這麼做,我們最後就能學到
很多關於我們自己的事。
12:50
Thank you.
312
758720
1216
謝謝。
12:51
(Applause掌聲)
313
759960
2440
(掌聲)
Translated by Lilian Chiu
Reviewed by Song Yu

▲Back to top

ABOUT THE SPEAKER
Leila Takayama - Social scientist
Leila Takayama conducts research on human-robot interaction.

Why you should listen

Leila Takayama is an acting associate professor of Psychology at the University of California, Santa Cruz, where she founded and leads the Re-Embodied Cognition Lab. Her lab examines how people make sense of, interact with, and relate to new technologies. Prior to academia, she was a researcher at GoogleX and Willow Garage, where she developed a taste for working alongside engineers, designers, animators, and more. Her interdisciplinary research continues in her current work on what happens when people interact with robots and through robots.

Takayama is a World Economic Forum Global Futures Council Member and Young Global Leader. In 2015, she was presented the IEEE Robotics & Automation Society Early Career Award. In 2012, she was named a TR35 winner and one of the 100 most creative people in business by Fast Company. She completed her PhD in Communication at Stanford University in 2008, advised by Professor Clifford Nass. She also holds a PhD minor in Psychology from Stanford, a master's degree in Communication from Stanford, and bachelor's of arts degrees in Psychology and Cognitive Science from UC Berkeley (2003). During her graduate studies, she was a research assistant in the User Interface Research (UIR) group at Palo Alto Research Center (PARC).

Photo: Melissa DeWitt

More profile about the speaker
Leila Takayama | Speaker | TED.com