ABOUT THE SPEAKER
Rodney Brooks - Roboticist
Rodney Brooks builds robots based on biological principles of movement and reasoning. The goal: a robot who can figure things out.

Why you should listen

Former MIT professor Rodney Brooks studies and engineers robot intelligence, looking for the holy grail of robotics: the AGI, or artificial general intelligence. For decades, we've been building robots to do highly specific tasks -- welding, riveting, delivering interoffice mail -- but what we all want, really, is a robot that can figure things out on its own, the way we humans do.

Brooks realized that a top-down approach -- just building the biggest brain possible and teaching it everything we could think of -- would never work. What would work is a robot who learns like we do, by trial and error, and with many separate parts that learn separate jobs. The thesis of his work which was captured in Fast, Cheap and Out of Control,went on to become the title of the great Errol Morris documentary.

A founder of iRobot, makers of the Roomba vacuum, Brooks is now founder and CTO of Rethink Robotics, whose mission is to apply advanced robotic intelligence to manufacturing and physical labor. Its first robots: the versatile two-armed Baxter and one-armed Sawyer. Brooks is the former director of CSAIL, MIT's Computers Science and Artificial Intelligence Laboratory.

 
More profile about the speaker
Rodney Brooks | Speaker | TED.com
TED2003

Rodney Brooks: Robots will invade our lives

Rodney Brooks 談論機器人將會進駐於我們的生活中

Filmed:
689,918 views

在2003年的這場具有遠見的演講中,機器人專家 Rodney Brooks 談論了未來機器人將會如何在我們的生活中存在 -- 從玩具機器人到家事機器人以及其它更深入的方面。
- Roboticist
Rodney Brooks builds robots based on biological principles of movement and reasoning. The goal: a robot who can figure things out. Full bio

Double-click the English transcript below to play the video.

00:18
What I want to tell you about today今天 is how I see robots機器人 invading入侵 our lives生活
0
0
5000
今天我想告訴大家的是,我認為機器人將會以
00:23
at multiple levels水平, over multiple timescales時間表.
1
5000
3000
各種不同方式,不同時間進駐到我們的生活之中。
00:26
And when I look out in the future未來, I can't imagine想像 a world世界, 500 years年份 from now,
2
8000
4000
當我估算未來時,我相信五百年後的世界,
00:30
where we don't have robots機器人 everywhere到處.
3
12000
2000
到處都是機器人,
00:32
Assuming假設 -- despite儘管 all the dire可怕的 predictions預測 from many許多 people about our future未來 --
4
14000
5000
只是假設 -- 儘管有許多人對未來有悲觀的預測( 世界末日等 ) --
00:37
assuming假設 we're still around, I can't imagine想像 the world世界 not being存在 populated人口稠密 with robots機器人.
5
19000
4000
假設我們到時候還活著,我相信世界將會充斥著機器人。
00:41
And then the question is, well, if they're going to be here in 500 years年份,
6
23000
3000
接下來的問題是,如果五百年後它們將會存在於世上,
00:44
are they going to be everywhere到處 sooner than that?
7
26000
2000
它們是否會更早出現呢?
00:46
Are they going to be around in 50 years年份?
8
28000
2000
它們在未來五十年內是否會存在於我們的生活中呢?
00:48
Yeah, I think that's pretty漂亮 likely容易 -- there's going to be lots of robots機器人 everywhere到處.
9
30000
3000
是的,我相信大概是如此 -- 到時候將會到處都有一堆機器人。
00:51
And in fact事實 I think that's going to be a lot sooner than that.
10
33000
3000
事實上我認為會更快就出現一堆機器人了。
00:54
I think we're sort分類 of on the cusp風口浪尖 of robots機器人 becoming變得 common共同,
11
36000
4000
我想我們正處在機器人普及化的開端了,
00:58
and I think we're sort分類 of around 1978 or 1980 in personal個人 computer電腦 years年份,
12
40000
6000
在1978或1980年代,個人電腦剛出現的時候,
01:04
where the first few少數 robots機器人 are starting開始 to appear出現.
13
46000
3000
一些初期的機器人就已經出現了。
01:07
Computers電腦 sort分類 of came來了 around through通過 games遊戲 and toys玩具.
14
49000
4000
電腦是從遊戲和玩具開始進入我們生活中的。
01:11
And you know, the first computer電腦 most people had in the house
15
53000
3000
你知道嗎,一開始人們有電腦時
01:14
may可能 have been a computer電腦 to play Pong,
16
56000
2000
大概都是拿來玩 乒乓球 遊戲(最早的電玩),
01:16
a little microprocessor微處理器 embedded嵌入式,
17
58000
2000
裡面裝著小小的微處理器,
01:18
and then other games遊戲 that came來了 after that.
18
60000
3000
之後其他遊戲才漸漸出來。
01:21
And we're starting開始 to see that same相同 sort分類 of thing with robots機器人:
19
63000
3000
對於機器人來說,大概也會有類似的狀況:
01:24
LEGOLEGO Mindstorms頭腦風暴, FurbiesFurbies -- who here -- did anyone任何人 here have a Furby菲比?
20
66000
4000
樂高機器人套件中的互動機器人菲比 -- 這邊有沒有人擁有菲比?
01:28
Yeah, there's 38 million百萬 of them sold出售 worldwide全世界.
21
70000
3000
全世界共賣出三千八百萬台。
01:31
They are pretty漂亮 common共同. And they're a little tiny robot機器人,
22
73000
2000
它們很常見,它們是小小機器人,
01:33
a simple簡單 robot機器人 with some sensors傳感器,
23
75000
2000
一個裝著許多感應器的簡易機器人。
01:35
a little bit of processing處理 actuation啟動.
24
77000
2000
它能處理一些簡單的訊息。
01:37
On the right there is another另一個 robot機器人 doll娃娃, who you could get a couple一對 of years年份 ago.
25
79000
3000
右邊是另一個幾年前就出現的機器娃娃。
01:40
And just as in the early days,
26
82000
2000
如同較早之前一般,
01:42
when there was a lot of sort分類 of amateur業餘 interaction相互作用 over computers電腦,
27
84000
5000
當時有許多藉由電腦所進行的業餘互動模式,
01:47
you can now get various各個 hacking黑客 kits, how-to-hack如何對黑客 books圖書.
28
89000
4000
你可以找到很多修改工具以及如何修改的書。
01:51
And on the left there is a platform平台 from Evolution演化 Robotics機器人,
29
93000
4000
左邊是一個 Evolution Robotics 所做的工作平台,
01:55
where you put a PC個人計算機 on, and you program程序 this thing with a GUIGUI
30
97000
3000
你可以將電腦接上去,然後藉由圖形化介面
01:58
to wander漫步 around your house and do various各個 stuff東東.
31
100000
3000
去設定讓它在家裡到處亂走或是做各種事情。
02:01
And then there's a higher更高 price價錢 point sort分類 of robot機器人 toys玩具 --
32
103000
3000
這是一個價格較高的某種機器人玩具 --
02:04
the Sony索尼 Aibo艾博. And on the right there, is one that the NECNEC developed發達,
33
106000
4000
Sony 的 Aibo。在右邊的是 NEC 開發的 PaPeRo,
02:08
the PaPeRoPaPeRo, which哪一個 I don't think they're going to release發布.
34
110000
3000
雖然我本來不認為它會被推出。
02:11
But nevertheless雖然, those sorts排序 of things are out there.
35
113000
3000
然而,這類型的東西已經出現了。
02:14
And we've我們已經 seen看到, over the last two or three years年份, lawn-mowing草坪修剪 robots機器人,
36
116000
4000
在過去兩三年之間,我們看見了割草機器人的出現,
02:18
HusqvarnaHusqvarna的 on the bottom底部, Friendly友善 Robotics機器人 on top最佳 there, an Israeli以色列 company公司.
37
120000
6000
上面是 Husqvarna 公司製造的,下面則是 以色列 Friendly Robotics 公司所推出的。
02:24
And then in the last 12 months個月 or so
38
126000
2000
而在過去十二個月左右的時間,
02:26
we've我們已經 started開始 to see a bunch of home-cleaning家居清潔 robots機器人 appear出現.
39
128000
4000
我們開始看見許多家用清潔機器人的出現。
02:30
The top最佳 left one is a very nice不錯 home-cleaning家居清潔 robot機器人
40
132000
3000
左上角的是一個很棒的家用清潔機器人,
02:33
from a company公司 called Dyson戴森, in the U.K. Except it was so expensive昂貴 --
41
135000
4000
由英國的一家Dyson公司製造。不過它很貴 --
02:37
3,500 dollars美元 -- they didn't release發布 it.
42
139000
2000
要價三千五百美金 -- 他們並沒有販賣它。
02:39
But at the bottom底部 left, you see Electrolux伊萊克斯, which哪一個 is on sale拍賣.
43
141000
3000
左下角是Electrolux,這個就有在販賣。
02:42
Another另一個 one from Karcher凱馳.
44
144000
2000
另一個是Karcher公司做的。
02:44
At the bottom底部 right is one that I built內置 in my lab實驗室
45
146000
2000
右下角這個是大約十年前
02:46
about 10 years年份 ago, and we finally最後 turned轉身 that into a product產品.
46
148000
3000
我在實驗室裡做出來的,我們後來終於把它變成商品。
02:49
And let me just show顯示 you that.
47
151000
2000
讓我展示一下這個。
02:51
We're going to give this away I think, Chris克里斯 said, after the talk.
48
153000
4000
我想 Chris 有說,在演講之後我們會把它送出去。
02:55
This is a robot機器人 that you can go out and buy購買, and that will clean清潔 up your floor地板.
49
157000
6000
這是一具你可以買到的機器人, 它會幫你清掃地板。
03:05
And it starts啟動 off sort分類 of just going around in ever-increasing不斷增加 circles.
50
167000
5000
然後它開始到處用繞圈的方式移動。
03:10
If it hits點擊 something -- you people see that?
51
172000
4000
如果它撞到東西 -- 你看見了嗎?
03:14
Now it's doing wall-following牆下, it's following以下 around my feet
52
176000
3000
它開始沿著牆壁前進,它正沿著我的腳
03:17
to clean清潔 up around me. Let's see, let's --
53
179000
4000
進行清掃。讓我們看看 --
03:21
oh, who stole偷了 my Rice白飯 Krispies脆片? They stole偷了 my Rice白飯 Krispies脆片!
54
183000
5000
喔,誰偷了我的爆米香?它們偷了我的爆米香。
03:26
(Laughter笑聲)
55
188000
6000
(笑聲)
03:32
Don't worry擔心, relax放鬆, no, relax放鬆, it's a robot機器人, it's smart聰明!
56
194000
3000
別擔心,放輕鬆,它是一個機器人,它很聰明。
03:35
(Laughter笑聲)
57
197000
3000
(笑聲)
03:38
See, the three-year-old三十歲 kids孩子, they don't worry擔心 about it.
58
200000
4000
看,三歲的小孩都不會擔心。
03:42
It's grown-ups大人 that get really upset煩亂.
59
204000
2000
大人反而會覺得不安。
03:44
(Laughter笑聲)
60
206000
1000
(笑聲)
03:45
We'll just put some crap擲骰子 here.
61
207000
2000
先在這裡放一些垃圾。
03:47
(Laughter笑聲)
62
209000
4000
(笑聲)
03:51
Okay.
63
213000
2000
好吧。
03:53
(Laughter笑聲)
64
215000
4000
(笑聲)
03:57
I don't know if you see -- so, I put a bunch of Rice白飯 Krispies脆片 there,
65
219000
3000
不知道你們有沒有看見 -- 所以我在這邊放一些爆米香,
04:00
I put some pennies便士, let's just shoot射擊 it at that, see if it cleans清理 up.
66
222000
7000
在放幾塊錢,看看它會不會清理乾淨。
04:10
Yeah, OK. So --
67
232000
2000
耶,它做到了。所以 --
04:12
we'll leave離開 that for later後來.
68
234000
4000
我們先讓它待在那兒一下。
04:16
(Applause掌聲)
69
238000
5000
(掌聲)
04:22
Part部分 of the trick was building建造 a better cleaning清潔的 mechanism機制, actually其實;
70
244000
4000
這些小技巧讓清潔機器人變得更好,
04:26
the intelligence情報 on board was fairly相當 simple簡單.
71
248000
4000
事實上,寫在電路板的人工智能相當簡單。
04:30
And that's true真正 with a lot of robots機器人.
72
252000
2000
對於很多機器人來說都是如此。
04:32
We've我們已經 all, I think, become成為, sort分類 of computational計算 chauvinists沙文主義者,
73
254000
4000
我想我們都變成了某種計算機的沙文主義者,
04:36
and think that computation計算 is everything,
74
258000
2000
認為運算就是一切,
04:38
but the mechanics機械學 still matter.
75
260000
2000
但是機構仍然是重要的一件事。
04:40
Here's這裡的 another另一個 robot機器人, the PackBotPackBot機器人,
76
262000
3000
這是另一個機器人,PackBot,
04:43
that we've我們已經 been building建造 for a bunch of years年份.
77
265000
2000
我們花了幾年的時間去製造它。
04:45
It's a military軍事 surveillance監控 robot機器人, to go in ahead of troops軍隊 --
78
267000
6000
它是軍事偵測機器人,總是走在部隊前方,
04:51
looking at caves洞穴, for instance.
79
273000
3000
去進行例如洞穴內偵查的任務。
04:54
But we had to make it fairly相當 robust強大的,
80
276000
2000
但是我們必須讓它變成更加強壯,
04:56
much more robust強大的 than the robots機器人 we build建立 in our labs實驗室.
81
278000
7000
比我們實驗室中其他機器人都更加強壯。
05:03
(Laughter笑聲)
82
285000
3000
(笑聲)
05:12
On board that robot機器人 is a PC個人計算機 running賽跑 LinuxLinux的.
83
294000
4000
機器人內部裝置的電腦使用 Linux 系統。
05:16
It can withstand經受 a 400G shock休克. The robot機器人 has local本地 intelligence情報:
84
298000
6000
它可以承受 400G 的震盪。這個機器人擁有局部智能:
05:22
it can flip翻動 itself本身 over, can get itself本身 into communication通訊 range範圍,
85
304000
6000
它能夠將自己翻轉過來,讓自己抵達通訊範圍內,
05:28
can go upstairs樓上 by itself本身, et cetera等等.
86
310000
3000
可以自己爬上樓之類的。
05:38
Okay, so it's doing local本地 navigation導航 there.
87
320000
4000
好,它正在區域探索。
05:42
A soldier士兵 gives it a command命令 to go upstairs樓上, and it does.
88
324000
6000
一個士兵給它上樓的指令,它也照做了。
05:49
That was not a controlled受控 descent降落.
89
331000
3000
那不是命令控制叫它摔下去的。
05:52
(Laughter笑聲)
90
334000
2000
(笑聲)
05:54
Now it's going to head off.
91
336000
2000
現在它準備繞到前面去。
05:56
And the big breakthrough突破 for these robots機器人, really, was September九月 11th.
92
338000
5000
這些機器人在911那天有了重大進展。
06:01
We had the robots機器人 down at the World世界 Trade貿易 Center中央 late晚了 that evening晚間.
93
343000
4000
當晚我們將這些機器人置放於世界貿易中心。
06:06
Couldn't不能 do a lot in the main主要 rubble瓦礫 pile,
94
348000
2000
在主要的碎石堆中無法做什麼事,
06:08
things were just too -- there was nothing left to do.
95
350000
3000
狀況太糟糕,已經沒什麼能做的。
06:11
But we did go into all the surrounding周圍 buildings房屋 that had been evacuated疏散,
96
353000
5000
但是我們進入了周圍已經被淨空的大樓,
06:16
and searched搜索 for possible可能 survivors倖存者 in the buildings房屋
97
358000
3000
在那些因為太危險而無法進入的大樓中,
06:19
that were too dangerous危險 to go into.
98
361000
2000
找尋可能的生還者。
06:21
Let's run this video視頻.
99
363000
2000
我們來播放這段影片。
06:23
Reporter記者: ...battlefield戰場 companions同伴 are helping幫助 to reduce減少 the combat戰鬥 risks風險.
100
365000
3000
播報員:戰場幫手正在協助降低人們救援的風險。
06:26
Nick缺口 Robertson羅伯遜 has that story故事.
101
368000
3000
請 Nick Robertson 告訴我們現場狀況
06:31
Rodney羅德尼 Brooks布魯克斯: Can we have another另一個 one of these?
102
373000
2000
Nick Robertson:能不能再拿一台來?
06:38
Okay, good.
103
380000
2000
好的,太好了。
06:43
So, this is a corporal下士 who had seen看到 a robot機器人 two weeks previously先前.
104
385000
3000
這是兩個禮拜前那台機器人的殘骸。
06:48
He's sending發出 robots機器人 into caves洞穴, looking at what's going on.
105
390000
4000
他將機器人們派遣至洞穴中勘查狀況。
06:52
The robot's機器人 being存在 totally完全 autonomous自主性.
106
394000
2000
這些機器人是全自動運作的。
06:54
The worst最差 thing that's happened發生 in the cave洞穴 so far
107
396000
4000
目前最糟的狀況是其中一台機器人
06:58
was one of the robots機器人 fell下跌 down ten meters.
108
400000
3000
在洞穴中跌入了十公尺的地方。
07:08
So one year ago, the US military軍事 didn't have these robots機器人.
109
410000
3000
在一年前,美軍還沒有這些機器人。
07:11
Now they're on active活性 duty義務 in Afghanistan阿富汗 every一切 day.
110
413000
2000
而現在他們每天都在阿富汗執行任務。
07:13
And that's one of the reasons原因 they say a robot機器人 invasion侵入 is happening事件.
111
415000
3000
這是他們說機器人正在進駐的其中一個原因。
07:16
There's a sea change更改 happening事件 in how -- where technology's技術的 going.
112
418000
4000
目前科技的進展有著巨大的改變。
07:20
Thanks謝謝.
113
422000
2000
謝謝。
07:23
And over the next下一個 couple一對 of months個月,
114
425000
2000
在未來幾個月內,
07:25
we're going to be sending發出 robots機器人 in production生產
115
427000
3000
我們將會把這些機器人送往生產原油的油井之下,
07:28
down producing生產 oil wells to get that last few少數 years年份 of oil out of the ground地面.
116
430000
4000
去將可以使用好幾年的原油運送出來。
07:32
Very hostile敵對 environments環境, 150˚ C, 10,000 PSIPSI.
117
434000
4000
在150度C、一萬psi的嚴酷環境下。
07:36
Autonomous自主性 robots機器人 going down, doing this sort分類 of work.
118
438000
4000
自主機器人將進入下方,進行這類工作。
07:40
But robots機器人 like this, they're a little hard to program程序.
119
442000
3000
但這類型機器人的設計有點困難。
07:43
How, in the future未來, are we going to program程序 our robots機器人
120
445000
2000
在未來,我們要如何去設計機器人,
07:45
and make them easier更輕鬆 to use?
121
447000
2000
該如何讓它們更容易被使用?
07:47
And I want to actually其實 use a robot機器人 here --
122
449000
3000
我想在這裡用一個機器人 --
07:50
a robot機器人 named命名 Chris克里斯 -- stand up. Yeah. Okay.
123
452000
5000
這個機器人叫 Chris -- 站起來。對。很好。
07:57
Come over here. Now notice注意, he thinks robots機器人 have to be a bit stiff僵硬.
124
459000
4000
過來這邊。請注意,他認為機器人應該有點僵硬。
08:01
He sort分類 of does that. But I'm going to --
125
463000
3000
他常這樣做。但是我將會 --
08:04
Chris克里斯 Anderson安德森: I'm just British英國的. RBRB: Oh.
126
466000
2000
Chris Anderson:我只是比較英式。RB:喔。
08:06
(Laughter笑聲)
127
468000
2000
(笑聲)
08:08
(Applause掌聲)
128
470000
2000
(掌聲)
08:10
I'm going to show顯示 this robot機器人 a task任務. It's a very complex複雜 task任務.
129
472000
3000
我將會指派一個任務給這個機器人。是一個很複雜的任務。
08:13
Now notice注意, he nodded點頭 there, he was giving me some indication跡象
130
475000
3000
請注意,他在點頭,他是在給我一些提示,
08:16
he was understanding理解 the flow of communication通訊.
131
478000
3000
讓我知道他了解溝通的流程。
08:19
And if I'd said something completely全然 bizarre奇異的
132
481000
2000
如果我說了些奇怪的話,
08:21
he would have looked看著 askance懷疑地 at me, and regulated調控 the conversation會話.
133
483000
3000
他會質疑的看著我,然後調整對話內容。
08:24
So now I brought this up in front面前 of him.
134
486000
3000
現在我把這個拿到他的前面。
08:27
I'd looked看著 at his eyes眼睛, and I saw his eyes眼睛 looked看著 at this bottle瓶子 top最佳.
135
489000
4000
我看著他的眼睛,我看見他看著瓶子的頂端。
08:31
And I'm doing this task任務 here, and he's checking檢查 up.
136
493000
2000
我在這裡指派任務,他正在確認中。
08:33
His eyes眼睛 are going back and forth向前 up to me, to see what I'm looking at --
137
495000
3000
他的眼睛正在前後觀望,看看我正在看什麼,
08:36
so we've我們已經 got shared共享 attention注意.
138
498000
2000
現在我們有了共同注意的東西。
08:38
And so I do this task任務, and he looks容貌, and he looks容貌 to me
139
500000
3000
我指派了這個任務,他正看著,然後看著我,
08:41
to see what's happening事件 next下一個. And now I'll give him the bottle瓶子,
140
503000
4000
看看之後會發生什麼事。現在我將會把瓶子交給他,
08:45
and we'll see if he can do the task任務. Can you do that?
141
507000
2000
然後看看他能不能完成這個任務。你能做到嗎?
08:47
(Laughter笑聲)
142
509000
3000
(笑聲)
08:50
Okay. He's pretty漂亮 good. Yeah. Good, good, good.
143
512000
4000
好的。他做得很好。對,很好很好。
08:54
I didn't show顯示 you how to do that.
144
516000
2000
我並沒有告訴你怎麼做那件事。
08:56
Now see if you can put it back together一起.
145
518000
2000
現在看看你能不能把他裝回去。
08:58
(Laughter笑聲)
146
520000
2000
(笑聲)
09:00
And he thinks a robot機器人 has to be really slow.
147
522000
1000
他認為機器人必須相當緩慢。
09:01
Good robot機器人, that's good.
148
523000
2000
好機器人,很棒。
09:03
So we saw a bunch of things there.
149
525000
2000
我們看了這兒的一些事情。
09:06
We saw when we're interacting互動,
150
528000
3000
我們看見當我們互動時,
09:09
we're trying to show顯示 someone有人 how to do something, we direct直接 their visual視覺 attention注意.
151
531000
4000
試著去讓某人知道如何做一件事,我們引導它們視覺上的注意。
09:13
The other thing communicates相通 their internal內部 state to us,
152
535000
4000
另一件事是要讓它們內部狀態和我們進行溝通,
09:17
whether是否 he's understanding理解 or not, regulates調整對象 a social社會 interaction相互作用.
153
539000
3000
是否它了解如何進行社交互動。
09:20
There was shared共享 attention注意 looking at the same相同 sort分類 of thing,
154
542000
2000
這是在參與關注並看著同一個東西,
09:22
and recognizing認識 socially社交上 communicated傳達 reinforcement加強 at the end結束.
155
544000
4000
最後去辨識社交訊息。
09:26
And we've我們已經 been trying to put that into our lab實驗室 robots機器人
156
548000
3000
我們試著將這個放進我們實驗室的機器人裡,
09:29
because we think this is how you're going to want to interact相互作用 with robots機器人 in the future未來.
157
551000
4000
因為我們認為這將會是未來讓你想要和機器人互動的方式。
09:33
I just want to show顯示 you one technical技術 diagram here.
158
555000
2000
我給大家看一張技術圖表。
09:35
The most important重要 thing for building建造 a robot機器人 that you can interact相互作用 with socially社交上
159
557000
4000
要製造一台能夠和你社交性互動的機器人,
09:39
is its visual視覺 attention注意 system系統.
160
561000
2000
最重要的是它的視覺關注系統。
09:41
Because what it pays支付 attention注意 to is what it's seeing眼看
161
563000
3000
因為它注意的東西是它所看見,
09:44
and interacting互動 with, and what you're understanding理解 what it's doing.
162
566000
3000
並且互動的東西,以及讓你了解它在做什麼。
09:47
So in the videos視頻 I'm about to show顯示 you,
163
569000
3000
在接下來給大家看的這段影片中,
09:50
you're going to see a visual視覺 attention注意 system系統 on a robot機器人
164
572000
4000
你將會看見一個具有視覺關注系統的機器人,
09:54
which哪一個 has -- it looks容貌 for skin皮膚 tone in HSVHSV space空間,
165
576000
4000
其中有著 -- 它會在 HSV 區間中搜尋皮膚色澤,
09:58
so it works作品 across橫過 all human人的 colorings色素.
166
580000
4000
它會掃過所有的東西,例如人的膚色。
10:02
It looks容貌 for highly高度 saturated飽和的 colors顏色, from toys玩具.
167
584000
2000
它從玩具中搜尋高飽和度的顏色。
10:04
And it looks容貌 for things that move移動 around.
168
586000
2000
並且搜尋移動的物體。
10:06
And it weights權重 those together一起 into an attention注意 window窗口,
169
588000
3000
它將這些放在關注視窗中進行衡量,
10:09
and it looks容貌 for the highest-scoring最高分 place地點 --
170
591000
2000
找出最高分的地方 -- 東西
10:11
the stuff東東 where the most interesting有趣 stuff東東 is happening事件 --
171
593000
2000
進行中的最有意思的東西。
10:13
and that is what its eyes眼睛 then segueSEGUE to.
172
595000
4000
那就是它的眼睛的下個目標。
10:17
And it looks容貌 right at that.
173
599000
2000
它盯著那東西看。
10:19
At the same相同 time, some top-down自頂向下 sort分類 of stuff東東:
174
601000
3000
同時做出上下看的動作:
10:22
might威力 decide決定 that it's lonely孤獨 and look for skin皮膚 tone,
175
604000
3000
可能會決定顯現它很寂寞,搜尋著皮膚色澤,
10:25
or might威力 decide決定 that it's bored無聊 and look for a toy玩具 to play with.
176
607000
3000
或是決定顯現它很無聊,想找玩具來玩。
10:28
And so these weights權重 change更改.
177
610000
2000
這些衡量持續改變著。
10:30
And over here on the right,
178
612000
2000
在右邊這個地方,
10:32
this is what we call the Steven史蒂芬 Spielberg斯皮爾伯格 memorial紀念館 module.
179
614000
3000
有著我們稱為 Steven Spielberg 記憶模組的東西。
10:35
Did people see the movie電影 "AIAI"? (Audience聽眾: Yes.)
180
617000
2000
你們看過AI這部電影嗎?觀眾:有。
10:37
RBRB: Yeah, it was really bad, but --
181
619000
2000
RB:嗯,那部片很爛,但是 --
10:39
remember記得, especially特別 when Haley海利 Joel喬爾 Osment奧斯蒙特, the little robot機器人,
182
621000
4000
請記住,尤其是當Haley Joel Osment,那個小機器人,
10:43
looked看著 at the blue藍色 fairy仙女 for 2,000 years年份 without taking服用 his eyes眼睛 off it?
183
625000
4000
看著那個藍色精靈2000年都沒有移開過視線?
10:47
Well, this gets得到 rid擺脫 of that,
184
629000
2000
嗯,這可以解釋它,
10:49
because this is a habituation習慣 Gaussian高斯 that gets得到 negative,
185
631000
4000
因為這是高斯慣性變成了負值,
10:53
and more and more intense激烈 as it looks容貌 at one thing.
186
635000
3000
於是當它看著某個東西時就變得更加強烈。
10:56
And it gets得到 bored無聊, so it will then look away at something else其他.
187
638000
3000
當它覺得厭倦時,它將會轉頭去看別的東西。
10:59
So, once一旦 you've got that -- and here's這裡的 a robot機器人, here's這裡的 Kismet天命,
188
641000
4000
當你了解這點時,這兒是另一個機器人,Kismet,
11:03
looking around for a toy玩具. You can tell what it's looking at.
189
645000
4000
在四周尋找著玩具。你可以知道那是它正在看得東西。
11:07
You can estimate估計 its gaze凝視 direction方向 from those eyeballs眼球 covering覆蓋 its camera相機,
190
649000
5000
你可以藉由眼球下覆蓋的攝影機來估算它的眼神方向,
11:12
and you can tell when it's actually其實 seeing眼看 the toy玩具.
191
654000
3000
你可以知道它是不是看著玩具。
11:15
And it's got a little bit of an emotional情緒化 response響應 here.
192
657000
2000
它有一點情緒性反應。
11:17
(Laughter笑聲)
193
659000
1000
(笑聲)
11:18
But it's still going to pay工資 attention注意
194
660000
2000
不過它仍然在注意著,
11:20
if something more significant重大 comes into its field領域 of view視圖 --
195
662000
4000
是否有其它更重要的東西進入視線範圍內 --
11:24
such這樣 as Cynthia辛西婭 BreazealBreazeal, the builder建設者 of this robot機器人, from the right.
196
666000
4000
例如右邊的 Cynthia Breazeal,機器人的製造者。
11:28
It sees看到 her, pays支付 attention注意 to her.
197
670000
5000
它看見她,注意著她。
11:33
Kismet天命 has an underlying底層, three-dimensional三維 emotional情緒化 space空間,
198
675000
4000
Kismet 有一個內建的三維模式的情感區間,
11:37
a vector向量 space空間, of where it is emotionally感情上.
199
679000
3000
一個向量區間用以讓它產生情緒。
11:40
And at different不同 places地方 in that space空間, it expresses表達 --
200
682000
5000
在區間中另一個地方讓它表達出 --
11:46
can we have the volume on here?
201
688000
2000
我們可以把聲音打開嗎?
11:48
Can you hear that now, out there? (Audience聽眾: Yeah.)
202
690000
2000
你們能聽見嗎? 觀眾:可以。
11:50
Kismet天命: Do you really think so? Do you really think so?
203
692000
5000
Kismet:你真的這麼認為嗎?真的那麼認為嗎?
11:57
Do you really think so?
204
699000
2000
你真的這麼認為嗎?
12:00
RBRB: So it's expressing表達 its emotion情感 through通過 its face面對
205
702000
3000
RB:它藉由臉部表情以及
12:03
and the prosody韻律學 in its voice語音.
206
705000
2000
聲音語調來表達它的情緒。
12:05
And when I was dealing交易 with my robot機器人 over here,
207
707000
4000
當我在那兒應付我的機器人時,
12:09
Chris克里斯, the robot機器人, was measuring測量 the prosody韻律學 in my voice語音,
208
711000
3000
這個機器人,Chris,正量測著我的聲音的語調,
12:12
and so we have the robot機器人 measure測量 prosody韻律學 for four basic基本 messages消息
209
714000
5000
如此我們的機器人能夠去量測語調,
12:17
that mothers母親 give their children孩子 pre-linguistically前語言.
210
719000
4000
來了解四種母親們在小孩會說話前所使用的基本訊息。
12:21
Here we've我們已經 got naive幼稚 subjects主題 praising讚美 the robot機器人:
211
723000
3000
這兒是用純真的內容來讚美機器人,
12:26
Voice語音: Nice尼斯 robot機器人.
212
728000
2000
聲音:好機器人。
12:29
You're such這樣 a cute可愛 little robot機器人.
213
731000
2000
你是一個可愛的小機器人。
12:31
(Laughter笑聲)
214
733000
2000
(笑聲)
12:33
RBRB: And the robot's機器人 reacting反應 appropriately適當.
215
735000
2000
然後機器人的反應很恰當。
12:35
Voice語音: ...very good, Kismet天命.
216
737000
4000
聲音:很棒,Kismet。
12:40
(Laughter笑聲)
217
742000
2000
(笑聲)
12:42
Voice語音: Look at my smile微笑.
218
744000
2000
聲音:看著我笑一個。
12:46
RBRB: It smiles笑容. She imitates模仿 the smile微笑. This happens發生 a lot.
219
748000
3000
RB:它笑了。她模仿著笑容。這常會發生。
12:49
These are naive幼稚 subjects主題.
220
751000
2000
這些純真的事件。
12:51
Here we asked them to get the robot's機器人 attention注意
221
753000
3000
接著我們要求他們取得機器人的注意,
12:54
and indicate表明 when they have the robot's機器人 attention注意.
222
756000
3000
並且當他們取得機器人注意時給予提示。
12:57
Voice語音: Hey, Kismet天命, ah, there it is.
223
759000
4000
聲音:嘿,Kismet,看這邊。
13:01
RBRB: So she realizes實現 she has the robot's機器人 attention注意.
224
763000
4000
RB:她發現她取得機器人的注意了。
13:08
Voice語音: Kismet天命, do you like the toy玩具? Oh.
225
770000
4000
聲音:Kismet,你喜歡這個玩具嗎?喔。
13:13
RBRB: Now, here they're asked to prohibit禁止 the robot機器人,
226
775000
2000
RB:現在,被要求去阻止機器人,
13:15
and this first woman女人 really pushes the robot機器人 into an emotional情緒化 corner.
227
777000
4000
第一個女士已經將機器人推入情緒的角落。
13:19
Voice語音: No. No. You're not to do that. No.
228
781000
5000
聲音:不行,你不可以那樣。不可以。
13:24
(Laughter笑聲)
229
786000
3000
(笑聲)
13:27
Not appropriate適當. No. No.
230
789000
6000
聲音:不行,不可以。
13:33
(Laughter笑聲)
231
795000
3000
(笑聲)
13:36
RBRB: I'm going to leave離開 it at that.
232
798000
2000
RB:就先到這邊為止。
13:38
We put that together一起. Then we put in turn taking服用.
233
800000
2000
我們將這些都放在一起,然後我們輸入輪流的觀念。
13:40
When we talk to someone有人, we talk.
234
802000
3000
當我們和某人說時,我們說著話。
13:43
Then we sort分類 of raise提高 our eyebrows眉毛, move移動 our eyes眼睛,
235
805000
4000
然後,我們會揚起眉毛,移動眼睛,
13:47
give the other person the idea理念 it's their turn to talk.
236
809000
3000
告訴另一個人,該你說了。
13:50
And then they talk, and then we pass通過 the baton指揮棒 back and forth向前 between之間 each other.
237
812000
4000
然後就這樣一來一往的開始交談起來。
13:54
So we put this in the robot機器人.
238
816000
2000
於是我們將這個機制放進機器人中。
13:56
We got a bunch of naive幼稚 subjects主題 in,
239
818000
2000
我們輸入許多常見的話題,
13:58
we didn't tell them anything about the robot機器人,
240
820000
2000
我們沒有告訴他們任何關於機器人的事,
14:00
satSAT them down in front面前 of the robot機器人 and said, talk to the robot機器人.
241
822000
2000
讓他們坐在機器人面前和機器人聊天。
14:02
Now what they didn't know was,
242
824000
2000
不過他們並不知道,
14:04
the robot機器人 wasn't understanding理解 a word they said,
243
826000
2000
機器人根本不知道他們再說什麼,
14:06
and that the robot機器人 wasn't speaking請講 English英語.
244
828000
3000
機器人也不會說英文。
14:09
It was just saying random隨機 English英語 phonemes音素.
245
831000
2000
它只是隨機說出英文的字句。
14:11
And I want you to watch carefully小心, at the beginning開始 of this,
246
833000
2000
請注意看一開始的狀況,
14:13
where this person, Ritchie里奇, who happened發生 to talk to the robot機器人 for 25 minutes分鐘 --
247
835000
4000
這位 Ritchie 一不小心就跟機器人聊了 25 分鐘 --
14:17
(Laughter笑聲)
248
839000
2000
(笑聲)
14:19
-- says, "I want to show顯示 you something.
249
841000
2000
-- 他說,"我想讓你看樣東西。
14:21
I want to show顯示 you my watch."
250
843000
2000
我讓你看我的手錶。"
14:23
And he brings帶來 the watch center中央, into the robot's機器人 field領域 of vision視力,
251
845000
5000
他把手錶的中心放進了機器人的視野之中,
14:28
points to it, gives it a motion運動 cue球桿,
252
850000
2000
指著它,給機器人一個感覺的提示,
14:30
and the robot機器人 looks容貌 at the watch quite相當 successfully順利.
253
852000
2000
機器人成功地看著那支手錶。
14:32
We don't know whether是否 he understood了解 or not that the robot機器人 --
254
854000
3000
我們不確定他是否知道這個機器人 --
14:36
Notice注意 the turn-taking轉回吐.
255
858000
2000
注意的是輪流說話的時機。
14:38
Ritchie里奇: OK, I want to show顯示 you something. OK, this is a watch
256
860000
3000
Ritchie:好了,我要讓你看樣東西。嗯,這是一支手錶,
14:41
that my girlfriend女朋友 gave me.
257
863000
3000
我女朋友給我的。
14:44
Robot機器人: Oh, cool.
258
866000
2000
Robot:喔,真酷。
14:46
Ritchie里奇: Yeah, look, it's got a little blue藍色 light in it too. I almost幾乎 lost丟失 it this week.
259
868000
4000
Ritchie:是啊,看,上面有微弱的藍光照明。我這裡拜差點把它搞丟了。
14:51
(Laughter笑聲)
260
873000
4000
(笑聲)
14:55
RBRB: So it's making製造 eye contact聯繫 with him, following以下 his eyes眼睛.
261
877000
3000
RB:它正在和他目光接觸,盯著他的眼睛看。
14:58
Ritchie里奇: Can you do the same相同 thing? Robot機器人: Yeah, sure.
262
880000
2000
Ritchie:你能做一樣的事嗎?機器人:是的,當然嚕。
15:00
RBRB: And they successfully順利 have that sort分類 of communication通訊.
263
882000
2000
RB:他們成功地進行著這類溝通。
15:02
And here's這裡的 another另一個 aspect方面 of the sorts排序 of things that Chris克里斯 and I were doing.
264
884000
4000
接下來是類似我和 Chris 所做的事情。
15:06
This is another另一個 robot機器人, CogCOG.
265
888000
2000
這是另外一個機器人,Cog。
15:08
They first make eye contact聯繫, and then, when Christie克里斯蒂 looks容貌 over at this toy玩具,
266
890000
6000
他們一開始也是進行目光接觸,然後當 Christie 看著這個玩具,
15:14
the robot機器人 estimates估計 her gaze凝視 direction方向
267
896000
2000
機器人估算她目光的方向,
15:16
and looks容貌 at the same相同 thing that she's looking at.
268
898000
2000
然後看著她正在看的那個東西。
15:18
(Laughter笑聲)
269
900000
1000
(笑聲)
15:19
So we're going to see more and more of this sort分類 of robot機器人
270
901000
3000
未來幾年中,我們將在這個實驗室
15:22
over the next下一個 few少數 years年份 in labs實驗室.
271
904000
2000
製作出更多類似的機器人。
15:24
But then the big questions問題, two big questions問題 that people ask me are:
272
906000
5000
但之後的大問題是,兩個人們常問的問題是:
15:29
if we make these robots機器人 more and more human-like類人,
273
911000
2000
如果我們製作出的機器人越來越像人,
15:31
will we accept接受 them, will we -- will they need rights權利 eventually終於?
274
913000
5000
我們要尊重它們嗎?它們需要機器人權嗎?
15:36
And the other question people ask me is, will they want to take over?
275
918000
3000
另一個人們常問的問題是,它們會取代我們嗎?
15:39
(Laughter笑聲)
276
921000
1000
(笑聲)
15:40
And on the first -- you know, this has been a very Hollywood好萊塢 theme主題
277
922000
3000
首先 -- 你知道,這已經是好萊塢許多電影的內容了。
15:43
with lots of movies電影. You probably大概 recognize認識 these characters人物 here --
278
925000
3000
你大概能認得這些角色 --
15:46
where in each of these cases, the robots機器人 want more respect尊重.
279
928000
4000
這些例子中,機器人都希望能得到更多的尊重。
15:50
Well, do you ever need to give robots機器人 respect尊重?
280
932000
3000
你曾經需要給予機器人尊重嗎?
15:54
They're just machines, after all.
281
936000
2000
它們只不過是機器啊。
15:56
But I think, you know, we have to accept接受 that we are just machines.
282
938000
4000
但是,我認為,我們必須接受,我們也只是機器。
16:00
After all, that's certainly當然 what modern現代 molecular分子 biology生物學 says about us.
283
942000
5000
畢竟,對許多現代分子生物學家來說,就是如此。
16:05
You don't see a description描述 of how, you know,
284
947000
3000
你不會看見任何說明來告訴你,
16:08
Molecule分子 A, you know, comes up and docks碼頭 with this other molecule分子.
285
950000
4000
一個分子是怎麼和另一個分子結合在一起的。
16:12
And it's moving移動 forward前鋒, you know, propelled推進的 by various各個 charges收費,
286
954000
3000
它會因為許多因素而移動,
16:15
and then the soul靈魂 steps腳步 in and tweaks調整 those molecules分子 so that they connect.
287
957000
4000
而靈魂介入之後,改變了這些分子的連結方式。
16:19
It's all mechanistic機械. We are mechanism機制.
288
961000
3000
這些都符合機械原理,我們都是機械的機構。
16:22
If we are machines, then in principle原理 at least最小,
289
964000
3000
如果我們是機器,那麼理論上,
16:25
we should be able能夠 to build建立 machines out of other stuff東東,
290
967000
4000
我們應該能夠製造出
16:29
which哪一個 are just as alive as we are.
291
971000
4000
和我們一樣活生生的機器。
16:33
But I think for us to admit承認 that,
292
975000
2000
但如果我們讓它發生的話,
16:35
we have to give up on our special-ness特別的煩躁, in a certain某些 way.
293
977000
3000
我們在某些方面必須放棄我們的獨特性。
16:38
And we've我們已經 had the retreat撤退 from special-ness特別的煩躁
294
980000
2000
在過去至少幾百年中
16:40
under the barrage彈幕 of science科學 and technology技術 many許多 times
295
982000
3000
科學與技術多次的衝擊下,
16:43
over the last few少數 hundred years年份, at least最小.
296
985000
2000
我們失去了獨特的特性。
16:45
500 years年份 ago we had to give up the idea理念
297
987000
2000
五百年前,當人們發現
16:47
that we are the center中央 of the universe宇宙
298
989000
3000
地球繞著太陽轉時,
16:50
when the earth地球 started開始 to go around the sun太陽;
299
992000
2000
才放棄了地球是宇宙中心的想法;
16:52
150 years年份 ago, with Darwin達爾文, we had to give up the idea理念 we were different不同 from animals動物.
300
994000
5000
一百五十年前,因為達爾文,我們放棄了自以為和動物不同的想法。
16:57
And to imagine想像 -- you know, it's always hard for us.
301
999000
3000
想想這個,對人們而言這是相當困難的。
17:00
Recently最近 we've我們已經 been battered受虐 with the idea理念 that maybe
302
1002000
3000
近代的人們被,人類並不是獨立被創造出來的,
17:03
we didn't even have our own擁有 creation創建 event事件, here on earth地球,
303
1005000
2000
這種觀念給擊垮了,雖然很多人不喜歡這說法。
17:05
which哪一個 people didn't like much. And then the human人的 genome基因組 said,
304
1007000
3000
而人類的基因顯示著,
17:08
maybe we only have 35,000 genes基因. And that was really --
305
1010000
3000
也許我們只有35000個基因。
17:11
people didn't like that, we've我們已經 got more genes基因 than that.
306
1013000
3000
這也是人們不喜歡的,我們應該要有更多基因吧。
17:14
We don't like to give up our special-ness特別的煩躁, so, you know,
307
1016000
3000
我們不喜歡放棄我們的獨特性,
17:17
having the idea理念 that robots機器人 could really have emotions情緒,
308
1019000
2000
知道機器人也許能夠有情感,
17:19
or that robots機器人 could be living活的 creatures生物 --
309
1021000
2000
或是機器人能變成生物時 --
17:21
I think is going to be hard for us to accept接受.
310
1023000
2000
我想這種事很難讓人們接受。
17:23
But we're going to come to accept接受 it over the next下一個 50 years年份 or so.
311
1025000
4000
但是在未來五十年或更久之後,我們將會接受它。
17:27
And the second第二 question is, will the machines want to take over?
312
1029000
3000
第二個問題是,機器人是否會想要接管這一切?
17:30
And here the standard標準 scenario腳本 is that we create創建 these things,
313
1032000
5000
標準的劇情模式是,我們將會發明機器人,
17:35
they grow增長, we nurture培育 them, they learn學習 a lot from us,
314
1037000
3000
我們培育它們,它們從而學習到許多事情,
17:38
and then they start開始 to decide決定 that we're pretty漂亮 boring無聊, slow.
315
1040000
4000
接著將會覺得我們很無趣,很遲鈍。
17:42
They want to take over from us.
316
1044000
2000
它們想要取代我們。
17:44
And for those of you that have teenagers青少年, you know what that's like.
317
1046000
3000
那些有叛逆期孩子的家長,應該很了解這種感覺。
17:47
(Laughter笑聲)
318
1049000
1000
(笑聲)
17:48
But Hollywood好萊塢 extends擴展 it to the robots機器人.
319
1050000
3000
好萊塢將這種劇情延伸到機器人上。
17:51
And the question is, you know,
320
1053000
3000
問題是,
17:54
will someone有人 accidentally偶然 build建立 a robot機器人 that takes over from us?
321
1056000
4000
會不會有人不小心做出一個能取代我們而主宰世界的機器人?
17:58
And that's sort分類 of like this lone孤單 guy in the backyard後院,
322
1060000
3000
就像是某個在後院的寂寞男人,
18:01
you know -- "I accidentally偶然 built內置 a 747."
323
1063000
3000
告訴你,"我不小心做出了一架波音747。"
18:04
I don't think that's going to happen發生.
324
1066000
2000
我不認為這會發生。
18:06
And I don't think --
325
1068000
2000
我不認為 --
18:08
(Laughter笑聲)
326
1070000
1000
(笑聲)
18:09
-- I don't think we're going to deliberately故意 build建立 robots機器人
327
1071000
3000
我不認為我們會故意去創造出
18:12
that we're uncomfortable不舒服 with.
328
1074000
2000
讓我們不舒服的機器人。
18:14
We'll -- you know, they're not going to have a super bad robot機器人.
329
1076000
2000
嗯 -- 你知道,不會有那種非常糟糕的機器人啦。
18:16
Before that has to come to be a mildly溫和 bad robot機器人,
330
1078000
3000
在那之前會先出現稍微糟糕的機器人,
18:19
and before that a not so bad robot機器人.
331
1081000
2000
在之前會出現不那麼糟糕的機器人。
18:21
(Laughter笑聲)
332
1083000
1000
(笑聲)
18:22
And we're just not going to let it go that way.
333
1084000
2000
我們不會讓狀況惡化的。
18:24
(Laughter笑聲)
334
1086000
1000
(笑聲)
18:25
So, I think I'm going to leave離開 it at that: the robots機器人 are coming未來,
335
1087000
6000
所以我最後的結論是:機器人的時代即將來臨,
18:31
we don't have too much to worry擔心 about, it's going to be a lot of fun開玩笑,
336
1093000
3000
大家不用擔心太多,那將會帶來很多樂趣,
18:34
and I hope希望 you all enjoy請享用 the journey旅程 over the next下一個 50 years年份.
337
1096000
4000
希望大家都能在未來五十年中享受這過程。
18:38
(Applause掌聲)
338
1100000
2000
(掌聲)
Translated by kane tan
Reviewed by Shelley Krishna R. TSANG

▲Back to top

ABOUT THE SPEAKER
Rodney Brooks - Roboticist
Rodney Brooks builds robots based on biological principles of movement and reasoning. The goal: a robot who can figure things out.

Why you should listen

Former MIT professor Rodney Brooks studies and engineers robot intelligence, looking for the holy grail of robotics: the AGI, or artificial general intelligence. For decades, we've been building robots to do highly specific tasks -- welding, riveting, delivering interoffice mail -- but what we all want, really, is a robot that can figure things out on its own, the way we humans do.

Brooks realized that a top-down approach -- just building the biggest brain possible and teaching it everything we could think of -- would never work. What would work is a robot who learns like we do, by trial and error, and with many separate parts that learn separate jobs. The thesis of his work which was captured in Fast, Cheap and Out of Control,went on to become the title of the great Errol Morris documentary.

A founder of iRobot, makers of the Roomba vacuum, Brooks is now founder and CTO of Rethink Robotics, whose mission is to apply advanced robotic intelligence to manufacturing and physical labor. Its first robots: the versatile two-armed Baxter and one-armed Sawyer. Brooks is the former director of CSAIL, MIT's Computers Science and Artificial Intelligence Laboratory.

 
More profile about the speaker
Rodney Brooks | Speaker | TED.com