ABOUT THE SPEAKER
Sam Rodriques - Neuroengineer
Sam Rodriques invents new technologies to help neuroscientists understand how the brain works.

Why you should listen

Sam Rodriques has designed a new nanofabrication method and a new approach to sensing neural activity with probes in the bloodstream, and his forthcoming technologies are aimed at understanding the spatial organization of cells in the brain and the transcriptional activity of neurons over time. He is a graduate student in Physics at the Massachusetts Institute of Technology, with affiliations at the MIT Media Lab, the MIT Department of Brain and Cognitive Sciences and the Broad Institute of Harvard and MIT. He graduated summa cum laude with highest honors in Physics from Haverford College, where he worked on new methods for calculating quantum entanglement in multipartite quantum systems. He has received numerous national awards and fellowships to support his research, including the Hertz Foundation Graduate Fellowship, an NSF Graduate Research Fellowship and a Churchill Scholarship.

More profile about the speaker
Sam Rodriques | Speaker | TED.com
TEDxBeaconStreet

Sam Rodriques: What we'll learn about the brain in the next century

山姆 · 羅德里各斯: 下個世紀,我們對大腦將有什麼新的了解

Filmed:
1,621,715 views

神經工程學家山姆 · 羅德里各斯在這想像力豐富的演講中,帶領我們進入未來百年腦科學的驚心動魄之旅。他想像著理解和治療腦部疾病的關鍵,以及可能會發生的奇怪(甚至可怕的)創新方式,像是以雷射光在頭骨上鑽小洞,伸入探針來研究神經元的電訊活動。
- Neuroengineer
Sam Rodriques invents new technologies to help neuroscientists understand how the brain works. Full bio

Double-click the English transcript below to play the video.

00:13
I want to tell you guys
something about neuroscience神經科學.
0
1040
2507
我想告訴你們一些
關於神經科學的東西。
00:16
I'm a physicist物理學家 by training訓練.
1
4040
1800
我是一名通過培訓的物理學家。
00:18
About three years年份 ago, I left physics物理
2
6230
2206
大約三年前,我離開物理學,
00:20
to come and try to understand理解
how the brain works作品.
3
8460
2349
試圖理解大腦的工作原理。
00:22
And this is what I found發現.
4
10833
1474
這就是我發現的。
00:24
Lots of people are working加工 on depression蕭條.
5
12331
2064
很多人研究抑鬱症。
00:26
And that's really good,
6
14419
1159
這真的很好,
00:27
depression蕭條 is something
that we really want to understand理解.
7
15602
2721
抑鬱症是我們真正想要了解的東西。
00:30
Here's這裡的 how you do it:
8
18347
1167
做法是:
00:31
you take a jar and you fill it up,
about halfway, with water.
9
19538
4161
拿個罐子,裝大約半滿的水。
00:35
And then you take a mouse老鼠,
and you put the mouse老鼠 in the jar, OK?
10
23723
4182
然後抓隻老鼠,放進罐子裡,
00:39
And the mouse老鼠 swims游泳 around
for a little while
11
27929
2350
老鼠游了一會兒,
00:42
and then at some point,
the mouse老鼠 gets得到 tired
12
30303
2388
到某個時候,老鼠感到疲倦,
00:44
and decides決定 to stop swimming游泳的.
13
32715
1934
決定不游了。
00:46
And when it stops停止 swimming游泳的,
that's depression蕭條.
14
34673
3133
停止游泳時就是得了抑鬱症。
00:50
OK?
15
38696
1150
這樣嗎?
00:52
And I'm from theoretical理論 physics物理,
16
40291
3380
我學的是理論物理學,
00:55
so I'm used to people making製造
very sophisticated複雜的 mathematical數學的 models楷模
17
43695
3668
所以習慣了人們用非常複雜的數學模型
00:59
to precisely恰恰 describe描述 physical物理 phenomena現象,
18
47387
2881
來精確地描述物理現象,
01:02
so when I saw that this
is the model模型 for depression蕭條,
19
50292
2452
當我看到這是抑鬱症的模型時,
01:04
I though雖然 to myself, "Oh my God,
we have a lot of work to do."
20
52768
2937
我對自己說:「哦,天啊,
我們還有很多工作要做。」
01:07
(Laughter笑聲)
21
55729
1370
(笑聲)
01:09
But this is a kind of general一般
problem問題 in neuroscience神經科學.
22
57123
2951
但這是神經科學中的普遍問題。
01:12
So for example, take emotion情感.
23
60377
2111
舉情感為例。
01:14
Lots of people want to understand理解 emotion情感.
24
62512
2459
很多人想要了解情感。
01:17
But you can't study研究 emotion情感
in mice老鼠 or monkeys猴子
25
65352
3313
但你不能研究老鼠或猴子的情感,
01:20
because you can't ask them
26
68689
1254
因為你不能問牠們感覺如何
01:21
how they're feeling感覺
or what they're experiencing經歷.
27
69967
2317
或者正經歷些什麼。
因此替代的是,想要了解情感的人
01:24
So instead代替, people who want
to understand理解 emotion情感,
28
72308
2357
通常終會研究所謂的行為動機,
01:26
typically一般 end結束 up studying研究
what's called motivated動機 behavior行為,
29
74689
2777
是「老鼠在極想要乳酪時
所做的事情」的另一種說法。
01:29
which哪一個 is code for "what the mouse老鼠 does
when it really, really wants cheese起司."
30
77490
3658
01:33
OK, I could go on and on.
31
81839
1675
我的例子可多著呢。
01:35
I mean, the point is, the NIHNIH
spends about 5.5 billion十億 dollars美元 a year
32
83538
6316
我的意思是,美國衛生研究院
每年約花費 55 億美元
01:41
on neuroscience神經科學 research研究.
33
89878
1532
研究神經科學。
01:43
And yet然而 there have been almost幾乎
no significant重大 improvements改進 in outcomes結果
34
91434
3603
然而幾乎沒有顯著的改善結果,
01:47
for patients耐心 with brain diseases疾病
in the past過去 40 years年份.
35
95061
3491
過去 40 年來
腦病患者預後的改善結果不大。
01:51
And I think a lot of that
is basically基本上 due應有 to the fact事實
36
99015
2540
我認為其中很大一部分原因
01:53
that mice老鼠 might威力 be OK as a model模型
for cancer癌症 or diabetes糖尿病,
37
101579
4151
是老鼠雖可當作
癌症或糖尿病的模型,
01:57
but the mouse老鼠 brain
is just not sophisticated複雜的 enough足夠
38
105754
2687
但鼠腦不夠複雜,
02:00
to reproduce複製 human人的 psychology心理學
or human人的 brain disease疾病.
39
108465
3175
不足以複製人類心理學
或人類的腦部疾病。
02:04
OK?
40
112379
1225
明白嗎?
02:05
So if the mouse老鼠 models楷模 are so bad,
why are we still using運用 them?
41
113628
3634
如果老鼠的模型很糟糕,
為什麼我們仍用呢?
02:10
Well, it basically基本上 boils down to this:
42
118143
2103
基本上可以歸結為:
02:12
the brain is made製作 up of neurons神經元
43
120270
2556
大腦由神經元組成,
02:14
which哪一個 are these little cells細胞 that send發送
electrical電動 signals信號 to each other.
44
122850
3447
神經元是互相發送電訊的小細胞。
02:18
If you want to understand理解
how the brain works作品,
45
126680
2144
要瞭解大腦的工作方式,
02:20
you have to be able能夠 to measure測量
the electrical電動 activity活動 of these neurons神經元.
46
128848
3808
就必須能測量神經元活動的電訊。
02:25
But to do that, you have to get
really close to the neurons神經元
47
133339
2992
但為了要做到,就必須用某種
02:28
with some kind of electrical電動
recording記錄 device設備 or a microscope顯微鏡.
48
136355
2928
電子記錄設備或顯微鏡
來真正接近神經元。
02:31
And so you can do that in mice老鼠
and you can do it in monkeys猴子,
49
139563
2810
在老鼠和在猴子身上做得到,
02:34
because you can physically物理
put things into their brain
50
142397
2548
因為能把異物實際放入牠們的腦中;
但由於某種原因,
我們仍無法在人類身上這麼做。
02:36
but for some reason原因 we still
can't do that in humans人類, OK?
51
144969
3046
02:40
So instead代替, we've我們已經 invented發明
all these proxies代理.
52
148533
3370
於是我們發明了這些替代品。
02:43
So the most popular流行 one is probably大概 this,
53
151927
2515
最流行的可能是這個,
02:46
functional實用 MRIMRI, fMRI功能磁共振成像,
54
154466
2397
功能性核磁共振成像(fMRI),
02:48
which哪一個 allows允許 you to make these
pretty漂亮 pictures圖片 like this,
55
156887
2692
可以讓你做出像這樣的漂亮圖片,
02:51
that show顯示 which哪一個 parts部分
of your brain light up
56
159603
2056
顯示你從事不同的活動時,
02:53
when you're engaged訂婚
in different不同 activities活動.
57
161683
2126
腦部的哪些部位會發光。
02:55
But this is a proxy代理.
58
163833
1920
但這是替代品,
02:57
You're not actually其實 measuring測量
neural神經 activity活動 here.
59
165777
3292
並未實際測量腦的神經活動,
03:01
What you're doing
is you're measuring測量, essentially實質上,
60
169093
2842
基本上測量的是腦中的血流。
03:03
like, blood血液 flow in the brain.
61
171959
1832
03:05
Where there's more blood血液.
62
173815
1238
實際上血流多的地方含氧量就高,
03:07
It's actually其實 where there's more oxygen,
but you get the idea理念, OK?
63
175077
3103
明白了吧?
另一樣是能驗「腦電圖」:
03:10
The other thing that you can do
is you can do this --
64
178204
2519
03:12
electroencephalography腦電圖 -- you can put
these electrodes電極 on your head, OK?
65
180747
3591
把電極放在頭上,測量腦波。
03:16
And then you can measure測量 your brain waves波浪.
66
184362
2143
03:19
And here, you're actually其實 measuring測量
electrical電動 activity活動.
67
187125
3079
這裡實際測量的是電子的活動,
03:22
But you're not measuring測量
the activity活動 of neurons神經元.
68
190228
2365
不是測量神經元的活動。
03:24
You're measuring測量
these electrical電動 currents電流,
69
192911
2444
測量這些電流
03:27
sloshing晃動 back and forth向前 in your brain.
70
195379
2299
在腦中流來流去。
03:30
So the point is just
that these technologies技術 that we have
71
198157
2674
重點是我們擁有的這些技術
03:32
are really measuring測量 the wrong錯誤 thing.
72
200855
2436
是在量錯誤的東西。
03:35
Because, for most of the diseases疾病
that we want to understand理解 --
73
203315
2953
因為大多數我們想要了解的疾病──
03:38
like, Parkinson's帕金森氏 is the classic經典 example.
74
206292
2198
帕金森氏症就是個典型例子──
03:40
In Parkinson's帕金森氏, there's one particular特定
kind of neuron神經元 deep in your brain
75
208514
3554
帕金森氏症患者的大腦深處
有種特定類型的神經元
03:44
that is responsible主管 for the disease疾病,
76
212092
1731
是造成這病的原因,
而這些技術根本沒有
你需要的解析度。
03:45
and these technologies技術 just don't have
the resolution解析度 that you need
77
213847
3182
03:49
to get at that.
78
217053
1373
03:50
And so that's why
we're still stuck卡住 with the animals動物.
79
218450
3974
這就是為什麼我們仍使用動物實驗。
03:54
Not that anyone任何人 wants
to be studying研究 depression蕭條
80
222448
2533
不是每個人都想透過
把老鼠裝罐來研究抑鬱症吧?
03:57
by putting mice老鼠 into jars罐子, right?
81
225005
2262
03:59
It's just that there's this pervasive無處不在
sense that it's not possible可能
82
227291
3753
只是大家普遍意識到
監看健康人士的神經元活動
是不可能的。
04:03
to look at the activity活動 of neurons神經元
in healthy健康 humans人類.
83
231068
3847
04:08
So here's這裡的 what I want to do.
84
236180
1492
因此我想要做的是這個:
04:09
I want to take you into the future未來.
85
237974
2521
我想帶你走進未來,
04:12
To have a look at one way in which哪一個
I think it could potentially可能 be possible可能.
86
240519
4482
看看我認為一種未來可能實現的方式。
04:17
And I want to preface前言 this by saying,
I don't have all the details細節.
87
245526
3298
先聲明,我沒有所有的細節,
04:21
So I'm just going to provide提供 you
with a kind of outline大綱.
88
249272
2967
我只會提供一個大致樣貌。
04:24
But we're going to go the year 2100.
89
252263
2400
我們來到 2100 年。
04:27
Now what does the year 2100 look like?
90
255732
2299
2100 年是什麼樣子?
04:30
Well, to start開始 with, the climate氣候
is a bit warmer回暖 that what you're used to.
91
258055
3518
首先,氣候比你習慣的還熱些。
(麻薩諸塞州的波士頓市)
04:33
(Laughter笑聲)
92
261597
3583
(笑聲)
04:37
And that robotic機器人 vacuum真空 cleaner清潔器
that you know and love
93
265204
4952
那個你熟悉且喜愛的掃地機器人
04:42
went through通過 a few少數 generations,
94
270180
1514
經歷了幾代,
04:43
and the improvements改進
were not always so good.
95
271718
2843
而改進並不總是盡如人意。
04:46
(Laughter笑聲)
96
274585
1595
(笑聲)
04:48
It was not always for the better.
97
276530
2310
未必總是改得更好。
04:52
But actually其實, in the year 2100
most things are surprisingly出奇 recognizable識別.
98
280221
4538
但實際上在 2100 年,
出乎意料地,大多事情仍可辨識出來。
04:57
It's just the brain is totally完全 different不同.
99
285458
2734
只是大腦完全不同。
05:00
For example, in the year 2100,
100
288740
2547
例如,在 2100 年,
05:03
we understand理解 the root causes原因
of Alzheimer's老年癡呆症.
101
291311
2857
我們了解阿茲海默症的根本原因,
05:06
So we can deliver交付 targeted針對
genetic遺傳 therapies治療 or drugs毒品
102
294192
3714
因此,能提供標靶基因療法或藥物,
05:09
to stop the degenerative退行性 process處理
before it begins開始.
103
297930
2876
在病發前就先阻止它惡化。
05:13
So how did we do it?
104
301629
1333
我們如何辦到的?
05:15
Well, there were essentially實質上 three steps腳步.
105
303898
2238
基本上有三個步驟。
05:18
The first step was
that we had to figure數字 out
106
306589
2814
第一是我們必須找出
05:21
some way to get electrical電動
connections連接 through通過 the skull頭骨
107
309427
3293
某種通過顱骨的接電方式
05:24
so we could measure測量
the electrical電動 activity活動 of neurons神經元.
108
312744
3015
來測量神經元的電訊活動。
05:28
And not only that,
it had to be easy簡單 and risk-free沒有風險.
109
316339
4349
不僅如此,它必須簡單且無風險,
05:32
Something that basically基本上 anyone任何人
would be OK with,
110
320712
2378
是基本上人人可行的方法,
05:35
like getting得到 a piercing沖孔.
111
323114
1600
就像穿耳洞一樣。
05:37
Because back in 2017,
112
325156
2747
回溯到 2017 年,
05:39
the only way that we knew知道 of
to get through通過 the skull頭骨
113
327927
2913
我們唯一知道進入頭骨的方法
05:42
was to drill鑽頭 these holes
the size尺寸 of quarters住處.
114
330864
2817
是鑽出 25 分錢幣大小的洞。
05:46
You would never let
someone有人 do that to you.
115
334015
2039
你絕不會允許有人
在你頭上鑽那樣的洞。
05:48
So in the 2020s,
116
336967
2253
所以到了 2020 年代,
05:51
people began開始 to experiment實驗 --
rather than drilling鑽孔 these gigantic巨大 holes,
117
339244
3381
人們開始試驗不鑽大的洞,
05:54
drilling鑽孔 microscopic顯微 holes,
no thicker更厚的 than a piece of hair頭髮.
118
342649
3115
而是鑽細細的孔,不比一根頭髮粗。
05:58
And the idea理念 here
was really for diagnosis診斷 --
119
346735
2096
這想法主要是用於診斷。
06:00
there are lots of times in the diagnosis診斷
of brain disorders障礙
120
348855
2786
為了經常要診斷異常的腦神經活動
06:03
when you would like to be able能夠 to look
at the neural神經 activity活動 beneath下面 the skull頭骨
121
351665
4872
而鑽這些觸及顱下的小洞,
使得重複多次為病人診斷腦障礙
06:08
and being存在 able能夠 to drill鑽頭
these microscopic顯微 holes
122
356561
3191
06:11
would make that much easier更輕鬆
for the patient患者.
123
359776
2142
變得更容易。
06:13
In the end結束, it would be
like getting得到 a shot射擊.
124
361942
2349
最終會像是打一針而已,
06:16
You just go in and you sit down
125
364315
1580
你只要進去,坐下來,
06:17
and there's a thing
that comes down on your head,
126
365919
2301
一樣東西落在你頭上,
瞬間感到刺痛,就結束了,
06:20
and a momentary瞬間 sting and then it's doneDONE,
127
368244
1953
06:22
and you can go back about your day.
128
370221
1864
你可以回去做你原先做的事。
06:24
So we're eventually終於 able能夠 to do it
129
372736
4793
我們最終能用雷射光鑽孔。
06:29
using運用 lasers激光器 to drill鑽頭 the holes.
130
377553
2667
06:32
And with the lasers激光器,
it was fast快速 and extremely非常 reliable可靠,
131
380244
2620
雷射光的速度非常快且非常可靠,
06:34
you couldn't不能 even tell
the holes were there,
132
382888
2213
你甚至看不出那裡有個洞,
06:37
any more than you could tell
that one of your hairs was missing失踪.
133
385125
3000
就像看不出那裡掉了根頭髮一樣。
06:40
And I know it might威力 sound聲音 crazy,
using運用 lasers激光器 to drill鑽頭 holes in your skull頭骨,
134
388753
4738
用雷射光在頭骨上打洞
聽起來可能很瘋狂,
06:45
but back in 2017,
135
393515
1366
但是在 2017 年,
06:46
people were OK with surgeons外科醫生
shooting射擊 lasers激光器 into their eyes眼睛
136
394905
4109
外科醫生將雷射光射入眼睛
進行矯正視力是可被人接受的,
06:51
for corrective糾正的 surgery手術
137
399038
1214
06:52
So when you're already已經 here,
it's not that big of a step.
138
400276
3887
雷射已被使用了,
所以這不算是多大的進步。
06:57
OK?
139
405561
1151
明白嗎?
06:58
So the next下一個 step,
that happened發生 in the 2030s,
140
406736
3571
2030 年代發生的下一步
07:02
was that it's not just about
getting得到 through通過 the skull頭骨.
141
410331
3086
不止於穿過顱骨,
07:05
To measure測量 the activity活動 of neurons神經元,
142
413441
1700
為了測量神經元的活動,
07:07
you have to actually其實 make it
into the brain tissue組織 itself本身.
143
415165
3825
必須實際上穿透進腦組織本身。
07:11
And the risk風險, whenever每當
you put something into the brain tissue組織,
144
419344
2968
無論何時,把異物放入腦組織的風險
基本上都是中風。
07:14
is essentially實質上 that of stroke行程.
145
422336
1439
你會碰破血管,
07:15
That you would hit擊中
a blood血液 vessel船隻 and burst爆裂 it,
146
423799
2196
導致中風。
07:18
and that causes原因 a stroke行程.
147
426019
1519
07:19
So, by the mid 2030s,
we had invented發明 these flexible靈活 probes探頭
148
427916
3725
因此,我們在 2030 年代中期
發明了靈活的探針,
07:23
that were capable of going
around blood血液 vessels船隻,
149
431665
2278
能夠隨著血管繞行,
07:25
rather than through通過 them.
150
433967
1476
不會刺穿血管。
07:27
And thus從而, we could put
huge巨大 batteries電池 of these probes探頭
151
435467
5697
如此,我們可以將大量的探針電池
07:33
into the brains大腦 of patients耐心
152
441188
1357
放入患者腦中,
07:34
and record記錄 from thousands數千 of their neurons神經元
without any risk風險 to them.
153
442569
3270
記錄成千上萬個神經元,
且不會造成任何風險。
07:39
And what we discovered發現,
sort分類 of to our surprise,
154
447458
4061
我們驚奇地發現
07:43
is that the neurons神經元 that we could identify鑑定
155
451543
2190
可辨識的神經元
07:45
were not responding響應
to things like ideas思路 or emotion情感,
156
453757
3524
對想法或情緒等事物沒反應,
07:49
which哪一個 was what we had expected預期.
157
457305
1627
不如我們原先的預期。
07:50
They were mostly大多 responding響應
to things like Jennifer詹妮弗 Aniston安妮斯頓
158
458956
3796
神經元主要回應珍妮佛安妮斯頓、
07:54
or Halle哈勒 Berry漿果
159
462776
2404
荷莉貝瑞
07:57
or Justin賈斯汀 Trudeau特魯多.
160
465204
1310
或加拿大總理賈斯汀杜魯道之類。
07:58
I mean --
161
466538
1253
我的意思是……
07:59
(Laughter笑聲)
162
467815
2326
(笑聲)
08:02
In hindsight事後, we shouldn't不能
have been that surprised詫異.
163
470165
2437
事後看來,我們不該感到訝異。
08:04
I mean, what do your neurons神經元
spend most of their time thinking思維 about?
164
472626
3262
不然你的神經元
大部分時間在想什麼?
08:07
(Laughter笑聲)
165
475912
1150
(笑聲)
08:09
But really, the point is that
166
477380
2040
但說真的,重點是
08:11
this technology技術 enabled啟用 us to begin開始
studying研究 neuroscience神經科學 in individuals個人.
167
479444
4430
這項技術使我們能
開始研究個體的神經科學。
08:15
So much like the transition過渡 to genetics遺傳學,
at the single cell細胞 level水平,
168
483898
4230
就像遺傳學轉到單細胞層次那樣,
08:20
we started開始 to study研究 neuroscience神經科學,
at the single human人的 level水平.
169
488152
3206
我們開始在個人身上研究神經科學。
08:23
But we weren't quite相當 there yet然而.
170
491890
1618
但我們尚未到那兒。
08:25
Because these technologies技術
171
493895
1642
因為這些技術
08:27
were still restricted限制
to medical applications應用,
172
495561
3056
仍然侷限於醫療應用,
08:30
which哪一個 meant意味著 that we were studying研究
sick生病 brains大腦, not healthy健康 brains大腦.
173
498641
3391
這意味著我們研究生病的腦,
不是健康的腦。
08:35
Because no matter how safe安全
your technology技術 is,
174
503235
3754
因為不管你的技術有多安全,
08:39
you can't stick something
into someone's誰家 brain
175
507013
2730
你都不能為了研究的目的
而將異物插入某人的腦裡。
08:41
for research研究 purposes目的.
176
509767
1420
08:43
They have to want it.
177
511211
1549
必須經過他們同意才行。
08:44
And why would they want it?
178
512784
1460
他們為什麼會同意?
08:46
Because as soon不久 as you have
an electrical電動 connection連接 to the brain,
179
514268
3571
因為一旦腦有了電的接頭,
08:49
you can use it to hook
the brain up to a computer電腦.
180
517863
2444
就可以用它來接上電腦。
08:53
Oh, well, you know, the general一般 public上市
was very skeptical懷疑的 at first.
181
521061
3429
哦,你是知道的,一開始
公眾對此一直持懷疑態度。
08:56
I mean, who wants to hook
their brain up to their computers電腦?
182
524514
2869
畢竟有誰會想要把大腦連上電腦?
08:59
Well just imagine想像 being存在 able能夠
to send發送 an email電子郵件 with a thought.
183
527926
4236
想像你腦筋一動,
就能夠發送電子郵件。
09:04
(Laughter笑聲)
184
532186
2253
(笑聲)
09:06
Imagine想像 being存在 able能夠 to take
a picture圖片 with your eyes眼睛, OK?
185
534463
4500
想像能夠用你的眼睛拍照。
09:10
(Laughter笑聲)
186
538987
1230
(笑聲)
09:12
Imagine想像 never forgetting遺忘 anything anymore,
187
540241
2963
想像再也不會忘記任何東西了,
09:15
because anything
that you choose選擇 to remember記得
188
543228
2159
因為你選擇記住的任何東西
09:17
will be stored存儲 permanently永久
on a hard drive駕駛 somewhere某處,
189
545411
2477
都會永久儲存在某個硬碟上,
09:19
able能夠 to be recalled回顧 at will.
190
547912
2029
可以隨時召回。
09:21
(Laughter笑聲)
191
549965
3366
(笑聲)
09:25
The line here
between之間 crazy and visionary空想家
192
553355
3381
瘋狂和夢想之間的界線
09:28
was never quite相當 clear明確.
193
556760
1467
從來就不是很清楚。
09:30
But the systems系統 were safe安全.
194
558720
1857
但系統是安全的。
09:32
So when the FDAFDA decided決定 to deregulate放鬆管制
these laser-drilling鐳射鑽孔 systems系統, in 2043,
195
560879
5016
因此,當美國食品藥品監督管理局
在 2043 年決定放寬
對雷射光鑽孔系統的管制時,
09:37
commercial廣告 demand需求 just exploded爆炸.
196
565919
2357
商業需求就爆增了。
09:40
People started開始 signing簽約 their emails電子郵件,
197
568300
1888
人們開始在電子郵件寫著:
09:42
"Please excuse藉口 any typos錯別字.
198
570212
1341
「錯別字請見諒。
09:43
Sent發送 from my brain."
199
571577
1333
由我的腦送出。」
09:44
(Laughter笑聲)
200
572934
1001
(笑聲)
09:45
Commercial廣告 systems系統
popped膨化 up left and right,
201
573959
2072
到處冒出的商業系統
09:48
offering the latest最新 and greatest最大
in neural神經 interfacing界面 technology技術.
202
576055
3238
提供最新和最好的神經接口技術。
09:51
There were 100 electrodes電極.
203
579792
1753
有一百個電極。
09:53
A thousand electrodes電極.
204
581569
1911
一千個電極。
09:55
High bandwidth帶寬 for only 99.99 a month.
205
583504
2476
寬頻每月只要 99.99 美元。
09:58
(Laughter笑聲)
206
586004
1539
(笑聲)
09:59
Soon不久, everyone大家 had them.
207
587567
1534
很快,每個人都有了。
10:01
And that was the key.
208
589694
1571
那就是關鍵。
10:03
Because, in the 2050s,
if you were a neuroscientist神經學家,
209
591289
2923
因為在 2050 年代,
如果你是個神經科學家,
10:06
you could have someone有人 come into your lab實驗室
essentially實質上 from off the street.
210
594236
3939
可從街上找個人進你的實驗室,
讓他們參與一些情緒上的任務、
10:10
And you could have them engaged訂婚
in some emotional情緒化 task任務
211
598792
2864
10:13
or social社會 behavior行為 or abstract抽象 reasoning推理,
212
601680
2437
社會行為或抽象推理,
10:16
things you could never study研究 in mice老鼠.
213
604141
2531
這些永遠無法
在老鼠身上學到的東西。
10:18
And you could record記錄
the activity活動 of their neurons神經元
214
606696
3111
你可以用他們現成的接口
10:21
using運用 the interfaces接口
that they already已經 had.
215
609831
3191
記錄他們的神經元活動。
10:25
And then you could also ask them
about what they were experiencing經歷.
216
613046
3189
也可以問他們正經歷什麼。
因此,你永遠無法在動物身上做出的
10:28
So this link鏈接 between之間
psychology心理學 and neuroscience神經科學
217
616259
3349
10:31
that you could never make
in the animals動物, was suddenly突然 there.
218
619632
3381
這種心理學和神經科學之間的聯繫
突然間就出現了。
10:35
So perhaps也許 the classic經典 example of this
219
623695
2184
也許這個典型的例子
10:37
was the discovery發現
of the neural神經 basis基礎 for insight眼光.
220
625903
3523
是發現洞察力的神經基礎。
10:41
That "Aha!" moment時刻, the moment時刻
it all comes together一起, it clicks點擊.
221
629450
3600
那個激出「啊哈!」
靈光乍現時刻的當下,
10:45
And this was discovered發現
by two scientists科學家們 in 2055,
222
633593
4056
在 2055 年,被兩位科學家
貝瑞和雷特發現了。
10:49
Barry巴里 and Late晚了,
223
637673
1372
10:51
who observed觀察到的, in the dorsal
prefrontal前額葉 cortex皮質,
224
639069
3663
他們觀察背外側前額葉
(Dorsolateral Prefrontal Cortex)
10:54
how in the brain of someone有人
trying to understand理解 an idea理念,
225
642756
5222
試圖了解某個想法的人腦內部,
11:00
how different不同 populations人群 of neurons神經元
would reorganize改組 themselves他們自己 --
226
648002
3369
不同的神經元群如何重組自己──
11:03
you're looking at neural神經
activity活動 here in orange橙子 --
227
651395
2436
神經活動在這裡以橙色顯示──
直到最後活動以一種
導致正面反饋的方式對齊。
11:05
until直到 finally最後 their activity活動 aligns對齊
in a way that leads引線 to positive feedback反饋.
228
653855
3738
11:10
Right there.
229
658339
1150
在那裡。
11:12
That is understanding理解.
230
660723
1467
那就是理解。
11:15
So finally最後, we were able能夠 to get
at the things that make us human人的.
231
663413
4437
我們終於能了解
使我們成為人類的東西。
11:21
And that's what really opened打開 the way
to major重大的 insights見解 from medicine醫學.
232
669871
4578
這才能真正打開
通向主要醫學見解的道路。
11:27
Because, starting開始 in the 2060s,
233
675465
2755
因為從 2060 年代開始,
11:30
with the ability能力 to record記錄
the neural神經 activity活動
234
678244
2484
我們能夠開始記錄這些
11:32
in the brains大腦 of patients耐心
with these different不同 mental心理 diseases疾病,
235
680752
3587
不同精神病患者的腦神經活動,
11:36
rather than defining確定 the diseases疾病
on the basis基礎 of their symptoms症狀,
236
684363
4690
而不是根據他們的症狀來定義疾病,
11:41
as we had at the beginning開始 of the century世紀,
237
689077
2040
就像本世紀初時那樣,
11:43
we started開始 to define確定 them
238
691141
1222
根據我們在神經水平觀察到的
11:44
on the basis基礎 of the actual實際 pathology病理
that we observed觀察到的 at the neural神經 level水平.
239
692387
3539
實際病理來定義它們。
11:48
So for example, in the case案件 of ADHD多動症,
240
696768
3825
例如,對注意力不足過動症
(ADHD)而言,
11:52
we discovered發現 that there are
dozens許多 of different不同 diseases疾病,
241
700617
3174
我們發現有幾十種不同的疾病,
11:55
all of which哪一個 had been called ADHD多動症
at the start開始 of the century世紀,
242
703815
3009
在本世紀初全被稱為 ADHD,
11:58
that actually其實 had nothing
to do with each other,
243
706848
2301
但實際上它們之間沒有任何關聯,
12:01
except that they had similar類似 symptoms症狀.
244
709173
2118
只是症狀類似,
12:03
And they needed需要 to be treated治療
in different不同 ways方法.
245
711625
2372
應該要用不同的治療方式。
12:06
So it was kind of incredible難以置信,
in retrospect回想起來,
246
714307
2247
這有點令人難以置信,
12:08
that at the beginning開始 of the century世紀,
247
716578
1777
回想本世紀初,
12:10
we had been treating治療
all those different不同 diseases疾病
248
718379
2317
我們一直在用同樣的藥物
治療這些不同的疾病,
12:12
with the same相同 drug藥物,
249
720720
1183
基本上就是只是給人服用安非他命。
12:13
just by giving people amphetamine,
basically基本上 is what we were doing.
250
721927
3214
精神分裂症和抑鬱症也是如此。
12:17
And schizophrenia精神分裂症 and depression蕭條
are the same相同 way.
251
725165
2488
12:19
So rather than prescribing處方 drugs毒品 to people
essentially實質上 at random隨機,
252
727677
4032
因此,我們不像以前那樣隨意開藥,
12:23
as we had,
253
731733
1150
12:24
we learned學到了 how to predict預測
which哪一個 drugs毒品 would be most effective有效
254
732907
3511
而是學會瞭解如何預測
哪種藥物對哪些患者最有效,
12:28
in which哪一個 patients耐心,
255
736442
1183
因而導致巨大的改善結果。
12:29
and that just led to this huge巨大
improvement起色 in outcomes結果.
256
737649
2756
12:33
OK, I want to bring帶來 you back now
to the year 2017.
257
741498
3476
我現在帶你回 2017 年。
12:38
Some of this may可能 sound聲音 satirical諷刺
or even far fetched.
258
746117
3373
其中一些聽起來很諷刺,
甚至牽強附會。
12:41
And some of it is.
259
749514
1293
有些的確是。
12:43
I mean, I can't actually其實
see into the future未來.
260
751291
2651
意思是,實際上我看不到未來,
12:45
I don't actually其實 know
261
753966
1366
實際上我並不知道
12:47
if we're going to be drilling鑽孔 hundreds數以百計
or thousands數千 of microscopic顯微 holes
262
755356
3667
我們是否會在 30 年內
在頭部鑽幾百或幾千個微小洞。
12:51
in our heads in 30 years年份.
263
759047
1667
12:53
But what I can tell you
264
761762
1706
但我能告訴你,
12:55
is that we're not going
to make any progress進展
265
763492
2175
我們無法在理解人類大腦
12:57
towards understanding理解 the human人的 brain
or human人的 diseases疾病
266
765691
3727
或人類疾病方面取得任何進展,
13:01
until直到 we figure數字 out how to get
at the electrical電動 activity活動 of neurons神經元
267
769442
4516
除非我們先弄清楚如何取得
健康人的神經元電訊活動才能夠做到。
13:05
in healthy健康 humans人類.
268
773982
1200
13:07
And almost幾乎 no one is working加工
on figuring盤算 out how to do that today今天.
269
775918
3239
而現今幾乎沒有人在研究如何做。
13:12
That is the future未來 of neuroscience神經科學.
270
780077
2334
那是神經科學的未來。
13:14
And I think it's time for neuroscientists神經學家
to put down the mouse老鼠 brain
271
782752
4393
我認為現在是神經科學家放下鼠腦,
13:19
and to dedicate奉獻 the thought
and investment投資 necessary必要
272
787169
2754
專心投入必要的腦力和金錢
13:21
to understand理解 the human人的 brain
and human人的 disease疾病.
273
789947
3267
來了解人腦和人類疾病的時候了。
13:27
Thank you.
274
795629
1151
謝謝。
13:28
(Applause掌聲)
275
796804
1172
(掌聲)
Translated by Helen Chang
Reviewed by Adrienne Lin

▲Back to top

ABOUT THE SPEAKER
Sam Rodriques - Neuroengineer
Sam Rodriques invents new technologies to help neuroscientists understand how the brain works.

Why you should listen

Sam Rodriques has designed a new nanofabrication method and a new approach to sensing neural activity with probes in the bloodstream, and his forthcoming technologies are aimed at understanding the spatial organization of cells in the brain and the transcriptional activity of neurons over time. He is a graduate student in Physics at the Massachusetts Institute of Technology, with affiliations at the MIT Media Lab, the MIT Department of Brain and Cognitive Sciences and the Broad Institute of Harvard and MIT. He graduated summa cum laude with highest honors in Physics from Haverford College, where he worked on new methods for calculating quantum entanglement in multipartite quantum systems. He has received numerous national awards and fellowships to support his research, including the Hertz Foundation Graduate Fellowship, an NSF Graduate Research Fellowship and a Churchill Scholarship.

More profile about the speaker
Sam Rodriques | Speaker | TED.com