ABOUT THE SPEAKER
Sam Rodriques - Neuroengineer
Sam Rodriques invents new technologies to help neuroscientists understand how the brain works.

Why you should listen

Sam Rodriques has designed a new nanofabrication method and a new approach to sensing neural activity with probes in the bloodstream, and his forthcoming technologies are aimed at understanding the spatial organization of cells in the brain and the transcriptional activity of neurons over time. He is a graduate student in Physics at the Massachusetts Institute of Technology, with affiliations at the MIT Media Lab, the MIT Department of Brain and Cognitive Sciences and the Broad Institute of Harvard and MIT. He graduated summa cum laude with highest honors in Physics from Haverford College, where he worked on new methods for calculating quantum entanglement in multipartite quantum systems. He has received numerous national awards and fellowships to support his research, including the Hertz Foundation Graduate Fellowship, an NSF Graduate Research Fellowship and a Churchill Scholarship.

More profile about the speaker
Sam Rodriques | Speaker | TED.com
TEDxBeaconStreet

Sam Rodriques: What we'll learn about the brain in the next century

Filmed:
1,621,715 views

In this imaginative talk, neuroengineer Sam Rodriques takes us on a thrilling tour of the next 100 years in brain science. He envisions strange (and sometimes frightening) innovations that may be the key to understanding and treating brain disease -- like lasers that drill tiny holes in our skulls and allow probes to study the electrical activity of our neurons.
- Neuroengineer
Sam Rodriques invents new technologies to help neuroscientists understand how the brain works. Full bio

Double-click the English transcript below to play the video.

00:13
I want to tell you guys
something about neuroscience.
0
1040
2507
00:16
I'm a physicist by training.
1
4040
1800
00:18
About three years ago, I left physics
2
6230
2206
00:20
to come and try to understand
how the brain works.
3
8460
2349
00:22
And this is what I found.
4
10833
1474
00:24
Lots of people are working on depression.
5
12331
2064
00:26
And that's really good,
6
14419
1159
00:27
depression is something
that we really want to understand.
7
15602
2721
00:30
Here's how you do it:
8
18347
1167
00:31
you take a jar and you fill it up,
about halfway, with water.
9
19538
4161
00:35
And then you take a mouse,
and you put the mouse in the jar, OK?
10
23723
4182
00:39
And the mouse swims around
for a little while
11
27929
2350
00:42
and then at some point,
the mouse gets tired
12
30303
2388
00:44
and decides to stop swimming.
13
32715
1934
00:46
And when it stops swimming,
that's depression.
14
34673
3133
00:50
OK?
15
38696
1150
00:52
And I'm from theoretical physics,
16
40291
3380
00:55
so I'm used to people making
very sophisticated mathematical models
17
43695
3668
00:59
to precisely describe physical phenomena,
18
47387
2881
01:02
so when I saw that this
is the model for depression,
19
50292
2452
01:04
I though to myself, "Oh my God,
we have a lot of work to do."
20
52768
2937
01:07
(Laughter)
21
55729
1370
01:09
But this is a kind of general
problem in neuroscience.
22
57123
2951
01:12
So for example, take emotion.
23
60377
2111
01:14
Lots of people want to understand emotion.
24
62512
2459
01:17
But you can't study emotion
in mice or monkeys
25
65352
3313
01:20
because you can't ask them
26
68689
1254
01:21
how they're feeling
or what they're experiencing.
27
69967
2317
01:24
So instead, people who want
to understand emotion,
28
72308
2357
01:26
typically end up studying
what's called motivated behavior,
29
74689
2777
01:29
which is code for "what the mouse does
when it really, really wants cheese."
30
77490
3658
01:33
OK, I could go on and on.
31
81839
1675
01:35
I mean, the point is, the NIH
spends about 5.5 billion dollars a year
32
83538
6316
01:41
on neuroscience research.
33
89878
1532
01:43
And yet there have been almost
no significant improvements in outcomes
34
91434
3603
01:47
for patients with brain diseases
in the past 40 years.
35
95061
3491
01:51
And I think a lot of that
is basically due to the fact
36
99015
2540
01:53
that mice might be OK as a model
for cancer or diabetes,
37
101579
4151
01:57
but the mouse brain
is just not sophisticated enough
38
105754
2687
02:00
to reproduce human psychology
or human brain disease.
39
108465
3175
02:04
OK?
40
112379
1225
02:05
So if the mouse models are so bad,
why are we still using them?
41
113628
3634
02:10
Well, it basically boils down to this:
42
118143
2103
02:12
the brain is made up of neurons
43
120270
2556
02:14
which are these little cells that send
electrical signals to each other.
44
122850
3447
02:18
If you want to understand
how the brain works,
45
126680
2144
02:20
you have to be able to measure
the electrical activity of these neurons.
46
128848
3808
02:25
But to do that, you have to get
really close to the neurons
47
133339
2992
02:28
with some kind of electrical
recording device or a microscope.
48
136355
2928
02:31
And so you can do that in mice
and you can do it in monkeys,
49
139563
2810
02:34
because you can physically
put things into their brain
50
142397
2548
02:36
but for some reason we still
can't do that in humans, OK?
51
144969
3046
02:40
So instead, we've invented
all these proxies.
52
148533
3370
02:43
So the most popular one is probably this,
53
151927
2515
02:46
functional MRI, fMRI,
54
154466
2397
02:48
which allows you to make these
pretty pictures like this,
55
156887
2692
02:51
that show which parts
of your brain light up
56
159603
2056
02:53
when you're engaged
in different activities.
57
161683
2126
02:55
But this is a proxy.
58
163833
1920
02:57
You're not actually measuring
neural activity here.
59
165777
3292
03:01
What you're doing
is you're measuring, essentially,
60
169093
2842
03:03
like, blood flow in the brain.
61
171959
1832
03:05
Where there's more blood.
62
173815
1238
03:07
It's actually where there's more oxygen,
but you get the idea, OK?
63
175077
3103
03:10
The other thing that you can do
is you can do this --
64
178204
2519
03:12
electroencephalography -- you can put
these electrodes on your head, OK?
65
180747
3591
03:16
And then you can measure your brain waves.
66
184362
2143
03:19
And here, you're actually measuring
electrical activity.
67
187125
3079
03:22
But you're not measuring
the activity of neurons.
68
190228
2365
03:24
You're measuring
these electrical currents,
69
192911
2444
03:27
sloshing back and forth in your brain.
70
195379
2299
03:30
So the point is just
that these technologies that we have
71
198157
2674
03:32
are really measuring the wrong thing.
72
200855
2436
03:35
Because, for most of the diseases
that we want to understand --
73
203315
2953
03:38
like, Parkinson's is the classic example.
74
206292
2198
03:40
In Parkinson's, there's one particular
kind of neuron deep in your brain
75
208514
3554
03:44
that is responsible for the disease,
76
212092
1731
03:45
and these technologies just don't have
the resolution that you need
77
213847
3182
03:49
to get at that.
78
217053
1373
03:50
And so that's why
we're still stuck with the animals.
79
218450
3974
03:54
Not that anyone wants
to be studying depression
80
222448
2533
03:57
by putting mice into jars, right?
81
225005
2262
03:59
It's just that there's this pervasive
sense that it's not possible
82
227291
3753
04:03
to look at the activity of neurons
in healthy humans.
83
231068
3847
04:08
So here's what I want to do.
84
236180
1492
04:09
I want to take you into the future.
85
237974
2521
04:12
To have a look at one way in which
I think it could potentially be possible.
86
240519
4482
04:17
And I want to preface this by saying,
I don't have all the details.
87
245526
3298
04:21
So I'm just going to provide you
with a kind of outline.
88
249272
2967
04:24
But we're going to go the year 2100.
89
252263
2400
04:27
Now what does the year 2100 look like?
90
255732
2299
04:30
Well, to start with, the climate
is a bit warmer that what you're used to.
91
258055
3518
04:33
(Laughter)
92
261597
3583
04:37
And that robotic vacuum cleaner
that you know and love
93
265204
4952
04:42
went through a few generations,
94
270180
1514
04:43
and the improvements
were not always so good.
95
271718
2843
04:46
(Laughter)
96
274585
1595
04:48
It was not always for the better.
97
276530
2310
04:52
But actually, in the year 2100
most things are surprisingly recognizable.
98
280221
4538
04:57
It's just the brain is totally different.
99
285458
2734
05:00
For example, in the year 2100,
100
288740
2547
05:03
we understand the root causes
of Alzheimer's.
101
291311
2857
05:06
So we can deliver targeted
genetic therapies or drugs
102
294192
3714
05:09
to stop the degenerative process
before it begins.
103
297930
2876
05:13
So how did we do it?
104
301629
1333
05:15
Well, there were essentially three steps.
105
303898
2238
05:18
The first step was
that we had to figure out
106
306589
2814
05:21
some way to get electrical
connections through the skull
107
309427
3293
05:24
so we could measure
the electrical activity of neurons.
108
312744
3015
05:28
And not only that,
it had to be easy and risk-free.
109
316339
4349
05:32
Something that basically anyone
would be OK with,
110
320712
2378
05:35
like getting a piercing.
111
323114
1600
05:37
Because back in 2017,
112
325156
2747
05:39
the only way that we knew of
to get through the skull
113
327927
2913
05:42
was to drill these holes
the size of quarters.
114
330864
2817
05:46
You would never let
someone do that to you.
115
334015
2039
05:48
So in the 2020s,
116
336967
2253
05:51
people began to experiment --
rather than drilling these gigantic holes,
117
339244
3381
05:54
drilling microscopic holes,
no thicker than a piece of hair.
118
342649
3115
05:58
And the idea here
was really for diagnosis --
119
346735
2096
06:00
there are lots of times in the diagnosis
of brain disorders
120
348855
2786
06:03
when you would like to be able to look
at the neural activity beneath the skull
121
351665
4872
06:08
and being able to drill
these microscopic holes
122
356561
3191
06:11
would make that much easier
for the patient.
123
359776
2142
06:13
In the end, it would be
like getting a shot.
124
361942
2349
06:16
You just go in and you sit down
125
364315
1580
06:17
and there's a thing
that comes down on your head,
126
365919
2301
06:20
and a momentary sting and then it's done,
127
368244
1953
06:22
and you can go back about your day.
128
370221
1864
06:24
So we're eventually able to do it
129
372736
4793
06:29
using lasers to drill the holes.
130
377553
2667
06:32
And with the lasers,
it was fast and extremely reliable,
131
380244
2620
06:34
you couldn't even tell
the holes were there,
132
382888
2213
06:37
any more than you could tell
that one of your hairs was missing.
133
385125
3000
06:40
And I know it might sound crazy,
using lasers to drill holes in your skull,
134
388753
4738
06:45
but back in 2017,
135
393515
1366
06:46
people were OK with surgeons
shooting lasers into their eyes
136
394905
4109
06:51
for corrective surgery
137
399038
1214
06:52
So when you're already here,
it's not that big of a step.
138
400276
3887
06:57
OK?
139
405561
1151
06:58
So the next step,
that happened in the 2030s,
140
406736
3571
07:02
was that it's not just about
getting through the skull.
141
410331
3086
07:05
To measure the activity of neurons,
142
413441
1700
07:07
you have to actually make it
into the brain tissue itself.
143
415165
3825
07:11
And the risk, whenever
you put something into the brain tissue,
144
419344
2968
07:14
is essentially that of stroke.
145
422336
1439
07:15
That you would hit
a blood vessel and burst it,
146
423799
2196
07:18
and that causes a stroke.
147
426019
1519
07:19
So, by the mid 2030s,
we had invented these flexible probes
148
427916
3725
07:23
that were capable of going
around blood vessels,
149
431665
2278
07:25
rather than through them.
150
433967
1476
07:27
And thus, we could put
huge batteries of these probes
151
435467
5697
07:33
into the brains of patients
152
441188
1357
07:34
and record from thousands of their neurons
without any risk to them.
153
442569
3270
07:39
And what we discovered,
sort of to our surprise,
154
447458
4061
07:43
is that the neurons that we could identify
155
451543
2190
07:45
were not responding
to things like ideas or emotion,
156
453757
3524
07:49
which was what we had expected.
157
457305
1627
07:50
They were mostly responding
to things like Jennifer Aniston
158
458956
3796
07:54
or Halle Berry
159
462776
2404
07:57
or Justin Trudeau.
160
465204
1310
07:58
I mean --
161
466538
1253
07:59
(Laughter)
162
467815
2326
08:02
In hindsight, we shouldn't
have been that surprised.
163
470165
2437
08:04
I mean, what do your neurons
spend most of their time thinking about?
164
472626
3262
08:07
(Laughter)
165
475912
1150
08:09
But really, the point is that
166
477380
2040
08:11
this technology enabled us to begin
studying neuroscience in individuals.
167
479444
4430
08:15
So much like the transition to genetics,
at the single cell level,
168
483898
4230
08:20
we started to study neuroscience,
at the single human level.
169
488152
3206
08:23
But we weren't quite there yet.
170
491890
1618
08:25
Because these technologies
171
493895
1642
08:27
were still restricted
to medical applications,
172
495561
3056
08:30
which meant that we were studying
sick brains, not healthy brains.
173
498641
3391
08:35
Because no matter how safe
your technology is,
174
503235
3754
08:39
you can't stick something
into someone's brain
175
507013
2730
08:41
for research purposes.
176
509767
1420
08:43
They have to want it.
177
511211
1549
08:44
And why would they want it?
178
512784
1460
08:46
Because as soon as you have
an electrical connection to the brain,
179
514268
3571
08:49
you can use it to hook
the brain up to a computer.
180
517863
2444
08:53
Oh, well, you know, the general public
was very skeptical at first.
181
521061
3429
08:56
I mean, who wants to hook
their brain up to their computers?
182
524514
2869
08:59
Well just imagine being able
to send an email with a thought.
183
527926
4236
09:04
(Laughter)
184
532186
2253
09:06
Imagine being able to take
a picture with your eyes, OK?
185
534463
4500
09:10
(Laughter)
186
538987
1230
09:12
Imagine never forgetting anything anymore,
187
540241
2963
09:15
because anything
that you choose to remember
188
543228
2159
09:17
will be stored permanently
on a hard drive somewhere,
189
545411
2477
09:19
able to be recalled at will.
190
547912
2029
09:21
(Laughter)
191
549965
3366
09:25
The line here
between crazy and visionary
192
553355
3381
09:28
was never quite clear.
193
556760
1467
09:30
But the systems were safe.
194
558720
1857
09:32
So when the FDA decided to deregulate
these laser-drilling systems, in 2043,
195
560879
5016
09:37
commercial demand just exploded.
196
565919
2357
09:40
People started signing their emails,
197
568300
1888
09:42
"Please excuse any typos.
198
570212
1341
09:43
Sent from my brain."
199
571577
1333
09:44
(Laughter)
200
572934
1001
09:45
Commercial systems
popped up left and right,
201
573959
2072
09:48
offering the latest and greatest
in neural interfacing technology.
202
576055
3238
09:51
There were 100 electrodes.
203
579792
1753
09:53
A thousand electrodes.
204
581569
1911
09:55
High bandwidth for only 99.99 a month.
205
583504
2476
09:58
(Laughter)
206
586004
1539
09:59
Soon, everyone had them.
207
587567
1534
10:01
And that was the key.
208
589694
1571
10:03
Because, in the 2050s,
if you were a neuroscientist,
209
591289
2923
10:06
you could have someone come into your lab
essentially from off the street.
210
594236
3939
10:10
And you could have them engaged
in some emotional task
211
598792
2864
10:13
or social behavior or abstract reasoning,
212
601680
2437
10:16
things you could never study in mice.
213
604141
2531
10:18
And you could record
the activity of their neurons
214
606696
3111
10:21
using the interfaces
that they already had.
215
609831
3191
10:25
And then you could also ask them
about what they were experiencing.
216
613046
3189
10:28
So this link between
psychology and neuroscience
217
616259
3349
10:31
that you could never make
in the animals, was suddenly there.
218
619632
3381
10:35
So perhaps the classic example of this
219
623695
2184
10:37
was the discovery
of the neural basis for insight.
220
625903
3523
10:41
That "Aha!" moment, the moment
it all comes together, it clicks.
221
629450
3600
10:45
And this was discovered
by two scientists in 2055,
222
633593
4056
10:49
Barry and Late,
223
637673
1372
10:51
who observed, in the dorsal
prefrontal cortex,
224
639069
3663
10:54
how in the brain of someone
trying to understand an idea,
225
642756
5222
11:00
how different populations of neurons
would reorganize themselves --
226
648002
3369
11:03
you're looking at neural
activity here in orange --
227
651395
2436
11:05
until finally their activity aligns
in a way that leads to positive feedback.
228
653855
3738
11:10
Right there.
229
658339
1150
11:12
That is understanding.
230
660723
1467
11:15
So finally, we were able to get
at the things that make us human.
231
663413
4437
11:21
And that's what really opened the way
to major insights from medicine.
232
669871
4578
11:27
Because, starting in the 2060s,
233
675465
2755
11:30
with the ability to record
the neural activity
234
678244
2484
11:32
in the brains of patients
with these different mental diseases,
235
680752
3587
11:36
rather than defining the diseases
on the basis of their symptoms,
236
684363
4690
11:41
as we had at the beginning of the century,
237
689077
2040
11:43
we started to define them
238
691141
1222
11:44
on the basis of the actual pathology
that we observed at the neural level.
239
692387
3539
11:48
So for example, in the case of ADHD,
240
696768
3825
11:52
we discovered that there are
dozens of different diseases,
241
700617
3174
11:55
all of which had been called ADHD
at the start of the century,
242
703815
3009
11:58
that actually had nothing
to do with each other,
243
706848
2301
12:01
except that they had similar symptoms.
244
709173
2118
12:03
And they needed to be treated
in different ways.
245
711625
2372
12:06
So it was kind of incredible,
in retrospect,
246
714307
2247
12:08
that at the beginning of the century,
247
716578
1777
12:10
we had been treating
all those different diseases
248
718379
2317
12:12
with the same drug,
249
720720
1183
12:13
just by giving people amphetamine,
basically is what we were doing.
250
721927
3214
12:17
And schizophrenia and depression
are the same way.
251
725165
2488
12:19
So rather than prescribing drugs to people
essentially at random,
252
727677
4032
12:23
as we had,
253
731733
1150
12:24
we learned how to predict
which drugs would be most effective
254
732907
3511
12:28
in which patients,
255
736442
1183
12:29
and that just led to this huge
improvement in outcomes.
256
737649
2756
12:33
OK, I want to bring you back now
to the year 2017.
257
741498
3476
12:38
Some of this may sound satirical
or even far fetched.
258
746117
3373
12:41
And some of it is.
259
749514
1293
12:43
I mean, I can't actually
see into the future.
260
751291
2651
12:45
I don't actually know
261
753966
1366
12:47
if we're going to be drilling hundreds
or thousands of microscopic holes
262
755356
3667
12:51
in our heads in 30 years.
263
759047
1667
12:53
But what I can tell you
264
761762
1706
12:55
is that we're not going
to make any progress
265
763492
2175
12:57
towards understanding the human brain
or human diseases
266
765691
3727
13:01
until we figure out how to get
at the electrical activity of neurons
267
769442
4516
13:05
in healthy humans.
268
773982
1200
13:07
And almost no one is working
on figuring out how to do that today.
269
775918
3239
13:12
That is the future of neuroscience.
270
780077
2334
13:14
And I think it's time for neuroscientists
to put down the mouse brain
271
782752
4393
13:19
and to dedicate the thought
and investment necessary
272
787169
2754
13:21
to understand the human brain
and human disease.
273
789947
3267
13:27
Thank you.
274
795629
1151
13:28
(Applause)
275
796804
1172

▲Back to top

ABOUT THE SPEAKER
Sam Rodriques - Neuroengineer
Sam Rodriques invents new technologies to help neuroscientists understand how the brain works.

Why you should listen

Sam Rodriques has designed a new nanofabrication method and a new approach to sensing neural activity with probes in the bloodstream, and his forthcoming technologies are aimed at understanding the spatial organization of cells in the brain and the transcriptional activity of neurons over time. He is a graduate student in Physics at the Massachusetts Institute of Technology, with affiliations at the MIT Media Lab, the MIT Department of Brain and Cognitive Sciences and the Broad Institute of Harvard and MIT. He graduated summa cum laude with highest honors in Physics from Haverford College, where he worked on new methods for calculating quantum entanglement in multipartite quantum systems. He has received numerous national awards and fellowships to support his research, including the Hertz Foundation Graduate Fellowship, an NSF Graduate Research Fellowship and a Churchill Scholarship.

More profile about the speaker
Sam Rodriques | Speaker | TED.com