ABOUT THE SPEAKER
Nita Farahany - Legal scholar, ethicist
Nita A. Farahany is a leading scholar on the ethical, legal, and social implications of biosciences and emerging technologies, particularly those related to neuroscience and behavioral genetics.

Why you should listen

Nita A. Farahany is a professor of law and philosophy, the founding director of the Duke Initiative for Science & Society and chair of the MA in Bioethics & Science Policy at Duke University. In 2010, Farahany was appointed by President Obama to the Presidential Commission for the Study of Bioethical Issues, and she served as a member until 2017. She is a member of the Neuroethics Division of the Multi-Council Working Group for the BRAIN Initiative, on the President's Research Council of the Canadian Institute for Advanced Research (CIFAR), and past member of the Global Agenda Council for Privacy, Technology and Governance at the World Economic Forum. 

Farahany presents her work to diverse audiences and is a frequent commentator for national media and radio shows. She is an elected member of the American Law Institute, the President-Elect of the International Neuroethics Society, serves on the Board of the International Neuroethics Society, a co-editor a co-editor-in-chief and co-founder of the Journal of Law and the Biosciences and an editorial board member of the American Journal of Bioethics (Neuroscience). She's on the Ethics Advisory Board for Illumina, Inc., the Scientific Advisory Board of Helix, and the Board of Advisors of Scientific American. 

Farahany received her AB in genetics, cell and developmental biology at Dartmouth College, a JD and MA from Duke University, as well as a PhD in philosophy. She also holds an ALM in biology from Harvard University. In 2004-2005, Farahany clerked for Judge Judith W. Rogers of the US Court of Appeals for the D.C. Circuit, after which she joined the faculty at Vanderbilt University. In 2011, Farahany was the Leah Kaplan Visiting Professor of Human Rights at Stanford Law School.

More profile about the speaker
Nita Farahany | Speaker | TED.com
TED Salon Zebra Technologies

Nita Farahany: When technology can read minds, how will we protect our privacy?

尼塔 A.法拉赫尼: 科技读心时代的隐私保护

Filmed:
1,819,292 views

法律学者、伦理学家尼塔•法拉赫尼(Nita Farahany)表示,能够解码你大脑活动、揭示你所思所感的技术即将问世。这对我们已经被侵犯的隐私意味着什么?在一次令人警醒的谈话中,法拉赫尼警告说,在这样一个社会里,人们仅仅因为想犯罪就被逮捕(就像在《少数派报告》中那样),而私人利益出卖了我们的大脑数据——并为保护我们的思想自由和自决的认知自由提出了理由。
- Legal scholar, ethicist
Nita A. Farahany is a leading scholar on the ethical, legal, and social implications of biosciences and emerging technologies, particularly those related to neuroscience and behavioral genetics. Full bio

Double-click the English transcript below to play the video.

00:13
In the months个月 following以下
the 2009 presidential总统 election选举 in Iran伊朗,
0
1096
4714
2009年伊朗总统大选结束后的几个月里,
00:17
protests抗议 erupted爆发 across横过 the country国家.
1
5834
2894
抗议活动席卷了伊朗全国。
00:21
The Iranian伊朗的 government政府
violently猛烈 suppressed抑制
2
9685
2896
伊朗政府暴力镇压了
00:24
what came来了 to be known已知
as the Iranian伊朗的 Green绿色 Movement运动,
3
12605
3979
后来广为知的“伊朗绿色运动”。
政府甚至屏蔽了移动信号
00:28
even blocking闭塞 mobile移动 signals信号
4
16608
2053
以切断抗议者间的联系。
00:30
to cut off communication通讯
between之间 the protesters抗议者.
5
18685
2714
00:34
My parents父母, who emigrated移民
to the United联合的 States状态 in the late晚了 1960s,
6
22316
4669
20世纪60年代末,
我的父母移民到了美国,
我们一大家子
00:39
spend substantial大量的 time there,
7
27009
1794
在那住了很多年。
00:40
where all of my large,
extended扩展 family家庭 live生活.
8
28827
3153
当我在镇压抗议活动
进行得如火如荼的时候
00:44
When I would call my family家庭 in Tehran德黑兰
9
32832
3129
00:47
during some of the most violent暴力
crackdowns镇压 of the protest抗议,
10
35985
3452
给德黑兰的家人打电话,
他们谁也不敢跟我说到底发生了什么。
00:51
none没有 of them dared discuss讨论
with me what was happening事件.
11
39461
3252
我,或者他们会立刻转移话题。
00:55
They or I knew知道 to quickly很快 steer驾驶
the conversation会话 to other topics主题.
12
43196
3529
我们所有人都知道,
00:59
All of us understood了解
what the consequences后果 could be
13
47163
3380
持有不同政见这一行为
会带来什么样的后果。
01:02
of a perceived感知 dissident持不同政见 action行动.
14
50567
2540
01:06
But I still wish希望 I could have known已知
what they were thinking思维
15
54095
3469
但我还是希望能知道他们当时的想法,
或者他们的感受。
01:09
or what they were feeling感觉.
16
57588
1418
如果我真的可以做到呢?
01:12
What if I could have?
17
60217
1393
01:14
Or more frighteningly令人震惊,
18
62149
1151
或者更骇人听闻的是,
01:15
what if the Iranian伊朗的 government政府 could have?
19
63324
2761
如果伊朗政府能做到呢?
01:18
Would they have arrested被捕 them
based基于 on what their brains大脑 revealed透露?
20
66695
3244
他们会基于人民大脑所呈现出的
想法而逮捕他们吗?
那一天或许没有你想象中的那么遥远。
01:22
That day may可能 be closer接近 than you think.
21
70933
2944
01:26
With our growing生长 capabilities功能
in neuroscience神经科学, artificial人造 intelligence情报
22
74527
3811
随着我们在神经科学,人工智能,
01:30
and machine learning学习,
23
78362
1703
以及在机器学习方面的进步,
我们可能很快就能对人类大脑中
所发生的事情有更多的了解。
01:32
we may可能 soon不久 know a lot more
of what's happening事件 in the human人的 brain.
24
80089
4075
01:37
As a bioethicist生物伦理学家, a lawyer律师, a philosopher哲学家
25
85083
3310
作为一名生物伦理学家、
律师兼哲学家,
以及伊朗裔美国人,
01:40
and an Iranian-American伊朗裔美国人,
26
88417
1867
01:42
I'm deeply concerned关心
about what this means手段 for our freedoms自由
27
90308
3787
我为这类行为对我们自由的威胁
01:46
and what kinds of protections保护 we need.
28
94119
2171
以及我们需要何种保护而深感担忧。
01:48
I believe we need
a right to cognitive认知 liberty自由,
29
96993
3460
我想我们需要认知自由的权利。
01:52
as a human人的 right
that needs需求 to be protected保护.
30
100477
2892
这是一种亟待保护的人权。
01:55
If not, our freedom自由 of thought,
31
103772
2643
如果没有它,我们的思想自由,
01:58
access访问 and control控制 over our own拥有 brains大脑
32
106439
3024
对自我大脑控制和访问的权力,
以及我们的精神隐私将受到威胁。
02:01
and our mental心理 privacy隐私 will be threatened受威胁.
33
109487
2841
02:05
Consider考虑 this:
34
113698
1158
设想一下:
一个正常人每天会产生成千上万的想法。
02:06
the average平均 person thinks
thousands数千 of thoughts思念 each day.
35
114880
3314
当思想形成时,
02:10
As a thought takes form形成,
36
118697
1151
就像数学计算,一个数字,
或一个单词一样,
02:11
like a math数学 calculation计算
or a number, a word,
37
119872
5056
02:16
neurons神经元 are interacting互动 in the brain,
38
124952
2885
神经元在大脑中相互作用,
释放出微小的电量。
02:19
creating创建 a miniscule微乎其微 electrical电动 discharge卸货.
39
127861
3088
当你处于一种主要的
思想状态,比如放松时,
02:23
When you have a dominant优势
mental心理 state, like relaxation松弛,
40
131713
3452
成百上千的神经元
在大脑中不断释放信号,
02:27
hundreds数以百计 and thousands数千 of neurons神经元
are firing射击 in the brain,
41
135189
4175
以特有的方式同时放电。
02:31
creating创建 concurrent同时 electrical电动 discharges放电
in characteristic特性 patterns模式
42
139388
4218
02:35
that can be measured测量
with electroencephalography脑电图, or EEG脑电图.
43
143630
4865
这些电流可以由脑电图,
即EEG检测出来。
02:41
In fact事实, that's what
you're seeing眼看 right now.
44
149118
2160
事实上,正如你们所看到的那样。
02:43
You're seeing眼看 my brain activity活动
that was recorded记录 in real真实 time
45
151956
3964
你看到的是由我头上戴着的简单装置
02:47
with a simple简单 device设备
that was worn磨损的 on my head.
46
155944
2735
实时记录下来的我的脑部活动。
02:51
What you're seeing眼看 is my brain activity活动
when I was relaxed轻松 and curious好奇.
47
159669
5653
你们现在看到的是
当我放松和好奇时的脑部活动。
02:58
To share分享 this information信息 with you,
48
166097
1755
为了和你们分享这些信息,
我戴上了早期的面向
消费者的脑电图记录仪。
02:59
I wore穿着 one of the early
consumer-based消费为主 EEG脑电图 devices设备
49
167876
3020
03:02
like this one,
50
170920
1211
就像这个,
03:04
which哪一个 recorded记录 the electrical电动
activity活动 in my brain in real真实 time.
51
172155
3988
它能实时记录我的脑电波活动。
03:08
It's not unlike不像 the fitness身体素质 trackers跟踪器
that some of you may可能 be wearing穿着
52
176849
3826
它和你们戴的可以测量心率和步数,
03:12
to measure测量 your heart rate
or the steps脚步 that you've taken采取,
53
180699
3586
甚至你们的睡眠活动的
03:16
or even your sleep睡觉 activity活动.
54
184309
1587
健身追踪器差不多。
它并不是市场上最复杂的神经成像技术,
03:19
It's hardly几乎不 the most sophisticated复杂的
neuroimaging神经影像学 technique技术 on the market市场.
55
187154
3824
03:23
But it's already已经 the most portable手提
56
191597
2378
但它已经是最便携,
03:25
and the most likely容易 to impact碰撞
our everyday每天 lives生活.
57
193999
3152
最有可能对我们日常生活产生影响的了。
03:29
This is extraordinary非凡.
58
197915
1503
这非常了不起。
通过一个简单的可穿戴设备
03:31
Through通过 a simple简单, wearable穿戴式 device设备,
59
199442
2505
我们可以窥测到人类大脑内部。
03:33
we can literally按照字面 see
inside the human人的 brain
60
201971
3548
03:37
and learn学习 aspects方面 of our mental心理 landscape景观
without ever uttering发声 a word.
61
205543
6476
一语不发,就能了解到
我们精神状态的方方面面。
03:44
While we can't reliably可靠 decode解码
complex复杂 thoughts思念 just yet然而,
62
212829
3653
尽管我们还无法精确的
解码复杂的思想,
03:48
we can already已经 gauge测量 a person's人的 mood心情,
63
216506
2519
我们已经可以判断一个人的情绪如何。
03:51
and with the help
of artificial人造 intelligence情报,
64
219049
2873
并且在人工智能的帮助下,
03:53
we can even decode解码
some single-digit个位数 numbers数字
65
221946
4341
我们甚至可以洞悉一个人所想的,
03:58
or shapes形状 or simple简单 words
that a person is thinking思维
66
226311
4882
所听到的,所看到的,
一些数字、形状或者简单的词汇。
04:03
or hearing听力, or seeing眼看.
67
231217
2189
04:06
Despite尽管 some inherent固有 limitations限制 in EEG脑电图,
68
234345
3365
尽管脑电图技术还存在
一些固有的局限性,
但我敢肯定地说,随着科技的进步,
04:09
I think it's safe安全 to say
that with our advances进步 in technology技术,
69
237734
4720
04:14
more and more of what's happening事件
in the human人的 brain
70
242478
3809
人类大脑中将会有越来越多的东西
04:18
can and will be decoded解码 over time.
71
246311
2310
随着时间的推移被破解出来。
04:21
Already已经, using运用 one of these devices设备,
72
249362
2746
现在已经可以使用这其中的一种设备
04:24
an epileptic癫痫 can know
they're going to have an epileptic癫痫 seizure发作
73
252132
3255
让癫痫患者在发病之前,
就能够预先知道,
04:27
before it happens发生.
74
255411
1436
截瘫患者仅凭思想就能在电脑上打字。
04:28
A paraplegic截瘫 can type类型 on a computer电脑
with their thoughts思念 alone单独.
75
256871
4603
04:34
A US-based美国总部 company公司 has developed发达
a technology技术 to embed these sensors传感器
76
262485
4183
一家美国公司已经研发了
一种将这些传感器
04:38
into the headrest头靠 of automobilies自动运动
77
266692
2230
嵌入汽车头枕中的技术,
04:40
so they can track跟踪 driver司机 concentration浓度,
78
268946
2505
这样它们就可以追踪司机
在开车时的专注度,
04:43
distraction娱乐 and cognitive认知 load加载
while driving主动.
79
271475
2667
注意力分散和认知负荷,
东风日产公司、保险公司以及
美国汽车协会都注意到了这一点。
04:46
Nissan日产, insurance保险 companies公司
and AAAAAA have all taken采取 note注意.
80
274912
4058
04:51
You could even watch this
choose-your-own-adventure选择你自己冒险 movie电影
81
279949
4508
你甚至可以看一场“自选冒险历程”的电影,
04:56
"The Moment时刻," which哪一个, with an EEG脑电图 headset耳机,
82
284481
4240
名叫《那一刻》,所佩戴的心电图耳机,
05:00
changes变化 the movie电影
based基于 on your brain-based基于大脑的 reactions反应,
83
288745
3926
可以根据你大脑的反应去改变电影的走向。
05:04
giving you a different不同 ending结尾
every一切 time your attention注意 wanes阴晴圆缺.
84
292695
4353
每次当你走神,
就会给你一个不同的结局。
05:11
This may可能 all sound声音 great,
85
299154
2763
这听起来可能很棒。
05:13
and as a bioethicist生物伦理学家,
86
301941
2189
作为一个生物伦理学家,
05:16
I am a huge巨大 proponent支持者 of empowering授权 people
87
304154
3613
我强烈支持通过让人们
05:19
to take charge收费 of their own拥有
health健康 and well-being福利
88
307791
2616
运用包括这项惊人的大脑解悉技术在内的
05:22
by giving them access访问
to information信息 about themselves他们自己,
89
310431
2918
方式获取自身信息,
05:25
including包含 this incredible难以置信
new brain-decoding大脑解码 technology技术.
90
313373
2976
来掌控自己的健康和幸福。
05:29
But I worry担心.
91
317878
1167
但我还是担心。
05:31
I worry担心 that we will voluntarily自行
or involuntarily不由自主 give up
92
319736
4760
我担心我们会自觉或不自觉的放弃
05:36
our last bastion堡垒 of freedom自由,
our mental心理 privacy隐私.
93
324520
4118
我们最后的自由堡垒,我们的精神隐私。
05:41
That we will trade贸易 our brain activity活动
94
329302
2925
我们会以我们大脑的活动数据,
05:44
for rebates回扣 or discounts折扣 on insurance保险,
95
332251
3046
来换取保险折扣,
05:48
or free自由 access访问
to social-media社交媒体 accounts账户 ...
96
336391
2603
或者社交媒体帐号的免费使用权限,
05:52
or even to keep our jobs工作.
97
340444
1848
甚至以此来保住我们的工作。
05:54
In fact事实, in China中国,
98
342900
1913
事实上,在中国,
那些在全世界最繁忙的高铁线路,
05:58
the train培养 drivers司机 on
the Beijing-Shanghai北京-上海 high-speed高速 rail,
99
346199
5897
京沪高铁线上工作的司机,
06:04
the busiest最繁忙 of its kind in the world世界,
100
352120
2532
06:06
are required需要 to wear穿 EEG脑电图 devices设备
101
354676
2476
都被要求佩戴脑电图设备,
06:09
to monitor监控 their brain activity活动
while driving主动.
102
357176
2427
以此来监测他们在驾驶时的大脑活动。
06:12
According根据 to some news新闻 sources来源,
103
360157
2226
根据一些新闻报道,
06:14
in government-run官办 factories工厂 in China中国,
104
362407
2679
在中国的国营工厂中,
工人被要求穿戴脑电图设备
来监测他们的生产率,
06:17
the workers工人 are required需要 to wear穿
EEG脑电图 sensors传感器 to monitor监控 their productivity生产率
105
365110
5364
06:22
and their emotional情绪化 state at work.
106
370498
2115
以及工作时的情绪状态。
06:25
Workers工人 are even sent发送 home
107
373267
2310
如果工人们的大脑显现出,
06:27
if their brains大脑 show显示 less-than-stellar小于恒星
concentration浓度 on their jobs工作,
108
375601
4054
注意力不集中或者情绪激动,
06:31
or emotional情绪化 agitation搅动.
109
379679
2122
他们甚至会被停工。
也许这不会马上就发生,
06:35
It's not going to happen发生 tomorrow明天,
110
383189
1745
06:36
but we're headed当家 to a world世界
of brain transparency透明度.
111
384958
3086
但是我们已经向
一个大脑信息透明的世界走去。
06:40
And I don't think people understand理解
that that could change更改 everything.
112
388537
3440
而我并不认为人们能
领会到这将改变一切。
06:44
Everything from our definitions定义
of data数据 privacy隐私 to our laws法律,
113
392474
3675
从我们对数据隐私的定义到法律,
06:48
to our ideas思路 about freedom自由.
114
396173
1800
到我们对自由的看法。
06:50
In fact事实, in my lab实验室 at Duke公爵 University大学,
115
398731
3077
事实上,我在杜克大学的实验室
06:53
we recently最近 conducted进行 a nationwide全国 study研究
in the United联合的 States状态
116
401832
3175
最近进行了一项全国性的研究,
06:57
to see if people appreciated赞赏
117
405031
1959
去看看人们是否会注意到
06:59
the sensitivity灵敏度
of their brain information信息.
118
407014
2071
他们大脑信息数据的敏感性。
我们让他们对33种不同类型的信息
07:02
We asked people to rate
their perceived感知 sensitivity灵敏度
119
410356
3356
按敏感性进行排名,
07:05
of 33 different不同 kinds of information信息,
120
413736
2231
07:07
from their social社会 security安全 numbers数字
121
415991
2220
这些信息包括他们的社会保险号,
07:10
to the content内容
of their phone电话 conversations对话,
122
418235
2597
电话通话内容,
07:12
their relationship关系 history历史,
123
420856
2193
他们的恋爱史,
还有他们的情绪、焦虑,
07:15
their emotions情绪, their anxiety焦虑,
124
423073
1942
07:17
the mental心理 images图片 in their mind心神
125
425039
1946
他们脑海中的心理表象,
07:19
and the thoughts思念 in their mind心神.
126
427009
1538
他们脑海中的想法。
07:21
Shockingly令人 震惊, people rated额定 their social社会
security安全 number as far more sensitive敏感
127
429481
5229
令人震惊的是,
人们将社会保险号的敏感性,
07:26
than any other kind of information信息,
128
434734
2203
排在包括他们的大脑数据在内的,
07:28
including包含 their brain data数据.
129
436961
2435
任何其它信息的前面。
我想这可能是因为人们还不太了解,
07:32
I think this is because
people don't yet然而 understand理解
130
440380
3216
07:35
or believe the implications启示
of this new brain-decoding大脑解码 technology技术.
131
443620
4063
或者相信这个全新的
大脑解悉技术的影响。
毕竟,如果我们能够知道
一个人大脑的内部运作情况,
07:40
After all, if we can know
the inner workings运作 of the human人的 brain,
132
448629
3289
07:43
our social社会 security安全 numbers数字
are the least最小 of our worries.
133
451942
2706
那么我们的社会保险号就成为
最不需要我们担心的东西了。
(笑声)
07:46
(Laughter笑声)
134
454672
1285
想象一下。
07:47
Think about it.
135
455981
1167
07:49
In a world世界 of total brain transparency透明度,
136
457172
2396
在一个大脑信息完全透明的世界里,
谁还敢有持不同政见,
07:51
who would dare have
a politically政治上 dissident持不同政见 thought?
137
459592
2429
07:55
Or a creative创作的 one?
138
463279
1541
或者创造性的想法?
07:57
I worry担心 that people will self-censor自我审查
139
465503
3476
我担心人们会因为害怕被社会排斥
08:01
in fear恐惧 of being存在 ostracized排斥 by society社会,
140
469003
3302
害怕因走神或情绪波动,
08:04
or that people will lose失去 their jobs工作
because of their waning衰退 attention注意
141
472329
3813
又或者因为他们想过要
采取反对老板的集体行动,
08:08
or emotional情绪化 instability不稳定,
142
476166
2150
而失去他们的工作,
从而对自己进行审查。
08:10
or because they're contemplating考虑
collective集体 action行动 against反对 their employers雇主.
143
478340
3550
08:14
That coming未来 out
will no longer be an option选项,
144
482478
3177
这样的话“出柜”也就不再会是一个选择,
08:17
because people's人们 brains大脑 will long ago
have revealed透露 their sexual有性 orientation方向,
145
485679
5067
因为在人们准备有意识地跟其他人
08:22
their political政治 ideology思想
146
490770
1822
分享信息前,
08:24
or their religious宗教 preferences优先,
147
492616
2025
他们的大脑就已经
08:26
well before they were ready准备
to consciously自觉 share分享 that information信息
148
494665
3080
展露出他们的性取向、政治意识形态,
08:29
with other people.
149
497769
1253
以及宗教观点。
08:31
I worry担心 about the ability能力 of our laws法律
to keep up with technological技术性 change更改.
150
499565
4912
我担心我们的法律
能否跟得上技术的进步。
08:36
Take the First Amendment修订
of the US Constitution宪法,
151
504986
2320
就拿美国宪法第一修正案来说,
它是保护言论自由的。
08:39
which哪一个 protects保护 freedom自由 of speech言语.
152
507330
1958
那它是否也能保护思想自由呢?
08:41
Does it also protect保护 freedom自由 of thought?
153
509312
1927
如果能的话,是否意味着
我们能随意改变自己的想法?
08:43
And if so, does that mean that we're free自由
to alter改变 our thoughts思念 however然而 we want?
154
511944
4169
政府或社会能告诉我们
如何运用自己的大脑吗?
08:48
Or can the government政府 or society社会 tell us
what we can do with our own拥有 brains大脑?
155
516137
4674
08:53
Can the NSANSA spy间谍 on our brains大脑
using运用 these new mobile移动 devices设备?
156
521591
3717
美国国家安全局能用
这些新设备监视我们的大脑吗?
企业能通过它们的
应用程序收集大脑数据,
08:58
Can the companies公司 that collect搜集
the brain data数据 through通过 their applications应用
157
526053
4119
09:02
sell this information信息 to third第三 parties派对?
158
530196
2074
然后将这些信息卖给第三方吗?
现在还没有法律阻止它们这么做。
09:05
Right now, no laws法律 prevent避免 them
from doing so.
159
533174
3222
09:09
It could be even more problematic问题
160
537203
2025
而在那些还不能享受到
09:11
in countries国家 that don't share分享
the same相同 freedoms自由
161
539252
2519
与美国人民同样自由的国家,
09:13
enjoyed享受 by people in the United联合的 States状态.
162
541795
2103
可能会有更多的问题。
如果在“伊朗绿色运动”期间,
09:16
What would've会一直 happened发生 during
the Iranian伊朗的 Green绿色 Movement运动
163
544883
3787
政府监视了我的家人的大脑活动,
09:20
if the government政府 had been
monitoring监控 my family's家庭的 brain activity活动,
164
548694
3901
并且认为他们同情抗议者会怎样?
09:24
and had believed相信 them
to be sympathetic有同情心 to the protesters抗议者?
165
552619
4007
09:30
Is it so far-fetched牵强 to imagine想像 a society社会
166
558091
3047
这样想象一个人们只是
因为有了犯罪的想法就会被逮捕的社会,
09:33
in which哪一个 people are arrested被捕
based基于 on their thoughts思念
167
561162
2842
09:36
of committing提交 a crime犯罪,
168
564028
1167
是否有点牵强?
09:37
like in the science-fiction科幻小说
dystopian反乌托邦 society社会 in "Minority少数民族 Report报告."
169
565219
4312
它就像是科幻电影《少数派报告》
中的反乌托邦社会。
09:42
Already已经, in the United联合的 States状态, in Indiana印地安那,
170
570286
4323
在美国的印地安那州,
已经有一个18岁的学生因为发布
09:46
an 18-year-old-岁 was charged带电
with attempting尝试 to intimidate威吓 his school学校
171
574633
4937
在走廊上射击他人的视频
而被指控企图恐吓学校。
09:51
by posting发帖 a video视频 of himself他自己
shooting射击 people in the hallways走廊 ...
172
579594
3309
09:55
Except the people were zombies僵尸
173
583881
3007
但是他射击的是僵尸,
而那个视频是他在
玩一款增强现实电子游戏。
09:58
and the video视频 was of him playing播放
an augmented-reality增强现实 video视频 game游戏,
174
586912
5047
这一切都被解释为他的
主观意图的一个心理折射。
10:03
all interpreted解读 to be a mental心理 projection投影
of his subjective主观 intent意图.
175
591983
4772
10:10
This is exactly究竟 why our brains大脑
need special特别 protection保护.
176
598456
4612
这就是为什么我们的大脑
需要特殊保护。
10:15
If our brains大脑 are just as subject学科
to data数据 tracking追踪 and aggregation聚合
177
603092
3556
如果我们的大脑像我们的财务记录
和交易一样受制于数据跟踪和汇总,
10:18
as our financial金融 records记录 and transactions交易,
178
606672
2532
10:21
if our brains大脑 can be hacked砍死
and tracked追踪 like our online线上 activities活动,
179
609228
4285
如果我们的大脑可以像在线活动,
10:25
our mobile移动 phones手机 and applications应用,
180
613537
2361
手机和应用程序那样被侵入和跟踪,
10:27
then we're on the brink边缘 of a dangerous危险
threat威胁 to our collective集体 humanity人性.
181
615922
4269
那么我们已经处在了
威胁到人类集体的危险边缘。
10:33
Before you panic恐慌,
182
621406
1309
在你感到惊慌之前,
10:36
I believe that there are solutions解决方案
to these concerns关注,
183
624012
3144
我相信一定存在这些问题的解决方案,
但是我们必须得从聚焦于正确的事情开始。
10:39
but we have to start开始 by focusing调焦
on the right things.
184
627180
2825
通常来说如果我们通过限制信息流动
10:42
When it comes to privacy隐私
protections保护 in general一般,
185
630580
2921
10:45
I think we're fighting战斗 a losing失去 battle战斗
186
633525
1826
来保护隐私,
10:47
by trying to restrict限制
the flow of information信息.
187
635375
2858
那么我们是在进行一场毫无希望的战斗。
相反,我们应该将重点放在权利保障和
10:50
Instead代替, we should be focusing调焦
on securing确保 rights权利 and remedies补救措施
188
638257
4057
对信息滥用的补救措施上面。
10:54
against反对 the misuse滥用 of our information信息.
189
642338
2275
10:57
If people had the right to decide决定
how their information信息 was shared共享,
190
645291
3285
比如说让人们拥有
决定如何分享自身信息的权利,
11:00
and more importantly重要的, have legal法律 redress纠正
191
648600
2921
并且更重要的是,如果他们因信息被滥用
而遭受到诸如就业,
11:03
if their information信息
was misused误用 against反对 them,
192
651545
2428
11:05
say to discriminate辨析 against反对 them
in an employment雇用 setting设置
193
653997
2786
卫生保健、教育方面的歧视时,
11:08
or in health健康 care关心 or education教育,
194
656807
2785
能够得到法律援助。
这对于建立信任来说
可能还有很长的路要走。
11:11
this would go a long way to build建立 trust相信.
195
659616
2571
11:14
In fact事实, in some instances实例,
196
662843
1718
事实上,在某些情况下,
我们反而愿意去分享更多的个人信息。
11:16
we want to be sharing分享 more
of our personal个人 information信息.
197
664585
3524
11:20
Studying研究 aggregated汇总 information信息
can tell us so much
198
668697
3047
研究集体信息可以让我们对
健康和幸福有更多的了解,
11:23
about our health健康 and our well-being福利,
199
671768
2747
但是为了安全地分享我们的信息,
11:26
but to be able能够 to safely安然 share分享
our information信息,
200
674539
3313
11:29
we need special特别 protections保护
for mental心理 privacy隐私.
201
677876
3223
我们需要对精神隐私进行特别保护。
11:33
This is why we need
a right to cognitive认知 liberty自由.
202
681832
3147
这就是为什么我们需要认知自由的权利。
这项权利保护我们思考和沉思的自由,
11:37
This right would secure安全 for us
our freedom自由 of thought and rumination沉思,
203
685543
4079
自主决定的自由,
11:41
our freedom自由 of self-determination自决,
204
689646
2540
11:44
and it would insure保证 that we have
the right to consent同意 to or refuse垃圾
205
692210
4390
它还确保我们有权同意或拒绝
11:48
access访问 and alteration改造
of our brains大脑 by others其他.
206
696624
2857
他人获取或修改我们的大脑信息。
这项权利应当被公认为
11:51
This right could be recognized认可
207
699811
2112
11:53
as part部分 of the Universal普遍 Declaration宣言
of Human人的 Rights,
208
701947
2883
世界人权宣言的一部分,
而后者已经建立了
11:56
which哪一个 has established既定 mechanisms机制
209
704854
2388
执行这类社会权利的机构。
11:59
for the enforcement强制
of these kinds of social社会 rights权利.
210
707266
2856
在“伊朗绿色运动”期间,
12:03
During the Iranian伊朗的 Green绿色 Movement运动,
211
711872
2070
12:05
the protesters抗议者 used the internet互联网
and good old-fashioned过时 word of mouth
212
713966
5186
抗议者使用了互联网和
老式的口口相传的方式
12:11
to coordinate坐标 their marches游行.
213
719176
1948
来协调他们的游行队伍。
而最终的结果是一些最残暴的限制
12:14
And some of the most oppressive压抑
restrictions限制 in Iran伊朗
214
722238
2769
12:17
were lifted取消 as a result结果.
215
725031
1877
被取消了,
12:20
But what if the Iranian伊朗的 government政府
had used brain surveillance监控
216
728047
4087
但是如果伊朗政府采取了大脑监视的方法
来侦察和阻止抗议又会怎样呢?
12:24
to detect检测 and prevent避免 the protest抗议?
217
732158
3061
这个世界还能听到抗议者的呼声吗?
12:28
Would the world世界 have ever heard听说
the protesters'抗议者的 cries哭声?
218
736847
3176
12:33
The time has come for us to call
for a cognitive认知 liberty自由 revolution革命.
219
741732
5121
是时候发动一场认知自由的革命了,
12:39
To make sure that we responsibly负责任
advance提前 technology技术
220
747559
3264
以确保我们值得信赖的先进技术
12:42
that could enable启用 us to embrace拥抱 the future未来
221
750847
2978
能够使我们拥抱一个
强烈保护我们内心最深处不被任何个人,
12:45
while fiercely激烈 protecting保护 all of us
from any person, company公司 or government政府
222
753849
6717
公司或政府非法侵入或改变的未来。
12:52
that attempts尝试 to unlawfully非法 access访问
or alter改变 our innermost lives生活.
223
760590
5040
12:58
Thank you.
224
766659
1174
谢谢。
12:59
(Applause掌声)
225
767857
3492
(掌声)
Translated by Yizhuo He
Reviewed by Riley Zhang

▲Back to top

ABOUT THE SPEAKER
Nita Farahany - Legal scholar, ethicist
Nita A. Farahany is a leading scholar on the ethical, legal, and social implications of biosciences and emerging technologies, particularly those related to neuroscience and behavioral genetics.

Why you should listen

Nita A. Farahany is a professor of law and philosophy, the founding director of the Duke Initiative for Science & Society and chair of the MA in Bioethics & Science Policy at Duke University. In 2010, Farahany was appointed by President Obama to the Presidential Commission for the Study of Bioethical Issues, and she served as a member until 2017. She is a member of the Neuroethics Division of the Multi-Council Working Group for the BRAIN Initiative, on the President's Research Council of the Canadian Institute for Advanced Research (CIFAR), and past member of the Global Agenda Council for Privacy, Technology and Governance at the World Economic Forum. 

Farahany presents her work to diverse audiences and is a frequent commentator for national media and radio shows. She is an elected member of the American Law Institute, the President-Elect of the International Neuroethics Society, serves on the Board of the International Neuroethics Society, a co-editor a co-editor-in-chief and co-founder of the Journal of Law and the Biosciences and an editorial board member of the American Journal of Bioethics (Neuroscience). She's on the Ethics Advisory Board for Illumina, Inc., the Scientific Advisory Board of Helix, and the Board of Advisors of Scientific American. 

Farahany received her AB in genetics, cell and developmental biology at Dartmouth College, a JD and MA from Duke University, as well as a PhD in philosophy. She also holds an ALM in biology from Harvard University. In 2004-2005, Farahany clerked for Judge Judith W. Rogers of the US Court of Appeals for the D.C. Circuit, after which she joined the faculty at Vanderbilt University. In 2011, Farahany was the Leah Kaplan Visiting Professor of Human Rights at Stanford Law School.

More profile about the speaker
Nita Farahany | Speaker | TED.com