ABOUT THE SPEAKER
Poppy Crum - Neuroscientist, technologist
Poppy Crum builds technologies that best leverage human physiology to enhance our experiences and how we interact with the world.

Why you should listen

Poppy Crum is dedicated to the development of immersive technologies that leverage human physiology and perceptual realities to enhance our experiences and interactions in the world. She has advanced a mission to democratize the way people of all abilities benefit from sensory technologies -- and how effectively technology communicates back to each of us. She believes the power of intelligent technologies is only realized with dynamic optimization and learning of as much of our personal and contextual data as possible.

Crum is chief scientist at Dolby Laboratories, leading the company's integration of neuroscience and sensory data science into its entertainment, communication and future technologies. She is also adjunct professor at Stanford University, where her work focuses on the impact and feedback potential of gaming and immersive environments, such as augmented and virtual reality, on neuroplasticity and learning. She has been recognized with the Advanced Imaging Society's Distinguished Leadership Award and the Consumer Technology Association's Technology and Standards Achievement Award for work towards the introduction of affordable, over-the-counter hearing-aid devices, and she is a fellow of the Audio Engineering Society. She has also been named to Billboard Magazine's 100 most influential female executives in the music industry. Prior to joining Dolby Laboratories, Crum was Research Faculty in the Department of Biomedical Engineering at Johns Hopkins School of Medicine.

More profile about the speaker
Poppy Crum | Speaker | TED.com
TED2018

Poppy Crum: Technology that knows what you're feeling

波碧·克文: 科技洞悉你的感受

Filmed:
1,589,033 views

當科技比我們更了解自己時,會發生什麼事情呢?波碧·克文研究人類如何表達情緒。她認為撲克臉的時代終將結束,因為科技能輕易地讀懂我們透露出來的情緒信號。在一次演講和技術展示中,波碧·克文展示了如何透過「同理心技術」 讀懂像身體溫度和呼吸中的化學成份這些生理訊息,而這些訊息能透露我們的感受。無論好壞,克文認為「如果我們能體悟同理心的力量,就能掌握科技,用科技幫助我們跨越情感和認知的鴻溝。」
- Neuroscientist, technologist
Poppy Crum builds technologies that best leverage human physiology to enhance our experiences and how we interact with the world. Full bio

Double-click the English transcript below to play the video.

00:12
What happens發生 when technology技術
knows知道 more about us than we do?
0
881
4456
當科技比我們更了解自己時,
會發生甚麼事呢?
00:17
A computer電腦 now can detect檢測
our slightest絲毫 facial面部 microexpressions細微
1
5992
3672
現今的電腦,
可以偵測到我們的微表情──
00:21
and be able能夠 to tell the difference區別
between之間 a real真實 smile微笑 and a fake one.
2
9688
3611
它分辨得出真笑和假笑。
00:25
That's only the beginning開始.
3
13323
1734
但這只是個開始。
00:27
Technology技術 has become成為
incredibly令人難以置信 intelligent智能
4
15466
2865
科技日益進步,
00:30
and already已經 knows知道 a lot
about our internal內部 states狀態.
5
18355
3400
它知道很多我們的內心世界。
00:34
And whether是否 we like it or not,
6
22085
2286
不管我們喜歡與否,
00:36
we already已經 are sharing分享
parts部分 of our inner lives生活
7
24395
3499
我們已將自己部分內心世界分享出去,
00:39
that's out of our control控制.
8
27918
1733
這不在我們的掌控之中。
00:43
That seems似乎 like a problem問題,
9
31413
1421
那似乎是個問題,
00:44
because a lot of us like to keep
what's going on inside
10
32858
3246
因為我們當中很多人,
喜歡隱藏內心想法,
讓旁人無從察覺。
00:48
from what people actually其實 see.
11
36128
1647
我們想要掌握自主權去選擇
什麼可被分享、什麼不可以。
00:50
We want to have agency機構
over what we share分享 and what we don't.
12
38323
4420
00:55
We all like to have a poker撲克 face面對.
13
43473
2321
我們都想要張撲克臉。
00:59
But I'm here to tell you
that I think that's a thing of the past過去.
14
47584
3346
但我在這裡告訴你們,
我認為這種想法已經過時。
01:03
And while that might威力 sound聲音 scary害怕,
it's not necessarily一定 a bad thing.
15
51347
4770
儘管這件事好像頗為嚇人,
但未必是件壞事。
01:09
I've spent花費 a lot of time
studying研究 the circuits電路 in the brain
16
57030
2770
我花了很多時間研究腦內的迴路,
01:11
that create創建 the unique獨特 perceptual知覺的
realities現實 that we each have.
17
59824
3693
它創造了我們獨特的感知。
01:16
And now I bring帶來 that together一起
18
64110
1405
現在我把它和當今科技結合,
01:17
with the capabilities功能
of current當前 technology技術
19
65539
2062
創造新技術,令我們更好、
01:19
to create創建 new technology技術
that does make us better,
20
67625
2537
感受更多、聯繫更多。
01:22
feel more, connect more.
21
70186
1600
01:24
And I believe to do that,
22
72482
1786
而我認為要達到這個目的,
01:26
we have to be OK
losing失去 some of our agency機構.
23
74292
2749
我們要接受失去一些自主權。
01:30
With some animals動物, it's really amazing驚人,
24
78149
2523
和某些動物在一起時,真的很奇妙。
我們可知道牠們的內心經歷。
01:32
and we get to see into
their internal內部 experiences經驗.
25
80696
3474
我們可直接得知
01:36
We get this upfront前期 look
at the mechanistic機械 interaction相互作用
26
84649
3722
牠們對周圍環境的反應
01:40
between之間 how they respond響應
to the world世界 around them
27
88395
2817
和生理系統狀態之間的相互影響。
01:43
and the state of their biological生物 systems系統.
28
91236
2008
01:45
This is where evolutionary發展的 pressures壓力
like eating, mating交配
29
93268
3809
這就是來自進化的壓力,
像是進食、交配
和確保我們不會被吃掉,
01:49
and making製造 sure we don't get eaten吃過
30
97101
1762
促使確定性行為
回應我們身邊的資訊。
01:50
drive駕駛 deterministic確定性 behavioral行為的 responses回复
to information信息 in the world世界.
31
98887
4157
01:55
And we get to see into this window窗口,
32
103806
2794
而我們透過這個窗口,
透視牠們的心理狀態及生理變化。
01:58
into their internal內部 states狀態
and their biological生物 experiences經驗.
33
106624
3636
02:02
It's really pretty漂亮 cool.
34
110284
1642
這真的很酷。
02:03
Now, stay with me for a moment時刻 --
I'm a violinist小提琴手, not a singer歌手.
35
111950
4103
現在請你們忍受我一會兒。
我是個小提琴手,不是歌唱家。
02:08
But the spider's蜘蛛 already已經
given特定 me a critical危急 review評論.
36
116077
3590
但這隻蜘蛛已挑剔地批評我。
02:16
(Video視頻) (Singing唱歌 in a low pitch瀝青)
37
124907
2060
(低音唱歌)
02:19
(Singing唱歌 in a middle中間 pitch瀝青)
38
127868
2888
(中音唱歌)
02:23
(Singing唱歌 in a high pitch瀝青)
39
131800
2505
(高音唱歌)
02:27
(Singing唱歌 in a low pitch瀝青)
40
135069
1421
(低音唱歌)
02:29
(Singing唱歌 in a middle中間 pitch瀝青)
41
137236
1600
(中音唱歌)
02:31
(Singing唱歌 in a high pitch瀝青)
42
139403
1777
(高音唱歌)
02:33
(Laughter笑聲)
43
141204
1150
(笑聲)
02:36
Poppy罌粟 CrumCrum: It turns out, some spiders蜘蛛
tune調 their webs like violins小提琴
44
144387
3198
由此可知,
有些蜘蛛會調節自己的蜘蛛網,
02:39
to resonate諧振 with certain某些 sounds聲音.
45
147609
2158
像小提琴一樣調至特定的音階。
02:41
And likely容易, the harmonics諧波
of my voice語音 as it went higher更高
46
149791
2771
同樣地,我唱歌的音調
隨分貝數拉高,
02:44
coupled耦合 with how loud I was singing唱歌
47
152586
1730
02:46
recreated重建 either the predatory掠奪性 call
of an echolocatingecholocating bat蝙蝠 or a bird,
48
154340
4467
重新創造類似捕獵者,
蝙蝠或鳥的回聲定位。
02:50
and the spider蜘蛛 did what it should.
49
158831
1881
而蜘蛛做了牠應該做的事。
02:53
It predictivelypredictively told me to bug竊聽器 off.
50
161300
2817
牠預言性地叫我離開。
02:56
I love this.
51
164824
1150
我愛這樣。
02:58
The spider's蜘蛛 responding響應
to its external外部 world世界
52
166546
3309
蜘蛛回應外在環境時,
03:01
in a way that we get to see and know
what's happening事件 to its internal內部 world世界.
53
169879
4350
我們藉此看到並了解牠的內心世界。
03:07
Biology生物學 is controlling控制
the spider's蜘蛛 response響應;
54
175069
2206
生理控制著蜘蛛的反應,
03:09
it's wearing穿著 its internal內部
state on its sleeve.
55
177299
2776
牠把自己最真實的情感流露出來。
03:13
But us, humans人類 --
56
181768
1655
但我們,人類 ──
03:16
we're different不同.
57
184184
1150
我們不一樣。
03:17
We like to think we have cognitive認知 control控制
over what people see, know and understand理解
58
185899
5735
我們自認能夠控制自己的感知能力。
像是讓別人怎樣看待、知道和理解
03:23
about our internal內部 states狀態 --
59
191658
1409
我們的內心世界、
03:25
our emotions情緒, our insecurities不安全感,
our bluffs虛張聲勢, our trials試驗 and tribulations磨難 --
60
193091
4303
我們的情感、我們的局促不安、
我們的虛張聲勢、
我們的麻煩和艱難⋯⋯
及我們的反應。
03:29
and how we respond響應.
61
197418
1267
03:31
We get to have our poker撲克 face面對.
62
199927
2282
我們需要一張撲克臉。
但或者,我們其實並不需要。
03:35
Or maybe we don't.
63
203799
1200
跟我一起嘗試這樣做。
03:37
Try this with me.
64
205728
1182
03:38
Your eye responds響應
to how hard your brain is working加工.
65
206934
2690
你的眼睛能反映出大腦使用的程度。
03:42
The response響應 you're about to see
is driven驅動 entirely完全 by mental心理 effort功夫
66
210363
3230
接下來你看到的反應
完全出於心理因素。
03:45
and has nothing to do
with changes變化 in lighting燈光.
67
213617
2635
和亮度沒有關係。
03:48
We know this from neuroscience神經科學.
68
216276
1650
我保證這有神經科學的根據。
03:49
I promise諾言, your eyes眼睛 are doing
the same相同 thing as the subject學科 in our lab實驗室,
69
217950
4560
你的眼睛和實驗室受試者的
眼睛會做相同的事,
03:54
whether是否 you want them to or not.
70
222534
1734
無論你願意與否。
首先你會聽到一些聲音。
03:56
At first, you'll你會 hear some voices聲音.
71
224292
2173
嘗試去了解,
而且定睛於你面前的眼睛。
03:58
Try and understand理解 them
and keep watching觀看 the eye in front面前 of you.
72
226489
3278
一開始會有點難。
04:01
It's going to be hard at first,
73
229791
1498
要放棄真的很容易。
04:03
one should drop下降 out,
and it should get really easy簡單.
74
231313
2391
從瞳孔的直徑
你可以看見心理因素的變化。
04:05
You're going to see the change更改 in effort功夫
in the diameter直徑 of the pupil瞳孔.
75
233728
3325
04:10
(Video視頻) (Two overlapping重疊 voices聲音 talking)
76
238140
2567
(兩種聲音重疊)
智能科技仰賴個人數據。
04:12
(Single voice語音) Intelligent智能 technology技術
depends依靠 on personal個人 data數據.
77
240731
2963
(一種聲音)智能科技仰賴個人數據。
04:15
(Two overlapping重疊 voices聲音 talking)
78
243718
2446
(兩種聲音重疊)
智能科技仰賴個人數據。
04:18
(Single voice語音) Intelligent智能 technology技術
depends依靠 on personal個人 data數據.
79
246188
2976
(一種聲音)智能科技仰賴個人數據。
04:21
PC個人計算機: Your pupil瞳孔 doesn't lie謊言.
80
249680
1326
你的瞳孔不會說謊。
04:23
Your eye gives away your poker撲克 face面對.
81
251030
2400
你的眼睛拆穿了你的撲克臉。
04:25
When your brain's大腦的 having to work harder更難,
82
253990
1913
當腦袋愈努力地想事情,
04:27
your autonomic自主 nervous緊張 system系統
drives驅動器 your pupil瞳孔 to dilate膨脹.
83
255927
2785
自主神經系統驅使瞳孔擴大。
04:30
When it's not, it contracts合同.
84
258736
1555
在相反的情況下,瞳孔縮小。
04:32
When I take away one of the voices聲音,
85
260680
1691
當只有一種聲音時,
理解說話內容需要的精力減少。
04:34
the cognitive認知 effort功夫
to understand理解 the talkers健談
86
262395
2262
04:36
gets得到 a lot easier更輕鬆.
87
264681
1158
04:37
I could have put the two voices聲音
in different不同 spatial空間的 locations地點,
88
265863
3000
我把兩種聲音分別放在兩個地方,
04:40
I could have made製作 one louder.
89
268887
1666
如果我將其中一種聲音調大聲,
04:42
You would have seen看到 the same相同 thing.
90
270577
1738
出來的效果也是一樣。
04:45
We might威力 think we have more agency機構
over the reveal揭示 of our internal內部 state
91
273006
4786
我們可能自認為比蜘蛛有更多自主權
能控制披露自己的內心狀態與否,
04:49
than that spider蜘蛛,
92
277816
1579
04:51
but maybe we don't.
93
279419
1266
但或許不如我們所想像的。
04:53
Today's今天的 technology技術 is starting開始
to make it really easy簡單
94
281021
2969
現今科技很容易就讀懂
04:56
to see the signals信號 and tells告訴
that give us away.
95
284014
2690
我們內心深處釋出的信號。
04:59
The amalgamation融合 of sensors傳感器
paired配對 with machine learning學習
96
287109
3294
傳感器連同機器學習,
05:02
on us, around us and in our environments環境,
97
290427
2413
與我們、周遭、環境相結合,
05:04
is a lot more than cameras相機 and microphones麥克風
tracking追踪 our external外部 actions行動.
98
292864
4653
遠遠不止追蹤我們行動的
相機和麥克風而已。
05:12
Our bodies身體 radiate輻射 our stories故事
99
300529
2818
我們身體的生理溫度變化
05:15
from changes變化 in the temperature溫度
of our physiology生理.
100
303371
2666
輻射出我們的故事。
05:18
We can look at these
as infrared紅外線 thermal images圖片
101
306546
2261
看看我身後的紅外線溫度圖。
05:20
showing展示 up behind背後 me,
102
308831
1160
05:22
where reds紅魔 are hotter
and blues藍調 are cooler冷卻器.
103
310015
2070
紅色代表較高溫,藍色代表低溫。
05:24
The dynamic動態 signature簽名
of our thermal response響應
104
312458
3183
動態的身體溫度徵象
05:27
gives away our changes變化 in stress強調,
105
315665
3031
讓我們對壓力的反應露了餡,
05:30
how hard our brain is working加工,
106
318720
2008
我們有多努力用腦想問題,
05:32
whether是否 we're paying付款 attention注意
107
320752
1936
我們和別人聊天時是否專注,
05:34
and engaged訂婚 in the conversation會話
we might威力 be having
108
322712
2627
甚至我們是否像身歷其境遭火炙。
05:37
and even whether是否 we're experiencing經歷
a picture圖片 of fire as if it were real真實.
109
325363
4095
05:41
We can actually其實 see
people give off heat on their cheeks臉頰
110
329482
2643
人們的臉頰真的會散發出熱力
05:44
in response響應 to an image圖片 of flame火焰.
111
332149
2200
來回應火焰的影像。
05:48
But aside在旁邊 from giving away
our poker撲克 bluffs虛張聲勢,
112
336013
2929
除了假裝冷漠露了餡,
05:50
what if dimensions尺寸 of data數據
from someone's誰家 thermal response響應
113
338966
4746
如果反應的人體溫度數據
05:55
gave away a glow輝光
of interpersonal人際交往 interest利益?
114
343736
2659
也讓人際關係露了餡呢?
05:58
Tracking跟蹤 the honesty誠實 of feelings情懷
in someone's誰家 thermal image圖片
115
346966
3532
用溫度圖追蹤人的真誠感受
06:02
might威力 be a new part部分 of how
we fall秋季 in love and see attraction引力.
116
350522
3626
可能是部分墜入愛河
和偵測吸引力的新穎方式。
06:06
Our technology技術 can listen,
develop發展 insights見解 and make predictions預測
117
354172
3693
我們的科技可以聆聽、洞察和預測
06:09
about our mental心理 and physical物理 health健康
118
357889
2095
我們心理和生理的健康,
06:12
just by analyzing分析 the timing定時 dynamics動力學
of our speech言語 and language語言
119
360008
4000
只需分析麥克風收到的
語言節奏變化。
06:16
picked採摘的 up by microphones麥克風.
120
364032
1443
06:18
Groups have shown顯示 that changes變化
in the statistics統計 of our language語言
121
366038
3880
資料顯示,透過搭配
人們的語言變化數據和機器,
06:21
paired配對 with machine learning學習
122
369942
1420
06:23
can predict預測 the likelihood可能性
someone有人 will develop發展 psychosis精神病.
123
371386
3161
可以預測一個人有精神病的可能性。
06:27
I'm going to take it a step further進一步
124
375442
1751
我將更深入地探討,
06:29
and look at linguistic語言 changes變化
and changes變化 in our voice語音
125
377217
2587
在不同的情況下,
06:31
that show顯示 up with a lot
of different不同 conditions條件.
126
379828
2239
我們語言和聲音上的改變。
06:34
Dementia癡呆, diabetes糖尿病 can alter改變
the spectral光譜 coloration著色 of our voice語音.
127
382091
4367
老年癡呆和糖尿病
可改變聲音中的光譜顏色。
06:39
Changes變化 in our language語言
associated相關 with Alzheimer's老年癡呆症
128
387205
3119
與阿茲海默症有關的語言變化
06:42
can sometimes有時 show顯示 up more
than 10 years年份 before clinical臨床 diagnosis診斷.
129
390348
4365
能在臨床診斷的前十年就被發現。
06:47
What we say and how we say it
tells告訴 a much richer更豐富 story故事
130
395236
3960
我們表達的內容和怎樣表達
比我們想像中透露出更多資訊。
06:51
than we used to think.
131
399220
1254
06:53
And devices設備 we already已經 have in our homes家園
could, if we let them,
132
401022
4047
家中已有的儀器,如果我們允許,
06:57
give us invaluable無價 insight眼光 back.
133
405093
2134
可以讓它們把珍貴的數據傳送回來。
06:59
The chemical化學 composition組成 of our breath呼吸
134
407998
2978
我們呼吸中的化學成分,
07:03
gives away our feelings情懷.
135
411959
1354
能透露我們的感覺。
07:06
There's a dynamic動態 mixture混合物 of acetone丙酮,
isoprene異戊二烯 and carbon dioxide二氧化碳
136
414363
4478
由丙酮、異戊二烯和二氧化碳
所組成的混合物成分,
07:10
that changes變化 when our heart speeds速度 up,
when our muscles肌肉 tense緊張,
137
418865
3384
在心跳加速、肌肉緊張時不斷變化。
07:14
and all without any obvious明顯 change更改
in our behaviors行為.
138
422809
2897
但我們的行為不會有明顯的改變。
07:18
Alright好的, I want you to watch
this clip with me.
139
426268
2738
請你們跟我一同看這段影片。
07:21
Some things might威力 be going on
on the side screens屏幕,
140
429030
3119
有些東西可能出現在兩旁的螢幕,
07:24
but try and focus焦點 on
the image圖片 in the front面前
141
432173
3777
但嘗試把注意力集中在前面的影像,
07:27
and the man at the window窗口.
142
435974
1463
和窗戶旁的男人。
07:31
(Eerie怪異 music音樂)
143
439633
2658
(怪異的音樂)
07:39
(Woman女人 screams尖叫聲)
144
447767
1437
(女人的尖叫)
07:50
PC個人計算機: Sorry about that.
I needed需要 to get a reaction反應.
145
458692
2395
抱歉讓你們經歷這些,
我需要你們的反應。
07:53
(Laughter笑聲)
146
461111
1785
(笑聲)
07:55
I'm actually其實 tracking追踪 the carbon dioxide二氧化碳
you exhale in the room房間 right now.
147
463412
4972
我正追蹤你們在這房間
呼出的二氧化碳。
08:01
We've我們已經 installed安裝 tubes
throughout始終 the theater劇院,
148
469903
3293
我們在整間講廳安裝了管道,
在地下,
08:05
lower降低 to the ground地面,
because COCO2 is heavier較重 than air空氣.
149
473220
2595
因為二氧化碳比空氣重。
08:07
But they're connected連接的
to a device設備 in the back
150
475839
2667
這些管道連接後面的儀器,
08:10
that lets讓我們 us measure測量, in real真實 time,
with high precision精確,
151
478530
3287
可以實時準確地計算
08:13
the continuous連續 differential微分
concentration濃度 of COCO2.
152
481841
2922
一連串的二氧化碳濃度變化。
08:17
The clouds on the sides雙方 are actually其實
the real-time即時的 data數據 visualization可視化
153
485246
5508
旁邊的雲圖顯示出這裡
08:22
of the density密度 of our COCO2.
154
490778
1998
二氧化碳濃度。
08:25
You might威力 still see
a patch補丁 of red on the screen屏幕,
155
493374
3699
你可能見到螢幕上有一小塊紅色,
08:29
because we're showing展示 increases增加
with larger colored有色 clouds,
156
497097
3705
因為它顯示有顏色的雲團增多,
08:32
larger colored有色 areas of red.
157
500826
2196
紅色的區域增多。
08:35
And that's the point
where a lot of us jumped跳下.
158
503046
2559
這正是顯示我們被嚇到的時候。
08:38
It's our collective集體 suspense懸念
driving主動 a change更改 in carbon dioxide二氧化碳.
159
506173
4915
是我們集體的焦慮
導致二氧化碳變化。
08:43
Alright好的, now, watch this
with me one more time.
160
511649
2722
現在再看一次這段影片。
08:46
(Cheerful開朗 music音樂)
161
514395
2238
(快樂的音樂)
08:54
(Woman女人 laughs)
162
522553
2137
(女人的笑聲)
09:05
PC個人計算機: You knew知道 it was coming未來.
163
533344
1349
你們已經知道會發生什麼事。
09:06
But it's a lot different不同
when we changed the creator's創作者 intent意圖.
164
534717
3363
當我們改變了創作者的意圖,
差別會很大。
09:10
Changing更改 the music音樂 and the sound聲音 effects效果
165
538776
2769
轉換了音樂和音效
09:13
completely全然 alter改變 the emotional情緒化
impact碰撞 of that scene現場.
166
541569
3603
完全改變那場景帶來的情緒影響。
09:17
And we can see it in our breath呼吸.
167
545196
2134
從我們的呼吸可以知道,
09:20
Suspense懸念, fear恐懼, joy喜悅
168
548196
2262
焦慮、恐懼、歡樂,
09:22
all show顯示 up as reproducible重複性,
visually視覺 identifiable識別 moments瞬間.
169
550482
4507
都顯示可重複性,
視覺上可以辨識的時刻。
09:27
We broadcast廣播 a chemical化學 signature簽名
of our emotions情緒.
170
555473
4151
這些化學特質傳達了我們的情緒。
09:35
It is the end結束 of the poker撲克 face面對.
171
563249
2133
撲克臉的時代將就此終結。
09:38
Our spaces空間, our technology技術
will know what we're feeling感覺.
172
566582
3566
我們的空間、科技
將會知道我們的情感。
09:42
We will know more about each other
than we ever have.
173
570736
2785
我們將前所未有地更了解彼此。
09:45
We get a chance機會 to reach達到 in and connect
to the experience經驗 and sentiments情緒
174
573911
4307
我們有機會在情感上與社交上,
09:50
that are fundamental基本的 to us as humans人類
175
578242
1742
交換那些身為人類
09:52
in our senses感官, emotionally感情上 and socially社交上.
176
580008
2410
基本擁有的經驗與觀點。
09:55
I believe it is the era時代 of the empath移情.
177
583482
2540
我相信這是一個同理心的時代。
09:58
And we are enabling啟用 the capabilities功能
that true真正 technological技術性 partners夥伴 can bring帶來
178
586046
5222
我們正在實現那些能力,
是真正的技術合作夥伴
能夠帶來的能力,
它能夠連結我們彼此,
10:03
to how we connect with each other
and with our technology技術.
179
591292
3047
也能連結我們與技術。
10:06
If we recognize認識 the power功率
of becoming變得 technological技術性 empathsempaths,
180
594363
3389
如果意識到我們能用技術
達到同理心的力量,
10:09
we get this opportunity機會
181
597776
1936
我們就掌有機會,
10:11
where technology技術 can help us bridge
the emotional情緒化 and cognitive認知 divide劃分.
182
599736
4424
用科技幫助我們跨越
情感和認知的鴻溝。
10:16
And in that way, we get to change更改
how we tell our stories故事.
183
604680
2723
如此一來,我們就能改變敘事的方法。
10:19
We can enable啟用 a better future未來
for technologies技術 like augmented增強 reality現實
184
607427
3580
我們能為「擴增實境」等技術
創造更美好的未來,
10:23
to extend延伸 our own擁有 agency機構
and connect us at a much deeper更深 level水平.
185
611031
4193
擴大我們的自主權,
並令彼此更深入地連結。
10:27
Imagine想像 a high school學校 counselor顧問
being存在 able能夠 to realize實現
186
615625
2547
試想像高中的輔導老師
能注意到表面歡樂的學生
其實正經歷著艱難的日子。
10:30
that an outwardly向外 cheery愉快的 student學生
really was having a deeply hard time,
187
618196
3826
10:34
where reaching到達 out can make
a crucial關鍵, positive difference區別.
188
622046
3180
給予這名學生支持,
藉此得到關鍵且正向的結果。
10:37
Or authorities當局, being存在 able能夠
to know the difference區別
189
625766
3230
或是讓權威人士有能力分辨
10:41
between之間 someone有人 having
a mental心理 health健康 crisis危機
190
629020
2325
一個人是有心理健康障礙
10:43
and a different不同 type類型 of aggression侵略,
191
631369
1826
還是具有侵略性,
10:45
and responding響應 accordingly於是.
192
633219
1800
並根據情況作出回應。
10:47
Or an artist藝術家, knowing會心
the direct直接 impact碰撞 of their work.
193
635609
3273
或者讓藝術工作者
能知曉其作品對人們的影響。
10:52
Leo獅子座 Tolstoy托爾斯泰 defined定義 his perspective透視 of art藝術
194
640173
2643
列夫·托爾斯泰將藝術定義為
10:54
by whether是否 what the creator創造者 intended
195
642840
1785
創造者透過作品
把自己的想法傳達給觀賞者。
10:56
was experienced有經驗的 by the person
on the other end結束.
196
644649
2586
10:59
Today's今天的 artists藝術家 can know
what we're feeling感覺.
197
647259
2566
如今藝術工作者能知道我們的感受。
11:02
But regardless而不管 of whether是否
it's art藝術 or human人的 connection連接,
198
650204
3005
但無論是藝術或人際的連結,
11:06
today's今天的 technologies技術
will know and can know
199
654608
2802
當今科技將會知道,能夠知道
11:09
what we're experiencing經歷 on the other side,
200
657434
2048
我們不為人知的經歷。
11:11
and this means手段 we can be
closer接近 and more authentic真實.
201
659506
2618
這代表我們之間會更接近、更真誠。
11:14
But I realize實現 a lot of us
have a really hard time
202
662498
4293
但我明白有很多人抗拒
11:18
with the idea理念 of sharing分享 our data數據,
203
666815
2267
分享與自身相關數據的做法,
11:21
and especially特別 the idea理念
that people know things about us
204
669673
3111
特別是別人能洞悉
11:24
that we didn't actively積極地 choose選擇 to share分享.
205
672808
2321
我們不想與之分享的想法。
11:28
Anytime任何時候 we talk to someone有人,
206
676728
2216
無論何時當我們和人交談、
11:31
look at someone有人
207
679946
1555
看著別人、
11:33
or choose選擇 not to look,
208
681525
1468
或不看時,
11:35
data數據 is exchanged交換, given特定 away,
209
683017
2647
數據已被交換並傳遞出去。
11:38
that people use to learn學習,
210
686533
2205
人們利用這些數據了解我們
11:40
make decisions決定 about
their lives生活 and about ours我們的.
211
688762
3267
來藉此做了和我們相關的決定。
11:45
I'm not looking to create創建 a world世界
where our inner lives生活 are ripped撕開 open打開
212
693469
3968
我不是想創造一個
內心生活要被剖開的世界──
11:49
and our personal個人 data數據
and our privacy隱私 given特定 away
213
697461
2348
把我們的私人數據和隱私
給我們不想給的人看。
11:51
to people and entities實體
where we don't want to see it go.
214
699833
2713
11:55
But I am looking to create創建 a world世界
215
703117
2762
我想創造一個世界
11:57
where we can care關心 about
each other more effectively有效,
216
705903
3408
可以讓我們更有效地關心彼此。
12:01
we can know more about when
someone有人 is feeling感覺 something
217
709335
3060
當有人需要被關心時,
12:04
that we ought應該 to pay工資 attention注意 to.
218
712419
1872
我們可以獲得更多資訊。
12:06
And we can have richer更豐富 experiences經驗
from our technology技術.
219
714800
3335
科技讓我們有更豐富的經驗。
12:10
Any technology技術
can be used for good or bad.
220
718887
2357
任何科技都可用來行善或犯罪。
12:13
Transparency透明度 to engagement訂婚
and effective有效 regulation
221
721268
2412
參與的透明度和有效的規範,
12:15
are absolutely絕對 critical危急
to building建造 the trust相信 for any of this.
222
723704
3120
這兩項要素能幫助科技建立信任。
12:20
But the benefits好處 that "empathetic感情移入的
technology技術" can bring帶來 to our lives生活
223
728106
4834
同理心技術帶給生活的好處,
12:24
are worth價值 solving the problems問題
that make us uncomfortable不舒服.
224
732964
3891
值得讓我們解決令人不安的問題。
12:29
And if we don't, there are
too many許多 opportunities機會 and feelings情懷
225
737315
4025
如果我們不解決的話,
會有太多的機會和感受
12:33
we're going to be missing失踪 out on.
226
741364
1695
被我們忽略。
12:35
Thank you.
227
743083
1175
謝謝!
12:36
(Applause掌聲)
228
744282
2479
(掌聲)
Translated by Clare Wong
Reviewed by Haoyun Lo

▲Back to top

ABOUT THE SPEAKER
Poppy Crum - Neuroscientist, technologist
Poppy Crum builds technologies that best leverage human physiology to enhance our experiences and how we interact with the world.

Why you should listen

Poppy Crum is dedicated to the development of immersive technologies that leverage human physiology and perceptual realities to enhance our experiences and interactions in the world. She has advanced a mission to democratize the way people of all abilities benefit from sensory technologies -- and how effectively technology communicates back to each of us. She believes the power of intelligent technologies is only realized with dynamic optimization and learning of as much of our personal and contextual data as possible.

Crum is chief scientist at Dolby Laboratories, leading the company's integration of neuroscience and sensory data science into its entertainment, communication and future technologies. She is also adjunct professor at Stanford University, where her work focuses on the impact and feedback potential of gaming and immersive environments, such as augmented and virtual reality, on neuroplasticity and learning. She has been recognized with the Advanced Imaging Society's Distinguished Leadership Award and the Consumer Technology Association's Technology and Standards Achievement Award for work towards the introduction of affordable, over-the-counter hearing-aid devices, and she is a fellow of the Audio Engineering Society. She has also been named to Billboard Magazine's 100 most influential female executives in the music industry. Prior to joining Dolby Laboratories, Crum was Research Faculty in the Department of Biomedical Engineering at Johns Hopkins School of Medicine.

More profile about the speaker
Poppy Crum | Speaker | TED.com