ABOUT THE SPEAKERS
Christiane Amanpour - Journalist
TV news legend Christiane Amanpour is known for her uncompromising approach to reporting and interviewing.

Why you should listen

Christiane Amanpour is CNN's chief international correspondent and anchor of the global a airs program "Amanpour," broadcast from the television network's London bureau. She's covered the most relevant conflicts of the last decades, exposing both the brutality and human cost of war and its geopolitical impacts. From the 1991 Gulf War to the siege of Sarajevo (the city later named her honorary citizen), from the 2003 American-led invasion of Iraq to the trial of Saddam Hussein the following year, Amanpour's fearless and uncompromising approach has made her popular with audiences, and a force to be reckoned with by global influencers.

During the Balkan wars, Amanpour famously broke with the idea of journalism neutrality by calling out human right abuses and saying that "there are some situations one simply cannot be neutral about, because when you are neutral you are an accomplice." Since her interview show "Amanpour" was launched in 2009, she's spoken to leaders and decision makers on the issues affecting the world today while continuing reporting from all over the world, including the 2010 earthquake in Haiti and the 2011 tsunami in Japan.

More profile about the speaker
Christiane Amanpour | Speaker | TED.com
Chris Anderson - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading.

Why you should listen

Chris Anderson is the Curator of TED, a nonprofit devoted to sharing valuable ideas, primarily through the medium of 'TED Talks' -- short talks that are offered free online to a global audience.

Chris was born in a remote village in Pakistan in 1957. He spent his early years in India, Pakistan and Afghanistan, where his parents worked as medical missionaries, and he attended an American school in the Himalayas for his early education. After boarding school in Bath, England, he went on to Oxford University, graduating in 1978 with a degree in philosophy, politics and economics.

Chris then trained as a journalist, working in newspapers and radio, including two years producing a world news service in the Seychelles Islands.

Back in the UK in 1984, Chris was captivated by the personal computer revolution and became an editor at one of the UK's early computer magazines. A year later he founded Future Publishing with a $25,000 bank loan. The new company initially focused on specialist computer publications but eventually expanded into other areas such as cycling, music, video games, technology and design, doubling in size every year for seven years. In 1994, Chris moved to the United States where he built Imagine Media, publisher of Business 2.0 magazine and creator of the popular video game users website IGN. Chris eventually merged Imagine and Future, taking the combined entity public in London in 1999, under the Future name. At its peak, it published 150 magazines and websites and employed 2,000 people.

This success allowed Chris to create a private nonprofit organization, the Sapling Foundation, with the hope of finding new ways to tackle tough global issues through media, technology, entrepreneurship and, most of all, ideas. In 2001, the foundation acquired the TED Conference, then an annual meeting of luminaries in the fields of Technology, Entertainment and Design held in Monterey, California, and Chris left Future to work full time on TED.

He expanded the conference's remit to cover all topics, including science, business and key global issues, while adding a Fellows program, which now has some 300 alumni, and the TED Prize, which grants its recipients "one wish to change the world." The TED stage has become a place for thinkers and doers from all fields to share their ideas and their work, capturing imaginations, sparking conversation and encouraging discovery along the way.

In 2006, TED experimented with posting some of its talks on the Internet. Their viral success encouraged Chris to begin positioning the organization as a global media initiative devoted to 'ideas worth spreading,' part of a new era of information dissemination using the power of online video. In June 2015, the organization posted its 2,000th talk online. The talks are free to view, and they have been translated into more than 100 languages with the help of volunteers from around the world. Viewership has grown to approximately one billion views per year.

Continuing a strategy of 'radical openness,' in 2009 Chris introduced the TEDx initiative, allowing free licenses to local organizers who wished to organize their own TED-like events. More than 8,000 such events have been held, generating an archive of 60,000 TEDx talks. And three years later, the TED-Ed program was launched, offering free educational videos and tools to students and teachers.

More profile about the speaker
Chris Anderson | Speaker | TED.com
TEDGlobal>NYC

Christiane Amanpour: How to seek truth in the era of fake news

克莉絲蒂安艾曼普: 在假新聞的時代,要如何尋求真相?

Filmed:
1,615,954 views

克莉絲蒂安艾曼普以她的勇氣和明確而聞名全球,她花了過去數十年的時間,去訪談創造歷史的商業、文化、政治領袖。在和 TED 策展人克里斯安德森的談話中,艾曼普討論了假新聞、新聞客觀性、全球政治的領導人空缺、以及其他更多議題,同時分享了她的智慧。「要非常留意你的資訊來自何處;」她說:「除非我們都能以全球市民的身份來參與,能夠意識到真相,能夠了解科學、實證證據與事實,不然我們就只會離開正道,走向潛在的大災難。」
- Journalist
TV news legend Christiane Amanpour is known for her uncompromising approach to reporting and interviewing. Full bio - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading. Full bio

Double-click the English transcript below to play the video.

00:12
Chis克麗絲 Anderson安德森: Christiane克莉絲蒂娜,
great to have you here.
0
931
2337
克里斯安德森:克莉絲蒂安,
很高興你今天能來。
00:15
So you've had this amazing驚人 viewpoint觀點,
1
3292
1843
所以,你有個很棒的觀點,
00:17
and perhaps也許 it's fair公平 to say
that in the last few少數 years年份,
2
5159
3060
也許可以說,在過去的幾年間,
00:20
there have been some alarming驚人
developments發展 that you're seeing眼看.
3
8243
3753
你看見了一些令人擔憂的發展。
00:24
What's alarmed驚慌 you most?
4
12020
1564
最讓你擔憂的是什麼?
00:25
Christiane克莉絲蒂娜 Amanpour阿曼普爾: Well, just listening
to the earlier speakers音箱,
5
13608
3192
克莉絲蒂安艾曼普:嗯,
聽了前幾位講者的演說,
00:28
I can frame it
in what they've他們已經 been saying:
6
16824
2472
我可以借用他們所說的來表達:
00:31
climate氣候 change更改, for instance --
cities城市, the threat威脅 to our environment環境
7
19320
3422
比如,氣候改變、城市,
對我們的環境
00:34
and our lives生活.
8
22766
1260
以及生活的威脅。
00:36
It basically基本上 also boils down to
understanding理解 the truth真相
9
24440
3894
基本上,可以歸結到了解真相,
00:40
and to be able能夠 to get to the truth真相
of what we're talking about
10
28358
3011
並能夠針對我們
談論的議題去探究真相,
00:43
in order訂購 to really be able能夠 to solve解決 it.
11
31393
2092
才能夠真正去解決它。
00:45
So if 99.9 percent百分
of the science科學 on climate氣候
12
33509
3927
所以,如果 99.9% 的氣候科學
00:49
is empirical經驗, scientific科學 evidence證據,
13
37460
3057
都是實證的、科學的證據,
00:52
but it's competing競爭 almost幾乎 equally一樣
with a handful少數 of deniers否認,
14
40541
4895
但卻在與少數駁斥者
幾乎平頭式地競爭,
00:57
that is not the truth真相;
15
45460
1227
那就不是真相,
00:58
that is the epitome縮影 of fake news新聞.
16
46711
2498
而是假新聞的縮影。
01:01
And so for me, the last few少數 years年份 --
certainly當然 this last year --
17
49233
5102
對我而言,過去幾年,
特別是最近這一年,
01:06
has crystallized結晶 the notion概念 of fake news新聞
in a way that's truly alarming驚人
18
54359
4260
假新聞的概念被以一種
很讓人擔憂的方式給具體化了,
01:10
and not just some slogan口號
to be thrown拋出 around.
19
58643
2659
不再只是隨處喊喊的口號而已。
01:13
Because when you can't distinguish區分
between之間 the truth真相 and fake news新聞,
20
61326
3811
因為當你無法區別
真相和假新聞的差別時,
01:17
you have a very much more
difficult time trying to solve解決
21
65161
3891
你就會更難去試圖解決
01:21
some of the great issues問題 that we face面對.
22
69076
2451
我們面對的一些重大議題。
01:24
CACA: Well, you've been involved參與
in this question of,
23
72512
3421
克里斯:你涉入
什麼是平衡、什麼是真相、
什麼是公正這些問題,
01:27
what is balance平衡, what is truth真相,
what is impartiality公正性,
24
75957
2912
01:30
for a long time.
25
78893
1255
已經很長一段時間。
01:32
You were on the front面前 lines
reporting報告 the Balkan巴爾幹 Wars戰爭 25 years年份 ago.
26
80172
5870
25 年前,你在巴爾幹
戰爭的前線做報導。
01:38
And back then, you famously著名 said,
27
86066
3412
那時,你說過一句名言,
01:41
by calling調用 out human人的 right abuses濫用,
28
89502
2621
大聲說出人權被侵犯,
01:44
you said, "Look, there are some situations情況
one simply只是 cannot不能 be neutral中性 about,
29
92147
4329
你說:「聽著,有些狀況
就是無法中立去看待,
01:48
because when you're neutral中性,
30
96500
1380
因為當你中立時,
01:49
you are an accomplice同案犯."
31
97904
1887
你就是共犯。」
01:53
So, do you feel that today's今天的 journalists記者
aren't heeding抽穗灌漿 that advice忠告
32
101243
4897
你覺得現今的記者沒有留意到
那個關於平衡的建議嗎?
01:58
about balance平衡?
33
106164
1472
01:59
CACA: Well, look, I think for journalists記者,
objectivity客觀性 is the golden金色 rule規則.
34
107660
4106
克莉絲蒂安:我認為對記者而言,
客觀是黃金法則。
02:03
But I think sometimes有時 we don't understand理解
what objectivity客觀性 means手段.
35
111790
4416
但我認為,有時我們
並不了解客觀的意義。
02:08
And I actually其實 learned學到了 this very,
very young年輕 in my career事業,
36
116230
2994
我在職涯的極早期已學到了這一點,
02:11
which哪一個 was during the Balkan巴爾幹 Wars戰爭.
37
119248
1572
當時是在巴爾幹戰爭的期間。
02:12
I was young年輕 then.
38
120844
1216
那時我很年輕。
02:14
It was about 25 years年份 ago.
39
122084
2539
大約是 25 年前。
02:16
And what we faced面對 was the wholesale批發
violation違反, not just of human人的 rights權利,
40
124647
5781
當時我們面對的是大規模的違反,
不只違反人權而已,
02:22
but all the way to ethnic民族
cleansing淨化 and genocide種族滅絕,
41
130452
2979
而是一路到排除異族和種族滅絕,
02:25
and that has been adjudicated裁決
in the highest最高 war戰爭 crimes犯罪 court法庭
42
133455
4006
已經被世界最高戰犯法庭裁決了。
02:29
in the world世界.
43
137485
1164
02:30
So, we know what we were seeing眼看.
44
138673
1653
我們知道我們看見了什麼。
02:32
Trying to tell the world世界
what we were seeing眼看
45
140350
2537
為了試圖告訴世界我們看見了什麼,
02:34
brought us accusations指責 of bias偏壓,
46
142911
2775
導致我們被控訴,說我們有偏見、
02:37
of siding側線 with one side,
47
145710
1889
選邊站、
02:39
of not seeing眼看 the whole整個 side,
48
147623
1862
不看整體大局,
02:41
and just, you know,
trying to tell one story故事.
49
149509
2297
而只述說單方片面的故事。
02:43
I particularly尤其 and personally親自
was accused被告 of siding側線 with,
50
151830
4307
我個人還特別遭到控訴,
02:48
for instance, the citizens公民 of Sarajevo薩拉熱窩 --
51
156161
1982
比如說我站在塞拉耶佛市民的那一邊,
02:50
"siding側線 with the Muslims穆斯林,"
52
158167
1427
說我「站在穆斯林的那一邊」,
02:51
because they were the minority少數民族
who were being存在 attacked襲擊
53
159618
3052
因為在那裡他們是被攻擊的少數,
被與塞爾維亞同一陣線的
02:54
by Christians基督徒 on the Serb塞爾維亞 side
54
162694
3738
基督徒攻擊。
02:58
in this area.
55
166456
1719
那讓我憂心。
03:00
And it worried擔心 me.
56
168199
1342
03:01
It worried擔心 me that I was being存在
accused被告 of this.
57
169565
2191
我憂心遭受這樣的指控,
03:03
I thought maybe I was wrong錯誤,
58
171780
1342
心想,也許我錯了,
03:05
maybe I'd forgotten忘記了 what objectivity客觀性 was.
59
173146
2348
也許我忘了客觀是什麼。
03:07
But then I started開始 to understand理解
that what people wanted
60
175518
3007
接著我開始了解,人們想要的
03:10
was actually其實 not to do anything --
61
178549
1798
其實是什麼都不做,
03:12
not to step in,
62
180371
1417
不要涉入,
03:13
not to change更改 the situation情況,
63
181812
1570
不要去改變局勢,
03:15
not to find a solution.
64
183406
1449
不要去找解決方案。
03:16
And so, their fake news新聞 at that time,
65
184879
2353
所以,那時他們的假新聞,
03:19
their lie謊言 at that time --
66
187256
1382
那時說謊的
03:20
including包含 our government's政府的,
our democratically民主地 elected當選 government's政府的,
67
188662
3530
包括我們的政府,我們的民選政府,
03:24
with values and principles原則
of human人的 rights權利 --
68
192216
2272
有著人權價值和原則的政府,
03:26
their lie謊言 was to say
that all sides雙方 are equally一樣 guilty有罪,
69
194512
3507
他們的謊言是,每一方都同等有罪,
03:30
that this has been centuries百年
of ethnic民族 hatred,
70
198043
2793
這是由數百世紀的民族仇恨造成的;
03:32
whereas we knew知道 that wasn't true真正,
71
200860
1882
而我們知道那並非事實,
03:34
that one side had decided決定 to kill,
slaughter屠宰 and ethnically種族 cleanse潔淨
72
202766
3647
而是一方鐵了心要屠殺另一方,
03:38
another另一個 side.
73
206437
1157
滅絕異族。
03:39
So that is where, for me,
74
207618
1496
所以,就在當時我了解到
03:41
I understood了解 that objectivity客觀性 means手段
giving all sides雙方 an equal等於 hearing聽力
75
209138
5306
客觀意味著
給每一方被傾聽的平等機會,
03:46
and talking to all sides雙方,
76
214468
2105
並跟每一方都談,
03:48
but not treating治療 all sides雙方 equally一樣,
77
216597
3622
而不是平等對待每一方,
03:52
not creating創建 a forced被迫 moral道德 equivalence等價
or a factual事實 equivalence等價.
78
220243
4788
不是勉強造出道德等值或事實等值。
03:57
And when you come up against反對
that crisis危機 point
79
225055
4479
當你面對國際法及人權法
04:01
in situations情況 of grave violations違規
of international國際 and humanitarian人道主義 law,
80
229558
5671
被重大違反的危機關口,
04:07
if you don't understand理解
what you're seeing眼看,
81
235253
2342
如果你不了解你看到了什麼,
04:09
if you don't understand理解 the truth真相
82
237619
2160
如果你不了解真相,
04:11
and if you get trapped被困
in the fake news新聞 paradigm範例,
83
239803
3513
如果你被假新聞寫作典範給困住,
04:15
then you are an accomplice同案犯.
84
243340
1590
那麼你就是個共犯。
04:17
And I refuse垃圾 to be
an accomplice同案犯 to genocide種族滅絕.
85
245658
2997
而我拒絕成為種族屠殺的共犯。
04:20
(Applause掌聲)
86
248679
3283
(掌聲)
04:26
CHCH: So there have always been
these propaganda宣傳 battles戰鬥,
87
254402
2778
克里斯:所以,宣傳戰一直存在著,
04:29
and you were courageous勇敢 in taking服用
the stand you took back then.
88
257204
4026
而你那時有勇氣採取了堅定的立場。
04:33
Today今天, there's a whole整個 new way, though雖然,
89
261652
3727
不過,現今有了全新的
04:37
in which哪一個 news新聞 seems似乎 to be becoming變得 fake.
90
265403
2204
新聞變成假新聞的方式。
04:39
How would you characterize表徵 that?
91
267631
1634
你會如何描述它的特性?
04:41
CACA: Well, look -- I am really alarmed驚慌.
92
269289
2084
克莉絲蒂安:嗯,我真的很憂心。
04:43
And everywhere到處 I look,
93
271397
2202
放眼任何地方,
04:45
you know, we're buffeted挨打 by it.
94
273623
1837
我們都不斷遭受它的衝擊。
04:47
Obviously明顯, when the leader領導
of the free自由 world世界,
95
275484
2202
顯然,當自由世界的領袖,
04:49
when the most powerful強大 person
in the entire整個 world世界,
96
277710
2473
當整個世界最有權勢的人,
04:52
which哪一個 is the president主席
of the United聯合的 States狀態 --
97
280207
2246
也就是美國總統──
04:54
this is the most important重要, most powerful強大
country國家 in the whole整個 world世界,
98
282477
4819
美國是全世界最重要、
最有權勢的國家,
04:59
economically經濟, militarily軍事, politically政治上
in every一切 which哪一個 way --
99
287320
4240
在經濟、軍事、政治,
每個面向都是──
05:04
and it seeks尋求 to, obviously明顯, promote促進
its values and power功率 around the world世界.
100
292415
5017
很顯然尋求要在全世界
提升價值和權勢。
05:09
So we journalists記者,
who only seek尋求 the truth真相 --
101
297456
3976
所以我們這些只尋求真相的記者,
05:13
I mean, that is our mission任務 --
102
301456
1521
懷抱我們的使命,
05:15
we go around the world世界
looking for the truth真相
103
303001
2117
跑遍世界去尋找真相,
05:17
in order訂購 to be everybody's每個人的 eyes眼睛 and ears耳朵,
104
305142
1973
要成為那些無法去到
世界各地的人的眼睛和耳朵,
05:19
people who can't go out
in various各個 parts部分 of the world世界
105
307139
2519
05:21
to figure數字 out what's going on
about things that are vitally至關重要 important重要
106
309682
3369
我們要找出發生了哪些事,
哪些事會大大影響
每個人的健康與安全。
05:25
to everybody's每個人的 health健康 and security安全.
107
313075
1956
05:27
So when you have a major重大的 world世界 leader領導
accusing指責 you of fake news新聞,
108
315055
6686
所以,當那個主要的世界領袖
指控你做假新聞時,
05:33
it has an exponential指數 ripple波紋 effect影響.
109
321765
3843
會引起呈指數成長的漣漪效應,
05:37
And what it does is,
it starts啟動 to chip芯片 away
110
325632
4272
造成的結果是開始一點一點削弱
05:42
at not just our credibility可信性,
111
330472
2888
不只我們的信用,
05:45
but at people's人們 minds頭腦 --
112
333384
2029
還有人們的理智與主見。
05:48
people who look at us,
and maybe they're thinking思維,
113
336372
2364
人們看著我們,也許他們在想,
05:50
"Well, if the president主席
of the United聯合的 States狀態 says that,
114
338760
2669
「嗯,如果美國總統都那樣說了,
05:53
maybe somewhere某處 there's a truth真相 in there."
115
341453
2135
也許或多或少是真的。」
05:56
CHCH: Presidents總統 have always
been critical危急 of the media媒體 --
116
344148
4184
克里斯:總統們向來都對媒體很不滿──
06:00
CACA: Not in this way.
117
348356
1601
克莉絲蒂安:不是現在這樣的方式。
06:01
CHCH: So, to what extent程度 --
118
349981
1505
克里斯:所以,到什麼程度──
06:03
(Laughter笑聲)
119
351510
1064
(笑聲)
06:04
(Applause掌聲)
120
352598
3120
(掌聲)
06:07
CHCH: I mean, someone有人 a couple一對 years年份 ago
looking at the avalanche雪崩 of information信息
121
355742
6896
克里斯:我是說,幾年前
如果有人看見大量資訊湧入,
06:14
pouring澆注 through通過 Twitter推特
and FacebookFacebook的 and so forth向前,
122
362662
3236
資訊從 Twitter
及 Facebook 等地湧入,
06:17
might威力 have said,
123
365922
1158
他可能會說:
06:19
"Look, our democracies民主 are healthier健康
than they've他們已經 ever been.
124
367104
2841
「我們未曾像現在這麼民主。
06:21
There's more news新聞 than ever.
125
369969
1521
新聞量遠多於過去。
06:23
Of course課程 presidents總統
will say what they'll他們會 say,
126
371514
2211
當然,總統們會說他們要說的話,
06:25
but everyone大家 else其他 can say
what they will say.
127
373749
2233
但其他人也都可以說自己想說的。
06:28
What's not to like?
How is there an extra額外 danger危險?"
128
376006
4155
有什麼不好?為什麼
會存在額外的危險呢?」
06:32
CACA: So, I wish希望 that was true真正.
129
380185
1542
克莉絲蒂安:我希望那是真的。
06:34
I wish希望 that the proliferation增殖 of platforms平台
upon which哪一個 we get our information信息
130
382992
6093
我希望
資訊來源平台的數量激增
06:41
meant意味著 that there was a proliferation增殖
of truth真相 and transparency透明度
131
389109
3878
意味著真相和透明度激增,
06:45
and depth深度 and accuracy準確性.
132
393011
1868
深度和正確性也激增。
06:46
But I think the opposite對面 has happened發生.
133
394903
2455
但我認為,實際發生的恰恰相反。
06:49
You know, I'm a little bit of a Luddite勒德,
134
397382
2090
我承認自己有點算是盧德份子。
(註:反對技術革新的人)
06:51
I will confess承認.
135
399496
1196
06:53
Even when we started開始 to talk about
the information信息 superhighway超級公路,
136
401147
3384
即使當我們開始談到資訊高速公路,
06:56
which哪一個 was a long time ago,
137
404555
1628
那是很久以前了,
06:58
before social社會 media媒體, Twitter推特
and all the rest休息 of it,
138
406207
2651
在社交媒體、Twitter、
這類平台之前,
07:00
I was actually其實 really afraid害怕
139
408882
1824
其實那時我很害怕,
07:02
that that would put people
into certain某些 lanes車道 and tunnels隧道
140
410730
4021
怕它會把人們放到某些線道或隧道,
07:06
and have them just focusing調焦
on areas of their own擁有 interest利益
141
414775
4342
使他們只聚焦在自己感興趣的領域,
07:11
instead代替 of seeing眼看 the broad廣闊 picture圖片.
142
419141
2333
而不是放寬視野去看大局。
07:13
And I'm afraid害怕 to say
that with algorithms算法, with logarithms對數,
143
421498
4586
我很怕演算法、對數
07:18
with whatever隨你 the "-ithms-ithms" are
144
426108
1648
這類東西被用來
07:19
that direct直接 us into all these particular特定
channels渠道 of information信息,
145
427780
4266
引領我們到特定的資訊管道,
07:24
that seems似乎 to be happening事件 right now.
146
432070
1870
而這正似乎是現在正在發生的情形。
07:25
I mean, people have written書面
about this phenomenon現象.
147
433964
2544
人們已經在寫關於這個現象的文章,
07:28
People have said that yes,
the internet互聯網 came來了,
148
436532
2198
人們說,是的,網際網路到來了,
07:30
its promise諾言 was to exponentially成倍 explode爆炸
our access訪問 to more democracy民主,
149
438754
5743
它承諾讓我們能大量接觸更多民主、
07:36
more information信息,
150
444521
1714
更多資訊、
07:38
less bias偏壓,
151
446259
1892
較少偏見、
07:40
more varied多變 information信息.
152
448175
2389
和更多樣化的資訊。
07:42
And, in fact事實, the opposite對面 has happened發生.
153
450588
2325
但事實上,發生的情形相反。
07:44
And so that, for me,
is incredibly令人難以置信 dangerous危險.
154
452937
4018
所以對我來說,那是相當危險的。
07:48
And again, when you are the president主席
of this country國家 and you say things,
155
456979
4515
同樣的,若你身為國家總統,
07:53
it also gives leaders領導者 in other
undemocratic不民主 countries國家 the cover
156
461518
5425
你的發言會掩護
其他不民主國家的領袖,
08:00
to affront冒犯 us even worse更差,
157
468009
2306
讓他們能進一步冒犯我們,
08:02
and to really whack重打 us --
and their own擁有 journalists記者 --
158
470339
2860
揮著假新聞的棍棒紮紮實實痛擊我們
08:05
with this bludgeon打棍子 of fake news新聞.
159
473223
1823
和他們自己的記者。
08:08
CHCH: To what extent程度
is what happened發生, though雖然,
160
476000
2184
克里斯:單就已經發生的事來說,
08:10
in part部分, just an unintended意外 consequence後果,
161
478208
2066
非蓄意的後果到了什麼程度?
08:12
that the traditional傳統
media媒體 that you worked工作 in
162
480298
2802
你所從事的傳統媒體業
08:15
had this curation-mediation策展調解 role角色,
163
483124
2080
扮演著調解和處理資訊的角色,
08:17
where certain某些 norms規範 were observed觀察到的,
164
485228
2026
遵從某些基準,
08:19
certain某些 stories故事 would be rejected拒絕
because they weren't credible可信的,
165
487278
3153
駁回一些不可信的故事。
08:22
but now that the standard標準
for publication出版物 and for amplification放大
166
490455
6499
但現在
出版和散播的標準
08:28
is just interest利益, attention注意,
excitement激動, click點擊,
167
496978
3328
只剩下有趣、注意力、
刺激、點閱數,
08:32
"Did it get clicked點擊 on?"
168
500330
1163
「它被點閱了嗎?」
08:33
"Send發送 it out there!"
169
501517
1155
「把它發出去!」
08:34
and that's what's --
is that part部分 of what's caused造成 the problem問題?
170
502696
3504
那是造成問題的部分原因嗎?
克莉絲蒂安:我認為這是個大問題,
2016 年的大選已看到這現象,
08:38
CACA: I think it's a big problem問題,
and we saw this in the election選舉 of 2016,
171
506224
3595
08:41
where the idea理念 of "clickbaitclickbait"
was very sexy性感的 and very attractive有吸引力,
172
509843
5107
那時,點閱誘餌是非常性感、
非常有吸引力的,
08:46
and so all these fake news新聞 sites網站
and fake news新聞 items項目
173
514974
4306
因此所有這些假新聞網站
及一則則假新聞
08:51
were not just haphazardly胡亂
and by happenstance偶然 being存在 put out there,
174
519304
4122
並不是隨意或偶然被放在那裡,
08:55
there's been a whole整個 industry行業
in the creation創建 of fake news新聞
175
523450
4451
而是有一整個產業在製造假新聞,
08:59
in parts部分 of Eastern Europe歐洲, wherever哪裡,
176
527925
2990
在部份東歐地區,無論是哪,
09:02
and you know, it's planted種植的
in real真實 space空間 and in cyberspace網絡空間.
177
530939
3260
假新聞被植入實體和網路的空間中。
09:06
So I think that, also,
178
534223
2359
所以我也認為,
09:08
the ability能力 of our technology技術
to proliferate增生 this stuff東東
179
536606
5121
我們的科技有能力
將這類東西擴散出去,
09:13
at the speed速度 of sound聲音
or light, just about --
180
541751
3511
擴散的速度幾近音速或光速,
09:17
we've我們已經 never faced面對 that before.
181
545286
1983
這是我們未曾面對過的。
09:19
And we've我們已經 never faced面對
such這樣 a massive大規模的 amount of information信息
182
547293
4867
我們未曾面對過
如此大量、未經彙整的資訊,
09:24
which哪一個 is not curated策劃
183
552184
1565
未被那些身負把關職則、
09:25
by those whose誰的 profession職業
leads引線 them to abide遵守 by the truth真相,
184
553773
5296
必須確認事實的真相、
09:31
to fact-check其實檢查
185
559093
1202
09:32
and to maintain保持 a code of conduct進行
and a code of professional專業的 ethics倫理.
186
560319
4834
並維持職業倫理準則
與行為準則的那些人
所彙整過的資訊。
09:37
CHCH: Many許多 people here may可能 know
people who work at FacebookFacebook的
187
565177
3343
克里斯:這裡許多人可能認識
在 Facebook、Twitter、
09:40
or Twitter推特 and Google谷歌 and so on.
188
568544
2324
Google 等公司工作的人。
09:42
They all seem似乎 like great people
with good intention意向 --
189
570892
3132
他們都看似很棒、有著良善的意圖,
09:46
let's assume承擔 that.
190
574048
1380
就讓我們先這樣假設。
09:47
If you could speak說話 with the leaders領導者
of those companies公司,
191
575452
3675
若你能和這些公司的領導人說話,
09:51
what would you say to them?
192
579151
1291
你會對他們說什麼?
09:52
CACA: Well, you know what --
193
580466
1769
克莉絲蒂安:你知道嗎,
09:54
I'm sure they are
incredibly令人難以置信 well-intentioned用心良苦,
194
582259
2344
我相信他們的意圖都是非常良善的,
09:56
and they certainly當然 developed發達
an unbelievable難以置信的, game-changing改變遊戲規則 system系統,
195
584627
5218
他們確實發展出令人難以置信
並且改變遊戲規則的系統,
10:01
where everybody's每個人的 connected連接的
on this thing called FacebookFacebook的.
196
589869
3211
每個人都在這個名叫
Facebook 的東西上彼此連結。
10:05
And they've他們已經 created創建 a massive大規模的
economy經濟 for themselves他們自己
197
593104
3801
他們為自己創造出大規模的經濟、
10:08
and an amazing驚人 amount of income收入.
198
596929
2680
以及驚人的收入。
10:11
I would just say,
199
599633
1180
我只會說:
10:12
"Guys, you know, it's time
to wake喚醒 up and smell the coffee咖啡
200
600837
4234
「各位,該是醒來的時候了,
聞一聞咖啡、看看現在
10:17
and look at what's happening事件
to us right now."
201
605095
2702
在我們身上發生了那些事。」
10:19
Mark標記 Zuckerberg扎克伯格 wants to create創建
a global全球 community社區.
202
607821
2932
馬克祖克柏想要創造一個全球社群。
10:22
I want to know: What is that global全球
community社區 going to look like?
203
610777
3219
我想要知道:這個全球社群
看起來會是什麼樣子?
10:26
I want to know where the codes代碼
of conduct進行 actually其實 are.
204
614020
4067
我想要知道行為準則到底在哪裡。
10:30
Mark標記 Zuckerberg扎克伯格 said --
205
618111
1825
馬克祖克柏說──
10:31
and I don't blame him,
he probably大概 believed相信 this --
206
619960
2718
我不怪他,他可能確實相信這點──
10:34
that it was crazy to think
207
622702
2356
他說,如果認為俄國人
10:37
that the Russians俄羅斯 or anybody任何人 else其他
could be tinkering修修補補 and messing搞亂 around
208
625082
4109
或是其他人可以在用這裡胡搞亂弄,
10:41
with this avenue大街.
209
629215
1243
那就太瘋狂了。
10:42
And what have we just learned學到了
in the last few少數 weeks?
210
630482
2482
我們在前幾週剛剛學到了什麼?
10:44
That, actually其實, there has been
a major重大的 problem問題 in that regard看待,
211
632988
2958
我們得知其實那方面的問題大得很,
10:47
and now they're having to investigate調查 it
and figure數字 it out.
212
635970
3118
現在他們得要調查到底怎麼一回事。
10:51
Yes, they're trying to do
what they can now
213
639112
3279
是的,他們目前正傾力試著
10:54
to prevent避免 the rise上升 of fake news新聞,
214
642415
2158
防止假新聞興起,
10:56
but, you know,
215
644597
1383
但你知道,
10:58
it went pretty漂亮 unrestricted無限制
for a long, long time.
216
646004
5091
長久以來,假新聞一直未曾受限。
11:03
So I guess猜測 I would say, you know,
217
651119
1900
所以,我想我會說,
11:05
you guys are brilliant輝煌 at technology技術;
218
653043
2099
你們在科技方面才華橫溢,
11:07
let's figure數字 out another另一個 algorithm算法.
219
655166
1891
咱們來想出另一個演算法,
11:09
Can we not?
220
657081
1171
行吧?
11:10
CHCH: An algorithm算法 that includes包括
journalistic新聞 investigation調查 --
221
658276
2887
克里斯:一個包含
新聞調查的演算法──
11:13
CACA: I don't really know how they do it,
but somehow不知何故, you know --
222
661187
3356
克莉絲蒂安:其實我不清楚他們
怎麼做,但總要以某種方式
11:16
filter過濾 out the crap擲骰子!
223
664567
1819
把狗屁都過濾掉!
11:18
(Laughter笑聲)
224
666410
1150
(笑聲)
11:19
And not just the unintentional無意 --
225
667584
2002
不僅僅濾掉非蓄意的,
11:21
(Applause掌聲)
226
669610
3254
(掌聲)
11:24
but the deliberate商榷 lies that are planted種植的
227
672888
2206
也要濾掉刻意植入的謊言,
11:27
by people who've誰一直 been doing this
as a matter of warfare
228
675118
4325
由數十年來習以為常
把假新聞當作戰爭手段的那些人
11:31
for decades幾十年.
229
679467
1302
所植入的謊言。
11:32
The Soviets蘇聯, the Russians俄羅斯 --
230
680793
1933
蘇聯人、俄國人,
11:34
they are the masters主人 of war戰爭
by other means手段, of hybrid混合動力 warfare.
231
682750
5244
他們是利用另類手法
製造戰爭的大師,是混合戰的大師。
11:40
And this is a --
232
688618
1444
假新聞是他們決定採取的手段。
11:42
this is what they've他們已經 decided決定 to do.
233
690689
2984
11:45
It worked工作 in the United聯合的 States狀態,
234
693697
1605
在美國奏效了,
11:47
it didn't work in France法國,
235
695326
1321
在法國行不通,
11:48
it hasn't有沒有 worked工作 in Germany德國.
236
696671
1673
在德國還沒有用上。
11:50
During the elections選舉 there,
where they've他們已經 tried試著 to interfere干擾,
237
698368
2941
選舉期間,他們試圖干預,
11:53
the president主席 of France法國
right now, Emmanuel靈光 Macron音符,
238
701333
2602
法國的現任總統埃瑪紐耶爾馬克宏
11:55
took a very tough強硬 stand
and confronted面對 it head on,
239
703959
2523
採取非常強硬的立場正面迎戰,
11:58
as did Angela安吉拉 Merkel默克爾.
240
706506
1158
安格拉梅克爾也一樣。
(註:德國總理)
11:59
CHCH: There's some hope希望 to be had
from some of this, isn't there?
241
707688
2985
克里斯:這當中是有些希望的吧?
12:02
That the world世界 learns獲悉.
242
710697
1151
世界在學習。
12:03
We get fooled上當 once一旦,
243
711872
1318
我們被騙過一次,
12:05
maybe we get fooled上當 again,
244
713214
1332
也許我們會再被騙一次,
12:06
but maybe not the third第三 time.
245
714570
1455
但也許不會犯第三次錯。
12:08
Is that true真正?
246
716049
1168
是這樣的嗎?
12:09
CACA: I mean, let's hope希望.
247
717241
1156
克莉絲蒂安:咱們就希望如此吧。
12:10
But I think in this regard看待 that so much
of it is also about technology技術,
248
718421
3387
但我認為,在這方面
有很大一部份和科技相關,
12:13
that the technology技術 has to also be given特定
some kind of moral道德 compass羅盤.
249
721832
3445
科技也得要有某種道德羅盤。
12:17
I know I'm talking nonsense廢話,
but you know what I mean.
250
725301
2816
我知道我在說廢話,
但你們明白我的意思。
12:20
CHCH: We need a filter-the-crap過濾器的,廢話 algorithm算法
with a moral道德 compass羅盤 --
251
728141
3708
克里斯:我們需要一個道德羅盤
過濾狗屁的演算法。
12:23
CACA: There you go.
252
731873
1157
克莉絲蒂安:你說對了。
12:25
CHCH: I think that's good.
253
733054
1152
克里斯:我認為那很好。
12:26
CACA: No -- "moral道德 technology技術."
254
734230
1671
克莉絲蒂安:不,「道德科技」。
12:27
We all have moral道德 compasses圓規 --
moral道德 technology技術.
255
735925
3106
我們都要有道德羅盤──道德科技。
12:31
CHCH: I think that's a great challenge挑戰.
CACA: You know what I mean.
256
739055
2979
克里斯:我認為那是個大挑戰。
克莉絲蒂安:你懂我的意思。
12:34
CHCH: Talk just a minute分鐘 about leadership領導.
257
742058
1944
克里斯:花一分鐘談談領導。
12:36
You've had a chance機會 to speak說話
with so many許多 people across橫過 the world世界.
258
744026
3136
你有和世上那麼多人說話的機會。
12:39
I think for some of us --
259
747186
1239
我認為我們當中有些人──
12:40
I speak說話 for myself,
I don't know if others其他 feel this --
260
748449
2692
我代表自己發言,
不知道其他人是否有同感──
12:43
there's kind of been a disappointment失望 of:
261
751165
1996
一直懷有這樣的失望:
12:45
Where are the leaders領導者?
262
753185
1859
領導人在哪裡?
12:47
So many許多 of us have been disappointed失望 --
263
755068
2314
我們這麼多人一直覺得失望──
12:49
Aung San Suu Kyi,
what's happened發生 recently最近,
264
757406
2016
翁山蘇姬最近怎麼搞的,
12:51
it's like, "No! Another另一個 one
bites咬傷 the dust灰塵."
265
759446
2085
就像是:「不好!又陣亡了一個。」
12:53
You know, it's heartbreaking令人心碎.
266
761555
1599
很讓人心碎。
12:55
(Laughter笑聲)
267
763178
1235
(笑聲)
12:56
Who have you met會見
268
764437
2021
你遇過誰,
12:58
who you have been
impressed印象深刻 by, inspired啟發 by?
269
766482
2870
你對誰印象的深刻、受到誰鼓舞呢?
13:01
CACA: Well, you talk about
the world世界 in crisis危機,
270
769376
2504
克莉絲蒂安:你談到
世界正處在危機當中,
13:03
which哪一個 is absolutely絕對 true真正,
271
771904
1354
這點絕對是真的,
13:05
and those of us who spend our whole整個 lives生活
immersed沉浸 in this crisis危機 --
272
773282
4487
我們這些一生埋首在危機中的人──
13:09
I mean, we're all on the verge邊緣
of a nervous緊張 breakdown分解.
273
777793
2993
我是說,我們都在精神崩潰的邊緣。
13:12
So it's pretty漂亮 stressful壓力 right now.
274
780810
2676
所以,現在壓力很大。
13:15
And you're right --
275
783510
1159
且你是對的,
13:16
there is this perceived感知 and actual實際
vacuum真空 of leadership領導,
276
784693
3110
我的確感受到領導的空缺狀態,
實際上也是如此,
13:19
and it's not me saying it,
I ask all these --
277
787827
2850
且不只是我這樣說,我問了所有
13:22
whoever I'm talking to,
I ask about leadership領導.
278
790701
2453
與我對話過的人關於領導。
13:25
I was speaking請講 to the outgoing傳出
president主席 of Liberia利比里亞 today今天,
279
793178
4510
我今天在和賴比瑞亞
即將離職的總統談話,
13:29
[Ellen艾倫 Johnson約翰遜 Sirleaf瑟利夫,]
280
797712
1810
艾倫強森瑟利夫,
13:31
who --
281
799546
1154
13:32
(Applause掌聲)
282
800724
2215
(掌聲)
13:34
in three weeks' time,
283
802963
1542
在三週後,
13:36
will be one of the very rare罕見
heads of an African非洲人 country國家
284
804529
3944
她將會成為非常少數真正遵循憲法,
13:40
who actually其實 abides仍住 by the constitution憲法
285
808497
2178
在規定任期結束後
13:42
and gives up power功率
after her prescribed規定 term術語.
286
810699
3612
就交出權力的非洲國家領袖之一。
13:46
She has said she wants
to do that as a lesson.
287
814335
3857
她說她想要那麼做,教大家一課。
13:50
But when I asked her about leadership領導,
288
818216
2032
但當我向她問到領導時,
13:52
and I gave a quick-fire快速火 round回合
of certain某些 names,
289
820272
2683
我快速丟給她一堆名字,
13:54
I presented呈現 her with the name名稱
of the new French法國 president主席,
290
822979
2977
我提到新法國總統的名字,
13:57
Emmanuel靈光 Macron音符.
291
825980
1433
埃瑪紐耶爾馬克宏。
13:59
And she said --
292
827437
1336
她說──
14:00
I said, "So what do you think
when I say his name名稱?"
293
828797
2506
我問:「你覺得他如何?」
14:03
And she said,
294
831327
1273
她說:
14:05
"Shaping塑造 up potentially可能 to be
295
833578
2325
「有潛力能夠成為
14:07
a leader領導 to fill our current當前
leadership領導 vacuum真空."
296
835927
4066
填補目前領導真空的領袖。」
14:12
I thought that was really interesting有趣.
297
840017
1833
我覺得那十分有趣。
14:13
Yesterday昨天, I happened發生 to have
an interview訪問 with him.
298
841874
2456
正巧昨天我剛訪問馬克宏。
14:16
I'm very proud驕傲 to say,
299
844354
1158
我能很驕傲地說,
14:17
I got his first international國際 interview訪問.
It was great. It was yesterday昨天.
300
845536
3419
我得到他的首次國際訪談。
很順利。那是昨天的事。
14:20
And I was really impressed印象深刻.
301
848979
1292
我的印象非常深刻。
14:22
I don't know whether是否 I should be
saying that in an open打開 forum論壇,
302
850295
2928
不知道我是否該在
公開的論壇中這樣說,
14:25
but I was really impressed印象深刻.
303
853247
1455
但我印象非常深刻。
14:26
(Laughter笑聲)
304
854726
1218
(笑聲)
14:28
And it could be just because
it was his first interview訪問,
305
856867
2675
可能只因為那是他的首次訪談,
14:31
but -- I asked questions問題,
and you know what?
306
859566
2095
但我問了問題,你們知道怎樣嗎?
14:33
He answered回答 them!
307
861685
1208
他回答了問題!
14:34
(Laughter笑聲)
308
862917
1933
(笑聲)
14:36
(Applause掌聲)
309
864874
3269
(掌聲)
14:40
There was no spin,
310
868167
1593
沒繞圈圈,
14:41
there was no wiggle擺動 and waggle搖擺,
311
869784
2391
沒閃避,
14:44
there was no spend-five-minutes-花,五分鐘 -
to-come-back-to-the-point對來背到了點.
312
872199
2829
沒花五分鐘才回到重點。
14:47
I didn't have to keep interrupting中斷,
313
875052
1668
不需要我一直打斷,
14:48
which哪一個 I've become成為 rather
renowned聞名 for doing,
314
876744
2083
我以訪談時打斷對方而聞名,
14:50
because I want people
to answer回答 the question.
315
878851
2532
因為想要他們回答我的問題。
14:53
And he answered回答 me,
316
881407
2051
他回答了我,
14:55
and it was pretty漂亮 interesting有趣.
317
883482
2614
那十分有趣。
14:58
And he said --
318
886120
1431
他說──
14:59
CHCH: Tell me what he said.
319
887575
1778
克里斯:告訴我他說了什麼。
15:01
CACA: No, no, you go ahead.
320
889377
1220
克莉絲蒂安:不,不,你先說。
15:02
CHCH: You're the interrupter斷續器,
I'm the listener傾聽者.
321
890621
2228
克里斯:你是打斷專家,我是聽眾。
15:04
CACA: No, no, go ahead.
322
892873
1158
克莉絲蒂安:不,不,請說。
15:06
CHCH: What'd什麼了 he say?
323
894055
1155
克里斯:他說了什麼?
15:07
CACA: OK. You've talked about
nationalism民族主義 and tribalism部落 here today今天.
324
895234
3078
克莉絲蒂安:好,今天在這裡談到
民族主義和對部族的忠誠。
15:10
I asked him, "How did you have the guts膽量
to confront面對 the prevailing優勢 winds
325
898336
3762
我問他:「你怎麼有膽子去對抗
15:14
of anti-globalization反全球化,
nationalism民族主義, populism民粹主義
326
902122
4535
反全球化、民族主義、
民粹主義的主流趨勢,特別是
15:18
when you can see what happened發生 in BrexitBrexit,
327
906681
1962
當你看到英國脫歐發生的情況,
15:20
when you could see what happened發生
in the United聯合的 States狀態
328
908667
2555
當你看到在美國發生的狀況,
15:23
and what might威力 have happened發生
in many許多 European歐洲的 elections選舉
329
911246
2595
以及 2017 年初許多歐洲選舉
15:25
at the beginning開始 of 2017?"
330
913865
1717
本來可能發生的狀況呢?」
15:27
And he said,
331
915606
1319
而他說:
15:29
"For me, nationalism民族主義 means手段 war戰爭.
332
917597
3274
「對我來說,民族主義意味著戰爭。
15:33
We have seen看到 it before,
333
921486
1673
我們以前就看過了,
15:35
we have lived生活 through通過 it before
on my continent大陸,
334
923183
2258
我們以前在歐陸經歷過了,
15:37
and I am very clear明確 about that."
335
925465
2686
而我非常清楚這一點。」
15:40
So he was not going to,
just for political政治 expediency便利,
336
928175
3961
所以他並不只求政治的眼前利益,
15:44
embrace擁抱 the, kind of, lowest最低
common共同 denominator分母
337
932160
3442
像是擁抱最小共同點,
15:47
that had been embraced擁抱
in other political政治 elections選舉.
338
935626
4005
其他的政治選舉都會去
擁抱最小共同點。
15:51
And he stood站在 against反對 Marine海洋 Le Pen鋼筆,
who is a very dangerous危險 woman女人.
339
939655
4441
而他對抗瑪琳勒朋,
瑪琳勒朋是個很危險的女人。
15:56
CHCH: Last question for you, Christiane克莉絲蒂娜.
340
944928
2032
克里斯:克莉絲蒂安,最後一個問題。
16:00
Tell us about ideas思路 worth價值 spreading傳播.
341
948093
1998
和我們談談值得散播的想法。
16:02
If you could plant one idea理念
into the minds頭腦 of everyone大家 here,
342
950115
4647
如果你能在這裡的每個人
腦中植入一個想法,
16:06
what would that be?
343
954786
1197
會是什麼?
16:08
CACA: I would say really be careful小心
where you get your information信息 from;
344
956007
5114
克莉絲蒂安:我會說,
要非常留意你的資訊來自何處;
16:13
really take responsibility責任
for what you read, listen to and watch;
345
961145
5322
要對你所讀到、聽到、
看到的資訊主動負責;
16:18
make sure that you go to the trusted信任
brands品牌 to get your main主要 information信息,
346
966491
4887
確保你的主要資訊必須是從
可信任的品牌那兒取得的,
16:23
no matter whether是否 you have
a wide, eclectic不拘一格 intake錄取,
347
971402
4689
不論資訊的來源
有多麼廣泛或是多麼多樣化,
16:28
really stick with the brand
names that you know,
348
976115
2995
一定要守住你認識的品牌,
16:31
because in this world世界 right now,
at this moment時刻 right now,
349
979134
3592
因為在這個世界中,在目前這時刻,
16:34
our crises危機, our challenges挑戰,
our problems問題 are so severe嚴重,
350
982750
4339
我們的危機、我們的挑戰、
我們的問題,都非常嚴重,
16:39
that unless除非 we are all engaged訂婚
as global全球 citizens公民
351
987113
3551
除非我們都能以
全球市民的身份來參與,
16:42
who appreciate欣賞 the truth真相,
352
990688
1903
能夠意識到真相,
16:44
who understand理解 science科學,
empirical經驗 evidence證據 and facts事實,
353
992615
4345
能夠了解科學、實證證據與事實,
16:48
then we are just simply只是
going to be wandering飄零 along沿
354
996984
3499
不然我們就只會離開正道,
16:52
to a potential潛在 catastrophe災難.
355
1000507
1961
走向潛在的大災難。
16:54
So I would say, the truth真相,
356
1002492
1364
所以我會說「真相」,
16:55
and then I would come back
to Emmanuel靈光 Macron音符
357
1003880
2256
接著我會回到埃瑪紐耶爾馬克宏,
16:58
and talk about love.
358
1006160
1300
並且談「愛」。
17:00
I would say that there's not
enough足夠 love going around.
359
1008022
4469
我會說,還沒有足夠的愛。
17:04
And I asked him to tell me about love.
360
1012515
2692
我請他和我談談愛。
17:07
I said, "You know, your marriage婚姻
is the subject學科 of global全球 obsession困擾."
361
1015231
3592
我說:「你的婚姻
是全球都很迷戀的目標。」
17:10
(Laughter笑聲)
362
1018847
1635
(笑聲)
17:12
"Can you tell me about love?
363
1020506
1413
「你能跟我談談愛嗎?
17:13
What does it mean to you?"
364
1021943
1314
對你而言愛是什麼?」
17:15
I've never asked a president主席
or an elected當選 leader領導 about love.
365
1023281
2941
我從未請總統或民選領袖
跟我談論愛這議題,
17:18
I thought I'd try it.
366
1026246
1158
我想試一試。
17:19
And he said -- you know,
he actually其實 answered回答 it.
367
1027428
3915
而他──你們知道嗎,
他真的回答了。
17:23
And he said, "I love my wife妻子,
she is part部分 of me,
368
1031367
4161
他說:「我愛我太太,
她是我的一部份,
17:27
we've我們已經 been together一起 for decades幾十年."
369
1035552
1627
我們在一起數十年了。」
17:29
But here's這裡的 where it really counted,
370
1037203
1685
但真正重要,
17:30
what really stuck卡住 with me.
371
1038912
1503
真正讓我難忘的是
17:32
He said,
372
1040439
1241
他說:
17:33
"It is so important重要 for me
to have somebody at home
373
1041704
3520
「對我來說,家裡
有個人能告訴我真相
17:37
who tells告訴 me the truth真相."
374
1045248
1899
是非常重要的事。」
17:40
So you see, I brought it home.
It's all about the truth真相.
375
1048618
2712
看,我把話帶回到主題了,
重點就是真相。
17:43
(Laughter笑聲)
376
1051354
1006
(笑聲)
17:44
CHCH: So there you go. Truth真相 and love.
Ideas思路 worth價值 spreading傳播.
377
1052384
2807
克里斯:有你的。真相和愛。
值得散播的想法。
17:47
Christiane克莉絲蒂娜 Amanpour阿曼普爾, thank you
so much. That was great.
378
1055215
2663
克莉絲蒂安艾曼普,
非常謝謝你。很棒的訪談。
17:49
(Applause掌聲)
379
1057902
1068
(掌聲)
17:50
CACA: Thank you.
CHCH: That was really lovely可愛.
380
1058994
2334
克莉絲蒂安:謝謝您。
克里斯:訪談非常愉快。
17:53
(Applause掌聲)
381
1061352
1215
(掌聲)
17:54
CACA: Thank you.
382
1062591
1165
克莉絲蒂安:謝謝。
Translated by Lilian Chiu
Reviewed by Helen Chang

▲Back to top

ABOUT THE SPEAKERS
Christiane Amanpour - Journalist
TV news legend Christiane Amanpour is known for her uncompromising approach to reporting and interviewing.

Why you should listen

Christiane Amanpour is CNN's chief international correspondent and anchor of the global a airs program "Amanpour," broadcast from the television network's London bureau. She's covered the most relevant conflicts of the last decades, exposing both the brutality and human cost of war and its geopolitical impacts. From the 1991 Gulf War to the siege of Sarajevo (the city later named her honorary citizen), from the 2003 American-led invasion of Iraq to the trial of Saddam Hussein the following year, Amanpour's fearless and uncompromising approach has made her popular with audiences, and a force to be reckoned with by global influencers.

During the Balkan wars, Amanpour famously broke with the idea of journalism neutrality by calling out human right abuses and saying that "there are some situations one simply cannot be neutral about, because when you are neutral you are an accomplice." Since her interview show "Amanpour" was launched in 2009, she's spoken to leaders and decision makers on the issues affecting the world today while continuing reporting from all over the world, including the 2010 earthquake in Haiti and the 2011 tsunami in Japan.

More profile about the speaker
Christiane Amanpour | Speaker | TED.com
Chris Anderson - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading.

Why you should listen

Chris Anderson is the Curator of TED, a nonprofit devoted to sharing valuable ideas, primarily through the medium of 'TED Talks' -- short talks that are offered free online to a global audience.

Chris was born in a remote village in Pakistan in 1957. He spent his early years in India, Pakistan and Afghanistan, where his parents worked as medical missionaries, and he attended an American school in the Himalayas for his early education. After boarding school in Bath, England, he went on to Oxford University, graduating in 1978 with a degree in philosophy, politics and economics.

Chris then trained as a journalist, working in newspapers and radio, including two years producing a world news service in the Seychelles Islands.

Back in the UK in 1984, Chris was captivated by the personal computer revolution and became an editor at one of the UK's early computer magazines. A year later he founded Future Publishing with a $25,000 bank loan. The new company initially focused on specialist computer publications but eventually expanded into other areas such as cycling, music, video games, technology and design, doubling in size every year for seven years. In 1994, Chris moved to the United States where he built Imagine Media, publisher of Business 2.0 magazine and creator of the popular video game users website IGN. Chris eventually merged Imagine and Future, taking the combined entity public in London in 1999, under the Future name. At its peak, it published 150 magazines and websites and employed 2,000 people.

This success allowed Chris to create a private nonprofit organization, the Sapling Foundation, with the hope of finding new ways to tackle tough global issues through media, technology, entrepreneurship and, most of all, ideas. In 2001, the foundation acquired the TED Conference, then an annual meeting of luminaries in the fields of Technology, Entertainment and Design held in Monterey, California, and Chris left Future to work full time on TED.

He expanded the conference's remit to cover all topics, including science, business and key global issues, while adding a Fellows program, which now has some 300 alumni, and the TED Prize, which grants its recipients "one wish to change the world." The TED stage has become a place for thinkers and doers from all fields to share their ideas and their work, capturing imaginations, sparking conversation and encouraging discovery along the way.

In 2006, TED experimented with posting some of its talks on the Internet. Their viral success encouraged Chris to begin positioning the organization as a global media initiative devoted to 'ideas worth spreading,' part of a new era of information dissemination using the power of online video. In June 2015, the organization posted its 2,000th talk online. The talks are free to view, and they have been translated into more than 100 languages with the help of volunteers from around the world. Viewership has grown to approximately one billion views per year.

Continuing a strategy of 'radical openness,' in 2009 Chris introduced the TEDx initiative, allowing free licenses to local organizers who wished to organize their own TED-like events. More than 8,000 such events have been held, generating an archive of 60,000 TEDx talks. And three years later, the TED-Ed program was launched, offering free educational videos and tools to students and teachers.

More profile about the speaker
Chris Anderson | Speaker | TED.com