ABOUT THE SPEAKER
Will MacAskill - Moral philosopher, writer
Will MacAskill is a cofounder of the effective altruism movement, a philosophy that tries to answer the question: How can we do as much good as possible?

Why you should listen

Will MacAskill is the author of Doing Good Better and the cofounder of the organization Giving What We Can and the social-impact-career advice project 80,000 Hours. He is a trustee of the Centre for Effective Altruism. As he writes: "I'm an associate professor of moral philosophy at Oxford University. I helped to create the philosophy of effective altruism: the use of evidence and reason to figure out how to do the most good we can.

"While growing up, I tried my best to do good things: I donated to high street charities, I worked at an old folks' home, I taught English in Ethiopia. But when I was a graduate student I realized that I'd never asked myself the question of whether these were the truly best ways of improving the world. So, being a nerd, I started doing some research to try to find out what activities would do the most to make the world a better place. Ten years later, I'm still asking that question.

"I'm still far from certain about what the best ways of doing good are, but I have learned that each and every one of us can do a tremendous amount to make the world a better place, whether that's by donating to buy bednets to protect children from malaria, campaigning to improve living conditions for animals in factory farms or working to ensure that the benefits of new technology are harnessed while avoiding the risks."

More profile about the speaker
Will MacAskill | Speaker | TED.com
TED2018

Will MacAskill: What are the most important moral problems of our time?

威爾麥卡斯基爾: 在我們的時代中,最重要的道德問題是什麼?

Filmed:
2,112,093 views

在人類面對的所有問題中,我們應該先專注解決哪些?這場引人入勝的演說是關於如何讓世界變更好。講者是道德哲學家威爾麥卡斯基爾,他提供了一個架構,來依據「有效利他主義」的哲學,回答這個問題—並分享他的想法,說明要如何處理三個迫切的全球議題。
- Moral philosopher, writer
Will MacAskill is a cofounder of the effective altruism movement, a philosophy that tries to answer the question: How can we do as much good as possible? Full bio

Double-click the English transcript below to play the video.

00:12
This is a graph圖形
0
857
1479
這是一張圖,
00:14
that represents代表 the economic經濟 history歷史
of human人的 civilization文明.
1
2360
3659
圖上的是人類文明的經濟歷史。
00:18
[World世界 GDPGDP per capita人頭
over the last 200,000 years年份]
2
6043
2400
【過去二十萬年的
世界人均國內生產總額】
00:23
There's not much going on, is there.
3
11757
2003
圖上沒多少東西。
00:26
For the vast廣大 majority多數 of human人的 history歷史,
4
14751
2347
人類歷史大部分的時期,
00:29
pretty漂亮 much everyone大家 lived生活
on the equivalent當量 of one dollar美元 per day,
5
17835
4048
幾乎人人都是一天
用一塊美金在過日子,
00:33
and not much changed.
6
21907
1286
沒有很大的改變。
00:36
But then, something
extraordinary非凡 happened發生:
7
24757
2777
接著,不凡的事發生了:
00:40
the Scientific科學 and Industrial產業 Revolutions革命.
8
28677
2811
科學和工業革命。
00:43
And the basically基本上 flat平面 graph圖形 you just saw
9
31512
2785
各位剛剛看到幾乎是平坦的圖形
00:46
transforms變換 into this.
10
34321
2675
轉變成這樣。
00:50
What this graph圖形 means手段 is that,
in terms條款 of power功率 to change更改 the world世界,
11
38612
4635
這張圖的意義是,
就改變世界的力量來說,
00:55
we live生活 in an unprecedented史無前例 time
in human人的 history歷史,
12
43271
3438
我們處在人類歷史上
前所未見的時代,
00:58
and I believe our ethical合乎道德的 understanding理解
hasn't有沒有 yet然而 caught抓住 up with this fact事實.
13
46733
3944
且我認為我們的倫理領會
還沒有趕上這些發展。
01:03
The Scientific科學 and Industrial產業 Revolutions革命
14
51716
1984
科學和工業革命
01:05
transformed改造 both
our understanding理解 of the world世界
15
53724
2909
轉變了我們對世界的了解
01:08
and our ability能力 to alter改變 it.
16
56657
1669
及我們改變世界的能力。
01:11
What we need is an ethical合乎道德的 revolution革命
17
59505
3667
我們需要的是倫理革命,
01:15
so that we can work out
18
63196
1548
這樣我們才能想出
01:16
how do we use this tremendous巨大
bounty賞金 of resources資源
19
64768
3152
我們要如何使用這麼豐富的資源
01:19
to improve提高 the world世界.
20
67944
1395
來改善世界。
01:22
For the last 10 years年份,
21
70249
1591
在過去十年間,
01:23
my colleagues同事 and I have developed發達
a philosophy哲學 and research研究 program程序
22
71864
3833
我和我同事在開發
一個哲學和研究的計畫,
01:27
that we call effective有效 altruism利他主義.
23
75721
1835
我們稱它為「有效利他主義」。
01:30
It tries嘗試 to respond響應
to these radical激進 changes變化 in our world世界,
24
78366
3595
這個計畫是在嘗試對世界上的
這些極端改變做出應變,
01:33
uses使用 evidence證據 and careful小心 reasoning推理
to try to answer回答 this question:
25
81985
4476
使用證據和謹慎的推論
來試著回答這個問題:
01:40
How can we do the most good?
26
88173
2278
我們要如何做至善的善事?
01:44
Now, there are many許多 issues問題
you've got to address地址
27
92265
3221
你需要處理許多議題,
01:47
if you want to tackle滑車 this problem問題:
28
95510
2263
才能解決這個問題:
01:49
whether是否 to do good through通過 your charity慈善機構
29
97797
2031
是否透過慈善事業、
01:51
or your career事業
or your political政治 engagement訂婚,
30
99852
2152
職業或參與政治來行善,
01:54
what programs程式 to focus焦點 on,
who to work with.
31
102028
2395
要投入什麼計畫,要和誰合作。
01:57
But what I want to talk about
32
105624
1476
但我想要談的
01:59
is what I think is the most
fundamental基本的 problem問題.
33
107124
2872
是我認為最基本的問題。
02:02
Of all the many許多 problems問題
that the world世界 faces面孔,
34
110020
2693
世界所面臨的這麼多問題當中,
02:05
which哪一個 should we be focused重點
on trying to solve解決 first?
35
113962
2659
我們應該聚焦優先解決哪一個?
02:10
Now, I'm going to give you a framework骨架
for thinking思維 about this question,
36
118668
3468
我要提供各位一個架構,
用來思考這個問題,
02:14
and the framework骨架 is very simple簡單.
37
122160
1936
且這個架構很簡單。
02:16
A problem's問題 higher更高 priority優先,
38
124842
1699
優先順序高的迫切問題是
02:19
the bigger, the more easily容易 solvable可解
and the more neglected被忽視的 it is.
39
127416
4063
大的、容易解決的,和被忽視的。
02:24
Bigger is better,
40
132694
1642
越大越好,
02:26
because we've我們已經 got more to gain獲得
if we do solve解決 the problem問題.
41
134360
2841
因為若能解決這個問題,
我們就能得到比較多。
比較容易解決比較好,
02:30
More easily容易 solvable可解 is better
42
138221
1569
02:31
because I can solve解決 the problem問題
with less time or money.
43
139814
2824
因為我可以花費較少的時間
和金錢來解決這個問題。
02:35
And most subtly巧妙的,
44
143737
2063
最微妙的一點是,
02:38
more neglected被忽視的 is better,
because of diminishing逐漸縮小的 returns回報.
45
146681
2849
越被忽視越好,理由是報酬遞減。
02:42
The more resources資源 that have already已經 been
invested投資 into solving a problem問題,
46
150285
3714
已經投入來解決
這個問題的資源越多,
02:46
the harder更難 it will be
to make additional額外 progress進展.
47
154023
2905
就越難再做出額外的進展。
02:50
Now, the key thing that I want
to leave離開 with you is this framework骨架,
48
158560
4059
我想要留給各位的關鍵,
就是這個架構,
02:54
so that you can think for yourself你自己
49
162643
1984
這樣各位就可以自己想想,
02:56
what are the highest最高 global全球 priorities優先.
50
164651
2321
全球最高優先順序的事是哪些。
02:59
But I and others其他
in the effective有效 altruism利他主義 community社區
51
167954
2692
但我和有效利他主義社區中的其他人
03:02
have converged融合 on three moral道德 issues問題
that we believe are unusually異常 important重要,
52
170670
5879
已經有共識,我們認為
有三項道德議題是特別重要的,
03:08
score得分了 unusually異常 well in this framework骨架.
53
176573
2182
在這個架構中得到的分數特別高。
03:11
First is global全球 health健康.
54
179151
2813
第一是全球健康。
03:13
This is supersolvable超可解.
55
181988
2411
這個議題是超級可以解決的。
03:16
We have an amazing驚人 track跟踪 record記錄
in global全球 health健康.
56
184423
3397
關於全球健康,我們有
很驚人的過去記錄資料。
03:19
Rates價格 of death死亡 from measles麻疹,
malaria瘧疾, diarrheal腹瀉 disease疾病
57
187844
5420
麻疹、霍亂、痢疾的死亡率
03:25
are down by over 70 percent百分.
58
193288
2246
下降了超過 70%。
03:29
And in 1980, we eradicated根除 smallpox天花.
59
197534
2789
1980 年,我們根絕了天花。
03:33
I estimate估計 we thereby從而 saved保存
over 60 million百萬 lives生活.
60
201815
3667
依我估計,這就拯救了
超過六千萬條性命。
03:37
That's more lives生活 saved保存
than if we'd星期三 achieved實現 world世界 peace和平
61
205506
3064
就算能在那個時期達成世界和平,
也救不了這麼多人。
03:40
in that same相同 time period.
62
208594
1697
03:43
On our current當前 best最好 estimates估計,
63
211893
2325
依我們目前最佳的估計,
03:46
we can save保存 a life by distributing分佈
long-lasting持久的 insecticide-treated殺蟲劑處理 bed nets
64
214242
4128
只要能發放用殺蟲劑處理過的
持久性蚊帳就能拯救一條性命,
03:50
for just a few少數 thousand dollars美元.
65
218394
1903
成本不過幾千美元。
03:52
This is an amazing驚人 opportunity機會.
66
220911
1667
這是一個很棒的機會。
03:55
The second第二 big priority優先
is factory farming農業.
67
223594
2515
第二個優先議題,是工廠化養殖。
03:58
This is superneglectedsuperneglected.
68
226681
1563
這個議題被嚴重忽略了。
04:00
There are 50 billion十億 land土地 animals動物
used every一切 year for food餐飲,
69
228768
4143
每年,有五百億隻陸地動物被食用,
04:05
and the vast廣大 majority多數 of them
are factory farmed養殖,
70
233625
2548
牠們絕大多數都是工廠化養殖的,
04:08
living活的 in conditions條件
of horrific可怕的 suffering痛苦.
71
236197
2380
生活條件非常惡劣、痛苦。
04:10
They're probably大概 among其中
the worst-off最差-關閉 creatures生物 on this planet行星,
72
238601
3151
牠們可能是地球上最不幸的生物,
04:13
and in many許多 cases, we could
significantly顯著 improve提高 their lives生活
73
241776
2858
在許多案例中,我們都可以
大幅改善牠們的生活,
04:16
for just pennies便士 per animal動物.
74
244658
1602
每隻動物只要幾分錢即可。
04:19
Yet然而 this is hugely巨大 neglected被忽視的.
75
247123
2082
但這個議題卻被嚴重忽視。
04:21
There are 3,000 times
more animals動物 in factory farms農場
76
249229
3810
工廠化養殖的動物數量
比街頭流浪的寵物要多三千倍,
04:25
than there are stray流浪 pets寵物,
77
253063
1561
04:28
but yet然而, factory farming農業 gets得到 one fiftieth五十
of the philanthropic慈善 funding資金.
78
256600
4373
但工廠化養殖得到的慈善資金
卻只有五十分之一。
04:34
That means手段 additional額外
resources資源 in this area
79
262211
2128
這就意味著,在這個領域
若有額外的資源,
04:36
could have a truly transformative變革 impact碰撞.
80
264363
2150
就能產生真正帶來改變的影響。
04:39
Now the third第三 area is the one
that I want to focus焦點 on the most,
81
267458
2985
第三個領域是我最想要拿來談的,
04:42
and that's the category類別
of existential存在 risks風險:
82
270467
2984
就是生存風險這個類別:
04:45
events事件 like a nuclear war戰爭
or a global全球 pandemic流感大流行
83
273475
3873
像核武戰爭或全球流行病這類事件,
04:50
that could permanently永久 derail出軌 civilization文明
84
278824
2611
能夠讓文明永久脫軌,
04:54
or even lead to the extinction滅絕
of the human人的 race種族.
85
282156
2436
或甚至導致人類的滅絕。
04:57
Let me explain說明 why I think
this is such這樣 a big priority優先
86
285882
2540
讓我用這個架構
來解釋為什麼我認為
05:00
in terms條款 of this framework骨架.
87
288446
1337
這是個高優先的議題。
05:02
First, size尺寸.
88
290992
1412
首先,大小。
05:05
How bad would it be if there were
a truly existential存在 catastrophe災難?
89
293341
3889
如果真的發生了
攸關生死存亡的大災難會多糟?
05:10
Well, it would involve涉及 the deaths死亡
of all seven billion十億 people on this planet行星
90
298920
6342
嗯,
它會牽涉到地球上
全部七十億人的存亡,
05:17
and that means手段 you
and everyone大家 you know and love.
91
305286
3119
包括你和你認識、你愛的所有人。
05:21
That's just a tragedy悲劇
of unimaginable不可思議 size尺寸.
92
309214
2579
這個悲劇大到無法想像。
05:25
But then, what's more,
93
313684
1976
但,還不只如此,
05:27
it would also mean the curtailment剝奪
of humanity's人類的 future未來 potential潛在,
94
315684
3605
它也意味著人類未來的
潛能會被限制,
05:31
and I believe that humanity's人類的
potential潛在 is vast廣大.
95
319313
2952
而我相信人類的潛能很大。
05:35
The human人的 race種族 has been around
for about 200,000 years年份,
96
323551
3451
人類已經存在有大約二十萬年了,
05:39
and if she lives生活 as long
as a typical典型 mammalian哺乳動物 species種類,
97
327026
2883
如果人類能活得跟典型的
哺乳動物物種一樣久,
05:41
she would last
for about two million百萬 years年份.
98
329933
2298
就能活大約兩百萬年。
05:46
If the human人的 race種族
were a single individual個人,
99
334884
2691
如果把整體人類看成是單一個體,
05:49
she would be just 10 years年份 old today今天.
100
337599
2419
人類現在才只有十歲而已。
05:53
And what's more, the human人的 race種族
isn't a typical典型 mammalian哺乳動物 species種類.
101
341526
4166
此外,人類並不是
典型的哺乳動物物種。
05:58
There's no reason原因 why, if we're careful小心,
102
346950
1906
如果我們夠小心,
我們不可能只存在
兩百萬年就絕種。
06:00
we should die off
after only two million百萬 years年份.
103
348880
2205
06:03
The earth地球 will remain habitable可居住
for 500 million百萬 years年份 to come.
104
351839
4040
在接下來的五億年,
地球都仍然會是適合居住的。
06:08
And if someday日後, we took to the stars明星,
105
356696
1944
若有一天,我們能到其他星球,
06:11
the civilization文明 could continue繼續
for billions數十億 more.
106
359640
2516
文明就還可以再延續數十億年。
06:16
So I think the future未來
is going to be really big,
107
364193
2388
所以我認為未來將會非常大,
06:19
but is it going to be good?
108
367669
1802
但未來會很好嗎?
06:21
Is the human人的 race種族
even really worth價值 preserving?
109
369495
2817
人類真的值得保護嗎?
06:26
Well, we hear all the time about
how things have been getting得到 worse更差,
110
374540
3929
嗯,我們總是聽到有人說
事物變得更糟糕,
06:31
but I think that when
we take the long run,
111
379459
2693
但我認為,從長期來看,
06:34
things have been getting得到 radically根本 better.
112
382176
2031
一切都徹底地變好了。
06:37
Here, for example,
is life expectancy期待 over time.
113
385453
2294
比如,這是各時期的預期壽命。
06:40
Here's這裡的 the proportion比例 of people
not living活的 in extreme極端 poverty貧窮.
114
388892
3023
這是非處於極度貧窮
生活條件的人口比例。
06:45
Here's這裡的 the number of countries國家 over time
that have decriminalized合法化 homosexuality同性戀.
115
393106
4095
這是各時期讓同性戀
合法化的國家數目。
06:50
Here's這裡的 the number of countries國家 over time
that have become成為 democratic民主的.
116
398848
3269
這是各時期民主國家的數目。
06:55
Then, when we look to the future未來,
there could be so much more to gain獲得 again.
117
403015
4619
當我們看向未來,會發現
還有好多可以獲得的。
06:59
We'll be so much richer更豐富,
118
407658
1228
我們會富有許多,
07:00
we can solve解決 so many許多 problems問題
that are intractable棘手 today今天.
119
408910
3595
我們能夠解決許多現今的棘手問題。
07:05
So if this is kind of a graph圖形
of how humanity人性 has progressed進展
120
413389
4445
如果就全人類隨時間
而過得更好的層面上來看,
07:09
in terms條款 of total human人的
flourishing芊芊 over time,
121
417858
2890
這張圖算是某種人類進步圖,
07:12
well, this is what we would expect期望
future未來 progress進展 to look like.
122
420772
3355
那麼我們預期未來進步
會是像這樣子的。
07:16
It's vast廣大.
123
424881
1150
它很廣大。
07:18
Here, for example,
124
426953
1198
以這一點為例,
07:20
is where we would expect期望 no one
to live生活 in extreme極端 poverty貧窮.
125
428175
3746
我們預期到這時候就沒有人
會過著極度貧窮的生活。
07:25
Here is where we would expect期望
everyone大家 to be better off
126
433930
3202
在這裡,我們預期人人都會
比現今最富有的人過得更好。
07:29
than the richest首富 person alive today今天.
127
437156
1853
07:32
Perhaps也許 here is where we would discover發現
the fundamental基本的 natural自然 laws法律
128
440081
3192
也許在這裡,我們會
發現管理我們世界的
07:35
that govern治理 our world世界.
129
443297
1268
基本自然法則。
07:37
Perhaps也許 here is where we discover發現
an entirely完全 new form形成 of art藝術,
130
445516
3705
也許在這裡,我們會
發現全新的藝術形式,
07:41
a form形成 of music音樂 we currently目前 lack缺乏
the ears耳朵 to hear.
131
449245
3040
我們目前沒有辦法聽見的音樂形式。
07:45
And this is just
the next下一個 few少數 thousand years年份.
132
453072
2222
這只是接下來的幾千年。
07:47
Once一旦 we think past過去 that,
133
455827
2205
一旦我們想到更遠的時間,
07:50
well, we can't even imagine想像 the heights高度
that human人的 accomplishment成就 might威力 reach達到.
134
458056
4167
我們甚至無法想像
人類的成就能夠有多高。
07:54
So the future未來 could be very big
and it could be very good,
135
462247
3040
所以,未來可能是非常大,
也可能是非常好的,
07:57
but are there ways方法
we could lose失去 this value?
136
465311
2086
但我們會不會
以某些方式失去這價值?
08:00
And sadly可悲的是, I think there are.
137
468366
1826
很不幸,我認為有可能。
08:02
The last two centuries百年 brought
tremendous巨大 technological技術性 progress進展,
138
470216
4053
過去兩個世紀,
發生了很巨大的技術進步,
08:06
but they also brought
the global全球 risks風險 of nuclear war戰爭
139
474293
2622
但也帶來了核武戰爭的全球風險,
08:08
and the possibility可能性
of extreme極端 climate氣候 change更改.
140
476939
2157
以及極端氣候改變的可能性。
當我們看向接下來的幾世紀,
08:11
When we look to the coming未來 centuries百年,
141
479725
1767
08:13
we should expect期望 to see
the same相同 pattern模式 again.
142
481516
2647
我們應該會預期
再次看到同樣的模式。
08:16
And we can see some radically根本
powerful強大 technologies技術 on the horizon地平線.
143
484187
3356
我們能看到一些
極強大的技術即將問世。
08:20
Synthetic合成的 biology生物學 might威力 give us
the power功率 to create創建 viruses病毒
144
488132
2849
合成生物學可能讓我們能創造出
08:23
of unprecedented史無前例
contagiousness傳染性 and lethality殺傷力.
145
491005
3047
具有前所未見感染力
和致命度的病毒。
08:27
Geoengineering地球工程 might威力 give us the power功率
to dramatically顯著 alter改變 the earth's地球 climate氣候.
146
495131
4643
地球工程可能讓我們
能大大改變地球的氣候。
08:31
Artificial人造 intelligence情報 might威力 give us
the power功率 to create創建 intelligent智能 agents代理
147
499798
4199
人工智慧可能讓我們能創造出
才能比我們更好的智慧代理人。
08:36
with abilities能力 greater更大 than our own擁有.
148
504021
2142
08:40
Now, I'm not saying that any
of these risks風險 are particularly尤其 likely容易,
149
508222
3888
我並不是說上述這些風險
特別有可能發生,
08:44
but when there's so much at stake賭注,
150
512134
1644
但當賭注有這麼高的時候,
08:45
even small probabilities概率
matter a great deal合同.
151
513802
2967
即使很低的機率也是非常要緊的。
08:49
Imagine想像 if you're getting得到 on a plane平面
and you're kind of nervous緊張,
152
517568
3001
想像一下,如果你要
上飛機,且你蠻緊張的,
08:52
and the pilot飛行員 reassures再保證 you by saying,
153
520593
3444
而駕駛員說了
這樣的話來向你保證:
08:56
"There's only a one-in-a-thousand一對一1000
chance機會 of crashing轟然. Don't worry擔心."
154
524061
4634
「墜機的機會只有
一千分之一。別擔心。」
09:02
Would you feel reassured放心?
155
530157
1554
你會覺得安心嗎?
09:04
For these reasons原因, I think that preserving
the future未來 of humanity人性
156
532509
4088
基於這些理由,
我認為保護人類的未來
09:08
is among其中 the most important重要 problems問題
that we currently目前 face面對.
157
536621
2984
是我們目前所面臨
最重要的問題之一。
09:12
But let's keep using運用 this framework骨架.
158
540546
2150
但,咱們繼續用這個架構吧。
09:14
Is this problem問題 neglected被忽視的?
159
542720
1310
這個問題有被忽視嗎?
09:18
And I think the answer回答 is yes,
160
546085
2282
我認為答案是「有」,
09:20
and that's because problems問題
that affect影響 future未來 generations
161
548391
3325
那是因為會影響未來世代的問題
09:23
are often經常 hugely巨大 neglected被忽視的.
162
551740
1651
通常被嚴重忽視。
09:26
Why?
163
554930
1406
為什麼?
09:28
Because future未來 people
don't participate參加 in markets市場 today今天.
164
556360
3478
因為未來的人並沒有
參與現今的市場。
09:31
They don't have a vote投票.
165
559862
1522
他們沒有投票權。
09:33
It's not like there's a lobby前廳
representing代表 the interests利益
166
561931
2673
並沒有一個遊說團會代表
2300 年出生者的利益。
09:36
of those born天生 in 2300 AD廣告.
167
564628
2023
09:40
They don't get to influence影響
the decisions決定 we make today今天.
168
568313
3242
他們無法影響我們現今所做的決策。
09:43
They're voiceless無聲.
169
571995
1191
他們無法發聲。
09:46
And that means手段 we still spend
a paltry微不足道 amount on these issues問題:
170
574490
3445
那就表示,我們在
這些議題上投入非常少:
09:49
nuclear nonproliferation防 擴散,
171
577959
1799
防止核武器擴散、
09:51
geoengineering地球工程, bioriskbiorisk,
172
579782
2330
地球工程、生物危險、
09:55
artificial人造 intelligence情報 safety安全.
173
583414
1642
人工智慧安全性。
09:57
All of these receive接收
only a few少數 tens of millions百萬 of dollars美元
174
585923
2874
所有這些議題每年都只得到
幾千萬美元的慈善資金。
10:00
of philanthropic慈善 funding資金 every一切 year.
175
588821
1927
10:04
That's tiny compared相比
to the 390 billion十億 dollars美元
176
592044
3929
這個數字相對很小,
因為美國的總慈善金額
10:08
that's spent花費 on US philanthropy慈善事業 in total.
177
596790
2261
為三千九百億美元。
10:13
The final最後 aspect方面 of our framework骨架 then:
178
601885
2484
接著,架構的最後一個面向:
10:17
Is this solvable可解?
179
605083
1190
它是可解決的嗎?
10:19
I believe it is.
180
607289
1289
我相信是的。
10:21
You can contribute有助於 with your money,
181
609014
3047
你可以貢獻你的金錢、
10:24
your career事業 or your political政治 engagement訂婚.
182
612085
2644
你的職業,或你的政治參與。
10:28
With your money,
you can support支持 organizations組織
183
616225
2175
若你投入金錢,你可以資助組織,
10:30
that focus焦點 on these risks風險,
184
618424
1302
讓它們著重這些風險,
10:31
like the Nuclear Threat威脅 Initiative倡議,
185
619750
2555
比如「核威脅倡議」,
10:34
which哪一個 campaigns活動 to take nuclear weapons武器
off hair-trigger發扳機 alert警報,
186
622329
3660
它在做的是不要讓核武
處在一觸即發的警戒狀態,
10:38
or the Blue藍色 Ribbon絲帶 Panel面板, which哪一個
develops發展 policy政策 to minimize最小化 the damage損傷
187
626013
3571
或「藍絲帶小組」,
它在做的是開發政策,
把自然發生和人為造成的流行病
所造成的影響給最小化,
10:41
from natural自然 and man-made人造 pandemics流行病,
188
629608
2095
10:45
or the Center中央 for Human-Compatible與人相容 AIAI,
which哪一個 does technical技術 research研究
189
633158
3260
或「人類相容人工智慧中心」,
它是在做技術研究,
10:48
to ensure確保 that AIAI systems系統
are safe安全 and reliable可靠.
190
636442
2747
確保人工智慧系統的安全可靠。
若你投入的是政治參與,
10:52
With your political政治 engagement訂婚,
191
640652
1516
10:54
you can vote投票 for candidates候選人
that care關心 about these risks風險,
192
642192
3096
你可以投票給在乎
這些風險的候選人,
10:57
and you can support支持
greater更大 international國際 cooperation合作.
193
645312
2586
你也可以支持更大的國際合作。
11:01
And then with your career事業,
there is so much that you can do.
194
649767
3542
若你投入的是職業,
你能做的非常多。
11:05
Of course課程, we need scientists科學家們
and policymakers政策制定者 and organization組織 leaders領導者,
195
653333
3672
當然,我們需要科學家、
政策制訂者,和組織領導人,
但,同樣重要的,
11:09
but just as importantly重要的,
196
657865
1152
11:11
we also need accountants會計師
and managers經理 and assistants助理
197
659041
4117
我們也需要會計師、經理人和助理,
11:16
to work in these organizations組織
that are tackling搶斷 these problems問題.
198
664691
3754
在這些能夠處理
這些問題的組織中工作。
11:20
Now, the research研究 program程序
of effective有效 altruism利他主義
199
668469
3492
有效利他主義的研究專案
11:25
is still in its infancy嬰兒期,
200
673191
1444
還在初期階段,
11:27
and there's still a huge巨大 amount
that we don't know.
201
675262
2524
還有很多我們不知道的。
11:31
But even with what we've我們已經 learned學到了 so far,
202
679173
2343
但,就我們目前所知道的來說,
11:34
we can see that by thinking思維 carefully小心
203
682748
2183
我們可以看到,若能謹慎思考,
11:37
and by focusing調焦 on those problems問題
that are big, solvable可解 and neglected被忽視的,
204
685494
4873
並把焦點放在那些可解決、
被忽視的大問題上,
11:43
we can make a truly tremendous巨大
difference區別 to the world世界
205
691152
2708
我們就能在接下來的數千年,
真正造成很大的不同,改變世界。
11:45
for thousands數千 of years年份 to come.
206
693884
1631
11:47
Thank you.
207
695963
1151
謝謝。
11:49
(Applause掌聲)
208
697138
4560
(掌聲)
Translated by Lilian Chiu
Reviewed by Helen Chang

▲Back to top

ABOUT THE SPEAKER
Will MacAskill - Moral philosopher, writer
Will MacAskill is a cofounder of the effective altruism movement, a philosophy that tries to answer the question: How can we do as much good as possible?

Why you should listen

Will MacAskill is the author of Doing Good Better and the cofounder of the organization Giving What We Can and the social-impact-career advice project 80,000 Hours. He is a trustee of the Centre for Effective Altruism. As he writes: "I'm an associate professor of moral philosophy at Oxford University. I helped to create the philosophy of effective altruism: the use of evidence and reason to figure out how to do the most good we can.

"While growing up, I tried my best to do good things: I donated to high street charities, I worked at an old folks' home, I taught English in Ethiopia. But when I was a graduate student I realized that I'd never asked myself the question of whether these were the truly best ways of improving the world. So, being a nerd, I started doing some research to try to find out what activities would do the most to make the world a better place. Ten years later, I'm still asking that question.

"I'm still far from certain about what the best ways of doing good are, but I have learned that each and every one of us can do a tremendous amount to make the world a better place, whether that's by donating to buy bednets to protect children from malaria, campaigning to improve living conditions for animals in factory farms or working to ensure that the benefits of new technology are harnessed while avoiding the risks."

More profile about the speaker
Will MacAskill | Speaker | TED.com