ABOUT THE SPEAKER
Will MacAskill - Moral philosopher, writer
Will MacAskill is a cofounder of the effective altruism movement, a philosophy that tries to answer the question: How can we do as much good as possible?

Why you should listen

Will MacAskill is the author of Doing Good Better and the cofounder of the organization Giving What We Can and the social-impact-career advice project 80,000 Hours. He is a trustee of the Centre for Effective Altruism. As he writes: "I'm an associate professor of moral philosophy at Oxford University. I helped to create the philosophy of effective altruism: the use of evidence and reason to figure out how to do the most good we can.

"While growing up, I tried my best to do good things: I donated to high street charities, I worked at an old folks' home, I taught English in Ethiopia. But when I was a graduate student I realized that I'd never asked myself the question of whether these were the truly best ways of improving the world. So, being a nerd, I started doing some research to try to find out what activities would do the most to make the world a better place. Ten years later, I'm still asking that question.

"I'm still far from certain about what the best ways of doing good are, but I have learned that each and every one of us can do a tremendous amount to make the world a better place, whether that's by donating to buy bednets to protect children from malaria, campaigning to improve living conditions for animals in factory farms or working to ensure that the benefits of new technology are harnessed while avoiding the risks."

More profile about the speaker
Will MacAskill | Speaker | TED.com
TED2018

Will MacAskill: What are the most important moral problems of our time?

威尔·麦克斯基尔: 我们这个时代最重要的道德问题是什么?

Filmed:
2,112,093 views

在人类面临的所有问题中,我们应该首先聚焦解决哪个?在这个引人注目的,有关如何让世界变更好的演讲中,道德哲学家威尔·麦卡斯基尔基于“有效利他主义”的哲学理念提供了一个回答这个问题的框架——并且分享了解决目前三个紧迫的全球问题的观点。
- Moral philosopher, writer
Will MacAskill is a cofounder of the effective altruism movement, a philosophy that tries to answer the question: How can we do as much good as possible? Full bio

Double-click the English transcript below to play the video.

00:12
This is a graph图形
0
857
1479
这是一张曲线图
00:14
that represents代表 the economic经济 history历史
of human人的 civilization文明.
1
2360
3659
它展现了人类文明的经济史.
00:18
[World世界 GDPGDP per capita人头
over the last 200,000 years年份]
2
6043
2400
(过去20万年的世界人均GDP)
00:23
There's not much going on, is there.
3
11757
2003
没什么在发生,不是吗。
00:26
For the vast广大 majority多数 of human人的 history历史,
4
14751
2347
在人类历史的长河中,
00:29
pretty漂亮 much everyone大家 lived生活
on the equivalent当量 of one dollar美元 per day,
5
17835
4048
几乎每人每天用一美元来生活,
00:33
and not much changed.
6
21907
1286
并且没有什么变化。
00:36
But then, something
extraordinary非凡 happened发生:
7
24757
2777
但是后来,发生了一个重大的事件:
00:40
the Scientific科学 and Industrial产业 Revolutions革命.
8
28677
2811
科学和工业革命。
00:43
And the basically基本上 flat平面 graph图形 you just saw
9
31512
2785
刚才你们所看到的平缓的曲线图
00:46
transforms变换 into this.
10
34321
2675
变成了这样。
00:50
What this graph图形 means手段 is that,
in terms条款 of power功率 to change更改 the world世界,
11
38612
4635
这幅图表现的是,
就改变世界的能力而言,
00:55
we live生活 in an unprecedented史无前例 time
in human人的 history历史,
12
43271
3438
我们现在所处的时代是空前的,
00:58
and I believe our ethical合乎道德的 understanding理解
hasn't有没有 yet然而 caught抓住 up with this fact事实.
13
46733
3944
并且我认为我们伦理上的理解
还没有赶上这一事实。
01:03
The Scientific科学 and Industrial产业 Revolutions革命
14
51716
1984
科学和工业革命
01:05
transformed改造 both
our understanding理解 of the world世界
15
53724
2909
不光改变了我们对世界的认知
01:08
and our ability能力 to alter改变 it.
16
56657
1669
也提高了我们改变世界的能力。
01:11
What we need is an ethical合乎道德的 revolution革命
17
59505
3667
我们所需要的
是一场伦理道德层面上的革命
01:15
so that we can work out
18
63196
1548
这样我们才能解决
01:16
how do we use this tremendous巨大
bounty赏金 of resources资源
19
64768
3152
怎么利用这些丰富的资源
01:19
to improve提高 the world世界.
20
67944
1395
让世界变得更美好。
01:22
For the last 10 years年份,
21
70249
1591
在过去的10年中,
01:23
my colleagues同事 and I have developed发达
a philosophy哲学 and research研究 program程序
22
71864
3833
我和同事开展了一项哲学研究计划,
01:27
that we call effective有效 altruism利他主义.
23
75721
1835
我们称之为有效的利他主义。
01:30
It tries尝试 to respond响应
to these radical激进 changes变化 in our world世界,
24
78366
3595
这项研究计划尝试去解决
当今世界所发生的巨变,
01:33
uses使用 evidence证据 and careful小心 reasoning推理
to try to answer回答 this question:
25
81985
4476
依据事实证据和缜密的推理
来回答这一问题:
01:40
How can we do the most good?
26
88173
2278
我们怎样才可以
最大效益得做好事呢?
01:44
Now, there are many许多 issues问题
you've got to address地址
27
92265
3221
现今,有许多待解决的问题
01:47
if you want to tackle滑车 this problem问题:
28
95510
2263
如果你想要解决这个大难题,
01:49
whether是否 to do good through通过 your charity慈善机构
29
97797
2031
无论你是通过做慈善
01:51
or your career事业
or your political政治 engagement订婚,
30
99852
2152
还是通过你的职业,
或是通过政治活动,
01:54
what programs程式 to focus焦点 on,
who to work with.
31
102028
2395
你专注的项目,
或你工作的对象来为世界得益。
01:57
But what I want to talk about
32
105624
1476
但是我想讨论的是
01:59
is what I think is the most
fundamental基本的 problem问题.
33
107124
2872
我认为在所有问题之中,
最根本的问题
02:02
Of all the many许多 problems问题
that the world世界 faces面孔,
34
110020
2693
这个世界所面临的所有问题中
02:05
which哪一个 should we be focused重点
on trying to solve解决 first?
35
113962
2659
哪一个应是我们首先解决的问题?
02:10
Now, I'm going to give you a framework骨架
for thinking思维 about this question,
36
118668
3468
现在,我将要给出
一个框架来思考这个问题,
02:14
and the framework骨架 is very simple简单.
37
122160
1936
这个框架很简单。
02:16
A problem's问题 higher更高 priority优先,
38
124842
1699
一个问题的优先级越高,
02:19
the bigger, the more easily容易 solvable可解
and the more neglected被忽视的 it is.
39
127416
4063
这问题就越大,就越容易解决,
也最容易被忽略。
02:24
Bigger is better,
40
132694
1642
越大越好。
02:26
because we've我们已经 got more to gain获得
if we do solve解决 the problem问题.
41
134360
2841
因为我们若解决了这个问题,
我们会获利极多。
02:30
More easily容易 solvable可解 is better
42
138221
1569
越容易解决越好,
02:31
because I can solve解决 the problem问题
with less time or money.
43
139814
2824
因为我可以花费更少的钱和
时间来解决这个问题。
02:35
And most subtly巧妙的,
44
143737
2063
最微妙的是,
02:38
more neglected被忽视的 is better,
because of diminishing逐渐缩小的 returns回报.
45
146681
2849
根据收益递减规律,
问题越容易被忽视越好。
02:42
The more resources资源 that have already已经 been
invested投资 into solving a problem问题,
46
150285
3714
若前期已投入资源过多,
来解决这个问题
02:46
the harder更难 it will be
to make additional额外 progress进展.
47
154023
2905
就越难取得额外的进展。
02:50
Now, the key thing that I want
to leave离开 with you is this framework骨架,
48
158560
4059
现在,我想要留给你们最关键的东西
就是这个框架,
02:54
so that you can think for yourself你自己
49
162643
1984
这样你可以自己去思考
02:56
what are the highest最高 global全球 priorities优先.
50
164651
2321
什么问题具有全球最高的优先级。
02:59
But I and others其他
in the effective有效 altruism利他主义 community社区
51
167954
2692
但是在这个高效利他主义群体中,
我和其他的人
03:02
have converged融合 on three moral道德 issues问题
that we believe are unusually异常 important重要,
52
170670
5879
一致认为有三个的道德问题
03:08
score得分了 unusually异常 well in this framework骨架.
53
176573
2182
在这个框架中占据重要的地位。
03:11
First is global全球 health健康.
54
179151
2813
第一个是全球卫生。
03:13
This is supersolvable超低的.
55
181988
2411
这是最易解决的。
03:16
We have an amazing惊人 track跟踪 record记录
in global全球 health健康.
56
184423
3397
我们对于全球卫生
有一个惊人的跟踪记录。
03:19
Rates价格 of death死亡 from measles麻疹,
malaria疟疾, diarrheal腹泻 disease疾病
57
187844
5420
麻疹,疟疾和痢疾引起的死亡率
03:25
are down by over 70 percent百分.
58
193288
2246
已经降低了70%。
03:29
And in 1980, we eradicated根除 smallpox天花.
59
197534
2789
并且在1980年,
我们根治了天花。
03:33
I estimate估计 we thereby从而 saved保存
over 60 million百万 lives生活.
60
201815
3667
我估计因此而拯救了
超过6000万条的生命。
03:37
That's more lives生活 saved保存
than if we'd星期三 achieved实现 world世界 peace和平
61
205506
3064
这些生命数量比
若在同时期取得世界和平
03:40
in that same相同 time period.
62
208594
1697
拯救的生命还要多。
03:43
On our current当前 best最好 estimates估计,
63
211893
2325
根据我们目前的最佳估计,
03:46
we can save保存 a life by distributing分布
long-lasting持久的 insecticide-treated驱虫处理 bed nets
64
214242
4128
我们可以通过分配长期有效的、
具有杀虫功效的蚊帐来拯救生命,
03:50
for just a few少数 thousand dollars美元.
65
218394
1903
而这些蚊帐只花费几千元。
03:52
This is an amazing惊人 opportunity机会.
66
220911
1667
这是一个惊人的机会。
03:55
The second第二 big priority优先
is factory farming农业.
67
223594
2515
第二优先问题就是农业工厂化。
03:58
This is superneglected超忽略.
68
226681
1563
这一点被严重忽视了。
04:00
There are 50 billion十亿 land土地 animals动物
used every一切 year for food餐饮,
69
228768
4143
每年有500亿陆生动物
被作为食物。
04:05
and the vast广大 majority多数 of them
are factory farmed养殖,
70
233625
2548
它们中的大部分是工厂化养殖的,
04:08
living活的 in conditions条件
of horrific可怕的 suffering痛苦.
71
236197
2380
这些动物居住在痛苦的生存环境中。
04:10
They're probably大概 among其中
the worst-off最糟糕的 creatures生物 on this planet行星,
72
238601
3151
它们可能是这个地球上最惨的生物,
04:13
and in many许多 cases, we could
significantly显著 improve提高 their lives生活
73
241776
2858
并且在很多情况下,
我们可以大幅度改善它们的生活,
04:16
for just pennies便士 per animal动物.
74
244658
1602
只需为每一个动物花费几分钱
就可以做到。
04:19
Yet然而 this is hugely巨大 neglected被忽视的.
75
247123
2082
然而这一点也被极度忽略了。
04:21
There are 3,000 times
more animals动物 in factory farms农场
76
249229
3810
工业化农场里的动物数量是
流浪动物的3000倍多,
04:25
than there are stray流浪 pets宠物,
77
253063
1561
04:28
but yet然而, factory farming农业 gets得到 one fiftieth五十
of the philanthropic慈善 funding资金.
78
256600
4373
但是工业化养殖的动物
只获得了1/5的慈善基金。
04:34
That means手段 additional额外
resources资源 in this area
79
262211
2128
这意味着,在这个领域若有更多资源
04:36
could have a truly transformative变革 impact碰撞.
80
264363
2150
可以产生一个真正的变革性影响。
04:39
Now the third第三 area is the one
that I want to focus焦点 on the most,
81
267458
2985
接下来第三个领域是我最想强调的。
04:42
and that's the category类别
of existential存在 risks风险:
82
270467
2984
这就是存亡风险:
04:45
events事件 like a nuclear war战争
or a global全球 pandemic流感大流行
83
273475
3873
一些像核战争或者全球性瘟疫的事件
04:50
that could permanently永久 derail出轨 civilization文明
84
278824
2611
这些可能永久地使文明脱节,
04:54
or even lead to the extinction灭绝
of the human人的 race种族.
85
282156
2436
或者甚至导致人类种族的灭绝。
04:57
Let me explain说明 why I think
this is such这样 a big priority优先
86
285882
2540
让我来解释下为什么我认为这件事情
05:00
in terms条款 of this framework骨架.
87
288446
1337
是这个框架下最具优先级的事情。
05:02
First, size尺寸.
88
290992
1412
第一,规模。
05:05
How bad would it be if there were
a truly existential存在 catastrophe灾难?
89
293341
3889
如果出现一个毁灭性的灾难,
情况将会变得多糟糕?
05:10
Well, it would involve涉及 the deaths死亡
of all seven billion十亿 people on this planet行星
90
298920
6342
它将牵扯到这个星球上
70亿人类的生死,
05:17
and that means手段 you
and everyone大家 you know and love.
91
305286
3119
当然包括你自己以及
你认识或者深爱的任何人。
05:21
That's just a tragedy悲剧
of unimaginable不可思议 size尺寸.
92
309214
2579
这将是一个不可想象规模的悲剧。
05:25
But then, what's more,
93
313684
1976
但是然后,还有呢。
05:27
it would also mean the curtailment剥夺
of humanity's人类的 future未来 potential潜在,
94
315684
3605
它同时预示着人类未来潜力的缩减,
05:31
and I believe that humanity's人类的
potential潜在 is vast广大.
95
319313
2952
我相信人类的潜力是巨大的。
05:35
The human人的 race种族 has been around
for about 200,000 years年份,
96
323551
3451
人类已经存在了大约20万年了,
05:39
and if she lives生活 as long
as a typical典型 mammalian哺乳动物 species种类,
97
327026
2883
如果她像一个典型的哺乳物种
活得一样长的话,
05:41
she would last
for about two million百万 years年份.
98
329933
2298
她将会持续200万年。
05:46
If the human人的 race种族
were a single individual个人,
99
334884
2691
如果人类是一个单一的个体,
05:49
she would be just 10 years年份 old today今天.
100
337599
2419
那么她今天就只有10岁。
05:53
And what's more, the human人的 race种族
isn't a typical典型 mammalian哺乳动物 species种类.
101
341526
4166
更何况,人类不是
一个典型的哺乳动物,
05:58
There's no reason原因 why, if we're careful小心,
102
346950
1906
没有理由,如果我们细心的话,
06:00
we should die off
after only two million百万 years年份.
103
348880
2205
我们将会在200万年后灭亡。
06:03
The earth地球 will remain habitable可居住
for 500 million百万 years年份 to come.
104
351839
4040
而地球在未来的5亿年
仍将保持可居住状态。
06:08
And if someday日后, we took to the stars明星,
105
356696
1944
如果有一天,我们登上了星球,
06:11
the civilization文明 could continue继续
for billions数十亿 more.
106
359640
2516
那么文明将会继续数十亿年。
06:16
So I think the future未来
is going to be really big,
107
364193
2388
所以,我认为未来将会真的很宽广,
06:19
but is it going to be good?
108
367669
1802
但是未来一定会更好吗?
06:21
Is the human人的 race种族
even really worth价值 preserving?
109
369495
2817
人类种族真的还值得保留吗?
06:26
Well, we hear all the time about
how things have been getting得到 worse更差,
110
374540
3929
是的,我们一直都在
听到事情如何变得更糟的消息。
06:31
but I think that when
we take the long run,
111
379459
2693
但是我相信当我们做长远打算时,
06:34
things have been getting得到 radically根本 better.
112
382176
2031
情况已经大为好转。
06:37
Here, for example,
is life expectancy期待 over time.
113
385453
2294
比如,人类平均寿命在延长。
06:40
Here's这里的 the proportion比例 of people
not living活的 in extreme极端 poverty贫穷.
114
388892
3023
很大比例的人不再生活异常贫困。
06:45
Here's这里的 the number of countries国家 over time
that have decriminalized合法化 homosexuality同性恋.
115
393106
4095
将同性恋合法化的国家数量在增加。
06:50
Here's这里的 the number of countries国家 over time
that have become成为 democratic民主的.
116
398848
3269
民主化的国家数量也在增加。
06:55
Then, when we look to the future未来,
there could be so much more to gain获得 again.
117
403015
4619
所以,当展望未来时,
我们可以获得更多东西。
06:59
We'll be so much richer更丰富,
118
407658
1228
我们将变得更富有,
07:00
we can solve解决 so many许多 problems问题
that are intractable棘手 today今天.
119
408910
3595
我们可以解决很多
如今看来很棘手的问题。
07:05
So if this is kind of a graph图形
of how humanity人性 has progressed进展
120
413389
4445
所以,如果这是一个
人类进步曲线图的话,
07:09
in terms条款 of total human人的
flourishing芊芊 over time,
121
417858
2890
它基于人类随着时间的推移
如何繁荣昌盛,
07:12
well, this is what we would expect期望
future未来 progress进展 to look like.
122
420772
3355
这将会是未来我们期望看到的进展。
07:16
It's vast广大.
123
424881
1150
它很广阔。
07:18
Here, for example,
124
426953
1198
比如。
07:20
is where we would expect期望 no one
to live生活 in extreme极端 poverty贫穷.
125
428175
3746
在这个点,我们期望没人会生活在
极度的贫困中。
07:25
Here is where we would expect期望
everyone大家 to be better off
126
433930
3202
在个点,我们期待每个人都会比
07:29
than the richest首富 person alive today今天.
127
437156
1853
比当今最富裕的人过的还好。
07:32
Perhaps也许 here is where we would discover发现
the fundamental基本的 natural自然 laws法律
128
440081
3192
或许在这里,我们将会发现统治世界
07:35
that govern治理 our world世界.
129
443297
1268
的基本自然法则。
07:37
Perhaps也许 here is where we discover发现
an entirely完全 new form形成 of art艺术,
130
445516
3705
或许在这个点,我们将会发现
一个全新的艺术形式,
07:41
a form形成 of music音乐 we currently目前 lack缺乏
the ears耳朵 to hear.
131
449245
3040
或者是一种当下我们没有能力
去倾听的音乐。
07:45
And this is just
the next下一个 few少数 thousand years年份.
132
453072
2222
这仅仅是未来的几千年。
07:47
Once一旦 we think past过去 that,
133
455827
2205
每当我们回首过往时,
07:50
well, we can't even imagine想像 the heights高度
that human人的 accomplishment成就 might威力 reach达到.
134
458056
4167
我们甚至都不敢相信
人类曾经取得的高度。
07:54
So the future未来 could be very big
and it could be very good,
135
462247
3040
所以,前景可能会很广阔,
也可能会很好,
07:57
but are there ways方法
we could lose失去 this value?
136
465311
2086
但是,是否会有一些情况,
会使得我们丧失这些价值?
08:00
And sadly可悲的是, I think there are.
137
468366
1826
悲伤地是,我认为是他们是存在的。
08:02
The last two centuries百年 brought
tremendous巨大 technological技术性 progress进展,
138
470216
4053
过去两个世纪带来了
巨大的技术进步,
08:06
but they also brought
the global全球 risks风险 of nuclear war战争
139
474293
2622
但是技术进步同时带来了
核战争的全球危机
08:08
and the possibility可能性
of extreme极端 climate气候 change更改.
140
476939
2157
还有极端气候变化的可能性。
08:11
When we look to the coming未来 centuries百年,
141
479725
1767
当我们观望未来的世纪时,
08:13
we should expect期望 to see
the same相同 pattern模式 again.
142
481516
2647
我们应该期望看到同样的样式。
08:16
And we can see some radically根本
powerful强大 technologies技术 on the horizon地平线.
143
484187
3356
我们能够看到未来会有的
一些激进强大的技术。
08:20
Synthetic合成的 biology生物学 might威力 give us
the power功率 to create创建 viruses病毒
144
488132
2849
合成生物学使得我们能够创病毒,
08:23
of unprecedented史无前例
contagiousness传染性 and lethality杀伤力.
145
491005
3047
而这些病毒具有不可预测的传染性和
杀伤力。
08:27
Geoengineering地球工程 might威力 give us the power功率
to dramatically显着 alter改变 the earth's地球 climate气候.
146
495131
4643
地质工程可能给予我们
极大改变气候的力量。
08:31
Artificial人造 intelligence情报 might威力 give us
the power功率 to create创建 intelligent智能 agents代理
147
499798
4199
人工智能可能会给予我们
创造智能终端的力量,
08:36
with abilities能力 greater更大 than our own拥有.
148
504021
2142
而这些智能终端会比人类
具备更强的能力。
08:40
Now, I'm not saying that any
of these risks风险 are particularly尤其 likely容易,
149
508222
3888
现在,我并不是说所有这些风险
都特别的有可能实现,
08:44
but when there's so much at stake赌注,
150
512134
1644
但当这一切都岌岌可危时。
08:45
even small probabilities概率
matter a great deal合同.
151
513802
2967
甚至一个小的可能性
也会造成大的惨剧。
08:49
Imagine想像 if you're getting得到 on a plane平面
and you're kind of nervous紧张,
152
517568
3001
想象你登上一个飞机,你有点紧张,
08:52
and the pilot飞行员 reassures再保证 you by saying,
153
520593
3444
并且飞行员在欢迎你时说到,
08:56
"There's only a one-in-a-thousand一对一1000
chance机会 of crashing轰然. Don't worry担心."
154
524061
4634
“飞机只有千分之一的几率会坠毁,
别担心。”
09:02
Would you feel reassured放心?
155
530157
1554
你会感觉安心吗?
09:04
For these reasons原因, I think that preserving
the future未来 of humanity人性
156
532509
4088
基于这些原因,
我认为保留人类的未来这件事情
09:08
is among其中 the most important重要 problems问题
that we currently目前 face面对.
157
536621
2984
是我们目前面临的最重要的难题。
09:12
But let's keep using运用 this framework骨架.
158
540546
2150
然而,让我们继续来使用这个框架。
09:14
Is this problem问题 neglected被忽视的?
159
542720
1310
这个问题被忽略了吗?
09:18
And I think the answer回答 is yes,
160
546085
2282
并且我认为答案是 “是的”,
09:20
and that's because problems问题
that affect影响 future未来 generations
161
548391
3325
这是因为影响人类后代的问题
09:23
are often经常 hugely巨大 neglected被忽视的.
162
551740
1651
通常都被极大地忽略了。
09:26
Why?
163
554930
1406
为什么?
09:28
Because future未来 people
don't participate参加 in markets市场 today今天.
164
556360
3478
因为未来人类没有参与今天的市场。
09:31
They don't have a vote投票.
165
559862
1522
他们没有任何决策权。
09:33
It's not like there's a lobby前厅
representing代表 the interests利益
166
561931
2673
并没有一个大厅来展示他们的利益,
09:36
of those born天生 in 2300 AD广告.
167
564628
2023
我指的是出生在公元2300年
这些人的利益。
09:40
They don't get to influence影响
the decisions决定 we make today今天.
168
568313
3242
他们没法影响我们今天所做的决定。
09:43
They're voiceless无声.
169
571995
1191
他们是无声的。
09:46
And that means手段 we still spend
a paltry微不足道 amount on these issues问题:
170
574490
3445
这意味着我们在这些问题上的投入
微乎其微:
09:49
nuclear nonproliferation防 扩散,
171
577959
1799
防止核扩散,
09:51
geoengineering地球工程, biorisk比奥里斯克,
172
579782
2330
地质工程,生物风险,
09:55
artificial人造 intelligence情报 safety安全.
173
583414
1642
还有人工智能的安全性。
09:57
All of these receive接收
only a few少数 tens of millions百万 of dollars美元
174
585923
2874
所有这些课题,
每年只能接受大约千万美金
10:00
of philanthropic慈善 funding资金 every一切 year.
175
588821
1927
的慈善基金。
10:04
That's tiny compared相比
to the 390 billion十亿 dollars美元
176
592044
3929
这些和美国每年花费的
10:08
that's spent花费 on US philanthropy慈善事业 in total.
177
596790
2261
3900亿慈善基金比起来
简直就是九牛一毛。
10:13
The final最后 aspect方面 of our framework骨架 then:
178
601885
2484
我们这个框架的最后一个方面:
10:17
Is this solvable可解?
179
605083
1190
这个问题可以解决吗?
10:19
I believe it is.
180
607289
1289
我相信是可以解决的。
10:21
You can contribute有助于 with your money,
181
609014
3047
你可以用你的钱来做贡献,
10:24
your career事业 or your political政治 engagement订婚.
182
612085
2644
或者通过你的职业,
或者你的政治参与来贡献。
10:28
With your money,
you can support支持 organizations组织
183
616225
2175
使用你的钱,你可以支持那些
10:30
that focus焦点 on these risks风险,
184
618424
1302
聚焦在这些风险上的组织机构,
10:31
like the Nuclear Threat威胁 Initiative倡议,
185
619750
2555
比如核威胁倡议组织,
10:34
which哪一个 campaigns活动 to take nuclear weapons武器
off hair-trigger一触即发 alert警报,
186
622329
3660
这个组织旨在将核武器
从一触即发的警告中消除,
10:38
or the Blue蓝色 Ribbon丝带 Panel面板, which哪一个
develops发展 policy政策 to minimize最小化 the damage损伤
187
626013
3571
或者蓝丝带小组,
这个组织旨在开发政策来最小化
10:41
from natural自然 and man-made人造 pandemics流行病,
188
629608
2095
自然或者人为流行病带来的伤害,
10:45
or the Center中央 for Human-Compatible人性化 AIAI,
which哪一个 does technical技术 research研究
189
633158
3260
或者人类兼容人工智能中心,
这个组织做一些研究
10:48
to ensure确保 that AIAI systems系统
are safe安全 and reliable可靠.
190
636442
2747
以确保人工智能系统是安全
并且可靠的。
10:52
With your political政治 engagement订婚,
191
640652
1516
通过你的政治参与活动
10:54
you can vote投票 for candidates候选人
that care关心 about these risks风险,
192
642192
3096
你可以投票给
那些关心这些风险的候选人,
10:57
and you can support支持
greater更大 international国际 cooperation合作.
193
645312
2586
你也可以支持更大的国际化合作。
11:01
And then with your career事业,
there is so much that you can do.
194
649767
3542
然后,通过你的职业,
有很多事情你可以做。
11:05
Of course课程, we need scientists科学家们
and policymakers政策制定者 and organization组织 leaders领导者,
195
653333
3672
毫无疑问,我们需要科学家,
需要政策制定者和组织领导者,
11:09
but just as importantly重要的,
196
657865
1152
但是同样重要的是,
11:11
we also need accountants会计师
and managers经理 and assistants助理
197
659041
4117
我们也需要财会,经理人和助理,
11:16
to work in these organizations组织
that are tackling抢断 these problems问题.
198
664691
3754
为解决这些难题的组织工作。
11:20
Now, the research研究 program程序
of effective有效 altruism利他主义
199
668469
3492
当前,针对有效利他主义的研究程序
11:25
is still in its infancy婴儿期,
200
673191
1444
还处于幼儿期。
11:27
and there's still a huge巨大 amount
that we don't know.
201
675262
2524
我们对很多东西仍不了解。
11:31
But even with what we've我们已经 learned学到了 so far,
202
679173
2343
但是,仅仅就目前我们所学到的东西
而言,
11:34
we can see that by thinking思维 carefully小心
203
682748
2183
我们可以发现,通过认真思考
11:37
and by focusing调焦 on those problems问题
that are big, solvable可解 and neglected被忽视的,
204
685494
4873
并且聚焦在这些巨大,
可解决并且被忽略的问题上,
11:43
we can make a truly tremendous巨大
difference区别 to the world世界
205
691152
2708
我们可以对世界作出
真正意义上巨大的改变,
11:45
for thousands数千 of years年份 to come.
206
693884
1631
这些改变是为了未来的几千年。
11:47
Thank you.
207
695963
1151
谢谢。
11:49
(Applause掌声)
208
697138
4560
(掌声)
Translated by Jingle duan
Reviewed by Yiyang Piao

▲Back to top

ABOUT THE SPEAKER
Will MacAskill - Moral philosopher, writer
Will MacAskill is a cofounder of the effective altruism movement, a philosophy that tries to answer the question: How can we do as much good as possible?

Why you should listen

Will MacAskill is the author of Doing Good Better and the cofounder of the organization Giving What We Can and the social-impact-career advice project 80,000 Hours. He is a trustee of the Centre for Effective Altruism. As he writes: "I'm an associate professor of moral philosophy at Oxford University. I helped to create the philosophy of effective altruism: the use of evidence and reason to figure out how to do the most good we can.

"While growing up, I tried my best to do good things: I donated to high street charities, I worked at an old folks' home, I taught English in Ethiopia. But when I was a graduate student I realized that I'd never asked myself the question of whether these were the truly best ways of improving the world. So, being a nerd, I started doing some research to try to find out what activities would do the most to make the world a better place. Ten years later, I'm still asking that question.

"I'm still far from certain about what the best ways of doing good are, but I have learned that each and every one of us can do a tremendous amount to make the world a better place, whether that's by donating to buy bednets to protect children from malaria, campaigning to improve living conditions for animals in factory farms or working to ensure that the benefits of new technology are harnessed while avoiding the risks."

More profile about the speaker
Will MacAskill | Speaker | TED.com