ABOUT THE SPEAKERS
Jack Dorsey - Entrepreneur, programmer
Jack Dorsey is the CEO of Twitter, CEO & Chairman of Square, and a cofounder of both.

Why you should listen
More profile about the speaker
Jack Dorsey | Speaker | TED.com
Chris Anderson - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading.

Why you should listen

Chris Anderson is the Curator of TED, a nonprofit devoted to sharing valuable ideas, primarily through the medium of 'TED Talks' -- short talks that are offered free online to a global audience.

Chris was born in a remote village in Pakistan in 1957. He spent his early years in India, Pakistan and Afghanistan, where his parents worked as medical missionaries, and he attended an American school in the Himalayas for his early education. After boarding school in Bath, England, he went on to Oxford University, graduating in 1978 with a degree in philosophy, politics and economics.

Chris then trained as a journalist, working in newspapers and radio, including two years producing a world news service in the Seychelles Islands.

Back in the UK in 1984, Chris was captivated by the personal computer revolution and became an editor at one of the UK's early computer magazines. A year later he founded Future Publishing with a $25,000 bank loan. The new company initially focused on specialist computer publications but eventually expanded into other areas such as cycling, music, video games, technology and design, doubling in size every year for seven years. In 1994, Chris moved to the United States where he built Imagine Media, publisher of Business 2.0 magazine and creator of the popular video game users website IGN. Chris eventually merged Imagine and Future, taking the combined entity public in London in 1999, under the Future name. At its peak, it published 150 magazines and websites and employed 2,000 people.

This success allowed Chris to create a private nonprofit organization, the Sapling Foundation, with the hope of finding new ways to tackle tough global issues through media, technology, entrepreneurship and, most of all, ideas. In 2001, the foundation acquired the TED Conference, then an annual meeting of luminaries in the fields of Technology, Entertainment and Design held in Monterey, California, and Chris left Future to work full time on TED.

He expanded the conference's remit to cover all topics, including science, business and key global issues, while adding a Fellows program, which now has some 300 alumni, and the TED Prize, which grants its recipients "one wish to change the world." The TED stage has become a place for thinkers and doers from all fields to share their ideas and their work, capturing imaginations, sparking conversation and encouraging discovery along the way.

In 2006, TED experimented with posting some of its talks on the Internet. Their viral success encouraged Chris to begin positioning the organization as a global media initiative devoted to 'ideas worth spreading,' part of a new era of information dissemination using the power of online video. In June 2015, the organization posted its 2,000th talk online. The talks are free to view, and they have been translated into more than 100 languages with the help of volunteers from around the world. Viewership has grown to approximately one billion views per year.

Continuing a strategy of 'radical openness,' in 2009 Chris introduced the TEDx initiative, allowing free licenses to local organizers who wished to organize their own TED-like events. More than 8,000 such events have been held, generating an archive of 60,000 TEDx talks. And three years later, the TED-Ed program was launched, offering free educational videos and tools to students and teachers.

More profile about the speaker
Chris Anderson | Speaker | TED.com
Whitney Pennington Rodgers - TED Current Affairs Curator
Whitney Pennington Rodgers is an award-winning journalist and media professional.

Why you should listen

Prior to joining TED as current affairs curator, Whitney Pennington Rodgers produced for NBC's primetime news magazine Dateline NBC. She earned a duPont-Columbia award and a News & Documentary Emmy or her contributions to the Dateline NBC hour "The Cosby Accusers Speak" -- an extensive group interview with 27 of the women who accused entertainer Bill Cosby of sexual misconduct.

Pennington Rodgers has worked at NBC's in-house production company Peacock Productions, The Today Show, Nightly News, Rock Center with Brian Williams and New Jersey-centric public affairs shows Caucus: New Jersey and One-on-One with Steve Adubato. Prior to beginning her career in media, she had a short stint as a fourth-grade teacher through the Teach for America program.

Pennington Rodgers received her Bachelor's in journalism and media studies from Rutgers University. She completed her Master's of Journalism at the University of California at Berkeley, where she produced a documentary about recruitment of nonblack students at historically black colleges and universities.

More profile about the speaker
Whitney Pennington Rodgers | Speaker | TED.com
TED2019

Jack Dorsey: How Twitter needs to change

杰克 · 多西: 推特需要改进的地方

Filmed:
2,089,470 views

推特还有救吗?在与TED的克里斯·安德森和惠特尼·彭宁顿·罗杰斯的一场广泛谈话中,推特的CEO杰克 · 多西探讨了该平台的未来——他承认推特存在骚扰和监管问题,并提出了一些根本性的改革,希望能鼓励健康、互敬的谈话。“我们是否能够传达人们每天所重视的事物呢?”多西问到。
- Entrepreneur, programmer
Jack Dorsey is the CEO of Twitter, CEO & Chairman of Square, and a cofounder of both. Full bio - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading. Full bio - TED Current Affairs Curator
Whitney Pennington Rodgers is an award-winning journalist and media professional. Full bio

Double-click the English transcript below to play the video.

00:13
Chris克里斯 Anderson安德森:
What worries you right now?
0
1131
2408
克里斯·安德森:
你现在担心什么?
00:15
You've been very open打开
about lots of issues问题 on Twitter推特.
1
3563
2853
一直以来你都敞开探讨
推特上的问题。
00:18
What would be your top最佳 worry担心
2
6440
2299
对于现在的情况,
00:20
about where things are right now?
3
8763
2049
你最担心的是什么?
00:23
Jack插口 Dorsey多尔西: Right now,
the health健康 of the conversation会话.
4
11447
2929
杰克·多西:
现在我最担心的是谈话的质量。
00:26
So, our purpose目的 is to serve服务
the public上市 conversation会话,
5
14400
3660
我们的目标是促进大众交流,
00:30
and we have seen看到
a number of attacks攻击 on it.
6
18084
5056
但我们已经在上面看到了
很多攻击性言论。
00:35
We've我们已经 seen看到 abuse滥用, we've我们已经 seen看到 harassment骚扰,
7
23164
2425
我们看到了侮辱,我们看到了骚扰,
00:37
we've我们已经 seen看到 manipulation操作,
8
25613
3222
我们还看到了诱导性言论,
00:40
automation自动化, human人的 coordination协调,
misinformation误传.
9
28859
4265
机器回复、水军、错误信息。
00:46
So these are all dynamics动力学
that we were not expecting期待
10
34134
4034
这些变化
00:50
13 years年份 ago when we were
starting开始 the company公司.
11
38192
3718
是我们在13年前创建公司时
没有想到的。
00:53
But we do now see them at scale规模,
12
41934
2664
然而,现在这些言论数量巨大,
00:56
and what worries me most
is just our ability能力 to address地址 it
13
44622
5278
我最担心的是我们是否有能力
解决这个问题,
01:01
in a systemic系统的 way that is scalable可扩展性,
14
49924
3108
通过一种可延展的系统性方式,
01:05
that has a rigorous严格 understanding理解
of how we're taking服用 action行动,
15
53056
6976
有一个严格的处理标准,
01:12
a transparent透明 understanding理解
of how we're taking服用 action行动
16
60056
3105
过程公开透明,大众可以理解,
01:15
and a rigorous严格 appeals上诉 process处理
for when we're wrong错误,
17
63185
3101
并且在我们犯错的时候
有一个而严格的上诉机制,
01:18
because we will be wrong错误.
18
66310
2169
因为每个人都会犯错。
01:20
Whitney惠特尼 Pennington边宁顿 Rodgers罗杰斯:
I'm really glad高兴 to hear
19
68503
2397
惠特尼·彭宁顿·罗杰斯:
我非常高兴
01:22
that that's something that concerns关注 you,
20
70924
1928
听到你谈论你的担心,
01:24
because I think there's been
a lot written书面 about people
21
72876
2630
因为我觉得有很多人
01:27
who feel they've他们已经 been abused滥用
and harassed骚扰 on Twitter推特,
22
75530
2477
都觉得自己在推特上被侮辱和骚扰,
01:30
and I think no one more so
than women妇女 and women妇女 of color颜色
23
78031
4102
并且我觉得首当其冲的
01:34
and black黑色 women妇女.
24
82157
1170
就是女性,非白人女性和黑人女性。
01:35
And there's been data数据 that's come out --
25
83351
1913
有数据表明,
01:37
Amnesty大赦 International国际 put out
a report报告 a few少数 months个月 ago
26
85288
2909
来自国际特赦组织几个月前的一份报告称
01:40
where they showed显示 that a subset子集
of active活性 black黑色 female Twitter推特 users用户
27
88221
4480
有一部分活跃的黑人女性推特用户
01:44
were receiving接收, on average平均,
one in 10 of their tweets微博
28
92725
3456
平均每10条推文
01:48
were some form形成 of harassment骚扰.
29
96205
2099
就会收到一条骚扰回复。
01:50
And so when you think about health健康
for the community社区 on Twitter推特,
30
98328
3907
当你谈论到为推特用户
塑造健康环境的时候,
01:54
I'm interested有兴趣 to hear,
"health健康 for everyone大家,"
31
102259
4024
我对你说的“大众的健康”很感兴趣,
01:58
but specifically特别: How are you looking
to make Twitter推特 a safe安全 space空间
32
106307
3125
但具体来讲:你准备如何为这个群体、
为女性、为非白人女性和黑人女性
02:01
for that subset子集, for women妇女,
for women妇女 of color颜色 and black黑色 women妇女?
33
109456
4164
营造一个安全的环境呢?
02:05
JDJD: Yeah.
34
113644
1164
多西:好的。
02:06
So it's a pretty漂亮 terrible可怕 situation情况
35
114832
2643
现在状况比较糟糕,
02:09
when you're coming未来 to a service服务
36
117499
1619
当你来到这个平台,
02:11
that, ideally理想, you want to learn学习
something about the world世界,
37
119142
4321
理想情况下,你想了解这个世界,
02:15
and you spend the majority多数 of your time
reporting报告 abuse滥用, receiving接收 abuse滥用,
38
123487
5443
但你要花费大量时间来举报
和受到侮辱信息,
02:20
receiving接收 harassment骚扰.
39
128954
1804
还有骚扰信息。
02:23
So what we're looking most deeply at
is just the incentives奖励
40
131373
6321
所以我们现在研究的是
02:29
that the platform平台 naturally自然 provides提供
and the service服务 provides提供.
41
137718
3823
平台和服务所提供的动机。
02:34
Right now, the dynamic动态 of the system系统
makes品牌 it super-easy超级容易 to harass骚扰
42
142262
4577
现在,系统的动态性使得在平台上
02:38
and to abuse滥用 others其他 through通过 the service服务,
43
146863
3664
骚扰和侮辱他人非常容易,
02:42
and unfortunately不幸, the majority多数
of our system系统 in the past过去
44
150551
3262
不幸的是,过去我们大部分系统
02:45
worked工作 entirely完全 based基于 on people
reporting报告 harassment骚扰 and abuse滥用.
45
153837
5596
完全依赖于用户举报骚扰和侮辱信息。
02:51
So about midway中途 last year,
we decided决定 that we were going to apply应用
46
159457
5075
大约在去年年中,我们决定更多采用
02:56
a lot more machine learning学习,
a lot more deep learning学习 to the problem问题,
47
164556
3982
机器学习与深度学习,
03:00
and try to be a lot more proactive主动
around where abuse滥用 is happening事件,
48
168562
4538
使其能够主动发现侮辱信息,
03:05
so that we can take the burden负担
off the victim受害者 completely全然.
49
173124
3960
这样我们就能让受害者卸下重担。
03:09
And we've我们已经 made制作 some progress进展 recently最近.
50
177108
2435
最近我们颇有成效。
03:11
About 38 percent百分 of abusive滥用的 tweets微博
are now proactively主动 identified确定
51
179567
6689
机器自主学习算法
03:18
by machine learning学习 algorithms算法
52
186280
1715
能够主动发现38%的侮辱性推文,
03:20
so that people don't actually其实
have to report报告 them.
53
188019
2334
这样人们就不用举报这些了。
03:22
But those that are identified确定
are still reviewed回顾 by humans人类,
54
190377
3305
但即使机器识别了那些信息,
也还需要人来审查,
03:25
so we do not take down content内容 or accounts账户
without a human人的 actually其实 reviewing回顾 it.
55
193706
5384
所以在有人审查之前,
我们不会删除任何评论或账号。
03:31
But that was from zero percent百分
just a year ago.
56
199114
2759
但重点是我们一年前是从零起步。
03:33
So that meant意味着, at that zero percent百分,
57
201897
1931
我的意思是,以前,
03:35
every一切 single person who received收到 abuse滥用
had to actually其实 report报告 it,
58
203852
3650
每个受到侮辱的人都需要亲自举报,
03:39
which哪一个 was a lot of work for them,
a lot of work for us
59
207526
3579
对于他们和我们而言是一项大工程,
03:43
and just ultimately最终 unfair不公平.
60
211129
2018
并且这非常不公平。
03:46
The other thing that we're doing
is making制造 sure that we, as a company公司,
61
214528
3780
我们在做的另外一件事,
是确保在公司里
03:50
have representation表示 of all the communities社区
that we're trying to serve服务.
62
218332
3333
有足够的员工来代表反映
我们服务的所有群体。
03:53
We can't build建立 a business商业
that is successful成功
63
221689
2159
若我们的团队没有多元的观念,
03:55
unless除非 we have a diversity多样
of perspective透视 inside of our walls墙壁
64
223872
3300
若没有员工会每天亲身经历这些问题,
03:59
that actually其实 feel these issues问题
every一切 single day.
65
227196
3732
我们将无法成功地建立一个企业。
04:02
And that's not just with the team球队
that's doing the work,
66
230952
3738
这一点不仅仅适用于开发小组,
04:06
it's also within our leadership领导 as well.
67
234714
2096
也适用于领导团队。
04:08
So we need to continue继续 to build建立 empathy同情
for what people are experiencing经历
68
236834
5757
所以我们要继续在
同情人们的遭遇的同时,
04:14
and give them better tools工具 to act法案 on it
69
242615
3316
给他们创造更好的交流平台,
04:17
and also give our customers顾客
a much better and easier更轻松 approach途径
70
245955
4252
并且为我们的用户打造更好而简便的方法
04:22
to handle处理 some of the things
that they're seeing眼看.
71
250231
2382
来处理一些问题。
04:24
So a lot of what we're doing
is around technology技术,
72
252637
3266
所以我们的大部分工作都是
围绕科技展开的,
04:27
but we're also looking at
the incentives奖励 on the service服务:
73
255927
4308
但我们同时也在思考这个服务的动机:
04:32
What does Twitter推特 incentivize激励 you to do
when you first open打开 it up?
74
260259
5183
当你第一次登录的时候,
推特鼓励你做了什么事?
04:37
And in the past过去,
75
265466
1294
在过去,
04:40
it's incented激励 a lot of outrage暴行,
it's incented激励 a lot of mob暴民 behavior行为,
76
268670
5544
这个平台催生了许多愤怒,
催生了许多暴徒行为,
04:46
it's incented激励 a lot of group harassment骚扰.
77
274238
2459
也催生了许多团体骚扰。
04:48
And we have to look a lot deeper更深
at some of the fundamentals基本面
78
276721
3648
我们正在深入研究我们服务的基本原则,
04:52
of what the service服务 is doing
to make the bigger shifts转变.
79
280393
2958
以探求这种行为现象的原因。
04:55
We can make a bunch of small shifts转变
around technology技术, as I just described描述,
80
283375
4031
正如我刚才所描述的,
我们可以在科技方面做出许多小的改变,
04:59
but ultimately最终, we have to look deeply
at the dynamics动力学 in the network网络 itself本身,
81
287430
4386
但最终,我们必须深入研究
网络本身的动态性,
而这就是我们现在所做的。
05:03
and that's what we're doing.
82
291840
1368
安德森:但在你看来——
05:05
CACA: But what's your sense --
83
293232
2060
05:07
what is the kind of thing
that you might威力 be able能够 to change更改
84
295316
3963
哪些方面的改动是你力所能及
05:11
that would actually其实
fundamentally从根本上 shift转移 behavior行为?
85
299303
2749
并且可以从根本上改善人们行为的呢?
05:15
JDJD: Well, one of the things --
86
303386
1480
多西:有一件事——
05:16
we started开始 the service服务
with this concept概念 of following以下 an account帐户,
87
304890
5340
我们开设的关注别人账号的这个功能,
05:22
as an example,
88
310254
1725
我举一个例子,
05:24
and I don't believe that's why
people actually其实 come to Twitter推特.
89
312003
4349
我猜这不是吸引人们用推特的主要原因。
05:28
I believe Twitter推特 is best最好
as an interest-based基于利息 network网络.
90
316376
4857
我相信推特的最佳用途是构建兴趣网络。
人们来到这里,每个人兴趣不同,
05:33
People come with a particular特定 interest利益.
91
321257
3453
他们需要花很大的工夫
才能找到志同道合的人,
05:36
They have to do a ton of work
to find and follow跟随 the related有关 accounts账户
92
324734
3487
05:40
around those interests利益.
93
328245
1405
关注兴趣相同的账号。
05:42
What we could do instead代替
is allow允许 you to follow跟随 an interest利益,
94
330217
3397
我们可以改进的地方在于
让你关注一项兴趣爱好,
关注一个标签,关注一种趋势,
05:45
follow跟随 a hashtag井号, follow跟随 a trend趋势,
95
333638
2103
05:47
follow跟随 a community社区,
96
335765
1754
关注一个群体。
05:49
which哪一个 gives us the opportunity机会
to show显示 all of the accounts账户,
97
337543
4637
通过这种方式,我们可以给你推荐
与你兴趣有关的
05:54
all the topics主题, all the moments瞬间,
all the hashtags主题标签
98
342204
3323
所有账号、所有话题、
所有时刻以及所有标签,
05:57
that are associated相关 with that
particular特定 topic话题 and interest利益,
99
345551
3992
这可以开拓用户的视野。
06:01
which哪一个 really opens打开 up
the perspective透视 that you see.
100
349567
4600
但要从只关注特定账号、
话题和兴趣爱好,
06:06
But that is a huge巨大 fundamental基本的 shift转移
101
354191
2157
06:08
to bias偏压 the entire整个 network网络
away from just an account帐户 bias偏压
102
356372
3792
转而关注整个社交网络,
06:12
towards a topics主题 and interest利益 bias偏压.
103
360188
2587
这是一个非常巨大的转变。
06:15
CACA: Because isn't it the case案件
104
363283
3148
安德森:但之所以推特上内容丰富,
06:19
that one reason原因 why you have
so much content内容 on there
105
367375
3541
是因为
06:22
is a result结果 of putting millions百万
of people around the world世界
106
370940
3591
推特使全球很多人
在“角斗场”中相互竞争,
06:26
in this kind of gladiatorial争论的
contest比赛 with each other
107
374555
3142
看谁有更多的粉丝,获得更多的关注吗?
06:29
for followers追随者, for attention注意?
108
377721
2090
06:31
Like, from the point of view视图
of people who just read Twitter推特,
109
379835
4117
我是说,对于那些关注者而言,
这并不是什么大问题。
06:35
that's not an issue问题,
110
383976
1155
但对于那些博主而言,每个人都在说,
06:37
but for the people who actually其实 create创建 it,
everyone's大家的 out there saying,
111
385155
3350
“你知道吗,我希望我也能有更多的赞、
更多的粉丝和转发,”
06:40
"You know, I wish希望 I had
a few少数 more 'likes"喜欢",' followers追随者, retweets锐推."
112
388529
3236
06:43
And so they're constantly经常 experimenting试验,
113
391789
2148
所以他们不断地尝试,
06:45
trying to find the path路径 to do that.
114
393961
1961
希望通过某种方式来做到这一点。
06:47
And what we've我们已经 all discovered发现
is that the number one path路径 to do that
115
395946
4126
并且我们发现,能达成这个目的
最有效方法,
就是发表挑衅的、
06:52
is to be some form形成 of provocative挑衅,
116
400096
3406
讨厌的、词藻华丽而令人憎恶的言论,
06:55
obnoxious厌恶, eloquently雄辩地 obnoxious厌恶,
117
403526
2980
06:58
like, eloquent雄辩 insults侮辱
are a dream梦想 on Twitter推特,
118
406530
3516
比方说,在推特上花式骂人,
07:02
where you rapidly急速 pile up --
119
410070
2603
如果你不停地侮辱他人——
07:04
and it becomes this self-fueling自供燃料
process处理 of driving主动 outrage暴行.
120
412697
4608
最终愤怒的情绪就会
自然不断地蔓延到其他人。
你要怎么解决这个问题?
07:09
How do you defuse化解 that?
121
417329
2351
07:12
JDJD: Yeah, I mean, I think you're spot on,
122
420624
2947
多西:是的,我觉得你讲的正中要害,
07:15
but that goes back to the incentives奖励.
123
423595
1886
但这反过来回到了刚才所说的动机问题。
07:17
Like, one of the choices选择
we made制作 in the early days was
124
425505
2632
在早期,我们设计的系统是这样的:
07:20
we had this number that showed显示
how many许多 people follow跟随 you.
125
428161
4701
系统会显示关注你账号的人数。
07:24
We decided决定 that number
should be big and bold胆大,
126
432886
2959
我们决定让这个数字放大又加粗显示,
07:27
and anything that's on the page
that's big and bold胆大 has importance重要性,
127
435869
3740
你界面上放大粗体的字体都很重要,
07:31
and those are the things
that you want to drive驾驶.
128
439633
2278
这些就是你想推动的东西。
对当时而言,这是否是正确的做法呢?
07:33
Was that the right decision决定 at the time?
129
441935
1907
也许不是。
07:35
Probably大概 not.
130
443866
1153
07:37
If I had to start开始 the service服务 again,
131
445043
1805
如果让我重头来过,
我不会再这么强调关注人数。
07:38
I would not emphasize注重
the follower信徒 count计数 as much.
132
446872
2398
我不会再这么强调推文的“喜爱”数量。
07:41
I would not emphasize注重
the "like" count计数 as much.
133
449294
2295
07:43
I don't think I would even
create创建 "like" in the first place地点,
134
451613
3120
我甚至不觉得我会创造“喜爱”这个功能,
07:46
because it doesn't actually其实 push
135
454757
3267
因为这个功能并不能推动
我们眼中最重要的事:
07:50
what we believe now
to be the most important重要 thing,
136
458048
3179
07:53
which哪一个 is healthy健康 contribution贡献
back to the network网络
137
461251
3039
对网络提供健康的贡献,
07:56
and conversation会话 to the network网络,
138
464314
2652
促进交流,
07:58
participation参与 within conversation会话,
139
466990
2072
推动人们参与交流,
08:01
learning学习 something from the conversation会话.
140
469086
2493
在交流中学习。
08:03
Those are not things
that we thought of 13 years年份 ago,
141
471603
2824
我们在13年前没有考虑到这些,
08:06
and we believe are extremely非常
important重要 right now.
142
474451
2439
但现在我们却认为它们非常重要。
所以我们需要审视系统如何呈现粉丝数,
08:08
So we have to look at
how we display显示 the follower信徒 count计数,
143
476914
3023
08:11
how we display显示 retweet转推 count计数,
144
479961
2365
呈现转推数,
呈现“喜爱”的数量,
08:14
how we display显示 "likes喜欢,"
145
482350
1401
08:15
and just ask the deep question:
146
483775
2254
并且问这个深奥的问题:
08:18
Is this really the number
that we want people to drive驾驶 up?
147
486053
3048
我们想让人们提升的是否就是这个数字?
08:21
Is this the thing that,
when you open打开 Twitter推特,
148
489125
2545
我们是否希望人们打开推特,
08:23
you see, "That's the thing
I need to increase增加?"
149
491694
2516
意识到,“这就是我要提升的东西?”
我相信现在答案是否定的。
08:26
And I don't believe
that's the case案件 right now.
150
494234
2144
(掌声)
08:28
(Applause掌声)
151
496402
2103
08:30
WPRWPR: I think we should look at
some of the tweets微博
152
498529
2352
罗杰斯:我觉得我们也应该看看
08:32
that are coming未来
in from the audience听众 as well.
153
500905
2169
一部分来自听众的推文。
08:35
CACA: Let's see what you guys are asking.
154
503868
2436
安德森:让我们看看你们在问什么。
08:38
I mean, this is -- generally通常, one
of the amazing惊人 things about Twitter推特
155
506328
3294
我觉得,这是——总体而言,
推特的特点之一
在于它能够集中大众智慧,
08:41
is how you can use it for crowd人群 wisdom智慧,
156
509646
2294
08:43
you know, that more knowledge知识,
more questions问题, more points of view视图
157
511964
4840
推特上的知识,问题和观点
多得超乎你的想象。
08:48
than you can imagine想像,
158
516828
1238
08:50
and sometimes有时, many许多 of them
are really healthy健康.
159
518090
3689
而且有时候,很多观点意见质量都很高。
08:53
WPRWPR: I think one I saw that
passed通过 already已经 quickly很快 down here,
160
521803
2900
罗杰斯:刚才我看到的这个很好,在下面,
08:56
"What's Twitter'sTwitter的 plan计划 to combat战斗
foreign国外 meddling插手 in the 2020 US election选举?"
161
524717
3524
“在2020年美国大选中,
推特准备采取什么措施对抗外国干涉?”
我觉得这是我们在互联网上见到的
09:00
I think that's something
that's an issue问题 we're seeing眼看
162
528265
2571
非常重要的一个问题,
09:02
on the internet互联网 in general一般,
163
530860
1901
09:04
that we have a lot of malicious恶毒
automated自动化 activity活动 happening事件.
164
532785
3667
网络上有着许多恶意的机器操作,
09:08
And on Twitter推特, for example,
in fact事实, we have some work
165
536476
5373
用推特打个比方,
来自齐格努实验室的朋友
给了我们一些资料,
09:13
that's come from our friends朋友
at Zignal齐格纳 Labs实验室,
166
541873
2758
09:16
and maybe we can even see that
to give us an example
167
544655
2656
也许我们可以把这当做一个例子,
09:19
of what exactly究竟 I'm talking about,
168
547335
1927
来引出我想谈论的这个问题,
09:21
where you have these bots机器人, if you will,
169
549286
3204
有很多机器人小号,
09:24
or coordinated协调 automated自动化
malicious恶毒 account帐户 activity活动,
170
552514
4550
或者说一系列的自动化恶意账号活动,
09:29
that is being存在 used to influence影响
things like elections选举.
171
557088
2764
它们被用来影响选举这类的事件。
09:31
And in this example we have
from Zignal齐格纳 which哪一个 they've他们已经 shared共享 with us
172
559876
3843
在这个例子中,齐格努给我们分享了一些
09:35
using运用 the data数据 that
they have from Twitter推特,
173
563743
2198
他们从推特上收集到的数据,
09:37
you actually其实 see that in this case案件,
174
565965
2441
你能在这里看到,
白色代表——人类账号,
每个点代表一个账号,
09:40
white白色 represents代表 the humans人类 --
human人的 accounts账户, each dot is an account帐户.
175
568430
4370
09:44
The pinker平克 it is,
176
572824
1359
颜色越粉,
09:46
the more automated自动化 the activity活动 is.
177
574207
1740
说明这个账号活动越像机器人。
09:47
And you can see how you have
a few少数 humans人类 interacting互动 with bots机器人.
178
575971
5970
你能到看到有一些人与机器账号互动,
09:53
In this case案件, it's related有关
to the election选举 in Israel以色列
179
581965
4419
在这里,互动内容是关于
以色列选举的,
机器账号散播关于
本尼·甘茨的虚假信息,
09:58
and spreading传播 misinformation误传
about Benny班尼 Gantz甘茨,
180
586408
2833
10:01
and as we know, in the end结束,
that was an election选举
181
589265
2662
我们都知道,选举的结果,
10:03
that Netanyahu内塔尼亚胡 won韩元 by a slim margin余量,
182
591951
3724
最终内塔尼亚胡以微弱优势获胜,
10:07
and that may可能 have been
in some case案件 influenced影响 by this.
183
595699
2842
而这个结果可能被这些互动影响了。
10:10
And when you think about
that happening事件 on Twitter推特,
184
598565
2615
当你想到推特上的这些事情时,
10:13
what are the things
that you're doing, specifically特别,
185
601204
2456
你正在采取具体什么措施,
来保证类似的虚假信息不会传播,
10:15
to ensure确保 you don't have misinformation误传
like this spreading传播 in this way,
186
603684
3702
影响他人,以致于影响民主制度呢?
10:19
influencing影响 people in ways方法
that could affect影响 democracy民主?
187
607410
4181
10:23
JDJD: Just to back up a bit,
188
611615
1771
多西:让我们倒回去一点,
10:25
we asked ourselves我们自己 a question:
189
613410
2975
我们问了自己一个问题:
我们是否能测量对话的质量?
10:28
Can we actually其实 measure测量
the health健康 of a conversation会话,
190
616409
3816
10:32
and what does that mean?
191
620249
1288
而对质量的定义又是什么?
10:33
And in the same相同 way
that you have indicators指标
192
621561
3382
就像你我
10:36
and we have indicators指标 as humans人类
in terms条款 of are we healthy健康 or not,
193
624967
3467
作为人类会有显示
我们健不健康的指标,
10:40
such这样 as temperature温度,
the flushness齐平度 of your face面对,
194
628458
4658
比如说体温,脸上的红晕,
我们相信我们能够找到反应
谈话健康程度的指标。
10:45
we believe that we could find
the indicators指标 of conversational对话的 health健康.
195
633140
4560
10:49
And we worked工作 with a lab实验室
called Cortico科尔蒂科 at MITMIT
196
637724
3843
我们与麻省理工学院的
一个名叫匡提科的实验室合作,
10:54
to propose提出 four starter起动机 indicators指标
197
642479
6091
提出了四项指标,
11:00
that we believe we could ultimately最终
measure测量 on the system系统.
198
648594
3670
我们认为最终可以运用在系统上。
11:05
And the first one is
what we're calling调用 shared共享 attention注意.
199
653249
5604
第一项指标叫作共同关注。
这个指标用于测量对话中
人们多大程度集中于
11:10
It's a measure测量 of how much
of the conversation会话 is attentive细心
200
658877
3581
同一个话题,还是不同的话题。
11:14
on the same相同 topic话题 versus disparate不同.
201
662482
2630
第二个指标叫作共同现实。
11:17
The second第二 one is called shared共享 reality现实,
202
665739
2783
这个指标用于测量这段对话有多少部分
11:21
and this is what percentage百分比
of the conversation会话
203
669217
2259
是基于共同事实的——
11:23
shares分享 the same相同 facts事实 --
204
671500
2005
11:25
not whether是否 those facts事实
are truthful真实 or not,
205
673529
3113
不是指这些事实真实与否,
11:28
but are we sharing分享
the same相同 facts事实 as we converse交谈?
206
676666
3009
而是我们认为的事实是否相同?
第三个指标是感受性:
11:32
The third第三 is receptivity接受:
207
680235
2353
11:34
How much of the conversation会话
is receptive接受 or civil国内
208
682612
3959
一段对话在多大程度上是温和,
让人容易接受的,
11:38
or the inverse, toxic有毒的?
209
686595
2944
亦或者是完全相反,会使人不快呢?
11:42
And then the fourth第四
is variety品种 of perspective透视.
210
690213
3222
第四个指标是观点多样性。
11:45
So, are we seeing眼看 filter过滤 bubbles泡泡
or echo回声 chambers,
211
693459
3145
我们所接收的信息是否被筛选过,
人云亦云,
11:48
or are we actually其实 getting得到
a variety品种 of opinions意见
212
696628
3057
又或者是我们能在谈话中
11:51
within the conversation会话?
213
699709
1635
接收到许多不同的观点呢?
11:53
And implicit含蓄 in all four of these
is the understanding理解 that,
214
701368
4018
而这四个指标背后蕴含意思是
11:57
as they increase增加, the conversation会话
gets得到 healthier健康 and healthier健康.
215
705410
3390
指标越高,谈话的质量就越高。
12:00
So our first step is to see
if we can measure测量 these online线上,
216
708824
4869
所以第一步是试验是否能在网上
检测这些指标,
12:05
which哪一个 we believe we can.
217
713717
1308
我们内心持肯定的态度。
12:07
We have the most momentum动量
around receptivity接受.
218
715049
3167
我们在感受性方面势头较大。
12:10
We have a toxicity毒性 score得分了,
a toxicity毒性 model模型, on our system系统
219
718240
4317
我们在系统中建立了反感分,
一个反感模型,
它能够检测在推特上
12:14
that can actually其实 measure测量
whether是否 you are likely容易 to walk步行 away
220
722581
4124
你是否会终止某个你正在参与的话题,
12:18
from a conversation会话
that you're having on Twitter推特
221
726729
2313
因为这个话题让你感到不快,
12:21
because you feel it's toxic有毒的,
222
729066
1633
12:22
with some pretty漂亮 high degree.
223
730723
2512
并且是非常不愉快。
12:26
We're working加工 to measure测量 the rest休息,
224
734369
2199
我们正在努力测量其他指标,
而下一步是,
12:28
and the next下一个 step is,
225
736592
1964
12:30
as we build建立 up solutions解决方案,
226
738580
3359
在我们制定解决方案之后,
12:33
to watch how these measurements测量
trend趋势 over time
227
741963
3491
观察这些指标长期的趋势走向,
并且继续试验。
12:37
and continue继续 to experiment实验.
228
745478
1873
12:39
And our goal目标 is to make sure
that these are balanced均衡,
229
747375
4041
我们的目标是保证这些指标的平衡,
12:43
because if you increase增加 one,
you might威力 decrease减少 another另一个.
230
751440
3066
因为一个指标的提升可能会导致
另一个的降低。
12:46
If you increase增加 variety品种 of perspective透视,
231
754530
2147
如果你提高观点的多样性,
可能就会导致共同现实值的降低。
12:48
you might威力 actually其实 decrease减少
shared共享 reality现实.
232
756701
3091
12:51
CACA: Just picking选择 up on some
of the questions问题 flooding洪水 in here.
233
759816
4989
安德森:让我们选择屏幕上的
一些问题提问。
多西:持续不断地发问。
12:56
JDJD: Constant不变 questioning疑问.
234
764829
1271
12:58
CACA: A lot of people are puzzled困惑 why,
235
766996
3620
安德森:许多人好奇,
13:02
like, how hard is it to get rid摆脱
of Nazis纳粹 from Twitter推特?
236
770640
4247
在推特上把纳粹清除出去有多难?
13:08
JDJD: (Laughs)
237
776309
1322
多西:(笑声)
13:09
So we have policies政策
around violent暴力 extremist极端主义 groups,
238
777655
6995
我们对于暴力极端组织制定了规则,
13:16
and the majority多数 of our work
and our terms条款 of service服务
239
784674
4426
而我们的大部分工作和服务条款
13:21
works作品 on conduct进行, not content内容.
240
789124
3729
是研究用户行为,而非网站内容。
13:24
So we're actually其实 looking for conduct进行.
241
792877
2551
所以我们实际上在研究行为模式。
13:27
Conduct进行 being存在 using运用 the service服务
242
795452
3014
比如说,有人利用这个平台
13:30
to repeatedly反复 or episodically不定期
harass骚扰 someone有人,
243
798490
3867
来持续或是间断性地骚扰他人,
13:34
using运用 hateful可恶 imagery意象
244
802381
2493
利用可憎的意象,
13:36
that might威力 be associated相关 with the KKK三 k 党
245
804898
2106
比如可能与3K党有关的意象,
13:39
or the American美国 Nazi纳粹 Party派对.
246
807028
3281
或是与美国纳粹党有关的。
这些都是我们会立即处理的问题。
13:42
Those are all things
that we act法案 on immediately立即.
247
810333
4156
13:47
We're in a situation情况 right now
where that term术语 is used fairly相当 loosely松弛地,
248
815002
5452
现在,这些术语被使用的次数相对较多,
用词不严谨,
13:52
and we just cannot不能 take
any one mention提到 of that word
249
820478
5313
所以我们不能因为有人提到这些单词,
13:57
accusing指责 someone有人 else其他
250
825815
2117
就指控某人有罪,
13:59
as a factual事实 indication迹象 that they
should be removed去除 from the platform平台.
251
827956
3755
并且以此为证据将他们赶出这个平台。
所以我们的许多模型
要检测的第一件事是:
14:03
So a lot of our models楷模
are based基于 around, number one:
252
831735
2627
14:06
Is this account帐户 associated相关
with a violent暴力 extremist极端主义 group?
253
834386
3140
这个账号是否与暴力极端组织
有联系?
14:09
And if so, we can take action行动.
254
837550
1983
如果答案是肯定的,
那我们可以采取措施。
14:11
And we have doneDONE so on the KKK三 k 党
and the American美国 Nazi纳粹 Party派对 and others其他.
255
839557
3852
我们对于3K党、美国纳粹党
以及其他组织就采取了措施。
14:15
And number two: Are they using运用
imagery意象 or conduct进行
256
843433
4183
第二个问题:这些账号是否在使用
与上述组织有关的图片,
14:19
that would associate关联 them as such这样 as well?
257
847640
2372
或是其行为是否与上述组织有关?
14:22
CACA: How many许多 people do you have
working加工 on content内容 moderation适度
258
850416
2932
安德森:你安排了多少人给账号行为评分,
来检查这些行为?
14:25
to look at this?
259
853372
1250
14:26
JDJD: It varies变化.
260
854646
1496
多西:人数不固定。
14:28
We want to be flexible灵活 on this,
261
856166
1595
我们希望能灵活应对这件事,
14:29
because we want to make sure
that we're, number one,
262
857785
2646
因为我们想保证,第一,
14:32
building建造 algorithms算法 instead代替 of just
hiring招聘 massive大规模的 amounts of people,
263
860455
4424
建立算法而不是雇佣大量的人,
14:36
because we need to make sure
that this is scalable可扩展性,
264
864903
2824
因为我们这项任务是会延展的
14:39
and there are no amount of people
that can actually其实 scale规模 this.
265
867751
3454
雇多少人都会显得不够。
14:43
So this is why we've我们已经 doneDONE so much work
around proactive主动 detection发现 of abuse滥用
266
871229
6629
这就是为什么我们要努力建立
能积极检测辱骂信息的系统,
然后让人来审阅这些信息。
14:49
that humans人类 can then review评论.
267
877882
1391
14:51
We want to have a situation情况
268
879297
2861
我们希望可以做到让
算法能够不断检测所有的推文,
14:54
where algorithms算法 are constantly经常
scouring every一切 single tweet鸣叫
269
882182
3741
把其中问题最大的挑出来,
14:57
and bringing使 the most
interesting有趣 ones那些 to the top最佳
270
885947
2342
15:00
so that humans人类 can bring带来 their judgment判断
to whether是否 we should take action行动 or not,
271
888313
3902
这样人就可以决定到底要不要采取措施,
基于我们的服务条款。
15:04
based基于 on our terms条款 of service服务.
272
892239
1524
罗杰斯:你说从事这项工作的人数
永远不够,
15:05
WPRWPR: But there's not an amount
of people that are scalable可扩展性,
273
893787
2803
15:08
but how many许多 people do you currently目前 have
monitoring监控 these accounts账户,
274
896614
3497
那么你现在到底雇佣了多少人
来监控账号?
你怎么知道这些人足够完成任务?
15:12
and how do you figure数字 out what's enough足够?
275
900135
2546
15:14
JDJD: They're completely全然 flexible灵活.
276
902705
2272
多西:我们的安排非常灵活。
15:17
Sometimes有时 we associate关联 folks乡亲 with spam垃圾邮件.
277
905001
2941
有时我们让人监控垃圾信息。
有时我们让人监控侮辱和
骚扰信息。
15:19
Sometimes有时 we associate关联 folks乡亲
with abuse滥用 and harassment骚扰.
278
907966
3845
我们要确保我们的人事安排足够灵活,
15:23
We're going to make sure that
we have flexibility灵活性 in our people
279
911835
3062
这样我们就能把他们派往
最有需要的地方。
15:26
so that we can direct直接 them
at what is most needed需要.
280
914921
2350
有时,他们监控选举信息。
15:29
Sometimes有时, the elections选举.
281
917295
1204
15:30
We've我们已经 had a string of elections选举
in Mexico墨西哥, one coming未来 up in India印度,
282
918523
4927
我们在墨西哥有一组人,
马上在印度也会成立一个小组。
15:35
obviously明显, the election选举 last year,
the midterm期中考试 election选举,
283
923474
4447
很明显,去年有选举,中期选举,
15:39
so we just want to be flexible灵活
with our resources资源.
284
927945
2472
所以我们希望确保人员流动性。
15:42
So when people --
285
930441
2129
当人们——
15:44
just as an example, if you go
to our current当前 terms条款 of service服务
286
932594
6389
举个例子,如果你现在去查
我们的服务条款,
15:51
and you bring带来 the page up,
287
939007
1641
然后你打开页面,
15:52
and you're wondering想知道 about abuse滥用
and harassment骚扰 that you just received收到
288
940672
3682
然后你想弄明白你刚刚收到的
侮辱和骚扰,
15:56
and whether是否 it was against反对
our terms条款 of service服务 to report报告 it,
289
944378
3634
它们是否违反我们的服务条款,
是否要举报,
16:00
the first thing you see
when you open打开 that page
290
948036
2559
你打开页面后看到的第一个东西,
16:02
is around intellectual知识分子
property属性 protection保护.
291
950619
3088
是知识产权保护。
你向下看就能看到侮辱,骚扰有关,
16:06
You scroll滚动 down and you get to
abuse滥用, harassment骚扰
292
954504
5323
以及其他所有你可能会遭遇的事情。
16:11
and everything else其他
that you might威力 be experiencing经历.
293
959851
2382
16:14
So I don't know how that happened发生
over the company's公司 history历史,
294
962257
3195
尽管我不清楚为什么近几年来
推特上会出现这些问题,
16:17
but we put that above以上
the thing that people want
295
965476
4797
但我们把这些信息列为人们
最需要的信息,
16:24
the most information信息 on
and to actually其实 act法案 on.
296
972146
3222
并且正在着手解决这些问题。
16:27
And just our ordering排序 shows节目 the world世界
what we believed相信 was important重要.
297
975392
5241
这些信息排列的先后顺序
代表了我们对其的重视程度。
16:32
So we're changing改变 all that.
298
980657
2881
所以我们正在改变。
16:35
We're ordering排序 it the right way,
299
983562
1563
我们以正确的顺序排列信息,
16:37
but we're also simplifying简化 the rules规则
so that they're human-readable人类可读
300
985149
3451
但同时我们也在简化规则,
使其简单易懂,
16:40
so that people can actually其实
understand理解 themselves他们自己
301
988624
4067
这样人们就能理解
16:44
when something is against反对 our terms条款
and when something is not.
302
992715
3448
哪些行为违反了条例,哪些没有。
16:48
And then we're making制造 --
303
996187
2161
接下来我们——
16:50
again, our big focus焦点 is on removing去除
the burden负担 of work from the victims受害者.
304
998372
5200
正如我刚才所说,现在工作的重点
是减少受害者的操作压力。
16:55
So that means手段 push more
towards technology技术,
305
1003596
3734
这意味着继续发展科技,
16:59
rather than humans人类 doing the work --
306
1007354
1873
代替真人做这些工作——
17:01
that means手段 the humans人类 receiving接收 the abuse滥用
307
1009251
2413
真人会接收侮辱信息,
17:03
and also the humans人类
having to review评论 that work.
308
1011688
3026
同时还要亲自举报。
17:06
So we want to make sure
309
1014738
1673
所以我们并不希望
17:08
that we're not just encouraging鼓舞人心的 more work
310
1016435
2841
我们一步步
17:11
around something
that's super, super negative,
311
1019300
2629
朝一些负面的东西发展,
17:13
and we want to have a good balance平衡
between之间 the technology技术
312
1021953
2674
我们希望
在科技和人类创新中取得平衡,
17:16
and where humans人类 can actually其实 be creative创作的,
313
1024651
2852
17:19
which哪一个 is the judgment判断 of the rules规则,
314
1027527
3090
让人类来审查规则,
而非机械地发现问题,报告问题。
17:22
and not just all the mechanical机械 stuff东东
of finding发现 and reporting报告 them.
315
1030641
3267
17:25
So that's how we think about it.
316
1033932
1530
以上就是我们的想法。
17:27
CACA: I'm curious好奇 to dig in more
about what you said.
317
1035486
2406
安德森:我想进一步了解你刚刚讲的事。
17:29
I mean, I love that you said
you are looking for ways方法
318
1037916
2605
我的意思是,我很开心听到你说
你在寻求方法
17:32
to re-tweak重新调整 the fundamental基本的
design设计 of the system系统
319
1040545
3462
来调整重构系统设计,
17:36
to discourage不鼓励 some of the reactive反应
behavior行为, and perhaps也许 --
320
1044031
4875
来阻止一些反应性行为,并且或许——
引用特里斯坦·哈里斯的说法——
17:40
to use Tristan特里斯坦 Harris-type哈里斯型 language语言 --
321
1048930
2705
17:43
engage从事 people's人们 more reflective反光 thinking思维.
322
1051659
4288
鼓励人们反思。
17:47
How far advanced高级 is that?
323
1055971
1854
你现在进行到哪一步?
17:49
What would alternatives备择方案
to that "like" button按键 be?
324
1057849
4305
你准备用什么功能
来代替“喜欢”功能?
17:55
JDJD: Well, first and foremost最重要的是,
325
1063518
3575
多西:首先,
17:59
my personal个人 goal目标 with the service服务
is that I believe fundamentally从根本上
326
1067117
5753
我对于推特的个人目标来自于
我认为公众交流是最关键的。
18:04
that public上市 conversation会话 is critical危急.
327
1072894
2702
世界上有一些问题,
18:07
There are existential存在 problems问题
facing面对 the world世界
328
1075620
2647
18:10
that are facing面对 the entire整个 world世界,
not any one particular特定 nation-state民族国家,
329
1078291
4163
不仅仅某个特定国家地区,
全世界都面临这些问题,
18:14
that global全球 public上市 conversation会话 benefits好处.
330
1082478
2649
而全球公共交流
有助于解决这些问题。
18:17
And that is one of the unique独特
dynamics动力学 of Twitter推特,
331
1085151
2372
这就是推特独特的活力之一,
18:19
that it is completely全然 open打开,
332
1087547
1814
这个平台完全开放,
18:21
it is completely全然 public上市,
333
1089385
1596
完全公共,
完全流动,
18:23
it is completely全然 fluid流体,
334
1091005
1399
18:24
and anyone任何人 can see any other conversation会话
and participate参加 in it.
335
1092428
4038
任何人可以浏览和参与和人话题。
18:28
So there are conversations对话
like climate气候 change更改.
336
1096490
2206
所以会有气候变化的讨论。
18:30
There are conversations对话
like the displacement移位 in the work
337
1098720
2682
有对人工智能取代人类的讨论。
18:33
through通过 artificial人造 intelligence情报.
338
1101426
2000
18:35
There are conversations对话
like economic经济 disparity差距.
339
1103450
3006
有对经济差距的讨论。
18:38
No matter what any one nation-state民族国家 does,
340
1106480
2765
无论你来自哪个国家或地区,
人类都无法独自解决这些问题。
18:41
they will not be able能够
to solve解决 the problem问题 alone单独.
341
1109269
2421
18:43
It takes coordination协调 around the world世界,
342
1111714
2643
这需要全球合作,
我们认为这就是推特发挥作用的地方。
18:46
and that's where I think
Twitter推特 can play a part部分.
343
1114381
3047
18:49
The second第二 thing is that Twitter推特,
right now, when you go to it,
344
1117452
5642
第二,现在你登录推特,
当你离开的时候,
也许你觉得没什么收获,
18:55
you don't necessarily一定 walk步行 away
feeling感觉 like you learned学到了 something.
345
1123118
3746
18:58
Some people do.
346
1126888
1276
但有些人觉得有收获。
19:00
Some people have
a very, very rich丰富 network网络,
347
1128188
3107
有些人的社交网络非常丰富,
19:03
a very rich丰富 community社区
that they learn学习 from every一切 single day.
348
1131319
3117
他们身处一个庞大的社区,
每天都可以学习到新的知识。
19:06
But it takes a lot of work
and a lot of time to build建立 up to that.
349
1134460
3691
但要想建造这样的社区,
需要大量的精力和时间。
19:10
So we want to get people
to those topics主题 and those interests利益
350
1138175
3448
所以我们让人们可以更快得
找到那些主题、那些兴趣。
19:13
much, much faster更快
351
1141647
1579
并确保他们有所收获,
19:15
and make sure that
they're finding发现 something that,
352
1143250
2566
19:18
no matter how much time
they spend on Twitter推特 --
353
1146728
2360
无论他们每天在推特上花多少时间——
19:21
and I don't want to maximize最大化
the time on Twitter推特,
354
1149112
2358
我不想扩大用户使用推特的时间,
我想扩大用户在推特上的收获,
19:23
I want to maximize最大化
what they actually其实 take away from it
355
1151494
2910
19:26
and what they learn学习 from it, and --
356
1154428
2030
让他们在推特上学到更多知识,并且——
19:29
CACA: Well, do you, though虽然?
357
1157598
1328
安德森:你是这么想的吗?
19:30
Because that's the core核心 question
that a lot of people want to know.
358
1158950
3244
这是许多人想问的一个核心问题。
19:34
Surely一定, Jack插口, you're constrained受限,
to a huge巨大 extent程度,
359
1162218
3638
当然,杰克,你在很大程度上受到约束,
19:37
by the fact事实 that you're a public上市 company公司,
360
1165880
2007
因为这是一家上市公司,
19:39
you've got investors投资者 pressing紧迫 on you,
361
1167911
1774
你会收到投资者的压力,
19:41
the number one way you make your money
is from advertising广告 --
362
1169709
3559
你赚钱的方法之一是广告——
19:45
that depends依靠 on user用户 engagement订婚.
363
1173292
2772
而广告的数量取决于用户参与度。
19:48
Are you willing愿意 to sacrifice牺牲
user用户 time, if need be,
364
1176088
4700
如果有需要的话,
你是否愿意牺牲用户参与度,
19:52
to go for a more reflective反光 conversation会话?
365
1180812
3729
来换取更加有效的交流呢?
19:56
JDJD: Yeah; more relevance关联 means手段
less time on the service服务,
366
1184565
3111
多西:当然;高效交流意味着
用户使用时间的减少,
19:59
and that's perfectly完美 fine,
367
1187700
1937
我完全可以接受这个,
因为我们希望确保,当你来到推特,
20:01
because we want to make sure
that, like, you're coming未来 to Twitter推特,
368
1189661
3099
你马上就能看到一些你感兴趣,
想学的东西。
20:04
and you see something immediately立即
that you learn学习 from and that you push.
369
1192784
4520
20:09
We can still serve服务 an ad广告 against反对 that.
370
1197328
3420
在这个基础上,我们依然能投放广告。
这并不意味着你需要加快速度
看更多的东西。
20:12
That doesn't mean you need to spend
any more time to see more.
371
1200772
2921
第二件我们正在研究的事情是...
20:15
The second第二 thing we're looking at --
372
1203717
1733
20:17
CACA: But just -- on that goal目标,
daily日常 active活性 usage用法,
373
1205474
2698
安德森:但是,就你的目标而言,
如果每日活跃参与度
是你追求的东西的话,
20:20
if you're measuring测量 that,
that doesn't necessarily一定 mean things
374
1208196
3245
这个指数并不意味着人们
每天参与了兴趣相关的事务。
20:23
that people value every一切 day.
375
1211465
1738
20:25
It may可能 well mean
376
1213227
1161
它可能指
人们每天出于好奇围观的事务。
20:26
things that people are drawn to
like a moth to the flame火焰, every一切 day.
377
1214412
3306
我们之所以沉迷于此,因为我们
看到一些惹怒我们的东西,
20:29
We are addicted上瘾, because we see
something that pisses尿尿 us off,
378
1217742
3022
20:32
so we go in and add fuel汽油 to the fire,
379
1220788
3178
然后我们参与进去,与别人争论,
20:35
and the daily日常 active活性 usage用法 goes up,
380
1223990
1927
然后每日活跃参与度就上升了,
20:37
and there's more ad广告 revenue收入 there,
381
1225941
1715
因此广告收益也增多了,
20:39
but we all get angrier愤怒 with each other.
382
1227680
2752
但最终结果是我们愈发互相仇恨。
20:42
How do you define确定 ...
383
1230456
2509
你是如何定义...
过于美化“每日活跃参与度”这个数值
似乎是一件危险的事。
20:44
"Daily日常 active活性 usage用法" seems似乎 like a really
dangerous危险 term术语 to be optimizing优化.
384
1232989
4126
20:49
(Applause掌声)
385
1237139
5057
(掌声)
20:54
JDJD: Taken采取 alone单独, it is,
386
1242220
1268
多西:确实,如果你单独
讨论这个数值的话,
20:55
but you didn't let me
finish the other metric,
387
1243512
2346
但你得让我介绍其他衡量标准,
20:57
which哪一个 is, we're watching观看 for conversations对话
388
1245882
3727
我们同时也在检测谈话
21:01
and conversation会话 chains.
389
1249633
2129
和谈话链。
21:03
So we want to incentivize激励
healthy健康 contribution贡献 back to the network网络,
390
1251786
5076
我们希望鼓励平台发展健康交流,
21:08
and what we believe that is
is actually其实 participating参与 in conversation会话
391
1256886
4181
而我们认为健康交流的定义是
参与有益的交流,
21:13
that is healthy健康,
392
1261091
1197
21:14
as defined定义 by those four indicators指标
I articulated铰接式 earlier.
393
1262312
5037
这里的有益通过我刚刚所
提到的四个指标来测量。
所以你不能只看一个指标。
21:19
So you can't just optimize优化
around one metric.
394
1267373
2657
你得平衡一下,不断观察
21:22
You have to balance平衡 and look constantly经常
395
1270054
2752
21:24
at what is actually其实 going to create创建
a healthy健康 contribution贡献 to the network网络
396
1272830
4083
在平台上什么因素可以创造健康交流,
可以为人们创造健康的体验。
21:28
and a healthy健康 experience经验 for people.
397
1276937
2341
最终,我们希望可以达到这样的标准,
21:31
Ultimately最终,, we want to get to a metric
398
1279302
1866
21:33
where people can tell us,
"Hey, I learned学到了 something from Twitter推特,
399
1281192
3757
人们会说:“嘿,我从推特上
学到了一些东西,
21:36
and I'm walking步行 away
with something valuable有价值."
400
1284973
2167
它让我增长了见识。”
这是我们未来的最终目标,
21:39
That is our goal目标 ultimately最终 over time,
401
1287164
2043
但距离这个目标的实现还有一段时间。
21:41
but that's going to take some time.
402
1289231
1809
21:43
CACA: You come over to many许多,
I think to me, as this enigma.
403
1291064
5282
安德森:你在很多人看来,是一个谜。
我也这样觉得。
21:48
This is possibly或者 unfair不公平,
but I woke醒来 up the other night
404
1296370
4396
这可能不公平,
但当我之前某天醒来的时候,
21:52
with this picture图片 of how I found发现 I was
thinking思维 about you and the situation情况,
405
1300790
3879
脑海中的画面是你和这个现状,
21:56
that we're on this great voyage航程 with you
on this ship called the "Twittanic特维塔尼奇" --
406
1304693
6903
我们正乘坐在一艘名为“推坦尼克号”
的巨轮上前进...
(笑声)
22:03
(Laughter笑声)
407
1311620
1281
22:04
and there are people on board in steerage统舱
408
1312925
4357
在船上,普通舱的一些人
感觉非常不舒服,
22:09
who are expressing表达 discomfort不舒服,
409
1317306
2203
22:11
and you, unlike不像 many许多 other captains船长,
410
1319533
2543
而你,和其他船长不同,
22:14
are saying, "Well, tell me, talk to me,
listen to me, I want to hear."
411
1322100
3431
说:“嘿,告诉我,
我想知道你们的意见。”
22:17
And they talk to you, and they say,
"We're worried担心 about the iceberg冰山 ahead."
412
1325555
3619
所以人们过来跟你讲话,他们说:
“我们担心前方的冰山。”
然后你回答:“你们知道吗,
那是一个巨大的挑战,
22:21
And you go, "You know,
that is a powerful强大 point,
413
1329198
2242
22:23
and our ship, frankly坦率地说,
hasn't有没有 been built内置 properly正确
414
1331464
2430
说实话,我们的船可能不像
它看上去的那样
22:25
for steering操舵 as well as it might威力."
415
1333918
1669
可以轻松转向。”
22:27
And we say, "Please do something."
416
1335611
1658
然后人们说:“请想想办法。”
然后你登上舰桥,
22:29
And you go to the bridge,
417
1337293
1411
22:30
and we're waiting等候,
418
1338728
2295
我们就在等待。
22:33
and we look, and then you're showing展示
this extraordinary非凡 calm冷静,
419
1341047
4548
我们向外看,发现你神态自若,
22:37
but we're all standing常设 outside,
saying, "Jack插口, turn the fucking他妈的 wheel!"
420
1345619
3883
但我们都站在外面,
喊着:“杰克,求你快点转向!”
22:41
You know?
421
1349526
1151
你能懂吗?
22:42
(Laughter笑声)
422
1350701
1335
(笑声)
22:44
(Applause掌声)
423
1352060
2381
(掌声)
22:46
I mean --
424
1354465
1166
我是说
(掌声)
22:47
(Applause掌声)
425
1355655
1734
22:49
It's democracy民主 at stake赌注.
426
1357413
4594
民主制度危在旦夕。
22:54
It's our culture文化 at stake赌注.
It's our world世界 at stake赌注.
427
1362031
2821
我们的文化、我们的世界危在旦夕。
22:56
And Twitter推特 is amazing惊人 and shapes形状 so much.
428
1364876
4706
推特非常了不起,变化非常巨大。
23:01
It's not as big as some
of the other platforms平台,
429
1369606
2233
它也许没有另一些平台那么大,
23:03
but the people of influence影响 use it
to set the agenda议程,
430
1371863
2804
但有影响力的人用它来设定世界议程,
23:06
and it's just hard to imagine想像 a more
important重要 role角色 in the world世界 than to ...
431
1374691
6787
很难想象世界上比它更重要的角色。
23:13
I mean, you're doing a brilliant辉煌 job工作
of listening, Jack插口, and hearing听力 people,
432
1381502
3784
我的意思是,杰克,你现在
听取他人意见,你做得很好,
23:17
but to actually其实 dial拨号 up the urgency
and move移动 on this stuff东东 --
433
1385310
4445
但讲到真正专心推动解决的办法——
23:21
will you do that?
434
1389779
2201
你会这么做吗?
23:24
JDJD: Yes, and we have been
moving移动 substantially基本上.
435
1392750
3815
多西:能,我们取得了
一些实质性的进展。
23:28
I mean, there's been
a few少数 dynamics动力学 in Twitter'sTwitter的 history历史.
436
1396589
3225
我的意思是,在推特的历史上
有着一些巨大变化。
当我重回到公司时,
23:31
One, when I came来了 back to the company公司,
437
1399838
2083
23:35
we were in a pretty漂亮 dire可怕的 state
in terms条款 of our future未来,
438
1403477
6256
我们对未来情形一筹莫展,
23:41
and not just from how people
were using运用 the platform平台,
439
1409757
4634
不仅仅是因为人们使用这个平台的方法,
23:46
but from a corporate企业 narrative叙述 as well.
440
1414415
2047
还因为公司结构的问题。
23:48
So we had to fix固定
a bunch of the foundation基础,
441
1416486
3204
所以我们得改变一些公司基础,
23:51
turn the company公司 around,
442
1419714
1969
转变公司的方向,
23:53
go through通过 two crazy layoffs裁员,
443
1421707
3111
经历两次疯狂的裁员,
23:56
because we just got too big
for what we were doing,
444
1424842
3793
因为对于当时的业务而言雇员太多,
24:00
and we focused重点 all of our energy能源
445
1428659
2060
然后我们将工作重心
24:02
on this concept概念 of serving服务
the public上市 conversation会话.
446
1430743
3508
放在为公众交流服务上。
这花了一点时间。
24:06
And that took some work.
447
1434275
1451
24:07
And as we dived跳水 into that,
448
1435750
2608
当我们开始关注这方面时,
24:10
we realized实现 some of the issues问题
with the fundamentals基本面.
449
1438382
2992
我们意识到一些原则性的问题。
24:14
We could do a bunch of superficial things
to address地址 what you're talking about,
450
1442120
4656
为了达成你所说的目的,
我们可以采取一系列浅层措施,
但我们需要找到可持续的改进措施,
24:18
but we need the changes变化 to last,
451
1446800
1790
24:20
and that means手段 going really, really deep
452
1448614
2459
而这意味着非常深入,
24:23
and paying付款 attention注意
to what we started开始 13 years年份 ago
453
1451097
4350
观察我们13年前建立的东西,
24:27
and really questioning疑问
454
1455471
2261
并且反省
24:29
how the system系统 works作品
and how the framework骨架 works作品
455
1457756
2566
这个系统是怎样工作的,
这个框架是怎样工作的,
24:32
and what is needed需要 for the world世界 today今天,
456
1460346
3833
现今社会需要的是什么,
24:36
given特定 how quickly很快 everything is moving移动
and how people are using运用 it.
457
1464203
4024
考虑到世界变化之快和
人们使用它的方法。
24:40
So we are working加工 as quickly很快 as we can,
but quickness速度 will not get the job工作 doneDONE.
458
1468251
6544
所以我们已经尽快努力了,
但速度无法帮我们完成任务。
24:46
It's focus焦点, it's prioritization优先,
459
1474819
2611
重要的是专注、优先级、
24:49
it's understanding理解
the fundamentals基本面 of the network网络
460
1477454
2946
是对网络之基的理解,
24:52
and building建造 a framework骨架 that scales
461
1480424
2842
构建可以灵活变动、适应变化的机制。
24:55
and that is resilient弹性 to change更改,
462
1483290
2351
24:57
and being存在 open打开 about where we are
and being存在 transparent透明 about where are
463
1485665
5429
坦诚布公我们的现状,
25:03
so that we can continue继续 to earn trust相信.
464
1491118
2179
这样我们才能继续赢得人们的信任。
所以我对已经投入执行的机制感到自豪。
25:06
So I'm proud骄傲 of all the frameworks构架
that we've我们已经 put in place地点.
465
1494141
3331
我对我们选定的前进方向感到自豪。
25:09
I'm proud骄傲 of our direction方向.
466
1497496
2888
25:12
We obviously明显 can move移动 faster更快,
467
1500915
2718
显然,我们可以更快进步,
25:15
but that required需要 just stopping停止 a bunch
of stupid stuff东东 we were doing in the past过去.
468
1503657
4719
但这需要我们放弃过去一系列愚蠢的决定。
安德森:很好。
25:21
CACA: All right.
469
1509067
1164
25:22
Well, I suspect疑似 there are many许多 people here
who, if given特定 the chance机会,
470
1510255
4067
那么,我觉得如果有机会的话,
在场的很多人
25:26
would love to help you
on this change-making变化的决策 agenda议程 you're on,
471
1514346
3989
都愿意帮助你完场这一项重大改变,
我不知道惠特尼是否还有问题,
25:30
and I don't know if Whitney惠特尼 --
472
1518359
1542
25:31
Jack插口, thank you for coming未来 here
and speaking请讲 so openly公然.
473
1519925
2761
杰克,感谢你能来与我们
开诚布公地讨论。
这非常需要勇气。
25:34
It took courage勇气.
474
1522710
1527
25:36
I really appreciate欣赏 what you said,
and good luck运气 with your mission任务.
475
1524261
3384
我对你所说的非常感激,
祝你好运完成任务。
25:39
JDJD: Thank you so much.
Thanks谢谢 for having me.
476
1527669
2095
多西:非常感谢。谢谢你们邀请我。
25:41
(Applause掌声)
477
1529788
3322
(掌声)
25:45
Thank you.
478
1533134
1159
谢谢。
Translated by Buyun Ping
Reviewed by Homer Li

▲Back to top

ABOUT THE SPEAKERS
Jack Dorsey - Entrepreneur, programmer
Jack Dorsey is the CEO of Twitter, CEO & Chairman of Square, and a cofounder of both.

Why you should listen
More profile about the speaker
Jack Dorsey | Speaker | TED.com
Chris Anderson - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading.

Why you should listen

Chris Anderson is the Curator of TED, a nonprofit devoted to sharing valuable ideas, primarily through the medium of 'TED Talks' -- short talks that are offered free online to a global audience.

Chris was born in a remote village in Pakistan in 1957. He spent his early years in India, Pakistan and Afghanistan, where his parents worked as medical missionaries, and he attended an American school in the Himalayas for his early education. After boarding school in Bath, England, he went on to Oxford University, graduating in 1978 with a degree in philosophy, politics and economics.

Chris then trained as a journalist, working in newspapers and radio, including two years producing a world news service in the Seychelles Islands.

Back in the UK in 1984, Chris was captivated by the personal computer revolution and became an editor at one of the UK's early computer magazines. A year later he founded Future Publishing with a $25,000 bank loan. The new company initially focused on specialist computer publications but eventually expanded into other areas such as cycling, music, video games, technology and design, doubling in size every year for seven years. In 1994, Chris moved to the United States where he built Imagine Media, publisher of Business 2.0 magazine and creator of the popular video game users website IGN. Chris eventually merged Imagine and Future, taking the combined entity public in London in 1999, under the Future name. At its peak, it published 150 magazines and websites and employed 2,000 people.

This success allowed Chris to create a private nonprofit organization, the Sapling Foundation, with the hope of finding new ways to tackle tough global issues through media, technology, entrepreneurship and, most of all, ideas. In 2001, the foundation acquired the TED Conference, then an annual meeting of luminaries in the fields of Technology, Entertainment and Design held in Monterey, California, and Chris left Future to work full time on TED.

He expanded the conference's remit to cover all topics, including science, business and key global issues, while adding a Fellows program, which now has some 300 alumni, and the TED Prize, which grants its recipients "one wish to change the world." The TED stage has become a place for thinkers and doers from all fields to share their ideas and their work, capturing imaginations, sparking conversation and encouraging discovery along the way.

In 2006, TED experimented with posting some of its talks on the Internet. Their viral success encouraged Chris to begin positioning the organization as a global media initiative devoted to 'ideas worth spreading,' part of a new era of information dissemination using the power of online video. In June 2015, the organization posted its 2,000th talk online. The talks are free to view, and they have been translated into more than 100 languages with the help of volunteers from around the world. Viewership has grown to approximately one billion views per year.

Continuing a strategy of 'radical openness,' in 2009 Chris introduced the TEDx initiative, allowing free licenses to local organizers who wished to organize their own TED-like events. More than 8,000 such events have been held, generating an archive of 60,000 TEDx talks. And three years later, the TED-Ed program was launched, offering free educational videos and tools to students and teachers.

More profile about the speaker
Chris Anderson | Speaker | TED.com
Whitney Pennington Rodgers - TED Current Affairs Curator
Whitney Pennington Rodgers is an award-winning journalist and media professional.

Why you should listen

Prior to joining TED as current affairs curator, Whitney Pennington Rodgers produced for NBC's primetime news magazine Dateline NBC. She earned a duPont-Columbia award and a News & Documentary Emmy or her contributions to the Dateline NBC hour "The Cosby Accusers Speak" -- an extensive group interview with 27 of the women who accused entertainer Bill Cosby of sexual misconduct.

Pennington Rodgers has worked at NBC's in-house production company Peacock Productions, The Today Show, Nightly News, Rock Center with Brian Williams and New Jersey-centric public affairs shows Caucus: New Jersey and One-on-One with Steve Adubato. Prior to beginning her career in media, she had a short stint as a fourth-grade teacher through the Teach for America program.

Pennington Rodgers received her Bachelor's in journalism and media studies from Rutgers University. She completed her Master's of Journalism at the University of California at Berkeley, where she produced a documentary about recruitment of nonblack students at historically black colleges and universities.

More profile about the speaker
Whitney Pennington Rodgers | Speaker | TED.com