ABOUT THE SPEAKERS
Jack Dorsey - Entrepreneur, programmer
Jack Dorsey is the CEO of Twitter, CEO & Chairman of Square, and a cofounder of both.

Why you should listen
More profile about the speaker
Jack Dorsey | Speaker | TED.com
Chris Anderson - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading.

Why you should listen

Chris Anderson is the Curator of TED, a nonprofit devoted to sharing valuable ideas, primarily through the medium of 'TED Talks' -- short talks that are offered free online to a global audience.

Chris was born in a remote village in Pakistan in 1957. He spent his early years in India, Pakistan and Afghanistan, where his parents worked as medical missionaries, and he attended an American school in the Himalayas for his early education. After boarding school in Bath, England, he went on to Oxford University, graduating in 1978 with a degree in philosophy, politics and economics.

Chris then trained as a journalist, working in newspapers and radio, including two years producing a world news service in the Seychelles Islands.

Back in the UK in 1984, Chris was captivated by the personal computer revolution and became an editor at one of the UK's early computer magazines. A year later he founded Future Publishing with a $25,000 bank loan. The new company initially focused on specialist computer publications but eventually expanded into other areas such as cycling, music, video games, technology and design, doubling in size every year for seven years. In 1994, Chris moved to the United States where he built Imagine Media, publisher of Business 2.0 magazine and creator of the popular video game users website IGN. Chris eventually merged Imagine and Future, taking the combined entity public in London in 1999, under the Future name. At its peak, it published 150 magazines and websites and employed 2,000 people.

This success allowed Chris to create a private nonprofit organization, the Sapling Foundation, with the hope of finding new ways to tackle tough global issues through media, technology, entrepreneurship and, most of all, ideas. In 2001, the foundation acquired the TED Conference, then an annual meeting of luminaries in the fields of Technology, Entertainment and Design held in Monterey, California, and Chris left Future to work full time on TED.

He expanded the conference's remit to cover all topics, including science, business and key global issues, while adding a Fellows program, which now has some 300 alumni, and the TED Prize, which grants its recipients "one wish to change the world." The TED stage has become a place for thinkers and doers from all fields to share their ideas and their work, capturing imaginations, sparking conversation and encouraging discovery along the way.

In 2006, TED experimented with posting some of its talks on the Internet. Their viral success encouraged Chris to begin positioning the organization as a global media initiative devoted to 'ideas worth spreading,' part of a new era of information dissemination using the power of online video. In June 2015, the organization posted its 2,000th talk online. The talks are free to view, and they have been translated into more than 100 languages with the help of volunteers from around the world. Viewership has grown to approximately one billion views per year.

Continuing a strategy of 'radical openness,' in 2009 Chris introduced the TEDx initiative, allowing free licenses to local organizers who wished to organize their own TED-like events. More than 8,000 such events have been held, generating an archive of 60,000 TEDx talks. And three years later, the TED-Ed program was launched, offering free educational videos and tools to students and teachers.

More profile about the speaker
Chris Anderson | Speaker | TED.com
Whitney Pennington Rodgers - TED Current Affairs Curator
Whitney Pennington Rodgers is an award-winning journalist and media professional.

Why you should listen

Prior to joining TED as current affairs curator, Whitney Pennington Rodgers produced for NBC's primetime news magazine Dateline NBC. She earned a duPont-Columbia award and a News & Documentary Emmy or her contributions to the Dateline NBC hour "The Cosby Accusers Speak" -- an extensive group interview with 27 of the women who accused entertainer Bill Cosby of sexual misconduct.

Pennington Rodgers has worked at NBC's in-house production company Peacock Productions, The Today Show, Nightly News, Rock Center with Brian Williams and New Jersey-centric public affairs shows Caucus: New Jersey and One-on-One with Steve Adubato. Prior to beginning her career in media, she had a short stint as a fourth-grade teacher through the Teach for America program.

Pennington Rodgers received her Bachelor's in journalism and media studies from Rutgers University. She completed her Master's of Journalism at the University of California at Berkeley, where she produced a documentary about recruitment of nonblack students at historically black colleges and universities.

More profile about the speaker
Whitney Pennington Rodgers | Speaker | TED.com
TED2019

Jack Dorsey: How Twitter needs to change

傑克・多爾西: 推特需要如何調整

Filmed:
2,089,470 views

推特還有得救嗎?在這場包含廣泛議題的對話中.TED 的克里斯•安德森、惠妮•潘尼頓•羅傑斯,和推特的執行長傑克•多爾西一同討論平臺的未來,點出問題包含:騷擾、內容審核及對於基礎原則改變的提議,希望能夠鼓勵良性、尊重的對話。多爾西問到,我們提供的服務是否呼應人們所重視的價值?
- Entrepreneur, programmer
Jack Dorsey is the CEO of Twitter, CEO & Chairman of Square, and a cofounder of both. Full bio - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading. Full bio - TED Current Affairs Curator
Whitney Pennington Rodgers is an award-winning journalist and media professional. Full bio

Double-click the English transcript below to play the video.

00:13
Chris克裡斯 Anderson安德森:
What worries you right now?
0
1131
2408
克里斯•安德森(安):
你目前擔心的是什麼?
你對許多推特爭議持開放的態度。
00:15
You've been very open打開
about lots of issues問題 on Twitter推特.
1
3563
2853
00:18
What would be your top最佳 worry擔心
2
6440
2299
對於現狀你最擔心的是什麼?
00:20
about where things are right now?
3
8763
2049
00:23
Jack插口 Dorsey多爾西: Right now,
the health健康 of the conversation會話.
4
11447
2929
傑克•多爾西(多):
目前擔心對話是否良性。
00:26
So, our purpose目的 is to serve服務
the public上市 conversation會話,
5
14400
3660
我們致力於為公眾提供對話平臺,
00:30
and we have seen看到
a number of attacks攻擊 on it.
6
18084
5056
但也看到對此的攻擊不斷。
00:35
We've我們已經 seen看到 abuse濫用, we've我們已經 seen看到 harassment騷擾,
7
23164
2425
像是那些濫用、騷擾、操弄、
00:37
we've我們已經 seen看到 manipulation操作,
8
25613
3222
00:40
automation自動化, human人的 coordination協調,
misinformation誤傳.
9
28859
4265
自動化、人類協作、錯誤資訊。
00:46
So these are all dynamics動力學
that we were not expecting期待
10
34134
4034
這並非 13 年前我們
創立時,所預期的變化。
00:50
13 years年份 ago when we were
starting開始 the company公司.
11
38192
3718
00:53
But we do now see them at scale規模,
12
41934
2664
但這些現象現已不容小覷,
00:56
and what worries me most
is just our ability能力 to address地址 it
13
44622
5278
而最讓我擔心的是,我們是否有能力
大規模、系統性地處理這些狀況,
01:01
in a systemic系統的 way that is scalable可擴展性,
14
49924
3108
01:05
that has a rigorous嚴格 understanding理解
of how we're taking服用 action行動,
15
53056
6976
能有一套嚴謹的系統、
透徹地瞭解我們該如何採取行動,
01:12
a transparent透明 understanding理解
of how we're taking服用 action行動
16
60056
3105
01:15
and a rigorous嚴格 appeals上訴 process處理
for when we're wrong錯誤,
17
63185
3101
當我們犯錯時,
也有嚴謹的上訴程序,
01:18
because we will be wrong錯誤.
18
66310
2169
因為我們一定會犯錯。
01:20
Whitney惠特尼 Pennington邊甯頓 Rodgers羅傑斯:
I'm really glad高興 to hear
19
68503
2397
惠妮•潘尼頓•羅傑斯(羅):
我很高興聽到
01:22
that that's something that concerns關注 you,
20
70924
1928
你關心這個議題,
因為常看到有人寫到
01:24
because I think there's been
a lot written書面 about people
21
72876
2630
他們覺得自己在推特上
被辱罵和騷擾,
01:27
who feel they've他們已經 been abused濫用
and harassed騷擾 on Twitter推特,
22
75530
2477
最常遇到這種事的
就是女性、有色人種女性
01:30
and I think no one more so
than women婦女 and women婦女 of color顏色
23
78031
4102
01:34
and black黑色 women婦女.
24
82157
1170
和黑人女性。
01:35
And there's been data數據 that's come out --
25
83351
1913
已經有些資料顯示——
01:37
Amnesty大赦 International國際 put out
a report報告 a few少數 months個月 ago
26
85288
2909
幾個月前,國際特赦組織發表報告,
01:40
where they showed顯示 that a subset子集
of active活性 black黑色 female Twitter推特 users用戶
27
88221
4480
報告顯示,部分活躍使用
推特的黑人女性中,
01:44
were receiving接收, on average平均,
one in 10 of their tweets微博
28
92725
3456
平均每十則回覆中就有一則
01:48
were some form形成 of harassment騷擾.
29
96205
2099
是某種形式的騷擾。
01:50
And so when you think about health健康
for the community社區 on Twitter推特,
30
98328
3907
當想到推特社群的良性發展,
01:54
I'm interested有興趣 to hear,
"health健康 for everyone大家,"
31
102259
4024
我很想要聽到「人人良性對話」,
01:58
but specifically特別: How are you looking
to make Twitter推特 a safe安全 space空間
32
106307
3125
但明確地說:你打算如何
將推特變成安全的空間,
02:01
for that subset子集, for women婦女,
for women婦女 of color顏色 and black黑色 women婦女?
33
109456
4164
尤其是對那些少數,包含女性、
有色人種的女性及黑人女性而言?
02:05
JDJD: Yeah.
34
113644
1164
多:是的。
02:06
So it's a pretty漂亮 terrible可怕 situation情況
35
114832
2643
這是個很糟的情況,
02:09
when you're coming未來 to a service服務
36
117499
1619
當你使用一項服務,
02:11
that, ideally理想, you want to learn學習
something about the world世界,
37
119142
4321
理想情況下,
只是想了解世界資訊,
02:15
and you spend the majority多數 of your time
reporting報告 abuse濫用, receiving接收 abuse濫用,
38
123487
5443
但你卻花了多數的時間於:
檢舉或受到辱罵、
02:20
receiving接收 harassment騷擾.
39
128954
1804
被人騷擾。
02:23
So what we're looking most deeply at
is just the incentives獎勵
40
131373
6321
我們在探究的是
由平臺及服務所產生的誘因。
02:29
that the platform平台 naturally自然 provides提供
and the service服務 provides提供.
41
137718
3823
02:34
Right now, the dynamic動態 of the system系統
makes品牌 it super-easy超級容易 to harass騷擾
42
142262
4577
目前系統的互動模式,讓使用者
被輕易地利用來騷擾和辱罵他人,
02:38
and to abuse濫用 others其他 through通過 the service服務,
43
146863
3664
02:42
and unfortunately不幸, the majority多數
of our system系統 in the past過去
44
150551
3262
不幸的是,過去系統的運作,
02:45
worked工作 entirely完全 based基於 on people
reporting報告 harassment騷擾 and abuse濫用.
45
153837
5596
絕大部分都依賴使用者
主動檢舉騷擾以及濫用。
02:51
So about midway中途 last year,
we decided決定 that we were going to apply應用
46
159457
5075
大約在去年,
我們決定針對這個問題
02:56
a lot more machine learning學習,
a lot more deep learning學習 to the problem問題,
47
164556
3982
採用更多的機器學習、深度學習,
03:00
and try to be a lot more proactive主動
around where abuse濫用 is happening事件,
48
168562
4538
並嘗試在發生濫用前先發制人,
03:05
so that we can take the burden負擔
off the victim受害者 completely全然.
49
173124
3960
替受害者完全卸下負擔。
03:09
And we've我們已經 made製作 some progress進展 recently最近.
50
177108
2435
最近我們有些進展。
03:11
About 38 percent百分 of abusive濫用的 tweets微博
are now proactively主動 identified確定
51
179567
6689
機器學習演算法可以主動找出
大約 38% 的濫用推文,
03:18
by machine learning學習 algorithms算法
52
186280
1715
03:20
so that people don't actually其實
have to report報告 them.
53
188019
2334
使用者不需要主動檢舉這些推文。
03:22
But those that are identified確定
are still reviewed回顧 by humans人類,
54
190377
3305
但被識別出來的濫用推文
會再由人工複審,
03:25
so we do not take down content內容 or accounts賬戶
without a human人的 actually其實 reviewing回顧 it.
55
193706
5384
在人工複審前,我們不會
刪除任何的推文或帳號。
03:31
But that was from zero percent百分
just a year ago.
56
199114
2759
這比一年前的 0% 大有進步。
03:33
So that meant意味著, at that zero percent百分,
57
201897
1931
意思就是,在 0% 時,
03:35
every一切 single person who received收到 abuse濫用
had to actually其實 report報告 it,
58
203852
3650
每個收到辱罵的人都得主動檢舉,
03:39
which哪一個 was a lot of work for them,
a lot of work for us
59
207526
3579
對他們和我們來說都很麻煩,
03:43
and just ultimately最終 unfair不公平.
60
211129
2018
歸根究柢就是很不公平。
03:46
The other thing that we're doing
is making製造 sure that we, as a company公司,
61
214528
3780
我們做的另外一件事
是確保作為一個企業
03:50
have representation表示 of all the communities社區
that we're trying to serve服務.
62
218332
3333
被服務的所有群體皆有其代表。
我們要打造一個成功的企業,
03:53
We can't build建立 a business商業
that is successful成功
63
221689
2159
內部就必須擁有多元觀點,
03:55
unless除非 we have a diversity多樣
of perspective透視 inside of our walls牆壁
64
223872
3300
且每天都深切地體會這些爭議。
03:59
that actually其實 feel these issues問題
every一切 single day.
65
227196
3732
04:02
And that's not just with the team球隊
that's doing the work,
66
230952
3738
不只是前端負責
執行的團隊要夠多元,
04:06
it's also within our leadership領導 as well.
67
234714
2096
領導階層也一樣。
04:08
So we need to continue繼續 to build建立 empathy同情
for what people are experiencing經歷
68
236834
5757
我們要對人們的感受保有同理心
04:14
and give them better tools工具 to act法案 on it
69
242615
3316
並為客戶提供更好的工具,
04:17
and also give our customers顧客
a much better and easier更輕鬆 approach途徑
70
245955
4252
還有更優質、更簡易的方法,
04:22
to handle處理 some of the things
that they're seeing眼看.
71
250231
2382
來處理他們所目睹的事情。
04:24
So a lot of what we're doing
is around technology技術,
72
252637
3266
我們做的事大多環繞著科技,
04:27
but we're also looking at
the incentives獎勵 on the service服務:
73
255927
4308
但我們也在研究因服務產生的誘因:
04:32
What does Twitter推特 incentivize激勵 you to do
when you first open打開 it up?
74
260259
5183
當你一打開推特,
它誘使你去做什麼?
04:37
And in the past過去,
75
265466
1294
在過去,
04:40
it's incented激勵 a lot of outrage暴行,
it's incented激勵 a lot of mob暴民 behavior行為,
76
268670
5544
推特誘導了許多紛爭、
暴行、集體騷擾。
04:46
it's incented激勵 a lot of group harassment騷擾.
77
274238
2459
04:48
And we have to look a lot deeper更深
at some of the fundamentals基本面
78
276721
3648
我們得更深入地探討
服務內容的本質,
04:52
of what the service服務 is doing
to make the bigger shifts轉變.
79
280393
2958
才能做重大調整。
04:55
We can make a bunch of small shifts轉變
around technology技術, as I just described描述,
80
283375
4031
我們可以在技術上
做許多小改變,如剛所敘,
04:59
but ultimately最終, we have to look deeply
at the dynamics動力學 in the network網絡 itself本身,
81
287430
4386
但最終我們得更深入
觀察網絡裡的動態,
05:03
and that's what we're doing.
82
291840
1368
那正是我們在做的。
05:05
CACA: But what's your sense --
83
293232
2060
安:你的看法是什麼——
05:07
what is the kind of thing
that you might威力 be able能夠 to change更改
84
295316
3963
你可能做出什麼樣的改變,
05:11
that would actually其實
fundamentally從根本上 shift轉移 behavior行為?
85
299303
2749
才能從根本上改變使用者行為呢?
05:15
JDJD: Well, one of the things --
86
303386
1480
多:其中一樣——
05:16
we started開始 the service服務
with this concept概念 of following以下 an account帳戶,
87
304890
5340
舉例來說,我們推出了一種服務,
概念是跟隨帳號,
05:22
as an example,
88
310254
1725
05:24
and I don't believe that's why
people actually其實 come to Twitter推特.
89
312003
4349
我不認為這是大家
使用推特的原因。
05:28
I believe Twitter推特 is best最好
as an interest-based基於利息 network網絡.
90
316376
4857
我認為推特最好定位為
以興趣為基礎的網絡。
05:33
People come with a particular特定 interest利益.
91
321257
3453
大家因為某種特定的興趣而來。
05:36
They have to do a ton of work
to find and follow跟隨 the related有關 accounts賬戶
92
324734
3487
他們得要花很多心力
才能找到並跟隨
05:40
around those interests利益.
93
328245
1405
和那些興趣相關的帳號。
05:42
What we could do instead代替
is allow允許 you to follow跟隨 an interest利益,
94
330217
3397
我們可以換一種方式,
讓你跟隨一種興趣,
05:45
follow跟隨 a hashtag井號, follow跟隨 a trend趨勢,
95
333638
2103
跟隨一個「#主題標籤」、一種趨勢、
05:47
follow跟隨 a community社區,
96
335765
1754
跟隨一個社群,
05:49
which哪一個 gives us the opportunity機會
to show顯示 all of the accounts賬戶,
97
337543
4637
這樣我們就有機會
呈現所有相關的帳號、
05:54
all the topics主題, all the moments瞬間,
all the hashtags主題標籤
98
342204
3323
所有相關的主題、時刻、「#主題標籤」,
05:57
that are associated相關 with that
particular特定 topic話題 and interest利益,
99
345551
3992
只要是和那個主題及興趣
有關的都會呈現出來,
06:01
which哪一個 really opens打開 up
the perspective透視 that you see.
100
349567
4600
這真的能夠開拓你的視野。
06:06
But that is a huge巨大 fundamental基本的 shift轉移
101
354191
2157
但這樣大幅度的結構性轉變
06:08
to bias偏壓 the entire整個 network網絡
away from just an account帳戶 bias偏壓
102
356372
3792
讓整個平臺從偏重於帳號
06:12
towards a topics主題 and interest利益 bias偏壓.
103
360188
2587
轉而偏向於主題和興趣 。
06:15
CACA: Because isn't it the case案件
104
363283
3148
安:現實情況不是這樣的嗎?
06:19
that one reason原因 why you have
so much content內容 on there
105
367375
3541
推特之所以有那麼多內容的原因之一
06:22
is a result結果 of putting millions百萬
of people around the world世界
106
370940
3591
不就是將全世界幾百萬人放在一個
06:26
in this kind of gladiatorial爭論的
contest比賽 with each other
107
374555
3142
類似競技場式的比賽
06:29
for followers追隨者, for attention注意?
108
377721
2090
去競爭跟隨者、關注?
06:31
Like, from the point of view視圖
of people who just read Twitter推特,
109
379835
4117
對只看而不發推特的人來說,
06:35
that's not an issue問題,
110
383976
1155
這不是問題,
06:37
but for the people who actually其實 create創建 it,
everyone's大家的 out there saying,
111
385155
3350
但對創造內容的人而言,
許多人會說:
06:40
"You know, I wish希望 I had
a few少數 more 'likes"喜歡",' followers追隨者, retweets銳推."
112
388529
3236
「我真希望有多一點
喜歡、跟隨者、轉推。」
06:43
And so they're constantly經常 experimenting試驗,
113
391789
2148
所以他們經常在做實驗,
06:45
trying to find the path路徑 to do that.
114
393961
1961
試著找出方法達到目的。
06:47
And what we've我們已經 all discovered發現
is that the number one path路徑 to do that
115
395946
4126
而我們所發現的是最常見途徑就是:
06:52
is to be some form形成 of provocative挑釁,
116
400096
3406
帶有挑釁成分的、
06:55
obnoxious厭惡, eloquently雄辯地 obnoxious厭惡,
117
403526
2980
罵起人來頭頭是道,
06:58
like, eloquent雄辯 insults侮辱
are a dream夢想 on Twitter推特,
118
406530
3516
彷彿,頭頭是道的辱罵,
是人們在推特的目標。
07:02
where you rapidly急速 pile up --
119
410070
2603
這樣便能快速傳播——
07:04
and it becomes this self-fueling自供燃料
process處理 of driving主動 outrage暴行.
120
412697
4608
而這變成一種導致憤怒的惡性循環。
07:09
How do you defuse化解 that?
121
417329
2351
你要如何平息這個狀況?
07:12
JDJD: Yeah, I mean, I think you're spot on,
122
420624
2947
多:是的,你切中核心,
07:15
but that goes back to the incentives獎勵.
123
423595
1886
但這又回到誘因。
07:17
Like, one of the choices選擇
we made製作 in the early days was
124
425505
2632
在早期,我們所做的選擇之一
07:20
we had this number that showed顯示
how many許多 people follow跟隨 you.
125
428161
4701
就是突出你的跟隨者人數。
07:24
We decided決定 that number
should be big and bold膽大,
126
432886
2959
我們決定讓那個數字
要用粗體大字呈現,
07:27
and anything that's on the page
that's big and bold膽大 has importance重要性,
127
435869
3740
網頁上只要用到粗體大字,
通常都是重要的,
07:31
and those are the things
that you want to drive駕駛.
128
439633
2278
那些就會是你想要追求的東西。
07:33
Was that the right decision決定 at the time?
129
441935
1907
在那時,那個決策正確嗎?
07:35
Probably大概 not.
130
443866
1153
可能不正確。
07:37
If I had to start開始 the service服務 again,
131
445043
1805
如果能重頭來過,
07:38
I would not emphasize注重
the follower信徒 count計數 as much.
132
446872
2398
我不會這麼強調跟隨者人數。
07:41
I would not emphasize注重
the "like" count計數 as much.
133
449294
2295
我不會這麼強調「喜歡」的數目。
07:43
I don't think I would even
create創建 "like" in the first place地點,
134
451613
3120
我甚至根本不會創造
「喜歡」這個功能,
07:46
because it doesn't actually其實 push
135
454757
3267
因為這個功能無法真的
07:50
what we believe now
to be the most important重要 thing,
136
458048
3179
推動我們現在最重要的理念,
07:53
which哪一個 is healthy健康 contribution貢獻
back to the network網絡
137
461251
3039
也就是對網路做良性的回饋貢獻,
07:56
and conversation會話 to the network網絡,
138
464314
2652
以及與網路對話、
07:58
participation參與 within conversation會話,
139
466990
2072
參與對話,
08:01
learning學習 something from the conversation會話.
140
469086
2493
從對話中學習。
08:03
Those are not things
that we thought of 13 years年份 ago,
141
471603
2824
我們 13 年前沒有想到這些,
08:06
and we believe are extremely非常
important重要 right now.
142
474451
2439
現在我們認為這些非常重要。
08:08
So we have to look at
how we display顯示 the follower信徒 count計數,
143
476914
3023
所以我們得要研究
如何呈現跟隨者人數、
08:11
how we display顯示 retweet轉推 count計數,
144
479961
2365
如何呈現轉推數,
08:14
how we display顯示 "likes喜歡,"
145
482350
1401
如何呈現「喜歡」,
08:15
and just ask the deep question:
146
483775
2254
並深思:
08:18
Is this really the number
that we want people to drive駕駛 up?
147
486053
3048
我們真的希望大家
把這個數字衝高嗎?
08:21
Is this the thing that,
when you open打開 Twitter推特,
148
489125
2545
我們真的希望當你打開推特時,
08:23
you see, "That's the thing
I need to increase增加?"
149
491694
2516
你會說「我得要增加這個數字」嗎?
08:26
And I don't believe
that's the case案件 right now.
150
494234
2144
我不認為是這樣的。
08:28
(Applause掌聲)
151
496402
2103
(掌聲)
08:30
WPRWPR: I think we should look at
some of the tweets微博
152
498529
2352
羅:我們來看一些觀眾的推文。
08:32
that are coming未來
in from the audience聽眾 as well.
153
500905
2169
08:35
CACA: Let's see what you guys are asking.
154
503868
2436
安:來看看大家想問什麼。
08:38
I mean, this is -- generally通常, one
of the amazing驚人 things about Twitter推特
155
506328
3294
一般來說,這是推特
很棒的優點之一,
08:41
is how you can use it for crowd人群 wisdom智慧,
156
509646
2294
可以把它用在群眾智慧上,
08:43
you know, that more knowledge知識,
more questions問題, more points of view視圖
157
511964
4840
像是:更多知識、問題、觀點,
08:48
than you can imagine想像,
158
516828
1238
超越想像,
08:50
and sometimes有時, many許多 of them
are really healthy健康.
159
518090
3689
有時,許多內容是非常良性的。
08:53
WPRWPR: I think one I saw that
passed通過 already已經 quickly很快 down here,
160
521803
2900
羅:有一則訊息進來:
08:56
"What's Twitter'sTwitter的 plan計劃 to combat戰鬥
foreign國外 meddling插手 in the 2020 US election選舉?"
161
524717
3524
「推特對抗外國干預 2020 年
美國大選的計畫是什麼?」
09:00
I think that's something
that's an issue問題 we're seeing眼看
162
528265
2571
這是我們在網路上
09:02
on the internet互聯網 in general一般,
163
530860
1901
一般都會看見的議題,
09:04
that we have a lot of malicious惡毒
automated自動化 activity活動 happening事件.
164
532785
3667
有許多惡意的、自動化的活動。
09:08
And on Twitter推特, for example,
in fact事實, we have some work
165
536476
5373
在推特上,比如,我們有
09:13
that's come from our friends朋友
at Zignal齊格納 Labs實驗室,
166
541873
2758
我們的朋友 Zignal 實驗室的作品,
09:16
and maybe we can even see that
to give us an example
167
544655
2656
也許我們能以此為例,
09:19
of what exactly究竟 I'm talking about,
168
547335
1927
闡述我的觀點,
09:21
where you have these bots機器人, if you will,
169
549286
3204
有些機器人,
或自動化協同的惡意帳號活動,
09:24
or coordinated協調 automated自動化
malicious惡毒 account帳戶 activity活動,
170
552514
4550
09:29
that is being存在 used to influence影響
things like elections選舉.
171
557088
2764
被用來影響選舉等等。
09:31
And in this example we have
from Zignal齊格納 which哪一個 they've他們已經 shared共享 with us
172
559876
3843
這個例子是 Zignal 和我們分享的,
09:35
using運用 the data數據 that
they have from Twitter推特,
173
563743
2198
他們使用來自推特的資料,
09:37
you actually其實 see that in this case案件,
174
565965
2441
各位可以看見,
09:40
white白色 represents代表 the humans人類 --
human人的 accounts賬戶, each dot is an account帳戶.
175
568430
4370
在這裡白色代表人——
人的帳戶,每一個點是一個帳戶。
09:44
The pinker平克 it is,
176
572824
1359
顏色越粉紅,
09:46
the more automated自動化 the activity活動 is.
177
574207
1740
就表示越多自動化的活動。
09:47
And you can see how you have
a few少數 humans人類 interacting互動 with bots機器人.
178
575971
5970
可以看見有些人和機器人互動。
09:53
In this case案件, it's related有關
to the election選舉 in Israel以色列
179
581965
4419
在這個例子中,這些活動
和以色列大選有關,
09:58
and spreading傳播 misinformation誤傳
about Benny班尼 Gantz甘茨,
180
586408
2833
散播關於本尼•甘茨的不實資訊,
10:01
and as we know, in the end結束,
that was an election選舉
181
589265
2662
我們知道,
10:03
that Netanyahu內塔尼亞胡 won韓元 by a slim margin餘量,
182
591951
3724
最後尼坦雅胡以些微差距勝出,
10:07
and that may可能 have been
in some case案件 influenced影響 by this.
183
595699
2842
那結果可能受到推特的影響。
10:10
And when you think about
that happening事件 on Twitter推特,
184
598565
2615
想到這樣的事發生在推特,
10:13
what are the things
that you're doing, specifically特別,
185
601204
2456
你會採取哪些明確的措施,
10:15
to ensure確保 you don't have misinformation誤傳
like this spreading傳播 in this way,
186
603684
3702
以確保不會有錯誤資訊
藉此散播出去,
10:19
influencing影響 people in ways方法
that could affect影響 democracy民主?
187
607410
4181
影響到大家,進而影響到民主?
10:23
JDJD: Just to back up a bit,
188
611615
1771
多:先倒帶一下,
10:25
we asked ourselves我們自己 a question:
189
613410
2975
我們會自問:
10:28
Can we actually其實 measure測量
the health健康 of a conversation會話,
190
616409
3816
我們真的能測量良性對話的程度嗎?
10:32
and what does that mean?
191
620249
1288
良性對話是什麼意思?
10:33
And in the same相同 way
that you have indicators指標
192
621561
3382
就像你有指標,
10:36
and we have indicators指標 as humans人類
in terms條款 of are we healthy健康 or not,
193
624967
3467
有指標可以表示人類是否健康,
10:40
such這樣 as temperature溫度,
the flushness齊平度 of your face面對,
194
628458
4658
比如體溫、面色紅潤度,
10:45
we believe that we could find
the indicators指標 of conversational對話的 health健康.
195
633140
4560
我們相信能找到良性對話的指標。
10:49
And we worked工作 with a lab實驗室
called Cortico科爾蒂科 at MITMIT
196
637724
3843
我們和麻省理工學院的
Cortico 實驗室合作,
10:54
to propose提出 four starter起動機 indicators指標
197
642479
6091
提出了四個初始指標,
11:00
that we believe we could ultimately最終
measure測量 on the system系統.
198
648594
3670
我們相信,最終能在系統中測量。
11:05
And the first one is
what we're calling調用 shared共享 attention注意.
199
653249
5604
第一個指標我們稱之為「共同焦點」。
11:10
It's a measure測量 of how much
of the conversation會話 is attentive細心
200
658877
3581
它計算的是:
人們對話的主題是集中還是分散。
11:14
on the same相同 topic話題 versus disparate不同.
201
662482
2630
11:17
The second第二 one is called shared共享 reality現實,
202
665739
2783
第二個指標叫「共同依據」,
11:21
and this is what percentage百分比
of the conversation會話
203
669217
2259
它指的是對話有多少比例
11:23
shares分享 the same相同 facts事實 --
204
671500
2005
來自同樣的依據——
11:25
not whether是否 those facts事實
are truthful真實 or not,
205
673529
3113
不論那些依據是否屬實,
11:28
but are we sharing分享
the same相同 facts事實 as we converse交談?
206
676666
3009
只看我們對話時
是否有共同的依據。
11:32
The third第三 is receptivity接受:
207
680235
2353
第三個指標是「接納歧見」:
11:34
How much of the conversation會話
is receptive接受 or civil國內
208
682612
3959
對話是否是開放、文明有禮的,
11:38
or the inverse, toxic有毒的?
209
686595
2944
或反過來說,是令人無法接受的?
11:42
And then the fourth第四
is variety品種 of perspective透視.
210
690213
3222
第四個指標是「多元觀點」。
11:45
So, are we seeing眼看 filter過濾 bubbles泡泡
or echo迴聲 chambers,
211
693459
3145
我們是否身處同溫層,或迴聲室?
11:48
or are we actually其實 getting得到
a variety品種 of opinions意見
212
696628
3057
或我們真的能在對話中
11:51
within the conversation會話?
213
699709
1635
聽見多元的聲音?
11:53
And implicit含蓄 in all four of these
is the understanding理解 that,
214
701368
4018
這四個指標蘊含的是:
11:57
as they increase增加, the conversation會話
gets得到 healthier健康 and healthier健康.
215
705410
3390
指標越是提升,對話越顯良性。
12:00
So our first step is to see
if we can measure測量 these online線上,
216
708824
4869
第一步就是嘗試能否
實際測量這些指標,
12:05
which哪一個 we believe we can.
217
713717
1308
我們相信是可行的。
12:07
We have the most momentum動量
around receptivity接受.
218
715049
3167
「接納歧見」的指標最有具動能。
12:10
We have a toxicity毒性 score得分了,
a toxicity毒性 model模型, on our system系統
219
718240
4317
系統裡有不悅度分數及模型,
12:14
that can actually其實 measure測量
whether是否 you are likely容易 to walk步行 away
220
722581
4124
它可以測量你是否有可能會離開
12:18
from a conversation會話
that you're having on Twitter推特
221
726729
2313
你在推特上參與的對話,
12:21
because you feel it's toxic有毒的,
222
729066
1633
因為它讓你感到不悅,
12:22
with some pretty漂亮 high degree.
223
730723
2512
這個指標已達相當水準。
12:26
We're working加工 to measure測量 the rest休息,
224
734369
2199
我們正致力於其它指標的測量,
12:28
and the next下一個 step is,
225
736592
1964
下一步是,
12:30
as we build建立 up solutions解決方案,
226
738580
3359
當我們在建立解決方案時,
12:33
to watch how these measurements測量
trend趨勢 over time
227
741963
3491
去觀察這些測量值的變化趨勢,
12:37
and continue繼續 to experiment實驗.
228
745478
1873
並持續地實驗。
12:39
And our goal目標 is to make sure
that these are balanced均衡,
229
747375
4041
目標是要確保這些
指標之間達到平衡,
12:43
because if you increase增加 one,
you might威力 decrease減少 another另一個.
230
751440
3066
因為如果有一項增加,
可能讓另一項減少。
12:46
If you increase增加 variety品種 of perspective透視,
231
754530
2147
如果「多元觀點」提升了,
12:48
you might威力 actually其實 decrease減少
shared共享 reality現實.
232
756701
3091
同時亦可能減少「共同依據」。
12:51
CACA: Just picking選擇 up on some
of the questions問題 flooding洪水 in here.
233
759816
4989
安:從湧入的問題中挑一個。
12:56
JDJD: Constant不變 questioning疑問.
234
764829
1271
多:不間斷的問題。
12:58
CACA: A lot of people are puzzled困惑 why,
235
766996
3620
安:很多人不明白,
13:02
like, how hard is it to get rid擺脫
of Nazis納粹 from Twitter推特?
236
770640
4247
像是讓極端種族主義分子的言論
從推特上消失有多難?
13:08
JDJD: (Laughs)
237
776309
1322
多:(笑)
13:09
So we have policies政策
around violent暴力 extremist極端主義 groups,
238
777655
6995
針對暴力極端團體
我們有相關的政策,
13:16
and the majority多數 of our work
and our terms條款 of service服務
239
784674
4426
我們主要的工作及服務條款
13:21
works作品 on conduct進行, not content內容.
240
789124
3729
是在處理行為,而非內容。
13:24
So we're actually其實 looking for conduct進行.
241
792877
2551
所以我們其實在尋找特定行為。
13:27
Conduct進行 being存在 using運用 the service服務
242
795452
3014
比如是使用服務
13:30
to repeatedly反复 or episodically不定期
harass騷擾 someone有人,
243
798490
3867
來重覆或不定期騷擾某人,
13:34
using運用 hateful可惡 imagery意象
244
802381
2493
使用仇恨的意象,
13:36
that might威力 be associated相關 with the KKK三 k 党
245
804898
2106
可能是與三 K 黨有關,
13:39
or the American美國 Nazi納粹 Party派對.
246
807028
3281
或是美國納粹黨。
13:42
Those are all things
that we act法案 on immediately立即.
247
810333
4156
那些都是我們可以立即處理的。
13:47
We're in a situation情況 right now
where that term術語 is used fairly相當 loosely鬆弛地,
248
815002
5452
目前的情況是大家
並不嚴謹地使用那個字,
13:52
and we just cannot不能 take
any one mention提到 of that word
249
820478
5313
我們實在無法只因為
那個字被提起一次,
13:57
accusing指責 someone有人 else其他
250
825815
2117
就指控某個人
13:59
as a factual事實 indication跡象 that they
should be removed去除 from the platform平台.
251
827956
3755
說有事實證據指出
他應該被踢出平臺。
14:03
So a lot of our models楷模
are based基於 around, number one:
252
831735
2627
我們許多模型的根據都是,第一:
14:06
Is this account帳戶 associated相關
with a violent暴力 extremist極端主義 group?
253
834386
3140
這個帳號和暴力極端團體有關聯嗎?
14:09
And if so, we can take action行動.
254
837550
1983
如果有,我們就能採取行動。
14:11
And we have doneDONE so on the KKK三 k 党
and the American美國 Nazi納粹 Party派對 and others其他.
255
839557
3852
我們已對三 K 黨、美國
納粹黨,及其他團體施行。
14:15
And number two: Are they using運用
imagery意象 or conduct進行
256
843433
4183
第二:他們是否也有採用
14:19
that would associate關聯 them as such這樣 as well?
257
847640
2372
和這些團體相關的意象或是行為?
14:22
CACA: How many許多 people do you have
working加工 on content內容 moderation適度
258
850416
2932
安:有多少員工負責審核內容
14:25
to look at this?
259
853372
1250
以觀察這些行為?
14:26
JDJD: It varies變化.
260
854646
1496
多:不一定。
14:28
We want to be flexible靈活 on this,
261
856166
1595
這方面我們想要保持彈性,
14:29
because we want to make sure
that we're, number one,
262
857785
2646
因為我們想要確保,第一,
14:32
building建造 algorithms算法 instead代替 of just
hiring招聘 massive大規模的 amounts of people,
263
860455
4424
我們要運用演算法,
而非僅透過大量人力,
14:36
because we need to make sure
that this is scalable可擴展性,
264
864903
2824
因為我們必須確保
作法是可以大範圍使用的,
14:39
and there are no amount of people
that can actually其實 scale規模 this.
265
867751
3454
再多的人力也不足以大規模擴展。
14:43
So this is why we've我們已經 doneDONE so much work
around proactive主動 detection發現 of abuse濫用
266
871229
6629
這是我們費盡心力做
主動偵測濫用的原因,
14:49
that humans人類 can then review評論.
267
877882
1391
接著讓人類來審查。
14:51
We want to have a situation情況
268
879297
2861
我們想要做到讓
14:54
where algorithms算法 are constantly經常
scouring every一切 single tweet鳴叫
269
882182
3741
演算法能不斷搜索每一則推文,
14:57
and bringing使 the most
interesting有趣 ones那些 to the top最佳
270
885947
2342
把最值得注意的推文置頂,
15:00
so that humans人類 can bring帶來 their judgment判斷
to whether是否 we should take action行動 or not,
271
888313
3902
這麼一來,人類可以去判斷
我們是否要採取行動,
15:04
based基於 on our terms條款 of service服務.
272
892239
1524
依據服務條款來決定。
15:05
WPRWPR: But there's not an amount
of people that are scalable可擴展性,
273
893787
2803
羅:大量人力
對系統性擴展並無幫助,
15:08
but how many許多 people do you currently目前 have
monitoring監控 these accounts賬戶,
274
896614
3497
但目前有多少人在監控這些帳戶?
15:12
and how do you figure數字 out what's enough足夠?
275
900135
2546
又如何判斷已經足夠了?
15:14
JDJD: They're completely全然 flexible靈活.
276
902705
2272
多:這部分彈性很大。
15:17
Sometimes有時 we associate關聯 folks鄉親 with spam垃圾郵件.
277
905001
2941
有時,我們讓大家去處理濫發廣告。
15:19
Sometimes有時 we associate關聯 folks鄉親
with abuse濫用 and harassment騷擾.
278
907966
3845
有時,我們讓大家去處理辱罵和騷擾。
15:23
We're going to make sure that
we have flexibility靈活性 in our people
279
911835
3062
我們要確保人員的彈性,
讓我們可以把他們導到
最需要的地方,
15:26
so that we can direct直接 them
at what is most needed需要.
280
914921
2350
有時,是選舉。
15:29
Sometimes有時, the elections選舉.
281
917295
1204
在墨西哥有一連串的選舉,
印度馬上也會有一場選舉,
15:30
We've我們已經 had a string of elections選舉
in Mexico墨西哥, one coming未來 up in India印度,
282
918523
4927
15:35
obviously明顯, the election選舉 last year,
the midterm期中考試 election選舉,
283
923474
4447
很顯然,還有去年的美國期中選舉,
15:39
so we just want to be flexible靈活
with our resources資源.
284
927945
2472
我們希望能夠彈性運用資源。
15:42
So when people --
285
930441
2129
所以,當人們——
15:44
just as an example, if you go
to our current當前 terms條款 of service服務
286
932594
6389
舉例,如果去看目前的服務條款,
15:51
and you bring帶來 the page up,
287
939007
1641
把網頁打開來,
15:52
and you're wondering想知道 about abuse濫用
and harassment騷擾 that you just received收到
288
940672
3682
你會想說,你遭受到的辱罵和騷擾
15:56
and whether是否 it was against反對
our terms條款 of service服務 to report報告 it,
289
944378
3634
是否有違反我們的服務條款,
這樣你就可以舉報,
16:00
the first thing you see
when you open打開 that page
290
948036
2559
而你打開頁面之後首先看到的
16:02
is around intellectual知識分子
property屬性 protection保護.
291
950619
3088
和智慧財產權保護有關。
16:06
You scroll滾動 down and you get to
abuse濫用, harassment騷擾
292
954504
5323
向下拉,就會看到辱罵、騷擾,
16:11
and everything else其他
that you might威力 be experiencing經歷.
293
959851
2382
及其他你可能經歷到的狀況。
16:14
So I don't know how that happened發生
over the company's公司 history歷史,
294
962257
3195
我不知道在公司歷史上
這是如何發展出來的,
16:17
but we put that above以上
the thing that people want
295
965476
4797
但我們會把大家最需要的資訊
16:24
the most information信息 on
and to actually其實 act法案 on.
296
972146
3222
拉到頁面上方,並真的採取行動。
16:27
And just our ordering排序 shows節目 the world世界
what we believed相信 was important重要.
297
975392
5241
順序就是在昭告天下
我們的價值觀。
16:32
So we're changing改變 all that.
298
980657
2881
我們在改變這些規則。
16:35
We're ordering排序 it the right way,
299
983562
1563
我們在做正確的排序,
16:37
but we're also simplifying簡化 the rules規則
so that they're human-readable人類可讀
300
985149
3451
但我們也把規則簡化,
讓人們容易閱讀,
16:40
so that people can actually其實
understand理解 themselves他們自己
301
988624
4067
讓大家能夠真的靠自己去瞭解
16:44
when something is against反對 our terms條款
and when something is not.
302
992715
3448
什麼狀況和我們的條款
有抵觸,什麼狀況沒有。
16:48
And then we're making製造 --
303
996187
2161
接著,我們在——
16:50
again, our big focus焦點 is on removing去除
the burden負擔 of work from the victims受害者.
304
998372
5200
同前述,我們很重視
不要讓受害者承擔。
16:55
So that means手段 push more
towards technology技術,
305
1003596
3734
意味著採用更多技術,
16:59
rather than humans人類 doing the work --
306
1007354
1873
而不是讓人類來做這些工作——
17:01
that means手段 the humans人類 receiving接收 the abuse濫用
307
1009251
2413
那包括了受到辱罵的人,
17:03
and also the humans人類
having to review評論 that work.
308
1011688
3026
以及要做審查工作的人。
17:06
So we want to make sure
309
1014738
1673
我們想要確保:
17:08
that we're not just encouraging鼓舞人心的 more work
310
1016435
2841
我們並非只鼓勵花更多心力
17:11
around something
that's super, super negative,
311
1019300
2629
在極度負面的事物上,
17:13
and we want to have a good balance平衡
between之間 the technology技術
312
1021953
2674
而是在運用科技及發揮
人類創意之間取得平衡,
17:16
and where humans人類 can actually其實 be creative創作的,
313
1024651
2852
17:19
which哪一個 is the judgment判斷 of the rules規則,
314
1027527
3090
像是人類對規則的判斷,
17:22
and not just all the mechanical機械 stuff東東
of finding發現 and reporting報告 them.
315
1030641
3267
而不只是所有機械化的方法
去找出、舉報它們。
17:25
So that's how we think about it.
316
1033932
1530
那是我們的看法。
17:27
CACA: I'm curious好奇 to dig in more
about what you said.
317
1035486
2406
安:我想要進一步
討論你剛所說的。
我很喜歡你說你們在想辦法
17:29
I mean, I love that you said
you are looking for ways方法
318
1037916
2605
17:32
to re-tweak重新調整 the fundamental基本的
design設計 of the system系統
319
1040545
3462
重新調整系統的基本設計,
17:36
to discourage不鼓勵 some of the reactive反應
behavior行為, and perhaps也許 --
320
1044031
4875
來阻止一些反應性行為,也許——
17:40
to use Tristan特里斯坦 Harris-type哈裡斯型 language語言 --
321
1048930
2705
用崔斯坦.哈里斯的話說——
17:43
engage從事 people's人們 more reflective反光 thinking思維.
322
1051659
4288
促使大家做更多反思。
17:47
How far advanced高級 is that?
323
1055971
1854
這部分的進展如何?
17:49
What would alternatives備擇方案
to that "like" button按鍵 be?
324
1057849
4305
「喜歡」按鈕的替代方案會像什麼?
17:55
JDJD: Well, first and foremost最重要的是,
325
1063518
3575
多:首先,也是最重要的,
17:59
my personal個人 goal目標 with the service服務
is that I believe fundamentally從根本上
326
1067117
5753
對這項服務,我個人的
目標是,基本上我相信,
18:04
that public上市 conversation會話 is critical危急.
327
1072894
2702
公眾對話是至關重要的。
18:07
There are existential存在 problems問題
facing面對 the world世界
328
1075620
2647
現今全世界一同
面臨攸關存亡的問題,
18:10
that are facing面對 the entire整個 world世界,
not any one particular特定 nation-state民族國家,
329
1078291
4163
並不僅限於特定的民族國家,
18:14
that global全球 public上市 conversation會話 benefits好處.
330
1082478
2649
因此全球公眾對話能從中受益。
18:17
And that is one of the unique獨特
dynamics動力學 of Twitter推特,
331
1085151
2372
那是推特的獨特動力之一,
18:19
that it is completely全然 open打開,
332
1087547
1814
它是完全開放的、
18:21
it is completely全然 public上市,
333
1089385
1596
它是完全公開的、
18:23
it is completely全然 fluid流體,
334
1091005
1399
它是完全流動的、
18:24
and anyone任何人 can see any other conversation會話
and participate參加 in it.
335
1092428
4038
任何人都能看見並參與任何對話。
18:28
So there are conversations對話
like climate氣候 change更改.
336
1096490
2206
有些話題是氣候變遷、
18:30
There are conversations對話
like the displacement移位 in the work
337
1098720
2682
有些話題是工作被人工智慧取代、
18:33
through通過 artificial人造 intelligence情報.
338
1101426
2000
18:35
There are conversations對話
like economic經濟 disparity差距.
339
1103450
3006
有些話題是貧富差距。
18:38
No matter what any one nation-state民族國家 does,
340
1106480
2765
不論任何單一民族國家怎麼做,
18:41
they will not be able能夠
to solve解決 the problem問題 alone單獨.
341
1109269
2421
都無法獨自解決問題。
18:43
It takes coordination協調 around the world世界,
342
1111714
2643
需要全世界的協調,
18:46
and that's where I think
Twitter推特 can play a part部分.
343
1114381
3047
而我認為推特此時便能發揮影響力。
18:49
The second第二 thing is that Twitter推特,
right now, when you go to it,
344
1117452
5642
第二,目前你去用推特時,
18:55
you don't necessarily一定 walk步行 away
feeling感覺 like you learned學到了 something.
345
1123118
3746
你不見得會在離開時
覺得你獲得了新知。
18:58
Some people do.
346
1126888
1276
有些人會,
19:00
Some people have
a very, very rich豐富 network網絡,
347
1128188
3107
有些人擁有非常豐沛的人脈、
19:03
a very rich豐富 community社區
that they learn學習 from every一切 single day.
348
1131319
3117
非常優質的社群,讓他們
每天都能從中學到東西。
19:06
But it takes a lot of work
and a lot of time to build建立 up to that.
349
1134460
3691
但要很多時間心力才能建立起來。
19:10
So we want to get people
to those topics主題 and those interests利益
350
1138175
3448
我們想讓大家以更快的速度,
找到那些主題、興趣,
19:13
much, much faster更快
351
1141647
1579
19:15
and make sure that
they're finding發現 something that,
352
1143250
2566
並確保他們能找到東西,
19:18
no matter how much time
they spend on Twitter推特 --
353
1146728
2360
不論人們花多少時間在推特,
19:21
and I don't want to maximize最大化
the time on Twitter推特,
354
1149112
2358
我想要最大化的
不是人們花在推特的時間,
19:23
I want to maximize最大化
what they actually其實 take away from it
355
1151494
2910
而是人們從推特帶走的東西,
19:26
and what they learn學習 from it, and --
356
1154428
2030
讓人們學到東西……
19:29
CACA: Well, do you, though雖然?
357
1157598
1328
安:但你能這麼做嗎?
19:30
Because that's the core核心 question
that a lot of people want to know.
358
1158950
3244
那是許多人想知道的核心問題,
19:34
Surely一定, Jack插口, you're constrained受限,
to a huge巨大 extent程度,
359
1162218
3638
當然,傑克,你受到
很大程度的限制,
19:37
by the fact事實 that you're a public上市 company公司,
360
1165880
2007
因為你們是上市公司,
19:39
you've got investors投資者 pressing緊迫 on you,
361
1167911
1774
你有來自投資者的壓力,
19:41
the number one way you make your money
is from advertising廣告 --
362
1169709
3559
廣告是最大的收入來源.
19:45
that depends依靠 on user用戶 engagement訂婚.
363
1173292
2772
那和使用者參與程度有關。
19:48
Are you willing願意 to sacrifice犧牲
user用戶 time, if need be,
364
1176088
4700
必要的話,你是否願意
犧牲使用者停留時間,
19:52
to go for a more reflective反光 conversation會話?
365
1180812
3729
來換取更具反思性的談話?
19:56
JDJD: Yeah; more relevance關聯 means手段
less time on the service服務,
366
1184565
3111
多:會,若高相關性
意味更短的停留時間,
19:59
and that's perfectly完美 fine,
367
1187700
1937
那完全沒問題,
20:01
because we want to make sure
that, like, you're coming未來 to Twitter推特,
368
1189661
3099
因為我們想要確保的是,
當你上了推特,
20:04
and you see something immediately立即
that you learn學習 from and that you push.
369
1192784
4520
可以馬上看到某樣東西,
能從中學習,並會轉推。
20:09
We can still serve服務 an ad廣告 against反對 that.
370
1197328
3420
我們仍然能於此情境下搭配廣告。
20:12
That doesn't mean you need to spend
any more time to see more.
371
1200772
2921
那並不表示需要
花更多時間、看更多東西。
我們在研究的第二件事——
20:15
The second第二 thing we're looking at --
372
1203717
1733
安:單就那個目標,
每日活躍使用量,
20:17
CACA: But just -- on that goal目標,
daily日常 active活性 usage用法,
373
1205474
2698
20:20
if you're measuring測量 that,
that doesn't necessarily一定 mean things
374
1208196
3245
如果你要測量它,它並不見得是
20:23
that people value every一切 day.
375
1211465
1738
人們真正重視的價值。
20:25
It may可能 well mean
376
1213227
1161
它也有可能是
20:26
things that people are drawn to
like a moth to the flame火焰, every一切 day.
377
1214412
3306
能把大家像是飛蛾撲火般
吸引過去的東西。
20:29
We are addicted上癮, because we see
something that pisses尿尿 us off,
378
1217742
3022
我們會上癮,是因為
看到惹毛我們的東西,
20:32
so we go in and add fuel汽油 to the fire,
379
1220788
3178
我們就會上推特,去火上加油,
20:35
and the daily日常 active活性 usage用法 goes up,
380
1223990
1927
而每日活躍使用量就會上升,
20:37
and there's more ad廣告 revenue收入 there,
381
1225941
1715
那就是更多的廣告收入,
20:39
but we all get angrier憤怒 with each other.
382
1227680
2752
但大家對彼此更生氣。
20:42
How do you define確定 ...
383
1230456
2509
你要如何定義……
20:44
"Daily日常 active活性 usage用法" seems似乎 like a really
dangerous危險 term術語 to be optimizing優化.
384
1232989
4126
若要最佳化「每日活躍
使用量」,似乎挺危險的。
20:49
(Applause掌聲)
385
1237139
5057
(掌聲)
20:54
JDJD: Taken採取 alone單獨, it is,
386
1242220
1268
多:單獨看的確是,
20:55
but you didn't let me
finish the other metric,
387
1243512
2346
但你還沒讓我講完另一項度量,
20:57
which哪一個 is, we're watching觀看 for conversations對話
388
1245882
3727
那就是,我們在留意對話及對話鏈。
21:01
and conversation會話 chains.
389
1249633
2129
21:03
So we want to incentivize激勵
healthy健康 contribution貢獻 back to the network網絡,
390
1251786
5076
我們想要提供不同的激勵,
讓良性對話回這個平臺,
21:08
and what we believe that is
is actually其實 participating參與 in conversation會話
391
1256886
4181
並鼓勵人們參與良性對話,
21:13
that is healthy健康,
392
1261091
1197
21:14
as defined定義 by those four indicators指標
I articulated鉸接式 earlier.
393
1262312
5037
所謂的良性,就是由
前述的四個指標所定義。
21:19
So you can't just optimize優化
around one metric.
394
1267373
2657
不能只將一項度量給最佳化。
21:22
You have to balance平衡 and look constantly經常
395
1270054
2752
得要平衡,且經常要注意
21:24
at what is actually其實 going to create創建
a healthy健康 contribution貢獻 to the network網絡
396
1272830
4083
什麼能夠對這個網路
創造出良性的貢獻,
21:28
and a healthy健康 experience經驗 for people.
397
1276937
2341
為大家創造正向的體驗。
21:31
Ultimately最終,, we want to get to a metric
398
1279302
1866
最終我們想要達到的標準是
21:33
where people can tell us,
"Hey, I learned學到了 something from Twitter推特,
399
1281192
3757
大家能告訴我們:「嘿,
我從推特學到了些東西,
21:36
and I'm walking步行 away
with something valuable有價值."
400
1284973
2167
我帶著有價值的東西離開。」
21:39
That is our goal目標 ultimately最終 over time,
401
1287164
2043
那是我們的最終目標,
21:41
but that's going to take some time.
402
1289231
1809
但那會花不少時間。
21:43
CACA: You come over to many許多,
I think to me, as this enigma.
403
1291064
5282
安:對許多人來說你是個謎,
對我來說是如此。
21:48
This is possibly或者 unfair不公平,
but I woke醒來 up the other night
404
1296370
4396
這樣說可能不公平,
但有天晚上我醒來,
21:52
with this picture圖片 of how I found發現 I was
thinking思維 about you and the situation情況,
405
1300790
3879
腦中的畫面是我
想著你和目前的狀態,
21:56
that we're on this great voyage航程 with you
on this ship called the "Twittanic特維塔尼奇" --
406
1304693
6903
我們和你一同航向偉大的旅程,
在艘名為「推達尼號」的船上,
22:03
(Laughter笑聲)
407
1311620
1281
(笑聲)
22:04
and there are people on board in steerage統艙
408
1312925
4357
船上有些在最低價艙等的人,
22:09
who are expressing表達 discomfort不舒服,
409
1317306
2203
他們表示感到不舒服,
22:11
and you, unlike不像 many許多 other captains船長,
410
1319533
2543
而你和其他船長不同,
22:14
are saying, "Well, tell me, talk to me,
listen to me, I want to hear."
411
1322100
3431
你說:「告訴我,跟我說,
聽我說,我想要知道。」
22:17
And they talk to you, and they say,
"We're worried擔心 about the iceberg冰山 ahead."
412
1325555
3619
他們跟你談,他們說:
「我們很擔心前面的冰山。」
22:21
And you go, "You know,
that is a powerful強大 point,
413
1329198
2242
你說:「那是很有力的論點,
22:23
and our ship, frankly坦率地說,
hasn't有沒有 been built內置 properly正確
414
1331464
2430
老實說,我們的船並沒有妥善製造,
22:25
for steering操舵 as well as it might威力."
415
1333918
1669
無法操作得如預期理想。」
22:27
And we say, "Please do something."
416
1335611
1658
我們說:「請做些什麼。」
你到了駕駛臺,
22:29
And you go to the bridge,
417
1337293
1411
22:30
and we're waiting等候,
418
1338728
2295
我們在等待,
22:33
and we look, and then you're showing展示
this extraordinary非凡 calm冷靜,
419
1341047
4548
我們看著,你表現出超凡的冷靜,
22:37
but we're all standing常設 outside,
saying, "Jack插口, turn the fucking他媽的 wheel!"
420
1345619
3883
但我們都站在外面說:
「傑克,轉那他媽的舵!」
22:41
You know?
421
1349526
1151
你知道嗎?
22:42
(Laughter笑聲)
422
1350701
1335
(笑聲)
22:44
(Applause掌聲)
423
1352060
2381
(掌聲)
22:46
I mean --
424
1354465
1166
我是說——
22:47
(Applause掌聲)
425
1355655
1734
(掌聲)
22:49
It's democracy民主 at stake賭注.
426
1357413
4594
民主岌岌可危。
22:54
It's our culture文化 at stake賭注.
It's our world世界 at stake賭注.
427
1362031
2821
我們的文化和世界岌岌可危。
22:56
And Twitter推特 is amazing驚人 and shapes形狀 so much.
428
1364876
4706
推特很了不起且有極大的影響力。
23:01
It's not as big as some
of the other platforms平台,
429
1369606
2233
或許它不比其他的平臺大,
23:03
but the people of influence影響 use it
to set the agenda議程,
430
1371863
2804
但有影響力的人用它來排定議程,
23:06
and it's just hard to imagine想像 a more
important重要 role角色 in the world世界 than to ...
431
1374691
6787
很難想像世界上
還有更重要的角色,
23:13
I mean, you're doing a brilliant輝煌 job工作
of listening, Jack插口, and hearing聽力 people,
432
1381502
3784
我是指,傑克,你在
傾聽這方面做得很出色,
但正視其急迫性
並努力推動改變——
23:17
but to actually其實 dial撥號 up the urgency
and move移動 on this stuff東東 --
433
1385310
4445
23:21
will you do that?
434
1389779
2201
你願意這麼做嗎?
23:24
JDJD: Yes, and we have been
moving移動 substantially基本上.
435
1392750
3815
多:會的,我們已取得
一些實質進展。
23:28
I mean, there's been
a few少數 dynamics動力學 in Twitter'sTwitter的 history歷史.
436
1396589
3225
推特經歷過一些巨變。
當我重回公司時,
23:31
One, when I came來了 back to the company公司,
437
1399838
2083
23:35
we were in a pretty漂亮 dire可怕的 state
in terms條款 of our future未來,
438
1403477
6256
我們未來的處境非常危急,
23:41
and not just from how people
were using運用 the platform平台,
439
1409757
4634
不僅是因為人們
使用這個平臺的方法,
23:46
but from a corporate企業 narrative敘述 as well.
440
1414415
2047
還因為公司結構的問題。
23:48
So we had to fix固定
a bunch of the foundation基礎,
441
1416486
3204
所以我們得改變一些公司基礎,
23:51
turn the company公司 around,
442
1419714
1969
轉變公司的方向,
23:53
go through通過 two crazy layoffs裁員,
443
1421707
3111
歷經兩次瘋狂的裁員,
23:56
because we just got too big
for what we were doing,
444
1424842
3793
因為當時做的太多,
24:00
and we focused重點 all of our energy能源
445
1428659
2060
我們將工作重心
24:02
on this concept概念 of serving服務
the public上市 conversation會話.
446
1430743
3508
放在為公眾對話服務上。
這花了一些心力。
24:06
And that took some work.
447
1434275
1451
24:07
And as we dived跳水 into that,
448
1435750
2608
當我們開始深入探討時,
24:10
we realized實現 some of the issues問題
with the fundamentals基本面.
449
1438382
2992
我們意識到一些根本性的問題。
24:14
We could do a bunch of superficial things
to address地址 what you're talking about,
450
1442120
4656
為了達成你所說的目的,
可以採取一些表面功夫,
但我們需要可長可久的變革,
24:18
but we need the changes變化 to last,
451
1446800
1790
24:20
and that means手段 going really, really deep
452
1448614
2459
而這意味著必須非常深入,
24:23
and paying付款 attention注意
to what we started開始 13 years年份 ago
453
1451097
4350
並觀察我們 13 年前建立的東西,
24:27
and really questioning疑問
454
1455471
2261
並且反思檢視:
24:29
how the system系統 works作品
and how the framework骨架 works作品
455
1457756
2566
這個系統和框架是怎樣運作的,
24:32
and what is needed需要 for the world世界 today今天,
456
1460346
3833
現今社會需要的是什麼,
24:36
given特定 how quickly很快 everything is moving移動
and how people are using運用 it.
457
1464203
4024
考慮到世界變化之快
和人們使用它的方法。
24:40
So we are working加工 as quickly很快 as we can,
but quickness速度 will not get the job工作 doneDONE.
458
1468251
6544
我們已經快馬加鞭了,
但「快」對完成任務並無幫助。
24:46
It's focus焦點, it's prioritization優先,
459
1474819
2611
重要的是:專注、優先順序、
24:49
it's understanding理解
the fundamentals基本面 of the network網絡
460
1477454
2946
對網路之基礎的理解,
24:52
and building建造 a framework骨架 that scales
461
1480424
2842
建構可以靈活變動、適應變化的機制。
24:55
and that is resilient彈性 to change更改,
462
1483290
2351
24:57
and being存在 open打開 about where we are
and being存在 transparent透明 about where are
463
1485665
5429
並對進度開誠布公,
25:03
so that we can continue繼續 to earn trust相信.
464
1491118
2179
這樣我們才能繼續贏得人們的信任。
我對已經投入執行的機制感到自豪、
25:06
So I'm proud驕傲 of all the frameworks構架
that we've我們已經 put in place地點.
465
1494141
3331
我對我們選定的前進方向感到自豪。
25:09
I'm proud驕傲 of our direction方向.
466
1497496
2888
25:12
We obviously明顯 can move移動 faster更快,
467
1500915
2718
顯然我們可以再加緊腳步,
25:15
but that required需要 just stopping停止 a bunch
of stupid stuff東東 we were doing in the past過去.
468
1503657
4719
但這包含放棄過去
一連串愚蠢的決定。
安:好的。
25:21
CACA: All right.
469
1509067
1164
我想如果有機會的話,在場很多人
25:22
Well, I suspect疑似 there are many許多 people here
who, if given特定 the chance機會,
470
1510255
4067
25:26
would love to help you
on this change-making變化的決策 agenda議程 you're on,
471
1514346
3989
都樂意協助你完成這一項重大改變,
25:30
and I don't know if Whitney惠特尼 --
472
1518359
1542
惠特尼是否還有問題——
25:31
Jack插口, thank you for coming未來 here
and speaking請講 so openly公然.
473
1519925
2761
傑克,感謝你能來與我們
開誠布公地討論。
這需要勇氣。
25:34
It took courage勇氣.
474
1522710
1527
25:36
I really appreciate欣賞 what you said,
and good luck運氣 with your mission任務.
475
1524261
3384
我很欣賞你剛所說的,
祝你好運完成任務。
25:39
JDJD: Thank you so much.
Thanks謝謝 for having me.
476
1527669
2095
多:非常感謝。謝謝你們邀請我。
25:41
(Applause掌聲)
477
1529788
3322
(掌聲)
25:45
Thank you.
478
1533134
1159
謝謝。
Translated by Val Zhang
Reviewed by Xiaowei Dong

▲Back to top

ABOUT THE SPEAKERS
Jack Dorsey - Entrepreneur, programmer
Jack Dorsey is the CEO of Twitter, CEO & Chairman of Square, and a cofounder of both.

Why you should listen
More profile about the speaker
Jack Dorsey | Speaker | TED.com
Chris Anderson - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading.

Why you should listen

Chris Anderson is the Curator of TED, a nonprofit devoted to sharing valuable ideas, primarily through the medium of 'TED Talks' -- short talks that are offered free online to a global audience.

Chris was born in a remote village in Pakistan in 1957. He spent his early years in India, Pakistan and Afghanistan, where his parents worked as medical missionaries, and he attended an American school in the Himalayas for his early education. After boarding school in Bath, England, he went on to Oxford University, graduating in 1978 with a degree in philosophy, politics and economics.

Chris then trained as a journalist, working in newspapers and radio, including two years producing a world news service in the Seychelles Islands.

Back in the UK in 1984, Chris was captivated by the personal computer revolution and became an editor at one of the UK's early computer magazines. A year later he founded Future Publishing with a $25,000 bank loan. The new company initially focused on specialist computer publications but eventually expanded into other areas such as cycling, music, video games, technology and design, doubling in size every year for seven years. In 1994, Chris moved to the United States where he built Imagine Media, publisher of Business 2.0 magazine and creator of the popular video game users website IGN. Chris eventually merged Imagine and Future, taking the combined entity public in London in 1999, under the Future name. At its peak, it published 150 magazines and websites and employed 2,000 people.

This success allowed Chris to create a private nonprofit organization, the Sapling Foundation, with the hope of finding new ways to tackle tough global issues through media, technology, entrepreneurship and, most of all, ideas. In 2001, the foundation acquired the TED Conference, then an annual meeting of luminaries in the fields of Technology, Entertainment and Design held in Monterey, California, and Chris left Future to work full time on TED.

He expanded the conference's remit to cover all topics, including science, business and key global issues, while adding a Fellows program, which now has some 300 alumni, and the TED Prize, which grants its recipients "one wish to change the world." The TED stage has become a place for thinkers and doers from all fields to share their ideas and their work, capturing imaginations, sparking conversation and encouraging discovery along the way.

In 2006, TED experimented with posting some of its talks on the Internet. Their viral success encouraged Chris to begin positioning the organization as a global media initiative devoted to 'ideas worth spreading,' part of a new era of information dissemination using the power of online video. In June 2015, the organization posted its 2,000th talk online. The talks are free to view, and they have been translated into more than 100 languages with the help of volunteers from around the world. Viewership has grown to approximately one billion views per year.

Continuing a strategy of 'radical openness,' in 2009 Chris introduced the TEDx initiative, allowing free licenses to local organizers who wished to organize their own TED-like events. More than 8,000 such events have been held, generating an archive of 60,000 TEDx talks. And three years later, the TED-Ed program was launched, offering free educational videos and tools to students and teachers.

More profile about the speaker
Chris Anderson | Speaker | TED.com
Whitney Pennington Rodgers - TED Current Affairs Curator
Whitney Pennington Rodgers is an award-winning journalist and media professional.

Why you should listen

Prior to joining TED as current affairs curator, Whitney Pennington Rodgers produced for NBC's primetime news magazine Dateline NBC. She earned a duPont-Columbia award and a News & Documentary Emmy or her contributions to the Dateline NBC hour "The Cosby Accusers Speak" -- an extensive group interview with 27 of the women who accused entertainer Bill Cosby of sexual misconduct.

Pennington Rodgers has worked at NBC's in-house production company Peacock Productions, The Today Show, Nightly News, Rock Center with Brian Williams and New Jersey-centric public affairs shows Caucus: New Jersey and One-on-One with Steve Adubato. Prior to beginning her career in media, she had a short stint as a fourth-grade teacher through the Teach for America program.

Pennington Rodgers received her Bachelor's in journalism and media studies from Rutgers University. She completed her Master's of Journalism at the University of California at Berkeley, where she produced a documentary about recruitment of nonblack students at historically black colleges and universities.

More profile about the speaker
Whitney Pennington Rodgers | Speaker | TED.com