ABOUT THE SPEAKERS
Jack Dorsey - Entrepreneur, programmer
Jack Dorsey is the CEO of Twitter, CEO & Chairman of Square, and a cofounder of both.

Why you should listen
More profile about the speaker
Jack Dorsey | Speaker | TED.com
Chris Anderson - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading.

Why you should listen

Chris Anderson is the Curator of TED, a nonprofit devoted to sharing valuable ideas, primarily through the medium of 'TED Talks' -- short talks that are offered free online to a global audience.

Chris was born in a remote village in Pakistan in 1957. He spent his early years in India, Pakistan and Afghanistan, where his parents worked as medical missionaries, and he attended an American school in the Himalayas for his early education. After boarding school in Bath, England, he went on to Oxford University, graduating in 1978 with a degree in philosophy, politics and economics.

Chris then trained as a journalist, working in newspapers and radio, including two years producing a world news service in the Seychelles Islands.

Back in the UK in 1984, Chris was captivated by the personal computer revolution and became an editor at one of the UK's early computer magazines. A year later he founded Future Publishing with a $25,000 bank loan. The new company initially focused on specialist computer publications but eventually expanded into other areas such as cycling, music, video games, technology and design, doubling in size every year for seven years. In 1994, Chris moved to the United States where he built Imagine Media, publisher of Business 2.0 magazine and creator of the popular video game users website IGN. Chris eventually merged Imagine and Future, taking the combined entity public in London in 1999, under the Future name. At its peak, it published 150 magazines and websites and employed 2,000 people.

This success allowed Chris to create a private nonprofit organization, the Sapling Foundation, with the hope of finding new ways to tackle tough global issues through media, technology, entrepreneurship and, most of all, ideas. In 2001, the foundation acquired the TED Conference, then an annual meeting of luminaries in the fields of Technology, Entertainment and Design held in Monterey, California, and Chris left Future to work full time on TED.

He expanded the conference's remit to cover all topics, including science, business and key global issues, while adding a Fellows program, which now has some 300 alumni, and the TED Prize, which grants its recipients "one wish to change the world." The TED stage has become a place for thinkers and doers from all fields to share their ideas and their work, capturing imaginations, sparking conversation and encouraging discovery along the way.

In 2006, TED experimented with posting some of its talks on the Internet. Their viral success encouraged Chris to begin positioning the organization as a global media initiative devoted to 'ideas worth spreading,' part of a new era of information dissemination using the power of online video. In June 2015, the organization posted its 2,000th talk online. The talks are free to view, and they have been translated into more than 100 languages with the help of volunteers from around the world. Viewership has grown to approximately one billion views per year.

Continuing a strategy of 'radical openness,' in 2009 Chris introduced the TEDx initiative, allowing free licenses to local organizers who wished to organize their own TED-like events. More than 8,000 such events have been held, generating an archive of 60,000 TEDx talks. And three years later, the TED-Ed program was launched, offering free educational videos and tools to students and teachers.

More profile about the speaker
Chris Anderson | Speaker | TED.com
Whitney Pennington Rodgers - TED Current Affairs Curator
Whitney Pennington Rodgers is an award-winning journalist and media professional.

Why you should listen

Prior to joining TED as current affairs curator, Whitney Pennington Rodgers produced for NBC's primetime news magazine Dateline NBC. She earned a duPont-Columbia award and a News & Documentary Emmy or her contributions to the Dateline NBC hour "The Cosby Accusers Speak" -- an extensive group interview with 27 of the women who accused entertainer Bill Cosby of sexual misconduct.

Pennington Rodgers has worked at NBC's in-house production company Peacock Productions, The Today Show, Nightly News, Rock Center with Brian Williams and New Jersey-centric public affairs shows Caucus: New Jersey and One-on-One with Steve Adubato. Prior to beginning her career in media, she had a short stint as a fourth-grade teacher through the Teach for America program.

Pennington Rodgers received her Bachelor's in journalism and media studies from Rutgers University. She completed her Master's of Journalism at the University of California at Berkeley, where she produced a documentary about recruitment of nonblack students at historically black colleges and universities.

More profile about the speaker
Whitney Pennington Rodgers | Speaker | TED.com
TED2019

Jack Dorsey: How Twitter needs to change

Filmed:
2,089,470 views

Can Twitter be saved? In a wide-ranging conversation with TED's Chris Anderson and Whitney Pennington Rodgers, Twitter CEO Jack Dorsey discusses the future of the platform -- acknowledging problems with harassment and moderation and proposing some fundamental changes that he hopes will encourage healthy, respectful conversations. "Are we actually delivering something that people value every single day?" Dorsey asks.
- Entrepreneur, programmer
Jack Dorsey is the CEO of Twitter, CEO & Chairman of Square, and a cofounder of both. Full bio - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading. Full bio - TED Current Affairs Curator
Whitney Pennington Rodgers is an award-winning journalist and media professional. Full bio

Double-click the English transcript below to play the video.

00:13
Chris Anderson:
What worries you right now?
0
1131
2408
00:15
You've been very open
about lots of issues on Twitter.
1
3563
2853
00:18
What would be your top worry
2
6440
2299
00:20
about where things are right now?
3
8763
2049
00:23
Jack Dorsey: Right now,
the health of the conversation.
4
11447
2929
00:26
So, our purpose is to serve
the public conversation,
5
14400
3660
00:30
and we have seen
a number of attacks on it.
6
18084
5056
00:35
We've seen abuse, we've seen harassment,
7
23164
2425
00:37
we've seen manipulation,
8
25613
3222
00:40
automation, human coordination,
misinformation.
9
28859
4265
00:46
So these are all dynamics
that we were not expecting
10
34134
4034
00:50
13 years ago when we were
starting the company.
11
38192
3718
00:53
But we do now see them at scale,
12
41934
2664
00:56
and what worries me most
is just our ability to address it
13
44622
5278
01:01
in a systemic way that is scalable,
14
49924
3108
01:05
that has a rigorous understanding
of how we're taking action,
15
53056
6976
01:12
a transparent understanding
of how we're taking action
16
60056
3105
01:15
and a rigorous appeals process
for when we're wrong,
17
63185
3101
01:18
because we will be wrong.
18
66310
2169
01:20
Whitney Pennington Rodgers:
I'm really glad to hear
19
68503
2397
01:22
that that's something that concerns you,
20
70924
1928
01:24
because I think there's been
a lot written about people
21
72876
2630
01:27
who feel they've been abused
and harassed on Twitter,
22
75530
2477
01:30
and I think no one more so
than women and women of color
23
78031
4102
01:34
and black women.
24
82157
1170
01:35
And there's been data that's come out --
25
83351
1913
01:37
Amnesty International put out
a report a few months ago
26
85288
2909
01:40
where they showed that a subset
of active black female Twitter users
27
88221
4480
01:44
were receiving, on average,
one in 10 of their tweets
28
92725
3456
01:48
were some form of harassment.
29
96205
2099
01:50
And so when you think about health
for the community on Twitter,
30
98328
3907
01:54
I'm interested to hear,
"health for everyone,"
31
102259
4024
01:58
but specifically: How are you looking
to make Twitter a safe space
32
106307
3125
02:01
for that subset, for women,
for women of color and black women?
33
109456
4164
02:05
JD: Yeah.
34
113644
1164
02:06
So it's a pretty terrible situation
35
114832
2643
02:09
when you're coming to a service
36
117499
1619
02:11
that, ideally, you want to learn
something about the world,
37
119142
4321
02:15
and you spend the majority of your time
reporting abuse, receiving abuse,
38
123487
5443
02:20
receiving harassment.
39
128954
1804
02:23
So what we're looking most deeply at
is just the incentives
40
131373
6321
02:29
that the platform naturally provides
and the service provides.
41
137718
3823
02:34
Right now, the dynamic of the system
makes it super-easy to harass
42
142262
4577
02:38
and to abuse others through the service,
43
146863
3664
02:42
and unfortunately, the majority
of our system in the past
44
150551
3262
02:45
worked entirely based on people
reporting harassment and abuse.
45
153837
5596
02:51
So about midway last year,
we decided that we were going to apply
46
159457
5075
02:56
a lot more machine learning,
a lot more deep learning to the problem,
47
164556
3982
03:00
and try to be a lot more proactive
around where abuse is happening,
48
168562
4538
03:05
so that we can take the burden
off the victim completely.
49
173124
3960
03:09
And we've made some progress recently.
50
177108
2435
03:11
About 38 percent of abusive tweets
are now proactively identified
51
179567
6689
03:18
by machine learning algorithms
52
186280
1715
03:20
so that people don't actually
have to report them.
53
188019
2334
03:22
But those that are identified
are still reviewed by humans,
54
190377
3305
03:25
so we do not take down content or accounts
without a human actually reviewing it.
55
193706
5384
03:31
But that was from zero percent
just a year ago.
56
199114
2759
03:33
So that meant, at that zero percent,
57
201897
1931
03:35
every single person who received abuse
had to actually report it,
58
203852
3650
03:39
which was a lot of work for them,
a lot of work for us
59
207526
3579
03:43
and just ultimately unfair.
60
211129
2018
03:46
The other thing that we're doing
is making sure that we, as a company,
61
214528
3780
03:50
have representation of all the communities
that we're trying to serve.
62
218332
3333
03:53
We can't build a business
that is successful
63
221689
2159
03:55
unless we have a diversity
of perspective inside of our walls
64
223872
3300
03:59
that actually feel these issues
every single day.
65
227196
3732
04:02
And that's not just with the team
that's doing the work,
66
230952
3738
04:06
it's also within our leadership as well.
67
234714
2096
04:08
So we need to continue to build empathy
for what people are experiencing
68
236834
5757
04:14
and give them better tools to act on it
69
242615
3316
04:17
and also give our customers
a much better and easier approach
70
245955
4252
04:22
to handle some of the things
that they're seeing.
71
250231
2382
04:24
So a lot of what we're doing
is around technology,
72
252637
3266
04:27
but we're also looking at
the incentives on the service:
73
255927
4308
04:32
What does Twitter incentivize you to do
when you first open it up?
74
260259
5183
04:37
And in the past,
75
265466
1294
04:40
it's incented a lot of outrage,
it's incented a lot of mob behavior,
76
268670
5544
04:46
it's incented a lot of group harassment.
77
274238
2459
04:48
And we have to look a lot deeper
at some of the fundamentals
78
276721
3648
04:52
of what the service is doing
to make the bigger shifts.
79
280393
2958
04:55
We can make a bunch of small shifts
around technology, as I just described,
80
283375
4031
04:59
but ultimately, we have to look deeply
at the dynamics in the network itself,
81
287430
4386
05:03
and that's what we're doing.
82
291840
1368
05:05
CA: But what's your sense --
83
293232
2060
05:07
what is the kind of thing
that you might be able to change
84
295316
3963
05:11
that would actually
fundamentally shift behavior?
85
299303
2749
05:15
JD: Well, one of the things --
86
303386
1480
05:16
we started the service
with this concept of following an account,
87
304890
5340
05:22
as an example,
88
310254
1725
05:24
and I don't believe that's why
people actually come to Twitter.
89
312003
4349
05:28
I believe Twitter is best
as an interest-based network.
90
316376
4857
05:33
People come with a particular interest.
91
321257
3453
05:36
They have to do a ton of work
to find and follow the related accounts
92
324734
3487
05:40
around those interests.
93
328245
1405
05:42
What we could do instead
is allow you to follow an interest,
94
330217
3397
05:45
follow a hashtag, follow a trend,
95
333638
2103
05:47
follow a community,
96
335765
1754
05:49
which gives us the opportunity
to show all of the accounts,
97
337543
4637
05:54
all the topics, all the moments,
all the hashtags
98
342204
3323
05:57
that are associated with that
particular topic and interest,
99
345551
3992
06:01
which really opens up
the perspective that you see.
100
349567
4600
06:06
But that is a huge fundamental shift
101
354191
2157
06:08
to bias the entire network
away from just an account bias
102
356372
3792
06:12
towards a topics and interest bias.
103
360188
2587
06:15
CA: Because isn't it the case
104
363283
3148
06:19
that one reason why you have
so much content on there
105
367375
3541
06:22
is a result of putting millions
of people around the world
106
370940
3591
06:26
in this kind of gladiatorial
contest with each other
107
374555
3142
06:29
for followers, for attention?
108
377721
2090
06:31
Like, from the point of view
of people who just read Twitter,
109
379835
4117
06:35
that's not an issue,
110
383976
1155
06:37
but for the people who actually create it,
everyone's out there saying,
111
385155
3350
06:40
"You know, I wish I had
a few more 'likes,' followers, retweets."
112
388529
3236
06:43
And so they're constantly experimenting,
113
391789
2148
06:45
trying to find the path to do that.
114
393961
1961
06:47
And what we've all discovered
is that the number one path to do that
115
395946
4126
06:52
is to be some form of provocative,
116
400096
3406
06:55
obnoxious, eloquently obnoxious,
117
403526
2980
06:58
like, eloquent insults
are a dream on Twitter,
118
406530
3516
07:02
where you rapidly pile up --
119
410070
2603
07:04
and it becomes this self-fueling
process of driving outrage.
120
412697
4608
07:09
How do you defuse that?
121
417329
2351
07:12
JD: Yeah, I mean, I think you're spot on,
122
420624
2947
07:15
but that goes back to the incentives.
123
423595
1886
07:17
Like, one of the choices
we made in the early days was
124
425505
2632
07:20
we had this number that showed
how many people follow you.
125
428161
4701
07:24
We decided that number
should be big and bold,
126
432886
2959
07:27
and anything that's on the page
that's big and bold has importance,
127
435869
3740
07:31
and those are the things
that you want to drive.
128
439633
2278
07:33
Was that the right decision at the time?
129
441935
1907
07:35
Probably not.
130
443866
1153
07:37
If I had to start the service again,
131
445043
1805
07:38
I would not emphasize
the follower count as much.
132
446872
2398
07:41
I would not emphasize
the "like" count as much.
133
449294
2295
07:43
I don't think I would even
create "like" in the first place,
134
451613
3120
07:46
because it doesn't actually push
135
454757
3267
07:50
what we believe now
to be the most important thing,
136
458048
3179
07:53
which is healthy contribution
back to the network
137
461251
3039
07:56
and conversation to the network,
138
464314
2652
07:58
participation within conversation,
139
466990
2072
08:01
learning something from the conversation.
140
469086
2493
08:03
Those are not things
that we thought of 13 years ago,
141
471603
2824
08:06
and we believe are extremely
important right now.
142
474451
2439
08:08
So we have to look at
how we display the follower count,
143
476914
3023
08:11
how we display retweet count,
144
479961
2365
08:14
how we display "likes,"
145
482350
1401
08:15
and just ask the deep question:
146
483775
2254
08:18
Is this really the number
that we want people to drive up?
147
486053
3048
08:21
Is this the thing that,
when you open Twitter,
148
489125
2545
08:23
you see, "That's the thing
I need to increase?"
149
491694
2516
08:26
And I don't believe
that's the case right now.
150
494234
2144
08:28
(Applause)
151
496402
2103
08:30
WPR: I think we should look at
some of the tweets
152
498529
2352
08:32
that are coming
in from the audience as well.
153
500905
2169
08:35
CA: Let's see what you guys are asking.
154
503868
2436
08:38
I mean, this is -- generally, one
of the amazing things about Twitter
155
506328
3294
08:41
is how you can use it for crowd wisdom,
156
509646
2294
08:43
you know, that more knowledge,
more questions, more points of view
157
511964
4840
08:48
than you can imagine,
158
516828
1238
08:50
and sometimes, many of them
are really healthy.
159
518090
3689
08:53
WPR: I think one I saw that
passed already quickly down here,
160
521803
2900
08:56
"What's Twitter's plan to combat
foreign meddling in the 2020 US election?"
161
524717
3524
09:00
I think that's something
that's an issue we're seeing
162
528265
2571
09:02
on the internet in general,
163
530860
1901
09:04
that we have a lot of malicious
automated activity happening.
164
532785
3667
09:08
And on Twitter, for example,
in fact, we have some work
165
536476
5373
09:13
that's come from our friends
at Zignal Labs,
166
541873
2758
09:16
and maybe we can even see that
to give us an example
167
544655
2656
09:19
of what exactly I'm talking about,
168
547335
1927
09:21
where you have these bots, if you will,
169
549286
3204
09:24
or coordinated automated
malicious account activity,
170
552514
4550
09:29
that is being used to influence
things like elections.
171
557088
2764
09:31
And in this example we have
from Zignal which they've shared with us
172
559876
3843
09:35
using the data that
they have from Twitter,
173
563743
2198
09:37
you actually see that in this case,
174
565965
2441
09:40
white represents the humans --
human accounts, each dot is an account.
175
568430
4370
09:44
The pinker it is,
176
572824
1359
09:46
the more automated the activity is.
177
574207
1740
09:47
And you can see how you have
a few humans interacting with bots.
178
575971
5970
09:53
In this case, it's related
to the election in Israel
179
581965
4419
09:58
and spreading misinformation
about Benny Gantz,
180
586408
2833
10:01
and as we know, in the end,
that was an election
181
589265
2662
10:03
that Netanyahu won by a slim margin,
182
591951
3724
10:07
and that may have been
in some case influenced by this.
183
595699
2842
10:10
And when you think about
that happening on Twitter,
184
598565
2615
10:13
what are the things
that you're doing, specifically,
185
601204
2456
10:15
to ensure you don't have misinformation
like this spreading in this way,
186
603684
3702
10:19
influencing people in ways
that could affect democracy?
187
607410
4181
10:23
JD: Just to back up a bit,
188
611615
1771
10:25
we asked ourselves a question:
189
613410
2975
10:28
Can we actually measure
the health of a conversation,
190
616409
3816
10:32
and what does that mean?
191
620249
1288
10:33
And in the same way
that you have indicators
192
621561
3382
10:36
and we have indicators as humans
in terms of are we healthy or not,
193
624967
3467
10:40
such as temperature,
the flushness of your face,
194
628458
4658
10:45
we believe that we could find
the indicators of conversational health.
195
633140
4560
10:49
And we worked with a lab
called Cortico at MIT
196
637724
3843
10:54
to propose four starter indicators
197
642479
6091
11:00
that we believe we could ultimately
measure on the system.
198
648594
3670
11:05
And the first one is
what we're calling shared attention.
199
653249
5604
11:10
It's a measure of how much
of the conversation is attentive
200
658877
3581
11:14
on the same topic versus disparate.
201
662482
2630
11:17
The second one is called shared reality,
202
665739
2783
11:21
and this is what percentage
of the conversation
203
669217
2259
11:23
shares the same facts --
204
671500
2005
11:25
not whether those facts
are truthful or not,
205
673529
3113
11:28
but are we sharing
the same facts as we converse?
206
676666
3009
11:32
The third is receptivity:
207
680235
2353
11:34
How much of the conversation
is receptive or civil
208
682612
3959
11:38
or the inverse, toxic?
209
686595
2944
11:42
And then the fourth
is variety of perspective.
210
690213
3222
11:45
So, are we seeing filter bubbles
or echo chambers,
211
693459
3145
11:48
or are we actually getting
a variety of opinions
212
696628
3057
11:51
within the conversation?
213
699709
1635
11:53
And implicit in all four of these
is the understanding that,
214
701368
4018
11:57
as they increase, the conversation
gets healthier and healthier.
215
705410
3390
12:00
So our first step is to see
if we can measure these online,
216
708824
4869
12:05
which we believe we can.
217
713717
1308
12:07
We have the most momentum
around receptivity.
218
715049
3167
12:10
We have a toxicity score,
a toxicity model, on our system
219
718240
4317
12:14
that can actually measure
whether you are likely to walk away
220
722581
4124
12:18
from a conversation
that you're having on Twitter
221
726729
2313
12:21
because you feel it's toxic,
222
729066
1633
12:22
with some pretty high degree.
223
730723
2512
12:26
We're working to measure the rest,
224
734369
2199
12:28
and the next step is,
225
736592
1964
12:30
as we build up solutions,
226
738580
3359
12:33
to watch how these measurements
trend over time
227
741963
3491
12:37
and continue to experiment.
228
745478
1873
12:39
And our goal is to make sure
that these are balanced,
229
747375
4041
12:43
because if you increase one,
you might decrease another.
230
751440
3066
12:46
If you increase variety of perspective,
231
754530
2147
12:48
you might actually decrease
shared reality.
232
756701
3091
12:51
CA: Just picking up on some
of the questions flooding in here.
233
759816
4989
12:56
JD: Constant questioning.
234
764829
1271
12:58
CA: A lot of people are puzzled why,
235
766996
3620
13:02
like, how hard is it to get rid
of Nazis from Twitter?
236
770640
4247
13:08
JD: (Laughs)
237
776309
1322
13:09
So we have policies
around violent extremist groups,
238
777655
6995
13:16
and the majority of our work
and our terms of service
239
784674
4426
13:21
works on conduct, not content.
240
789124
3729
13:24
So we're actually looking for conduct.
241
792877
2551
13:27
Conduct being using the service
242
795452
3014
13:30
to repeatedly or episodically
harass someone,
243
798490
3867
13:34
using hateful imagery
244
802381
2493
13:36
that might be associated with the KKK
245
804898
2106
13:39
or the American Nazi Party.
246
807028
3281
13:42
Those are all things
that we act on immediately.
247
810333
4156
13:47
We're in a situation right now
where that term is used fairly loosely,
248
815002
5452
13:52
and we just cannot take
any one mention of that word
249
820478
5313
13:57
accusing someone else
250
825815
2117
13:59
as a factual indication that they
should be removed from the platform.
251
827956
3755
14:03
So a lot of our models
are based around, number one:
252
831735
2627
14:06
Is this account associated
with a violent extremist group?
253
834386
3140
14:09
And if so, we can take action.
254
837550
1983
14:11
And we have done so on the KKK
and the American Nazi Party and others.
255
839557
3852
14:15
And number two: Are they using
imagery or conduct
256
843433
4183
14:19
that would associate them as such as well?
257
847640
2372
14:22
CA: How many people do you have
working on content moderation
258
850416
2932
14:25
to look at this?
259
853372
1250
14:26
JD: It varies.
260
854646
1496
14:28
We want to be flexible on this,
261
856166
1595
14:29
because we want to make sure
that we're, number one,
262
857785
2646
14:32
building algorithms instead of just
hiring massive amounts of people,
263
860455
4424
14:36
because we need to make sure
that this is scalable,
264
864903
2824
14:39
and there are no amount of people
that can actually scale this.
265
867751
3454
14:43
So this is why we've done so much work
around proactive detection of abuse
266
871229
6629
14:49
that humans can then review.
267
877882
1391
14:51
We want to have a situation
268
879297
2861
14:54
where algorithms are constantly
scouring every single tweet
269
882182
3741
14:57
and bringing the most
interesting ones to the top
270
885947
2342
15:00
so that humans can bring their judgment
to whether we should take action or not,
271
888313
3902
15:04
based on our terms of service.
272
892239
1524
15:05
WPR: But there's not an amount
of people that are scalable,
273
893787
2803
15:08
but how many people do you currently have
monitoring these accounts,
274
896614
3497
15:12
and how do you figure out what's enough?
275
900135
2546
15:14
JD: They're completely flexible.
276
902705
2272
15:17
Sometimes we associate folks with spam.
277
905001
2941
15:19
Sometimes we associate folks
with abuse and harassment.
278
907966
3845
15:23
We're going to make sure that
we have flexibility in our people
279
911835
3062
15:26
so that we can direct them
at what is most needed.
280
914921
2350
15:29
Sometimes, the elections.
281
917295
1204
15:30
We've had a string of elections
in Mexico, one coming up in India,
282
918523
4927
15:35
obviously, the election last year,
the midterm election,
283
923474
4447
15:39
so we just want to be flexible
with our resources.
284
927945
2472
15:42
So when people --
285
930441
2129
15:44
just as an example, if you go
to our current terms of service
286
932594
6389
15:51
and you bring the page up,
287
939007
1641
15:52
and you're wondering about abuse
and harassment that you just received
288
940672
3682
15:56
and whether it was against
our terms of service to report it,
289
944378
3634
16:00
the first thing you see
when you open that page
290
948036
2559
16:02
is around intellectual
property protection.
291
950619
3088
16:06
You scroll down and you get to
abuse, harassment
292
954504
5323
16:11
and everything else
that you might be experiencing.
293
959851
2382
16:14
So I don't know how that happened
over the company's history,
294
962257
3195
16:17
but we put that above
the thing that people want
295
965476
4797
16:24
the most information on
and to actually act on.
296
972146
3222
16:27
And just our ordering shows the world
what we believed was important.
297
975392
5241
16:32
So we're changing all that.
298
980657
2881
16:35
We're ordering it the right way,
299
983562
1563
16:37
but we're also simplifying the rules
so that they're human-readable
300
985149
3451
16:40
so that people can actually
understand themselves
301
988624
4067
16:44
when something is against our terms
and when something is not.
302
992715
3448
16:48
And then we're making --
303
996187
2161
16:50
again, our big focus is on removing
the burden of work from the victims.
304
998372
5200
16:55
So that means push more
towards technology,
305
1003596
3734
16:59
rather than humans doing the work --
306
1007354
1873
17:01
that means the humans receiving the abuse
307
1009251
2413
17:03
and also the humans
having to review that work.
308
1011688
3026
17:06
So we want to make sure
309
1014738
1673
17:08
that we're not just encouraging more work
310
1016435
2841
17:11
around something
that's super, super negative,
311
1019300
2629
17:13
and we want to have a good balance
between the technology
312
1021953
2674
17:16
and where humans can actually be creative,
313
1024651
2852
17:19
which is the judgment of the rules,
314
1027527
3090
17:22
and not just all the mechanical stuff
of finding and reporting them.
315
1030641
3267
17:25
So that's how we think about it.
316
1033932
1530
17:27
CA: I'm curious to dig in more
about what you said.
317
1035486
2406
17:29
I mean, I love that you said
you are looking for ways
318
1037916
2605
17:32
to re-tweak the fundamental
design of the system
319
1040545
3462
17:36
to discourage some of the reactive
behavior, and perhaps --
320
1044031
4875
17:40
to use Tristan Harris-type language --
321
1048930
2705
17:43
engage people's more reflective thinking.
322
1051659
4288
17:47
How far advanced is that?
323
1055971
1854
17:49
What would alternatives
to that "like" button be?
324
1057849
4305
17:55
JD: Well, first and foremost,
325
1063518
3575
17:59
my personal goal with the service
is that I believe fundamentally
326
1067117
5753
18:04
that public conversation is critical.
327
1072894
2702
18:07
There are existential problems
facing the world
328
1075620
2647
18:10
that are facing the entire world,
not any one particular nation-state,
329
1078291
4163
18:14
that global public conversation benefits.
330
1082478
2649
18:17
And that is one of the unique
dynamics of Twitter,
331
1085151
2372
18:19
that it is completely open,
332
1087547
1814
18:21
it is completely public,
333
1089385
1596
18:23
it is completely fluid,
334
1091005
1399
18:24
and anyone can see any other conversation
and participate in it.
335
1092428
4038
18:28
So there are conversations
like climate change.
336
1096490
2206
18:30
There are conversations
like the displacement in the work
337
1098720
2682
18:33
through artificial intelligence.
338
1101426
2000
18:35
There are conversations
like economic disparity.
339
1103450
3006
18:38
No matter what any one nation-state does,
340
1106480
2765
18:41
they will not be able
to solve the problem alone.
341
1109269
2421
18:43
It takes coordination around the world,
342
1111714
2643
18:46
and that's where I think
Twitter can play a part.
343
1114381
3047
18:49
The second thing is that Twitter,
right now, when you go to it,
344
1117452
5642
18:55
you don't necessarily walk away
feeling like you learned something.
345
1123118
3746
18:58
Some people do.
346
1126888
1276
19:00
Some people have
a very, very rich network,
347
1128188
3107
19:03
a very rich community
that they learn from every single day.
348
1131319
3117
19:06
But it takes a lot of work
and a lot of time to build up to that.
349
1134460
3691
19:10
So we want to get people
to those topics and those interests
350
1138175
3448
19:13
much, much faster
351
1141647
1579
19:15
and make sure that
they're finding something that,
352
1143250
2566
19:18
no matter how much time
they spend on Twitter --
353
1146728
2360
19:21
and I don't want to maximize
the time on Twitter,
354
1149112
2358
19:23
I want to maximize
what they actually take away from it
355
1151494
2910
19:26
and what they learn from it, and --
356
1154428
2030
19:29
CA: Well, do you, though?
357
1157598
1328
19:30
Because that's the core question
that a lot of people want to know.
358
1158950
3244
19:34
Surely, Jack, you're constrained,
to a huge extent,
359
1162218
3638
19:37
by the fact that you're a public company,
360
1165880
2007
19:39
you've got investors pressing on you,
361
1167911
1774
19:41
the number one way you make your money
is from advertising --
362
1169709
3559
19:45
that depends on user engagement.
363
1173292
2772
19:48
Are you willing to sacrifice
user time, if need be,
364
1176088
4700
19:52
to go for a more reflective conversation?
365
1180812
3729
19:56
JD: Yeah; more relevance means
less time on the service,
366
1184565
3111
19:59
and that's perfectly fine,
367
1187700
1937
20:01
because we want to make sure
that, like, you're coming to Twitter,
368
1189661
3099
20:04
and you see something immediately
that you learn from and that you push.
369
1192784
4520
20:09
We can still serve an ad against that.
370
1197328
3420
20:12
That doesn't mean you need to spend
any more time to see more.
371
1200772
2921
20:15
The second thing we're looking at --
372
1203717
1733
20:17
CA: But just -- on that goal,
daily active usage,
373
1205474
2698
20:20
if you're measuring that,
that doesn't necessarily mean things
374
1208196
3245
20:23
that people value every day.
375
1211465
1738
20:25
It may well mean
376
1213227
1161
20:26
things that people are drawn to
like a moth to the flame, every day.
377
1214412
3306
20:29
We are addicted, because we see
something that pisses us off,
378
1217742
3022
20:32
so we go in and add fuel to the fire,
379
1220788
3178
20:35
and the daily active usage goes up,
380
1223990
1927
20:37
and there's more ad revenue there,
381
1225941
1715
20:39
but we all get angrier with each other.
382
1227680
2752
20:42
How do you define ...
383
1230456
2509
20:44
"Daily active usage" seems like a really
dangerous term to be optimizing.
384
1232989
4126
20:49
(Applause)
385
1237139
5057
20:54
JD: Taken alone, it is,
386
1242220
1268
20:55
but you didn't let me
finish the other metric,
387
1243512
2346
20:57
which is, we're watching for conversations
388
1245882
3727
21:01
and conversation chains.
389
1249633
2129
21:03
So we want to incentivize
healthy contribution back to the network,
390
1251786
5076
21:08
and what we believe that is
is actually participating in conversation
391
1256886
4181
21:13
that is healthy,
392
1261091
1197
21:14
as defined by those four indicators
I articulated earlier.
393
1262312
5037
21:19
So you can't just optimize
around one metric.
394
1267373
2657
21:22
You have to balance and look constantly
395
1270054
2752
21:24
at what is actually going to create
a healthy contribution to the network
396
1272830
4083
21:28
and a healthy experience for people.
397
1276937
2341
21:31
Ultimately, we want to get to a metric
398
1279302
1866
21:33
where people can tell us,
"Hey, I learned something from Twitter,
399
1281192
3757
21:36
and I'm walking away
with something valuable."
400
1284973
2167
21:39
That is our goal ultimately over time,
401
1287164
2043
21:41
but that's going to take some time.
402
1289231
1809
21:43
CA: You come over to many,
I think to me, as this enigma.
403
1291064
5282
21:48
This is possibly unfair,
but I woke up the other night
404
1296370
4396
21:52
with this picture of how I found I was
thinking about you and the situation,
405
1300790
3879
21:56
that we're on this great voyage with you
on this ship called the "Twittanic" --
406
1304693
6903
22:03
(Laughter)
407
1311620
1281
22:04
and there are people on board in steerage
408
1312925
4357
22:09
who are expressing discomfort,
409
1317306
2203
22:11
and you, unlike many other captains,
410
1319533
2543
22:14
are saying, "Well, tell me, talk to me,
listen to me, I want to hear."
411
1322100
3431
22:17
And they talk to you, and they say,
"We're worried about the iceberg ahead."
412
1325555
3619
22:21
And you go, "You know,
that is a powerful point,
413
1329198
2242
22:23
and our ship, frankly,
hasn't been built properly
414
1331464
2430
22:25
for steering as well as it might."
415
1333918
1669
22:27
And we say, "Please do something."
416
1335611
1658
22:29
And you go to the bridge,
417
1337293
1411
22:30
and we're waiting,
418
1338728
2295
22:33
and we look, and then you're showing
this extraordinary calm,
419
1341047
4548
22:37
but we're all standing outside,
saying, "Jack, turn the fucking wheel!"
420
1345619
3883
22:41
You know?
421
1349526
1151
22:42
(Laughter)
422
1350701
1335
22:44
(Applause)
423
1352060
2381
22:46
I mean --
424
1354465
1166
22:47
(Applause)
425
1355655
1734
22:49
It's democracy at stake.
426
1357413
4594
22:54
It's our culture at stake.
It's our world at stake.
427
1362031
2821
22:56
And Twitter is amazing and shapes so much.
428
1364876
4706
23:01
It's not as big as some
of the other platforms,
429
1369606
2233
23:03
but the people of influence use it
to set the agenda,
430
1371863
2804
23:06
and it's just hard to imagine a more
important role in the world than to ...
431
1374691
6787
23:13
I mean, you're doing a brilliant job
of listening, Jack, and hearing people,
432
1381502
3784
23:17
but to actually dial up the urgency
and move on this stuff --
433
1385310
4445
23:21
will you do that?
434
1389779
2201
23:24
JD: Yes, and we have been
moving substantially.
435
1392750
3815
23:28
I mean, there's been
a few dynamics in Twitter's history.
436
1396589
3225
23:31
One, when I came back to the company,
437
1399838
2083
23:35
we were in a pretty dire state
in terms of our future,
438
1403477
6256
23:41
and not just from how people
were using the platform,
439
1409757
4634
23:46
but from a corporate narrative as well.
440
1414415
2047
23:48
So we had to fix
a bunch of the foundation,
441
1416486
3204
23:51
turn the company around,
442
1419714
1969
23:53
go through two crazy layoffs,
443
1421707
3111
23:56
because we just got too big
for what we were doing,
444
1424842
3793
24:00
and we focused all of our energy
445
1428659
2060
24:02
on this concept of serving
the public conversation.
446
1430743
3508
24:06
And that took some work.
447
1434275
1451
24:07
And as we dived into that,
448
1435750
2608
24:10
we realized some of the issues
with the fundamentals.
449
1438382
2992
24:14
We could do a bunch of superficial things
to address what you're talking about,
450
1442120
4656
24:18
but we need the changes to last,
451
1446800
1790
24:20
and that means going really, really deep
452
1448614
2459
24:23
and paying attention
to what we started 13 years ago
453
1451097
4350
24:27
and really questioning
454
1455471
2261
24:29
how the system works
and how the framework works
455
1457756
2566
24:32
and what is needed for the world today,
456
1460346
3833
24:36
given how quickly everything is moving
and how people are using it.
457
1464203
4024
24:40
So we are working as quickly as we can,
but quickness will not get the job done.
458
1468251
6544
24:46
It's focus, it's prioritization,
459
1474819
2611
24:49
it's understanding
the fundamentals of the network
460
1477454
2946
24:52
and building a framework that scales
461
1480424
2842
24:55
and that is resilient to change,
462
1483290
2351
24:57
and being open about where we are
and being transparent about where are
463
1485665
5429
25:03
so that we can continue to earn trust.
464
1491118
2179
25:06
So I'm proud of all the frameworks
that we've put in place.
465
1494141
3331
25:09
I'm proud of our direction.
466
1497496
2888
25:12
We obviously can move faster,
467
1500915
2718
25:15
but that required just stopping a bunch
of stupid stuff we were doing in the past.
468
1503657
4719
25:21
CA: All right.
469
1509067
1164
25:22
Well, I suspect there are many people here
who, if given the chance,
470
1510255
4067
25:26
would love to help you
on this change-making agenda you're on,
471
1514346
3989
25:30
and I don't know if Whitney --
472
1518359
1542
25:31
Jack, thank you for coming here
and speaking so openly.
473
1519925
2761
25:34
It took courage.
474
1522710
1527
25:36
I really appreciate what you said,
and good luck with your mission.
475
1524261
3384
25:39
JD: Thank you so much.
Thanks for having me.
476
1527669
2095
25:41
(Applause)
477
1529788
3322
25:45
Thank you.
478
1533134
1159

▲Back to top

ABOUT THE SPEAKERS
Jack Dorsey - Entrepreneur, programmer
Jack Dorsey is the CEO of Twitter, CEO & Chairman of Square, and a cofounder of both.

Why you should listen
More profile about the speaker
Jack Dorsey | Speaker | TED.com
Chris Anderson - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading.

Why you should listen

Chris Anderson is the Curator of TED, a nonprofit devoted to sharing valuable ideas, primarily through the medium of 'TED Talks' -- short talks that are offered free online to a global audience.

Chris was born in a remote village in Pakistan in 1957. He spent his early years in India, Pakistan and Afghanistan, where his parents worked as medical missionaries, and he attended an American school in the Himalayas for his early education. After boarding school in Bath, England, he went on to Oxford University, graduating in 1978 with a degree in philosophy, politics and economics.

Chris then trained as a journalist, working in newspapers and radio, including two years producing a world news service in the Seychelles Islands.

Back in the UK in 1984, Chris was captivated by the personal computer revolution and became an editor at one of the UK's early computer magazines. A year later he founded Future Publishing with a $25,000 bank loan. The new company initially focused on specialist computer publications but eventually expanded into other areas such as cycling, music, video games, technology and design, doubling in size every year for seven years. In 1994, Chris moved to the United States where he built Imagine Media, publisher of Business 2.0 magazine and creator of the popular video game users website IGN. Chris eventually merged Imagine and Future, taking the combined entity public in London in 1999, under the Future name. At its peak, it published 150 magazines and websites and employed 2,000 people.

This success allowed Chris to create a private nonprofit organization, the Sapling Foundation, with the hope of finding new ways to tackle tough global issues through media, technology, entrepreneurship and, most of all, ideas. In 2001, the foundation acquired the TED Conference, then an annual meeting of luminaries in the fields of Technology, Entertainment and Design held in Monterey, California, and Chris left Future to work full time on TED.

He expanded the conference's remit to cover all topics, including science, business and key global issues, while adding a Fellows program, which now has some 300 alumni, and the TED Prize, which grants its recipients "one wish to change the world." The TED stage has become a place for thinkers and doers from all fields to share their ideas and their work, capturing imaginations, sparking conversation and encouraging discovery along the way.

In 2006, TED experimented with posting some of its talks on the Internet. Their viral success encouraged Chris to begin positioning the organization as a global media initiative devoted to 'ideas worth spreading,' part of a new era of information dissemination using the power of online video. In June 2015, the organization posted its 2,000th talk online. The talks are free to view, and they have been translated into more than 100 languages with the help of volunteers from around the world. Viewership has grown to approximately one billion views per year.

Continuing a strategy of 'radical openness,' in 2009 Chris introduced the TEDx initiative, allowing free licenses to local organizers who wished to organize their own TED-like events. More than 8,000 such events have been held, generating an archive of 60,000 TEDx talks. And three years later, the TED-Ed program was launched, offering free educational videos and tools to students and teachers.

More profile about the speaker
Chris Anderson | Speaker | TED.com
Whitney Pennington Rodgers - TED Current Affairs Curator
Whitney Pennington Rodgers is an award-winning journalist and media professional.

Why you should listen

Prior to joining TED as current affairs curator, Whitney Pennington Rodgers produced for NBC's primetime news magazine Dateline NBC. She earned a duPont-Columbia award and a News & Documentary Emmy or her contributions to the Dateline NBC hour "The Cosby Accusers Speak" -- an extensive group interview with 27 of the women who accused entertainer Bill Cosby of sexual misconduct.

Pennington Rodgers has worked at NBC's in-house production company Peacock Productions, The Today Show, Nightly News, Rock Center with Brian Williams and New Jersey-centric public affairs shows Caucus: New Jersey and One-on-One with Steve Adubato. Prior to beginning her career in media, she had a short stint as a fourth-grade teacher through the Teach for America program.

Pennington Rodgers received her Bachelor's in journalism and media studies from Rutgers University. She completed her Master's of Journalism at the University of California at Berkeley, where she produced a documentary about recruitment of nonblack students at historically black colleges and universities.

More profile about the speaker
Whitney Pennington Rodgers | Speaker | TED.com