ABOUT THE SPEAKER
Julie Cordua - Social entrepreneur
Julie Cordua is dedicated to building the tools, partnerships and communities that will eliminate child sexual abuse from the internet.

Why you should listen

Julie Cordua is driven by a desire to break down traditional silos and barriers and bring the resources and talent of the private sector to bear on some of our society's worst social problems. In 2006, she left the wireless technology industry to help launch (RED), bringing private sector resources to work on AIDS in Africa. In 2011, she joined co-founders Ashton Kutcher and Demi Moore to create Thorn, a technology-led non-profit that builds products and systems aimed at eliminating online child sexual abuse. Today, Thorn's tools are deployed in more than 50 countries, having helped to identify more than 14,000 child victims of abuse and reduce investigative time by more than 65 percent.

More profile about the speaker
Julie Cordua | Speaker | TED.com
TED2019

Julie Cordua: How we can eliminate child sexual abuse material from the internet

Filmed:
1,173,650 views

Social entrepreneur Julie Cordua works on a problem that isn't easy to talk about: the sexual abuse of children in images and videos on the internet. At Thorn, she's building technology to connect the dots between the tech industry, law enforcement and government -- so we can swiftly end the viral distribution of abuse material and rescue children faster. Learn more about how this scalable solution could help dismantle the communities normalizing child sexual abuse around the world today. (This ambitious plan is part of the Audacious Project, TED's initiative to inspire and fund global change.)
- Social entrepreneur
Julie Cordua is dedicated to building the tools, partnerships and communities that will eliminate child sexual abuse from the internet. Full bio

Double-click the English transcript below to play the video.

00:12
[This talk contains mature content]
0
836
2548
00:17
Five years ago,
1
5741
1620
00:19
I received a phone call
that would change my life.
2
7385
3254
00:23
I remember so vividly that day.
3
11795
2666
00:27
It was about this time of year,
4
15245
1901
00:29
and I was sitting in my office.
5
17170
1958
00:31
I remember the sun
streaming through the window.
6
19692
3066
00:35
And my phone rang.
7
23592
1320
00:37
And I picked it up,
8
25565
1187
00:39
and it was two federal agents,
asking for my help
9
27594
3978
00:43
in identifying a little girl
10
31596
2706
00:46
featured in hundreds of child
sexual abuse images they had found online.
11
34326
5367
00:53
They had just started working the case,
12
41145
2576
00:55
but what they knew
13
43745
2870
00:58
was that her abuse had been broadcast
to the world for years
14
46639
4823
01:03
on dark web sites dedicated
to the sexual abuse of children.
15
51486
5149
01:09
And her abuser was incredibly
technologically sophisticated:
16
57605
4385
01:14
new images and new videos every few weeks,
17
62014
4588
01:18
but very few clues as to who she was
18
66626
4049
01:22
or where she was.
19
70699
1676
01:25
And so they called us,
20
73324
1386
01:26
because they had heard
we were a new nonprofit
21
74734
2690
01:29
building technology
to fight child sexual abuse.
22
77448
3458
01:33
But we were only two years old,
23
81596
2287
01:35
and we had only worked
on child sex trafficking.
24
83907
3132
01:39
And I had to tell them
25
87944
2123
01:42
we had nothing.
26
90091
1197
01:44
We had nothing that could
help them stop this abuse.
27
92280
3955
01:49
It took those agents another year
28
97263
3574
01:52
to ultimately find that child.
29
100861
3001
01:56
And by the time she was rescued,
30
104853
2275
01:59
hundreds of images and videos
documenting her rape had gone viral,
31
107152
6488
02:05
from the dark web
32
113664
1691
02:07
to peer-to-peer networks,
private chat rooms
33
115379
3017
02:10
and to the websites you and I use
34
118420
3220
02:13
every single day.
35
121664
2602
02:17
And today, as she struggles to recover,
36
125216
3815
02:21
she lives with the fact
that thousands around the world
37
129055
4139
02:25
continue to watch her abuse.
38
133218
3088
02:29
I have come to learn
in the last five years
39
137994
2428
02:32
that this case is far from unique.
40
140446
2517
02:36
How did we get here as a society?
41
144119
3843
02:41
In the late 1980s, child pornography --
42
149490
3759
02:45
or what it actually is,
child sexual abuse material --
43
153273
5253
02:50
was nearly eliminated.
44
158550
1851
02:53
New laws and increased prosecutions
made it simply too risky
45
161209
4504
02:57
to trade it through the mail.
46
165737
1572
03:00
And then came the internet,
and the market exploded.
47
168233
4145
03:05
The amount of content in circulation today
48
173310
3356
03:08
is massive and growing.
49
176690
2844
03:12
This is a truly global problem,
50
180421
3244
03:15
but if we just look at the US:
51
183689
1833
03:17
in the US alone last year,
52
185546
2711
03:20
more than 45 million images and videos
of child sexual abuse material
53
188281
5274
03:25
were reported to the National Center
for Missing and Exploited Children,
54
193579
3678
03:29
and that is nearly double
the amount the year prior.
55
197281
4381
03:34
And the details behind these numbers
are hard to contemplate,
56
202627
5286
03:39
with more than 60 percent of the images
featuring children younger than 12,
57
207937
5720
03:45
and most of them including
extreme acts of sexual violence.
58
213681
4502
03:50
Abusers are cheered on in chat rooms
dedicated to the abuse of children,
59
218850
5300
03:56
where they gain rank and notoriety
60
224174
2483
03:58
with more abuse and more victims.
61
226681
2906
04:02
In this market,
62
230243
2595
04:04
the currency has become
the content itself.
63
232862
3958
04:10
It's clear that abusers have been quick
to leverage new technologies,
64
238023
3806
04:13
but our response as a society has not.
65
241853
3015
04:17
These abusers don't read
user agreements of websites,
66
245671
4171
04:21
and the content doesn't honor
geographic boundaries.
67
249866
3802
04:26
And they win when we look
at one piece of the puzzle at a time,
68
254656
6113
04:32
which is exactly how
our response today is designed.
69
260793
4017
04:36
Law enforcement works in one jurisdiction.
70
264834
3487
04:40
Companies look at just their platform.
71
268345
3414
04:43
And whatever data they learn along the way
72
271783
2704
04:46
is rarely shared.
73
274511
2002
04:49
It is so clear that this
disconnected approach is not working.
74
277402
5601
04:55
We have to redesign
our response to this epidemic
75
283643
4110
04:59
for the digital age.
76
287777
1522
05:01
And that's exactly
what we're doing at Thorn.
77
289702
2942
05:05
We're building the technology
to connect these dots,
78
293311
3617
05:08
to arm everyone on the front lines --
79
296952
2322
05:11
law enforcement, NGOs and companies --
80
299298
2824
05:14
with the tools they need
to ultimately eliminate
81
302146
3509
05:17
child sexual abuse material
from the internet.
82
305679
2493
05:21
Let's talk for a minute --
83
309571
1271
05:22
(Applause)
84
310866
1519
05:24
Thank you.
85
312409
1304
05:25
(Applause)
86
313737
2340
05:29
Let's talk for a minute
about what those dots are.
87
317800
2517
05:33
As you can imagine,
this content is horrific.
88
321323
3211
05:36
If you don't have to look at it,
you don't want to look at it.
89
324558
3848
05:40
And so, most companies
or law enforcement agencies
90
328430
4969
05:45
that have this content
91
333423
1663
05:47
can translate every file
into a unique string of numbers.
92
335110
3452
05:50
This is called a "hash."
93
338586
1474
05:52
It's essentially a fingerprint
94
340084
2143
05:54
for each file or each video.
95
342251
2398
05:56
And what this allows them to do
is use the information in investigations
96
344673
4613
06:01
or for a company to remove
the content from their platform,
97
349310
3027
06:04
without having to relook
at every image and every video each time.
98
352361
5174
06:10
The problem today, though,
99
358196
2151
06:12
is that there are hundreds
of millions of these hashes
100
360371
3751
06:16
sitting in siloed databases
all around the world.
101
364146
3610
06:20
In a silo,
102
368214
1151
06:21
it might work for the one agency
that has control over it,
103
369389
3076
06:24
but not connecting this data means
we don't know how many are unique.
104
372489
4130
06:28
We don't know which ones represent
children who have already been rescued
105
376643
3516
06:32
or need to be identified still.
106
380183
2889
06:35
So our first, most basic premise
is that all of this data
107
383096
4170
06:39
must be connected.
108
387290
2403
06:42
There are two ways where this data,
combined with software on a global scale,
109
390318
6169
06:48
can have transformative
impact in this space.
110
396511
3408
06:52
The first is with law enforcement:
111
400464
2622
06:55
helping them identify new victims faster,
112
403110
3631
06:58
stopping abuse
113
406765
1216
07:00
and stopping those producing this content.
114
408005
2904
07:03
The second is with companies:
115
411441
2666
07:06
using it as clues to identify
the hundreds of millions of files
116
414131
3621
07:09
in circulation today,
117
417776
1594
07:11
pulling it down
118
419394
1187
07:12
and then stopping the upload
of new material before it ever goes viral.
119
420605
6818
07:21
Four years ago,
120
429694
1646
07:23
when that case ended,
121
431364
1539
07:26
our team sat there,
and we just felt this, um ...
122
434300
3739
07:31
... deep sense of failure,
is the way I can put it,
123
439635
3338
07:34
because we watched that whole year
124
442997
3651
07:38
while they looked for her.
125
446672
1320
07:40
And we saw every place
in the investigation
126
448016
3967
07:44
where, if the technology
would have existed,
127
452007
2388
07:46
they would have found her faster.
128
454419
2304
07:49
And so we walked away from that
129
457684
1936
07:51
and we went and we did
the only thing we knew how to do:
130
459644
2955
07:54
we began to build software.
131
462623
2634
07:57
So we've started with law enforcement.
132
465689
2252
07:59
Our dream was an alarm bell on the desks
of officers all around the world
133
467965
4421
08:04
so that if anyone dare post
a new victim online,
134
472410
4544
08:08
someone would start
looking for them immediately.
135
476978
3489
08:13
I obviously can't talk about
the details of that software,
136
481324
2957
08:16
but today it's at work in 38 countries,
137
484305
2609
08:18
having reduced the time it takes
to get to a child
138
486938
2974
08:21
by more than 65 percent.
139
489936
2330
08:24
(Applause)
140
492290
4370
08:33
And now we're embarking
on that second horizon:
141
501442
3015
08:36
building the software to help companies
identify and remove this content.
142
504481
5665
08:43
Let's talk for a minute
about these companies.
143
511193
2532
08:46
So, I told you -- 45 million images
and videos in the US alone last year.
144
514270
5232
08:52
Those come from just 12 companies.
145
520280
3887
08:57
Twelve companies, 45 million files
of child sexual abuse material.
146
525883
6428
09:04
These come from those companies
that have the money
147
532335
2800
09:07
to build the infrastructure that it takes
to pull this content down.
148
535159
4557
09:11
But there are hundreds of other companies,
149
539740
2411
09:14
small- to medium-size companies
around the world,
150
542175
2666
09:16
that need to do this work,
151
544865
2054
09:18
but they either: 1) can't imagine that
their platform would be used for abuse,
152
546943
5425
09:24
or 2) don't have the money to spend
on something that is not driving revenue.
153
552392
5845
09:30
So we went ahead and built it for them,
154
558932
3289
09:34
and this system now gets smarter
with the more companies that participate.
155
562245
4969
09:39
Let me give you an example.
156
567965
1725
09:42
Our first partner, Imgur --
if you haven't heard of this company,
157
570459
3878
09:46
it's one of the most visited
websites in the US --
158
574361
3142
09:49
millions of pieces of user-generated
content uploaded every single day,
159
577527
5008
09:54
in a mission to make the internet
a more fun place.
160
582559
2858
09:58
They partnered with us first.
161
586012
1852
09:59
Within 20 minutes
of going live on our system,
162
587888
3343
10:03
someone tried to upload
a known piece of abuse material.
163
591255
3572
10:06
They were able to stop it,
they pull it down,
164
594851
2108
10:08
they report it to the National Center
for Missing and Exploited Children.
165
596983
3466
10:12
But they went a step further,
166
600473
1908
10:14
and they went and inspected the account
of the person who had uploaded it.
167
602405
4133
10:19
Hundreds more pieces
of child sexual abuse material
168
607086
4711
10:23
that we had never seen.
169
611821
1818
10:26
And this is where we start
to see exponential impact.
170
614152
3532
10:29
We pull that material down,
171
617708
1768
10:31
it gets reported to the National Center
for Missing and Exploited Children
172
619500
3550
10:35
and then those hashes
go back into the system
173
623074
2511
10:37
and benefit every other company on it.
174
625609
2464
10:40
And when the millions of hashes we have
lead to millions more and, in real time,
175
628097
4784
10:44
companies around the world are identifying
and pulling this content down,
176
632905
4538
10:49
we will have dramatically increased
the speed at which we are removing
177
637467
4561
10:54
child sexual abuse material
from the internet around the world.
178
642052
4294
10:58
(Applause)
179
646370
5472
11:06
But this is why it can't just be
about software and data,
180
654208
3220
11:09
it has to be about scale.
181
657452
1772
11:11
We have to activate thousands of officers,
182
659248
3513
11:14
hundreds of companies around the world
183
662785
2377
11:17
if technology will allow us
to outrun the perpetrators
184
665186
3608
11:20
and dismantle the communities
that are normalizing child sexual abuse
185
668818
4125
11:24
around the world today.
186
672967
1552
11:27
And the time to do this is now.
187
675064
2650
11:30
We can no longer say we don't know
the impact this is having on our children.
188
678288
5797
11:36
The first generation of children
whose abuse has gone viral
189
684688
4458
11:41
are now young adults.
190
689170
1710
11:43
The Canadian Centre for Child Protection
191
691451
2585
11:46
just did a recent study
of these young adults
192
694060
2696
11:48
to understand the unique trauma
they try to recover from,
193
696780
4636
11:53
knowing that their abuse lives on.
194
701440
2823
11:57
Eighty percent of these young adults
have thought about suicide.
195
705213
4846
12:02
More than 60 percent
have attempted suicide.
196
710566
4062
12:07
And most of them live
with the fear every single day
197
715572
5217
12:12
that as they walk down the street
or they interview for a job
198
720813
4463
12:17
or they go to school
199
725300
2290
12:19
or they meet someone online,
200
727614
2425
12:22
that that person has seen their abuse.
201
730063
3658
12:26
And the reality came true
for more than 30 percent of them.
202
734547
4905
12:32
They had been recognized
from their abuse material online.
203
740256
4586
12:38
This is not going to be easy,
204
746022
3276
12:41
but it is not impossible.
205
749322
2843
12:44
Now it's going to take the will,
206
752189
2676
12:46
the will of our society
207
754889
1589
12:48
to look at something
that is really hard to look at,
208
756502
3554
12:52
to take something out of the darkness
209
760080
2343
12:54
so these kids have a voice;
210
762447
2095
12:58
the will of companies to take action
and make sure that their platforms
211
766110
4946
13:03
are not complicit in the abuse of a child;
212
771080
3313
13:07
the will of governments to invest
with their law enforcement
213
775205
3951
13:11
for the tools they need to investigate
a digital first crime,
214
779180
5094
13:16
even when the victims
cannot speak for themselves.
215
784298
4083
13:21
This audacious commitment
is part of that will.
216
789746
3698
13:26
It's a declaration of war
against one of humanity's darkest evils.
217
794269
5407
13:32
But what I hang on to
218
800263
1940
13:34
is that it's actually
an investment in a future
219
802227
3449
13:37
where every child can simply be a kid.
220
805700
3074
13:41
Thank you.
221
809357
1194
13:42
(Applause)
222
810896
6144

▲Back to top

ABOUT THE SPEAKER
Julie Cordua - Social entrepreneur
Julie Cordua is dedicated to building the tools, partnerships and communities that will eliminate child sexual abuse from the internet.

Why you should listen

Julie Cordua is driven by a desire to break down traditional silos and barriers and bring the resources and talent of the private sector to bear on some of our society's worst social problems. In 2006, she left the wireless technology industry to help launch (RED), bringing private sector resources to work on AIDS in Africa. In 2011, she joined co-founders Ashton Kutcher and Demi Moore to create Thorn, a technology-led non-profit that builds products and systems aimed at eliminating online child sexual abuse. Today, Thorn's tools are deployed in more than 50 countries, having helped to identify more than 14,000 child victims of abuse and reduce investigative time by more than 65 percent.

More profile about the speaker
Julie Cordua | Speaker | TED.com