ABOUT THE SPEAKER
Finn Lützow-Holm Myrstad - Information security researcher
Finn Lützow-Holm Myrstad leads the development of better, more ethical digital policies and standards at the Norwegian Consumer Council.

Why you should listen

Using targeted research and global coalition building, Finn Lützow-Holm Myrstad and his team convince governments and companies to improve their policies. Myrstad is the EU Chair of the Transatlantic Consumer Dialogue's INFOSOC Committee, which gathers 75 organizations from both sides of the Atlantic. Myrstad shapes opinion through addresses to audiences such as the G20, European Parliament, SXSW, Princeton University and TED.

Some of their projects so far include privacy and security challenges posed by: internet-connected devicesuser terms in apps and tech companies use of dark patterns. Myrstad informs his digital work with his business and LSE degrees as well as his previous experience in politics, energy and the private sector. In 2018 he was named amongst Politico's top influencers in the "GDPR power matrix," and in 2017 he and his team won the European Excellence Awards for best Public Affairs campaign for the work on connected toys.

More profile about the speaker
Finn Lützow-Holm Myrstad | Speaker | TED.com
TED Salon Samsung

Finn Lützow-Holm Myrstad: How tech companies deceive you into giving up your data and privacy

Filmed:
1,878,535 views

Have you ever actually read the terms and conditions for the apps you use? Finn Lützow-Holm Myrstad and his team at the Norwegian Consumer Council have, and it took them nearly a day and a half to read the terms of all the apps on an average phone. In a talk about the alarming ways tech companies deceive their users, Myrstad shares insights about the personal information you've agreed to let companies collect -- and how they use your data at a scale you could never imagine.
- Information security researcher
Finn Lützow-Holm Myrstad leads the development of better, more ethical digital policies and standards at the Norwegian Consumer Council. Full bio

Double-click the English transcript below to play the video.

00:13
Do you remember when you were a child,
0
1040
2416
00:15
you probably had a favorite toy
that was a constant companion,
1
3480
3576
00:19
like Christopher Robin
had Winnie the Pooh,
2
7080
2616
00:21
and your imagination
fueled endless adventures?
3
9720
2800
00:25
What could be more innocent than that?
4
13640
2400
00:28
Well, let me introduce you
to my friend Cayla.
5
16800
4800
00:34
Cayla was voted toy of the year
in countries around the world.
6
22600
3456
00:38
She connects to the internet
and uses speech recognition technology
7
26080
3576
00:41
to answer your child's questions,
8
29680
2136
00:43
respond just like a friend.
9
31840
1960
00:46
But the power doesn't lie
with your child's imagination.
10
34920
3656
00:50
It actually lies with the company
harvesting masses of personal information
11
38600
4536
00:55
while your family is innocently
chatting away in the safety of their home,
12
43160
5536
01:00
a dangerously false sense of security.
13
48720
2480
01:04
This case sounded alarm bells for me,
14
52840
2656
01:07
as it is my job to protect
consumers' rights in my country.
15
55520
3200
01:11
And with billions of devices such as cars,
16
59800
3496
01:15
energy meters and even vacuum cleaners
expected to come online by 2020,
17
63320
5096
01:20
we thought this was a case
worth investigating further.
18
68440
3936
01:24
Because what was Cayla doing
19
72400
1896
01:26
with all the interesting things
she was learning?
20
74320
2536
01:28
Did she have another friend she was
loyal to and shared her information with?
21
76880
3640
01:33
Yes, you guessed right. She did.
22
81640
2776
01:36
In order to play with Cayla,
23
84440
2096
01:38
you need to download an app
to access all her features.
24
86560
3000
01:42
Parents must consent to the terms
being changed without notice.
25
90280
3840
01:47
The recordings of the child,
her friends and family,
26
95280
3776
01:51
can be used for targeted advertising.
27
99080
1960
01:54
And all this information can be shared
with unnamed third parties.
28
102080
4960
01:59
Enough? Not quite.
29
107760
2120
02:02
Anyone with a smartphone
can connect to Cayla
30
110880
4296
02:07
within a certain distance.
31
115200
1600
02:09
When we confronted the company
that made and programmed Cayla,
32
117560
4576
02:14
they issued a series of statements
33
122160
2256
02:16
that one had to be an IT expert
in order to breach the security.
34
124440
4120
02:22
Shall we fact-check that statement
and live hack Cayla together?
35
130039
3921
02:29
Here she is.
36
137920
1200
02:32
Cayla is equipped with a Bluetooth device
37
140200
3376
02:35
which can transmit up to 60 feet,
38
143600
2216
02:37
a bit less if there's a wall between.
39
145840
2616
02:40
That means I, or any stranger,
can connect to the doll
40
148480
5296
02:45
while being outside the room
where Cayla and her friends are.
41
153800
3736
02:49
And to illustrate this,
42
157560
2176
02:51
I'm going to turn Cayla on now.
43
159760
2136
02:53
Let's see, one, two, three.
44
161920
1800
02:57
There. She's on. And I asked a colleague
45
165040
1976
02:59
to stand outside with his smartphone,
46
167040
2096
03:01
and he's connected,
47
169160
1240
03:03
and to make this a bit creepier ...
48
171320
2096
03:05
(Laughter)
49
173440
4056
03:09
let's see what kids could hear Cayla say
in the safety of their room.
50
177520
4920
03:15
Man: Hi. My name is Cayla. What is yours?
51
183920
2896
03:18
Finn Myrstad: Uh, Finn.
52
186840
1320
03:20
Man: Is your mom close by?
53
188960
1296
03:22
FM: Uh, no, she's in the store.
54
190280
1480
03:24
Man: Ah. Do you want
to come out and play with me?
55
192680
2376
03:27
FM: That's a great idea.
56
195080
1480
03:29
Man: Ah, great.
57
197720
1200
03:32
FM: I'm going to turn Cayla off now.
58
200480
2136
03:34
(Laughter)
59
202640
1200
03:39
We needed no password
60
207080
2736
03:41
or to circumvent any other
type of security to do this.
61
209840
3560
03:46
We published a report
in 20 countries around the world,
62
214440
3816
03:50
exposing this significant security flaw
63
218280
2976
03:53
and many other problematic issues.
64
221280
1760
03:56
So what happened?
65
224000
1240
03:57
Cayla was banned in Germany,
66
225840
1640
04:00
taken off the shelves
by Amazon and Wal-Mart,
67
228480
3216
04:03
and she's now peacefully resting
68
231720
3056
04:06
at the German Spy Museum in Berlin.
69
234800
3456
04:10
(Laughter)
70
238280
2776
04:13
However, Cayla was also for sale
in stores around the world
71
241080
4296
04:17
for more than a year
after we published our report.
72
245400
3576
04:21
What we uncovered is that
there are few rules to protect us
73
249000
4256
04:25
and the ones we have
are not being properly enforced.
74
253280
3360
04:30
We need to get the security
and privacy of these devices right
75
258000
3856
04:33
before they enter the market,
76
261880
2856
04:36
because what is the point
of locking a house with a key
77
264760
3976
04:40
if anyone can enter it
through a connected device?
78
268760
2920
04:45
You may well think,
"This will not happen to me.
79
273640
3296
04:48
I will just stay away
from these flawed devices."
80
276960
2600
04:52
But that won't keep you safe,
81
280600
2056
04:54
because simply by
connecting to the internet,
82
282680
3176
04:57
you are put in an impossible
take-it-or-leave-it position.
83
285880
4576
05:02
Let me show you.
84
290480
1200
05:04
Like most of you,
I have dozens of apps on my phone,
85
292400
3096
05:07
and used properly,
they can make our lives easier,
86
295520
2856
05:10
more convenient and maybe even healthier.
87
298400
2440
05:13
But have we been lulled
into a false sense of security?
88
301960
3520
05:18
It starts simply by ticking a box.
89
306600
2440
05:21
Yes, we say,
90
309880
1776
05:23
I've read the terms.
91
311680
1440
05:27
But have you really read the terms?
92
315240
3040
05:31
Are you sure they didn't look too long
93
319200
2296
05:33
and your phone was running out of battery,
94
321520
2056
05:35
and the last time you tried
they were impossible to understand,
95
323600
3216
05:38
and you needed to use the service now?
96
326840
1840
05:41
And now, the power
imbalance is established,
97
329840
3656
05:45
because we have agreed
to our personal information
98
333520
3656
05:49
being gathered and used
on a scale we could never imagine.
99
337200
3120
05:53
This is why my colleagues and I
decided to take a deeper look at this.
100
341640
3696
05:57
We set out to read the terms
101
345360
3336
06:00
of popular apps on an average phone.
102
348720
2696
06:03
And to show the world
how unrealistic it is
103
351440
3736
06:07
to expect consumers
to actually read the terms,
104
355200
3216
06:10
we printed them,
105
358440
1496
06:11
more than 900 pages,
106
359960
1840
06:14
and sat down in our office
and read them out loud ourselves,
107
362800
3600
06:19
streaming the experiment
live on our websites.
108
367800
2536
06:22
As you can see, it took quite a long time.
109
370360
2536
06:24
It took us 31 hours,
49 minutes and 11 seconds
110
372920
4416
06:29
to read the terms on an average phone.
111
377360
2576
06:31
That is longer than a movie marathon
of the "Harry Potter" movies
112
379960
4376
06:36
and the "Godfather" movies combined.
113
384360
2496
06:38
(Laughter)
114
386880
1400
06:41
And reading is one thing.
115
389600
1936
06:43
Understanding is another story.
116
391560
1976
06:45
That would have taken us
much, much longer.
117
393560
3576
06:49
And this is a real problem,
118
397160
1776
06:50
because companies have argued
for 20 to 30 years
119
398960
3216
06:54
against regulating the internet better,
120
402200
3056
06:57
because users have consented
to the terms and conditions.
121
405280
3160
07:02
As we've shown with this experiment,
122
410520
1976
07:04
achieving informed consent
is close to impossible.
123
412520
2880
07:09
Do you think it's fair to put the burden
of responsibility on the consumer?
124
417080
3524
07:14
I don't.
125
422000
1736
07:15
I think we should demand
less take-it-or-leave-it
126
423760
3096
07:18
and more understandable terms
before we agree to them.
127
426880
3176
07:22
(Applause)
128
430080
1536
07:23
Thank you.
129
431640
1200
07:28
Now, I would like to tell you
a story about love.
130
436200
4880
07:34
Some of the world's
most popular apps are dating apps,
131
442080
3536
07:37
an industry now worth more than,
or close to, three billion dollars a year.
132
445640
4640
07:43
And of course, we're OK
sharing our intimate details
133
451160
4176
07:47
with our other half.
134
455360
1240
07:49
But who else is snooping,
135
457240
1976
07:51
saving and sharing our information
136
459240
2936
07:54
while we are baring our souls?
137
462200
1640
07:56
My team and I decided to investigate this.
138
464520
2200
08:00
And in order to understand
the issue from all angles
139
468920
3016
08:03
and to truly do a thorough job,
140
471960
2240
08:07
I realized I had to download
141
475400
1976
08:09
one of the world's
most popular dating apps myself.
142
477400
3440
08:14
So I went home to my wife ...
143
482440
2296
08:16
(Laughter)
144
484760
1936
08:18
who I had just married.
145
486720
1656
08:20
"Is it OK if I establish a profile
on a very popular dating app
146
488400
4616
08:25
for purely scientific purposes?"
147
493040
1896
08:26
(Laughter)
148
494960
1856
08:28
This is what we found.
149
496840
1496
08:30
Hidden behind the main menu
was a preticked box
150
498360
3976
08:34
that gave the dating company access
to all my personal pictures on Facebook,
151
502360
6056
08:40
in my case more than 2,000 of them,
152
508440
2856
08:43
and some were quite personal.
153
511320
2120
08:46
And to make matters worse,
154
514400
2216
08:48
when we read the terms and conditions,
155
516640
2056
08:50
we discovered the following,
156
518720
1376
08:52
and I'm going to need to take out
my reading glasses for this one.
157
520120
3120
08:56
And I'm going to read it for you,
because this is complicated.
158
524400
2936
08:59
All right.
159
527360
1200
09:01
"By posting content" --
160
529440
1536
09:03
and content refers to your pictures, chat
161
531000
1976
09:05
and other interactions
in the dating service --
162
533000
2216
09:07
"as a part of the service,
163
535240
1256
09:08
you automatically grant to the company,
164
536520
1976
09:10
its affiliates, licensees and successors
165
538520
2176
09:12
an irrevocable" -- which means
you can't change your mind --
166
540720
3616
09:16
"perpetual" -- which means forever --
167
544360
2776
09:19
"nonexclusive, transferrable,
sublicensable, fully paid-up,
168
547160
2896
09:22
worldwide right and license
to use, copy, store, perform,
169
550080
2696
09:24
display, reproduce, record,
170
552800
1336
09:26
play, adapt, modify
and distribute the content,
171
554160
2216
09:28
prepare derivative works of the content,
172
556400
1936
09:30
or incorporate the content
into other works
173
558360
2016
09:32
and grant and authorize sublicenses
of the foregoing in any media
174
560400
3056
09:35
now known or hereafter created."
175
563480
1560
09:40
That basically means
that all your dating history
176
568640
3816
09:44
and everything related to it
can be used for any purpose for all time.
177
572480
5080
09:50
Just imagine your children
seeing your sassy dating photos
178
578520
4976
09:55
in a birth control ad 20 years from now.
179
583520
2560
10:00
But seriously, though --
180
588400
1216
10:01
(Laughter)
181
589640
1600
10:04
what might these commercial
practices mean to you?
182
592880
2360
10:08
For example, financial loss:
183
596320
2240
10:11
based on your web browsing history,
184
599480
1696
10:13
algorithms might decide
whether you will get a mortgage or not.
185
601200
2960
10:16
Subconscious manipulation:
186
604840
1480
10:19
companies can analyze your emotions
based on your photos and chats,
187
607560
3696
10:23
targeting you with ads
when you are at your most vulnerable.
188
611280
3256
10:26
Discrimination:
189
614560
1496
10:28
a fitness app can sell your data
to a health insurance company,
190
616080
3016
10:31
preventing you from getting
coverage in the future.
191
619120
3056
10:34
All of this is happening
in the world today.
192
622200
2520
10:37
But of course, not all uses
of data are malign.
193
625800
3336
10:41
Some are just flawed or need more work,
194
629160
1976
10:43
and some are truly great.
195
631160
1520
10:47
And there is some good news as well.
196
635560
3696
10:51
The dating companies
changed their policies globally
197
639280
3296
10:54
after we filed a legal complaint.
198
642600
1680
10:57
But organizations such as mine
199
645720
2696
11:00
that fight for consumers' rights
can't be everywhere.
200
648440
2976
11:03
Nor can consumers fix this on their own,
201
651440
2536
11:06
because if we know
that something innocent we said
202
654000
3576
11:09
will come back to haunt us,
203
657600
1456
11:11
we will stop speaking.
204
659080
1896
11:13
If we know that we are being
watched and monitored,
205
661000
3376
11:16
we will change our behavior.
206
664400
2096
11:18
And if we can't control who has our data
and how it is being used,
207
666520
3896
11:22
we have lost the control of our lives.
208
670440
1840
11:26
The stories I have told you today
are not random examples.
209
674400
3496
11:29
They are everywhere,
210
677920
1776
11:31
and they are a sign
that things need to change.
211
679720
2856
11:34
And how can we achieve that change?
212
682600
2096
11:36
Well, companies need to realize
that by prioritizing privacy and security,
213
684720
5576
11:42
they can build trust
and loyalty to their users.
214
690320
2960
11:46
Governments must create a safer internet
215
694520
3096
11:49
by ensuring enforcement
and up-to-date rules.
216
697640
2880
11:53
And us, the citizens?
217
701400
2216
11:55
We can use our voice
218
703640
1816
11:57
to remind the world that technology
can only truly benefit society
219
705480
5096
12:02
if it respects basic rights.
220
710600
2600
12:05
Thank you so much.
221
713720
1576
12:07
(Applause)
222
715320
4080

▲Back to top

ABOUT THE SPEAKER
Finn Lützow-Holm Myrstad - Information security researcher
Finn Lützow-Holm Myrstad leads the development of better, more ethical digital policies and standards at the Norwegian Consumer Council.

Why you should listen

Using targeted research and global coalition building, Finn Lützow-Holm Myrstad and his team convince governments and companies to improve their policies. Myrstad is the EU Chair of the Transatlantic Consumer Dialogue's INFOSOC Committee, which gathers 75 organizations from both sides of the Atlantic. Myrstad shapes opinion through addresses to audiences such as the G20, European Parliament, SXSW, Princeton University and TED.

Some of their projects so far include privacy and security challenges posed by: internet-connected devicesuser terms in apps and tech companies use of dark patterns. Myrstad informs his digital work with his business and LSE degrees as well as his previous experience in politics, energy and the private sector. In 2018 he was named amongst Politico's top influencers in the "GDPR power matrix," and in 2017 he and his team won the European Excellence Awards for best Public Affairs campaign for the work on connected toys.

More profile about the speaker
Finn Lützow-Holm Myrstad | Speaker | TED.com