ABOUT THE SPEAKERS
Kashmir Hill - Technology journalist
Kashmir Hill writes about privacy and technology.

Why you should listen

Kashmir Hill is a senior reporter for the Gizmodo Media Group. As she writes: "I started out in journalism blogging at what was essentially an online tabloid for lawyers. Then I got interested in privacy, and that forced me to write about Facebook and eventually about other technologies; along the way people started describing me as a technology journalist instead of a legal blogger.

"I've always wanted my writing to be approachable for a lay audience, so I usually use humor, a first-person approach or, ideally, both. So I've hacked a smart home, lived in a monitored one, created a fake business and bought it a fake reputation, worked as a crowdsourced girlfriend, lived on Bitcoin and spent a whole week WRITING IN CAPS LOCK. The best way to prepare people for future possible tech dystopias is for me to live in them and report back."

More profile about the speaker
Kashmir Hill | Speaker | TED.com
Surya Mattu - Artist, investigative journalist, engineer
Think of Surya Mattu as a data detective. As he writes: "I am interested in figuring out the ways in which algorithmic systems harm people."

Why you should listen

Surya Mattu is a data reporter on Gizmodo's Special Projects Desk and an R&D journalism resident at Eyebeam NYC. As he writes: "My practice combines art, investigative journalism, engineering and creative technology. The aim is to reverse-engineer the specific ways in which the tools or technology we create are imbued with the ethics of the culture in which they're created. Currently, I am a data reporter. Previously, I was a contributing researcher at ProPublica, where I worked on "Machine Bias," a series that aims to highlight how algorithmic systems can be biased and discriminate against people."

More profile about the speaker
Surya Mattu | Speaker | TED.com
TED2018

Kashmir Hill and Surya Mattu: What your smart devices know (and share) about you

Filmed:
1,504,690 views

Once your smart devices can talk to you, who else are they talking to? Kashmir Hill and Surya Mattu wanted to find out -- so they outfitted Hill's apartment with 18 different internet-connected devices and built a special router to track how often they contacted their servers and see what they were reporting back. The results were surprising -- and more than a little bit creepy. Learn more about what the data from your smart devices reveals about your sleep schedule, TV binges and even your tooth-brushing habits -- and how tech companies could use it to target and profile you. (This talk contains mature language.)
- Technology journalist
Kashmir Hill writes about privacy and technology. Full bio - Artist, investigative journalist, engineer
Think of Surya Mattu as a data detective. As he writes: "I am interested in figuring out the ways in which algorithmic systems harm people." Full bio

Double-click the English transcript below to play the video.

00:12
Kashmir Hill: So for my
birthday last year,
0
912
2135
00:15
my husband got me an Amazon Echo.
1
3071
1993
00:17
I was kind of shocked, actually,
2
5595
1540
00:19
because we both work
in privacy and security.
3
7159
3234
00:22
(Laughter)
4
10417
1336
00:24
And this was a device that would sit
in the middle of our home
5
12688
3389
00:28
with a microphone on,
6
16101
1436
00:29
constantly listening.
7
17561
1649
00:31
We're not alone, though.
8
19696
1254
00:32
According to a survey by NPR
and Edison Research,
9
20974
2881
00:35
one in six American adults
now has a smart speaker,
10
23879
4079
00:39
which means that they have
a virtual assistant at home.
11
27982
2881
00:42
Like, that's wild.
12
30887
1198
00:44
The future, or the future dystopia,
is getting here fast.
13
32109
4116
00:48
Beyond that, companies are offering us
all kinds of internet-connected devices.
14
36778
4507
00:53
There are smart lights, smart locks,
smart toilets, smart toys,
15
41309
4968
00:58
smart sex toys.
16
46301
1311
01:00
Being smart means the device
can connect to the internet,
17
48323
2691
01:03
it can gather data,
18
51038
1176
01:04
and it can talk to its owner.
19
52238
1902
01:06
But once your appliances can talk to you,
20
54966
2714
01:09
who else are they going to be talking to?
21
57704
2145
01:12
I wanted to find out,
22
60331
1365
01:13
so I went all-in and turned my
one-bedroom apartment in San Francisco
23
61720
3388
01:17
into a smart home.
24
65132
1349
01:18
I even connected our bed to the internet.
25
66800
3091
01:22
As far as I know, it was just
measuring our sleeping habits.
26
70505
3302
01:26
I can now tell you
that the only thing worse
27
74490
2081
01:28
than getting a terrible night's sleep
28
76595
1787
01:30
is to have your smart bed
tell you the next day
29
78406
2417
01:32
that you "missed your goal
and got a low sleep score."
30
80847
3056
01:35
(Laughter)
31
83927
1380
01:37
It's like, "Thanks, smart bed.
32
85331
1437
01:38
As if I didn't already
feel like shit today."
33
86792
2516
01:41
(Laughter)
34
89332
1153
01:42
All together, I installed 18
internet-connected devices in my home.
35
90509
4536
01:47
I also installed a Surya.
36
95069
2238
01:49
Surya Mattu: Hi, I'm Surya.
37
97331
1381
01:50
(Laughter)
38
98736
1198
01:51
I monitored everything the smart home did.
39
99958
2944
01:54
I built a special router that let me look
at all the network activity.
40
102926
3929
01:58
You can think of my router
sort of like a security guard,
41
106879
2930
02:01
compulsively logging
all the network packets
42
109833
2133
02:03
as they entered and left the smart home.
43
111990
2047
02:06
KH: Surya and I are both journalists,
he's not my husband,
44
114061
2754
02:08
we just work together at Gizmodo.
45
116839
1676
02:10
SM: Thank you for clarifying.
46
118539
1389
02:11
The devices Kashmir bought --
47
119952
1402
02:13
we were interested in understanding
48
121378
1692
02:15
what they were saying
to their manufacturers.
49
123094
2111
02:17
But we were also interested
in understanding
50
125229
2062
02:19
what the home's digital
emissions look like
51
127315
2370
02:21
to the internet service provider.
52
129709
2265
02:23
We were seeing what the ISP could see,
but more importantly,
53
131998
2842
02:26
what they could sell.
54
134864
1181
02:28
KH: We ran the experiment for two months.
55
136069
2143
02:30
In that two months,
56
138236
1167
02:31
there wasn't a single hour
of digital silence in the house --
57
139427
2877
02:34
not even when we went away for a week.
58
142328
1932
02:36
SM: Yeah, it's so true.
59
144284
1174
02:37
Based on the data, I knew when
you guys woke up and went to bed.
60
145482
3016
02:40
I even knew when Kashmir
brushed her teeth.
61
148522
2039
02:42
I'm not going to out your brushing habits,
62
150585
2024
02:44
but let's just say it was very clear to me
when you were working from home.
63
152633
3629
02:48
KH: Uh, I think you just outed them
to, like, a lot of people here.
64
156286
3212
02:51
SM: Don't be embarrassed,
it's just metadata.
65
159522
2346
02:54
I knew when you turned on your TV
and how long you watched it for.
66
162472
3143
02:57
Fun fact about the Hill household:
67
165639
1629
02:59
they don't watch a lot of television,
68
167292
1794
03:01
but when they do,
it's usually in binge mode.
69
169110
2268
03:03
Favorite shows include
"Difficult People" and "Party Down."
70
171402
2779
03:06
KH: OK, you're right,
I loved "Party Down."
71
174205
2039
03:08
It's a great show,
and you should definitely watch it.
72
176268
2547
03:10
But "Difficult People"
was all my husband, Trevor.
73
178839
2444
03:13
And Trevor was actually a little upset
that you knew about his binges,
74
181307
3603
03:16
because even though he'd been the one
to connect the TV to the router,
75
184934
3334
03:20
he forgot that the TV was watching us.
76
188292
2447
03:23
It's actually not the first time
that our TV has spied on us.
77
191109
3293
03:26
The company that made it, VIZIO,
78
194426
1699
03:28
paid a 2.2 million-dollar settlement
to the government just last year,
79
196149
4333
03:32
because it had been collecting
second-by-second information
80
200506
3357
03:35
about what millions of people
were watching on TV, including us,
81
203887
3468
03:39
and then it was selling that information
to data brokers and advertisers.
82
207379
3705
03:43
SM: Ah, classic surveillance economy move.
83
211108
3222
03:46
The devices Kashmir bought
almost all pinged their servers daily.
84
214775
3866
03:50
But do you know which device
was especially chatty?
85
218665
2396
03:53
The Amazon Echo.
86
221085
1294
03:54
It contacted its servers
every three minutes,
87
222403
2428
03:56
regardless of whether
you were using it or not.
88
224855
2198
03:59
KH: In general, it was disconcerting
89
227077
2183
04:01
that all these devices were having
ongoing conversations
90
229284
2984
04:04
that were invisible to me.
91
232292
1626
04:05
I mean, I would have had
no idea, without your router.
92
233942
2587
04:08
If you buy a smart device,
you should probably know --
93
236553
3468
04:12
you're going to own the device,
94
240045
2111
04:14
but in general, the company
is going to own your data.
95
242180
3420
04:17
And you know, I mean,
maybe that's to be expected --
96
245624
2572
04:20
you buy an internet-connected device,
it's going to use the internet.
97
248220
3507
04:24
But it's strange to have these devices
98
252189
1842
04:26
moving into the intimate space
that is the home
99
254055
2658
04:28
and allowing companies to track
our really basic behavior there.
100
256737
3443
04:32
SM: So true.
101
260204
1159
04:33
Even the most banal-seeming data
can be mined by the surveillance economy.
102
261387
3539
04:36
For example, who cares
how often you brush your teeth?
103
264950
2575
04:39
Well, as it turns out, there's a dental
insurance company called Beam.
104
267549
3526
04:43
They've been monitoring their customers'
smart toothbrushes since 2015 --
105
271099
3835
04:46
for discounts on their
premiums, of course.
106
274958
2579
04:49
KH: We know what
some of you are thinking:
107
277561
2259
04:51
this is the contract of the modern world.
108
279844
2654
04:54
You give up a little privacy,
109
282522
1427
04:55
and you get some convenience
or some price breaks in return.
110
283973
3218
04:59
But that wasn't my experience
in my smart home.
111
287514
2341
05:01
It wasn't convenient, it was infuriating.
112
289879
3674
05:05
I'll admit, I love my smart vacuum,
113
293577
2492
05:08
but many other things in the house
drove me insane:
114
296093
2405
05:10
we ran out of electrical outlets,
115
298522
2396
05:12
and I had to download
over a dozen apps to my phone
116
300942
3379
05:16
to control everything.
117
304345
1327
05:17
And then every device had its own log-in,
118
305696
2089
05:19
my toothbrush had a password ...
119
307809
2363
05:22
(Laughter)
120
310196
1737
05:23
And smart coffee, especially,
was just a world of hell.
121
311957
4128
05:28
SM: Wait, really? Cloud-powered
coffee wasn't really working for you?
122
316109
3928
05:32
KH: I mean, maybe I'm naive,
but I thought it was going to be great.
123
320061
3238
05:35
I thought we'd just wake up in the morning
and we'd say, "Alexa, make us coffee."
124
323323
4019
05:39
But that's not how it went down.
125
327366
1873
05:41
We had to use this really particular,
brand-specific phrase to make it work.
126
329263
4651
05:45
It was, "Alexa, ask the Behmor
to run quick start."
127
333938
4608
05:51
And this was just, like,
really hard to remember
128
339222
3342
05:54
first thing in the morning,
129
342588
1334
05:55
before you have had your caffeine.
130
343946
1703
05:57
(Laughter)
131
345673
1103
05:58
And apparently, it was hard to say,
132
346800
1690
06:00
because the Echo Dot
that was right next to our bed
133
348514
3063
06:03
just couldn't understand us.
134
351601
1611
06:05
So we would basically start every day
by screaming this phrase at the Echo Dot.
135
353704
4621
06:10
(Laughter)
136
358349
1173
06:11
And Trevor hated this.
137
359546
1466
06:13
He'd be like, "Please, Kashmir,
138
361339
1748
06:15
just let me go to the kitchen and push
the button to make the coffee run."
139
363111
3669
06:19
And I'd be like, "No, you can't!
140
367379
2087
06:21
We have to do it the smart way!"
141
369490
2409
06:23
(Laughter)
142
371923
1916
06:25
I'm happy to report that our marriage
survived the experiment,
143
373863
3043
06:28
but just barely.
144
376930
1560
06:30
SM: If you decide to make your home smart,
145
378514
2047
06:32
hopefully, you’ll find it
less infuriating than Kashmir did.
146
380585
2849
06:35
But regardless, the smart things you buy
147
383458
2087
06:37
can and probably are used
to target and profile you.
148
385569
3095
06:41
Just the number of devices you have
can be used to predict
149
389141
3040
06:44
how rich or poor you are.
150
392205
1422
06:45
Facebook's made this tech,
and they've also patented it.
151
393651
2767
06:48
KH: All the anxiety you currently feel
every time you go online,
152
396442
3686
06:52
about being tracked,
153
400152
1306
06:53
is about to move into your living room.
154
401482
2301
06:55
Or into your bedroom.
155
403807
1579
06:57
There's this sex toy called the We-Vibe.
156
405950
2222
07:00
You might wonder why
a sex toy connects to the internet,
157
408577
2667
07:03
but it's for two people
who are in a long-distance relationship,
158
411268
3436
07:06
so they can share their love from afar.
159
414728
2920
07:10
Some hackers took a close look at this toy
160
418259
2116
07:12
and saw it was sending
a lot of information
161
420399
2075
07:14
back to the company that made it --
162
422498
2462
07:16
when it was used,
how long it was used for,
163
424984
2966
07:19
what the vibration settings were,
how hot the toy got.
164
427974
3706
07:23
It was all going into a database.
165
431704
2189
07:25
So I reached out to the company,
166
433917
2749
07:28
and I said, "Why are you collecting
this really sensitive data?"
167
436690
3302
07:32
And they said, "Well, it's great
for market research."
168
440369
3386
07:36
But they were data-mining
their customers' orgasms.
169
444592
2929
07:39
And they weren't telling them about it.
170
447545
1921
07:41
I mean, even if
you're cavalier about privacy,
171
449490
2222
07:43
I hope that you would admit
that's a step too far.
172
451736
2595
07:46
SM: This is why I want
to keep my sex toys dumb.
173
454680
2508
07:49
KH: That's great.
174
457212
1166
07:50
We're all very glad to know that.
175
458402
1666
07:52
(Laughter)
176
460092
1448
07:53
SM: A data point I'm willing to share.
177
461564
2034
07:55
(Laughter)
178
463622
1723
07:57
The devices Kashmir bought
range from useful to annoying.
179
465369
2970
08:00
But the thing they all had in common
180
468363
1793
08:02
was sharing data with
the companies that made them.
181
470180
2699
08:04
With email service providers
and social media,
182
472903
2308
08:07
we've long been told that if it's free,
you're the product.
183
475235
2958
08:10
But with the internet of things, it seems,
184
478217
2007
08:12
even if you pay, you're still the product.
185
480248
2008
08:14
So you really have to ask:
186
482280
1245
08:15
Who's the true beneficiary
of your smart home,
187
483549
2185
08:17
you or the company mining you?
188
485758
1629
08:19
KH: Look, we're a tech savvy crowd here.
189
487411
1952
08:21
I think most of us know
that these things connect to the internet
190
489387
3112
08:24
and send data out.
191
492523
1163
08:25
And fine, maybe you're OK with living
in that commercial panopticon,
192
493710
4018
08:29
but others aren't.
193
497752
1397
08:31
We need the companies to rethink
the design of these devices
194
499173
2841
08:34
with our privacy in mind,
195
502038
1515
08:35
because we're not all willing
to participate in "market research,"
196
503577
3111
08:38
just because a device we bought
has a Wi-Fi connection.
197
506712
2893
08:42
And I have to tell you,
198
510014
1198
08:43
even when you're aware,
generally, this is happening,
199
511236
2582
08:45
it's really easy to forget that normal
household items are spying on you.
200
513842
4394
08:50
It's easy to forget
these things are watching you,
201
518664
2350
08:53
because they don't look like cameras.
202
521038
2047
08:55
They could look like ...
203
523109
1429
08:56
well, they could look like a dildo.
204
524562
2544
08:59
Thank you.
205
527780
1152
09:00
(Applause)
206
528956
3686

▲Back to top

ABOUT THE SPEAKERS
Kashmir Hill - Technology journalist
Kashmir Hill writes about privacy and technology.

Why you should listen

Kashmir Hill is a senior reporter for the Gizmodo Media Group. As she writes: "I started out in journalism blogging at what was essentially an online tabloid for lawyers. Then I got interested in privacy, and that forced me to write about Facebook and eventually about other technologies; along the way people started describing me as a technology journalist instead of a legal blogger.

"I've always wanted my writing to be approachable for a lay audience, so I usually use humor, a first-person approach or, ideally, both. So I've hacked a smart home, lived in a monitored one, created a fake business and bought it a fake reputation, worked as a crowdsourced girlfriend, lived on Bitcoin and spent a whole week WRITING IN CAPS LOCK. The best way to prepare people for future possible tech dystopias is for me to live in them and report back."

More profile about the speaker
Kashmir Hill | Speaker | TED.com
Surya Mattu - Artist, investigative journalist, engineer
Think of Surya Mattu as a data detective. As he writes: "I am interested in figuring out the ways in which algorithmic systems harm people."

Why you should listen

Surya Mattu is a data reporter on Gizmodo's Special Projects Desk and an R&D journalism resident at Eyebeam NYC. As he writes: "My practice combines art, investigative journalism, engineering and creative technology. The aim is to reverse-engineer the specific ways in which the tools or technology we create are imbued with the ethics of the culture in which they're created. Currently, I am a data reporter. Previously, I was a contributing researcher at ProPublica, where I worked on "Machine Bias," a series that aims to highlight how algorithmic systems can be biased and discriminate against people."

More profile about the speaker
Surya Mattu | Speaker | TED.com