ABOUT THE SPEAKERS
Kashmir Hill - Technology journalist
Kashmir Hill writes about privacy and technology.

Why you should listen

Kashmir Hill is a senior reporter for the Gizmodo Media Group. As she writes: "I started out in journalism blogging at what was essentially an online tabloid for lawyers. Then I got interested in privacy, and that forced me to write about Facebook and eventually about other technologies; along the way people started describing me as a technology journalist instead of a legal blogger.

"I've always wanted my writing to be approachable for a lay audience, so I usually use humor, a first-person approach or, ideally, both. So I've hacked a smart home, lived in a monitored one, created a fake business and bought it a fake reputation, worked as a crowdsourced girlfriend, lived on Bitcoin and spent a whole week WRITING IN CAPS LOCK. The best way to prepare people for future possible tech dystopias is for me to live in them and report back."

More profile about the speaker
Kashmir Hill | Speaker | TED.com
Surya Mattu - Artist, investigative journalist, engineer
Think of Surya Mattu as a data detective. As he writes: "I am interested in figuring out the ways in which algorithmic systems harm people."

Why you should listen

Surya Mattu is a data reporter on Gizmodo's Special Projects Desk and an R&D journalism resident at Eyebeam NYC. As he writes: "My practice combines art, investigative journalism, engineering and creative technology. The aim is to reverse-engineer the specific ways in which the tools or technology we create are imbued with the ethics of the culture in which they're created. Currently, I am a data reporter. Previously, I was a contributing researcher at ProPublica, where I worked on "Machine Bias," a series that aims to highlight how algorithmic systems can be biased and discriminate against people."

More profile about the speaker
Surya Mattu | Speaker | TED.com
TED2018

Kashmir Hill and Surya Mattu: What your smart devices know (and share) about you

卡許米爾希爾 和 蘇瑞雅馬圖: 你的智慧裝置會知道(並分享)你的哪些資訊?

Filmed:
2,030,169 views

一旦你的「智慧」裝置能和你交談,它們還會跟其他哪些人交談?卡許米爾希爾和蘇瑞雅馬圖想要查出答案,所以他們把希爾的公寓裝配了 18 項和網路連結的裝置,並建造了一台特殊的路由器,來反追蹤這些裝置有多常和它們的伺服器連結,和會傳什麼資訊回去。結果很驚人,讓人毛骨悚然。來聽聽這場演說,進一步了解你的智慧裝置會告訴背後的公司哪些關於你的資訊,可能是你的睡眠時程、你迷上的電視節目、甚至你的刷牙習慣——以及科技公司如何用這些資訊來針對你做側寫。(包含成人內容。)
- Technology journalist
Kashmir Hill writes about privacy and technology. Full bio - Artist, investigative journalist, engineer
Think of Surya Mattu as a data detective. As he writes: "I am interested in figuring out the ways in which algorithmic systems harm people." Full bio

Double-click the English transcript below to play the video.

00:12
Kashmir克什米爾 Hill爬坡道: So for my
birthday生日 last year,
0
912
2135
卡許米爾希爾:我去年生日時,
00:15
my husband丈夫 got me an Amazon亞馬遜 Echo迴聲.
1
3071
1993
我先生送了我一台 Amazon
Echo(智能居家助理)。
00:17
I was kind of shocked吃驚, actually其實,
2
5595
1540
我其實還挺震驚的,
00:19
because we both work
in privacy隱私 and security安全.
3
7159
3234
因為我們兩個人做事
都很重視隱私。
00:22
(Laughter笑聲)
4
10417
1336
(笑聲)
00:24
And this was a device設備 that would sit
in the middle中間 of our home
5
12688
3389
這個裝置會坐在我們家的中央,
00:28
with a microphone麥克風 on,
6
16101
1436
打開麥克風,
00:29
constantly經常 listening.
7
17561
1649
不斷傾聽著。
00:31
We're not alone單獨, though雖然.
8
19696
1254
不過,我們並不孤單。
00:32
According根據 to a survey調查 by NPR美國國家公共電台
and Edison愛迪生 Research研究,
9
20974
2881
根據全國公共廣播電台
和愛迪生研究公司的調查,
00:35
one in six American美國 adults成年人
now has a smart聰明 speaker揚聲器,
10
23879
4079
現今的美國成人中,每六人
就有一人擁有智慧喇叭,
00:39
which哪一個 means手段 that they have
a virtual虛擬 assistant助理 at home.
11
27982
2881
也就是說,他們家中
有一個虛擬助理。
00:42
Like, that's wild野生.
12
30887
1198
那好瘋狂啊。
00:44
The future未來, or the future未來 dystopia異位,
is getting得到 here fast快速.
13
32109
4116
未來或是未來的反烏托邦來得好快。
00:48
Beyond that, companies公司 are offering us
all kinds of internet-connected互聯網連接 devices設備.
14
36778
4507
在那之外,公司也提供我們
各式各樣和網路連結的裝置。
00:53
There are smart聰明 lights燈火, smart聰明 locks,
smart聰明 toilets洗手間, smart聰明 toys玩具,
15
41309
4968
有智慧燈、智慧鎖、
智慧廁所、智慧玩具、
00:58
smart聰明 sex性別 toys玩具.
16
46301
1311
智慧性玩具。
01:00
Being存在 smart聰明 means手段 the device設備
can connect to the internet互聯網,
17
48323
2691
當一個裝置有智慧,
就表示它能連上網路,
01:03
it can gather收集 data數據,
18
51038
1176
它能收集資料,
01:04
and it can talk to its owner所有者.
19
52238
1902
且它能和它的擁有者交談。
01:06
But once一旦 your appliances家電 can talk to you,
20
54966
2714
但一旦你的器具能和你交談,
01:09
who else其他 are they going to be talking to?
21
57704
2145
它們還會跟其他哪些人交談?
01:12
I wanted to find out,
22
60331
1365
我想要知道答案,
01:13
so I went all-in所有功能於 and turned轉身 my
one-bedroom一間臥室 apartment公寓 in San Francisco弗朗西斯科
23
61720
3388
所以我投入一切,
把我在舊金山只有一間臥房的公寓
01:17
into a smart聰明 home.
24
65132
1349
變成一個智慧家庭。
01:18
I even connected連接的 our bed to the internet互聯網.
25
66800
3091
我甚至把我們的床連上網路。
01:22
As far as I know, it was just
measuring測量 our sleeping睡眠 habits習慣.
26
70505
3302
就我所知,它只是在
測量我們的睡眠習慣。
01:26
I can now tell you
that the only thing worse更差
27
74490
2081
我現在可以告訴各位,只有一件事
01:28
than getting得到 a terrible可怕 night's晚上的 sleep睡覺
28
76595
1787
比晚上沒睡好更糟,
01:30
is to have your smart聰明 bed
tell you the next下一個 day
29
78406
2417
那就是:隔天你的智慧床告訴你
01:32
that you "missed錯過 your goal目標
and got a low sleep睡覺 score得分了."
30
80847
3056
你「沒達成你的目標,
取得很低的睡眠分數。」
01:35
(Laughter笑聲)
31
83927
1380
(笑聲)
我想說:「謝了,智慧床。
01:37
It's like, "Thanks謝謝, smart聰明 bed.
32
85331
1437
01:38
As if I didn't already已經
feel like shit拉屎 today今天."
33
86792
2516
你是覺得我今天的感覺
還不夠鳥嗎?」
01:41
(Laughter笑聲)
34
89332
1153
(笑聲)
01:42
All together一起, I installed安裝 18
internet-connected互聯網連接 devices設備 in my home.
35
90509
4536
我總共在我家中安裝了
18 項和網路連結的裝置。
01:47
I also installed安裝 a Surya蘇里亞.
36
95069
2238
我甚至安裝了一個蘇瑞雅。
01:49
Surya蘇里亞 MattuMattu: Hi你好, I'm Surya蘇里亞.
37
97331
1381
蘇瑞雅馬圖:嗨,
我是蘇瑞雅。(笑聲)
01:50
(Laughter笑聲)
38
98736
1198
01:51
I monitored監控 everything the smart聰明 home did.
39
99958
2944
我負責監控智慧家庭
所做的每一件事。
01:54
I built內置 a special特別 router路由器 that let me look
at all the network網絡 activity活動.
40
102926
3929
我建造了一個特殊的路由器
讓我可以看到所有的網路活動。
01:58
You can think of my router路由器
sort分類 of like a security安全 guard守衛,
41
106879
2930
你們可以把我的路由器
想像成是安全警衛,
02:01
compulsively強制 logging記錄
all the network網絡 packets
42
109833
2133
強制性地在任何
網路封包進入和離開
02:03
as they entered進入 and left the smart聰明 home.
43
111990
2047
智慧家庭時,將它們記錄下來。
02:06
KHKH: Surya蘇里亞 and I are both journalists記者,
he's not my husband丈夫,
44
114061
2754
卡:蘇瑞雅和我都是記者,
他不是我老公,
02:08
we just work together一起 at GizmodoGizmodo的.
45
116839
1676
我們只是在 Gizmodo 工作的同事。
02:10
SMSM: Thank you for clarifying澄清.
46
118539
1389
蘇:謝謝你的澄清。
02:11
The devices設備 Kashmir克什米爾 bought --
47
119952
1402
卡許米爾買的裝置——
02:13
we were interested有興趣 in understanding理解
48
121378
1692
我們有興趣想要了解
02:15
what they were saying
to their manufacturers製造商.
49
123094
2111
它們會對它們的製造商說些什麼。
02:17
But we were also interested有興趣
in understanding理解
50
125229
2062
但我們亦有興趣想要了解
02:19
what the home's家的 digital數字
emissions排放 look like
51
127315
2370
在網路服務提供者(ISP)的眼中,
02:21
to the internet互聯網 service服務 provider提供商.
52
129709
2265
家庭的數位排放
看起來是什麼樣子。
02:23
We were seeing眼看 what the ISPISP could see,
but more importantly重要的,
53
131998
2842
我們能看見 ISP 能看見的,
但,更重要的,
是看見它們能銷售的。
02:26
what they could sell.
54
134864
1181
02:28
KHKH: We ran the experiment實驗 for two months個月.
55
136069
2143
卡:我們的實驗進行了兩個月。
02:30
In that two months個月,
56
138236
1167
在那兩個月中,
02:31
there wasn't a single hour小時
of digital數字 silence安靜 in the house --
57
139427
2877
家中完全沒有
一小時的數位寧靜——
02:34
not even when we went away for a week.
58
142328
1932
即使我們離家一週也一樣。
02:36
SMSM: Yeah, it's so true真正.
59
144284
1174
蘇:這是真的。
02:37
Based基於 on the data數據, I knew知道 when
you guys woke醒來 up and went to bed.
60
145482
3016
根據資料,我知道你們
何時起床、何時就寢。
02:40
I even knew知道 when Kashmir克什米爾
brushed her teeth.
61
148522
2039
我甚至知道卡許米爾何時刷牙。
02:42
I'm not going to out your brushing刷牙 habits習慣,
62
150585
2024
我不會揭露你的刷牙習慣,
02:44
but let's just say it was very clear明確 to me
when you were working加工 from home.
63
152633
3629
但,姑且就說,我能非常清楚
知道你何時在家工作。
02:48
KHKH: Uh, I think you just outed驅趕出局 them
to, like, a lot of people here.
64
156286
3212
卡:呃,我想你剛已經
對這裡的很多人揭露了。
02:51
SMSM: Don't be embarrassed尷尬,
it's just metadata元數據.
65
159522
2346
蘇:別覺得尷尬,
那只是詮釋資料。
02:54
I knew知道 when you turned轉身 on your TV電視
and how long you watched看著 it for.
66
162472
3143
我知道你何時打開電視,
也知道你看了多久。
02:57
Fun開玩笑 fact事實 about the Hill爬坡道 household家庭:
67
165639
1629
關於希爾家的一項趣聞:
02:59
they don't watch a lot of television電視,
68
167292
1794
他們很少看電視,
03:01
but when they do,
it's usually平時 in binge狂歡 mode模式.
69
169110
2268
但當他們看電視時,
通常都是沉迷模式。
03:03
Favorite喜愛 shows節目 include包括
"Difficult People" and "Party派對 Down."
70
171402
2779
最喜歡的節目包括
《難處之人》和《狂歡派對》。
03:06
KHKH: OK, you're right,
I loved喜愛 "Party派對 Down."
71
174205
2039
卡:你說的沒錯,
我很愛《狂歡派對》。
03:08
It's a great show顯示,
and you should definitely無疑 watch it.
72
176268
2547
這個節目很棒,你一定要看看。
03:10
But "Difficult People"
was all my husband丈夫, Trevor特雷弗.
73
178839
2444
但《難處之人》
是我老公崔佛愛看的。
03:13
And Trevor特雷弗 was actually其實 a little upset煩亂
that you knew知道 about his binges狂歡,
74
181307
3603
且崔佛其實有點不高興
你知道他在迷什麼,
03:16
because even though雖然 he'd他會 been the one
to connect the TV電視 to the router路由器,
75
184934
3334
因為雖然他是
把電視連上路由器的人,
03:20
he forgot忘記 that the TV電視 was watching觀看 us.
76
188292
2447
但他忘了電視也在看我們。
03:23
It's actually其實 not the first time
that our TV電視 has spied窺探 on us.
77
191109
3293
其實,這並不是我們的電視
第一次監視我們了。
03:26
The company公司 that made製作 it, VIZIOVIZIO,
78
194426
1699
電視的製造商 VIZIO
03:28
paid支付 a 2.2 million-dollar百萬美元 settlement沉降
to the government政府 just last year,
79
196149
4333
光是去年就付給政府
220 萬美元的和解費,
03:32
because it had been collecting蒐集
second-by-second第二次 information信息
80
200506
3357
因為它一直在收集
每一秒鐘的資訊,
03:35
about what millions百萬 of people
were watching觀看 on TV電視, including包含 us,
81
203887
3468
關於數百萬人,包括我們,
在看什麼節目的資訊,
03:39
and then it was selling銷售 that information信息
to data數據 brokers經紀商 and advertisers廣告商.
82
207379
3705
接著該公司會把那些資訊
賣給資料仲介和廣告商。
03:43
SMSM: Ah, classic經典 surveillance監控 economy經濟 move移動.
83
211108
3222
蘇:啊,很典型的監視經濟對策。
03:46
The devices設備 Kashmir克什米爾 bought
almost幾乎 all pingedping通 their servers服務器 daily日常.
84
214775
3866
卡許米爾買的裝置幾乎都會
每天 ping 它們的伺服器。
03:50
But do you know which哪一個 device設備
was especially特別 chatty健談的?
85
218665
2396
但你們知道哪個裝置
特別「長舌」嗎?
03:53
The Amazon亞馬遜 Echo迴聲.
86
221085
1294
Amazon Echo。
03:54
It contacted聯繫 its servers服務器
every一切 three minutes分鐘,
87
222403
2428
它每三分鐘就和伺服器連線一次,
03:56
regardless而不管 of whether是否
you were using運用 it or not.
88
224855
2198
不論你有沒有在用它都一樣。
03:59
KHKH: In general一般, it was disconcerting令人不安
89
227077
2183
卡:一般來說,這挺讓人不安的,
04:01
that all these devices設備 were having
ongoing不斷的 conversations對話
90
229284
2984
所有這些裝置都在進行交談,
04:04
that were invisible無形 to me.
91
232292
1626
我卻看不見。
04:05
I mean, I would have had
no idea理念, without your router路由器.
92
233942
2587
如果沒有你的路由器,
我根本不會知道。
04:08
If you buy購買 a smart聰明 device設備,
you should probably大概 know --
93
236553
3468
如果你要買一台智慧裝置,
你應該要知道——
04:12
you're going to own擁有 the device設備,
94
240045
2111
你的確會擁有那台裝置,
04:14
but in general一般, the company公司
is going to own擁有 your data數據.
95
242180
3420
但一般來說,那間公司
會擁有你的資料。
04:17
And you know, I mean,
maybe that's to be expected預期 --
96
245624
2572
也許,這是意料之中的——
04:20
you buy購買 an internet-connected互聯網連接 device設備,
it's going to use the internet互聯網.
97
248220
3507
你買了和網路連結的裝置,
它勢必會去用網路。
04:24
But it's strange奇怪 to have these devices設備
98
252189
1842
但感覺還是很奇怪,
04:26
moving移動 into the intimate親密 space空間
that is the home
99
254055
2658
這些裝置進入了
像家這樣的親密空間,
04:28
and allowing允許 companies公司 to track跟踪
our really basic基本 behavior行為 there.
100
256737
3443
並讓公司能追蹤我們
在家中的基本行為。
04:32
SMSM: So true真正.
101
260204
1159
蘇:真的。
04:33
Even the most banal-seeming平庸似乎 data數據
can be mined開採 by the surveillance監控 economy經濟.
102
261387
3539
監視經濟甚至可以在看似最平凡
乏味的資料當中挖出有用資訊。
04:36
For example, who cares管它
how often經常 you brush your teeth?
103
264950
2575
比如,誰在乎你多常刷牙啊?
04:39
Well, as it turns out, there's a dental牙齒
insurance保險 company公司 called Beam光束.
104
267549
3526
嗯,結果發現,有間牙齒
保險公司叫做 Beam。
04:43
They've他們已經 been monitoring監控 their customers'顧客'
smart聰明 toothbrushes牙刷 since以來 2015 --
105
271099
3835
從 2015 年起,他們就一直
在監看客戶的智慧牙刷——
04:46
for discounts折扣 on their
premiums保費, of course課程.
106
274958
2579
當然,是為了優惠折扣用途。
04:49
KHKH: We know what
some of you are thinking思維:
107
277561
2259
卡:我們知道有些人會怎麼想:
04:51
this is the contract合同 of the modern現代 world世界.
108
279844
2654
這就是現代世界的契約。
04:54
You give up a little privacy隱私,
109
282522
1427
你放棄一點點隱私,
04:55
and you get some convenience方便
or some price價錢 breaks休息 in return返回.
110
283973
3218
相對的,你也會得到
一些便利性或是優惠價。
04:59
But that wasn't my experience經驗
in my smart聰明 home.
111
287514
2341
但我在我的智慧家庭中的
經歷並非如此。
05:01
It wasn't convenient方便, it was infuriating氣死人.
112
289879
3674
它並不便利,它很讓人生氣。
05:05
I'll admit承認, I love my smart聰明 vacuum真空,
113
293577
2492
我承認,我愛我的智慧吸塵器,
05:08
but many許多 other things in the house
drove開車 me insane:
114
296093
2405
但家中的許多其他東西
實在讓我抓狂:
05:10
we ran out of electrical電動 outlets網點,
115
298522
2396
我們沒有足夠的插座可用了,
05:12
and I had to download下載
over a dozen apps應用 to my phone電話
116
300942
3379
我的手機得要下載
十多個應用程式,
05:16
to control控制 everything.
117
304345
1327
才能控制所有東西。
05:17
And then every一切 device設備 had its own擁有 log-in登錄,
118
305696
2089
每個裝置又有它自己的登入方式,
05:19
my toothbrush牙刷 had a password密碼 ...
119
307809
2363
我的牙刷還有個密碼……
05:22
(Laughter笑聲)
120
310196
1737
(笑聲)
05:23
And smart聰明 coffee咖啡, especially特別,
was just a world世界 of hell地獄.
121
311957
4128
特別是智慧咖啡機,那真是地獄啊。
05:28
SMSM: Wait, really? Cloud-powered雲驅動
coffee咖啡 wasn't really working加工 for you?
122
316109
3928
蘇:等等,真的?
你不喜歡雲端運算的咖啡機?
05:32
KHKH: I mean, maybe I'm naive幼稚,
but I thought it was going to be great.
123
320061
3238
卡:也許我太天真了,
但我以為它會很棒。
05:35
I thought we'd星期三 just wake喚醒 up in the morning早上
and we'd星期三 say, "AlexaAlexa的, make us coffee咖啡."
124
323323
4019
我以為就是我們早上起床,說:
「艾列克莎,為我們泡咖啡。」
05:39
But that's not how it went down.
125
327366
1873
但結果並不是這樣的。
05:41
We had to use this really particular特定,
brand-specific品牌專用 phrase短語 to make it work.
126
329263
4651
我們得依它特定的用詞,
說出明確的品牌,才會有用。
05:45
It was, "AlexaAlexa的, ask the BehmorBehmor
to run quick start開始."
127
333938
4608
比如:「艾列克莎,
叫 Behmor 咖啡機進行快速啟動。」
05:51
And this was just, like,
really hard to remember記得
128
339222
3342
這實在好難記住啊,
05:54
first thing in the morning早上,
129
342588
1334
且是一大早還沒有
05:55
before you have had your caffeine咖啡因.
130
343946
1703
取得咖啡因之前呢。
05:57
(Laughter笑聲)
131
345673
1103
(笑聲)
05:58
And apparently顯然地, it was hard to say,
132
346800
1690
很顯然,那真的很難說,
06:00
because the Echo迴聲 Dot
that was right next下一個 to our bed
133
348514
3063
因為 Echo Dot 就在我們的床邊,
06:03
just couldn't不能 understand理解 us.
134
351601
1611
而它完全無法理解我們。
06:05
So we would basically基本上 start開始 every一切 day
by screaming尖叫 this phrase短語 at the Echo迴聲 Dot.
135
353704
4621
基本上,我們開始每一天的方式
就是對 Echo Dot 大吼這個句子。
06:10
(Laughter笑聲)
136
358349
1173
(笑聲)
06:11
And Trevor特雷弗 hated this.
137
359546
1466
崔佛很討厭這一點。
06:13
He'd他會 be like, "Please, Kashmir克什米爾,
138
361339
1748
他會說:「拜託,卡許米爾,
06:15
just let me go to the kitchen廚房 and push
the button按鍵 to make the coffee咖啡 run."
139
363111
3669
就讓我走到廚房去,
按下咖啡機的啟動按鈕吧。」
06:19
And I'd be like, "No, you can't!
140
367379
2087
我就會說:「不,你不能!
06:21
We have to do it the smart聰明 way!"
141
369490
2409
我們得用智慧的方式
來做這件事!」
06:23
(Laughter笑聲)
142
371923
1916
(笑聲)
06:25
I'm happy快樂 to report報告 that our marriage婚姻
survived倖存 the experiment實驗,
143
373863
3043
我很高興告訴大家,
我們的婚姻撐過了這次實驗,
06:28
but just barely僅僅.
144
376930
1560
但真的很勉強。
06:30
SMSM: If you decide決定 to make your home smart聰明,
145
378514
2047
蘇:若你打算要弄個智慧家庭,
06:32
hopefully希望, you’ll find it
less infuriating氣死人 than Kashmir克什米爾 did.
146
380585
2849
希望你最後不會
像卡許米爾這麼火大。
06:35
But regardless而不管, the smart聰明 things you buy購買
147
383458
2087
但,無論如何,你所買的智慧物品
06:37
can and probably大概 are used
to target目標 and profile輪廓 you.
148
385569
3095
能夠且可能會被用來
針對你收集資料做側寫。
06:41
Just the number of devices設備 you have
can be used to predict預測
149
389141
3040
光是你用的裝置數目,
就可以用來預測
06:44
how rich豐富 or poor較差的 you are.
150
392205
1422
你有多富有或貧窮。
06:45
Facebook'sFacebook的 made製作 this tech高科技,
and they've他們已經 also patented專利 it.
151
393651
2767
臉書已經發明了這項技術,
還取得了專利。
06:48
KHKH: All the anxiety焦慮 you currently目前 feel
every一切 time you go online線上,
152
396442
3686
卡:現在你每次上網時,
很怕會被追蹤的
06:52
about being存在 tracked追踪,
153
400152
1306
那種焦慮感,
06:53
is about to move移動 into your living活的 room房間.
154
401482
2301
即將要搬進你的客廳,
06:55
Or into your bedroom臥室.
155
403807
1579
或你的臥房。
06:57
There's this sex性別 toy玩具 called the We-Vibe我們-氛圍.
156
405950
2222
有一種性玩具叫做 We-Vibe。
07:00
You might威力 wonder奇蹟 why
a sex性別 toy玩具 connects所連接 to the internet互聯網,
157
408577
2667
你可能會納悶,為什麼
性玩具要連上網路?
07:03
but it's for two people
who are in a long-distance遠距離 relationship關係,
158
411268
3436
但它其實是給異地戀的兩人使用的,
07:06
so they can share分享 their love from afar.
159
414728
2920
讓他們可以在遠處分享他們的愛。
07:10
Some hackers黑客 took a close look at this toy玩具
160
418259
2116
有些駭客仔細研究了這種玩具,
07:12
and saw it was sending發出
a lot of information信息
161
420399
2075
發現它會發送出很多資訊,
07:14
back to the company公司 that made製作 it --
162
422498
2462
送回製造它的公司——
07:16
when it was used,
how long it was used for,
163
424984
2966
這些資訊包括:
何時使用它、用了多久、
07:19
what the vibration振動 settings設置 were,
how hot the toy玩具 got.
164
427974
3706
振動的設定為何、玩具變得多熱。
07:23
It was all going into a database數據庫.
165
431704
2189
這些資料都被存到資料庫中。
07:25
So I reached到達 out to the company公司,
166
433917
2749
所以我去聯絡了那間公司,
07:28
and I said, "Why are you collecting蒐集
this really sensitive敏感 data數據?"
167
436690
3302
我說:「你們為什麼
要收集這些很敏感的資料?」
07:32
And they said, "Well, it's great
for market市場 research研究."
168
440369
3386
他們說:「嗯,
那對市場研究很有幫助。」
07:36
But they were data-mining數據挖掘
their customers'顧客' orgasms性高潮.
169
444592
2929
但他們是在客戶的高潮
當中挖掘有用資訊。
07:39
And they weren't telling告訴 them about it.
170
447545
1921
且他們並沒有告訴客戶這件事。
07:41
I mean, even if
you're cavalier騎士 about privacy隱私,
171
449490
2222
就算你對於隱私不是很在意,
07:43
I hope希望 that you would admit承認
that's a step too far.
172
451736
2595
我希望你也同意這樣做有點太過頭。
07:46
SMSM: This is why I want
to keep my sex性別 toys玩具 dumb.
173
454680
2508
蘇:這就是為什麼我希望
用笨的性玩具就好。
07:49
KHKH: That's great.
174
457212
1166
卡:那很好啊。
07:50
We're all very glad高興 to know that.
175
458402
1666
我們都很高興知道這一點。
07:52
(Laughter笑聲)
176
460092
1448
(笑聲)
07:53
SMSM: A data數據 point I'm willing願意 to share分享.
177
461564
2034
蘇:我願意分享的一個資料點。
07:55
(Laughter笑聲)
178
463622
1723
(笑聲)
07:57
The devices設備 Kashmir克什米爾 bought
range範圍 from useful有用 to annoying惱人的.
179
465369
2970
卡許米爾買的裝置,
從很有用的到很惱人的都有。
08:00
But the thing they all had in common共同
180
468363
1793
但它們都有一個共通點,
08:02
was sharing分享 data數據 with
the companies公司 that made製作 them.
181
470180
2699
就是會和製造
它們的公司分享資料。
08:04
With email電子郵件 service服務 providers供應商
and social社會 media媒體,
182
472903
2308
有了電子郵件供應商
和社交媒體的經驗,
08:07
we've我們已經 long been told that if it's free自由,
you're the product產品.
183
475235
2958
我們長久以來都知道,
若是免費的,就表示你是商品。
08:10
But with the internet互聯網 of things, it seems似乎,
184
478217
2007
但似乎,就物聯網來說,
08:12
even if you pay工資, you're still the product產品.
185
480248
2008
即使你付了錢,你仍然是產品。
08:14
So you really have to ask:
186
482280
1245
所以你必須要問:
08:15
Who's誰是 the true真正 beneficiary受益人
of your smart聰明 home,
187
483549
2185
誰能真正從你的智慧家庭受惠?
你?或挖掘你的資料的公司?
08:17
you or the company公司 mining礦業 you?
188
485758
1629
08:19
KHKH: Look, we're a tech高科技 savvy精明 crowd人群 here.
189
487411
1952
卡:這裡的人都是懂科技的人。
08:21
I think most of us know
that these things connect to the internet互聯網
190
489387
3112
我想我們大部分人都知道
這些東西會連上網路
08:24
and send發送 data數據 out.
191
492523
1163
並把資料傳送出去。
08:25
And fine, maybe you're OK with living活的
in that commercial廣告 panopticon圓形監獄,
192
493710
4018
也許你覺得住在那商業
圓形監獄當中也沒關係,
08:29
but others其他 aren't.
193
497752
1397
但其他人不見得這麼想。
08:31
We need the companies公司 to rethink反思
the design設計 of these devices設備
194
499173
2841
這些公司需要重新思考
這些裝置的設計,
08:34
with our privacy隱私 in mind心神,
195
502038
1515
要把我們的隱私考量進去,
08:35
because we're not all willing願意
to participate參加 in "market市場 research研究,"
196
503577
3111
因為我們不見得願意只因為
買了一個有無線網路連結的裝置,
08:38
just because a device設備 we bought
has a Wi-Fi無線上網 connection連接.
197
506712
2893
就莫名其妙參與了「市場研究」。
08:42
And I have to tell you,
198
510014
1198
我得告訴各位,
08:43
even when you're aware知道的,
generally通常, this is happening事件,
199
511236
2582
一般來說,就算你知道
有這樣的狀況,
08:45
it's really easy簡單 to forget忘記 that normal正常
household家庭 items項目 are spying間諜 on you.
200
513842
4394
還是很容易忘記,
正常的居家物品正在監視你。
08:50
It's easy簡單 to forget忘記
these things are watching觀看 you,
201
518664
2350
很容易忘記,
這些東西正在看著你,
08:53
because they don't look like cameras相機.
202
521038
2047
因為它們長得不像是攝影機。
08:55
They could look like ...
203
523109
1429
它們的外觀可能只是……
08:56
well, they could look like a dildodildo.
204
524562
2544
嗯,它們的外觀可能
只是一支假陰莖。
08:59
Thank you.
205
527780
1152
謝謝。
09:00
(Applause掌聲)
206
528956
3686
(掌聲)
Translated by Lilian Chiu
Reviewed by Yanyan Hong

▲Back to top

ABOUT THE SPEAKERS
Kashmir Hill - Technology journalist
Kashmir Hill writes about privacy and technology.

Why you should listen

Kashmir Hill is a senior reporter for the Gizmodo Media Group. As she writes: "I started out in journalism blogging at what was essentially an online tabloid for lawyers. Then I got interested in privacy, and that forced me to write about Facebook and eventually about other technologies; along the way people started describing me as a technology journalist instead of a legal blogger.

"I've always wanted my writing to be approachable for a lay audience, so I usually use humor, a first-person approach or, ideally, both. So I've hacked a smart home, lived in a monitored one, created a fake business and bought it a fake reputation, worked as a crowdsourced girlfriend, lived on Bitcoin and spent a whole week WRITING IN CAPS LOCK. The best way to prepare people for future possible tech dystopias is for me to live in them and report back."

More profile about the speaker
Kashmir Hill | Speaker | TED.com
Surya Mattu - Artist, investigative journalist, engineer
Think of Surya Mattu as a data detective. As he writes: "I am interested in figuring out the ways in which algorithmic systems harm people."

Why you should listen

Surya Mattu is a data reporter on Gizmodo's Special Projects Desk and an R&D journalism resident at Eyebeam NYC. As he writes: "My practice combines art, investigative journalism, engineering and creative technology. The aim is to reverse-engineer the specific ways in which the tools or technology we create are imbued with the ethics of the culture in which they're created. Currently, I am a data reporter. Previously, I was a contributing researcher at ProPublica, where I worked on "Machine Bias," a series that aims to highlight how algorithmic systems can be biased and discriminate against people."

More profile about the speaker
Surya Mattu | Speaker | TED.com