ABOUT THE SPEAKER
Jennifer Golbeck - Computer scientist
As the director of the Human-Computer Interaction Lab at the University of Maryland, Jennifer Golbeck studies how people use social media -- and thinks about ways to improve their interactions.

Why you should listen

Jennifer Golbeck is an associate professor in the College of Information Studies at the University of Maryland, where she also moonlights in the department of computer science. Her work invariably focuses on how to enhance and improve the way that people interact with their own information online. "I approach this from a computer science perspective and my general research hits social networks, trust, web science, artificial intelligence, and human-computer interaction," she writes.

Author of the 2013 book, Analyzing the Social Web, Golbeck likes nothing more than to immerse herself in the inner workings of the Internet tools so many millions of people use daily, to understand the implications of our choices and actions. Recently, she has also been working to bring human-computer interaction ideas to the world of security and privacy systems.

More profile about the speaker
Jennifer Golbeck | Speaker | TED.com
TEDxMidAtlantic 2013

Jennifer Golbeck: Your social media "likes" expose more than you think

詹尼弗·高尔拜克: 炸薯条之迷:你以为社交媒体的“赞”就是单纯的“赞”而已吗

Filmed:
2,366,837 views

你喜欢吃炸薯条吗?你在面谱网上给过它们“赞”吗?看看这个视频,你会惊奇地发现面谱网(以及其他网站)怎样从你随手的“赞”和“分享”中预测与你相关的信息。计算机科学家詹尼弗·高尔拜克解释这一种做法,解释某些科技应用软件其实并没那么趣致——以及为什么她觉得我们应该把信息的控制权交还给真正的拥有人。
- Computer scientist
As the director of the Human-Computer Interaction Lab at the University of Maryland, Jennifer Golbeck studies how people use social media -- and thinks about ways to improve their interactions. Full bio

Double-click the English transcript below to play the video.

00:12
If you remember记得 that first decade of the web卷筒纸,
0
738
1997
如果你还记得网络时代的头十年,
00:14
it was really a static静态的 place地点.
1
2735
2255
网络是一个水尽鹅飞的地方。
00:16
You could go online线上, you could look at pages网页,
2
4990
2245
你可以上网,你可以浏览网页,
00:19
and they were put up either by organizations组织
3
7235
2513
当时的网站
00:21
who had teams球队 to do it
4
9748
1521
要么是由某个组织的专门团队建立,
00:23
or by individuals个人 who were really tech-savvy技术娴熟
5
11269
2229
要么就是由真正的技术行家所做,
00:25
for the time.
6
13498
1737
这就是当时情况。
00:27
And with the rise上升 of social社会 media媒体
7
15235
1575
但在二十一世纪初
00:28
and social社会 networks网络 in the early 2000s,
8
16810
2399
随着社交媒体以及社交网络的兴起,
00:31
the web卷筒纸 was completely全然 changed
9
19209
2149
网络发生了翻天覆地的变化:
00:33
to a place地点 where now the vast广大 majority多数 of content内容
10
21358
3608
如今网络上大部分的互动内容
00:36
we interact相互作用 with is put up by average平均 users用户,
11
24966
3312
都是由大众网络用户提供,
00:40
either in YouTubeYouTube的 videos视频 or blog博客 posts帖子
12
28278
2697
既有Youtube视频,也有博客文章,
00:42
or product产品 reviews评论 or social社会 media媒体 postings帖子.
13
30975
3315
既有产品评论,也有社交媒体发布。
00:46
And it's also become成为 a much more interactive互动 place地点,
14
34290
2347
与此同时,互联网成为了一个有更多互动的地方,
00:48
where people are interacting互动 with others其他,
15
36637
2637
人们在这里互相交流、
00:51
they're commenting评论, they're sharing分享,
16
39274
1696
互相评论、互相分享,
00:52
they're not just reading.
17
40970
1614
而不只是阅读信息。
00:54
So FacebookFacebook的 is not the only place地点 you can do this,
18
42584
1866
面谱网不是唯一一个你可以做这些事情的地方,
00:56
but it's the biggest最大,
19
44450
1098
但它确实是最大的一个,
00:57
and it serves供应 to illustrate说明 the numbers数字.
20
45548
1784
并且它用数字来证明这点。
00:59
FacebookFacebook的 has 1.2 billion十亿 users用户 per month.
21
47332
3477
面谱网每个月有12亿用户。
01:02
So half the Earth's地球 Internet互联网 population人口
22
50809
1930
由此可见,地球上一半的互联网用户
01:04
is using运用 FacebookFacebook的.
23
52739
1653
都在使用面谱网。
01:06
They are a site现场, along沿 with others其他,
24
54392
1932
这些都是网站,
01:08
that has allowed允许 people to create创建 an online线上 persona人物
25
56324
3219
允许人们在网上创建不同的角色,
01:11
with very little technical技术 skill技能,
26
59543
1782
但这些人又不需要有多少计算机技能,
01:13
and people responded回应 by putting huge巨大 amounts
27
61325
2476
而人们的反应是
在网上输入大量的个人信息。
01:15
of personal个人 data数据 online线上.
28
63801
1983
01:17
So the result结果 is that we have behavioral行为的,
29
65784
2543
结果是,我们拥有数以亿计人的
01:20
preference偏爱, demographic人口 data数据
30
68327
1986
行为信息、喜好信息
01:22
for hundreds数以百计 of millions百万 of people,
31
70313
2101
以及人口数据资料。
01:24
which哪一个 is unprecedented史无前例 in history历史.
32
72414
2026
这在历史上前所未有。
01:26
And as a computer电脑 scientist科学家,
what this means手段 is that
33
74440
2560
对于作为计算机科学家的我来说,这意味着
01:29
I've been able能够 to build建立 models楷模
34
77000
1664
我能够建立模型
01:30
that can predict预测 all sorts排序 of hidden attributes属性
35
78664
2322
来预测各种各样的
01:32
for all of you that you don't even know
36
80986
2284
你或许完全没有意识到的
01:35
you're sharing分享 information信息 about.
37
83270
2202
与你所分享的信息相关的隐藏信息。
01:37
As scientists科学家们, we use that to help
38
85472
2382
作为科学家,我们利用这些信息
01:39
the way people interact相互作用 online线上,
39
87854
2114
来帮助人们在网上交流。
01:41
but there's less altruistic利他 applications应用,
40
89968
2499
但也有人用此来谋取自己的私欲,
01:44
and there's a problem问题 in that users用户 don't really
41
92467
2381
而问题是,用户并没有真正理解
01:46
understand理解 these techniques技术 and how they work,
42
94848
2470
其中用到的技术和技术的应用方式。
01:49
and even if they did, they don't
have a lot of control控制 over it.
43
97318
3128
即便理解了,也不见得他们有话事权。
01:52
So what I want to talk to you about today今天
44
100446
1490
所以,我今天想谈谈
01:53
is some of these things that we're able能够 to do,
45
101936
2702
我们能够做的一些事情,
01:56
and then give us some ideas思路
of how we might威力 go forward前锋
46
104638
2763
也启发我们
01:59
to move移动 some control控制 back into the hands of users用户.
47
107401
2769
如何改善情况、让话事权回归用户。
02:02
So this is Target目标, the company公司.
48
110170
1586
这是塔吉特百货公司的商标。
02:03
I didn't just put that logo商标
49
111756
1324
我并不单单把那个商标
02:05
on this poor较差的, pregnant woman's女人的 belly肚皮.
50
113080
2170
放在这个可怜的孕妇的肚子上。
02:07
You may可能 have seen看到 this anecdote轶事 that was printed印刷的
51
115250
1840
或许在福布斯杂志上
02:09
in Forbes福布斯 magazine杂志 where Target目标
52
117090
2061
你看过这么一则趣事:
02:11
sent发送 a flyer传单 to this 15-year-old-岁 girl女孩
53
119151
2361
塔吉特百货公司给这个15岁女孩寄了一份传单,
02:13
with advertisements广告 and coupons优惠券
54
121512
1710
传单上都是婴儿奶瓶、尿布、
02:15
for baby宝宝 bottles瓶子 and diapers尿布 and cribs婴儿床
55
123222
2554
婴儿床的广告和优惠券。
02:17
two weeks before she told her parents父母
56
125776
1684
这一切发生在
02:19
that she was pregnant.
57
127460
1864
她把怀孕消息告诉父母的两周前。
02:21
Yeah, the dad was really upset烦乱.
58
129324
2704
没错,女孩的父亲很生气。
02:24
He said, "How did Target目标 figure数字 out
59
132028
1716
他说:”塔吉特是如何
02:25
that this high school学校 girl女孩 was pregnant
60
133744
1824
在连这个高中女生的父母都尚未知情之前
02:27
before she told her parents父母?"
61
135568
1960
就知道她怀孕了?“
02:29
It turns out that they have the purchase采购 history历史
62
137528
2621
原来,塔吉特有成千上万的顾客,
02:32
for hundreds数以百计 of thousands数千 of customers顾客
63
140149
2301
并拥有他们的购买历史记录,
02:34
and they compute计算 what they
call a pregnancy怀孕 score得分了,
64
142450
2730
他们用计算机推算出他们所谓的“怀孕分数”,
02:37
which哪一个 is not just whether是否 or
not a woman's女人的 pregnant,
65
145180
2332
不仅能知道一个女性是否怀孕,
02:39
but what her due应有 date日期 is.
66
147512
1730
而且还能计算出她的分娩日期。
02:41
And they compute计算 that
67
149242
1304
他们计算出的结果
02:42
not by looking at the obvious明显 things,
68
150546
1768
不单单是基于一些显而易见的事情,
02:44
like, she's buying购买 a crib婴儿床 or baby宝宝 clothes衣服,
69
152314
2512
比如说,她准备买个婴儿床或孩子的衣服,
02:46
but things like, she bought more vitamins维生素
70
154826
2943
更是基于其他一些事情,
02:49
than she normally一般 had,
71
157769
1717
例如她比平时多买了维他命,
02:51
or she bought a handbag手提包
72
159486
1464
或她买了一个新的手提包
02:52
that's big enough足够 to hold保持 diapers尿布.
73
160950
1711
大得可以放尿布。
02:54
And by themselves他们自己, those purchases购买 don't seem似乎
74
162661
1910
单独来看这些消费记录
02:56
like they might威力 reveal揭示 a lot,
75
164571
2469
或许并不能说明什么,
02:59
but it's a pattern模式 of behavior行为 that,
76
167040
1978
但这确是一种行为模式,
03:01
when you take it in the context上下文
of thousands数千 of other people,
77
169018
3117
当你有大量人口背景作比较,
03:04
starts启动 to actually其实 reveal揭示 some insights见解.
78
172135
2757
这种行为模式就开始透露一些见解。
03:06
So that's the kind of thing that we do
79
174892
1793
当我们根据社交媒体来预测关于你的一些事情时,
03:08
when we're predicting预测 stuff东东
about you on social社会 media媒体.
80
176685
2567
这便是我们常做的一类事情。
03:11
We're looking for little
patterns模式 of behavior行为 that,
81
179252
2796
我们着眼于零星的行为模式,
03:14
when you detect检测 them among其中 millions百万 of people,
82
182048
2682
当你在众人中发现这些行为模式时,
03:16
lets让我们 us find out all kinds of things.
83
184730
2706
会帮助我们发现各种各样的事情。
03:19
So in my lab实验室 and with colleagues同事,
84
187436
1747
在我的实验室,在同事们的合作下,
03:21
we've我们已经 developed发达 mechanisms机制 where we can
85
189183
1777
我们已经开发了一些机制
03:22
quite相当 accurately准确 predict预测 things
86
190960
1560
来较为准确地推测一些事情,
03:24
like your political政治 preference偏爱,
87
192520
1725
比如你的政治立场、
03:26
your personality个性 score得分了, gender性别, sexual有性 orientation方向,
88
194245
3752
你的性格得分、性别、性取向、
03:29
religion宗教, age年龄, intelligence情报,
89
197997
2873
宗教信仰、年龄、智商,
03:32
along沿 with things like
90
200870
1394
另外还有:
03:34
how much you trust相信 the people you know
91
202264
1937
你对认识的人的信任程度、
03:36
and how strong强大 those relationships关系 are.
92
204201
1804
你的人际关系程度。
03:38
We can do all of this really well.
93
206005
1785
我们能够很好地完成这些推测。
03:39
And again, it doesn't come from what you might威力
94
207790
2197
我在这里在强调一遍,这种推测并基于
03:41
think of as obvious明显 information信息.
95
209987
2102
在你看来显而易见的信息。
03:44
So my favorite喜爱 example is from this study研究
96
212089
2281
我最喜欢的例子是来自
03:46
that was published发表 this year
97
214370
1240
今年发表在
03:47
in the Proceedings论文集 of the National国民 Academies学院.
98
215610
1795
美国国家论文集上的一个研究。
03:49
If you Google谷歌 this, you'll你会 find it.
99
217405
1285
你可以在谷歌搜索找到这篇文章。
03:50
It's four pages网页, easy简单 to read.
100
218690
1872
这篇文章总共四页,容易阅读。
他们仅仅研究了人们在面谱网上的“赞”,
03:52
And they looked看着 at just people's人们 FacebookFacebook的 likes喜欢,
101
220562
3003
03:55
so just the things you like on FacebookFacebook的,
102
223565
1920
也就是你在面谱网上喜欢的事情。
03:57
and used that to predict预测 all these attributes属性,
103
225485
2138
他们利用这些数据来预测
03:59
along沿 with some other ones那些.
104
227623
1645
之前所说的所有特性,还有其他的一些特性。
04:01
And in their paper they listed上市 the five likes喜欢
105
229268
2961
在文章中列举了
04:04
that were most indicative指示 of high intelligence情报.
106
232229
2787
最能够显示高智商的五个“赞”。
04:07
And among其中 those was liking喜欢 a page
107
235016
2324
在这五项中
04:09
for curly卷曲 fries薯条. (Laughter笑声)
108
237340
1905
赞“炸扭薯”页面的是其中之一
04:11
Curly卷曲 fries薯条 are delicious美味的,
109
239245
2093
炸扭薯很好吃,
04:13
but liking喜欢 them does not necessarily一定 mean
110
241338
2530
但喜欢吃炸扭薯
04:15
that you're smarter聪明 than the average平均 person.
111
243868
2080
并不一定意味着你比一般人聪明。
04:17
So how is it that one of the strongest最强 indicators指标
112
245948
3207
那么为什么喜欢某个页面
04:21
of your intelligence情报
113
249155
1570
就成为显示你智商
04:22
is liking喜欢 this page
114
250725
1447
的重要因素,
04:24
when the content内容 is totally完全 irrelevant不相干
115
252172
2252
尽管该页面的内容和所预测的属性
04:26
to the attribute属性 that's being存在 predicted预料到的?
116
254424
2527
与此毫不相干?
04:28
And it turns out that we have to look at
117
256951
1584
事实是我们必须审视
04:30
a whole整个 bunch of underlying底层 theories理论
118
258535
1618
大量的基础理论,
04:32
to see why we're able能够 to do this.
119
260153
2569
从而了解我们是如何做到准确推测的。
04:34
One of them is a sociological社会学的
theory理论 called homophily趋同性,
120
262722
2913
其中一个基础理论是社会学的同质性理论,
04:37
which哪一个 basically基本上 says people are
friends朋友 with people like them.
121
265635
3092
主要意思是人们和自己相似的人交朋友。
04:40
So if you're smart聪明, you tend趋向 to
be friends朋友 with smart聪明 people,
122
268727
2014
所以说,如果你很聪明,你倾向于和聪明的人交朋友。
04:42
and if you're young年轻, you tend趋向
to be friends朋友 with young年轻 people,
123
270741
2630
如果你还年轻,你倾向于和年轻人交朋友。
04:45
and this is well established既定
124
273371
1627
这是数百年来
04:46
for hundreds数以百计 of years年份.
125
274998
1745
公认的理论。
04:48
We also know a lot
126
276743
1232
我们很清楚
04:49
about how information信息 spreads利差 through通过 networks网络.
127
277975
2550
信息在网络上传播的传播途径。
04:52
It turns out things like viral病毒 videos视频
128
280525
1754
结果是,流行的视频、
04:54
or FacebookFacebook的 likes喜欢 or other information信息
129
282279
2406
脸书上得到很多“赞”的内容、
04:56
spreads利差 in exactly究竟 the same相同 way
130
284685
1888
或者其他信息的传播,
04:58
that diseases疾病 spread传播 through通过 social社会 networks网络.
131
286573
2454
同疾病在社交网络中蔓延的方式是相同的。
05:01
So this is something we've我们已经 studied研究 for a long time.
132
289027
1791
我们在这方面已经研究很久了。
05:02
We have good models楷模 of it.
133
290818
1576
我们己经建立了很好的模型。
05:04
And so you can put those things together一起
134
292394
2157
你能够将所有这些事物放在一起,
05:06
and start开始 seeing眼看 why things like this happen发生.
135
294551
3088
看看为什么这样的事情会发生。
05:09
So if I were to give you a hypothesis假设,
136
297639
1814
如果要我给你一个假说的话,
05:11
it would be that a smart聪明 guy started开始 this page,
137
299453
3227
我会猜测一个聪明的人建立了这个页面,
05:14
or maybe one of the first people who liked喜欢 it
138
302680
1939
或者第一个喜欢这个页面的人
05:16
would have scored进球 high on that test测试.
139
304619
1736
拥有挺高的智商得分。
05:18
And they liked喜欢 it, and their friends朋友 saw it,
140
306355
2288
他们喜欢了这个页面,然后他们的朋友看到了,
05:20
and by homophily趋同性, we know that
he probably大概 had smart聪明 friends朋友,
141
308643
3122
根据同质性理论,我们知道这些人可能有聪明的朋友,
05:23
and so it spread传播 to them,
and some of them liked喜欢 it,
142
311765
3056
然后他们看到这类信息,他们中的一部分人也喜欢,
05:26
and they had smart聪明 friends朋友,
143
314821
1189
他们也有聪明的朋友,
05:28
and so it spread传播 to them,
144
316010
807
所以这类信息也传到其他朋友那里,
05:28
and so it propagated传播 through通过 the network网络
145
316817
1973
所以信息就在网络上
05:30
to a host主办 of smart聪明 people,
146
318790
2569
在聪明人的圈子里流传开来了,
05:33
so that by the end结束, the action行动
147
321359
2056
因此到了最后,
05:35
of liking喜欢 the curly卷曲 fries薯条 page
148
323415
2544
喜欢炸扭薯的这个页面
05:37
is indicative指示 of high intelligence情报,
149
325959
1615
就成了高智商的象征,
05:39
not because of the content内容,
150
327574
1803
而不是因为内容本身,
05:41
but because the actual实际 action行动 of liking喜欢
151
329377
2522
而是“喜欢”这一个实际行动
05:43
reflects反映 back the common共同 attributes属性
152
331899
1900
反映了那些也付诸同样行动的人
05:45
of other people who have doneDONE it.
153
333799
2468
的相同特征。
05:48
So this is pretty漂亮 complicated复杂 stuff东东, right?
154
336267
2897
听起来很复杂,对吧?
05:51
It's a hard thing to sit down and explain说明
155
339164
2199
对于一般用户来说
05:53
to an average平均 user用户, and even if you do,
156
341363
2848
它比较难解释清楚,就算你解释清楚了,
05:56
what can the average平均 user用户 do about it?
157
344211
2188
一般用户又能利用它来干嘛呢?
05:58
How do you know that
you've liked喜欢 something
158
346399
2048
你又怎么能知道你喜欢的事情
06:00
that indicates指示 a trait特征 for you
159
348447
1492
反映了你什么特征
06:01
that's totally完全 irrelevant不相干 to the
content内容 of what you've liked喜欢?
160
349939
3545
而且这个特征还和你喜欢的内容毫不相干呢?
06:05
There's a lot of power功率 that users用户 don't have
161
353484
2546
用户其实没有太多的能力
06:08
to control控制 how this data数据 is used.
162
356030
2230
去控制这些数据的使用。
06:10
And I see that as a real真实
problem问题 going forward前锋.
163
358260
3112
我把这个看作将来的真实问题,
06:13
So I think there's a couple一对 paths路径
164
361372
1977
我认为,要是我们想让用户拥有
06:15
that we want to look at
165
363349
1001
使用这些数据的能力,
06:16
if we want to give users用户 some control控制
166
364350
1910
那么有几条路径
06:18
over how this data数据 is used,
167
366260
1740
我们需要探究,
06:20
because it's not always going to be used
168
368000
1940
因为这些数据并不总是
06:21
for their benefit效益.
169
369940
1381
用来为他们谋利益。
06:23
An example I often经常 give is that,
170
371321
1422
这有一个我经常举的例子,
06:24
if I ever get bored无聊 being存在 a professor教授,
171
372743
1646
如果我厌倦了当一名教授,
06:26
I'm going to go start开始 a company公司
172
374389
1653
我会选择自己开家公司
06:28
that predicts预测 all of these attributes属性
173
376042
1454
这家公司能预测这些特性和事物
06:29
and things like how well you work in teams球队
174
377496
1602
例如你在团队里的能力
06:31
and if you're a drug药物 user用户, if you're an alcoholic酒精.
175
379098
2671
例如你是否是一个吸毒者或酗酒者。
06:33
We know how to predict预测 all that.
176
381769
1440
我们知道如何去预测这些特性。
06:35
And I'm going to sell reports报告
177
383209
1761
然后我就会把这些报告
06:36
to H.R. companies公司 and big businesses企业
178
384970
2100
卖给那些人力资源公司
06:39
that want to hire聘请 you.
179
387070
2273
和想要雇佣你的大公司。
06:41
We totally完全 can do that now.
180
389343
1177
我们完全可以做到这点。
06:42
I could start开始 that business商业 tomorrow明天,
181
390520
1788
我明天就能开始这个项目,
06:44
and you would have
absolutely绝对 no control控制
182
392308
2052
并且你对我这用使用你的数据
06:46
over me using运用 your data数据 like that.
183
394360
2138
是一点办法也没有的。
06:48
That seems似乎 to me to be a problem问题.
184
396498
2292
这对我来说是一个问题。
06:50
So one of the paths路径 we can go down
185
398790
1910
所以我们可选的其中一条路径
06:52
is the policy政策 and law path路径.
186
400700
2032
是政策和法律这条途径。
06:54
And in some respects尊重, I think
that that would be most effective有效,
187
402732
3046
某程度上我觉得这可能是最有效的
06:57
but the problem问题 is we'd星期三
actually其实 have to do it.
188
405778
2756
但问题是,事实上我们将不得不这么做。
07:00
Observing观察 our political政治 process处理 in action行动
189
408534
2780
观察我们目前的政治进程
07:03
makes品牌 me think it's highly高度 unlikely不会
190
411314
2379
让我觉得在美国
07:05
that we're going to get a bunch of representatives代表
191
413693
1597
把一帮代表们聚在一起
07:07
to sit down, learn学习 about this,
192
415290
1986
让他们坐下来理解这个问题,
07:09
and then enact制定 sweeping笼统的 changes变化
193
417276
2106
然后颁布有关知识产权法方面的颠覆性条例,
07:11
to intellectual知识分子 property属性 law in the U.S.
194
419382
2157
让用户掌控自己的数据,
07:13
so users用户 control控制 their data数据.
195
421539
2461
这似乎是不可能的。
07:16
We could go the policy政策 route路线,
196
424000
1304
我们可以走政策途径,
07:17
where social社会 media媒体 companies公司 say,
197
425304
1479
这样社交媒体公司就会告诉你,
07:18
you know what? You own拥有 your data数据.
198
426783
1402
你知道吗?你的确拥有你的数据。
07:20
You have total control控制 over how it's used.
199
428185
2489
你绝对能自己决定要怎么去用。
07:22
The problem问题 is that the revenue收入 models楷模
200
430674
1848
但问题在于大部分的社交媒体公司
07:24
for most social社会 media媒体 companies公司
201
432522
1724
他们的盈利模式
07:26
rely依靠 on sharing分享 or exploiting利用
users'用户 data数据 in some way.
202
434246
4031
在某方面取决于分享或挖掘用户的数据资料。
07:30
It's sometimes有时 said of FacebookFacebook的 that the users用户
203
438277
1833
所以有时会说面谱网的用户并不是顾客,
07:32
aren't the customer顾客, they're the product产品.
204
440110
2528
而是产品。
07:34
And so how do you get a company公司
205
442638
2714
那么你要怎样让一个公司
07:37
to cede放弃 control控制 of their main主要 asset财富
206
445352
2558
将他们的主要资产控制权
双手拱让给用户呢?
07:39
back to the users用户?
207
447910
1249
07:41
It's possible可能, but I don't think it's something
208
449159
1701
这是可能的,但我不觉得
07:42
that we're going to see change更改 quickly很快.
209
450860
2320
我们能很快见证这种改变。
07:45
So I think the other path路径
210
453180
1500
所以我认为我们得走另一条途径
07:46
that we can go down that's
going to be more effective有效
211
454680
2288
一条更有效的途径,
07:48
is one of more science科学.
212
456968
1508
一条更加科学的途径。
07:50
It's doing science科学 that allowed允许 us to develop发展
213
458476
2510
这途径是开发一种技术
07:52
all these mechanisms机制 for computing计算
214
460986
1750
让我们能够发展所有这些机制
07:54
this personal个人 data数据 in the first place地点.
215
462736
2052
来首先处理自己的个人信息资料。
07:56
And it's actually其实 very similar类似 research研究
216
464788
2106
而这很接近
07:58
that we'd星期三 have to do
217
466894
1438
我们必须做的研究,
08:00
if we want to develop发展 mechanisms机制
218
468332
2386
要是我们想要发展这些机制
08:02
that can say to a user用户,
219
470718
1421
跟用户说明,
08:04
"Here's这里的 the risk风险 of that action行动 you just took."
220
472139
2229
“这样做你需要承担那样的风险。”
08:06
By liking喜欢 that FacebookFacebook的 page,
221
474368
2080
你在面谱网上点“赞”
08:08
or by sharing分享 this piece of personal个人 information信息,
222
476448
2535
或者分享一些私人信息,
08:10
you've now improved改善 my ability能力
223
478983
1502
就相当于增强了我的能力
08:12
to predict预测 whether是否 or not you're using运用 drugs毒品
224
480485
2086
去预测你是不是在吸毒
08:14
or whether是否 or not you get
along沿 well in the workplace职场.
225
482571
2862
或者你在工作中是否顺利。
08:17
And that, I think, can affect影响 whether是否 or not
226
485433
1848
我觉得,这样做
08:19
people want to share分享 something,
227
487281
1510
能够影响人们分享的决定:
08:20
keep it private私人的, or just keep it offline离线 altogether.
228
488791
3239
是要保持私隐,还是在网上只字不提。
08:24
We can also look at things like
229
492030
1563
我们也可以探究一些别的,例如
08:25
allowing允许 people to encrypt加密 data数据 that they upload上载,
230
493593
2728
让人们去给上传的东西加密,
08:28
so it's kind of invisible无形 and worthless无用
231
496321
1855
那么像面谱网这样的网站
08:30
to sites网站 like FacebookFacebook的
232
498176
1431
或其他能获取信息的第三方来说
08:31
or third第三 party派对 services服务 that access访问 it,
233
499607
2629
这些信息就隐秘很多,也少了很多意义,
08:34
but that select选择 users用户 who the person who posted发布 it
234
502236
3247
而且只有上传人指定的用户
08:37
want to see it have access访问 to see it.
235
505483
2670
才有浏览的权限。
08:40
This is all super exciting扣人心弦 research研究
236
508153
2166
从智能的角度来看,
08:42
from an intellectual知识分子 perspective透视,
237
510319
1620
这是一个非常振奋人心的研究,
08:43
and so scientists科学家们 are going to be willing愿意 to do it.
238
511939
1859
而且科学家们也会乐意去做这样的事。
08:45
So that gives us an advantage优点 over the law side.
239
513798
3610
这样在法律方面,我们就有优势了。
08:49
One of the problems问题 that people bring带来 up
240
517408
1725
当我谈论到这个话题时,
08:51
when I talk about this is, they say,
241
519133
1595
人们提到的其中一个问题,就是
08:52
you know, if people start开始
keeping保持 all this data数据 private私人的,
242
520728
2646
如果当人们开始把这些数据进行保密,
08:55
all those methods方法 that you've been developing发展
243
523374
2113
那些你研发的用来预测
08:57
to predict预测 their traits性状 are going to fail失败.
244
525487
2653
人们特性的手段都会作废。
09:00
And I say, absolutely绝对, and for me, that's success成功,
245
528140
3520
我会说,绝对会作废,但对我来说,这是成功,
09:03
because as a scientist科学家,
246
531660
1786
因为作为一个科学家,
09:05
my goal目标 is not to infer推断 information信息 about users用户,
247
533446
3688
我的目标不是去推测出用户的信息,
09:09
it's to improve提高 the way people interact相互作用 online线上.
248
537134
2767
而是提高人们在网上互动的方式。
09:11
And sometimes有时 that involves涉及
inferring推断 things about them,
249
539901
3218
虽然有时涉及到推测用户的资料,
09:15
but if users用户 don't want me to use that data数据,
250
543119
3022
但如果用户不希望我们用他们的数据,
09:18
I think they should have the right to do that.
251
546141
2038
我觉得他们应该有权去拒绝。
09:20
I want users用户 to be informed通知 and consenting同意
252
548179
2651
我希望用户能被告知
09:22
users用户 of the tools工具 that we develop发展.
253
550830
2112
并且赞同我们开发的这种工具。
09:24
And so I think encouraging鼓舞人心的 this kind of science科学
254
552942
2952
所以我认为,鼓励这类科学,
09:27
and supporting支持 researchers研究人员
255
555894
1346
支持这些研究者们
09:29
who want to cede放弃 some of that control控制 back to users用户
256
557240
3023
这些愿意放弃部分控制,退还给用户们,
09:32
and away from the social社会 media媒体 companies公司
257
560263
2311
并且不让社交媒体公司接触数据的研究者们
09:34
means手段 that going forward前锋, as these tools工具 evolve发展
258
562574
2671
随着这些工具的进化和提高
09:37
and advance提前,
259
565245
1476
这一切意味着向前的发展,
09:38
means手段 that we're going to have an educated博学
260
566721
1414
意味着我们将会拥有一个
09:40
and empowered授权 user用户 base基础,
261
568135
1694
有素质有权力的用户基础,
09:41
and I think all of us can agree同意
262
569829
1100
我觉得我们都会同意
09:42
that that's a pretty漂亮 ideal理想 way to go forward前锋.
263
570929
2564
这是一个理想的前进目标。
09:45
Thank you.
264
573493
2184
谢谢。
09:47
(Applause掌声)
265
575677
3080
(掌声)
Translated by Lee Li
Reviewed by Xingyi Ouyang 歐陽杏儀

▲Back to top

ABOUT THE SPEAKER
Jennifer Golbeck - Computer scientist
As the director of the Human-Computer Interaction Lab at the University of Maryland, Jennifer Golbeck studies how people use social media -- and thinks about ways to improve their interactions.

Why you should listen

Jennifer Golbeck is an associate professor in the College of Information Studies at the University of Maryland, where she also moonlights in the department of computer science. Her work invariably focuses on how to enhance and improve the way that people interact with their own information online. "I approach this from a computer science perspective and my general research hits social networks, trust, web science, artificial intelligence, and human-computer interaction," she writes.

Author of the 2013 book, Analyzing the Social Web, Golbeck likes nothing more than to immerse herself in the inner workings of the Internet tools so many millions of people use daily, to understand the implications of our choices and actions. Recently, she has also been working to bring human-computer interaction ideas to the world of security and privacy systems.

More profile about the speaker
Jennifer Golbeck | Speaker | TED.com