ABOUT THE SPEAKER
Lorrie Faith Cranor - Security researcher
At Carnegie Mellon University, Lorrie Faith Cranor studies online privacy, usable security, phishing, spam and other research around keeping us safe online.

Why you should listen

Lorrie Faith Cranor is an Associate Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University, where she is director of the CyLab Usable Privacy and Security Laboratory (CUPS) and co-director of the MSIT-Privacy Engineering masters program. She is also a co-founder of Wombat Security Technologies, Inc. She has authored over 100 research papers on online privacy, usable security, phishing, spam, electronic voting, anonymous publishing, and other topics.

Cranor plays a key role in building the usable privacy and security research community, having co-edited the seminal book Security and Usability and founded the Symposium On Usable Privacy and Security (SOUPS). She also chaired the Platform for Privacy Preferences Project (P3P) Specification Working Group at the W3C and authored the book Web Privacy with P3P. She has served on a number of boards, including the Electronic Frontier Foundation Board of Directors, and on the editorial boards of several journals. In 2003 she was named one of the top 100 innovators 35 or younger by Technology Review.

More profile about the speaker
Lorrie Faith Cranor | Speaker | TED.com
TEDxCMU

Lorrie Faith Cranor: What’s wrong with your pa$$w0rd?

Filmed:
1,566,161 views

Lorrie Faith Cranor studied thousands of real passwords to figure out the surprising, very common mistakes that users -- and secured sites -- make to compromise security. And how, you may ask, did she study thousands of real passwords without compromising the security of any users? That's a story in itself. It's secret data worth knowing, especially if your password is 123456 ...
- Security researcher
At Carnegie Mellon University, Lorrie Faith Cranor studies online privacy, usable security, phishing, spam and other research around keeping us safe online. Full bio

Double-click the English transcript below to play the video.

00:12
I am a computer science and engineering
professor here at Carnegie Mellon,
0
535
3445
00:15
and my research focuses on
usable privacy and security,
1
3980
4248
00:20
and so my friends like to give me examples
2
8228
2768
00:22
of their frustrations with computing systems,
3
10996
2202
00:25
especially frustrations related to
4
13198
3354
00:28
unusable privacy and security.
5
16552
4112
00:32
So passwords are something that I hear a lot about.
6
20664
2711
00:35
A lot of people are frustrated with passwords,
7
23375
2880
00:38
and it's bad enough
8
26255
1694
00:39
when you have to have one really good password
9
27949
2644
00:42
that you can remember
10
30593
1822
00:44
but nobody else is going to be able to guess.
11
32415
2894
00:47
But what do you do when you have accounts
12
35309
1637
00:48
on a hundred different systems
13
36946
1808
00:50
and you're supposed to have a unique password
14
38754
2276
00:53
for each of these systems?
15
41030
3037
00:56
It's tough.
16
44067
2184
00:58
At Carnegie Mellon, they used to make it
17
46251
1759
01:00
actually pretty easy for us
18
48010
1299
01:01
to remember our passwords.
19
49309
1737
01:03
The password requirement up through 2009
20
51046
2403
01:05
was just that you had to have a password
21
53449
2379
01:07
with at least one character.
22
55828
2211
01:10
Pretty easy. But then they changed things,
23
58039
2888
01:12
and at the end of 2009, they announced
24
60927
2670
01:15
that we were going to have a new policy,
25
63597
2376
01:17
and this new policy required
26
65973
1863
01:19
passwords that were at least eight characters long,
27
67836
2681
01:22
with an uppercase letter, lowercase letter,
28
70517
1775
01:24
a digit, a symbol,
29
72292
1288
01:25
you couldn't use the same
character more than three times,
30
73580
2638
01:28
and it wasn't allowed to be in a dictionary.
31
76218
2434
01:30
Now, when they implemented this new policy,
32
78652
2182
01:32
a lot of people, my colleagues and friends,
33
80834
2310
01:35
came up to me and they said, "Wow,
34
83144
1854
01:36
now that's really unusable.
35
84998
1512
01:38
Why are they doing this to us,
36
86510
1193
01:39
and why didn't you stop them?"
37
87703
1711
01:41
And I said, "Well, you know what?
38
89414
1356
01:42
They didn't ask me."
39
90770
1508
01:44
But I got curious, and I decided to go talk
40
92278
3465
01:47
to the people in charge of our computer systems
41
95743
1937
01:49
and find out what led them to introduce
42
97680
2831
01:52
this new policy,
43
100511
1848
01:54
and they said that the university
44
102359
1584
01:55
had joined a consortium of universities,
45
103943
2366
01:58
and one of the requirements of membership
46
106309
2634
02:00
was that we had to have stronger passwords
47
108943
2248
02:03
that complied with some new requirements,
48
111191
2272
02:05
and these requirements were that our passwords
49
113463
2104
02:07
had to have a lot of entropy.
50
115567
1604
02:09
Now entropy is a complicated term,
51
117171
2278
02:11
but basically it measures the strength of passwords.
52
119449
2798
02:14
But the thing is, there isn't actually
53
122247
1979
02:16
a standard measure of entropy.
54
124226
1949
02:18
Now, the National Institute
of Standards and Technology
55
126175
2399
02:20
has a set of guidelines
56
128574
1553
02:22
which have some rules of thumb
57
130127
2568
02:24
for measuring entropy,
58
132695
1440
02:26
but they don't have anything too specific,
59
134135
2895
02:29
and the reason they only have rules of thumb
60
137030
2337
02:31
is it turns out they don't actually have any good data
61
139367
3136
02:34
on passwords.
62
142503
1520
02:36
In fact, their report states,
63
144023
2312
02:38
"Unfortunately, we do not have much data
64
146335
2328
02:40
on the passwords users
choose under particular rules.
65
148663
2842
02:43
NIST would like to obtain more data
66
151505
2333
02:45
on the passwords users actually choose,
67
153838
2462
02:48
but system administrators
are understandably reluctant
68
156300
2463
02:50
to reveal password data to others."
69
158763
2940
02:53
So this is a problem, but our research group
70
161703
3097
02:56
looked at it as an opportunity.
71
164800
2140
02:58
We said, "Well, there's a need
for good password data.
72
166940
3100
03:02
Maybe we can collect some good password data
73
170040
2148
03:04
and actually advance the state of the art here.
74
172188
2704
03:06
So the first thing we did is,
75
174892
1672
03:08
we got a bag of candy bars
76
176564
1556
03:10
and we walked around campus
77
178120
1086
03:11
and talked to students, faculty and staff,
78
179206
2798
03:14
and asked them for information
79
182004
1530
03:15
about their passwords.
80
183534
1552
03:17
Now we didn't say, "Give us your password."
81
185086
3004
03:20
No, we just asked them about their password.
82
188090
2661
03:22
How long is it? Does it have a digit?
83
190751
1478
03:24
Does it have a symbol?
84
192229
1068
03:25
And were you annoyed at having to create
85
193297
2045
03:27
a new one last week?
86
195342
2744
03:30
So we got results from 470 students,
87
198086
3206
03:33
faculty and staff,
88
201292
971
03:34
and indeed we confirmed that the new policy
89
202263
2514
03:36
was very annoying,
90
204777
1453
03:38
but we also found that people said
91
206230
1792
03:40
they felt more secure with these new passwords.
92
208022
3130
03:43
We found that most people knew
93
211152
2306
03:45
they were not supposed to
write their password down,
94
213458
2152
03:47
and only 13 percent of them did,
95
215610
2391
03:50
but disturbingly, 80 percent of people
96
218001
2416
03:52
said they were reusing their password.
97
220417
2124
03:54
Now, this is actually more dangerous
98
222541
1796
03:56
than writing your password down,
99
224337
2022
03:58
because it makes you much
more susceptible to attackers.
100
226359
3561
04:01
So if you have to, write your passwords down,
101
229920
3118
04:05
but don't reuse them.
102
233038
1799
04:06
We also found some interesting things
103
234837
1751
04:08
about the symbols people use in passwords.
104
236588
2961
04:11
So CMU allows 32 possible symbols,
105
239549
2799
04:14
but as you can see, there's only a small number
106
242348
2433
04:16
that most people are using,
107
244781
1802
04:18
so we're not actually getting very much strength
108
246583
2941
04:21
from the symbols in our passwords.
109
249524
2466
04:23
So this was a really interesting study,
110
251990
2711
04:26
and now we had data from 470 people,
111
254701
2464
04:29
but in the scheme of things,
112
257165
1305
04:30
that's really not very much password data,
113
258470
2580
04:33
and so we looked around to see
114
261050
1445
04:34
where could we find additional password data?
115
262495
2560
04:37
So it turns out there are a lot of people
116
265055
2176
04:39
going around stealing passwords,
117
267231
2202
04:41
and they often go and post these passwords
118
269433
2477
04:43
on the Internet.
119
271910
1337
04:45
So we were able to get access
120
273247
1673
04:46
to some of these stolen password sets.
121
274920
3970
04:50
This is still not really ideal for research, though,
122
278890
2328
04:53
because it's not entirely clear
123
281218
2037
04:55
where all of these passwords came from,
124
283255
2184
04:57
or exactly what policies were in effect
125
285439
2242
04:59
when people created these passwords.
126
287681
2108
05:01
So we wanted to find some better source of data.
127
289789
3552
05:05
So we decided that one thing we could do
128
293341
1634
05:06
is we could do a study and have people
129
294975
2129
05:09
actually create passwords for our study.
130
297104
3240
05:12
So we used a service called
Amazon Mechanical Turk,
131
300344
2821
05:15
and this is a service where you can post
132
303165
2334
05:17
a small job online that takes a minute,
133
305499
2304
05:19
a few minutes, an hour,
134
307803
1500
05:21
and pay people, a penny, ten cents, a few dollars,
135
309303
2584
05:23
to do a task for you,
136
311887
1346
05:25
and then you pay them through Amazon.com.
137
313233
2122
05:27
So we paid people about 50 cents
138
315355
2294
05:29
to create a password following our rules
139
317649
2596
05:32
and answering a survey,
140
320245
1410
05:33
and then we paid them again to come back
141
321655
2525
05:36
two days later and log in
142
324180
2071
05:38
using their password and answering another survey.
143
326251
2574
05:40
So we did this, and we collected 5,000 passwords,
144
328825
4464
05:45
and we gave people a bunch of different policies
145
333289
2695
05:47
to create passwords with.
146
335984
1508
05:49
So some people had a pretty easy policy,
147
337492
1910
05:51
we call it Basic8,
148
339402
1539
05:52
and here the only rule was that your password
149
340941
2146
05:55
had to have at least eight characters.
150
343087
3416
05:58
Then some people had a much harder policy,
151
346503
2251
06:00
and this was very similar to the CMU policy,
152
348754
2537
06:03
that it had to have eight characters
153
351291
1934
06:05
including uppercase, lowercase, digit, symbol,
154
353225
2376
06:07
and pass a dictionary check.
155
355601
2389
06:09
And one of the other policies we tried,
156
357990
1335
06:11
and there were a whole bunch more,
157
359325
1270
06:12
but one of the ones we tried was called Basic16,
158
360595
2240
06:14
and the only requirement here
159
362835
2632
06:17
was that your password had
to have at least 16 characters.
160
365467
3153
06:20
All right, so now we had 5,000 passwords,
161
368620
2458
06:23
and so we had much more detailed information.
162
371078
3563
06:26
Again we see that there's only a small number
163
374641
2559
06:29
of symbols that people are actually using
164
377200
1915
06:31
in their passwords.
165
379115
1886
06:33
We also wanted to get an idea of how strong
166
381001
2599
06:35
the passwords were that people were creating,
167
383600
2771
06:38
but as you may recall, there isn't a good measure
168
386371
2620
06:40
of password strength.
169
388991
1754
06:42
So what we decided to do was to see
170
390745
2312
06:45
how long it would take to crack these passwords
171
393057
2370
06:47
using the best cracking tools
172
395427
1414
06:48
that the bad guys are using,
173
396841
1808
06:50
or that we could find information about
174
398649
2016
06:52
in the research literature.
175
400665
1537
06:54
So to give you an idea of how bad guys
176
402202
2758
06:56
go about cracking passwords,
177
404960
2170
06:59
they will steal a password file
178
407130
1951
07:01
that will have all of the passwords
179
409081
2153
07:03
in kind of a scrambled form, called a hash,
180
411234
2889
07:06
and so what they'll do is they'll make a guess
181
414123
2562
07:08
as to what a password is,
182
416685
1712
07:10
run it through a hashing function,
183
418397
1897
07:12
and see whether it matches
184
420294
1765
07:14
the passwords they have on
their stolen password list.
185
422059
3950
07:18
So a dumb attacker will try every password in order.
186
426009
3105
07:21
They'll start with AAAAA and move on to AAAAB,
187
429114
3568
07:24
and this is going to take a really long time
188
432682
2418
07:27
before they get any passwords
189
435100
1526
07:28
that people are really likely to actually have.
190
436626
2697
07:31
A smart attacker, on the other hand,
191
439323
2183
07:33
does something much more clever.
192
441506
1386
07:34
They look at the passwords
193
442892
1826
07:36
that are known to be popular
194
444718
1800
07:38
from these stolen password sets,
195
446518
1727
07:40
and they guess those first.
196
448245
1189
07:41
So they're going to start by guessing "password,"
197
449434
2134
07:43
and then they'll guess "I love you," and "monkey,"
198
451568
2751
07:46
and "12345678,"
199
454319
2583
07:48
because these are the passwords
200
456902
1312
07:50
that are most likely for people to have.
201
458214
1905
07:52
In fact, some of you probably have these passwords.
202
460119
3261
07:57
So what we found
203
465191
1298
07:58
by running all of these 5,000 passwords we collected
204
466489
3406
08:01
through these tests to see how strong they were,
205
469895
4106
08:06
we found that the long passwords
206
474001
2752
08:08
were actually pretty strong,
207
476753
1280
08:10
and the complex passwords were pretty strong too.
208
478033
3262
08:13
However, when we looked at the survey data,
209
481295
2442
08:15
we saw that people were really frustrated
210
483737
3024
08:18
by the very complex passwords,
211
486761
2339
08:21
and the long passwords were a lot more usable,
212
489100
2630
08:23
and in some cases, they were actually
213
491730
1325
08:25
even stronger than the complex passwords.
214
493055
2908
08:27
So this suggests that,
215
495963
1169
08:29
instead of telling people that they need
216
497132
1703
08:30
to put all these symbols and numbers
217
498835
1522
08:32
and crazy things into their passwords,
218
500357
2842
08:35
we might be better off just telling people
219
503199
2022
08:37
to have long passwords.
220
505221
2652
08:39
Now here's the problem, though:
221
507873
1792
08:41
Some people had long passwords
222
509665
2255
08:43
that actually weren't very strong.
223
511920
1555
08:45
You can make long passwords
224
513475
1997
08:47
that are still the sort of thing
225
515472
1556
08:49
that an attacker could easily guess.
226
517028
1742
08:50
So we need to do more than
just say long passwords.
227
518770
3365
08:54
There has to be some additional requirements,
228
522135
1936
08:56
and some of our ongoing research is looking at
229
524071
2969
08:59
what additional requirements we should add
230
527040
2439
09:01
to make for stronger passwords
231
529479
2104
09:03
that also are going to be easy for people
232
531583
2312
09:05
to remember and type.
233
533895
2698
09:08
Another approach to getting people to have
234
536593
2126
09:10
stronger passwords is to use a password meter.
235
538719
2257
09:12
Here are some examples.
236
540976
1385
09:14
You may have seen these on the Internet
237
542361
1401
09:15
when you were creating passwords.
238
543762
3057
09:18
We decided to do a study to find out
239
546819
2248
09:21
whether these password meters actually work.
240
549067
2887
09:23
Do they actually help people
241
551954
1421
09:25
have stronger passwords,
242
553375
1453
09:26
and if so, which ones are better?
243
554828
2086
09:28
So we tested password meters that were
244
556914
2507
09:31
different sizes, shapes, colors,
245
559421
2098
09:33
different words next to them,
246
561519
1416
09:34
and we even tested one that was a dancing bunny.
247
562935
3275
09:38
As you type a better password,
248
566210
1582
09:39
the bunny dances faster and faster.
249
567792
2539
09:42
So this was pretty fun.
250
570331
2529
09:44
What we found
251
572860
1567
09:46
was that password meters do work.
252
574427
3572
09:49
(Laughter)
253
577999
1801
09:51
Most of the password meters were actually effective,
254
579800
3333
09:55
and the dancing bunny was very effective too,
255
583133
2521
09:57
but the password meters that were the most effective
256
585654
2881
10:00
were the ones that made you work harder
257
588535
2355
10:02
before they gave you that thumbs up and said
258
590890
1980
10:04
you were doing a good job,
259
592870
1377
10:06
and in fact we found that most
260
594247
1512
10:07
of the password meters on the Internet today
261
595759
2281
10:10
are too soft.
262
598040
952
10:10
They tell you you're doing a good job too early,
263
598992
2203
10:13
and if they would just wait a little bit
264
601195
1929
10:15
before giving you that positive feedback,
265
603124
2049
10:17
you probably would have better passwords.
266
605173
3160
10:20
Now another approach to better passwords, perhaps,
267
608333
3847
10:24
is to use pass phrases instead of passwords.
268
612180
2890
10:27
So this was an xkcd cartoon
from a couple of years ago,
269
615070
3418
10:30
and the cartoonist suggests
270
618488
1674
10:32
that we should all use pass phrases,
271
620162
2196
10:34
and if you look at the second row of this cartoon,
272
622358
3170
10:37
you can see the cartoonist is suggesting
273
625528
1857
10:39
that the pass phrase "correct horse battery staple"
274
627385
3441
10:42
would be a very strong pass phrase
275
630826
2481
10:45
and something really easy to remember.
276
633307
1916
10:47
He says, in fact, you've already remembered it.
277
635223
2797
10:50
And so we decided to do a research study
278
638020
2150
10:52
to find out whether this was true or not.
279
640170
2592
10:54
In fact, everybody who I talk to,
280
642762
1775
10:56
who I mention I'm doing password research,
281
644537
2042
10:58
they point out this cartoon.
282
646579
1400
10:59
"Oh, have you seen it? That xkcd.
283
647979
1574
11:01
Correct horse battery staple."
284
649553
1602
11:03
So we did the research study to see
285
651155
1806
11:04
what would actually happen.
286
652961
2359
11:07
So in our study, we used Mechanical Turk again,
287
655320
3060
11:10
and we had the computer pick the random words
288
658380
4167
11:14
in the pass phrase.
289
662547
1100
11:15
Now the reason we did this
290
663647
1153
11:16
is that humans are not very good
291
664800
1586
11:18
at picking random words.
292
666386
1384
11:19
If we asked a human to do it,
293
667770
1262
11:21
they would pick things that were not very random.
294
669032
2998
11:24
So we tried a few different conditions.
295
672030
2032
11:26
In one condition, the computer picked
296
674062
2090
11:28
from a dictionary of the very common words
297
676152
2216
11:30
in the English language,
298
678368
1362
11:31
and so you'd get pass phrases like
299
679730
1764
11:33
"try there three come."
300
681494
1924
11:35
And we looked at that, and we said,
301
683418
1732
11:37
"Well, that doesn't really seem very memorable."
302
685150
3050
11:40
So then we tried picking words
303
688200
2240
11:42
that came from specific parts of speech,
304
690440
2521
11:44
so how about noun-verb-adjective-noun.
305
692961
2182
11:47
That comes up with something
that's sort of sentence-like.
306
695143
2577
11:49
So you can get a pass phrase like
307
697720
2070
11:51
"plan builds sure power"
308
699790
1308
11:53
or "end determines red drug."
309
701098
2786
11:55
And these seemed a little bit more memorable,
310
703884
2676
11:58
and maybe people would like those a little bit better.
311
706560
2822
12:01
We wanted to compare them with passwords,
312
709382
2572
12:03
and so we had the computer
pick random passwords,
313
711954
3196
12:07
and these were nice and short, but as you can see,
314
715150
1990
12:09
they don't really look very memorable.
315
717140
2806
12:11
And then we decided to try something called
316
719946
1396
12:13
a pronounceable password.
317
721342
1646
12:14
So here the computer picks random syllables
318
722988
2245
12:17
and puts them together
319
725233
1134
12:18
so you have something sort of pronounceable,
320
726367
2475
12:20
like "tufritvi" and "vadasabi."
321
728842
2602
12:23
That one kind of rolls off your tongue.
322
731444
2147
12:25
So these were random passwords that were
323
733591
2216
12:27
generated by our computer.
324
735807
2744
12:30
So what we found in this study was that, surprisingly,
325
738551
2978
12:33
pass phrases were not actually all that good.
326
741529
3768
12:37
People were not really better at remembering
327
745297
2793
12:40
the pass phrases than these random passwords,
328
748090
2953
12:43
and because the pass phrases are longer,
329
751043
2754
12:45
they took longer to type
330
753797
1226
12:47
and people made more errors while typing them in.
331
755023
3010
12:50
So it's not really a clear win for pass phrases.
332
758033
3227
12:53
Sorry, all of you xkcd fans.
333
761260
3345
12:56
On the other hand, we did find
334
764605
1892
12:58
that pronounceable passwords
335
766497
1804
13:00
worked surprisingly well,
336
768301
1471
13:01
and so we actually are doing some more research
337
769772
2418
13:04
to see if we can make that
approach work even better.
338
772190
3195
13:07
So one of the problems
339
775385
1812
13:09
with some of the studies that we've done
340
777197
1623
13:10
is that because they're all done
341
778820
1683
13:12
using Mechanical Turk,
342
780503
1590
13:14
these are not people's real passwords.
343
782093
1812
13:15
They're the passwords that they created
344
783905
2105
13:18
or the computer created for them for our study.
345
786010
2495
13:20
And we wanted to know whether people
346
788505
1568
13:22
would actually behave the same way
347
790073
2312
13:24
with their real passwords.
348
792385
2227
13:26
So we talked to the information
security office at Carnegie Mellon
349
794612
3681
13:30
and asked them if we could
have everybody's real passwords.
350
798293
3803
13:34
Not surprisingly, they were a little bit reluctant
351
802096
1754
13:35
to share them with us,
352
803850
1550
13:37
but we were actually able to work out
353
805400
1810
13:39
a system with them
354
807210
1040
13:40
where they put all of the real passwords
355
808250
2109
13:42
for 25,000 CMU students, faculty and staff,
356
810359
3091
13:45
into a locked computer in a locked room,
357
813450
2448
13:47
not connected to the Internet,
358
815898
1394
13:49
and they ran code on it that we wrote
359
817292
1848
13:51
to analyze these passwords.
360
819140
2152
13:53
They audited our code.
361
821292
1326
13:54
They ran the code.
362
822618
1312
13:55
And so we never actually saw
363
823930
1738
13:57
anybody's password.
364
825668
2817
14:00
We got some interesting results,
365
828485
1515
14:02
and those of you Tepper students in the back
366
830000
1696
14:03
will be very interested in this.
367
831696
2875
14:06
So we found that the passwords created
368
834571
3731
14:10
by people affiliated with the
school of computer science
369
838302
2158
14:12
were actually 1.8 times stronger
370
840460
2324
14:14
than those affiliated with the business school.
371
842784
3738
14:18
We have lots of other really interesting
372
846522
2040
14:20
demographic information as well.
373
848562
2238
14:22
The other interesting thing that we found
374
850800
1846
14:24
is that when we compared
the Carnegie Mellon passwords
375
852646
2440
14:27
to the Mechanical Turk-generated passwords,
376
855086
2283
14:29
there was actually a lot of similarities,
377
857369
2619
14:31
and so this helped validate our research method
378
859988
1948
14:33
and show that actually, collecting passwords
379
861936
2510
14:36
using these Mechanical Turk studies
380
864446
1808
14:38
is actually a valid way to study passwords.
381
866254
2788
14:41
So that was good news.
382
869042
2285
14:43
Okay, I want to close by talking about
383
871327
2414
14:45
some insights I gained while on sabbatical
384
873741
2068
14:47
last year in the Carnegie Mellon art school.
385
875809
3201
14:51
One of the things that I did
386
879010
1281
14:52
is I made a number of quilts,
387
880291
1524
14:53
and I made this quilt here.
388
881815
1548
14:55
It's called "Security Blanket."
389
883363
1899
14:57
(Laughter)
390
885262
2431
14:59
And this quilt has the 1,000
391
887693
3095
15:02
most frequent passwords stolen
392
890788
2328
15:05
from the RockYou website.
393
893116
2571
15:07
And the size of the passwords is proportional
394
895687
2061
15:09
to how frequently they appeared
395
897748
1901
15:11
in the stolen dataset.
396
899649
2248
15:13
And what I did is I created this word cloud,
397
901897
2632
15:16
and I went through all 1,000 words,
398
904529
2132
15:18
and I categorized them into
399
906661
1795
15:20
loose thematic categories.
400
908456
2380
15:22
And it was, in some cases,
401
910836
1903
15:24
it was kind of difficult to figure out
402
912739
2038
15:26
what category they should be in,
403
914777
1755
15:28
and then I color-coded them.
404
916532
1899
15:30
So here are some examples of the difficulty.
405
918431
2619
15:33
So "justin."
406
921050
1181
15:34
Is that the name of the user,
407
922231
1829
15:36
their boyfriend, their son?
408
924060
1322
15:37
Maybe they're a Justin Bieber fan.
409
925382
2888
15:40
Or "princess."
410
928270
2225
15:42
Is that a nickname?
411
930495
1635
15:44
Are they Disney princess fans?
412
932130
1595
15:45
Or maybe that's the name of their cat.
413
933725
3694
15:49
"Iloveyou" appears many times
414
937419
1655
15:51
in many different languages.
415
939074
1545
15:52
There's a lot of love in these passwords.
416
940619
3735
15:56
If you look carefully, you'll see there's also
417
944354
1680
15:58
some profanity,
418
946034
2267
16:00
but it was really interesting to me to see
419
948301
1950
16:02
that there's a lot more love than hate
420
950251
2307
16:04
in these passwords.
421
952558
2292
16:06
And there are animals,
422
954850
1490
16:08
a lot of animals,
423
956340
1360
16:09
and "monkey" is the most common animal
424
957700
2304
16:12
and the 14th most popular password overall.
425
960004
3675
16:15
And this was really curious to me,
426
963679
2231
16:17
and I wondered, "Why are monkeys so popular?"
427
965910
2523
16:20
And so in our last password study,
428
968433
3352
16:23
any time we detected somebody
429
971785
1686
16:25
creating a password with the word "monkey" in it,
430
973471
2649
16:28
we asked them why they had
a monkey in their password.
431
976120
3030
16:31
And what we found out --
432
979150
1910
16:33
we found 17 people so far, I think,
433
981060
2103
16:35
who have the word "monkey" --
434
983163
1283
16:36
We found out about a third of them said
435
984446
1812
16:38
they have a pet named "monkey"
436
986258
1740
16:39
or a friend whose nickname is "monkey,"
437
987998
2291
16:42
and about a third of them said
438
990289
1660
16:43
that they just like monkeys
439
991949
1533
16:45
and monkeys are really cute.
440
993482
1638
16:47
And that guy is really cute.
441
995120
3639
16:50
So it seems that at the end of the day,
442
998759
3408
16:54
when we make passwords,
443
1002167
1783
16:55
we either make something that's really easy
444
1003950
1974
16:57
to type, a common pattern,
445
1005924
3009
17:00
or things that remind us of the word password
446
1008933
2486
17:03
or the account that we've created the password for,
447
1011419
3312
17:06
or whatever.
448
1014731
2617
17:09
Or we think about things that make us happy,
449
1017348
2642
17:11
and we create our password
450
1019990
1304
17:13
based on things that make us happy.
451
1021294
2238
17:15
And while this makes typing
452
1023532
2863
17:18
and remembering your password more fun,
453
1026395
2870
17:21
it also makes it a lot easier
454
1029265
1807
17:23
to guess your password.
455
1031072
1506
17:24
So I know a lot of these TED Talks
456
1032578
1748
17:26
are inspirational
457
1034326
1634
17:27
and they make you think about nice, happy things,
458
1035960
2461
17:30
but when you're creating your password,
459
1038421
1897
17:32
try to think about something else.
460
1040318
1991
17:34
Thank you.
461
1042309
1107
17:35
(Applause)
462
1043416
553

▲Back to top

ABOUT THE SPEAKER
Lorrie Faith Cranor - Security researcher
At Carnegie Mellon University, Lorrie Faith Cranor studies online privacy, usable security, phishing, spam and other research around keeping us safe online.

Why you should listen

Lorrie Faith Cranor is an Associate Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University, where she is director of the CyLab Usable Privacy and Security Laboratory (CUPS) and co-director of the MSIT-Privacy Engineering masters program. She is also a co-founder of Wombat Security Technologies, Inc. She has authored over 100 research papers on online privacy, usable security, phishing, spam, electronic voting, anonymous publishing, and other topics.

Cranor plays a key role in building the usable privacy and security research community, having co-edited the seminal book Security and Usability and founded the Symposium On Usable Privacy and Security (SOUPS). She also chaired the Platform for Privacy Preferences Project (P3P) Specification Working Group at the W3C and authored the book Web Privacy with P3P. She has served on a number of boards, including the Electronic Frontier Foundation Board of Directors, and on the editorial boards of several journals. In 2003 she was named one of the top 100 innovators 35 or younger by Technology Review.

More profile about the speaker
Lorrie Faith Cranor | Speaker | TED.com