ABOUT THE SPEAKER
Priyanka Jain - Technologist
Priyanka Jain heads up product for pymetrics, an NYC-based startup that uses neuroscience and AI to make hiring more diverse and effective.

Why you should listen

Passionate about using technology to create a fairer workplace and global economy, Priyanka Jain is a spokesperson for the United Nations Foundation's Girl Up Campaign, Chair of the Acumen Fund's Junior Council and on the Innovation Board for the XPrize Foundation. She received her B.S. from Stanford University, where she was President of Stanford Women in Business and one of 12 Mayfield Entrepreneurship Fellows. Her previous experience includes internships at IBM Watson, Shift Technologies, Canvas Ventures and the Institute for Learning and Brain Sciences. Outside of work, she loves playing tennis and eating anything covered in dark chocolate.

More profile about the speaker
Priyanka Jain | Speaker | TED.com
The Way We Work

Priyanka Jain: How to make applying for jobs less painful

Filmed:
548,985 views

Finding a job used to start with submitting your résumé to a million listings and never hearing back from most of them. But more and more companies are using tech-forward methods to identify candidates. If AI is the future of hiring, what does that mean for you? Technologist Priyanka Jain gives a look at this new hiring landscape.
- Technologist
Priyanka Jain heads up product for pymetrics, an NYC-based startup that uses neuroscience and AI to make hiring more diverse and effective. Full bio

Double-click the English transcript below to play the video.

00:00
Applying for jobs online
0
548
1416
00:01
is one of the worst
digital experiences of our time.
1
1988
2616
00:04
And applying for jobs in person
really isn't much better.
2
4628
2696
00:07
[The Way We Work]
3
7349
1720
00:11
Hiring as we know it
is broken on many fronts.
4
11588
2456
00:14
It's a terrible experience for people.
5
14068
1856
00:15
About 75 percent of people
6
15948
1816
00:17
who applied to jobs
using various methods in the past year
7
17788
2856
00:20
said they never heard anything back
from the employer.
8
20668
2656
00:23
And at the company level
it's not much better.
9
23348
2176
00:25
46 percent of people get fired or quit
10
25548
2776
00:28
within the first year
of starting their jobs.
11
28348
2176
00:30
It's pretty mind-blowing.
12
30548
1216
00:31
It's also bad for the economy.
13
31788
1456
00:33
For the first time in history,
14
33268
1456
00:34
we have more open jobs
than we have unemployed people,
15
34748
2856
00:37
and to me that screams
that we have a problem.
16
37628
2176
00:39
I believe that at the crux of all of this
is a single piece of paper: the résumé.
17
39828
3976
00:43
A résumé definitely has
some useful pieces in it:
18
43828
2336
00:46
what roles people have had,
computer skills,
19
46188
2096
00:48
what languages they speak,
20
48308
1256
00:49
but what it misses is
what they have the potential to do
21
49588
3056
00:52
that they might not have had
the opportunity to do in the past.
22
52668
2976
00:55
And with such a quickly changing economy
where jobs are coming online
23
55668
3256
00:58
that might require skills that nobody has,
24
58948
2056
01:01
if we only look at what someone
has done in the past,
25
61028
2776
01:03
we're not going to be able
to match people to the jobs of the future.
26
63828
3256
01:07
So this is where I think technology
can be really helpful.
27
67108
2736
01:09
You've probably seen
that algorithms have gotten pretty good
28
69868
2856
01:12
at matching people to things,
29
72748
1536
01:14
but what if we could use
that same technology
30
74308
2256
01:16
to actually help us find jobs
that we're really well-suited for?
31
76588
3096
01:19
But I know what you're thinking.
32
79708
1576
01:21
Algorithms picking your next job
sounds a little bit scary,
33
81308
2776
01:24
but there is one thing that has been shown
34
84108
2056
01:26
to be really predictive
of someone's future success in a job,
35
86188
2896
01:29
and that's what's called
a multimeasure test.
36
89108
2136
01:31
Multimeasure tests
really aren't anything new,
37
91268
2176
01:33
but they used to be really expensive
38
93468
1736
01:35
and required a PhD sitting across from you
39
95228
2016
01:37
and answering lots of questions
and writing reports.
40
97268
2456
01:39
Multimeasure tests are a way
41
99748
1696
01:41
to understand someone's inherent traits --
42
101468
2456
01:43
your memory, your attentiveness.
43
103948
1776
01:46
What if we could take multimeasure tests
44
106662
1942
01:48
and make them scalable and accessible,
45
108628
2536
01:51
and provide data to employers
about really what the traits are
46
111188
3376
01:54
of someone who can make
them a good fit for a job?
47
114588
2896
01:57
This all sounds abstract.
48
117508
1296
01:58
Let's try one of the games together.
49
118828
1735
02:00
You're about to see a flashing circle,
50
120587
1857
02:02
and your job is going to be
to clap when the circle is red
51
122468
2960
02:06
and do nothing when it's green.
52
126285
1496
02:07
[Ready?]
53
127806
1376
02:09
[Begin!]
54
129207
1360
02:11
[Green circle]
55
131708
1000
02:13
[Green circle]
56
133708
1040
02:15
[Red circle]
57
135708
1000
02:17
[Green circle]
58
137708
1080
02:19
[Red circle]
59
139708
1000
02:21
Maybe you're the type of person
60
141928
1596
02:23
who claps the millisecond
after a red circle appears.
61
143548
2496
02:26
Or maybe you're the type of person
62
146068
1656
02:27
who takes just a little bit longer
to be 100 percent sure.
63
147748
2735
02:30
Or maybe you clap on green
even though you're not supposed to.
64
150508
2936
02:33
The cool thing here is that
this isn't like a standardized test
65
153468
2976
02:36
where some people are employable
and some people aren't.
66
156468
2656
02:39
Instead it's about understanding
the fit between your characteristics
67
159148
3256
02:42
and what would make you
good a certain job.
68
162428
2016
02:44
We found that if you clap late on red
and you never clap on the green,
69
164468
3736
02:48
you might be high in attentiveness
and high in restraint.
70
168228
3176
02:51
People in that quadrant tend to be
great students, great test-takers,
71
171428
3576
02:55
great at project management or accounting.
72
175028
2136
02:57
But if you clap immediately on red
and sometimes clap on green,
73
177188
3336
03:00
that might mean that
you're more impulsive and creative,
74
180548
2656
03:03
and we've found that top-performing
salespeople often embody these traits.
75
183228
3875
03:07
The way we actually use this in hiring
76
187128
2016
03:09
is we have top performers in a role
go through neuroscience exercises
77
189168
3696
03:12
like this one.
78
192888
1216
03:14
Then we develop an algorithm
79
194128
1376
03:15
that understands what makes
those top performers unique.
80
195528
2656
03:18
And then when people apply to the job,
81
198208
1936
03:20
we're able to surface the candidates
who might be best suited for that job.
82
200168
4136
03:24
So you might be thinking
there's a danger in this.
83
204328
2376
03:26
The work world today
is not the most diverse
84
206728
2136
03:28
and if we're building algorithms
based on current top performers,
85
208888
3096
03:32
how do we make sure
86
212008
1216
03:33
that we're not just perpetuating
the biases that already exist?
87
213248
2976
03:36
For example, if we were building
an algorithm based on top performing CEOs
88
216248
4056
03:40
and use the S&P 500 as a training set,
89
220328
3216
03:43
you would actually find
90
223568
1256
03:44
that you're more likely to hire
a white man named John than any woman.
91
224848
3816
03:48
And that's the reality
of who's in those roles right now.
92
228688
2696
03:51
But technology actually poses
a really interesting opportunity.
93
231408
3376
03:54
We can create algorithms
that are more equitable
94
234808
2256
03:57
and more fair than human beings
have ever been.
95
237088
2256
03:59
Every algorithm that we put
into production has been pretested
96
239368
3696
04:03
to ensure that it doesn't favor
any gender or ethnicity.
97
243088
3096
04:06
And if there's any population
that's being overfavored,
98
246208
2736
04:08
we can actually alter the algorithm
until that's no longer true.
99
248968
3120
04:12
When we focus on the inherent
characteristics
100
252448
2216
04:14
that can make somebody
a good fit for a job,
101
254688
2096
04:16
we can transcend racism,
classism, sexism, ageism --
102
256808
3576
04:20
even good schoolism.
103
260408
1416
04:21
Our best technology and algorithms
shouldn't just be used
104
261848
2896
04:24
for helping us find our next movie binge
or new favorite Justin Bieber song.
105
264768
3736
04:28
Imagine if we could harness
the power of technology
106
268528
2656
04:31
to get real guidance
on what we should be doing
107
271208
2296
04:33
based on who we are at a deeper level.
108
273528
1936
Translated by Leslie Gauthier
Reviewed by Joanna Pietrulewicz

▲Back to top

ABOUT THE SPEAKER
Priyanka Jain - Technologist
Priyanka Jain heads up product for pymetrics, an NYC-based startup that uses neuroscience and AI to make hiring more diverse and effective.

Why you should listen

Passionate about using technology to create a fairer workplace and global economy, Priyanka Jain is a spokesperson for the United Nations Foundation's Girl Up Campaign, Chair of the Acumen Fund's Junior Council and on the Innovation Board for the XPrize Foundation. She received her B.S. from Stanford University, where she was President of Stanford Women in Business and one of 12 Mayfield Entrepreneurship Fellows. Her previous experience includes internships at IBM Watson, Shift Technologies, Canvas Ventures and the Institute for Learning and Brain Sciences. Outside of work, she loves playing tennis and eating anything covered in dark chocolate.

More profile about the speaker
Priyanka Jain | Speaker | TED.com