ABOUT THE SPEAKER
Priyanka Jain - Technologist
Priyanka Jain heads up product for pymetrics, an NYC-based startup that uses neuroscience and AI to make hiring more diverse and effective.

Why you should listen

Passionate about using technology to create a fairer workplace and global economy, Priyanka Jain is a spokesperson for the United Nations Foundation's Girl Up Campaign, Chair of the Acumen Fund's Junior Council and on the Innovation Board for the XPrize Foundation. She received her B.S. from Stanford University, where she was President of Stanford Women in Business and one of 12 Mayfield Entrepreneurship Fellows. Her previous experience includes internships at IBM Watson, Shift Technologies, Canvas Ventures and the Institute for Learning and Brain Sciences. Outside of work, she loves playing tennis and eating anything covered in dark chocolate.

More profile about the speaker
Priyanka Jain | Speaker | TED.com
The Way We Work

Priyanka Jain: How to make applying for jobs less painful

Priyanka Jain: Como facer que a solicitude de emprego sexa menos dolorosa

Filmed:
548,985 views

Atopar un traballo adoitaba comezar coa presentación do teu currículum a un millón de anuncios e nunca volver saber da maioría deles. Pero cada vez son máis as empresas que usan métodos de tecnoloxía avanzada para identificar os candidatos. Se a IA é o futuro da contratación, que significa isto para ti? A tecnóloga Priyanka Jain dá unha ollada a esta nova paisaxe de contratación.
- Technologist
Priyanka Jain heads up product for pymetrics, an NYC-based startup that uses neuroscience and AI to make hiring more diverse and effective. Full bio

Double-click the English transcript below to play the video.

00:00
Applying for jobs online
0
548
1416
Solicitar emprego en liña
00:01
is one of the worst
digital experiences of our time.
1
1988
2616
é unha das peores
experiencias dixitais do noso tempo.
00:04
And applying for jobs in person
really isn't much better.
2
4628
2696
E solicitalo de xeito presencial
non é moito mellor.
00:07
[The Way We Work]
3
7349
1720
[Cómo traballamos]
00:11
Hiring as we know it
is broken on many fronts.
4
11588
2456
A contratación actual
está rota en moitas frontes.
É unha experiencia terrible para a xente.
00:14
It's a terrible experience for people.
5
14068
1856
Arredor do 75 por cento das persoas
00:15
About 75 percent of people
6
15948
1816
00:17
who applied to jobs
using various methods in the past year
7
17788
2856
que solicitaron un traballo
por varios métodos no último ano
00:20
said they never heard anything back
from the employer.
8
20668
2656
dixeron que o empresario
nunca lles contestou.
00:23
And at the company level
it's not much better.
9
23348
2176
E a nivel de empresa non é moito mellor.
00:25
46 percent of people get fired or quit
10
25548
2776
O 46 % das persoas
son despedidas ou deixan o traballo
00:28
within the first year
of starting their jobs.
11
28348
2176
ao primeiro ano de comezalo.
00:30
It's pretty mind-blowing.
12
30548
1216
É terrorífico.
00:31
It's also bad for the economy.
13
31788
1456
E tamén é malo para a economía.
00:33
For the first time in history,
14
33268
1456
Por primeira vez na historia,
00:34
we have more open jobs
than we have unemployed people,
15
34748
2856
temos máis emprego dispoñible
que persoas desempregadas,
00:37
and to me that screams
that we have a problem.
16
37628
2176
e para min iso indica
que temos un problema.
00:39
I believe that at the crux of all of this
is a single piece of paper: the résumé.
17
39828
3976
Creo que na base de todo isto
hai un único papel: o currículo vitae.
00:43
A résumé definitely has
some useful pieces in it:
18
43828
2336
Un currículo ten algunhas partes útiles:
00:46
what roles people have had,
computer skills,
19
46188
2096
os roles da xente,
as habilidades informáticas,
00:48
what languages they speak,
20
48308
1256
os idiomas que falan,
00:49
but what it misses is
what they have the potential to do
21
49588
3056
pero o que falta é
dicir qué potencial teñen de facer
00:52
that they might not have had
the opportunity to do in the past.
22
52668
2976
o que quizais non tivesen
a oportunidade de facer no pasado.
00:55
And with such a quickly changing economy
where jobs are coming online
23
55668
3256
E cunha economía tan cambiante
onde os traballos que veñen en liña
00:58
that might require skills that nobody has,
24
58948
2056
poden requirir
habilidades que ninguén ten,
01:01
if we only look at what someone
has done in the past,
25
61028
2776
se só miramos o que alguén
fixo no pasado,
01:03
we're not going to be able
to match people to the jobs of the future.
26
63828
3256
non imos poder vincular a xente
aos traballos do futuro.
01:07
So this is where I think technology
can be really helpful.
27
67108
2736
Aquí é onde creo que a tecnoloxía
pode ser de gran axuda.
01:09
You've probably seen
that algorithms have gotten pretty good
28
69868
2856
Probablemente xa viron
que os algoritmos
son bos relacionando
a xente con cousas,
01:12
at matching people to things,
29
72748
1536
01:14
but what if we could use
that same technology
30
74308
2256
pero, e se puidésemos usar
esa mesma tecnoloxía
01:16
to actually help us find jobs
that we're really well-suited for?
31
76588
3096
para realmente axudarnos a atopar
un emprego que se adapte a nós?
01:19
But I know what you're thinking.
32
79708
1576
Xa sei o que está a pensar.
01:21
Algorithms picking your next job
sounds a little bit scary,
33
81308
2776
Dá medo que os algoritmos
escollan o seu seguinte traballo,
01:24
but there is one thing that has been shown
34
84108
2056
pero hai unha cousa que demostrou
01:26
to be really predictive
of someone's future success in a job,
35
86188
2896
ser realmente premonitoria
no éxito futuro de alguén nun traballo,
01:29
and that's what's called
a multimeasure test.
36
89108
2136
e chámase proba multimedida.
As probas multimedida
non son realmente nada novo,
01:31
Multimeasure tests
really aren't anything new,
37
91268
2176
01:33
but they used to be really expensive
38
93468
1736
pero adoitaban ser moi caras
e precisaban dun doutorando
sentado ao seu carón
01:35
and required a PhD sitting across from you
39
95228
2016
01:37
and answering lots of questions
and writing reports.
40
97268
2456
respondendo moitas preguntas
e escribindo informes.
01:39
Multimeasure tests are a way
41
99748
1696
As probas multimedida son un xeito
de comprender
os rasgos inherentes de alguén,
01:41
to understand someone's inherent traits --
42
101468
2456
01:43
your memory, your attentiveness.
43
103948
1776
a súa memoria, a súa atención.
01:46
What if we could take multimeasure tests
44
106662
1942
E se puidésemos facer probas multimedida
01:48
and make them scalable and accessible,
45
108628
2536
escalables e accesibles,
01:51
and provide data to employers
about really what the traits are
46
111188
3376
e proporcionar datos aos empresarios
sobre cales son os rasgos
01:54
of someone who can make
them a good fit for a job?
47
114588
2896
de alguén que poida encaixar nun traballo?
01:57
This all sounds abstract.
48
117508
1296
Todo isto soa abstracto.
01:58
Let's try one of the games together.
49
118828
1735
Probemos un dos xogos xuntos.
02:00
You're about to see a flashing circle,
50
120587
1857
Agora verá un círculo intermitente,
02:02
and your job is going to be
to clap when the circle is red
51
122468
2960
e o seu traballo vai ser
aplaudir cando o círculo está vermello
02:06
and do nothing when it's green.
52
126285
1496
e non facelo cando estea verde.
02:07
[Ready?]
53
127806
1376
[Listo?]
02:09
[Begin!]
54
129207
1360
[Comeza!]
02:11
[Green circle]
55
131708
1000
[Círculo verde]
02:13
[Green circle]
56
133708
1040
[Círculo verde]
02:15
[Red circle]
57
135708
1000
[Círculo vermello]
02:17
[Green circle]
58
137708
1080
[Círculo verde]
02:19
[Red circle]
59
139708
1000
[Círculo vermello]
02:21
Maybe you're the type of person
60
141928
1596
Quizais sexa o tipo de persoa
02:23
who claps the millisecond
after a red circle appears.
61
143548
2496
que aplaude un milisegundo despois
de ver o círculo vermello.
02:26
Or maybe you're the type of person
62
146068
1656
Ou quizais sexa do tipo
02:27
who takes just a little bit longer
to be 100 percent sure.
63
147748
2735
que lle leva un pouco máis
estar 100 por cento seguro.
02:30
Or maybe you clap on green
even though you're not supposed to.
64
150508
2936
Ou quizais aplaude ao verde
aínda que se supón que non debe.
02:33
The cool thing here is that
this isn't like a standardized test
65
153468
2976
O xenial aquí é que isto non é
como unha proba normalizada
02:36
where some people are employable
and some people aren't.
66
156468
2656
onde algunhas persoas
son empregables e outras non.
02:39
Instead it's about understanding
the fit between your characteristics
67
159148
3256
Trátase de comprender
o axuste entre as súas características
02:42
and what would make you
good a certain job.
68
162428
2016
que o farían bo para determinado traballo.
02:44
We found that if you clap late on red
and you never clap on the green,
69
164468
3736
Descubrimos que se aplaude ao vermello
e nunca aplaude ao verde,
02:48
you might be high in attentiveness
and high in restraint.
70
168228
3176
pode ter un alto grado de atención
e de contención.
02:51
People in that quadrant tend to be
great students, great test-takers,
71
171428
3576
As persoas así adoitan ser
grandes estudantes e probadores,
e bos na xestión de proxectos
ou en contabilidade.
02:55
great at project management or accounting.
72
175028
2136
02:57
But if you clap immediately on red
and sometimes clap on green,
73
177188
3336
Pero se aplaude inmediatamente
ao vermello e ás veces ao verde,
03:00
that might mean that
you're more impulsive and creative,
74
180548
2656
podería significar que é vostede
máis impulsivo e creativo,
03:03
and we've found that top-performing
salespeople often embody these traits.
75
183228
3875
e descubrimos que estes son os rasgos
dos vendedores de mellor rendemento.
03:07
The way we actually use this in hiring
76
187128
2016
O xeito no que usamos isto na contratación
03:09
is we have top performers in a role
go through neuroscience exercises
77
189168
3696
é facendo exercicios de neurociencia
como este cos mellores.
03:12
like this one.
78
192888
1216
03:14
Then we develop an algorithm
79
194128
1376
coma este.
Logo creamos un algoritmo
03:15
that understands what makes
those top performers unique.
80
195528
2656
que entende que é o que fai únicos
aos máis destacados.
03:18
And then when people apply to the job,
81
198208
1936
E cando a xente solicita o traballo,
03:20
we're able to surface the candidates
who might be best suited for that job.
82
200168
4136
podemos escoller ao máis adecuado
para ese traballo de entre os candidatos.
03:24
So you might be thinking
there's a danger in this.
83
204328
2376
Pode estar a pensar
que hai un perigo nisto.
03:26
The work world today
is not the most diverse
84
206728
2136
O mundo laboral de hoxe
non é o máis diverso
03:28
and if we're building algorithms
based on current top performers,
85
208888
3096
e se facemos algoritmos
en base aos mellores na actualidade,
03:32
how do we make sure
86
212008
1216
como nos aseguramos
03:33
that we're not just perpetuating
the biases that already exist?
87
213248
2976
de non estar a perpetuar
os sesgos xa existentes?
03:36
For example, if we were building
an algorithm based on top performing CEOs
88
216248
4056
Por exemplo, se construimos un algoritmo
baseado nos CEOs de mellor rendemento
03:40
and use the S&P 500 as a training set,
89
220328
3216
e usamos o S&P 500
como conxunto de adestramento,
03:43
you would actually find
90
223568
1256
atopariamos que hai máis probabilidades
03:44
that you're more likely to hire
a white man named John than any woman.
91
224848
3816
de contratar a un home branco
chamado Xoán que a calquera muller.
03:48
And that's the reality
of who's in those roles right now.
92
228688
2696
Esa é a realidade
de quen está nesas posicións agora.
03:51
But technology actually poses
a really interesting opportunity.
93
231408
3376
Pero a tecnoloxía plantexa
unha oportunidade realmente interesante.
03:54
We can create algorithms
that are more equitable
94
234808
2256
Podemos crear algoritmos máis equitativos
03:57
and more fair than human beings
have ever been.
95
237088
2256
e máis xustos do que
algunha vez foi o ser humano.
03:59
Every algorithm that we put
into production has been pretested
96
239368
3696
Todos os algoritmos que facemos
foron probados antes
04:03
to ensure that it doesn't favor
any gender or ethnicity.
97
243088
3096
para asegurar que non favorece
a ningún xénero ou etnia.
04:06
And if there's any population
that's being overfavored,
98
246208
2736
E se hai algunha poboación
que está sendo desfavorecida,
04:08
we can actually alter the algorithm
until that's no longer true.
99
248968
3120
podemos alterar o algoritmo
ata que iso non ocorra.
04:12
When we focus on the inherent
characteristics
100
252448
2216
Cando nos centramos
nas características inherentes
04:14
that can make somebody
a good fit for a job,
101
254688
2096
que fan a alguén ideal para un traballo,
04:16
we can transcend racism,
classism, sexism, ageism --
102
256808
3576
podemos transcender o racismo,
clasismo, sexismo, idade,
e incluso o bo escolarismo.
04:20
even good schoolism.
103
260408
1416
04:21
Our best technology and algorithms
shouldn't just be used
104
261848
2896
A nosa mellor tecnoloxía e algoritmos
non debería usarse
04:24
for helping us find our next movie binge
or new favorite Justin Bieber song.
105
264768
3736
para axudarnos a atopar a próxima película
ou o último éxito de Justin Bieber.
04:28
Imagine if we could harness
the power of technology
106
268528
2656
Imaxine se puidésemos aproveitar
o poder da tecnoloxía
04:31
to get real guidance
on what we should be doing
107
271208
2296
para ter unha orientación real
do que deberiamos estar a facer
04:33
based on who we are at a deeper level.
108
273528
1936
en función do que somos
a un nivel máis profundo.
Translated by Penny Martínez
Reviewed by Alicia Ferreiro

▲Back to top

ABOUT THE SPEAKER
Priyanka Jain - Technologist
Priyanka Jain heads up product for pymetrics, an NYC-based startup that uses neuroscience and AI to make hiring more diverse and effective.

Why you should listen

Passionate about using technology to create a fairer workplace and global economy, Priyanka Jain is a spokesperson for the United Nations Foundation's Girl Up Campaign, Chair of the Acumen Fund's Junior Council and on the Innovation Board for the XPrize Foundation. She received her B.S. from Stanford University, where she was President of Stanford Women in Business and one of 12 Mayfield Entrepreneurship Fellows. Her previous experience includes internships at IBM Watson, Shift Technologies, Canvas Ventures and the Institute for Learning and Brain Sciences. Outside of work, she loves playing tennis and eating anything covered in dark chocolate.

More profile about the speaker
Priyanka Jain | Speaker | TED.com