English-Video.net comment policy

The comment field is common to all languages

Let's write in your language and use "Google Translate" together

Please refer to informative community guidelines on TED.com

TED2015

Laura Schulz: The surprisingly logical minds of babies

Lora Šulc (Laura Schulz): Iznenađujuće logičan um beba

Filmed
Views 1,632,838

Kako bebe nauče tako mnogo iz tako malo tako brzo? U zabavnom govoru punom priče o eksperimentima, kognitivni naučnik Lora Šulc pokazuje kako naši potomci donose odluke sa iznenađujuće snažnim osećajem za logiku, mnogo pre nego što umeju da govore.

- Cognitive scientist
Developmental behavior studies spearheaded by Laura Schulz are changing our notions of how children learn. Full bio

Mark Tven je sumirao ono što smatram
00:12
Mark Twain summed up
what I take to be
jednim od temeljnih problema
kognitivne nauke
00:14
one of the fundamental problems
of cognitive science
samo jednom dosetkom.
00:18
with a single witticism.
Rekao je: "Postoji nešto fascinantno
u vezi sa naukom.
00:20
He said, "There's something
fascinating about science.
Dobija se veliki obrt pretpostavki
00:23
One gets such wholesale
returns of conjecture
od tako sitnog ulaganja u činjenice."
00:26
out of such a trifling
investment in fact."
(Smeh)
00:29
(Laughter)
Tven je mislio to kao šalu, naravno,
ali u pravu je,
00:32
Twain meant it as a joke,
of course, but he's right:
postoji nešto fascinantno
u vezi sa naukom.
00:34
There's something
fascinating about science.
Na osnovu nekoliko kostiju,
zaključujemo o postojanju dinosaurusa.
00:37
From a few bones, we infer
the existence of dinosuars.
Na osnovu spektralnih linija,
o sastavu nebula.
00:42
From spectral lines,
the composition of nebulae.
Od voćne mušice,
00:47
From fruit flies,
o mehanizmima nasleđivanja,
00:50
the mechanisms of heredity,
a na osnovu rekonstruisanih snimaka
protoka krvi kroz mozak,
00:53
and from reconstructed images
of blood flowing through the brain,
ili u mom slučaju,
na osnovu ponašanja veoma male dece,
00:57
or in my case, from the behavior
of very young children,
pokušavamo da kažemo
nešto o osnovnim mehanizmima
01:02
we try to say something about
the fundamental mechanisms
ljudske kognicije.
01:05
of human cognition.
Konkretno, u mojoj laboratoriji
01:07
In particular, in my lab in the Department
of Brain and Cognitive Sciences at MIT,
na Odeljenju za mozak i kognitivne nauke
na Masačusetskom tehnološkom institutu,
01:12
I have spent the past decade
trying to understand the mystery
provela sam proteklu deceniju
pokušavajući da razumem misteriju
kako deca uče tako mnogo
iz tako malo tako brzo.
01:16
of how children learn so much
from so little so quickly.
Jer, ispostavlja se
da je fascinantna stvar u vezi sa naukom
01:20
Because, it turns out that
the fascinating thing about science
takođe i fascinantna stvar
u vezi sa decom,
01:23
is also a fascinating
thing about children,
a to je, da ublažim verziju Marka Tvena,
01:27
which, to put a gentler
spin on Mark Twain,
upravo njihova sposobnost da izvuku
bujne, apstraktne zaključke
01:29
is precisely their ability
to draw rich, abstract inferences
brzo i tačno
iz oskudnih, izmešanih podataka.
01:34
rapidly and accurately
from sparse, noisy data.
Daću vam dva primera.
01:40
I'm going to give you
just two examples today.
Jedan je o problemu generalizacije,
01:42
One is about a problem of generalization,
a drugi je
o problemu uzročnog rezonovanja.
01:45
and the other is about a problem
of causal reasoning.
I mada ću govoriti
o radu u mojoj laboratoriji,
01:47
And although I'm going to talk
about work in my lab,
ovaj rad je imao inspiraciju na terenu
i njemu ga dugujem.
01:50
this work is inspired by
and indebted to a field.
Zahvalna sam mentorima,
kolegama i saradnicima širom sveta.
01:53
I'm grateful to mentors, colleagues,
and collaborators around the world.
Počeću problemom generalizacije.
01:59
Let me start with the problem
of generalization.
Uopštavanje
na osnovu malih uzoraka podataka
02:02
Generalizing from small samples of data
is the bread and butter of science.
je osnovni izvor nauke.
Izbrojimo mali deo izbornog tela
02:06
We poll a tiny fraction of the electorate
i predviđamo ishod nacionalnih izbora.
02:09
and we predict the outcome
of national elections.
Vidimo kako nekolicina pacijenata reaguje
na tretman u kliničkom ispitivanju,
02:12
We see how a handful of patients
responds to treatment in a clinical trial,
i donosimo lekove na domaće tržište.
02:16
and we bring drugs to a national market.
Ali ovo funkcioniše samo ako je naš uzorak
nasumično izvučen iz populacije.
02:19
But this only works if our sample
is randomly drawn from the population.
Ako je naš uzorak biran na neki način -
02:23
If our sample is cherry-picked
in some way --
recimo, ispitamo samo gradske birače,
02:26
say, we poll only urban voters,
ili recimo, u kliničkim ispitivanjima
tretmana bolesti srca
02:28
or say, in our clinical trials
for treatments for heart disease,
uključimo samo muškarce -
02:32
we include only men --
rezultati se možda neće generalizovati
na širu populaciju.
02:34
the results may not generalize
to the broader population.
Dakle, naučnike zanima da li su dokazi
slučajno uzorkovani ili ne,
02:38
So scientists care whether evidence
is randomly sampled or not,
ali kakve to ima veze sa bebama?
02:42
but what does that have to do with babies?
Pa, bebe moraju stalno da generalizuju
na osnovu malih uzoraka podataka.
02:44
Well, babies have to generalize
from small samples of data all the time.
Vide nekoliko gumenih pataka
i nauče da one plutaju,
02:49
They see a few rubber ducks
and learn that they float,
ili nekoliko lopti i nauče da one odskaču.
02:52
or a few balls and learn that they bounce.
I razvijaju očekivanja
u vezi sa patkama i loptama
02:55
And they develop expectations
about ducks and balls
koje će proširiti na gumene patke i lopte
02:58
that they're going to extend
to rubber ducks and balls
do kraja njihovih života.
03:01
for the rest of their lives.
A vrste generalizacija koje bebe prave
o patkama i loptama
03:03
And the kinds of generalizations
babies have to make about ducks and balls
moraju da prave o gotovo svemu:
03:07
they have to make about almost everything:
cipelama, brodovima,
vosku za pečaćenje, kupusu i kraljevima.
03:09
shoes and ships and sealing wax
and cabbages and kings.
Da li bebe zanima
da li delić dokaza koji one vide
03:14
So do babies care whether
the tiny bit of evidence they see
verodostojno predstavlja veću populaciju?
03:17
is plausibly representative
of a larger population?
Hajde da to otkrijemo.
03:21
Let's find out.
Pokazaću vam dva filma,
03:23
I'm going to show you two movies,
jedan iz svake od situacija
u eksperimentu,
03:25
one from each of two conditions
of an experiment,
i pošto ćete videti samo dva filma,
03:27
and because you're going to see
just two movies,
videćete samo dve bebe,
03:30
you're going to see just two babies,
a bilo koje dve bebe
se razlikuju međusobno na bezbroj načina.
03:32
and any two babies differ from each other
in innumerable ways.
Ali ove bebe, naravno,
ovde zastupaju grupe beba,
03:36
But these babies, of course,
here stand in for groups of babies,
i razlike koje ćete videti
03:39
and the differences you're going to see
predstavljaju prosečne grupne razlike
u ponašanju beba kroz različite uslove.
03:41
represent average group differences
in babies' behavior across conditions.
U svakom filmu ćete videti
kako beba radi
03:47
In each movie, you're going to see
a baby doing maybe
možda baš ono što biste očekivali
da će beba uraditi,
03:49
just exactly what you might
expect a baby to do,
a teško da možemo da učinimo bebe
čarobnijim nego što već jesu.
03:53
and we can hardly make babies
more magical than they already are.
Ali za mene je čarobna stvar,
03:58
But to my mind the magical thing,
i ono na šta želim da obratite pažnju,
04:00
and what I want you to pay attention to,
kontrast između ova dva uslova,
04:02
is the contrast between
these two conditions,
jer jedino što razlikuje ova dva filma
04:05
because the only thing
that differs between these two movies
je statistički dokaz
koji će bebe primetiti.
04:08
is the statistical evidence
the babies are going to observe.
Pokazaćemo bebama
kutiju plavih i žutih lopti,
04:13
We're going to show babies
a box of blue and yellow balls,
a moja tadašnja studentkinja,
sada koleginica na Stenfordu, Jouon Gvon,
04:16
and my then-graduate student,
now colleague at Stanford, Hyowon Gweon,
izvući će tri plave lopte
zaredom iz ove kutije,
04:21
is going to pull three blue balls
in a row out of this box,
i kada izvuče te lopte, stisnuće ih,
04:24
and when she pulls those balls out,
she's going to squeeze them,
a lopte će zapištati.
04:27
and the balls are going to squeak.
Ako ste beba, to je kao TED govor.
04:29
And if you're a baby,
that's like a TED Talk.
Ne može biti bolje od toga.
04:32
It doesn't get better than that.
(Smeh)
04:34
(Laughter)
Ali bitna poenta je da je veoma lako
izvući tri plave loptice zaredom
04:38
But the important point is it's really
easy to pull three blue balls in a row
iz kutije sa pretežno plavim lopticama.
04:42
out of a box of mostly blue balls.
Možete to da uradite sa zatvorenim očima.
04:44
You could do that with your eyes closed.
To je verovatno slučajni uzorak
iz ove populacije.
04:46
It's plausibly a random sample
from this population.
A ako možete posegnuti u kutiju nasumice
i izvaditi stvari koje pište,
04:49
And if you can reach into a box at random
and pull out things that squeak,
onda možda sve u toj kutiji pišti.
04:53
then maybe everything in the box squeaks.
Možda bebe očekuju
da žute lopte takođe pište.
04:56
So maybe babies should expect
those yellow balls to squeak as well.
Te žute lopte
imaju zabavne štapiće na kraju,
05:00
Now, those yellow balls
have funny sticks on the end,
tako da bebe mogu
da rade druge stvari sa njima ako hoće.
05:02
so babies could do other things
with them if they wanted to.
Mogu da ih lupaju ili udaraju.
05:05
They could pound them or whack them.
Ali hajde da vidimo šta beba radi.
05:07
But let's see what the baby does.
(Video) Jouon Gvon: Vidiš ovo?
(Lopta pišti)
05:12
(Video) Hyowon Gweon: See this?
(Ball squeaks)
Jesi li videla to? (Lopta pišti)
05:16
Did you see that?
(Ball squeaks)
Kul.
05:20
Cool.
Vidiš ovu?
05:24
See this one?
(Lopta pišti)
05:26
(Ball squeaks)
Opa!
05:28
Wow.
Lora Šulc: Rekla sam vam. (Smeh)
05:33
Laura Schulz: Told you. (Laughs)
(Video) JG: Vidiš ovu? (Lopta pišti)
05:35
(Video) HG: See this one?
(Ball squeaks)
Hej Klara, ova je za tebe.
Možeš da se igraš.
05:39
Hey Clara, this one's for you.
You can go ahead and play.
(Smeh)
05:51
(Laughter)
LŠ: Ne moram ni da pričam, zar ne?
05:56
LS: I don't even have to talk, right?
U redu, lepo je to što će bebe
generalizovati osobine
05:59
All right, it's nice that babies
will generalize properties
plavih loptica na žute loptice,
06:02
of blue balls to yellow balls,
i impresivno je to što bebe
mogu da uče imitirajući nas,
06:03
and it's impressive that babies
can learn from imitating us,
ali to sve znamo o bebama još odavno.
06:06
but we've known those things about babies
for a very long time.
Zaista zanimljivo pitanje
06:10
The really interesting question
je šta se dešava
kada pokažemo bebama isto to,
06:12
is what happens when we show babies
exactly the same thing,
a možemo da obezbedimo da bude baš isto
jer imamo tajnu pregradu
06:15
and we can ensure it's exactly the same
because we have a secret compartment
i izvlačimo lopte odatle,
06:18
and we actually pull the balls from there,
ali ovog puta
menjamo samo vidljivu populaciju
06:20
but this time, all we change
is the apparent population
iz koje se izvlači dokaz.
06:24
from which that evidence was drawn.
Ovoga puta ćemo bebama pokazati
tri plave loptice
06:27
This time, we're going to show babies
three blue balls
izvučene iz kutije
sa pretežno žutim lopticama,
06:30
pulled out of a box
of mostly yellow balls,
i pogodite šta?
06:34
and guess what?
Verovatno nećete nasumično izvući
tri loptice zaredom
06:35
You [probably won't] randomly draw
three blue balls in a row
iz kutije sa većinom žutim lopticama.
06:38
out of a box of mostly yellow balls.
To nije
verovatan slučajno uzorkovani dokaz.
06:40
That is not plausibly
randomly sampled evidence.
Taj dokazi ukazuje da je možda Jouon
namerno uzorkovala plave loptice.
06:44
That evidence suggests that maybe Hyowon
was deliberately sampling the blue balls.
Možda postoji
nešto posebno u vezi sa plavim lopticama.
06:49
Maybe there's something special
about the blue balls.
Možda samo plave loptice pište.
06:52
Maybe only the blue balls squeak.
Hajde da vidimo šta beba radi.
06:55
Let's see what the baby does.
(Video) JG: Vidiš ovo? (Lopta pišti)
06:57
(Video) HG: See this?
(Ball squeaks)
Vidiš ovu igračku? (Lopta pišti)
07:02
See this toy?
(Ball squeaks)
O, to je bilo kul. Vidiš? (Lopta pišti)
07:05
Oh, that was cool. See?
(Ball squeaks)
Ova je za tebe da se igraš.
Možeš da se igraš.
07:10
Now this one's for you to play.
You can go ahead and play.
(Beba negoduje) (Smeh)
07:18
(Fussing)
(Laughter)
LŠ: Upravo ste videli
dve bebe stare 15 meseci
07:26
LS: So you just saw
two 15-month-old babies
koje rade potpuno različite stvari
07:29
do entirely different things
samo na osnovu verovatnoće uzorka
koji su zapazile.
07:31
based only on the probability
of the sample they observed.
Dozvolite da vam pokažem
eksperimentalne rezultate.
07:35
Let me show you the experimental results.
Na vertikalnoj osi
ćete videti procenat beba
07:37
On the vertical axis, you'll see
the percentage of babies
koje su stiskale loptu u svakoj situaciji,
i kao što ćete videti,
07:40
who squeezed the ball in each condition,
07:42
and as you'll see, babies are much
more likely to generalize the evidence
mnogo je verovatnije
da će bebe generalizovati dokaz
kada verodostojnije predstavlja populaciju
07:46
when it's plausibly representative
of the population
nego kada je očigledno probran.
07:49
than when the evidence
is clearly cherry-picked.
A to navodi na zabavno predviđanje:
07:53
And this leads to a fun prediction:
recimo da ste izvukli
samo jednu plavu loptu
07:55
Suppose you pulled just one blue ball
out of the mostly yellow box.
iz uglavnom žute kutije.
Verovatno nećete izvući
tri plave lopte zaredom iz žute kutije,
08:00
You [probably won't] pull three blue balls
in a row at random out of a yellow box,
ali biste mogli nasumice uzeti
samo jednu plavu loptu.
08:04
but you could randomly sample
just one blue ball.
To nije neverovatan uzorak.
08:07
That's not an improbable sample.
A ako posegnete u kutiju nasumice
08:09
And if you could reach into
a box at random
i izvučete nešto što pišti,
možda sve u kutiji pišti.
08:11
and pull out something that squeaks,
maybe everything in the box squeaks.
Dakle, iako će bebe videti
mnogo manje dokaza za pištanje,
08:15
So even though babies are going to see
much less evidence for squeaking,
i imati mnogo manje radnji za oponašanje
08:20
and have many fewer actions to imitate
u situaciji sa jednom loptom
nego u situaciji koju ste upravo videli,
08:22
in this one ball condition than in
the condition you just saw,
predvideli smo da će bebe stiskati više,
08:25
we predicted that babies themselves
would squeeze more,
i to je upravo ono što smo pronašli.
08:29
and that's exactly what we found.
Dakle, bebama od 15 meseci,
u ovom pogledu, kao i naučnicima,
08:32
So 15-month-old babies,
in this respect, like scientists,
je bitno da li je dokaz
nasumično uzorkovan ili ne,
08:37
care whether evidence
is randomly sampled or not,
i one koriste to da stvore
očekivanja o svetu:
08:40
and they use this to develop
expectations about the world:
šta pišti, a šta ne,
08:43
what squeaks and what doesn't,
šta istražiti, a šta ignorisati.
08:45
what to explore and what to ignore.
Dozvolite mi da vam sada pokažem
još jedan primer,
08:50
Let me show you another example now,
ovog puta o problemu uzročnog rasuđivanja.
08:52
this time about a problem
of causal reasoning.
Počinje problemom zbunjujućeg dokaza,
08:55
And it starts with a problem
of confounded evidence
koji postoji kod svih nas,
08:57
that all of us have,
a to je da smo deo sveta.
08:59
which is that we are part of the world.
I to vam možda ne deluje kao problem
ali, kao i većina problema,
09:01
And this might not seem like a problem
to you, but like most problems,
postaje problem
tek kada stvari krenu naopako.
09:04
it's only a problem when things go wrong.
Uzmite ovu bebu, na primer.
09:07
Take this baby, for instance.
Stvari mu ne polaze za rukom.
09:09
Things are going wrong for him.
Želeo bi da pokrene ovu igračku,
ali ne može.
09:10
He would like to make
this toy go, and he can't.
Pokazaću vam snimak od nekoliko sekundi.
09:13
I'll show you a few-second clip.
Postoje dve mogućnosti, uglavnom.
09:21
And there's two possibilities, broadly:
Možda radi nešto pogrešno,
09:23
Maybe he's doing something wrong,
ili možda nešto nije u redu sa igračkom.
09:25
or maybe there's something
wrong with the toy.
Dakle, u sledećem eksperimentu,
09:30
So in this next experiment,
daćemo bebama
samo delić statističkih podataka
09:32
we're going to give babies
just a tiny bit of statistical data
koji podržavaju jednu od hipoteza,
09:35
supporting one hypothesis over the other,
i videćemo da li bebe mogu to da koriste
kako bi donosile različite odluke
09:38
and we're going to see if babies
can use that to make different decisions
o onome što će činiti.
09:41
about what to do.
Evo postavke.
09:43
Here's the setup.
Jouon će pokušati da pokrene igračku
i uspeti u tome.
09:46
Hyowon is going to try to make
the toy go and succeed.
Ja ću potom pokušati dva puta
i oba puta neću uspeti,
09:49
I am then going to try twice
and fail both times,
zatim će Jouon pokušati ponovo i uspeti,
09:52
and then Hyowon is going
to try again and succeed,
i to otprilike rezimira
odnos koji imam sa mojim studentima
09:55
and this roughly sums up my relationship
to my graduate students
po pitanju svih vrsta tehnologija.
09:58
in technology across the board.
Ali, ono što je ovde važno
jeste to da se pruža malo dokaza
10:02
But the important point here is
it provides a little bit of evidence
da problem nije sa igračkom,
već sa osobom.
10:05
that the problem isn't with the toy,
it's with the person.
Neki ljudi mogu da pokrenu ovu igračku,
10:08
Some people can make this toy go,
a neki ne mogu.
10:11
and some can't.
Sad, kada beba dobije igračku,
imaće izbor.
10:12
Now, when the baby gets the toy,
he's going to have a choice.
Njegova mama je tu pored,
10:16
His mom is right there,
tako da može da joj priđe,
preda igračku i promeni osobu,
10:18
so he can go ahead and hand off the toy
and change the person,
ali na kraju te krpe
će biti još jedna igračka,
10:21
but there's also going to be
another toy at the end of that cloth,
i on može da povuče krpu ka sebi
i promeni igračku.
10:24
and he can pull the cloth towards him
and change the toy.
Hajde da vidimo šta će beba uraditi.
10:28
So let's see what the baby does.
(Video) JG: Dva, tri. Sad! (Muzika)
10:30
(Video) HG: Two, three. Go!
(Music)
LS: Jedan, dva, tri, sad!
10:34
LS: One, two, three, go!
10:37
Arthur, I'm going to try again.
One, two, three, go!
Arture, pokušaću ponovo.
Jedan, dva, tri, sad!
JG: Arture, dopusti
da ja pokušam ponovo, okej?
10:45
YG: Arthur, let me try again, okay?
Jedan, dva, tri, sad! (Muzika)
10:48
One, two, three, go!
(Music)
Pogledaj. Sećaš li se tih igračaka?
10:53
Look at that. Remember these toys?
Vidiš te igračke? Da, staviću ovu ovde,
10:55
See these toys? Yeah, I'm going
to put this one over here,
a ovu ću ti dati.
10:58
and I'm going to give this one to you.
Možeš da se igraš.
11:00
You can go ahead and play.
LŠ: Okej, Lora, ali naravno,
bebe vole svoje mame.
11:23
LS: Okay, Laura, but of course,
babies love their mommies.
Naravno da bebe daju igračke svojim mamama
11:27
Of course babies give toys
to their mommies
kada ne mogu da učine da prorade.
11:30
when they can't make them work.
Još jednom, zaista bitno pitanje
je šta se dešava kada promenimo
11:32
So again, the really important question
is what happens when we change
statističke podatke neznatno.
11:35
the statistical data ever so slightly.
Ovog puta, bebe će videti
11:38
This time, babies are going to see the toy
work and fail in exactly the same order,
kako igračka radi i ne radi
potpuno istim redosledom,
ali ćemo izmeniti raspodelu dokaza.
11:42
but we're changing
the distribution of evidence.
Ovog puta će Jouon uspeti jednom
i neće uspeti jednom, a isto tako ću i ja.
11:45
This time, Hyowon is going to succeed
once and fail once, and so am I.
Ovo ukazuje da nije bitno ko isprobava
igračku, igračka je pokvarena.
11:49
And this suggests it doesn't matter
who tries this toy, the toy is broken.
Ne radi uvek.
11:55
It doesn't work all the time.
Još jednom, beba će imati izbor.
11:57
Again, the baby's going to have a choice.
Njena mama je tu pored,
tako da može da promeni osobu,
11:59
Her mom is right next to her,
so she can change the person,
i biće tu još jedna igračka na kraju krpe.
12:02
and there's going to be another toy
at the end of the cloth.
Hajde da vidimo šta će uraditi.
12:05
Let's watch what she does.
(Video) JG: Dva, tri, sad! (Muzika)
12:07
(Video) HG: Two, three, go!
(Music)
Daj da probam još jednom.
Jedan, dva, tri, sad!
12:11
Let me try one more time.
One, two, three, go!
Hmmm.
12:17
Hmm.
LŠ: Daj da ja probam, Klara.
12:19
LS: Let me try, Clara.
Jedan, dva, tri, sad!
12:22
One, two, three, go!
Hmmm, daj da probam još jednom.
12:27
Hmm, let me try again.
Jedan, dva, tri, sad! (Muzika)
12:29
One, two, three, go!
(Music)
JG: Staviću ovu ovde,
12:35
HG: I'm going
to put this one over here,
a ovu ću ti dati.
12:37
and I'm going to give this one to you.
Možeš da se igraš.
12:39
You can go ahead and play.
(Aplauz)
12:58
(Applause)
LŠ: Dozvolite da vam pokažem
rezultate eksperimenta.
13:04
LS: Let me show you
the experimental results.
Na vertikalnoj osi ćete videti raspodelu
13:07
On the vertical axis,
you'll see the distribution
izbora dece u svakoj od situacija,
13:09
of children's choices in each condition,
i videćete da raspodela izbora
koji deca donose
13:12
and you'll see that the distribution
of the choices children make
zavisi od dokaza koje posmatraju.
13:16
depends on the evidence they observe.
U drugoj godini života
13:19
So in the second year of life,
bebe mogu da koriste
malo statističkih podataka
13:21
babies can use a tiny bit
of statistical data
da bi odabrali između dve
fundamentalno različite strategije
13:24
to decide between two
fundamentally different strategies
za postupanje u svetu:
13:27
for acting in the world:
pitati za pomoć i istraživati.
13:29
asking for help and exploring.
Upravo sam vam pokazala
dva laboratorijska eksperimenta
13:33
I've just shown you
two laboratory experiments
od bukvalno stotina u ovoj oblasti
koji imaju sličnu poentu,
13:37
out of literally hundreds in the field
that make similar points,
jer je presudna poenta
13:40
because the really critical point
da se sposobnost dece da donose
bogate zaključke iz oskudnih podataka
13:43
is that children's ability
to make rich inferences from sparse data
nalazi u osnovi
svakog specifičnog kulturnog učenja.
13:48
underlies all the species-specific
cultural learning that we do.
Deca uče o novim alatkama
na osnovu samo nekoliko primera.
13:53
Children learn about new tools
from just a few examples.
Uče nove uzročno-posledične veze
iz samo nekoliko primera.
13:58
They learn new causal relationships
from just a few examples.
Čak uče i nove reči,
u ovom slučaju američki znakovni jezik.
14:03
They even learn new words,
in this case in American Sign Language.
Želim da završim sa samo dve poente.
14:08
I want to close with just two points.
Ako ste pratili moj svet,
oblast mozga i kognitivne nauke,
14:12
If you've been following my world,
the field of brain and cognitive sciences,
poslednjih nekoliko godina,
14:15
for the past few years,
tri ideje su vam privukle pažnju.
14:17
three big ideas will have come
to your attention.
Prva je da je ovo era mozga.
14:20
The first is that this is
the era of the brain.
I zaista, bilo je neverovatnih otkrića
u neuronaukama:
14:23
And indeed, there have been
staggering discoveries in neuroscience:
lokalizacija funkcionalno specijalizovanih
regija korteksa,
14:27
localizing functionally specialized
regions of cortex,
dovođenje mišjeg mozga
u transparentno stanje,
14:30
turning mouse brains transparent,
aktiviranje neurona svetlošću.
14:33
activating neurons with light.
Druga velika ideja
14:36
A second big idea
je da je ovo era velikih podataka
i mašinskog učenja,
14:38
is that this is the era of big data
and machine learning,
a mašinsko učenje
obećava revoluciju u našem razumevanju
14:43
and machine learning promises
to revolutionize our understanding
svega, od društvenih mreža
do epidemiologije.
14:46
of everything from social networks
to epidemiology.
I možda će nam, kako se bavi
problemima razumevanja scene
14:50
And maybe, as it tackles problems
of scene understanding
i obrade prirodnog jezika,
14:53
and natural language processing,
reći nešto o ljudskoj kogniciji.
14:55
to tell us something
about human cognition.
A poslednja velika ideja koju ćete čuti
14:59
And the final big idea you'll have heard
je da je možda dobra ideja
da ćemo toliko znati o mozgu
15:01
is that maybe it's a good idea we're going
to know so much about brains
i imati toliko pristupa velikim podacima,
15:05
and have so much access to big data,
jer prepušteni sami sebi,
15:06
because left to our own devices,
ljudi su skloni greškama,
koristimo prečice,
15:09
humans are fallible, we take shortcuts,
grešimo, pravimo pogreške,
15:13
we err, we make mistakes,
imamo predrasude, i na bezbroj načina,
15:16
we're biased, and in innumerable ways,
shvatamo svet pogrešno.
15:20
we get the world wrong.
Mislim da su ovo sve važne priče,
15:24
I think these are all important stories,
i imaju mnogo toga da nam kažu
o tome šta znači biti čovek,
15:27
and they have a lot to tell us
about what it means to be human,
ali želim da primite k znanju da sam
vam danas ispričala veoma drugačiju priču.
15:31
but I want you to note that today
I told you a very different story.
To je priča o umu, a ne o mozgu,
15:35
It's a story about minds and not brains,
a naročito,
to je priča o vrstama proračuna
15:39
and in particular, it's a story
about the kinds of computations
koje jedino ljudski um može da vrši,
15:42
that uniquely human minds can perform,
što podrazumeva bogato,
strukturirano znanje i sposobnost učenja
15:45
which involve rich, structured knowledge
and the ability to learn
iz malih količina podataka,
dokaz samo na osnovu nekoliko primera.
15:49
from small amounts of data,
the evidence of just a few examples.
I u osnovi, to je priča
o tome kako počevši kao veoma mala deca
15:56
And fundamentally, it's a story
about how starting as very small children
i nastavljajući
sve do najvećih dostignuća
16:00
and continuing out all the way
to the greatest accomplishments
naše kulture,
16:04
of our culture,
shvatamo svet na pravi način.
16:08
we get the world right.
Narode, ljudski um
ne uči samo iz malih količina podataka.
16:12
Folks, human minds do not only learn
from small amounts of data.
Ljudski umovi smišljaju
potpuno nove ideje.
16:18
Human minds think
of altogether new ideas.
Ljudski umovi rađaju
istraživanja i otkrića,
16:20
Human minds generate
research and discovery,
rađaju umetnost i književnost,
poeziju i pozorište,
16:23
and human minds generate
art and literature and poetry and theater,
i ljudski umovi se brinu o drugim ljudima:
16:29
and human minds take care of other humans:
našim starima, mladima, bolesnima.
16:32
our old, our young, our sick.
Čak ih i lečimo.
16:36
We even heal them.
U godinama koje su pred nama,
videćemo tehnološke inovacije
16:39
In the years to come, we're going
to see technological innovations
kakve ne mogu ni da zamislim,
16:42
beyond anything I can even envision,
ali je veoma malo verovatno
16:46
but we are very unlikely
da ćemo videti bilo šta čak ni približno
moći proračuna ljudskog deteta
16:48
to see anything even approximating
the computational power of a human child
tokom mog života ili vašeg.
16:54
in my lifetime or in yours.
Ako ulažemo
u te najmoćnije učenike i njihov razvoj,
16:58
If we invest in these most powerful
learners and their development,
u bebe i decu
17:03
in babies and children
i majke i očeve
17:06
and mothers and fathers
i staratelje i učitelje
17:08
and caregivers and teachers
onako kako ulažemo u druge naše
najmoćnije i najelegantnije oblike
17:11
the ways we invest in our other
most powerful and elegant forms
tehnologije, inženjeringa i dizajna,
17:15
of technology, engineering and design,
nećemo samo sanjati o boljoj budućnosti,
17:18
we will not just be dreaming
of a better future,
već ćemo je planirati.
17:21
we will be planning for one.
Mnogo vam hvala.
17:23
Thank you very much.
(Aplauz)
17:25
(Applause)
Kris Anderson: Lora, hvala.
Ja zapravo imam jedno pitanje za tebe.
17:29
Chris Anderson: Laura, thank you.
I do actually have a question for you.
Pre svega, istraživanje je suludo.
17:34
First of all, the research is insane.
Mislim, ko bi osmislio
takav eksperiment? (Smeh)
17:36
I mean, who would design
an experiment like that? (Laughter)
Video sam to par puta,
17:41
I've seen that a couple of times,
i još uvek iskreno ne verujem
da se to stvarno dešava,
17:42
and I still don't honestly believe
that that can truly be happening,
ali i drugi su uradili
slične eksperimente; provereno je.
17:46
but other people have done
similar experiments; it checks out.
Bebe su stvarno toliko genijalne.
17:49
The babies really are that genius.
LŠ: Znaš, izgledaju stvarno impresivno
u našim eksperimentima,
17:50
LS: You know, they look really impressive
in our experiments,
ali pomisli na to kako izgledaju
u stvarnom životu.
17:53
but think about what they
look like in real life, right?
Počinje kao beba.
17:56
It starts out as a baby.
Osamnaest meseci kasnije priča sa vama,
17:57
Eighteen months later,
it's talking to you,
a bebine prve reči
nisu samo one poput lopte i patke,
17:59
and babies' first words aren't just
things like balls and ducks,
to su i "nema",
što se odnosi na nestajanje,
18:02
they're things like "all gone,"
which refer to disappearance,
ili "o-o", što se odnosi
na nenamerne postupke.
18:05
or "uh-oh," which refer
to unintentional actions.
To mora da je toliko moćno.
18:07
It has to be that powerful.
To mora da je mnogo moćnije
od svega što sam vam pokazala.
18:09
It has to be much more powerful
than anything I showed you.
Oni otkrivaju ceo svet.
18:12
They're figuring out the entire world.
Dete od četiri godine
može da priča sa vama o gotovo svemu.
18:14
A four-year-old can talk to you
about almost anything.
(Aplauz)
18:17
(Applause)
KA: I ako sam te dobro razumeo,
druga tvoja ključna poenta je,
18:19
CA: And if I understand you right,
the other key point you're making is,
protekle su godine sa tom pričom
18:22
we've been through these years
where there's all this talk
o tome kako je um uvrnut i blesav,
18:25
of how quirky and buggy our minds are,
bihejvioralna ekonomija i čitave teorije
18:27
that behavioral economics
and the whole theories behind that
o tome kako nismo razumni izvršioci.
18:29
that we're not rational agents.
Ti u stvari govoriš da je veća priča
kako je izvanredan,
18:31
You're really saying that the bigger
story is how extraordinary,
i da je tu zapravo genije
koji se potcenjuje.
18:35
and there really is genius there
that is underappreciated.
LŠ: Jedan od mojih omiljenih citata
u psihologiji
18:40
LS: One of my favorite
quotes in psychology
18:42
comes from the social
psychologist Solomon Asch,
potiče od socijalnog psihologa
Solomona Eša,
a on je rekao da je osnovni zadatak
psihologije da ukloni
18:45
and he said the fundamental task
of psychology is to remove
zavesu samodokazivanja.
18:47
the veil of self-evidence from things.
Postoji milion redova veličine više odluka
koje donosite svakodnevno
18:50
There are orders of magnitude
more decisions you make every day
koje pravilno shvataju svet.
18:55
that get the world right.
Imate znanje o predmetima
i njihovim osobinama.
18:56
You know about objects
and their properties.
Prepoznajete ih kada su zaklonjeni.
Prepoznajete ih u mraku.
18:58
You know them when they're occluded.
You know them in the dark.
Možete da se krećete kroz prostorije.
19:01
You can walk through rooms.
19:02
You can figure out what other people
are thinking. You can talk to them.
Možete da shvatite šta drugi ljudi misle.
Možete da razgovarate sa njima.
19:06
You can navigate space.
You know about numbers.
Možete se kretati u prostoru.
Razumete brojeve.
19:08
You know causal relationships.
You know about moral reasoning.
Razumete uzročno-posledične veze.
Razumete moralno rasuđivanje.
Radite to bez napora, tako da se ne vidi,
19:11
You do this effortlessly,
so we don't see it,
ali to je način na koji poimamo svet,
a to je neverovatno dostignuće
19:14
but that is how we get the world right,
and it's a remarkable
19:16
and very difficult-to-understand
accomplishment.
i veoma teško za razumevanje.
KA: Pretpostavljam da postoje
ljudi u publici koji imaju
19:19
CA: I suspect there are people
in the audience who have
gledište o ubrzanoj tehnološkoj moći
19:21
this view of accelerating
technological power
koji bi mogli da ospore tvoju izjavu
da nikada za vreme našeg života
19:24
who might dispute your statement
that never in our lifetimes
19:27
will a computer do what
a three-year-old child can do,
računar neće uraditi
ono što može trogodišnje dete,
ali ono što je jasno jeste
da u bilo kom scenariju,
19:29
but what's clear is that in any scenario,
naše mašine mogu
mnogo toga da nauče od naših beba.
19:32
our machines have so much to learn
from our toddlers.
LŠ: Mislim da je tako. Tu su neki ljudi
koji se bave mašinama koje uče.
19:38
LS: I think so. You'll have some
machine learning folks up here.
Mislim, nikada se ne treba kladiti
protiv beba ili šimpanzi
19:41
I mean, you should never bet
against babies or chimpanzees
ili tehnologije tek tako,
19:45
or technology as a matter of practice,
ali nije u pitanju
samo razlika u količini,
19:49
but it's not just
a difference in quantity,
već razlika u vrsti.
19:53
it's a difference in kind.
Imamo neverovatno moćne kompjutere,
19:55
We have incredibly powerful computers,
i oni stvarno obavljaju
neverovatno sofisticirane stvari,
19:57
and they do do amazingly
sophisticated things,
20:00
often with very big amounts of data.
često sa veoma velikom količinom podataka.
Ljudski um čini, po meni,
nešto sasvim drugačije,
20:03
Human minds do, I think,
something quite different,
a mislim da je strukturirana,
hijerarhijska priroda ljudskog znanja
20:05
and I think it's the structured,
hierarchical nature of human knowledge
ono što ostaje pravi izazov.
20:09
that remains a real challenge.
KA: Lora Šulc, sjajna hrana za misli.
Mnogo ti hvala.
20:11
CA: Laura Schulz, wonderful
food for thought. Thank you so much.
LŠ: Hvala. (Aplauz)
20:14
LS: Thank you.
(Applause)
Translated by Ivana Krivokuća
Reviewed by Ivana Korom

▲Back to top

About the speaker:

Laura Schulz - Cognitive scientist
Developmental behavior studies spearheaded by Laura Schulz are changing our notions of how children learn.

Why you should listen

MIT Early Childhood Cognition Lab lead investigator Laura Schulz studies learning in early childhood. Her research bridges computational models of cognitive development and behavioral studies in order to understand the origins of inquiry and discovery.

Working in play labs, children’s museums, and a recently-launched citizen science website, Schultz is reshaping how we view young children’s perceptions of the world around them. Some of the surprising results of her research: before the age of four, children expect hidden causes when events happen probabilistically, use simple experiments to distinguish causal hypotheses, and trade off learning from instruction and exploration.

More profile about the speaker
Laura Schulz | Speaker | TED.com