ABOUT THE SPEAKER
Dan Ariely - Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why.

Why you should listen

Dan Ariely is a professor of psychology and behavioral economics at Duke University and a founding member of the Center for Advanced Hindsight. He is the author of the bestsellers Predictably IrrationalThe Upside of Irrationality, and The Honest Truth About Dishonesty -- as well as the TED Book Payoff: The Hidden Logic that Shapes Our Motivations.

Through his research and his (often amusing and unorthodox) experiments, he questions the forces that influence human behavior and the irrational ways in which we often all behave.

More profile about the speaker
Dan Ariely | Speaker | TED.com
EG 2008

Dan Ariely: Are we in control of our own decisions?

Den Erijeli se pita da li sami kontrolišemo sopstvene odluke.

Filmed:
5,917,967 views

Behavioralni ekonomista, Dan Erijeli, autor knjige Predvidljivo iracionalni, koristi uobičajene optičke varke i svoje vlastite kontraintuitivne (i ponekad šokantne) rezultate istraživanja da bi pokazao da nismo baš toliko racionalni kada donosimo odluke.
- Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why. Full bio

Double-click the English transcript below to play the video.

00:16
I'll tell you a little bit about irrational behavior.
0
0
3000
Reći ću vam nešto
o iracionalnom ponašanju.
00:19
Not yours, of course -- other people's.
1
3000
2000
Ne vašem, naravno. Tuđem.
00:21
(Laughter)
2
5000
1000
(Smeh)
00:22
So after being at MIT for a few years,
3
6000
4000
Nakon što sam proveo nekoliko godina na MITu,
00:26
I realized that writing academic papers is not that exciting.
4
10000
4000
shvatio sam da pisanje akademskih radova
nije baš toliko uzbudljivo.
00:30
You know, I don't know how many of those you read,
5
14000
2000
Ne znam koliko takvih radova čitate,
00:32
but it's not fun to read and often not fun to write --
6
16000
3000
ali nije ih zabavno ni čitati, a često ni pisati.
00:35
even worse to write.
7
19000
2000
Još je gore pisati.
00:37
So I decided to try and write something more fun.
8
21000
3000
Tako sam odlučio da pokušam
da napišem nešto zabavnije.
00:40
And I came up with an idea that I will write a cookbook.
9
24000
4000
Došao sam na ideju da napišem kuvar.
00:44
And the title for my cookbook was going to be
10
28000
2000
Naslov knjige je trebalo da bude
00:46
"Dining Without Crumbs: The Art of Eating Over the Sink."
11
30000
3000
"Ručanje bez mrvica: umetnost jela nad sudoperom."
00:49
(Laughter)
12
33000
2000
(Smeh)
00:51
And it was going to be a look at life through the kitchen.
13
35000
3000
I to je trebalo da bude pogled na život kroz kuhinju.
00:54
And I was quite excited about this. I was going to talk
14
38000
2000
Bio sam jako uzbuđen zbog ovoga. Hteo sam da pišem
00:56
a little bit about research, a little bit about the kitchen.
15
40000
3000
malo o istraživanju, malo o kuhinji.
00:59
You know, we do so much in the kitchen I thought this would be interesting.
16
43000
3000
Znate, radimo toliko toga u kuhinji da sam mislio
da bi ovo moglo biti zanimljivo.
01:02
And I wrote a couple of chapters.
17
46000
2000
Napisao sam par poglavlja
01:04
And I took it to MIT press and they said,
18
48000
2000
i odnio u izdavačku kuću MIT-a i oni su mi rekli
01:06
"Cute, but not for us. Go and find somebody else."
19
50000
4000
"Lepo je to, ali nije za nas. Idite i nađite nekog drugog."
01:10
I tried other people and everybody said the same thing,
20
54000
2000
Pokušao sam i kod drugih ljudi i svi su mi rekli istu stvar.
01:12
"Cute. Not for us."
21
56000
3000
"Lepo. Ali nije za nas."
01:15
Until somebody said,
22
59000
3000
Sve dok mi neko nije rekao:
01:18
"Look, if you're serious about this,
23
62000
2000
"Vidi, ako si ozbiljan oko toga,
01:20
you first have to write a book about your research. You have to publish something,
24
64000
3000
prvo treba da napišeš knjigu o svom istraživanju.
Moraš nešto da objaviš,
01:23
and then you'll get the opportunity to write something else.
25
67000
2000
a onda će ti se pružiti prilika da napišeš nešto drugo.
01:25
If you really want to do it you have to do it."
26
69000
2000
Ako stvarno želiš to da uradiš,
onda moraš to i da uradiš."
01:27
So I said, "You know, I really don't want to write about my research.
27
71000
3000
Ja sam im rekao: "Znate, ja baš i ne želim
da pišem o svom istraživanju.
01:30
I do this all day long. I want to write something else.
28
74000
2000
Radim to po vasceli dan. Hoću da pišem o nečem drugom.
01:32
Something a bit more free, less constrained."
29
76000
3000
O nečemu što je slobodnije, manje sputano."
01:35
And this person was very forceful and said,
30
79000
3000
Ova osoba je bila vrlo uporna:
01:38
"Look. That's the only way you'll ever do it."
31
82000
2000
"Slušaj, to je jedini način da to uradiš."
01:40
So I said, "Okay, if I have to do it -- "
32
84000
3000
"U redu, ako moram...", rekao sam.
01:43
I had a sabbatical. I said, "I'll write about my research
33
87000
3000
Bio sam na odmoru. "Pisaću o svom istraživanju..."
01:46
if there is no other way. And then I'll get to do my cookbook."
34
90000
2000
"...ako je to jedini način. A onda ću preći na kuvar."
01:48
So I wrote a book on my research.
35
92000
3000
Tako sam napisao knjigu o svom istraživanju,
01:51
And it turned out to be quite fun in two ways.
36
95000
3000
što je na kraju ispalo vrlo zanimljivo iz dva razloga:
01:54
First of all, I enjoyed writing.
37
98000
3000
Prvo, uživao sam pišući.
01:57
But the more interesting thing was that
38
101000
2000
Ali još zanimljivije je bilo
01:59
I started learning from people.
39
103000
2000
to što sam počeo da učim od drugih ljudi.
02:01
It's a fantastic time to write,
40
105000
2000
Ovo je fantastično vreme za pisanje,
02:03
because there is so much feedback you can get from people.
41
107000
2000
zato što možete dobiti jako puno povratnih informacija od ljudi.
02:05
People write me about their personal experience,
42
109000
3000
Ljudi mi pišu o svojim ličnim iskustvima,
02:08
and about their examples, and what they disagree,
43
112000
2000
o svojim primerima i o tome oko čega se ne slažu
02:10
and nuances.
44
114000
2000
pišu mi o nijansma.
02:12
And even being here -- I mean the last few days,
45
116000
2000
I čak za vreme dok sam ovde, mislim, ovih par dana,
02:14
I've known really heights of obsessive behavior
46
118000
3000
susreo sam se sa vrhuncem opsesivnog ponašanja
02:17
I never thought about.
47
121000
2000
o kojima nisam ni razmišljao.
02:19
(Laughter)
48
123000
1000
(Smeh)
02:20
Which I think is just fascinating.
49
124000
2000
I mislim da je to fascinantno.
02:22
I will tell you a little bit about irrational behavior.
50
126000
3000
Reći ću vam nešto o iracionalnom ponašanju
02:25
And I want to start by giving you some examples of visual illusion
51
129000
3000
Počeću sa nekoliko primera optičkih varki
02:28
as a metaphor for rationality.
52
132000
2000
kao metoforama za racionalnost.
02:30
So think about these two tables.
53
134000
2000
Razmotrimo ova dva stola.
02:32
And you must have seen this illusion.
54
136000
2000
Sigurno ste već videli ovu optičku varku.
02:34
If I asked you what's longer, the vertical line on the table on the left,
55
138000
3000
Ako bih vas pitao šta je duže, vertikalna linija na stolu sa leve strane
02:37
or the horizontal line on the table on the right?
56
141000
3000
ili horizontalna linjia na stolu sa desne strane,
02:40
Which one seems longer?
57
144000
3000
za koju biste rekli da je duža?
02:43
Can anybody see anything but the left one being longer?
58
147000
3000
Da li neko ne misli da je leva linija duža?
02:46
No, right? It's impossible.
59
150000
2000
Nema šanse, zar ne? To je nemoguće.
02:48
But the nice thing about visual illusion is we can easily demonstrate mistakes.
60
152000
3000
Ali zgodna stvar kod optičkih varki je da se greške lako demonstriraju.
02:51
So I can put some lines on; it doesn't help.
61
155000
3000
Tako mogu da stavim neke linije ali, to ne pomaže.
02:54
I can animate the lines.
62
158000
2000
Mogu da animiram ove linije,
02:56
And to the extent you believe I didn't shrink the lines,
63
160000
2000
i sve dok mi verujete da nisam smanjio linije,
02:58
which I didn't, I've proven to you that your eyes were deceiving you.
64
162000
5000
što naravno, nisam uradio, dokazao sam da su vas vaše oči prevarile.
03:03
Now, the interesting thing about this
65
167000
2000
E sad, zanimljiva stvar kod ovoga
03:05
is when I take the lines away,
66
169000
2000
je da kada sklonim linije,
03:07
it's as if you haven't learned anything in the last minute.
67
171000
2000
ispada da ništa niste naučili prie par trenutaka.
03:09
(Laughter)
68
173000
3000
(Smeh)
03:12
You can't look at this and say, "Okay now I see reality as it is."
69
176000
3000
Ne možete sada da pogledate ovo i kažete: " OK, sada vidim realnost kakva jeste."
03:15
Right? It's impossible to overcome this
70
179000
2000
Jel tako? Nemoguće je prevazići taj
03:17
sense that this is indeed longer.
71
181000
3000
osećaj da je ova linija stvarno duža.
03:20
Our intuition is really fooling us in a repeatable, predictable, consistent way.
72
184000
3000
Naša intuicija nas vara na ponovljiv, predvidiv i konzistentan način.
03:23
And there is almost nothing we can do about it,
73
187000
3000
I mi oko toga ne možemo skoro ništa da preduzmemo,
03:26
aside from taking a ruler and starting to measure it.
74
190000
3000
osim što možemo da uzmemo lenjir i premerimo celu stvar.
03:29
Here is another one -- this is one of my favorite illusions.
75
193000
3000
Evo još jedne. Ovo je jedna od mojih omiljenih optičkih varki.
03:32
What do you see the color that top arrow is pointing to?
76
196000
3000
Koje je boje polje na koje pokazuje gornja strelica?
03:35
Brown. Thank you.
77
199000
2000
Smeđe. Hvala.
03:37
The bottom one? Yellow.
78
201000
2000
A donja? Žuta.
03:39
Turns out they're identical.
79
203000
2000
Ispada da su ipak identične.
03:41
Can anybody see them as identical?
80
205000
2000
Da li iko vidi da su identične?
03:43
Very very hard.
81
207000
2000
Jako, jako teško.
03:45
I can cover the rest of the cube up.
82
209000
2000
Mogu i da pokrijem ostatak kocke...
03:47
And if I cover the rest of the cube you can see that they are identical.
83
211000
3000
... i ako pokrijem ostatak kocke videćete da su identične.
03:50
And if you don't believe me you can get the slide later
84
214000
2000
Ako mi ne verujete, dobićete prezentaciju kasnije,
03:52
and do some arts and crafts and see that they're identical.
85
216000
3000
i onda možete da probate svašta i uverite se da su identične.
03:55
But again it's the same story
86
219000
2000
Ali i ovde se ponavlja ona priča
03:57
that if we take the background away,
87
221000
2000
da ako odstranimo pozadinu,
03:59
the illusion comes back. Right.
88
223000
2000
iluzija se vraća.
04:01
There is no way for us not to see this illusion.
89
225000
3000
Nemoguće je da ne vidimo tu optičku varku.
04:04
I guess maybe if you're colorblind I don't think you can see that.
90
228000
3000
Jedino ako ste možda daltonista onda ne možete da vidite ovo.
04:07
I want you to think about illusion as a metaphor.
91
231000
3000
Želim da na optičku varku gledate kao na metaforu.
04:10
Vision is one of the best things we do.
92
234000
2000
Vid je jedna od stvari u kojoj smo najbolji.
04:12
We have a huge part of our brain dedicated to vision --
93
236000
2000
Veliki dio našeg mozga je rezervisan za vid.
04:14
bigger than dedicated to anything else.
94
238000
2000
Veći nego za bilo koju drugu aktivnost.
04:16
We do more vision more hours of the day than we do anything else.
95
240000
4000
Tokom dana gledamo više nego što radimo bilo šta drugo.
04:20
And we are evolutionarily designed to do vision.
96
244000
2000
A i evolucijski smo razvijeni da se oslanjamo na vid.
04:22
And if we have these predictable repeatable mistakes in vision,
97
246000
3000
I ako ponavljamo ove predvidljive greške gledajući,
04:25
which we're so good at,
98
249000
2000
u čemu smo jako dobri,
04:27
what's the chance that we don't make even more mistakes
99
251000
2000
kakve su šanse da ne pravimo još više grešaka
04:29
in something we're not as good at --
100
253000
2000
u nečemu u čemu i nismo tako dobri...
04:31
for example, financial decision making:
101
255000
2000
kao na primer, u donošenju odluka o finansijama
04:33
(Laughter)
102
257000
2000
(smeh)
04:35
something we don't have an evolutionary reason to do,
103
259000
2000
u nečemu za šta nemamo evolucijski razlog da radimo,
04:37
we don't have a specialized part of the brain,
104
261000
2000
nemamo specijalizovani deo mozga za to
04:39
and we don't do that many hours of the day.
105
263000
2000
i ne radimo to većim delom dana.
04:41
And the argument is in those cases
106
265000
3000
I u tim slučajevima može da se tvrdi
04:44
it might be the issue that we actually make many more mistakes
107
268000
4000
da mi možda zapravo pravimo mnogo više grešaka
04:48
and, worse, not have an easy way to see them.
108
272000
3000
i, što je još gore, ne možemo da ih lako uočimo.
04:51
Because in visual illusions we can easily demonstrate the mistakes;
109
275000
3000
Jer kod optičkih varki lako možemo da ukažemo na greške,
04:54
in cognitive illusion it's much, much harder
110
278000
2000
dok je kod kognitivnih varki mnogo, mnogo teže
04:56
to demonstrate to people the mistakes.
111
280000
2000
ukazati ljudima na greške.
04:58
So I want to show you some cognitive illusions,
112
282000
3000
Zato, želim da vam na isti način pokažem neke kognitivne varke,
05:01
or decision-making illusions, in the same way.
113
285000
3000
ili varke prilikom donošenja odluka.
05:04
And this is one of my favorite plots in social sciences.
114
288000
3000
Ovo je moj omiljeni grafik u društvenim naukama.
05:07
It's from a paper by Johnson and Goldstein.
115
291000
4000
On je preuzet iz rada Džonsona i Goldstina
05:11
And it basically shows
116
295000
2000
koji u osnovi pokazuje
05:13
the percentage of people who indicated
117
297000
2000
procenat osoba koje su navele
05:15
they would be interested in giving their organs to donation.
118
299000
4000
da bi bi bili zainteresovane da doniraju svoje organe.
05:19
And these are different countries in Europe. And you basically
119
303000
2000
Ovo su različite zemlje u Evropi. U osnovi
05:21
see two types of countries:
120
305000
2000
vidite dve vrste zemalja:
05:23
countries on the right, that seem to be giving a lot;
121
307000
2000
zemlje na desnoj strani, koje izgleaju kao da daju mnogo;
05:25
and countries on the left that seem to giving very little,
122
309000
3000
i zemlje na levoj strani koje izgleda daju jako malo,
05:28
or much less.
123
312000
2000
ili mnogo manje.
05:30
The question is, why? Why do some countries give a lot
124
314000
2000
Pitanje je zašto? Zašto neke zemlje daju mnogo
05:32
and some countries give a little?
125
316000
2000
a neke zemlje daju malo?
05:34
When you ask people this question,
126
318000
2000
Kada postavite ovo pitanje,
05:36
they usually think that it has to be something about culture.
127
320000
2000
svi obično misle da je u pitanju nešto kulturološko.
05:38
Right? How much do you care about people?
128
322000
2000
Jel tako? Koliko je vama stalo do ljudi?
05:40
Giving your organs to somebody else
129
324000
2000
Dati svoje organe nekom drugom
05:42
is probably about how much you care about society, how linked you are.
130
326000
3000
je verovatno pokazatelj koliko vam je stalo do društva, koliko ste povezani.
05:45
Or maybe it is about religion.
131
329000
2000
Ili možda ima veze sa religijom.
05:47
But, if you look at this plot,
132
331000
2000
Ali, ako bliže pogledate ovaj grafik,
05:49
you can see that countries that we think about as very similar
133
333000
3000
možete videti da zemlje za koje mislimo da su međusobno vrlo slične
05:52
actually exhibit very different behavior.
134
336000
3000
u stvari se veoma različito ponašaju.
05:55
For example, Sweden is all the way on the right,
135
339000
2000
Na primer, Švedska je potpuno na desnoj strani,
05:57
and Denmark, that we think is culturally very similar,
136
341000
3000
a Danska, za koju mislimo da je kulturološki jako slična,
06:00
is all the way on the left.
137
344000
2000
je skroz na levoj strani.
06:02
Germany is on the left. And Austria is on the right.
138
346000
4000
Nemačka je na levoj. A Austrija je na desnoj.
06:06
The Netherlands is on the left. And Belgium is on the right.
139
350000
3000
Holandija je na levoj. A Belgija je na desnoj.
06:09
And finally, depending on your particular version
140
353000
3000
I konačno, zavisno od vaše lične verzije
06:12
of European similarity,
141
356000
2000
evropskih sličnosti
06:14
you can think about the U.K and France as either similar culturally or not.
142
358000
5000
možete razmišljati o Engleskoj i Francuskoj kao kulturološki sličnim ili ne.
06:19
But it turns out that from organ donation they are very different.
143
363000
4000
Ispostavlja se da, kada se radi o donacijima organa, su jako različite.
06:23
By the way, the Netherlands is an interesting story.
144
367000
2000
Holandija ima, inače, interesantnu priču.
06:25
You see the Netherlands is kind of the biggest of the small group.
145
369000
5000
Vidite da je Holandija nekako najveća u grupi malih.
06:30
Turns out that they got to 28 percent
146
374000
3000
Ispada da su došli do 28 procenata
06:33
after mailing every household in the country a letter
147
377000
3000
posle slanja pisma svakom domaćinstvu u zemlji
06:36
begging people to join this organ donation program.
148
380000
3000
kojim su molili ljude da se pridruže programu doniranja organa.
06:39
You know the expression, "Begging only gets you so far"?
149
383000
3000
Kako se ono kaže: "Prosjačenjem možete postići samo toliko"?
06:42
It's 28 percent in organ donation.
150
386000
3000
Kada govorimo o doniranju organa, to je 28 odsto.
06:45
(Laughter)
151
389000
2000
(Smeh)
06:47
But whatever the countries on the right are doing
152
391000
2000
Ali, šta god zemlje sa desne strane radile
06:49
they are doing a much better job than begging.
153
393000
2000
to je mnogo bolje od slanja molbi.
06:51
So what are they doing?
154
395000
2000
Šta je to što one rade?
06:53
Turns out the secret has to do with a form at the DMV.
155
397000
3000
Ispada da je tajna u formularu Odseka za motorna vozila (OMV).
06:56
And here is the story.
156
400000
2000
Evo kako to ide.
06:58
The countries on the left have a form at the DMV
157
402000
2000
Zemlje na levoj strani imaju formular u OMV-u
07:00
that looks something like this.
158
404000
2000
koji izgleda nekako ovako.
07:02
Check the box below if you want to participate
159
406000
2000
Označite ovde ukoliko želite da učestvujete
07:04
in the organ donor program.
160
408000
2000
u programu doniranja organa.
07:06
And what happens?
161
410000
2000
I šta se dešava?
07:08
People don't check, and they don't join.
162
412000
3000
Osobe koje ne označe kvadrat, ne učestvuju.
07:11
The countries on the right, the ones that give a lot,
163
415000
2000
Zemlje sa desne strane, one u kojima se daje dosta,
07:13
have a slightly different form.
164
417000
2000
imaju malo drugačiji formular.
07:15
It says check the box below if you don't want to participate.
165
419000
3000
U njemu vam traže da označite kvadratić ako NE želite da učestvujete.
07:18
Interestingly enough, when people get this,
166
422000
2000
Zanimljivo je, kada ljudi dobiju ovo,
07:20
they again don't check -- but now they join.
167
424000
3000
oni isto ne označe kvadrat, ali to ovoga puta znači da učestvuju.
07:23
(Laughter)
168
427000
3000
(Smeh)
07:26
Now think about what this means.
169
430000
3000
Sada razmislite o tome šta ovo znači.
07:29
We wake up in the morning and we feel we make decisions.
170
433000
4000
Probudimo se ujutru i mislimo da donosimo odluke.
07:33
We wake up in the morning and we open the closet
171
437000
2000
Probudimo se ujutru i otvorimo ormar
07:35
and we feel that we decide what to wear.
172
439000
2000
i mislimo da odlučujemo šta ćemo obući.
07:37
And we open the refrigerator and we feel that we decide what to eat.
173
441000
3000
I otvorimo frižider i mislimo da odlučujemo šta ćemo jesti.
07:40
What this is actually saying is that
174
444000
2000
Šta nam ovo zapravo govori je
07:42
much of these decisions are not residing within us.
175
446000
2000
da dosta ovih odluka nije na nama.
07:44
They are residing in the person who is designing that form.
176
448000
3000
Ove odluke zavise od osobe koja dizajnira formular.
07:47
When you walk into the DMV,
177
451000
3000
Kada uđete u OMV,
07:50
the person who designed the form will have a huge influence
178
454000
2000
osoba koja je dizajnirala formular imaće veliki uticaj
07:52
on what you'll end up doing.
179
456000
2000
na ono što ćete na kraju uraditi.
07:54
Now it's also very hard to intuit these results. Think about it for yourself.
180
458000
4000
Jako je teško shvatiti ove rezultate. Razmislite sami o tome.
07:58
How many of you believe
181
462000
2000
Koliko vas veruje
08:00
that if you went to renew your license tomorrow,
182
464000
2000
da ako biste sutra pošli da obnovite svoju dozvolu,
08:02
and you went to the DMV,
183
466000
2000
i pođete u OMV,
08:04
and you would encounter one of these forms,
184
468000
2000
i kada biste se sreli sa jednim od ovih formulara,
08:06
that it would actually change your own behavior?
185
470000
3000
da bi oni zapravo promenili vaše ponašanje?
08:09
Very, very hard to think that you will influence us.
186
473000
2000
Jako, jako je teško misliti da će to uticati na nas.
08:11
We can say, "Oh, these funny Europeans, of course it would influence them."
187
475000
2000
Možemo reći: "O, ovi smešni Evropljani, pa naravno da će to uticati na njih."
08:13
But when it comes to us,
188
477000
3000
Ali kada se radi o nama (Amerikancima),
08:16
we have such a feeling that we are at the driver's seat,
189
480000
2000
imamo takav osećaj da mi vodimo igru,
08:18
we have such a feeling that we are in control,
190
482000
2000
imamo taj osećaj da mi kontrolišemo situaciju,
08:20
and we are making the decision,
191
484000
2000
i da sami donosimo odluku,
08:22
that it's very hard to even accept
192
486000
2000
i jako je teško prihvatiti
08:24
the idea that we actually have
193
488000
2000
ideju da mi zapravo imamo
08:26
an illusion of making a decision, rather than an actual decision.
194
490000
4000
iliuziju donošenja odluke, pre nego da zaista donosimo odluku.
08:30
Now, you might say,
195
494000
2000
Možda ćete reći,
08:32
"These are decisions we don't care about."
196
496000
3000
"Ovo su odluke do kojih nam nije stalo".
08:35
In fact, by definition, these are decisions
197
499000
2000
Zapravo, po definiciji, ovo su odluke
08:37
about something that will happen to us after we die.
198
501000
2000
o nečemu što će se desiti pošto umremo.
08:39
How could we care about something less
199
503000
3000
Može li nam nešto biti manje važno
08:42
than something that happens after we die?
200
506000
2000
od onoga što će se desiti pošto umremo?
08:44
So a standard economist, someone who believes in rationality,
201
508000
3000
Tako, prosečni ekonomista, neko ko veruje u racionalnost,
08:47
would say, "You know what? The cost of lifting the pencil
202
511000
3000
može reći:" Znate šta? Samo podizanje olovke
08:50
and marking a V is higher than the possible
203
514000
2000
i označavanje V košta više nego moguća
08:52
benefit of the decision,
204
516000
2000
korist od odluke,
08:54
so that's why we get this effect."
205
518000
2000
i zbog toga dobijamo ovaj efekat."
08:56
But, in fact, it's not because it's easy.
206
520000
3000
Ali, zapravo, ovo nije zato što je to lako.
08:59
It's not because it's trivial. It's not because we don't care.
207
523000
3000
Ovo nije zato što je beznačajno. Nije zato što nam nije stalo.
09:02
It's the opposite. It's because we care.
208
526000
3000
Upravo je suprotno. Ovo se dešava zato što nam je stalo.
09:05
It's difficult and it's complex.
209
529000
2000
Ovo je teško i kompleksno.
09:07
And it's so complex that we don't know what to do.
210
531000
2000
I toliko je kompleksno da ne znamo šta da radimo.
09:09
And because we have no idea what to do
211
533000
2000
I pošto nemamo pojma šta da radimo
09:11
we just pick whatever it was that was chosen for us.
212
535000
4000
mi samo odaberemo ono što je već odabrano za nas.
09:15
I'll give you one more example for this.
213
539000
2000
Daću vam još primera ovoga.
09:17
This is from a paper by Redelmeier and Schaefer.
214
541000
3000
Ovo je iz rada Redelmajera i Šefera.
09:20
And they said, "Well, this effect also happens to experts,
215
544000
3000
Oni su rekli:" Dobro, ovaj efekat se takođe dešava i ekspertima,
09:23
people who are well-paid, experts in their decisions,
216
547000
3000
ljudima koji su dobro plaćeni, koji su stručni za donošenje odluka
09:26
do it a lot."
217
550000
2000
i koji to rade vrlo često."
09:28
And they basically took a group of physicians.
218
552000
2000
Uzeli su grupu lekara.
09:30
And they presented to them a case study of a patient.
219
554000
2000
Prezentovali su im slučaj jednog pacijenta.
09:32
Here is a patient. He is a 67-year-old farmer.
220
556000
4000
Ovo je pacijent. On je poljoprivrednik star 67 godina.
09:36
He's been suffering from a right hip pain for a while.
221
560000
2000
Već neko vreme on oseća bol u desnom kuku.
09:38
And then they said to the physician,
222
562000
2000
A onda su rekli lekaru,
09:40
"You decided a few weeks ago
223
564000
2000
"Pre nekoliko nedelja ste odlučili
09:42
that nothing is working for this patient.
224
566000
2000
da ovom pacijentu ništa ne može pomoći..
09:44
All these medications, nothing seems to be working.
225
568000
2000
Svi ovi lekovi, čini se da nemaju nikakvog efekta.
09:46
So you refer the patient to hip replacement therapy.
226
570000
3000
I uputili ste pacijenta na terapiju zamene kuka.
09:49
Hip replacement. Okay?"
227
573000
2000
Zamena kuka. U redu?"
09:51
So the patient is on a path to have his hip replaced.
228
575000
3000
Znači, pacijent je na putu da mu se zameni kuk.
09:54
And then they said to half the physicians, they said,
229
578000
2000
A onda su polovini lekara iz grupe rekli,
09:56
"Yesterday you reviewed the patient's case
230
580000
2000
"Juče ste posmatrali slučaj ovog pacijenta
09:58
and you realized that you forgot to try one medication.
231
582000
3000
i shvatili ste da ste zaboravili da probate jedan lek.
10:01
You did not try ibuprofen.
232
585000
3000
Niste probali ibuprofen.
10:04
What do you do? Do you pull the patient back and try ibuprofen?
233
588000
3000
Šta ćete da uradite? Da li da vratite pacijenta i prepišete ibuprofen?
10:07
Or do you let them go and have hip replacement?"
234
591000
3000
Ili ćete ga pustiti da ide na zamenu kuka?"
10:10
Well the good news is that most physicians in this case
235
594000
2000
Dobra vest je da je većina lekara u ovom slučaju
10:12
decided to pull the patient and try the ibuprofen.
236
596000
3000
odlučila da vrati pacijenta i proba ibuprofen.
10:15
Very good for the physicians.
237
599000
2000
Odlično za lekare.
10:17
The other group of the physicians, they said,
238
601000
2000
Drugoj grupi lekara su rekli
10:19
"Yesterday when you reviewed the case
239
603000
2000
"Juče ste posmatrali ovaj slučaj
10:21
you discovered there were two medications you didn't try out yet,
240
605000
2000
i otkrili ste da niste probali još dva leka
10:23
ibuprofen and piroxicam."
241
607000
3000
ibuprofen i piroksikam".
10:26
And they said, "You have two medications you didn't try out yet. What do you do?
242
610000
3000
A onda su rekli: "Imate dva leka koja niste još probali. Šta ćete da uradite?"
10:29
You let them go. Or you pull them back.
243
613000
2000
Da pustite pacijenta, ili ga vratite.
10:31
And if you pull them back do you try ibuprofen or piroxicam? Which one?"
244
615000
3000
I ako vratite pacijenta, da lli ćete probati ibuprofen ili piroksikam? Koji od ta dva?
10:34
Now think of it. This decision
245
618000
2000
Sada razmislite o ovome. Ova odluka
10:36
makes it as easy to let the patient continue with hip replacement.
246
620000
3000
olakšava da se pacijent pusti i da ode na zamenu kuka.
10:39
But pulling them back, all of the sudden becomes more complex.
247
623000
3000
Vraćanje pacijenta odjednom postaje složenije.
10:42
There is one more decision.
248
626000
2000
Jer postoji još jedna odluka koju treba da donesu.
10:44
What happens now?
249
628000
2000
Šta se sada dešava?
10:46
Majority of the physicians now choose to let the patient go
250
630000
3000
Većina lekara sada odlučuje da puste pacijenta
10:49
to hip replacement.
251
633000
2000
da ide na zamenu kuka.
10:51
I hope this worries you, by the way --
252
635000
2000
Nadam se da će vas ovo zabrinuti, inače...
10:53
(Laughter)
253
637000
1000
(Smeh)
10:54
when you go to see your physician.
254
638000
2000
kada odete kod lekara.
10:56
The thing is is that no physician would ever say,
255
640000
3000
Nijedan lekar nikada neće reći,
10:59
"Piroxicam, ibuprofen, hip replacement.
256
643000
2000
"Piroksikam, ibuprofen, zamena kuka.
11:01
Let's go for hip replacement."
257
645000
2000
Ok, idemo sa zamenom kuka."
11:03
But the moment you set this as the default
258
647000
3000
Ali u trenutku kada postavite ovo kao osnovni izbor
11:06
it has a huge power over whatever people end up doing.
259
650000
4000
onda to ima veliki uticaj na ono što ljudi na kraju urade.
11:10
I'll give you a couple of more examples on irrational decision-making.
260
654000
3000
Daću vam još par primera o iracionalnom donošenju odluka.
11:13
Imagine I give you a choice.
261
657000
2000
Zamislite da vam dajem da birate.
11:15
Do you want to go for a weekend to Rome?
262
659000
2000
Da li želite da odete na vikend u Rim?
11:17
All expenses paid:
263
661000
2000
Svi troškovi su plaćeni:
11:19
hotel, transportation, food, breakfast,
264
663000
2000
hotel, prevoz, hrana, doručak,
11:21
a continental breakfast, everything.
265
665000
2000
kontinentalni doručak, sve.
11:23
Or a weekend in Paris?
266
667000
2000
Ili vikend u Parizu?
11:25
Now, a weekend in Paris, a weekend in Rome, these are different things;
267
669000
3000
Sada, vikend u Parizu i vikend u Rimu su različite stvari;
11:28
they have different food, different culture, different art.
268
672000
2000
imaju različitu hranu, različite kulture, različitu umetnost.
11:30
Now imagine I added a choice to the set
269
674000
2000
Sada zamislite da sam ovom izboru dodao nešto
11:32
that nobody wanted.
270
676000
2000
što niko nije želio.
11:34
Imagine I said, "A weekend in Rome,
271
678000
2000
Zamislite da sam rekao: "Vikend u Rimu,
11:36
a weekend in Paris, or having your car stolen?"
272
680000
3000
vikend u Parizu, ili da vam ukradu kola?"
11:39
(Laughter)
273
683000
3000
(Smeh)
11:42
It's a funny idea, because why would having your car stolen,
274
686000
3000
Ovo je zabavna ideja, jer zašto bi vam krađa kola
11:45
in this set, influence anything?
275
689000
2000
u ovom setu, uticala na bilo šta?
11:47
(Laughter)
276
691000
2000
(Smeh)
11:49
But what if the option to have your car stolen
277
693000
3000
Ali šta ako opcija da vam ukradu kola
11:52
was not exactly like this.
278
696000
2000
nije baš ovakva.
11:54
What if it was a trip to Rome, all expenses paid,
279
698000
2000
Šta ako je to put u Rim, svi troškovi plaćeni,
11:56
transportation, breakfast,
280
700000
2000
prevoz, doručak,
11:58
but doesn't include coffee in the morning.
281
702000
3000
ali ne uključuje kafu ujutru.
12:01
If you want coffee you have to pay for it yourself. It's two euros 50.
282
705000
3000
Ako želite kafu, morate sami da platite za nju. To bi bilo dva eura i 50.
12:04
Now in some ways,
283
708000
3000
Sada na neki način,
12:07
given that you can have Rome with coffee,
284
711000
2000
pošto možete da dobijete Rim sa kafom,
12:09
why would you possibly want Rome without coffee?
285
713000
3000
zašto biste uopšte poželeli Rim bez kafe?
12:12
It's like having your car stolen. It's an inferior option.
286
716000
3000
Ovo je kao da su vam ukrali kola. Ovo je inferiorna opcija.
12:15
But guess what happened. The moment you add Rome without coffee,
287
719000
2000
Ali pogodite šta se desilo. Čim dodate Rim bez kafe,
12:17
Rome with coffee becomes more popular. And people choose it.
288
721000
5000
Rim sa kafom postaje popularniji. I ljudi to izaberu.
12:22
The fact that you have Rome without coffee
289
726000
3000
Činjenica da imate Rim bez kafe
12:25
makes Rome with coffee look superior,
290
729000
2000
čini da Rim sa kafom deluje superiornije,
12:27
and not just to Rome without coffee -- even superior to Paris.
291
731000
3000
i to ne samo u odnosu na Rim bez kafe - nego je superiorniji i u odnosu na Pariz.
12:30
(Laughter)
292
734000
4000
(Smeh)
12:34
Here are two examples of this principle.
293
738000
2000
Evo dva primera ovog principa.
12:36
This was an ad from The Economist a few years ago
294
740000
3000
Reklama iz časopisa Eknomist od pre nekoliko godina
12:39
that gave us three choices.
295
743000
2000
nam je dala tri opcije.
12:41
An online subscription for 59 dollars.
296
745000
3000
Onlajn pretplata za 59 dolara.
12:44
A print subscription for 125.
297
748000
4000
Štampana pretplata za 125.
12:48
Or you could get both for 125.
298
752000
2000
Ili možete da dobijete obe za 125.
12:50
(Laughter)
299
754000
2000
(Smeh)
12:52
Now I looked at this and I called up The Economist.
300
756000
2000
Kada sam ovo video, nazvao sam Ekonomist.
12:54
And I tried to figure out what were they thinking.
301
758000
3000
Hteo sam da razumem kakva im je ideja.
12:57
And they passed me from one person to another to another,
302
761000
3000
A oni su me prebacivali od jedne do druge osobe,
13:00
until eventually I got to a person who was in charge of the website.
303
764000
4000
dok konačno nisu došli do osobe koja je bila zadužena za veb sajt.
13:04
And I called them up. And they went to check what was going on.
304
768000
3000
Pozvao sam ih i otišli su da provere šta se dešava.
13:07
The next thing I know, the ad is gone. And no explanation.
305
771000
4000
Sledeće što se desilo, reklama je uklonjena. Bez objašnjenja.
13:11
So I decided to do the experiment
306
775000
2000
Onda sam odlučio da sprovedem eksperiment
13:13
that I would have loved The Economist to do with me.
307
777000
3000
koji bih voleo da Ekonomist uradi sa mnom.
13:16
I took this and I gave it to 100 MIT students.
308
780000
2000
Uzeo sam ovo i prosledio 100 studenata MIT-a.
13:18
I said, "What would you choose?"
309
782000
2000
Pitao sam: "Šta biste vi odabrali?"
13:20
These are the market share. Most people wanted the combo deal.
310
784000
4000
Ovo su rezultati. Većina je htela kombinovanu ponudu.
13:24
Thankfully nobody wanted the dominated option.
311
788000
2000
Srećom, niko nije odabrao dominantu opciju.
13:26
That means our students can read.
312
790000
2000
Ovo nam pokazuje da naši studenti znaju da čitaju.
13:28
(Laughter)
313
792000
1000
(Smeh)
13:29
But now if you have an option that nobody wants,
314
793000
3000
Ali sada, ako imate opciju koju niko ne želi,
13:32
you can take it off. Right?
315
796000
2000
možete da je uklonite. Jel tako?
13:34
So I printed another version of this,
316
798000
2000
Onda sam odštampao drugu verziju ovoga,
13:36
where I eliminated the middle option.
317
800000
2000
odakle sam sklonio srednju opciju.
13:38
I gave it to another 100 students. Here is what happens.
318
802000
3000
Dao sam drugoj stotini studenata. Evo šta se dogodilo.
13:41
Now the most popular option became the least popular.
319
805000
3000
Sada je najpopularnija opcija postala najmanje popularna.
13:44
And the least popular became the most popular.
320
808000
3000
A najmanje popularna postala je najpopularnija.
13:47
What was happening was the option that was useless,
321
811000
4000
Ono što se desilo je da je beskorisna opcija,
13:51
in the middle, was useless in the sense that nobody wanted it.
322
815000
4000
u sredini, bila beskorisna u smislu da je niko nije hteo.
13:55
But it wasn't useless in the sense that it helped people figure out
323
819000
2000
Ali nije bila beskorisna u smislu da je pomogla ljudima da shvate
13:57
what they wanted.
324
821000
2000
šta žele.
13:59
In fact, relative to the option in the middle,
325
823000
3000
Zapravo, u poređenju sa opcijom u sredini,
14:02
which was get only the print for 125,
326
826000
4000
koja je nudila da se dobije samo štampano izdanje za 125,
14:06
the print and web for 125 looked like a fantastic deal.
327
830000
4000
štampano i online izdanje izgledali su kao odlična prilika.
14:10
And as a consequence, people chose it.
328
834000
2000
I kao posledica toga, ljudi su birali ovu opciju.
14:12
The general idea here, by the way,
329
836000
2000
Inače, osnovna ideja ovde je
14:14
is that we actually don't know our preferences that well.
330
838000
2000
da mi zapravo i ne znamo tako dobro šta stvarno hoćemo.
14:16
And because we don't know our preferences that well
331
840000
2000
I zbog toga što ne znamo naše preference tako dobro
14:18
we're susceptible to all of these influences from the external forces:
332
842000
4000
podložni smo svim ovim uticajima spoljnjih faktora:
14:22
the defaults, the particular options that are presented to us, and so on.
333
846000
4000
standardi, posebne opcije koje su nam predstavljene, i tako dalje.
14:26
One more example of this.
334
850000
2000
Još jedan primer ovoga.
14:28
People believe that when we deal with physical attraction,
335
852000
3000
Ljudi veruju da kada se suočiimo sa fizičkom privlačnošću,
14:31
we see somebody, and we know immediately whether we like them or not,
336
855000
3000
kada vidimo nekoga, da odmah znamo da li nam se dopada ili ne,
14:34
attracted or not.
337
858000
2000
da li nas privlači ili ne.
14:36
Which is why we have these four-minute dates.
338
860000
2000
Zbog toga postoje oni brzinski sastanci od četiri minute.
14:38
So I decided to do this experiment with people.
339
862000
3000
Stoga sam odlučio da uradim eksperiment.
14:41
I'll show you graphic images of people -- not real people.
340
865000
2000
Pokazaću vam slike ljudi, ne pravih ljudi.
14:43
The experiment was with people.
341
867000
2000
Ovaj eksperiment je bio sa ljudima.
14:45
I showed some people a picture of Tom, and a picture of Jerry.
342
869000
3000
Pokazao sam nekim ljudima sliku Toma i sliku Džerija.
14:48
I said "Who do you want to date? Tom or Jerry?"
343
872000
3000
I rekao sam "Sa kim biste hteli da izađete? Sa Tomom ili Džerijem?"
14:51
But for half the people I added an ugly version of Jerry.
344
875000
4000
Ali za jednu polovinu ljudi dodao sam ružnu verziju Džerija.
14:55
I took Photoshop and I made Jerry slightly less attractive.
345
879000
5000
Pomoću Fotošopa učinio sam Džerija malo manje atraktivnim.
15:00
(Laughter)
346
884000
1000
(Smeh)
15:01
The other people, I added an ugly version of Tom.
347
885000
4000
Drugim ljudima, dodao sam ružnu verziju Toma.
15:05
And the question was, will ugly Jerry and ugly Tom
348
889000
3000
I pitanje je bilo, da li će ružni Džeri i ružni Tom
15:08
help their respective, more attractive brothers?
349
892000
4000
pomoći svojoj atraktivnijoj braći?
15:12
The answer was absolutely yes.
350
896000
2000
Odgovor je - apsolutno da.
15:14
When ugly Jerry was around, Jerry was popular.
351
898000
2000
Kada je ružni Džeri bio prisutan, Džeri je bio popularan.
15:16
When ugly Tom was around, Tom was popular.
352
900000
2000
Kada je ružni Tom bio prisutan, Tom je bio popularan.
15:18
(Laughter)
353
902000
2000
(Smeh)
15:20
This of course has two very clear implications
354
904000
2000
Ovo naravno ima dve veoma jasne implikacije
15:22
for life in general.
355
906000
4000
na život uopšte.
15:26
If you ever go bar hopping, who do you want to take with you?
356
910000
3000
Ako budete išli u noćni provod, koga želite da povedete sa sobom?
15:29
(Laughter)
357
913000
6000
(Smeh)
15:35
You want a slightly uglier version of yourself.
358
919000
3000
Želite malo ružniju verziju sebe.
15:38
(Laughter)
359
922000
2000
(Smeh)
15:40
Similar. Similar ... but slightly uglier.
360
924000
2000
Sličnu. Sličnu... ali malo ružniju.
15:42
(Laughter)
361
926000
2000
(Smeh)
15:44
The second point, or course, is that
362
928000
2000
Druga stvar, naravno, je to
15:46
if somebody else invites you, you know how they think about you.
363
930000
3000
da ako vas neko drugi pozove, znate šta misle o vama.
15:49
(Laughter)
364
933000
3000
(Smeh)
15:52
Now you're getting it.
365
936000
2000
Sada vam je jasno.
15:54
What is the general point?
366
938000
2000
Šta je opšti zaključak?
15:56
The general point is that when we think about economics we have
367
940000
2000
Opšti zaključak je da kada mislimo o ekonomiji imamo
15:58
this beautiful view of human nature.
368
942000
3000
ovo divno mišljenje o ljudskoj prirodi.
16:01
"What a piece of work is man! How noble in reason!"
369
945000
2000
"Kako je čovek divan! Kako plemenit u umu!"
16:03
We have this view of ourselves, of others.
370
947000
3000
Imamo ovakvo mišljenje o nama, o drugima.
16:06
The behavioral economics perspective
371
950000
2000
Perspektiva bihevioralne ekonomije
16:08
is slightly less generous to people.
372
952000
3000
je malo manje darežljiva prema ljudima.
16:11
In fact in medical terms, that's our view.
373
955000
3000
U stvari u medicinskim terminima, to je naše mišljenje.
16:14
(Laughter)
374
958000
6000
(Smeh)
16:20
But there is a silver lining.
375
964000
2000
Ali postoji pozitivna strana.
16:22
The silver lining is, I think,
376
966000
2000
Pozitivna strana je, ja mislim,
16:24
kind of the reason that behavioral economics is interesting and exciting.
377
968000
4000
to što je bihevioralna ekonomija interesantna i uzbudljiva.
16:28
Are we Superman? Or are we Homer Simpson?
378
972000
2000
Jesmo li mi Supermeni? Ili smo Homer Simpson?
16:30
When it comes to building the physical world,
379
974000
4000
Kada se radi o izgradnji fizičkog sveta,
16:34
we kind of understand our limitations.
380
978000
2000
nekako razumemo naša ograničenja.
16:36
We build steps. And we build these things
381
980000
2000
Gradimo stepenice. I gradimo one stvari
16:38
that not everybody can use obviously.
382
982000
3000
koje očigledno ne koriste baš svi.
16:41
(Laughter)
383
985000
1000
(Smeh)
16:42
We understand our limitations,
384
986000
2000
Mi razumemo naša ograničenja,
16:44
and we build around it.
385
988000
2000
i uzimamo ih u obzir.
16:46
But for some reason when it comes to the mental world,
386
990000
2000
Ali iz nekog razloga kada se radi o mentalnom svetu,
16:48
when we design things like healthcare and retirement and stockmarkets,
387
992000
4000
kada pravimo stvari poput zdravstva i penzija i berze,
16:52
we somehow forget the idea that we are limited.
388
996000
2000
nekako zaboravljamo ideju da smo ograničeni.
16:54
I think that if we understood our cognitive limitations
389
998000
3000
Mislim da kada bismo razumeli svoja kognitivna ograničenja
16:57
in the same way that we understand our physical limitations,
390
1001000
2000
kao što razumemo naša fizička ograničenja,
16:59
even though they don't stare us in the face in the same way,
391
1003000
2000
mada nisu tako očigledna
17:01
we could design a better world.
392
1005000
3000
mogli bismo da napravimo bolji svet.
17:04
And that, I think, is the hope of this thing.
393
1008000
2000
I to je, mislim, nada koja pronalazimo ovde.
17:06
Thank you very much.
394
1010000
2000
Hvala vam puno.
17:08
(Applause)
395
1012000
8000
(Aplauz)
Translated by Majda Škrijelj
Reviewed by Ivana Korom

▲Back to top

ABOUT THE SPEAKER
Dan Ariely - Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why.

Why you should listen

Dan Ariely is a professor of psychology and behavioral economics at Duke University and a founding member of the Center for Advanced Hindsight. He is the author of the bestsellers Predictably IrrationalThe Upside of Irrationality, and The Honest Truth About Dishonesty -- as well as the TED Book Payoff: The Hidden Logic that Shapes Our Motivations.

Through his research and his (often amusing and unorthodox) experiments, he questions the forces that influence human behavior and the irrational ways in which we often all behave.

More profile about the speaker
Dan Ariely | Speaker | TED.com