ABOUT THE SPEAKER
Dan Gilbert - Psychologist; happiness expert
Harvard psychologist Dan Gilbert says our beliefs about what will make us happy are often wrong -- a premise he supports with intriguing research, and explains in his accessible and unexpectedly funny book, Stumbling on Happiness.

Why you should listen

Dan Gilbert believes that, in our ardent, lifelong pursuit of happiness, most of us have the wrong map. In the same way that optical illusions fool our eyes -- and fool everyone's eyes in the same way -- Gilbert argues that our brains systematically misjudge what will make us happy. And these quirks in our cognition make humans very poor predictors of our own bliss.

The premise of his current research -- that our assumptions about what will make us happy are often wrong -- is supported with clinical research drawn from psychology and neuroscience. But his delivery is what sets him apart. His engaging -- and often hilarious -- style pokes fun at typical human behavior and invokes pop-culture references everyone can relate to. This winning style translates also to Gilbert's writing, which is lucid, approachable and laugh-out-loud funny. The immensely readable Stumbling on Happiness, published in 2006, became a New York Times bestseller and has been translated into 20 languages.

In fact, the title of his book could be drawn from his own life. At 19, he was a high school dropout with dreams of writing science fiction. When a creative writing class at his community college was full, he enrolled in the only available course: psychology. He found his passion there, earned a doctorate in social psychology in 1985 at Princeton, and has since won a Guggenheim Fellowship and the Phi Beta Kappa teaching prize for his work at Harvard. He has written essays and articles for The New York Times, Time and even Starbucks, while continuing his research into happiness at his Hedonic Psychology Laboratory.

More profile about the speaker
Dan Gilbert | Speaker | TED.com
TED2004

Dan Gilbert: The surprising science of happiness

Dan Gilbert: A sorprendente ciencia da felicidade

Filmed:
18,394,509 views

Dan Gilbert, autor de *Stumbling on Happiness*, cuestiona a idea de que seremos miserables se non conseguimos o que queremos. O noso "sistema psicoinmune" permítenos ser completamente felices mesmo cando as cousas non saen como programamos.
- Psychologist; happiness expert
Harvard psychologist Dan Gilbert says our beliefs about what will make us happy are often wrong -- a premise he supports with intriguing research, and explains in his accessible and unexpectedly funny book, Stumbling on Happiness. Full bio

Double-click the English transcript below to play the video.

00:15
When you have 21 minutes to speak,
0
0
2000
Cando tes 21 minutos para falar,
00:17
two million years seems like a really long time.
1
2000
3000
dous millóns de anos parecen moito tempo.
00:20
But evolutionarily, two million years is nothing.
2
5000
3000
Mais evolutivamente,
dous millóns non son nada.
00:23
And yet in two million years the human brain has nearly tripled in mass,
3
8000
6000
Porén, en dous millóns de anos
o cerebro humano
case triplicou a súa masa,
00:29
going from the one-and-a-quarter pound brain of our ancestor here, Habilis,
4
14000
3000
pasando do cerebro de medio quilo
que tiñan os nosos devanceiros, Habilis,
00:32
to the almost three-pound meatloaf that everybody here has between their ears.
5
17000
6000
a uns miolos de case quilo e medio
que temos todos agora entre as orellas.
00:38
What is it about a big brain that nature was so eager for every one of us to have one?
6
23000
7000
Que acontece cun cerebro grande,
para que a natureza tivese tanta ansia
de nos dar un a todos?
00:45
Well, it turns out when brains triple in size,
7
30000
3000
Porque cando o cerebro
triplica o seu tamaño
00:48
they don't just get three times bigger; they gain new structures.
8
33000
4000
non só se fai tres veces máis grande,
senón que adquire novas estruturas.
00:52
And one of the main reasons our brain got so big is because it got a new part,
9
37000
4000
Unha das razóns polas que o cerebro
se agrandou é porque ten unha parte nova
00:56
called the "frontal lobe." And particularly, a part called the "pre-frontal cortex."
10
41000
4000
chamada lobo frontal,
en particular unha zona chamada
córtex prefrontal.
01:00
Now what does a pre-frontal cortex do for you that should justify
11
45000
4000
Que fai o córtex prefrontal
para xustificar
01:04
the entire architectural overhaul of the human skull in the blink of evolutionary time?
12
49000
5000
todo o cambio arquitectónico
do cranio humano
nun chiscar de ollos evolutivo?
01:09
Well, it turns out the pre-frontal cortex does lots of things,
13
54000
3000
Resulta que o córtex prefrontal
fai unha chea de cousas,
01:12
but one of the most important things it does
14
57000
2000
pero unha das máis importantes
é ser un simulador de experiencias.
01:14
is it is an experience simulator.
15
59000
4000
01:18
Flight pilots practice in flight simulators
16
63000
4000
Os pilotos practican en simuladores de voo
01:22
so that they don't make real mistakes in planes.
17
67000
2000
para non cometer erros reais en avións.
01:24
Human beings have this marvelous adaptation
18
69000
3000
O ser humano posúe esta
adaptación marabillosa
01:27
that they can actually have experiences in their heads
19
72000
4000
de poder crear experiencias
na súa cabeza
01:31
before they try them out in real life.
20
76000
2000
antes de experimentalas na vida real.
01:33
This is a trick that none of our ancestors could do,
21
78000
3000
Ningún dos nosos devanceiros
podía facer este truco,
01:36
and that no other animal can do quite like we can. It's a marvelous adaptation.
22
81000
5000
e non pode facelo ningún outro animal
como o facemos nós.
É unha adaptación marabillosa.
Está ao mesmo nivel que o dedo prénsil,
erguerse en dúas patas e a linguaxe
01:41
It's up there with opposable thumbs and standing upright and language
23
86000
4000
como unha das cousas que fixo
que a nosa especie saíse das árbores
01:45
as one of the things that got our species out of the trees
24
90000
3000
01:48
and into the shopping mall.
25
93000
2000
e entrase no centro comercial.
01:50
Now -- (Laughter) -- all of you have done this.
26
95000
3000
(Risos)
Todos fixestes isto.
01:53
I mean, you know,
27
98000
1000
Ben&Jerry’s non teñen
xeados de fígado e cebola,
01:54
Ben and Jerry's doesn't have liver-and-onion ice cream,
28
99000
4000
e non porque bateran un pouco,
o probaran e dixeran: "Que noxo!",
01:58
and it's not because they whipped some up, tried it and went, "Yuck."
29
103000
3000
02:01
It's because, without leaving your armchair,
30
106000
3000
senón porque sen erguerte do sofá
02:04
you can simulate that flavor and say "yuck" before you make it.
31
109000
4000
podes simular ese sabor
e dicir "que noxo" antes de elaboralo.
02:08
Let's see how your experience simulators are working.
32
113000
5000
Imos ver como funcionan
os simuladores de experiencias.
02:13
Let's just run a quick diagnostic
before I proceed with the rest of the talk.
33
118000
4000
Fagamos un breve diagnóstico
antes de continuar co resto da charla.
02:17
Here's two different futures that I invite you to contemplate,
34
122000
4000
Imos ver dous futuros diferentes
e convídovos a que os contempledes.
02:21
and you can try to simulate them and tell me which one you think you might prefer.
35
126000
4000
Podedes simulalos e dicirme
cal pensades que preferiredes.
02:25
One of them is winning the lottery. This is about 314 million dollars.
36
130000
5000
Un deles é gañar a lotería.
Son uns 314 millóns de dólares.
02:30
And the other is becoming paraplegic.
37
135000
3000
E o outro é quedar parapléxico.
02:33
So, just give it a moment of thought.
38
138000
3000
(Risos)
Pensádeo un intre.
02:36
You probably don't feel like you need a moment of thought.
39
141000
3000
Seguramente non precisades
tempo para pensalo.
02:39
Interestingly, there are data on these two groups of people,
40
144000
4000
É curioso que existen datos
sobre estes dous grupos de persoas,
02:43
data on how happy they are.
41
148000
2000
datos sobre o felices que son.
02:45
And this is exactly what you expected, isn't it?
42
150000
3000
Iso é exactamente o que esperabades,
non si?
02:48
But these aren't the data. I made these up!
43
153000
3000
Pero eses non son os datos.
Inventeinos eu!
02:51
These are the data. You failed the pop quiz, and you're hardly five minutes into the lecture.
44
156000
5000
Estes son os datos.
Suspendestes o test e levades
cinco minutos escasos na clase.
02:56
Because the fact is that a year after losing the use of their legs,
45
161000
4000
Porque o feito é que un ano despois
de perderen a mobilidade nas pernas,
03:00
and a year after winning the lotto, lottery winners and paraplegics
46
165000
5000
e un ano despois de gañaren a lotería,
os que gañaron a lotería e os parapléxicos
03:05
are equally happy with their lives.
47
170000
2000
son igual de felices coas súas vidas.
03:07
Now, don't feel too bad about failing the first pop quiz,
48
172000
3000
Non vos sintades mal
por suspender o primeiro test
03:10
because everybody fails all of the pop quizzes all of the time.
49
175000
3000
porque todo o mundo suspende
sempre todos os exames sorpresa.
03:13
The research that my laboratory has been doing,
50
178000
3000
A investigación que levou a cabo
o meu laboratorio,
03:16
that economists and psychologists around the country have been doing,
51
181000
3000
que desenvolveron psicólogos
e economistas por todo o mundo,
03:19
have revealed something really quite startling to us,
52
184000
3000
revelounos en realidade algo
bastante sorprendente,
03:22
something we call the "impact bias,"
53
187000
3000
ao que lle chamamos
o nesgo do impacto,
03:25
which is the tendency for the simulator to work badly.
54
190000
3000
que é a tendencia
do simulador a funcionar mal.
03:28
For the simulator to make you believe that different outcomes
55
193000
4000
Porque o simulador faiche crer
que as diferentes posibilidades
03:32
are more different than in fact they really are.
56
197000
3000
son máis dispares do que en realidade son.
03:35
From field studies to laboratory studies,
57
200000
2000
Desde estudos de campo aos do laboratorio,
03:37
we see that winning or losing an election, gaining or losing a romantic partner,
58
202000
4000
vemos que gañar ou perder unha elección,
gañar ou perder unha parella,
03:41
getting or not getting a promotion, passing or not passing a college test,
59
206000
5000
conseguir ou non un ascenso,
aprobar ou non un exame,
03:46
on and on, have far less impact, less intensity and much less duration
60
211000
5000
etc., ten moito menos impacto,
menos intensidade e dura moito menos
03:51
than people expect them to have.
61
216000
3000
do que a xente espera que teña.
03:54
In fact, a recent study -- this almost floors me --
62
219000
3000
Isto case me derruba:
03:57
a recent study showing how major life traumas affect people
63
222000
5000
un estudo recente demostra
que canto maiores son os traumas vividos,
04:02
suggests that if it happened over three months ago,
64
227000
3000
04:05
with only a few exceptions,
65
230000
1000
se pasasen uns tres meses antes,
04:06
it has no impact whatsoever on your happiness.
66
231000
3000
quitando algunhas excepcións,
04:09
Why?
67
234000
3000
non terían impacto sobre a felicidade.
04:12
Because happiness can be synthesized.
68
237000
4000
Por que?
04:16
Sir Thomas Brown wrote in 1642, "I am the happiest man alive.
69
241000
4000
Porque a felicidade pode sintetizarse.
04:20
I have that in me that can convert poverty to riches, adversity to prosperity.
70
245000
6000
Sir Thomas Brown escribiu en 1642:
"Son o home máis feliz do mundo.
Teño o don de converter a pobreza
en riqueza, a adversidade en prosperidade.
04:26
I am more invulnerable than Achilles; fortune hath not one place to hit me."
71
251000
4000
Son máis invulnerable que Aquiles;
o fado non ten lugar para me abater".
04:30
What kind of remarkable machinery does this guy have in his head?
72
255000
4000
04:34
Well, it turns out it's precisely the same remarkable machinery that all off us have.
73
259000
5000
Que clase de maquinaria extraordinaria
ten este home na cabeza?
Pois resulta que ten precisamente
a mesma que temos todos.
04:39
Human beings have something that we might think of as a "psychological immune system."
74
264000
6000
Os humanos teñen algo
04:45
A system of cognitive processes, largely non-conscious cognitive processes,
75
270000
5000
que podemos considerar
"sistema psicoinmune":
Un sistema de procesos cognitivo,
sobre todo non conscientes,
04:50
that help them change their views of the world,
76
275000
4000
que lles axuda a cambiar
a súa visión do mundo,
04:54
so that they can feel better about the worlds in which they find themselves.
77
279000
4000
para se sentiren mellor
04:58
Like Sir Thomas, you have this machine.
78
283000
2000
na situación na que se atopan.
05:00
Unlike Sir Thomas, you seem not to know it. (Laughter)
79
285000
4000
Coma Sir Thomas, tedes esta máquina.
A diferencia del,
parece que non o sabedes.
05:04
We synthesize happiness, but we think happiness is a thing to be found.
80
289000
6000
Sintetizamos a felicidade,
pero cremos que é algo que se atopa.
05:10
Now, you don't need me to give you too many examples of people synthesizing happiness,
81
295000
5000
Xa non precisades que vos dea
05:15
I suspect. Though I'm going to show you some experimental evidence,
82
300000
3000
moitos exemplos de xente
que sintetiza a felicidade, supoño.
05:18
you don't have to look very far for evidence.
83
303000
3000
Pero vouvos amosar
algunhas probas experimentais,
05:21
As a challenge to myself, since I say this once in a while in lectures,
84
306000
3000
non tedes que ir moi lonxe para atopalas.
05:24
I took a copy of the New York Times and tried to find some instances of people synthesizing happiness.
85
309000
4000
Nun exemplar do New York Times
tentei buscar exemplos
de xente que sintetiza a felicidade.
05:28
And here are three guys synthesizing happiness.
86
313000
2000
Aquí temos tres tíos que a sintetizan.
05:30
"I am so much better off physically, financially, emotionally, mentally
87
315000
3000
"Estou moito mellor física,
financeira e mentalmente...
05:33
and almost every other way." "I don't have one minute's regret.
88
318000
4000
"Non teño nada de que me laiar.
Foi unha experiencia gloriosa".
"Coido que resultou ser o mellor".
05:37
It was a glorious experience." "I believe it turned out for the best."
89
322000
3000
Quen son estes personaxes
que están tan felices?
05:40
Who are these characters who are so damn happy?
90
325000
2000
O primeiro é Jim Wright.
05:42
Well, the first one is Jim Wright.
91
327000
2000
Algúns tedes idade para lembralo:
05:44
Some of you are old enough to remember: he was the chairman of the House of Representatives
92
329000
4000
Era o presidente
da Cámara de Representantes
e caeu en desgraza e dimitiu
05:48
and he resigned in disgrace when this young Republican named Newt Gingrich
93
333000
4000
cando aquel mozo republicano,
Newt Gingrich,
destapou un turbio negocio que el fixera.
05:52
found out about a shady book deal he had done.
94
337000
3000
Perdeuno todo.
O demócrata máis poderoso
do país perdeuno todo.
05:55
He lost everything. The most powerful Democrat in the country,
95
340000
2000
Perdeu o diñeiro e o poder.
05:57
he lost everything.
96
342000
1000
05:58
He lost his money; he lost his power.
97
343000
3000
Que di el despois de todos eses anos?
"Estou moito mellor física,
financeira e mentalmente
06:01
What does he have to say all these years later about it?
98
346000
2000
06:03
"I am so much better off physically, financially, mentally
99
348000
3000
e en case todos os aspectos".
En que outros sentidos
podería estar mellor?
06:06
and in almost every other way."
100
351000
2000
Vexetal? Mineral? Animalmente?
06:08
What other way would there be to be better off?
101
353000
2000
Xa os cubriu case todos.
06:10
Vegetably? Minerally? Animally? He's pretty much covered them there.
102
355000
4000
Nunca oístes falar
de Moresee Bickham.
06:14
Moreese Bickham is somebody you've never heard of.
103
359000
2000
Moresee Bickham dixo estas palabras
tras ser liberado.
06:16
Moreese Bickham uttered these words upon being released.
104
361000
4000
Tiña 78 anos.
Pasara 37 nun cárcere
do estado de Louisiana
06:20
He was 78 years old. He spent 37 years
105
365000
2430
por un crime que non cometera.
06:22
in a Louisiana State Penitentiary
for a crime he didn't commit.
106
367430
2790
[Ao final liberárono
por boa conducta
na metade da súa sentenza].
06:25
[He was ultimately
107
370220
1780
06:27
released for good behavior
halfway through his sentence.]
108
372000
3000
Que dixo da súa experiencia?
"Non teño nada de que me laiar.
Foi unha experiencia gloriosa". Gloriosa!
06:30
And what did he have to say about his experience?
109
375000
2000
06:32
"I don't have one minute's regret. It was a glorious experience."
110
377000
2000
Non di:
06:34
Glorious! This guy is not saying,
111
379000
2000
"Había tipos agradables.
Tiñan un ximnasio".
06:36
"Well, you know, there were some nice guys. They had a gym."
112
381000
2000
"Gloriosa",
unha palabra que se reserva
para algo coma unha experiencia relixiosa.
06:38
It's "glorious,"
113
383000
1000
06:39
a word we usually reserve for something like a religious experience.
114
384000
4000
Harry S. Langermann pronunciou
estas palabras, e poderiades coñecelo,
06:43
Harry S. Langerman uttered these words, and he's somebody you might have known
115
388000
4000
porque en 1949 leu un artigo no xornal
06:47
but didn't, because in 1949 he read a little article in the paper
116
392000
3000
sobre un posto de hamburguesas
de dous irmáns, os McDonalds.
06:50
about a hamburger stand owned by these two brothers named McDonalds.
117
395000
4000
E pensou: "É unha idea estupenda!"
Entón foi xunta eles e dixéronlle:
06:54
And he thought, "That's a really neat idea!"
118
399000
2000
"Podémosche dar unha franquía
por 3000 dólares".
06:56
So he went to find them. They said,
119
401000
2000
06:58
"We can give you a franchise on this for 3,000 bucks."
120
403000
2000
Harry volveu a Nova York, pediulle
a seu irmán, investidor financeiro,
07:00
Harry went back to New York, asked his brother who's an investment banker
121
405000
4000
un préstamo de 3000 dólares
e a frase inmortal do irmán foi:
07:04
to loan him the 3,000 dollars,
122
409000
1000
"Parvo, ninguén come hamburguesas".
07:05
and his brother's immortal words were,
123
410000
2000
Non lle deixou o diñeiro e,
07:07
"You idiot, nobody eats hamburgers."
124
412000
1000
por suposto, seis meses despois
Ray Kroc tivo xusto a mesma idea.
07:08
He wouldn't lend him the money, and of course six months later
125
413000
3000
Resulta que a xente come hamburguesas,
07:11
Ray Croc had exactly the same idea.
126
416000
2000
07:13
It turns out people do eat hamburgers,
127
418000
2000
e Ray Kroc, por un tempo,
converteuse no home máis rico de EE. UU.
07:15
and Ray Croc, for a while, became the richest man in America.
128
420000
4000
E, por último,
algúns recoñeceredes
esta foto antiga de Pete Best,
07:20
And then finally -- you know, the best of all possible worlds --
129
425000
2000
o batería orixinal dos Beatles,
07:22
some of you recognize this young photo of Pete Best,
130
427000
5000
ata que, xa sabedes, mandárono
por aí, pasaron del
07:27
who was the original drummer for the Beatles,
131
432000
2000
e colleron a Ringo na xira.
07:29
until they, you know, sent him out on an errand and snuck away
132
434000
3000
En 1994, cando entrevistaron a Pete Best
(si, segue a ser batería,
si, é músico de estudio)
07:32
and picked up Ringo on a tour.
133
437000
3000
dixo isto: "Son máis feliz
do que sería cos Beatles".
07:35
Well, in 1994, when Pete Best was interviewed
134
440000
2000
07:37
-- yes, he's still a drummer; yes, he's a studio musician --
135
442000
3000
Ben. Hai algo importante
que aprender desta xente,
07:40
he had this to say: "I'm happier than I would have been with the Beatles."
136
445000
3000
o segredo da felicidade.
07:43
Okay. There's something important to be learned from these people,
137
448000
3000
E aquí vai ser revelado por fin.
Primeiro: acumular riqueza, poder
e prestixio e logo perdelo.
07:46
and it is the secret of happiness.
138
451000
2000
07:48
Here it is, finally to be revealed.
139
453000
2000
(Risos)
07:50
First: accrue wealth, power, and prestige,
140
455000
3000
Segundo: pasar tanto tempo
como poidas no cárcere.
07:53
then lose it. (Laughter)
141
458000
3000
(Risos)
07:56
Second: spend as much of your life in prison as you possibly can.
142
461000
3000
Terceiro: facer moi, moi rica
a outra persoa.
07:59
(Laughter) Third: make somebody else really, really rich. (Laughter)
143
464000
5000
E por último: non xuntarse cos Beatles.
(Risos)
Si, xa.
08:04
And finally: never ever join the Beatles. (Laughter)
144
469000
4000
Porque cando a xente
sintetiza a felicidade,
como parece que fixeron estas persoas,
08:08
OK. Now I, like Ze Frank, can predict your next thought,
145
473000
5000
todos lles sorrimos, pero dalgún xeito
poñemos os ollos en branco:
"Si, xa, en realidade
nunca quixeches o traballo".
08:13
which is, "Yeah, right." Because when
146
478000
2000
08:15
people synthesize happiness, as these gentlemen seem to have done,
147
480000
4000
"Ah, si, claro. A verdade é
que non tiñas tanto en común con ela,
08:19
we all smile at them, but we kind of roll our eyes and say,
148
484000
4000
e imaxinabas que co tempo
che tiraría o anel de compromiso na cara".
08:23
"Yeah right, you never really wanted the job."
149
488000
3000
Facemos un escarnio porque cremos
que a felicidade sintética
08:26
"Oh yeah, right. You really didn't
150
491000
1000
08:27
have that much in common with her,
151
492000
3000
non ten a mesma calidade
que a que chamamos felicidade natural.
08:30
and you figured that out just about the time
152
495000
2000
Que significan estes termos?
08:32
she threw the engagement ring in your face."
153
497000
2000
A felicidade natural é a que obtemos
cando conseguimos o que queremos,
08:34
We smirk because we believe that synthetic happiness
154
499000
4000
e a felicidade sintética é a que creamos
cando non conseguimos o que queremos.
08:38
is not of the same quality as what we might call "natural happiness."
155
503000
3000
Na nosa sociedade
temos unha crenza arraigada
08:41
What are these terms?
156
506000
1000
08:42
Natural happiness is what we get when we get what we wanted,
157
507000
4000
de que a felicidade sintética
é dunha categoría inferior.
08:46
and synthetic happiness is what we make when we don't get what we wanted.
158
511000
5000
Por que temos esa crenza?
Ben, é moi simple.
Que clase de motor económico
seguiría funcionando
08:51
And in our society, we have a strong belief
159
516000
3000
08:54
that synthetic happiness is of an inferior kind.
160
519000
3000
se cremos que non obter o que queremos
08:57
Why do we have that belief?
161
522000
2000
nos pode facer tan felices como obtelo?
08:59
Well, it's very simple. What kind of economic engine
162
524000
4000
Que me perdoe o meu amigo Matthieu Ricard,
09:03
would keep churning
163
528000
1000
un centro comercial cheo de monxes zen
09:04
if we believed that not getting what we want could make us just as happy as getting it?
164
529000
6000
non vai ser especialmente rendible
porque non precisan demasiadas cousas.
09:10
With all apologies to my friend Matthieu Ricard,
165
535000
4000
(Risos)
Quero ensinarvos
que a felicidade sintética
09:14
a shopping mall full of Zen monks
166
539000
2000
é tan real e duradeira
09:16
is not going to be particularly profitable
167
541000
2000
coma o tipo de felicidade que atopades
09:18
because they don't want stuff enough.
168
543000
4000
cando conseguides exactamente
o que buscades.
09:22
I want to suggest to you that synthetic happiness
169
547000
3000
Son científico, así que
non vou facer isto con retórica,
09:25
is every bit as real and enduring
170
550000
3000
senón mergullándovos nuns cantos datos.
Deixádeme ensinarvos primeiro
un paradigma experimental usado
09:28
as the kind of happiness you stumble upon
171
553000
3000
para demostrar a síntese da felicidade
entre vellos amigos normais.
09:31
when you get exactly what you were aiming for.
172
556000
3000
09:34
Now, I'm a scientist, so I'm going to do this not with rhetoric,
173
559000
1000
E non é meu.
09:35
but by marinating you in a little bit of data.
174
560000
2000
É un paradigma que ten 50 anos,
o da "libre elección".
09:38
Let me first show you an experimental paradigm that is used
175
563000
2000
É moi simple.
09:40
to demonstrate the synthesis of happiness
176
565000
4000
Traes, por exemplo, seis obxectos,
pídeslle a alguén que os puntúe
do que máis ao que menos lle gusta.
09:44
among regular old folks. And this isn't mine.
177
569000
2000
09:46
This is a 50-year-old paradigm called the "free choice paradigm."
178
571000
3000
Neste caso, porque os usa o experimento,
09:49
It's very simple.
179
574000
2000
son láminas de Monet.
Todos podedes puntuar as láminas de Monet
09:51
You bring in, say, six objects,
180
576000
3000
desde a que máis vos gusta
ata a que vos gusta menos.
09:54
and you ask a subject to rank them from the most to the least liked.
181
579000
2000
Agora dámosvos a elixir:
09:56
In this case, because the experiment I'm going to tell you about uses them,
182
581000
3000
"Temos máis láminas no armario.
09:59
these are Monet prints.
183
584000
2000
Imos darvos unha como premio
para levardes para a casa.
10:01
So, everybody can rank these Monet prints
184
586000
2000
Resulta que temos o número tres
e o número catro", dicímoslle á persoa.
10:03
from the one they like the most, to the one they like the least.
185
588000
2000
10:05
Now we give you a choice:
186
590000
2000
É unha elección algo difícil,
10:07
"We happen to have some extra prints in the closet.
187
592000
3000
porque non preferimos moito máis
unha cá outra,
10:10
We're going to give you one as your prize to take home.
188
595000
2000
pero polo xeral a xente
adoita coller o número tres
10:12
We happen to have number three and number four,"
189
597000
3000
porque lle gusta un pouco máis
có número catro.
10:15
we tell the subject. This is a bit of a difficult choice,
190
600000
3000
Tempo despois, poden pasar
15 minutos ou 15 días...
10:18
because neither one is preferred strongly to the other,
191
603000
3000
sométese á persoa aos
mesmos estímulos,
10:21
but naturally, people tend to pick number three
192
606000
3000
e pídeselle que volva puntuar
eses estímulos.
10:24
because they liked it a little better than number four.
193
609000
2000
"Dinos canto che gustan agora".
10:27
Sometime later -- it could be 15 minutes; it could be 15 days --
194
612000
3000
Que acontece?
Observade como se sintetiza a felicidade.
10:30
the same stimuli are put before the subject,
195
615000
3000
Este é o resultado
que se repite unha e outra vez.
10:33
and the subject is asked to re-rank the stimuli.
196
618000
2000
Estades a ver a felicidade sintetizada.
10:35
"Tell us how much you like them now."
197
620000
2000
Queredes volver vela?
10:37
What happens? Watch as happiness is synthesized.
198
622000
3000
A felicidade!
"O que gañei é moito
mellor do que pensaba!
10:40
This is the result that has been replicated over and over again.
199
625000
4000
O que non levei non vale nada!"
Esa é a síntese da felicidade.
10:44
You're watching happiness be synthesized.
200
629000
1000
10:45
Would you like to see it again? Happiness!
201
630000
5000
(Risos)
Entón, cal é a resposta correcta a iso?
"Si, xa!"
10:50
"The one I got is really better than I thought!
202
635000
2000
10:52
That other one I didn't get sucks!"
203
637000
2000
Aquí está o experimento que fixemos,
10:54
(Laughter) That's the synthesis of happiness.
204
639000
2000
espero que vos convenza de que
10:56
Now what's the right response to that? "Yeah, right!"
205
641000
6000
"si, xa!" non era a resposta correcta.
Fixemos este experimento
cun grupo de pacientes
con amnesia anterógrada.
11:02
Now, here's the experiment we did,
206
647000
3000
Son pacientes hospitalizados,
a maioría con síndrome de Korsakov,
11:05
and I would hope this is going to convince you that
207
650000
1000
11:06
"Yeah, right!" was not the right response.
208
651000
3000
unha psicose polineurítica.
Beberon demasiado
e non poden crear novos recordos.
11:09
We did this experiment with a group of patients
209
654000
2000
11:11
who had anterograde amnesia. These are hospitalized patients.
210
656000
3000
Lembran a infancia
pero se te achegas e te presentas
11:14
Most of them have Korsakoff's syndrome,
211
659000
2000
11:16
a polyneuritic psychosis that -- they drank way too much,
212
661000
5000
e logo marchas da sala,
cando volves non saben quen es.
Levamos as láminas de Monet ao hospital.
11:21
and they can't make new memories.
213
666000
2000
11:23
OK? They remember their childhood, but if you walk in and introduce yourself,
214
668000
4000
E pedímoslles a estes pacientes
que as puntuasen
11:27
and then leave the room,
215
672000
1000
da que máis lles gustaba
á que lles gustaba menos.
11:28
when you come back, they don't know who you are.
216
673000
2000
Logo démoslles a escoller
entre o número tres e o catro.
11:31
We took our Monet prints to the hospital.
217
676000
3000
11:34
And we asked these patients to rank them
218
679000
4000
Igual que os demais, dixeron:
"Vaia, grazas, doutor. Está xenial!
Unha lámina nova.
11:38
from the one they liked the most to the one they liked the least.
219
683000
3000
Vou coller o número tres".
Explicámoslles que lles enviariamos
por correo o número tres.
11:41
We then gave them the choice between number three and number four.
220
686000
4000
Recollemos os materiais, saímos da sala
11:45
Like everybody else, they said,
221
690000
2000
11:47
"Gee, thanks Doc! That's great! I could use a new print.
222
692000
2000
e contamos media hora.
(Risos)
11:49
I'll take number three."
223
694000
2000
Volvemos á sala e dixemos:
"Ola, estamos de volta".
11:51
We explained we would have number three mailed to them.
224
696000
4000
Os pacientes, benditos sexan, dixeron:
"Doutor, síntoo,
11:55
We gathered up our materials and we went out of the room,
225
700000
3000
teño un problema de memoria;
por iso estou aquí.
11:58
and counted to a half hour.
226
703000
2000
Se nos coñecemos antes, non o lembro".
12:00
Back into the room, we say, "Hi, we're back."
227
705000
3000
"De verdade non se lembra de min?
Vin antes coas láminas de Monet".
12:03
The patients, bless them, say, "Ah, Doc, I'm sorry,
228
708000
4000
"Síntoo, doutor, non teño nin idea".
12:07
I've got a memory problem; that's why I'm here.
229
712000
2000
"Non se preocupe, Jim. Só preciso
que puntúe estas láminas
12:09
If I've met you before, I don't remember."
230
714000
2000
da que máis lle guste
á que lle guste menos".
12:11
"Really, Jim, you don't remember? I was just here with the Monet prints?"
231
716000
3000
12:14
"Sorry, Doc, I just don't have a clue."
232
719000
3000
E que fan?
Ben, primeiro imos comprobar
que de verdade teñen amnesia.
12:17
"No problem, Jim. All I want you to do is rank these for me
233
722000
3000
Pedímoslles aos pacientes con amnesia
que nos digan que lámina teñen,
12:20
from the one you like the most to the one you like the least."
234
725000
5000
cal foi a que escolleron
a última vez, a deles.
12:25
What do they do? Well, let's first check and make sure
235
730000
2000
E o que fan os pacientes
con amnesia é supoñer.
12:27
they're really amnesiac. We ask these
236
732000
2000
12:29
amnesiac patients to tell us which one they own,
237
734000
4000
Son controis rutineiros,
que se volos fixese a vós,
todos saberiades que lámina escollestes.
12:33
which one they chose last time, which one is theirs.
238
738000
3000
Pero ao facelo a pacientes con amnesia,
non teñen nin idea.
12:36
And what we find is amnesiac patients just guess.
239
741000
4000
Non saben escoller a súa lámina da lista.
12:40
These are normal controls, where if I did this with you,
240
745000
2000
Isto é o que fan os controis rutineiros:
sintetizan a felicidade.
12:42
all of you would know which print you chose.
241
747000
2000
12:44
But if I do this with amnesiac patients,
242
749000
2000
Non si? Este é o cambio na puntuación,
12:46
they don't have a clue. They can't pick their print out of a lineup.
243
751000
6000
desde a primeira vez que puntúan
ata a segunda vez.
Os controis rutineiros demostran
a maxia que vos ensinei;
12:52
Here's what normal controls do: they synthesize happiness.
244
757000
4000
agora vou amosárvola
de xeito gráfico:
12:56
Right? This is the change in liking score,
245
761000
2000
"O que teño é mellor do que pensaba.
12:58
the change from the first time they ranked to the second time they ranked.
246
763000
3000
O que non teño, aquel que non escollín,
non é tan bo como pensaba".
13:01
Normal controls show
247
766000
1000
13:02
-- that was the magic I showed you;
248
767000
2000
Os amnésicos fan exactamente o mesmo.
Pensade neste resultado.
13:04
now I'm showing it to you in graphical form --
249
769000
3000
13:07
"The one I own is better than I thought. The one I didn't own,
250
772000
3000
A eles gústalles máis o que teñen,
13:10
the one I left behind, is not as good as I thought."
251
775000
3000
pero non saben que o teñen.
13:13
Amnesiacs do exactly the same thing. Think about this result.
252
778000
5000
"Si, xa" non é a resposta correcta!
13:18
These people like better the one they own,
253
783000
3000
O que fixeron eles
cando sintetizaron a felicidade
13:21
but they don't know they own it.
254
786000
3000
foi en realidade cambiar
as súas reaccións afectivas,
hedónicas e estéticas ante a lámina.
13:25
"Yeah, right" is not the right response!
255
790000
3000
13:29
What these people did when they synthesized happiness
256
794000
3000
Non o din só porque é deles,
13:32
is they really, truly changed
257
797000
3000
porque non saben que o teñen.
13:35
their affective, hedonic, aesthetic reactions to that poster.
258
800000
5000
Cando os psicólogos vos ensinan barras,
sabedes que vos están a ensinar
as medias de moita xente.
13:40
They're not just saying it because they own it,
259
805000
3000
E porén todos temos
este sistema psicoinmune,
13:43
because they don't know they own it.
260
808000
3000
esta capacidade para sintetizar
a felicidade,
13:47
Now, when psychologists show you bars,
261
812000
3000
pero a algúns dásenos mellor ca a outros.
13:50
you know that they are showing you averages of lots of people.
262
815000
3000
E algunhas situacións provocan que
calquera o faga de xeito máis efectivo
13:53
And yet, all of us have this psychological immune system,
263
818000
4000
ca outras situacións.
13:57
this capacity to synthesize happiness,
264
822000
2000
Resulta que a liberdade,
13:59
but some of us do this trick better than others.
265
824000
3000
a capacidade de tomar decisións
e de cambialas,
14:02
And some situations allow anybody to do it more effectively
266
827000
4000
é unha aliada natural da felicidade,
porque che permite escoller
14:06
than other situations do.
267
831000
3000
entre todos os prometedores futuros
14:10
It turns out that freedom
268
835000
4000
e atopar aquel do que máis gozarías.
Pero a liberdade de elección,
14:14
-- the ability to make up your mind and change your mind --
269
839000
3000
de cambiar e tomar decisións,
é inimiga da felicidade sintética.
14:17
is the friend of natural happiness, because it allows you to choose
270
842000
3000
E vouvos amosar por que.
Dilbert xa o sabe, claro.
14:20
among all those delicious futures and find the one that you would most enjoy.
271
845000
5000
"Asistencia Dogbert.
Como podo abusar de vostede?"
14:25
But freedom to choose
272
850000
2000
"A impresora saca unha folla en
branco con cada páxina".
14:27
-- to change and make up your mind -- is the enemy of synthetic happiness.
273
852000
4000
"Por que queixarse por ter papel gratis?"
"Gratis? Non me estás a dar
o meu propio papel?
14:31
And I'm going to show you why.
274
856000
2000
"Mira a calidade do papel gratis
comparada co teu ruín papel normal!
14:33
Dilbert already knows, of course.
275
858000
1000
14:34
You're reading the cartoon as I'm talking.
276
859000
2000
Só un parvo ou un mentireiro diría
que parecen iguais!"
14:36
"Dogbert's tech support. How may I abuse you?"
277
861000
2000
14:38
"My printer prints a blank page after every document."
278
863000
3000
"Agora que o dis, si que
semella un pouco máis satinado!"
14:41
"Why would you complain about getting free paper?"
279
866000
2000
"Que fas?"
"Axudarlle á xente a aceptar as cousas
que non poden cambiar". Exacto.
14:43
"Free? Aren't you just giving me my own paper?"
280
868000
2000
14:45
"Egad, man! Look at the quality of the free paper
281
870000
2000
O sistema psicoinmune funciona mellor
14:47
compared to your lousy regular paper!
282
872000
1000
14:48
Only a fool or a liar would say that they look the same!"
283
873000
3000
cando estamos totalmente atascados,
cando estamos atrapados.
14:51
"Ah! Now that you mention it, it does seem a little silkier!"
284
876000
3000
Esa é a diferenza entre
namorar e estar casados.
14:54
"What are you doing?"
285
879000
2000
Se saes con alguén
14:56
"I'm helping people accept the things they cannot change." Indeed.
286
881000
3000
e mete o dedo no nariz
non volves ter outra cita.
14:59
The psychological immune system works best
287
884000
3000
E se estás casada e mete o dedo no nariz?
Ten un corazón de ouro.
Non toques o pastel!
15:02
when we are totally stuck, when we are trapped.
288
887000
4000
Buscas un xeito de ser feliz
con esa situación.
15:06
This is the difference between dating and marriage, right?
289
891000
2000
(Risos)
15:08
I mean, you go out on a date with a guy,
290
893000
2000
Agora, o que quero ensinarvos
é que a xente non sabe isto
sobre ela mesma,
15:10
and he picks his nose; you don't go out on another date.
291
895000
2000
15:12
You're married to a guy and he picks his nose?
292
897000
2000
e non sabelo pode converterse
na nosa maior desvantaxe.
15:14
Yeah, he has a heart of gold;
293
899000
1000
15:15
don't touch the fruitcake. Right? (Laughter)
294
900000
2000
Este é un experimento feito en Harvard.
15:17
You find a way to be happy with what's happened.
295
902000
4000
Organizamos un curso de fotografía
en branco e negro,
15:21
Now what I want to show you is that
296
906000
3000
e deixámoslles aos alumnos vir
e aprender a usar o cuarto de revelado.
15:24
people don't know this about themselves,
297
909000
3000
Démoslles cámaras e eles foron polo campus
15:27
and not knowing this can work to our supreme disadvantage.
298
912000
3000
e tiraron 12 fotos
dos seus mestres favoritos,
15:30
Here's an experiment we did at Harvard.
299
915000
2000
do seu dormitorio e do seu can,
15:32
We created a photography course, a black-and-white photography course,
300
917000
3000
e de todas as cousas que querían
ter como lembranzas de Harvard.
15:35
and we allowed students to come in and learn how to use a darkroom.
301
920000
3000
Trouxéronnos a cámara,
creamos unha folla de contacto,
imaxinaron cales eran
as dúas mellores fotos
15:39
So we gave them cameras; they went around campus;
302
924000
2000
e pasamos seis horas
ensinándolles o cuarto de revelado.
15:41
they took 12 pictures of their favorite professors and their dorm room and their dog,
303
926000
5000
E amplían dúas,
e teñen dúas fermosas fotos de 20x25
15:46
and all the other things they wanted to have Harvard memories of.
304
931000
2000
de cousas importantes para eles,
e dicimos:
15:48
They bring us the camera; we make up a contact sheet;
305
933000
3000
"Cal queres descartar?"
"Debo descartar unha?"
15:51
they figure out which are the two best pictures;
306
936000
2000
"Si, precisamos unha como proba
do proxecto de clase.
15:53
and we now spend six hours teaching them about darkrooms.
307
938000
2000
Así que debes darme unha.
Tes que tomar unha decisión.
15:55
And they blow two of them up,
308
940000
2000
15:57
and they have two gorgeous eight-by-10 glossies of
309
942000
2000
Debes quedar ti con unha
e eu con outra".
15:59
meaningful things to them, and we say,
310
944000
2000
Hai dúas condicións neste experimento.
16:01
"Which one would you like to give up?"
311
946000
3000
Nun dos casos, díselles aos estudantes:
16:04
They say, "I have to give one up?"
312
949000
1000
16:05
"Oh, yes. We need one as evidence of the class project.
313
950000
3000
"Xa sabes, se queres cambiar de idea,
16:08
So you have to give me one. You have to make a choice.
314
953000
3000
sempre vou ter eu aquí a outra,
e nos seguintes catro días,
antes de enviala por correo á central..."
16:11
You get to keep one, and I get to keep one."
315
956000
3000
16:14
Now, there are two conditions in this experiment.
316
959000
3000
Si, a central...
"Estarei encantado de trocarcha.
16:17
In one case, the students are told, "But you know,
317
962000
3000
De feito, irei ao teu cuarto,
só tes que enviarme un correo.
16:20
if you want to change your mind, I'll always have the other one here,
318
965000
3000
Mellor aínda, vereino contigo.
16:23
and in the next four days, before I actually mail it to headquarters,
319
968000
4000
Se queres cambiar de idea
é totalmente reversible.
16:27
I'll be glad to" -- (Laughter) -- yeah, "headquarters" --
320
972000
3000
Á outra metade de alumnos
díselles xusto o contrario:
16:30
"I'll be glad to swap it out with you. In fact,
321
975000
3000
"Toma unha decisión e, por certo,
o correo sae en dous minutos
cara a Inglaterra.
16:33
I'll come to your dorm room and give
322
978000
1000
16:34
-- just give me an email. Better yet, I'll check with you.
323
979000
3000
A túa foto estará voando sobre o Atlántico.
16:37
You ever want to change your mind, it's totally returnable."
324
982000
3000
Non a volverás ver".
A metade dos estudantes
en cada un destes casos
16:40
The other half of the students are told exactly the opposite:
325
985000
3000
debe facer predicións
16:43
"Make your choice. And by the way,
326
988000
2000
sobre canto lles vai gustar
a foto coa que queden
16:45
the mail is going out, gosh, in two minutes, to England.
327
990000
3000
e a foto que descartan.
16:48
Your picture will be winging its way over the Atlantic.
328
993000
2000
A outros estudantes envíanos
aos seus cuartos
16:50
You will never see it again."
329
995000
2000
e mídenlles durante os
seguintes tres a seis días
16:52
Now, half of the students in each of these conditions
330
997000
3000
o seu agrado e satisfacción coas fotos.
16:55
are asked to make predictions about how much
331
1000000
2000
16:57
they're going to come to like the picture that they keep
332
1002000
3000
Mirade o que descubrimos.
Primeiro o que os estudantes
cren que vai pasar.
17:00
and the picture they leave behind.
333
1005000
2000
Pensan que se cadra lles vai
gustar a foto que escolleron
17:02
Other students are just sent back to their little dorm rooms
334
1007000
3000
17:05
and they are measured over the next three to six days
335
1010000
5000
algo máis que a que descartaron,
pero estas diferenzas non son
estatisticamente significativas.
17:10
on their liking, satisfaction with the pictures.
336
1015000
2000
É un incremento moi pequeno
e non importa moito
17:12
And look at what we find.
337
1017000
1000
17:13
First of all, here's what students think is going to happen.
338
1018000
3000
se están na situación reversible
ou na irreversible.
17:16
They think they're going to maybe come to like the picture they chose
339
1021000
4000
É un erro. Malos simuladores.
Porque isto é o que pasa en realidade.
17:20
a little more than the one they left behind,
340
1025000
3000
Xusto antes do intercambio
e cinco días máis tarde,
17:23
but these are not statistically significant differences.
341
1028000
3000
ás persoas que están apegadas a esa foto,
17:27
It's a very small increase, and it doesn't much matter
342
1032000
2000
que non teñen elección,
que non poden cambiar de opinión,
17:29
whether they were in the reversible or irreversible condition.
343
1034000
3000
gústalles moito!
17:32
Wrong-o. Bad simulators. Because here's what's really happening.
344
1037000
5000
E as persoas que están deliberando:
"Debo cambiala?
Escollín a axeitada?
Se cadra esta non é a boa?
17:37
Both right before the swap and five days later,
345
1042000
3000
Quizais descartei a boa?"
suicidáronse.
17:40
people who are stuck with that picture,
346
1045000
2000
Non lles gusta a súa foto
17:42
who have no choice,
347
1047000
1000
17:43
who can never change their mind, like it a lot!
348
1048000
5000
e de feito, mesmo despois
de caducar a opción de cambiar,
segue sen lles gustar a foto.
17:48
And people who are deliberating -- "Should I return it?
349
1053000
3000
Por que?
Porque a situación [reversible]
non conduce
17:51
Have I gotten the right one? Maybe this isn't the good one?
350
1056000
2000
17:53
Maybe I left the good one?" -- have killed themselves.
351
1058000
2000
á síntese da felicidade.
17:55
They don't like their picture, and in fact
352
1060000
2000
E aquí vai a parte final
deste experimento.
17:57
even after the opportunity to swap has expired,
353
1062000
2000
Traemos un novo grupo enteiro
de estudantes inxenuos de Harvard
17:59
they still don't like their picture. Why?
354
1064000
6000
e dicímoslles: "Mirade, estamos
a facer un curso de fotografía e
18:05
Because the reversible condition is not conducive
355
1070000
3000
podemos facelo de dous modos.
Podemos facelo de modo que
cando tiredes as dúas fotos
18:08
to the synthesis of happiness.
356
1073000
2000
18:10
So here's the final piece of this experiment.
357
1075000
3000
teñades catro días para cambiar de idea,
ou facemos outro curso
no que tirades dúas fotos
18:13
We bring in a whole new group of naive Harvard students
358
1078000
4000
e tomades unha decisión no momento
e nunca podedes cambiala.
18:17
and we say, "You know, we're doing a photography course,
359
1082000
3000
Que curso preferides facer? Ah!
18:20
and we can do it one of two ways.
360
1085000
2000
66 por cento dos estudantes, dous terzos,
18:22
We could do it so that when you take the two pictures,
361
1087000
3000
prefiren estar no curso
no que teñen opción de cambiar de idea.
18:25
you'd have four days to change your mind,
362
1090000
2000
Como? 66 por cento dos estudantes
escollen asistir ao curso
18:27
or we're doing another course where you take the two pictures
363
1092000
2000
18:29
and you make up your mind right away
364
1094000
2000
no que finalmente estarán
moi insatisfeitos coa foto.
18:31
and you can never change it. Which course would you like to be in?"
365
1096000
2000
18:33
Duh! 66 percent of the students, two-thirds,
366
1098000
5000
Porque non coñecen as condicións
nas que agroma a felicidade sintética.
18:38
prefer to be in the course where they have the opportunity to change their mind.
367
1103000
4000
O Bardo contouno todo mellor,
claro, e defende aquí a miña versión
18:42
Hello? 66 percent of the students choose to be in the course in which they will
368
1107000
4000
18:46
ultimately be deeply dissatisfied with the picture.
369
1111000
4000
pero faino de xeito hiperbólico:
"Non hai nada bo ou ruín/
Mais pensalo faino bo ou ruín".
18:50
Because they do not know the conditions under which synthetic happiness grows.
370
1115000
6000
É un fermoso poema,
pero non pode ser totalmente certo.
De verdade non hai nada bo ou ruín?
18:56
The Bard said everything best, of course, and he's making my point here
371
1121000
5000
En serio unha operación da vesícula biliar
e unha viaxe a París
19:01
but he's making it hyperbolically:
372
1126000
3000
son o mesmo?
(Risos)
19:04
"'Tis nothing good or bad / But thinking makes it so."
373
1129000
3000
Parece un test de CI dunha soa pregunta.
19:07
It's nice poetry, but that can't exactly be right.
374
1132000
3000
Non poden ser o mesmo.
Nunha prosa máis solemne
pero máis próxima á verdade,
19:10
Is there really nothing good or bad?
375
1135000
3000
o pai do capitalismo moderno,
Adam Smith, dixo o seguinte.
19:13
Is it really the case that gall bladder surgery and a trip to Paris
376
1138000
3000
19:16
are just the same thing? That seems like a one-question IQ test.
377
1141000
7000
Vale a pena contemplar isto:
"A gran fonte tanto da miseria
coma das desordes da vida humana
parece proceder
de sobrevalorar a diferenza
19:23
They can't be exactly the same.
378
1148000
2000
entre unha situación
permanente e outra...
19:25
In more turgid prose, but closer to the truth,
379
1150000
3000
Sen dúbida algunhas situacións
merecen ser preferidas fronte a outras,
19:28
was the father of modern capitalism, Adam Smith, and he said this.
380
1153000
3000
19:31
This is worth contemplating:
381
1156000
2000
19:33
"The great source of both the misery and disorders of human life
382
1158000
4000
pero ningunha pode merecer ser perseguida
con ese ardor apaixonado
que nos conduce a violar as regras,
19:37
seems to arise from overrating the difference
383
1162000
3000
19:40
between one permanent situation and another ...
384
1165000
3000
sexan de prudencia ou xustiza,
19:43
Some of these situations may, no doubt, deserve to be preferred to others,
385
1168000
5000
ou a corromper a futura tranquilidade
da nosa mente,
sexa por vergoña ou por lembranza
da nosa propia estupidez,
19:48
but none of them can deserve to be pursued
386
1173000
6000
ou polo arrepentimento do horror
da nosa propia inxustiza".
Noutras palabras: si,
algunhas cousas son mellores ca outras.
19:54
with that passionate ardor which drives us to violate the rules
387
1179000
4000
19:58
either of prudence or of justice, or to corrupt the future tranquility of our minds,
388
1183000
5000
Debemos ter preferencias que nos conduzan
cara a un futuro sobre outro.
20:03
either by shame from the remembrance of our own folly,
389
1188000
4000
Pero cando esas preferencias
nos resulten difíciles ou moi rápidas
20:07
or by remorse for the horror of our own injustice."
390
1192000
4000
porque sobrevaloramos
as diferenzas entre eses futuros,
20:11
In other words: yes, some things are better than others.
391
1196000
5000
corremos un risco.
Cando a nosa ambición ten límites,
lévanos a traballar con alegría.
20:16
We should have preferences that lead us into one future over another.
392
1201000
5000
Cando a nosa ambición é ilimitada,
lévanos a mentir, a enganar,
a roubar, a ferir a outros,
20:21
But when those preferences drive us too hard and too fast
393
1206000
4000
a sacrificar cousas con valor real.
20:25
because we have overrated the difference between these futures,
394
1210000
4000
Cando os nosos medos son limitados,
somos prudentes, cautos,
20:29
we are at risk.
395
1214000
3000
reflexivos.
Cando os nosos medos
son ilimitados e desbordantes,
20:32
When our ambition is bounded, it leads us to work joyfully.
396
1217000
3000
somos imprudentes e covardes.
20:35
When our ambition is unbounded, it leads us to lie, to cheat, to steal, to hurt others,
397
1220000
6000
A lección que quero compartir con vós
a partir destes datos
é que os nosos desexos e preocupacións
son en certo xeito desbordantes
20:41
to sacrifice things of real value. When our fears are bounded,
398
1226000
4000
20:45
we're prudent; we're cautious; we're thoughtful.
399
1230000
4000
porque temos dentro de nós a capacidade
de fabricar a propia materia prima
20:49
When our fears are unbounded and overblown,
400
1234000
3000
que estamos a buscar constantemente
cando escollemos a experiencia.
20:52
we're reckless, and we're cowardly.
401
1237000
3000
20:55
The lesson I want to leave you with from these data
402
1240000
3000
Grazas.
(Aplausos)
20:58
is that our longings and our worries are both to some degree overblown,
403
1243000
5000
21:03
because we have within us the capacity to manufacture the very commodity
404
1248000
6000
21:09
we are constantly chasing when we choose experience.
405
1254000
4000
21:13
Thank you.
406
1258000
1000

▲Back to top

ABOUT THE SPEAKER
Dan Gilbert - Psychologist; happiness expert
Harvard psychologist Dan Gilbert says our beliefs about what will make us happy are often wrong -- a premise he supports with intriguing research, and explains in his accessible and unexpectedly funny book, Stumbling on Happiness.

Why you should listen

Dan Gilbert believes that, in our ardent, lifelong pursuit of happiness, most of us have the wrong map. In the same way that optical illusions fool our eyes -- and fool everyone's eyes in the same way -- Gilbert argues that our brains systematically misjudge what will make us happy. And these quirks in our cognition make humans very poor predictors of our own bliss.

The premise of his current research -- that our assumptions about what will make us happy are often wrong -- is supported with clinical research drawn from psychology and neuroscience. But his delivery is what sets him apart. His engaging -- and often hilarious -- style pokes fun at typical human behavior and invokes pop-culture references everyone can relate to. This winning style translates also to Gilbert's writing, which is lucid, approachable and laugh-out-loud funny. The immensely readable Stumbling on Happiness, published in 2006, became a New York Times bestseller and has been translated into 20 languages.

In fact, the title of his book could be drawn from his own life. At 19, he was a high school dropout with dreams of writing science fiction. When a creative writing class at his community college was full, he enrolled in the only available course: psychology. He found his passion there, earned a doctorate in social psychology in 1985 at Princeton, and has since won a Guggenheim Fellowship and the Phi Beta Kappa teaching prize for his work at Harvard. He has written essays and articles for The New York Times, Time and even Starbucks, while continuing his research into happiness at his Hedonic Psychology Laboratory.

More profile about the speaker
Dan Gilbert | Speaker | TED.com