ABOUT THE SPEAKER
Pamela Meyer - Lie detector
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth.

Why you should listen
Social media expert Pamela Meyer can tell when you’re lying. If it’s not your words that give you away, it’s your posture, eyes, breathing rate, fidgets, and a host of other indicators. Worse, we are all lied to up to 200 times a day, she says, from the white lies that allow society to function smoothly to the devastating duplicities that bring down corporations and break up families.

Working with a team of researchers over several years, Meyer, who is CEO of social networking company Simpatico Networks, collected and reviewed most of the research on deception that has been published, from such fields as law-enforcement, military, psychology and espionage. She then became an expert herself, receiving advanced training in deception detection, including multiple courses of advanced training in interrogation, microexpression analysis, statement analysis, behavior and body language interpretation, and emotion recognition. Her research is synthetized in her bestselling book Liespotting.
More profile about the speaker
Pamela Meyer | Speaker | TED.com
TEDGlobal 2011

Pamela Meyer: How to spot a liar

Pamela Meyer: Kako prepoznati lažljivca

Filmed:
22,447,282 views

Svakog dana nas slažu između 10 i 200 puta, a znakovi koji nam otkrivaju kada nas lažu mogu biti suptilni i suprotni intuiciji. Pamela Mayer, autorica knjige Prepoznavanje laži (Liespotting), predstavlja načine i ključne točke kojima se služe ljudi obučeni da prepoznaju prevaru, tvrdeći da je iskrenost vrlina koju vrijedi očuvati.
- Lie detector
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth. Full bio

Double-click the English transcript below to play the video.

00:15
Okay, now I don't want to alarm anybody in this room,
0
0
5000
Gledajte, ne bih htjela
nikoga ovdje uznemiriti,
00:20
but it's just come to my attention
1
5000
2000
ali sam upravo primjetila
00:22
that the person to your right is a liar.
2
7000
2000
da je osoba desno od vas lažljivac.
00:24
(Laughter)
3
9000
2000
(Smijeh)
00:26
Also, the person to your left is a liar.
4
11000
3000
I osoba lijevo od vas je lažljivac.
00:29
Also the person sitting in your very seats is a liar.
5
14000
3000
Kao i osoba koja sjedi na mjestu svakoga od vas.
00:32
We're all liars.
6
17000
2000
Svi smo mi lažljivci.
00:34
What I'm going to do today
7
19000
2000
I evo šta ću danas učiniti:
00:36
is I'm going to show you what the research says about why we're all liars,
8
21000
3000
predstaviti ću vam što istraživanja kažu o tome zašto smo svi mi lažljivci.
00:39
how you can become a liespotter
9
24000
2000
kako možete naučiti prepoznavati lažljivce.
00:41
and why you might want to go the extra mile
10
26000
3000
i zašto biste mogli željeti da pođete i korak dalje
00:44
and go from liespotting to truth seeking,
11
29000
3000
te da sa prepoznavanja lažljivaca prijeđete na traganje za istinom,
00:47
and ultimately to trust building.
12
32000
2000
i konačno do izgradnje povjerenja.
00:49
Now speaking of trust,
13
34000
3000
Kad smo već pomenuli povjerenje,
00:52
ever since I wrote this book, "Liespotting,"
14
37000
3000
otkad sam napisala ovu knjigu, "Prepoznavanje lažova"
00:55
no one wants to meet me in person anymore, no, no, no, no, no.
15
40000
3000
niko više ne želi sa mnom da se sretne uživo, ne, ne, ne.
00:58
They say, "It's okay, we'll email you."
16
43000
3000
Svi kažu: "Važi, poslaćemo vam e-mail".
01:01
(Laughter)
17
46000
2000
(Smijeh)
01:03
I can't even get a coffee date at Starbucks.
18
48000
4000
Niko neće sa mnom ni na kafu u Starbucks.
01:07
My husband's like, "Honey, deception?
19
52000
2000
Muž mi kaže: "Mila, laganje?
01:09
Maybe you could have focused on cooking. How about French cooking?"
20
54000
3000
Možda da si se bavila kuvanjem? Šta kažeš na francusku kuhinju?"
01:12
So before I get started, what I'm going to do
21
57000
2000
Dakle, prije nego što počnem, evo šta ću učiniti
01:14
is I'm going to clarify my goal for you,
22
59000
3000
objasniću vam šta mi je cilj,
01:17
which is not to teach a game of Gotcha.
23
62000
2000
a cilj mi nije da vas naučim da igrate igru Gotcha! (Moj si!).
01:19
Liespotters aren't those nitpicky kids,
24
64000
2000
Ljudi koji prepoznaju lažove nisu klinci cjepidlake,
01:21
those kids in the back of the room that are shouting, "Gotcha! Gotcha!
25
66000
3000
oni koji sjede na suprotnoj strani sobe i vrište "Moj si! Moj si!"
01:24
Your eyebrow twitched. You flared your nostril.
26
69000
3000
Podigao si obrvu. Raširio si nozdrvu.
01:27
I watch that TV show 'Lie To Me.' I know you're lying."
27
72000
3000
Gledam onu televizijsku emisiju "Laži me". Znam da lažeš".
01:30
No, liespotters are armed
28
75000
2000
Ne, ljudi koji prepoznaju lažove raspolažu
01:32
with scientific knowledge of how to spot deception.
29
77000
3000
naučnim saznanjima o tome kako da prepoznaju laganje.
01:35
They use it to get to the truth,
30
80000
2000
Koriste ga da bi došli do istine,
01:37
and they do what mature leaders do everyday;
31
82000
2000
i učine ono što zreli lideri svakodnevno čine;
01:39
they have difficult conversations with difficult people,
32
84000
3000
vode teške razgovore sa teškim ljudima,
01:42
sometimes during very difficult times.
33
87000
2000
ponekad u veoma teška vremena.
01:44
And they start up that path
34
89000
2000
A na taj put kreću
01:46
by accepting a core proposition,
35
91000
2000
nakon što prihvate temeljnu pretpostavku
01:48
and that proposition is the following:
36
93000
2000
a ta pretpostavka je sljedeća:
01:50
Lying is a cooperative act.
37
95000
3000
Laganje je čin saradnje.
01:53
Think about it, a lie has no power whatsoever by its mere utterance.
38
98000
4000
Razmislite o tome, laž nema nikakvu moć ako se samo izgovori.
01:57
Its power emerges
39
102000
2000
Njena moć se počne da se pomalja
01:59
when someone else agrees to believe the lie.
40
104000
2000
kada neko drugi pristane da u tu laž povjeruje.
02:01
So I know it may sound like tough love,
41
106000
2000
Znam da bih mogla da vam zvučim kao strog roditelj,
02:03
but look, if at some point you got lied to,
42
108000
4000
ali vidite, ako vas neko slaže,
02:07
it's because you agreed to get lied to.
43
112000
2000
to je zato što ste pristali na to da vas lažu.
02:09
Truth number one about lying: Lying's a cooperative act.
44
114000
3000
Prva važna istina o laganju: laganje je čin saradnje.
02:12
Now not all lies are harmful.
45
117000
2000
Nisu sve laži škodljive.
02:14
Sometimes we're willing participants in deception
46
119000
3000
Nekad svojevoljno učestvujemo u prevari
02:17
for the sake of social dignity,
47
122000
3000
da bismo sačuvali dostojanstvo u društvu,
02:20
maybe to keep a secret that should be kept secret, secret.
48
125000
3000
a možda i da bismo zatajili tajnu koja treba da ostane tajna.
02:23
We say, "Nice song."
49
128000
2000
Reći ćemo: "Lijepa pjesma".
02:25
"Honey, you don't look fat in that, no."
50
130000
3000
"Ne, mila, ne izgledaš debela u tome".
02:28
Or we say, favorite of the digiratti,
51
133000
2000
Ili, kako to već digitalna elita voli da kaže,
02:30
"You know, I just fished that email out of my spam folder.
52
135000
3000
"Gle, upravo sam tu poruku spasila iz pregrade za neželjenu poštu.
02:33
So sorry."
53
138000
3000
Baš mi je žao".
02:36
But there are times when we are unwilling participants in deception.
54
141000
3000
Ali ponekad i nevoljno učestvujemo u zavaravanju.
02:39
And that can have dramatic costs for us.
55
144000
3000
I to nas može skupo stajati.
02:42
Last year saw 997 billion dollars
56
147000
3000
Prošle godine su kompanije izgubile 997 milijardi dolara
02:45
in corporate fraud alone in the United States.
57
150000
4000
na prevarama, samo u Sjedinjenim Američkim Državama.
02:49
That's an eyelash under a trillion dollars.
58
154000
2000
A to je tek nešto malo manje od hiljade milijardi dolara.
02:51
That's seven percent of revenues.
59
156000
2000
I to predstavlja sedam odsto naših prihoda.
02:53
Deception can cost billions.
60
158000
2000
Prevare mogu koštati milijarde dolara.
02:55
Think Enron, Madoff, the mortgage crisis.
61
160000
3000
Sjetite se Enrona, Madoffa, krize hipotekarnih kredita.
02:58
Or in the case of double agents and traitors,
62
163000
3000
Ili dvostrukih agenata i izdajica,
03:01
like Robert Hanssen or Aldrich Ames,
63
166000
2000
Kao što su Robert Hanssen ili Aldrich Ames,
03:03
lies can betray our country,
64
168000
2000
lažima se izdaje naža zemlja,
03:05
they can compromise our security, they can undermine democracy,
65
170000
3000
one nam mogu ugroziti bezbjednost, narušiti nam demokratiju,
03:08
they can cause the deaths of those that defend us.
66
173000
3000
uzrokovati smrt naših branitelja.
03:11
Deception is actually serious business.
67
176000
3000
Prevara je zapravo ozbiljan posao.
03:14
This con man, Henry Oberlander,
68
179000
2000
Ovaj prevarant, Henry Oberlander,
03:16
he was such an effective con man
69
181000
2000
Toliko je bio učinkovit
03:18
British authorities say
70
183000
2000
da britanske vlasti tvrde
03:20
he could have undermined the entire banking system of the Western world.
71
185000
3000
da je mogao potkopati čitav bankarski sistem zapadnog svijeta.
03:23
And you can't find this guy on Google; you can't find him anywhere.
72
188000
2000
I ništa o njemu nećete naći na Googlu; nigdje ništa o njemu nećete naći.
03:25
He was interviewed once, and he said the following.
73
190000
3000
Jednom je intervjuisan, i evo šta je tada rekao.
03:28
He said, "Look, I've got one rule."
74
193000
2000
Rekao je: "Vidite, ja imam jedno pravilo".
03:30
And this was Henry's rule, he said,
75
195000
3000
I to je bilo, kako je rekao, Henrijevo pravilo.
03:33
"Look, everyone is willing to give you something.
76
198000
2000
"Pazite, svako je spreman nešto da vam da.
03:35
They're ready to give you something for whatever it is they're hungry for."
77
200000
3000
Da vam da nešto u zamenu za ono za čim žudi".
03:38
And that's the crux of it.
78
203000
2000
I to je ključna stvar.
03:40
If you don't want to be deceived, you have to know,
79
205000
2000
Ako ne želite da vas prevare, morate znati
03:42
what is it that you're hungry for?
80
207000
2000
za čim žudite.
03:44
And we all kind of hate to admit it.
81
209000
3000
A to baš i ne volimo da priznamo.
03:47
We wish we were better husbands, better wives,
82
212000
3000
Želimo da budemo bolji muževi, bolje žene,
03:50
smarter, more powerful,
83
215000
2000
pametniji, moćniji,
03:52
taller, richer --
84
217000
2000
viši, bogatiji --
03:54
the list goes on.
85
219000
2000
i tako dalje.
03:56
Lying is an attempt to bridge that gap,
86
221000
2000
Laganjem pokušavamo da premostimo ovaj jaz,
03:58
to connect our wishes and our fantasies
87
223000
2000
kako bismo povezali želje i maštarije
04:00
about who we wish we were, how we wish we could be,
88
225000
3000
o tome ko bi smo želeli da budemo, kakvi bismo želeli da da možemo biti,
04:03
with what we're really like.
89
228000
3000
sa onim kakvi smo zaista.
04:06
And boy are we willing to fill in those gaps in our lives with lies.
90
231000
3000
Uh, kako samo želimo te praznine u našim životima da ispunimo lažima.
04:09
On a given day, studies show that you may be lied to
91
234000
3000
Istraživanja pokazuju da vas dnevno slažu
04:12
anywhere from 10 to 200 times.
92
237000
2000
između 10 i 200 puta.
04:14
Now granted, many of those are white lies.
93
239000
3000
Priznajem, mnoge od tih laži su bezazlene.
04:17
But in another study,
94
242000
2000
Međutim, jedna druga studija,
04:19
it showed that strangers lied three times
95
244000
2000
pokazala je da ljudi koji se ne poznaju lažu tri puta više
04:21
within the first 10 minutes of meeting each other.
96
246000
2000
u prvih deset minuta upoznavanja.
04:23
(Laughter)
97
248000
2000
(Smijeh)
04:25
Now when we first hear this data, we recoil.
98
250000
3000
Prenerazimo se kad prvi put čujemo ovaj podatak.
04:28
We can't believe how prevalent lying is.
99
253000
2000
Teško nam je da povjerujemo koliko je laž sveprisutna.
04:30
We're essentially against lying.
100
255000
2000
U suštini, mi se protivimo laganju.
04:32
But if you look more closely,
101
257000
2000
Ali ako to pažljivije pogledamo,
04:34
the plot actually thickens.
102
259000
2000
pokazaće se da je to zapravo složenije.
04:36
We lie more to strangers than we lie to coworkers.
103
261000
3000
Više lažemo ljudima koje ne poznajemo nego onima sa kojima radimo.
04:39
Extroverts lie more than introverts.
104
264000
4000
Ekstrovertni ljudi lažu više od introvertnih.
04:43
Men lie eight times more about themselves
105
268000
3000
Muškarci osam puta više lažu o sebi
04:46
than they do other people.
106
271000
2000
nego što lažu druge ljude.
04:48
Women lie more to protect other people.
107
273000
3000
Žene više lažu da bi zaštitile druge.
04:51
If you're an average married couple,
108
276000
3000
Ako ste prosečan bračni par,
04:54
you're going to lie to your spouse
109
279000
2000
lagaćete supružnika u jednoj od
04:56
in one out of every 10 interactions.
110
281000
2000
10 interakcija.
04:58
Now you may think that's bad.
111
283000
2000
Mogli biste pomisliti da je to loše.
05:00
If you're unmarried, that number drops to three.
112
285000
2000
Ako niste vjenčani, taj broj pada na tri.
05:02
Lying's complex.
113
287000
2000
Lagati je složeno
05:04
It's woven into the fabric of our daily and our business lives.
114
289000
3000
Protkano nitima našeg svakodnoevnog i poslovnog života.
05:07
We're deeply ambivalent about the truth.
115
292000
2000
Kad je riječ o istini, tu smo duboko podvojeni.
05:09
We parse it out on an as-needed basis,
116
294000
2000
Analiziramo je onda kada nam zatreba,
05:11
sometimes for very good reasons,
117
296000
2000
nekad s veoma dobrim razlozima,
05:13
other times just because we don't understand the gaps in our lives.
118
298000
3000
a nekada, opet, prosto zato što ne razumijemo praznine u našim životima.
05:16
That's truth number two about lying.
119
301000
2000
I to je druga istinita činjenica o laganju.
05:18
We're against lying,
120
303000
2000
Protivimo se laganju,
05:20
but we're covertly for it
121
305000
2000
Ali smo, potajno, za laganje
05:22
in ways that our society has sanctioned
122
307000
2000
na načine koje je naše društvo sankcionisalo
05:24
for centuries and centuries and centuries.
123
309000
2000
kroz vijekove, vijekove i vijekove.
05:26
It's as old as breathing.
124
311000
2000
Staro je koliko i mi sami.
05:28
It's part of our culture, it's part of our history.
125
313000
2000
Dio je naše kulture, dio je naše istorije.
05:30
Think Dante, Shakespeare,
126
315000
3000
Sjetite se Dantea, Shakespearea,
05:33
the Bible, News of the World.
127
318000
3000
Biblije, tabloida News of the World.
05:36
(Laughter)
128
321000
2000
(Smijeh)
05:38
Lying has evolutionary value to us as a species.
129
323000
2000
Za nas kao vrstu, laganje ima evolutivni značaj.
05:40
Researchers have long known
130
325000
2000
Istraživanja su davno pokazala
05:42
that the more intelligent the species,
131
327000
2000
da što je neka vrsta inteligentnija,
05:44
the larger the neocortex,
132
329000
2000
što joj je veći neokorteks,
05:46
the more likely it is to be deceptive.
133
331000
2000
to je sklonija laganju.
05:48
Now you might remember Koko.
134
333000
2000
Možda ćete se sjetiti Koko.
05:50
Does anybody remember Koko the gorilla who was taught sign language?
135
335000
3000
Da li se iko od vas sjeća Koko, gorile koju su učili jezik znakova?
05:53
Koko was taught to communicate via sign language.
136
338000
3000
Koko su, dakle, učili da komunicira jezikom znakova.
05:56
Here's Koko with her kitten.
137
341000
2000
Ovdje vidite Koko sa mačetom.
05:58
It's her cute little, fluffy pet kitten.
138
343000
3000
Njeno malo, čupavo, mače.
06:01
Koko once blamed her pet kitten
139
346000
2000
Koko je onomad svoje mače
06:03
for ripping a sink out of the wall.
140
348000
2000
optužila da je iščupala česmu iz zida
06:05
(Laughter)
141
350000
2000
(Smijeh)
06:07
We're hardwired to become leaders of the pack.
142
352000
2000
Težnja da postanemo predvodnici nam je urođena.
06:09
It's starts really, really early.
143
354000
2000
I s tim počinjemo veoma, veoma rano.
06:11
How early?
144
356000
2000
Koliko rano?
06:13
Well babies will fake a cry,
145
358000
2000
Pa, bebe se pretvaraju da plaču,
06:15
pause, wait to see who's coming
146
360000
2000
zastanu, sačekaju da vide ko će doći
06:17
and then go right back to crying.
147
362000
2000
a onda nastave da plaču.
06:19
One-year-olds learn concealment.
148
364000
2000
Jednogodišnjaci nauče kao da nešto sakriju.
06:21
(Laughter)
149
366000
2000
(Smijeh)
06:23
Two-year-olds bluff.
150
368000
2000
Dvogodišnjaci blefiraju.
06:25
Five-year-olds lie outright.
151
370000
2000
Petogodišnjaci vas gledaju u oči i lažu.
06:27
They manipulate via flattery.
152
372000
2000
Manipulišu laskanjem.
06:29
Nine-year-olds, masters of the cover up.
153
374000
3000
Devetogodišnjaci su majstori u zataškavanju.
06:32
By the time you enter college,
154
377000
2000
Dok stasate za fakultet,
06:34
you're going to lie to your mom in one out of every five interactions.
155
379000
3000
lagaćete majku tokom jednog od pet razgovora.
06:37
By the time we enter this work world and we're breadwinners,
156
382000
3000
Kad stupimo u svijet zaposlenih i počnemo da zarađujemo svoj hljeb,
06:40
we enter a world that is just cluttered
157
385000
2000
naći ćemo se u gunguli
06:42
with spam, fake digital friends,
158
387000
2000
od neželjenih poruka, lažnih digitalnih prijatelja,
06:44
partisan media,
159
389000
2000
pristrasnih medija,
06:46
ingenious identity thieves,
160
391000
2000
ingenioznih kradljivaca identiteta,
06:48
world-class Ponzi schemers,
161
393000
2000
prevaranata kroz šeme piramidalne štednje svjetske klase
06:50
a deception epidemic --
162
395000
2000
epidemiju laganja --
06:52
in short, what one author calls
163
397000
2000
ukratko, ono što je jedan autor nazvao
06:54
a post-truth society.
164
399000
3000
društvom iščezle istine.
06:57
It's been very confusing
165
402000
2000
I to nas zbunjuje
06:59
for a long time now.
166
404000
3000
već dugo.
07:03
What do you do?
167
408000
2000
Šta činiti?
07:05
Well there are steps we can take
168
410000
2000
Moguće je preduzeti neke korake
07:07
to navigate our way through the morass.
169
412000
2000
koji vode iz ove kaljuge.
07:09
Trained liespotters get to the truth 90 percent of the time.
170
414000
3000
Ljudi obučeni da prepoznaju lažove uspevaju u 90% slučajeva.
07:12
The rest of us, we're only 54 percent accurate.
171
417000
3000
Dok smo mi, ostali, uspješni u svega 54%.
07:15
Why is it so easy to learn?
172
420000
2000
Kako to da je tako lako naučiti? Pa,
07:17
There are good liars and there are bad liars. There are no real original liars.
173
422000
3000
ima dobrih i rđavih lažova. Lažovi nisu nikad zaista originalni.
07:20
We all make the same mistakes. We all use the same techniques.
174
425000
3000
Svi pravimo griješimo u istim stvarima. Svi se služimo istim tehnikama.
07:23
So what I'm going to do
175
428000
2000
Stoga, evo šta ću učiniti.
07:25
is I'm going to show you two patterns of deception.
176
430000
2000
Uputiću vas u dva obrasca laganja.
07:27
And then we're going to look at the hot spots and see if we can find them ourselves.
177
432000
3000
Zatim ćemo potražiti ključna mjesta i pokušati da ih prepoznamo.
07:30
We're going to start with speech.
178
435000
3000
Počećemo govorom.
07:33
(Video) Bill Clinton: I want you to listen to me.
179
438000
2000
(Video) Bill Clinton: Želim da me slušate.
07:35
I'm going to say this again.
180
440000
2000
Ovo ću ponovo reći.
07:37
I did not have sexual relations
181
442000
3000
Nisam imao seksualne odnose
07:40
with that woman, Miss Lewinsky.
182
445000
4000
sa tom ženom, gospođicom Lewinsky.
07:44
I never told anybody to lie,
183
449000
2000
Nikada nikome nisam rekao da laže,
07:46
not a single time, never.
184
451000
2000
ni jedan jedini put, nikada.
07:48
And these allegations are false.
185
453000
3000
Ove tvrdnje su lažne.
07:51
And I need to go back to work for the American people.
186
456000
2000
A sad treba da se vratim poslu u službi američkog naroda.
07:53
Thank you.
187
458000
2000
Hvala.
07:58
Pamela Meyer: Okay, what were the telltale signs?
188
463000
3000
Dobro, koji nam ga znaci ovdje odaju?
08:01
Well first we heard what's known as a non-contracted denial.
189
466000
4000
Prvo, čuli smo ono što se uobičajeno zove poricanje punim oblicima.
08:05
Studies show that people who are overdetermined in their denial
190
470000
3000
Studije pokazuju da oni koji djeluju previše uvjerljivo u poricanju
08:08
will resort to formal rather than informal language.
191
473000
3000
pribjegavaju formalnom izražavanju radije nego neformalnom.
08:11
We also heard distancing language: "that woman."
192
476000
3000
Čuli smo ga i kako se izrazom "s tom ženom" distancira.
08:14
We know that liars will unconsciously distance themselves
193
479000
2000
Znamo da se lažovi nesvjesno distanciraju
08:16
from their subject
194
481000
2000
od predmeta laži
08:18
using language as their tool.
195
483000
3000
služeći se jezikom kao alatkom.
08:21
Now if Bill Clinton had said, "Well, to tell you the truth ... "
196
486000
3000
Ali, da je Bill Clinton rekao: "Pa, da vam pravo kažem...".
08:24
or Richard Nixon's favorite, "In all candor ... "
197
489000
2000
ili, kako je to volio da kaže Richard Nixon: "U svoj iskrenosti..."
08:26
he would have been a dead giveaway
198
491000
2000
smjesta bi se odao
08:28
for any liespotter than knows
199
493000
2000
jer svako ko umije da prepozna lažova zna
08:30
that qualifying language, as it's called, qualifying language like that,
200
495000
3000
da pribjegavanje restrikcijama u jeziku, kako se to već zove,
08:33
further discredits the subject.
201
498000
2000
dodatno diskredituje subjekat.
08:35
Now if he had repeated the question in its entirety,
202
500000
3000
Da je ponovio čitavo pitanje,
08:38
or if he had peppered his account with a little too much detail --
203
503000
4000
ili da je svoju verziju događaja začinio sa previše detalja --
08:42
and we're all really glad he didn't do that --
204
507000
2000
a zaista nam je drago što to nije učinio --
08:44
he would have further discredited himself.
205
509000
2000
dodatno bi izgubio na uvjerljivosti.
08:46
Freud had it right.
206
511000
2000
I tu je Freud bio u pravu.
08:48
Freud said, look, there's much more to it than speech:
207
513000
3000
Frojd je rekao, vidite, nije samo riječ o govoru:
08:51
"No mortal can keep a secret.
208
516000
3000
"Nema tog smrtnika koji može da sačuva tajnu.
08:54
If his lips are silent, he chatters with his fingertips."
209
519000
3000
Ako i šuti kao zaliven, izbrbljaće se prstima".
08:57
And we all do it no matter how powerful you are.
210
522000
3000
A to svi činimo, bez obzira na to koliko smo moćni.
09:00
We all chatter with our fingertips.
211
525000
2000
Svi mi brbljamo prstima.
09:02
I'm going to show you Dominique Strauss-Kahn with Obama
212
527000
3000
Pokazaću vam Dominika Strauss-Kahna sa Obamom
09:05
who's chattering with his fingertips.
213
530000
3000
koji govori prstima.
09:08
(Laughter)
214
533000
3000
(Smijeh)
09:11
Now this brings us to our next pattern,
215
536000
3000
I to nas sada dovodi do sljedećeg obrasca,
09:14
which is body language.
216
539000
3000
a to je govor tijela.
09:17
With body language, here's what you've got to do.
217
542000
3000
Evo šta treba da radite u vezi sa govorom tijela.
09:20
You've really got to just throw your assumptions out the door.
218
545000
3000
Treba da se otarasite svih pretpostavki.
09:23
Let the science temper your knowledge a little bit.
219
548000
2000
I dozvolite nauci da malo modifikuje vaše znanje.
09:25
Because we think liars fidget all the time.
220
550000
3000
Pošto mislimo da su lažovi uvijek nervozni.
09:28
Well guess what, they're known to freeze their upper bodies when they're lying.
221
553000
3000
E pa, vidite, zna se da im je, dok lažu, gornji dio tijela sasvim ukočen.
09:31
We think liars won't look you in the eyes.
222
556000
3000
Mislimo da lažovi izbjegavaju da nas gledaju u oči.
09:34
Well guess what, they look you in the eyes a little too much
223
559000
2000
E, pa nije tko, čak vas gledaju u oči malo više nego što bi trebalo.
09:36
just to compensate for that myth.
224
561000
2000
kako bi učinili suprotno tom mitu.
09:38
We think warmth and smiles
225
563000
2000
Mislimo da toplota i osmijesi
09:40
convey honesty, sincerity.
226
565000
2000
otrkivaju poštenje i iskrenost.
09:42
But a trained liespotter
227
567000
2000
Međutim, oni koji su obučeni da prepoznaju laganje
09:44
can spot a fake smile a mile away.
228
569000
2000
mogu izdaleka da prepoznaju lažni osmjeh.
09:46
Can you all spot the fake smile here?
229
571000
3000
Možete li ovdje da prepoznate lažni osmjeh?
09:50
You can consciously contract
230
575000
2000
Moguće je svjesno zgrčiti
09:52
the muscles in your cheeks.
231
577000
3000
mišiće obraza.
09:55
But the real smile's in the eyes, the crow's feet of the eyes.
232
580000
3000
Pravi osmjeh je, međutim, u očima, u uglovima očiju.
09:58
They cannot be consciously contracted,
233
583000
2000
Koji se ne mogu svjesno zgrčiti,
10:00
especially if you overdid the Botox.
234
585000
2000
pogotovu ne ako ste pretjerali sa botoksom.
10:02
Don't overdo the Botox; nobody will think you're honest.
235
587000
3000
Pazite da ne pretjerate sa botoksom, niko vam više neće vjerovati da ste iskreni.
10:05
Now we're going to look at the hot spots.
236
590000
2000
Sad ćemo potražiti ključna mjesta.
10:07
Can you tell what's happening in a conversation?
237
592000
2000
Možete li zaključiti šta se dešava u nekom razgovoru?
10:09
Can you start to find the hot spots
238
594000
3000
Možete li početi traganjem za ključnim mjestima
10:12
to see the discrepancies
239
597000
2000
kako biste vidjeli neslaganja
10:14
between someone's words and someone's actions?
240
599000
2000
između onoga što neko govori i što radi?
10:16
Now I know it seems really obvious,
241
601000
2000
Znam da djeluje sasvim očigledno,
10:18
but when you're having a conversation
242
603000
2000
ali kad sa nekim razgovarate
10:20
with someone you suspect of deception,
243
605000
3000
i sumnjate da vas laže
10:23
attitude is by far the most overlooked but telling of indicators.
244
608000
3000
držanje se daleko najčešće previdi iako daje
10:26
An honest person is going to be cooperative.
245
611000
2000
Poštena osoba će uvijek sarađivati.
10:28
They're going to show they're on your side.
246
613000
2000
Pokazaće vam da su na vašoj strani.
10:30
They're going to be enthusiastic.
247
615000
2000
Biće ushićeni.
10:32
They're going to be willing and helpful to getting you to the truth.
248
617000
2000
Pokazaće volju i spremnost da vam pomognu da dođete do istine.
10:34
They're going to be willing to brainstorm, name suspects,
249
619000
3000
Mozgaće sa vama, imenovaće sumnjive,
10:37
provide details.
250
622000
2000
snabdjeće vas detaljima.
10:39
They're going to say, "Hey,
251
624000
2000
Reći će: "Vidi,
10:41
maybe it was those guys in payroll that forged those checks."
252
626000
3000
možda su oni u obračunskom krivotvorili te čekove".
10:44
They're going to be infuriated if they sense they're wrongly accused
253
629000
3000
Biće bijesni ukoliko shvate da su pogrešno optuženi
10:47
throughout the entire course of the interview, not just in flashes;
254
632000
2000
tokom čitavog razgovora, ne samo na trenutke;
10:49
they'll be infuriated throughout the entire course of the interview.
255
634000
3000
biće bijesni tokom čitavog razgovora.
10:52
And if you ask someone honest
256
637000
2000
a ako poštenu osobu pitate
10:54
what should happen to whomever did forge those checks,
257
639000
3000
šta bi trebalo da se dogodi onome ko je te čekove krivotvorio,
10:57
an honest person is much more likely
258
642000
2000
poštena osoba će mnogo vjerovatnije
10:59
to recommend strict rather than lenient punishment.
259
644000
4000
preporučiti strožu kaznu umjesto blaže.
11:03
Now let's say you're having that exact same conversation
260
648000
2000
Sad zamislimo da isti takav razgovor vodimo
11:05
with someone deceptive.
261
650000
2000
sa nekim ko skriva istinu.
11:07
That person may be withdrawn,
262
652000
2000
Ta osoba može biti povučena,
11:09
look down, lower their voice,
263
654000
2000
spuštena pogleda, tiha glasa,
11:11
pause, be kind of herky-jerky.
264
656000
2000
zastajkuje, nepredvidljiva i nemirna.
11:13
Ask a deceptive person to tell their story,
265
658000
2000
Ako takvu osobu zamolite da vam ispriča svoju verziju priče,
11:15
they're going to pepper it with way too much detail
266
660000
3000
začiniće je suvišnim detaljima
11:18
in all kinds of irrelevant places.
267
663000
3000
koji su potpuno nevažni.
11:21
And then they're going to tell their story in strict chronological order.
268
666000
3000
I priču će vam ispričati strogo hronološkim redom.
11:24
And what a trained interrogator does
269
669000
2000
I šta tu onda uradi obučeni ispitivač -
11:26
is they come in and in very subtle ways
270
671000
2000
- dođe i na veoma vješt način
11:28
over the course of several hours,
271
673000
2000
tokom više sati razgovora,
11:30
they will ask that person to tell that story backwards,
272
675000
3000
traži od te osobe da priču ispriča unatraške,
11:33
and then they'll watch them squirm,
273
678000
2000
pa onda gleda kako se ta osoba migolji,
11:35
and track which questions produce the highest volume of deceptive tells.
274
680000
3000
kako bi mogao da primjeti na koja pitanja dobija najveće laži.
11:38
Why do they do that? Well we all do the same thing.
275
683000
3000
Zašto to tako rade? To svi radimo.
11:41
We rehearse our words,
276
686000
2000
Mi vježbamo kako ćemo nešto reći,
11:43
but we rarely rehearse our gestures.
277
688000
2000
ali rijetko vježbamo i pokrete tijelom.
11:45
We say "yes," we shake our heads "no."
278
690000
2000
Izgovorimo "da" ali glavom pokažemo "ne".
11:47
We tell very convincing stories, we slightly shrug our shoulders.
279
692000
3000
Dok pričamo veoma ubjedljive priče, jedva primjetno sliježemo ramenima.
11:50
We commit terrible crimes,
280
695000
2000
Kada počinimo stravične zločine,
11:52
and we smile at the delight in getting away with it.
281
697000
3000
smijemo se zadovoljni što smo se izvukli.
11:55
Now that smile is known in the trade as "duping delight."
282
700000
3000
U ovom poslu taj osmjeh zovemo "uživanjem u uspjeloj prevari" (duping delight).
11:58
And we're going to see that in several videos moving forward,
283
703000
3000
Vidjećemo ga u nekoliko sljedećih video isječaka,
12:01
but we're going to start -- for those of you who don't know him,
284
706000
2000
ali prvo počnimo sa - za one koji ga ne znaju,
12:03
this is presidential candidate John Edwards
285
708000
3000
ovo je predsjednički kandidat John Edwards
12:06
who shocked America by fathering a child out of wedlock.
286
711000
3000
koji je šokirao Ameriku otkrićem da ima vanbračno dijete.
12:09
We're going to see him talk about getting a paternity test.
287
714000
3000
Vidjećemo ga kako govori o utvrđivanju očinstva.
12:12
See now if you can spot him
288
717000
2000
Probajte sada da primjetite
12:14
saying, "yes" while shaking his head "no,"
289
719000
2000
kako govori "da" dok mu glava odmahuje "ne".
12:16
slightly shrugging his shoulders.
290
721000
2000
i lagano sliježe ramenima.
12:18
(Video) John Edwards: I'd be happy to participate in one.
291
723000
2000
(Video) John Edwards: "Sa zadovoljstvom bih u tome učestvovao.
12:20
I know that it's not possible that this child could be mine,
292
725000
3000
Znam da to dijete nikako ne može biti moje.
12:23
because of the timing of events.
293
728000
2000
zbog vremena kada se sve događalo.
12:25
So I know it's not possible.
294
730000
2000
Stoga znam da to nije moguće.
12:27
Happy to take a paternity test,
295
732000
2000
Rado ću se podvrgnuti utvrđivanju očinstva,
12:29
and would love to see it happen.
296
734000
2000
i volio bih da se to i dogodi".
12:31
Interviewer: Are you going to do that soon? Is there somebody --
297
736000
3000
Ispitivač: "Hoćete li to uskoro učiniti? Ima li koga --"
12:34
JE: Well, I'm only one side. I'm only one side of the test.
298
739000
3000
JE: "Pa, ja sam samo jedna strana. Jedna strana u testiranju.
12:37
But I'm happy to participate in one.
299
742000
3000
Ali ću rado učestvovati".
12:40
PM: Okay, those head shakes are much easier to spot
300
745000
2000
PM: OK, mnogo je lakše primjetiti ovo klimanje glavom.
12:42
once you know to look for them.
301
747000
2000
ako znate šta treba da tražite.
12:44
There're going to be times
302
749000
2000
Jednom će
12:46
when someone makes one expression
303
751000
2000
neko napraviti izraz lica
12:48
while masking another that just kind of leaks through in a flash.
304
753000
3000
kojim će prekriti drugi izraz a on će se, na trenutak, nekako probiti.
12:52
Murderers are known to leak sadness.
305
757000
2000
Poznato je da ubicama procuri tuga.
12:54
Your new joint venture partner might shake your hand,
306
759000
2000
Novi poslovni partner se s vama može rukovati,
12:56
celebrate, go out to dinner with you
307
761000
2000
proslaviti s vama, izići na vereču
12:58
and then leak an expression of anger.
308
763000
3000
a onda će mu se slučajno oteti izraz bijesa.
13:01
And we're not all going to become facial expression experts overnight here,
309
766000
3000
Nećemo svi preko noći postati stručnjaci za izraze lica,
13:04
but there's one I can teach you that's very dangerous, and it's easy to learn,
310
769000
3000
ali jedan veoma opasan, mogu vas naučiti, lako je,
13:07
and that's the expression of contempt.
311
772000
3000
a riječ je o izrazu prijezira.
13:10
Now with anger, you've got two people on an even playing field.
312
775000
3000
Kod bijesa, u pitanju je dvoje ljudi na istoj ravni.
13:13
It's still somewhat of a healthy relationship.
313
778000
2000
Tako da tu, bar donekle, imamo zdrav odnos.
13:15
But when anger turns to contempt,
314
780000
2000
Ali, kada bijes preraste u prezir.
13:17
you've been dismissed.
315
782000
2000
bićete otpisani.
13:19
It's associated with moral superiority.
316
784000
2000
Prezir se vezuje za moralnu superiornost.
13:21
And for that reason, it's very, very hard to recover from.
317
786000
3000
I stoga je iz njega veoma, veoma teško izići.
13:24
Here's what it looks like.
318
789000
2000
Evo kako to izgleda.
13:26
It's marked by one lip corner
319
791000
2000
Vidi se u uglu usne
13:28
pulled up and in.
320
793000
2000
koji se malo podigne i uvuče.
13:30
It's the only asymmetrical expression.
321
795000
3000
Taj je izraz jedini asimetričan.
13:33
And in the presence of contempt,
322
798000
2000
Kada je prisutan prezir,
13:35
whether or not deception follows --
323
800000
2000
bilo da za njim uslijedi laž ili ne --
13:37
and it doesn't always follow --
324
802000
2000
jer ona ne uslijedi uvijek --
13:39
look the other way, go the other direction,
325
804000
2000
okrenite se na drugu stranu, potražite nešto drugo,
13:41
reconsider the deal,
326
806000
2000
razmislite još jednom o postignutom dogovoru,
13:43
say, "No thank you. I'm not coming up for just one more nightcap. Thank you."
327
808000
4000
i recite: "Neka, hvala, ne bih da se pojavljujem zbog kasnovečernjeg ćaskanja uz čašicu. Hvala".
13:47
Science has surfaced
328
812000
2000
Nauka je izdvojila
13:49
many, many more indicators.
329
814000
2000
mnogo, mnogo drugih pokazatelja.
13:51
We know, for example,
330
816000
2000
Znamo, na primjer,
13:53
we know liars will shift their blink rate,
331
818000
2000
da se lažovima mijenja učestalost treptanja,
13:55
point their feet towards an exit.
332
820000
2000
da su im stopala usmjerena prema izlazu.
13:57
They will take barrier objects
333
822000
2000
da će se poslužiti predmetima od kojih mogu načiniti prepreku
13:59
and put them between themselves and the person that is interviewing them.
334
824000
3000
između sebe i osobe koja ih ispituje.
14:02
They'll alter their vocal tone,
335
827000
2000
Govoriće promjenjivim visinama glasa,
14:04
often making their vocal tone much lower.
336
829000
3000
često će govoriti mnogo dubljim glasom nego inače.
14:07
Now here's the deal.
337
832000
2000
Evo o čemu je riječ.
14:09
These behaviors are just behaviors.
338
834000
3000
Ovo su samo oblici ponašanja.
14:12
They're not proof of deception.
339
837000
2000
Njima se ne dokazuje laž.
14:14
They're red flags.
340
839000
2000
Oni su znak za uzbunu.
14:16
We're human beings.
341
841000
2000
Svi smo ljudska bića.
14:18
We make deceptive flailing gestures all over the place all day long.
342
843000
3000
Stalno prevarantski mlataramo rukama na sve strane,
14:21
They don't mean anything in and of themselves.
343
846000
2000
Ali to mlataranje ne znači ništa samo po sebi.
14:23
But when you see clusters of them, that's your signal.
344
848000
3000
Međutim, ako primjetite da se to mlataranje ponavlja u nizovima, eto vam znaka.
14:26
Look, listen, probe, ask some hard questions,
345
851000
3000
Pogledajte malo bolje, poslušajte, ispitajte i postavite teška pitanja,
14:29
get out of that very comfortable mode of knowing,
346
854000
3000
napustite taj udoban režim rada kada nešto saznajete,
14:32
walk into curiosity mode, ask more questions,
347
857000
3000
i prebacite se u režim znatiželje, više zapitkujte,
14:35
have a little dignity, treat the person you're talking to with rapport.
348
860000
3000
zadržite dostojanstvenost, imajte odnos međusobnog uvažavanja
14:38
Don't try to be like those folks on "Law & Order" and those other TV shows
349
863000
3000
Ni ne pokušavajte da se ponašate kao oni iz serije "Red i zakon", i njima sličnih
14:41
that pummel their subjects into submission.
350
866000
2000
koji svoje ispitanike silom natjeraju na pokornost.
14:43
Don't be too aggressive, it doesn't work.
351
868000
3000
Ne budite previše agresivni, to prosto ne pali.
14:46
Now we've talked a little bit
352
871000
2000
Eto, popričali smo o tome
14:48
about how to talk to someone who's lying
353
873000
2000
kako valja razgovarati sa lažovima
14:50
and how to spot a lie.
354
875000
2000
i kako prepoznati laž.
14:52
And as I promised, we're now going to look at what the truth looks like.
355
877000
3000
Ali sam vam obećala i da ćemo pričati o tome kako izgleda istina.
14:55
But I'm going to show you two videos,
356
880000
2000
Prikazaću vam dva snimka,
14:57
two mothers -- one is lying, one is telling the truth.
357
882000
3000
dvije majke -- jedna laže dok druga govori istinu.
15:00
And these were surfaced
358
885000
2000
Primjetio ih je
15:02
by researcher David Matsumoto in California.
359
887000
2000
istraživač David Matsumoto iz Kalifornije.
15:04
And I think they're an excellent example
360
889000
2000
A ja mislim da su odličan primjer
15:06
of what the truth looks like.
361
891000
2000
kako izgleda istina.
15:08
This mother, Diane Downs,
362
893000
2000
Ova majka, Diane Downs,
15:10
shot her kids at close range,
363
895000
2000
pucala je iz blizine u svoju djecu,
15:12
drove them to the hospital
364
897000
2000
odvezla ih je u bolnicu
15:14
while they bled all over the car,
365
899000
2000
dok su krvarili svuda po autu,
15:16
claimed a scraggy-haired stranger did it.
366
901000
2000
i ustvrdila je da je pucao nepoznati čovjek raščupane kose.
15:18
And you'll see when you see the video,
367
903000
2000
Kada budete gledali snimak, primjetićete,
15:20
she can't even pretend to be an agonizing mother.
368
905000
2000
da ona čak nije u stanju ni da se pretvara da je rastrzana brigom za djecu.
15:22
What you want to look for here
369
907000
2000
Ovdje treba primjetiti
15:24
is an incredible discrepancy
370
909000
2000
nevjerovatno neslaganje
15:26
between horrific events that she describes
371
911000
2000
između stravičnih događaja koje ona opisuje
15:28
and her very, very cool demeanor.
372
913000
2000
i izuzetno hladnog ponašanja.
15:30
And if you look closely, you'll see duping delight throughout this video.
373
915000
3000
A ako pažljivo posmatrate, primjetićete i zadovoljstvo zbog uspjele prevare tokom čitavog snimka.
15:33
(Video) Diane Downs: At night when I close my eyes,
374
918000
2000
(Snimak) Diane Downs: Noću, kad zatvorim oči,
15:35
I can see Christie reaching her hand out to me while I'm driving,
375
920000
3000
Vidim Christi kako ka meni pruža ruke dok je vozim,
15:38
and the blood just kept coming out of her mouth.
376
923000
3000
dok joj krv nadire iz usta.
15:41
And that -- maybe it'll fade too with time --
377
926000
2000
Možda će to s vremenom izblijedjeti --
15:43
but I don't think so.
378
928000
2000
iako mislim da neće.
15:45
That bothers me the most.
379
930000
3000
To me najviše proganja.
15:55
PM: Now I'm going to show you a video
380
940000
2000
PM: Sad ću vam prikazati snimak
15:57
of an actual grieving mother, Erin Runnion,
381
942000
2000
majke koja zaista pati, Erin Runnion,
15:59
confronting her daughter's murderer and torturer in court.
382
944000
4000
kako se na suđenju suočava sa ubicom i mučiteljem svoje kćerke.
16:03
Here you're going to see no false emotion,
383
948000
2000
Ovdje nećete vidjeti lažna osjećanja,
16:05
just the authentic expression of a mother's agony.
384
950000
3000
već istinski izraz majčinske patnje.
16:08
(Video) Erin Runnion: I wrote this statement on the third anniversary
385
953000
2000
(Snimak) Erin Runnion: Ovu izjavu sam napisala na treću godišnjicu
16:10
of the night you took my baby,
386
955000
2000
noći kada si mi oteo dijete,
16:12
and you hurt her,
387
957000
2000
povrijedio je,
16:14
and you crushed her,
388
959000
2000
i slomio je,
16:16
you terrified her until her heart stopped.
389
961000
4000
prestravio je toliko da joj je srce stalo.
16:20
And she fought, and I know she fought you.
390
965000
3000
A ona se opirala, i znam da se borila.
16:23
But I know she looked at you
391
968000
2000
I znam da te je gledala onim
16:25
with those amazing brown eyes,
392
970000
2000
divnim smeđim očima,
16:27
and you still wanted to kill her.
393
972000
3000
a ti si i dalje želio da je ubiješ.
16:30
And I don't understand it,
394
975000
2000
To ne razumijem,
16:32
and I never will.
395
977000
3000
i nikada neću razumjeti.
16:35
PM: Okay, there's no doubting the veracity of those emotions.
396
980000
4000
PM: Dakle, nema nikakve sumnje u iskrenost ovih osjećanja.
16:39
Now the technology around what the truth looks like
397
984000
3000
Tehnološka dostignuća su uznapredovala i omogućuju nam da
16:42
is progressing on, the science of it.
398
987000
3000
naučno pokažemo kako izgleda istina.
16:45
We know for example
399
990000
2000
Na primjer, sada već imamo posebne naprave
16:47
that we now have specialized eye trackers and infrared brain scans,
400
992000
3000
za praćenje kretanja očne jabučice, te infracrvene snimke mozga,
16:50
MRI's that can decode the signals that our bodies send out
401
995000
3000
snimke magnetne rezonance koji tumače znakove koje naša tijela šalju
16:53
when we're trying to be deceptive.
402
998000
2000
onda kada pokušavamo da prevarimo.
16:55
And these technologies are going to be marketed to all of us
403
1000000
3000
I te će nam tehnologije biti svima ponuđene na tržištu
16:58
as panaceas for deceit,
404
1003000
2000
kao čarobni lijek protiv laži,
17:00
and they will prove incredibly useful some day.
405
1005000
3000
i jednoga dana će se pokazati nevjerovatno korisnim.
17:03
But you've got to ask yourself in the meantime:
406
1008000
2000
Ali, do tada, treba sebi postaviti pitanje:
17:05
Who do you want on your side of the meeting,
407
1010000
2000
Ko biste željeli da budete na takvom jednom razgovoru,
17:07
someone who's trained in getting to the truth
408
1012000
3000
osoba obučena da stigne do istine
17:10
or some guy who's going to drag a 400-pound electroencephalogram
409
1015000
2000
ili neko ko sa sobom vucara EEG od 180 kg
17:12
through the door?
410
1017000
2000
i s njim pokušava proći kroz vrata?
17:14
Liespotters rely on human tools.
411
1019000
4000
Ljudi obučeni da prepoznaju laž služe se ljudskim alatkama.
17:18
They know, as someone once said,
412
1023000
2000
Kako je to nekad neko rekao, oni znaju:
17:20
"Character's who you are in the dark."
413
1025000
2000
"Ljudski karakter izlazi na vidjelo u mraku".
17:22
And what's kind of interesting
414
1027000
2000
I baš to je zanimljivo
17:24
is that today we have so little darkness.
415
1029000
2000
jer je u savremenom svijetu ostalo tako malo tame.
17:26
Our world is lit up 24 hours a day.
416
1031000
3000
Živimo u svijetu koji je osvijetljen tokom čitavog dana.
17:29
It's transparent
417
1034000
2000
Sve je vidljivo
17:31
with blogs and social networks
418
1036000
2000
kroz blogove i društvene mreže
17:33
broadcasting the buzz of a whole new generation of people
419
1038000
2000
kojima se šire priče o čitavoj jednoj generaciji
17:35
that have made a choice to live their lives in public.
420
1040000
3000
koja je odlučila da svoje živote izloži javnosti.
17:38
It's a much more noisy world.
421
1043000
4000
Ovaj svijet je daleko bučniji.
17:42
So one challenge we have
422
1047000
2000
Otuda je pred nama izazov
17:44
is to remember,
423
1049000
2000
da ne nikako ne smetnemo s uma da
17:46
oversharing, that's not honesty.
424
1051000
3000
pretjerano otkrivanje ne znači i iskrenost.
17:49
Our manic tweeting and texting
425
1054000
2000
Naše manijakalno slanje kratkih poruka Twitterom ili mobilnim telefonima
17:51
can blind us to the fact
426
1056000
2000
prijeti da nas zaslijepi te da zaboravimo
17:53
that the subtleties of human decency -- character integrity --
427
1058000
3000
na ono teško uhvatljivo u čestitosti -- ličnost i integritet.
17:56
that's still what matters, that's always what's going to matter.
428
1061000
3000
Jer to ostaje najvažnije i to će uvijek biti najvažnije.
17:59
So in this much noisier world,
429
1064000
2000
Stoga,u ovom tako bučnom svijetu,
18:01
it might make sense for us
430
1066000
2000
i dalje ima smisla
18:03
to be just a little bit more explicit
431
1068000
2000
biti nešto izričitiji kada je riječ
18:05
about our moral code.
432
1070000
3000
o moralnim vrijednostima.
18:08
When you combine the science of recognizing deception
433
1073000
2000
Kada se ukrste naučna saznanja o prepoznavanju laži
18:10
with the art of looking, listening,
434
1075000
2000
sa vještinom posmatranja, slušanja,
18:12
you exempt yourself from collaborating in a lie.
435
1077000
3000
uspjećete da izbjegnete da sarađujete sa onim ko laže.
18:15
You start up that path
436
1080000
2000
Krenite putem koji podrazumijeva
18:17
of being just a little bit more explicit,
437
1082000
2000
da ćete biti tek nešto malo više izričiti
18:19
because you signal to everyone around you,
438
1084000
2000
jer ćete time poslati signal svima u okruženju,
18:21
you say, "Hey, my world, our world,
439
1086000
3000
"Slušaj, moj svijet, naš svijet,
18:24
it's going to be an honest one.
440
1089000
2000
biće svijet iskrenih.
18:26
My world is going to be one where truth is strengthened
441
1091000
2000
U mom svijetu jačaće istina
18:28
and falsehood is recognized and marginalized."
442
1093000
3000
a pretvaranje će biti razotkriveno i skrajnuto".
18:31
And when you do that,
443
1096000
2000
Ako se tako ponašate,
18:33
the ground around you starts to shift just a little bit.
444
1098000
3000
prilike oko vas će se malo ipak promjeniti.
18:36
And that's the truth. Thank you.
445
1101000
3000
I to je istina. Hvala.
18:39
(Applause)
446
1104000
5000
(Aplauz)
Translated by Teodora Kusovac
Reviewed by Nela Kacmarcik

▲Back to top

ABOUT THE SPEAKER
Pamela Meyer - Lie detector
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth.

Why you should listen
Social media expert Pamela Meyer can tell when you’re lying. If it’s not your words that give you away, it’s your posture, eyes, breathing rate, fidgets, and a host of other indicators. Worse, we are all lied to up to 200 times a day, she says, from the white lies that allow society to function smoothly to the devastating duplicities that bring down corporations and break up families.

Working with a team of researchers over several years, Meyer, who is CEO of social networking company Simpatico Networks, collected and reviewed most of the research on deception that has been published, from such fields as law-enforcement, military, psychology and espionage. She then became an expert herself, receiving advanced training in deception detection, including multiple courses of advanced training in interrogation, microexpression analysis, statement analysis, behavior and body language interpretation, and emotion recognition. Her research is synthetized in her bestselling book Liespotting.
More profile about the speaker
Pamela Meyer | Speaker | TED.com