ABOUT THE SPEAKER
Pamela Meyer - Lie detector
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth.

Why you should listen
Social media expert Pamela Meyer can tell when you’re lying. If it’s not your words that give you away, it’s your posture, eyes, breathing rate, fidgets, and a host of other indicators. Worse, we are all lied to up to 200 times a day, she says, from the white lies that allow society to function smoothly to the devastating duplicities that bring down corporations and break up families.

Working with a team of researchers over several years, Meyer, who is CEO of social networking company Simpatico Networks, collected and reviewed most of the research on deception that has been published, from such fields as law-enforcement, military, psychology and espionage. She then became an expert herself, receiving advanced training in deception detection, including multiple courses of advanced training in interrogation, microexpression analysis, statement analysis, behavior and body language interpretation, and emotion recognition. Her research is synthetized in her bestselling book Liespotting.
More profile about the speaker
Pamela Meyer | Speaker | TED.com
TEDGlobal 2011

Pamela Meyer: How to spot a liar

Pamela Meyer: Yolg'onchini qanday aniqlash mumkin?

Filmed:
28,415,176 views

Biz har kuni yolg'on bilan 10 dan 200 martagacha yuzlanamiz va ularni fosh etish uchun belgilar turlicha va tushunish uchun qiyin bo'lishi mumkin. Pamela Meyer, "Yolg'onni fosh etish" asarinig muallifi, yolg'onni aniqlashning "issiq nuqtalari" va unda insonning o'zini tutishi haqidagi sirlari haqida so'zlab beradi. Uning hisoblashicha, rostgo'ylik bu biz qadriga yetishimiz va avaylashimiz kerak bo'lgan boyligimizdir.
- Lie detector
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth. Full bio

Double-click the English transcript below to play the video.

00:15
Okay, now I don't want to alarm anybody in this room,
0
0
5000
Xo'sh, mening bu zaldagilarni qo'rqitish
niyatim yo'q, lekin
00:20
but it's just come to my attention
1
5000
2000
e'tiboringizni shunga
qaratmoqchimanki,
00:22
that the person to your right is a liar.
2
7000
2000
o'ng tomoningizdagi inson — yolg'onchi.
00:24
(Laughter)
3
9000
2000
(Kulgu)
00:26
Also, the person to your left is a liar.
4
11000
3000
Va chap tomoningizdagi inson ham.
00:29
Also the person sitting in your very seats is a liar.
5
14000
3000
Xattoki, siz o'tirgan o'rindiqda
o'tirgan inson ham
00:32
We're all liars.
6
17000
2000
— yolg'onchi.
Biz hammamiz yolg'onchimiz.
00:34
What I'm going to do today
7
19000
2000
Va men bugun sizlarga tadqiqotlar
00:36
is I'm going to show you what the research says about why we're all liars,
8
21000
3000
bizni nega aldashimiz haqida nima
deyishini, qanday qilib
00:39
how you can become a liespotter
9
24000
2000
yolg'onni aniqlash ustasi bo'lishingiz
00:41
and why you might want to go the extra mile
10
26000
3000
va aslida yaxshisi oldinga intilib,
yolg'onni aniqlashdan ko'ra
00:44
and go from liespotting to truth seeking,
11
29000
3000
haqiqatni qidirish ekanligi, va nihoyat,
ishonch atmosferasini
00:47
and ultimately to trust building.
12
32000
2000
yaratish madaniyati haqda so'zlab beraman.
00:49
Now speaking of trust,
13
34000
3000
Aytancha, ishonch haqida gapirsak.
00:52
ever since I wrote this book, "Liespotting,"
14
37000
3000
"Yolg'onni fosh etish" kitobini
yozganimdan beri, hech kim
00:55
no one wants to meet me in person anymore, no, no, no, no, no.
15
40000
3000
men bilan yuzma-yuz uchrashgisi
kelmaydi, yo'q, yo'q, yo'q...
00:58
They say, "It's okay, we'll email you."
16
43000
3000
Ular deyishadi: "Yaxshi, siz
bilan e-mail orqali bog'lanamiz".
01:01
(Laughter)
17
46000
2000
(Kulgu)
01:03
I can't even get a coffee date at Starbucks.
18
48000
4000
Men xattoki hech kimni Starbaksga qahva
ichishga taklif qilolmayman.
01:07
My husband's like, "Honey, deception?
19
52000
2000
Xatto erim ham: "Azizam, qanday yolg'on?
01:09
Maybe you could have focused on cooking. How about French cooking?"
20
54000
3000
Balki kulinariya bilan qiziqarsan?
Fransuz oshxonasi, masalan."
01:12
So before I get started, what I'm going to do
21
57000
2000
Boshlashdan oldin men sizlarga aniq
01:14
is I'm going to clarify my goal for you,
22
59000
3000
maqsadimni aytishim kerakki, men sizlarga
"mushuk-sichqon"
01:17
which is not to teach a game of Gotcha.
23
62000
2000
o'yinini o'rgatish niyatim yo'q.
01:19
Liespotters aren't those nitpicky kids,
24
64000
2000
Yolg'onni biluvchilar —
yosh bola emas,
01:21
those kids in the back of the room that are shouting, "Gotcha! Gotcha!
25
66000
3000
ular "tutdimmi, fosh etdimmi!"
deb baqirishmaydi.
01:24
Your eyebrow twitched. You flared your nostril.
26
69000
3000
Qovog'ingiz uchadi, burningizni
"ishirasiz".
01:27
I watch that TV show 'Lie To Me.' I know you're lying."
27
72000
3000
TVda "Meni alda" degan shou bor.
Sevib tomosha qilaman.
01:30
No, liespotters are armed
28
75000
2000
Yolg'on biluvchilar yolg'onni aniqlash
01:32
with scientific knowledge of how to spot deception.
29
77000
3000
borasidagi kerakli bilimlar
bilan qurollangan. Ular bu bilimni
01:35
They use it to get to the truth,
30
80000
2000
haqiqatni bilish uchun qo'llashadi.
01:37
and they do what mature leaders do everyday;
31
82000
2000
Ular xuddi tajribali boshliqlar qiladigan
01:39
they have difficult conversations with difficult people,
32
84000
3000
ishni qilishadi: murakkab insonlar bilan
muzokaralar qilishadi.
01:42
sometimes during very difficult times.
33
87000
2000
Va ba'zida bu juda qiyin kechadi.
01:44
And they start up that path
34
89000
2000
Ular bu yo'lni asosiy ma'lumotni
01:46
by accepting a core proposition,
35
91000
2000
qabul qilishdan boshlashadi, u esa
01:48
and that proposition is the following:
36
93000
2000
quyidagicha jaranglaydi:
01:50
Lying is a cooperative act.
37
95000
3000
Yolg'on — birgalikda bajariladigan
harakat.
01:53
Think about it, a lie has no power whatsoever by its mere utterance.
38
98000
4000
O'ylab ko'ring, o'z-o'zidan yolg'on hech
qanday kuchga ega emas, deyarli bo'sh gap.
01:57
Its power emerges
39
102000
2000
Va u faqatgina unga kimdir ishonishga
01:59
when someone else agrees to believe the lie.
40
104000
2000
rozi bo'lgandagina kuchga ega bo'ladi.
02:01
So I know it may sound like tough love,
41
106000
2000
Bilaman, bu shafqatsizlik yaxshilikdan
02:03
but look, if at some point you got lied to,
42
108000
4000
degandek eshitiladi. Lekin o'ylab ko'ring,
qaysidir vaziyatda sizni aldashlari
02:07
it's because you agreed to get lied to.
43
112000
2000
siz aldanishga rozi bo'lganingizdek gap.
02:09
Truth number one about lying: Lying's a cooperative act.
44
114000
3000
Yolg'on haqida birinchi haqiqat:
"Yolg'on bu birgalikda harakat"
02:12
Now not all lies are harmful.
45
117000
2000
Va hamma yolg'on ham zararli emas.
02:14
Sometimes we're willing participants in deception
46
119000
3000
Ba'zida biz o'zimiz yolg'onda
ishtirok etamiz,
02:17
for the sake of social dignity,
47
122000
3000
insoniy qadr-qimmatni his etish yoki
saqlab qolish uchun.
02:20
maybe to keep a secret that should be kept secret, secret.
48
125000
3000
Ba'zan esa sir qolishi kerak bo'lgan
narsani yashirish uchun.
02:23
We say, "Nice song."
49
128000
2000
Biz aytamiz: "Yaxshi qo'shiq".
02:25
"Honey, you don't look fat in that, no."
50
130000
3000
"Azizam, yo'q, bu seni semiz qilib
ko'rsatayotgani yo'q".
02:28
Or we say, favorite of the digiratti,
51
133000
2000
Yoki biz internet-elitaga taqlid qilib:
02:30
"You know, I just fished that email out of my spam folder.
52
135000
3000
"Bilasanmi, bu xabarni men hozirgina men
spamda ko'rib qoldim.
02:33
So sorry."
53
138000
3000
Uzr so'rayman".
02:36
But there are times when we are unwilling participants in deception.
54
141000
3000
Ba'zida o'zlarimiz istamay,
02:39
And that can have dramatic costs for us.
55
144000
3000
yolg'onda ishtirok etamiz.
Va bu bizga qimmatga tushishi mumkin.
02:42
Last year saw 997 billion dollars
56
147000
3000
O'tgan yili korporativ qalloblik
natijasida birgina
02:45
in corporate fraud alone in the United States.
57
150000
4000
AQSHning o'zi 997 milliard
dollar yo'qotdi.
02:49
That's an eyelash under a trillion dollars.
58
154000
2000
Bu deyarli butun boshli trillion dollar.
02:51
That's seven percent of revenues.
59
156000
2000
Bu davlat daromadining 7%i.
02:53
Deception can cost billions.
60
158000
2000
Yolg'on milliardlar turishi mumkin.
02:55
Think Enron, Madoff, the mortgage crisis.
61
160000
3000
"Enron", Meddof, ipoteka
inqirozlarini eslang.
02:58
Or in the case of double agents and traitors,
62
163000
3000
Yoki Robert Hansen, Oldrich Eyms kabi
ikkiyoqlama agent va
03:01
like Robert Hanssen or Aldrich Ames,
63
166000
2000
sotqinlar bilan bog'liq voqealarni eslang.
03:03
lies can betray our country,
64
168000
2000
Yolg'on orqali vatanga xiyonat qilish ham
03:05
they can compromise our security, they can undermine democracy,
65
170000
3000
mumkin, u bizni xavfga qo'yishi,
tinchimizni buzishi va bizni
03:08
they can cause the deaths of those that defend us.
66
173000
3000
himoyalayotgan barcha narsani
yemirishi mumkin.
03:11
Deception is actually serious business.
67
176000
3000
Yolg'on aslida jiddiy biznes
ko'rinishiga ega.
03:14
This con man, Henry Oberlander,
68
179000
2000
Genri Oberlender ismli firibgar,
03:16
he was such an effective con man
69
181000
2000
shunday usta yolg'onchi ediki,
03:18
British authorities say
70
183000
2000
Britaniya rasmiylari fikricha, u butun
03:20
he could have undermined the entire banking system of the Western world.
71
185000
3000
G'arb bank sistemasini buzib yuborishi
mumkin edi. Va siz uni
03:23
And you can't find this guy on Google; you can't find him anywhere.
72
188000
2000
Google orqali topa olmaysiz, siz uni hech
03:25
He was interviewed once, and he said the following.
73
190000
3000
qayerdan topa olmaysiz. Undan intervyu
olinganida shunday degan:
03:28
He said, "Look, I've got one rule."
74
193000
2000
"Bilasizmi, menda faqat bitta qoida bor".
03:30
And this was Henry's rule, he said,
75
195000
3000
Genrining fikricha bu qoida shunday:
"Qarang, har kim sizga
03:33
"Look, everyone is willing to give you something.
76
198000
2000
nimadir berishga tayyor.
Insonlar sizga
03:35
They're ready to give you something for whatever it is they're hungry for."
77
200000
3000
nima bo'lsayam berishga tayyor, o'zlari
olishni xohlayotgan
03:38
And that's the crux of it.
78
203000
2000
narsa evaziga".
Savolning mazmuni shunda.
03:40
If you don't want to be deceived, you have to know,
79
205000
2000
Agar siz aldanib qolishni xohlamasangiz,
03:42
what is it that you're hungry for?
80
207000
2000
tushunishingiz kerakki: siz aynan nimani
03:44
And we all kind of hate to admit it.
81
209000
3000
ko'proq xohlaysiz.
Va bizga buni tan olish umuman yoqmaydi.
03:47
We wish we were better husbands, better wives,
82
212000
3000
Biz eng yaxshi erkak,
eng yaxshi ayol bo'lishni xohlaymiz,
03:50
smarter, more powerful,
83
215000
2000
aqlliroq bo'lishni, kuchlroq bo'lishni,
03:52
taller, richer --
84
217000
2000
baland bo'yli, boy bo'lishni —
03:54
the list goes on.
85
219000
2000
va bu ro'yxat cheksiz.
03:56
Lying is an attempt to bridge that gap,
86
221000
2000
Yolg'on — bu faqat shu probellarni
03:58
to connect our wishes and our fantasies
87
223000
2000
to'ldirishga, kim bo'lishni xohlashimiz
04:00
about who we wish we were, how we wish we could be,
88
225000
3000
bilan bog'liq xohish-istaklarimiz
va fantaziyalarimizni
04:03
with what we're really like.
89
228000
3000
reallik bilan birlashtirishga urinish.
Afsuski, biz
04:06
And boy are we willing to fill in those gaps in our lives with lies.
90
231000
3000
hayotimizdagi shunday probellarni yolg'on
bilan to'ldirishga
04:09
On a given day, studies show that you may be lied to
91
234000
3000
tayyormiz. Bir kun davomida,
tadqiqotlarga ko'ra, sizga
04:12
anywhere from 10 to 200 times.
92
237000
2000
10 dan 200 martagacha yolg'on gapirishadi.
04:14
Now granted, many of those are white lies.
93
239000
3000
Aniqlanishicha, ularning ko'p qismi
"oq" yolg'on.
04:17
But in another study,
94
242000
2000
Boshqa tadqiqotlar ko'rsatadiki, notanish
04:19
it showed that strangers lied three times
95
244000
2000
insonlar suhbatning ilk 10 daqiqasida
04:21
within the first 10 minutes of meeting each other.
96
246000
2000
bir-biriga uch marta yolg'on gapirishgan.
04:23
(Laughter)
97
248000
2000
(Kulgu)
04:25
Now when we first hear this data, we recoil.
98
250000
3000
Va hozir, biz bu haqda eshitayotib
daxshatga tushamiz. Yolg'on
04:28
We can't believe how prevalent lying is.
99
253000
2000
buncha ko'p uchrashiga ishona olmaymiz.
04:30
We're essentially against lying.
100
255000
2000
Tabiatan biz yolg'on gapirishga qarshimiz.
04:32
But if you look more closely,
101
257000
2000
Lekin e'tibor bersak,
04:34
the plot actually thickens.
102
259000
2000
hammasi yanada nozik.
Notanishlarga
04:36
We lie more to strangers than we lie to coworkers.
103
261000
3000
biz hamkasblarimizga nisbatan
ko'proq yolg'on gapiramiz.
04:39
Extroverts lie more than introverts.
104
264000
4000
Ekstravertlar introvertlarga qaraganda
ko'proq aldashadi.
04:43
Men lie eight times more about themselves
105
268000
3000
Erkaklar o'zi haqda boshqalar
haqda aldagandan
04:46
than they do other people.
106
271000
2000
sakkiz marta ko'proq aldaydi.
04:48
Women lie more to protect other people.
107
273000
3000
Ayollar esa ko'proq kimnidir
himoya qilish uchun aldaydi.
04:51
If you're an average married couple,
108
276000
3000
Agar siz o'rtahol oilaviy juftlik
bo'lsangiz,
04:54
you're going to lie to your spouse
109
279000
2000
siz bir-biringizga 1/10 miqdorda
04:56
in one out of every 10 interactions.
110
281000
2000
yolg'on gapirasizlar.
04:58
Now you may think that's bad.
111
283000
2000
Siz bu yomon deb o'ylashingiz mumkin.
05:00
If you're unmarried, that number drops to three.
112
285000
2000
Agar xali turmush qurmagan
bo'lsangizlar,
05:02
Lying's complex.
113
287000
2000
bu ko'rsatkich 3/10.
Yolg'on — murakkab.
05:04
It's woven into the fabric of our daily and our business lives.
114
289000
3000
U bizning kundalik hayotimiz va ishimiz
bilan uzviy bog'liq.
05:07
We're deeply ambivalent about the truth.
115
292000
2000
Bu borada biz qarama-qarshilikka duch
05:09
We parse it out on an as-needed basis,
116
294000
2000
kelamiz.
Biz uni muhim bo'lgandagina
05:11
sometimes for very good reasons,
117
296000
2000
ishlatamiz.
Ezgu amallar bahonasi bilan.
05:13
other times just because we don't understand the gaps in our lives.
118
298000
3000
Ba'zida esa hayotimizdagi probellarni
anglamayotganda.
05:16
That's truth number two about lying.
119
301000
2000
Va bu yolg'on haqidagi ikkinchi haqiqat.
05:18
We're against lying,
120
303000
2000
Biz yolg'onga qarshimiz,
05:20
but we're covertly for it
121
305000
2000
lekin qalbimiz tub-tubida biz u
tomondamiz,
05:22
in ways that our society has sanctioned
122
307000
2000
azaldan beri shunday bo'lib kelmoqda,
05:24
for centuries and centuries and centuries.
123
309000
2000
va hozir ham.
05:26
It's as old as breathing.
124
311000
2000
Yolg'on dunyo kabi qadimiydir.
05:28
It's part of our culture, it's part of our history.
125
313000
2000
U bizning tariximiz va madaniyatimizning
05:30
Think Dante, Shakespeare,
126
315000
3000
bir qismiga aylangan.
Dante, Shekspir, Bibliya va
05:33
the Bible, News of the World.
127
318000
3000
dunyo yangiliklarini esalang.
05:36
(Laughter)
128
321000
2000
(Kulgu)
05:38
Lying has evolutionary value to us as a species.
129
323000
2000
Yolg'on biz uchun tur sifatida evolyutsion
05:40
Researchers have long known
130
325000
2000
ahamiyatga ega.
Olimlar allaqachon
05:42
that the more intelligent the species,
131
327000
2000
bilishadiki, tur qanchalik aqlli bo'lsa,
05:44
the larger the neocortex,
132
329000
2000
u turning miyasi shunchalik rivojlangan
05:46
the more likely it is to be deceptive.
133
331000
2000
va yolg'onga shu darajada moyil bo'ladi.
05:48
Now you might remember Koko.
134
333000
2000
Kokoni eslang.
05:50
Does anybody remember Koko the gorilla who was taught sign language?
135
335000
3000
Kimdir Kokoni eslaydimi? Imo-ishoralarga
o'rgatilgan gorillani.
05:53
Koko was taught to communicate via sign language.
136
338000
3000
Kokoni imo-ishoralar orqali muloqot
qilishga o'rgatishadi.
05:56
Here's Koko with her kitten.
137
341000
2000
Bu yerda esa biz Kokoni mushukcha bilan
05:58
It's her cute little, fluffy pet kitten.
138
343000
3000
ko'ryapmiz. Bu
uning kichkinagina, sevimli mushukchasi.
06:01
Koko once blamed her pet kitten
139
346000
2000
Koko bir kuni o'zini aybini mushukchaga
06:03
for ripping a sink out of the wall.
140
348000
2000
to'nkaydi,
devordan uzilgan rakovina uchun.
06:05
(Laughter)
141
350000
2000
(Kulgu)
06:07
We're hardwired to become leaders of the pack.
142
352000
2000
Tabiat bizga liderlik xususiyatini bergan.
06:09
It's starts really, really early.
143
354000
2000
Va bu juda juda erta boshlangan.
06:11
How early?
144
356000
2000
Qanchalik erta?
06:13
Well babies will fake a cry,
145
358000
2000
Go'dak ba'zida yolg'ondan yig'lay oladi,
06:15
pause, wait to see who's coming
146
360000
2000
kim kelayotganini bilish uchun bir to'xtab
06:17
and then go right back to crying.
147
362000
2000
oladi va yana yig'lashda davom etishadi.
06:19
One-year-olds learn concealment.
148
364000
2000
Bir yoshdagi bolalar haqiqatni yashirishni
06:21
(Laughter)
149
366000
2000
o'rganishadi.
(Kulgu)
06:23
Two-year-olds bluff.
150
368000
2000
Ikki yoshli go'dak esa aldashga urinadi.
06:25
Five-year-olds lie outright.
151
370000
2000
Besh yoshli esa qiynalmay yolg'on
gapiradi.
06:27
They manipulate via flattery.
152
372000
2000
Atrofdagilarni maqtov bilan
manipulyatsiya
06:29
Nine-year-olds, masters of the cover up.
153
374000
3000
qiladi, 9 yoshda esa
niqoblanish ustalariga aylanishadi.
06:32
By the time you enter college,
154
377000
2000
Kollejga kiradigan yoshga yetib esa
06:34
you're going to lie to your mom in one out of every five interactions.
155
379000
3000
siz onangizga 1/5 holatda aldashga
tayyorsiz.
06:37
By the time we enter this work world and we're breadwinners,
156
382000
3000
Katta bo'lib, o'zimizni ta'minlash uchun
ish dunyosiga kirgach,
06:40
we enter a world that is just cluttered
157
385000
2000
biz spamlarga to'la,
soxta internet-
06:42
with spam, fake digital friends,
158
387000
2000
o'rtoqlarga, ig'vochi matbuotga,
06:44
partisan media,
159
389000
2000
ayyor shaxsiy ma'lumot o'g'rilariga,
06:46
ingenious identity thieves,
160
391000
2000
dunyo darajasidagi moliyaviy piramida
06:48
world-class Ponzi schemers,
161
393000
2000
quruvchilariga to'la,
06:50
a deception epidemic --
162
395000
2000
va yolg'on epidemiyasiga duchor bo'lgan,
06:52
in short, what one author calls
163
397000
2000
bir gap bilan aytganda, bitta muallif
06:54
a post-truth society.
164
399000
3000
ataganidek "post-haqiqat jamiyati"ga
duch keladi.
06:57
It's been very confusing
165
402000
2000
Bu azaldan beri bizni tushkunlikka soladi,
06:59
for a long time now.
166
404000
3000
va bu hozirgacha davom etmoqda.
07:03
What do you do?
167
408000
2000
Xo'sh, biz nima qilishimiz kerak?
07:05
Well there are steps we can take
168
410000
2000
Bu loyqa yo'l orqali o'zimizga yo'l qurish
07:07
to navigate our way through the morass.
169
412000
2000
oldini olishi kerak bo'lgan qadamlar bor.
07:09
Trained liespotters get to the truth 90 percent of the time.
170
414000
3000
Yolg'on sezuvchi insonlar haqiqatga
90% holatda yetib boradi.
07:12
The rest of us, we're only 54 percent accurate.
171
417000
3000
Boshqalari haqda aytadigan bo'lsak,
ular 54% holatdagina aniq.
07:15
Why is it so easy to learn?
172
420000
2000
Nega buni o'rganish shunchalik oson?
07:17
There are good liars and there are bad liars. There are no real original liars.
173
422000
3000
Yaxshi va yomon yolg'onchilar bor.
Asl yolg'onchilar bo'lmaydi.
07:20
We all make the same mistakes. We all use the same techniques.
174
425000
3000
Hammamiz bir xil xatolar qilamiz
va bir xil texnikaga egamiz.
07:23
So what I'm going to do
175
428000
2000
Hozir men sizlarga
07:25
is I'm going to show you two patterns of deception.
176
430000
2000
yolg'onning ikkita belgisini ko'rsataman.
07:27
And then we're going to look at the hot spots and see if we can find them ourselves.
177
432000
3000
Keyin "issiq nuqtalarini" ko'rib chiqamiz,
mustaqil ravishda
07:30
We're going to start with speech.
178
435000
3000
ularni topa olamizmi.
Va biz mana bu nutqdan dan boshlaymiz.
07:33
(Video) Bill Clinton: I want you to listen to me.
179
438000
2000
(Video)
Bill Klinton: Men sizlar meni
07:35
I'm going to say this again.
180
440000
2000
tinglashingizni xohlayman.
Men yana bir bor
07:37
I did not have sexual relations
181
442000
3000
aytaman.
Menda u ayol, miss Levinski bilan
07:40
with that woman, Miss Lewinsky.
182
445000
4000
jinsiy aloqa bo'lmagan.
Men hech qachon hech
07:44
I never told anybody to lie,
183
449000
2000
kimni yolg'on gapirishga majburlamaganman,
07:46
not a single time, never.
184
451000
2000
biror marta ham, hech qachon.
07:48
And these allegations are false.
185
453000
3000
Va bu isibotsiz ayblovlar adolatsizdir.
Va men ishga qaytishim
07:51
And I need to go back to work for the American people.
186
456000
2000
kerak, barcha amerikaliklar uchun.
07:53
Thank you.
187
458000
2000
Rahmat.
07:58
Pamela Meyer: Okay, what were the telltale signs?
188
463000
3000
Pamela Meyer:
Yaxshi, qayerda yolg'on belgilari bor?
08:01
Well first we heard what's known as a non-contracted denial.
189
466000
4000
Birinchi navbatda biz mutlaqo
inkorni eshitdik.
08:05
Studies show that people who are overdetermined in their denial
190
470000
3000
Tadqiqotlar ko'rsatadiki, inkor etish
bilan band insonlar
08:08
will resort to formal rather than informal language.
191
473000
3000
norasmiyga qaraganda ko'proq
rasmiy tilda gapirishadi.
08:11
We also heard distancing language: "that woman."
192
476000
3000
Biz shuningdek, uzoqlashish tiliga
guvoh bo'ldik: "u ayol".
08:14
We know that liars will unconsciously distance themselves
193
479000
2000
Biz bilamizki, yolg'onchilar ongsiz holda
08:16
from their subject
194
481000
2000
o'zlarini obyektdan ma'lum bir masofaga
08:18
using language as their tool.
195
483000
3000
olishadi, buni qurol sifatida
ishlatishga urinib.
08:21
Now if Bill Clinton had said, "Well, to tell you the truth ... "
196
486000
3000
Agar Bill Klinton nutqini:
"Xo'sh, to'g'risini aytganda..." yoki
08:24
or Richard Nixon's favorite, "In all candor ... "
197
489000
2000
Niksonning iborasi:
"Butun samimiyat bilan"
08:26
he would have been a dead giveaway
198
491000
2000
deb boshlaganda, bu so'zlar
aldov sezuvchi
08:28
for any liespotter than knows
199
493000
2000
har bir inson uchun rad etib
bo'lmaydigan
08:30
that qualifying language, as it's called, qualifying language like that,
200
495000
3000
dalil bo'lgan bo'lar edi.
Bu kabi kvalifikatsion til,
08:33
further discredits the subject.
201
498000
2000
yanada ko'proq shubha uyg'otadi.
08:35
Now if he had repeated the question in its entirety,
202
500000
3000
Agar u savolini takrorlaganida,
to'liq holida,
08:38
or if he had peppered his account with a little too much detail --
203
503000
4000
yoki nutqini tafsilotlar bilan
bo'yaganida, u bu ishni
08:42
and we're all really glad he didn't do that --
204
507000
2000
qilmaganidan xursandmizki, u o'zini
08:44
he would have further discredited himself.
205
509000
2000
yanada obro'sizlantirgan bo'lar edi.
08:46
Freud had it right.
206
511000
2000
Freyd haq edi.
Freyd haqiqatdan ham
08:48
Freud said, look, there's much more to it than speech:
207
513000
3000
muhimroq gapni aytgan:
"Birorta ham shunday jonzot yo'qki,
08:51
"No mortal can keep a secret.
208
516000
3000
sirlarni saqlay oladigan.
Xattoki til gapirmasa ham, lablar
08:54
If his lips are silent, he chatters with his fingertips."
209
519000
3000
qimirlamasa ham, barmoq uchlari gapiradi".
Biz qanchalik
08:57
And we all do it no matter how powerful you are.
210
522000
3000
o'zimizni boshqara olishimizdan qat'iy
nazar hammamizda shunday.
09:00
We all chatter with our fingertips.
211
525000
2000
Biz aslida barmoq uchlari bilan gapiramiz.
09:02
I'm going to show you Dominique Strauss-Kahn with Obama
212
527000
3000
Hozir men sizga Dominik Strauss-Kanni
ko'rsataman, Barak Obama
09:05
who's chattering with his fingertips.
213
530000
3000
bilan, ularning barmoq uchlari bilan
gaplashayotganini.
09:08
(Laughter)
214
533000
3000
(Kulgu)
09:11
Now this brings us to our next pattern,
215
536000
3000
Xo'sh, bu bizni keyingi belgiga
olib keladi,
09:14
which is body language.
216
539000
3000
u tana tili deyiladi.
Bunga to'xtalsak,
09:17
With body language, here's what you've got to do.
217
542000
3000
mana siz uni qanday ishlatishingiz kerak.
Kallangizdagi barcha
09:20
You've really got to just throw your assumptions out the door.
218
545000
3000
tahminlarni chiqarib tashlashingiz
kerak bo'ladi. Ilm-fanga
09:23
Let the science temper your knowledge a little bit.
219
548000
2000
yuzlanib bilimlarimizni
kuchaytirib olamiz.
09:25
Because we think liars fidget all the time.
220
550000
3000
Chunki bizning fikrimizcha, yolg'onchilar
doim asabiylashadi.
09:28
Well guess what, they're known to freeze their upper bodies when they're lying.
221
553000
3000
Aslida ular aldaganda tashqi belgilarni
muzlatishni bilishadi.
09:31
We think liars won't look you in the eyes.
222
556000
3000
O'ylaymizki, ular ko'zga
tik qaramaslikni afzal bilishadi.
09:34
Well guess what, they look you in the eyes a little too much
223
559000
2000
Ular aldayotib ko'zingizga tik qarashadi,
09:36
just to compensate for that myth.
224
561000
2000
odatdagidan ham uzoqroq,
ishontirish uchun.
09:38
We think warmth and smiles
225
563000
2000
Biz o'ylaymizki, suhbatdoshning
09:40
convey honesty, sincerity.
226
565000
2000
samimiyligi va tabassumi
09:42
But a trained liespotter
227
567000
2000
to'g'rilik va samimiyatni anglatadi deb.
09:44
can spot a fake smile a mile away.
228
569000
2000
Lekin yolg'on aniqlashga usta inson bir
09:46
Can you all spot the fake smile here?
229
571000
3000
mil' naridagi yolg'on tabassumni
ko'ra oladi. Bu yerda soxta
09:50
You can consciously contract
230
575000
2000
tabassumni aniqlay oladiganlar bormi?
09:52
the muscles in your cheeks.
231
577000
3000
Siz keragicha yanoq mushaklarini
qisqartirishingiz mumkin.
09:55
But the real smile's in the eyes, the crow's feet of the eyes.
232
580000
3000
Lekin chin tabassum ko'zlarda
va atrofidagi ajinlardan bilinadi.
09:58
They cannot be consciously contracted,
233
583000
2000
Undagi mushaklarni xohlagancha qisqartirib
10:00
especially if you overdid the Botox.
234
585000
2000
bo'lmaydi, ayniqsa sizda boteks bo'lsa.
10:02
Don't overdo the Botox; nobody will think you're honest.
235
587000
3000
Boteksga unchalik berilmang,
aks holda sizga hech kim ishonmay
10:05
Now we're going to look at the hot spots.
236
590000
2000
qo'yadi. Endi esa issiq
nuqtalarga o'tamiz.
10:07
Can you tell what's happening in a conversation?
237
592000
2000
Bu suhbatda nima bo'layotganini aytib bera
10:09
Can you start to find the hot spots
238
594000
3000
olasizmi? So'zlar va harakatlar
bir-biri bilan to'g'ri
10:12
to see the discrepancies
239
597000
2000
kelmaydigan holatlarni qidirgan holda
10:14
between someone's words and someone's actions?
240
599000
2000
issiq nuqtalarni qidira olasizmi?
Bilaman,
10:16
Now I know it seems really obvious,
241
601000
2000
bu oddiy ko'rinadi, lekin siz yolg'onda
10:18
but when you're having a conversation
242
603000
2000
shubha qilayotgan inson bilan o'zingiz
10:20
with someone you suspect of deception,
243
605000
3000
gaplashayotganingizda sizning u insonga
munosabatingiz
10:23
attitude is by far the most overlooked but telling of indicators.
244
608000
3000
yolg'on-rost aniqlashini og'irlashtiradi.
10:26
An honest person is going to be cooperative.
245
611000
2000
Rostgo'y inson hamkorlik qiladi.
10:28
They're going to show they're on your side.
246
613000
2000
U siz tomoningizda ekanini ko'rsatishga
10:30
They're going to be enthusiastic.
247
615000
2000
harakat qiladi.
Shijoatga to'la bo'ladi.
10:32
They're going to be willing and helpful to getting you to the truth.
248
617000
2000
Haqiqatni qidirishda yordam berishga
10:34
They're going to be willing to brainstorm, name suspects,
249
619000
3000
tayyorligini bildiradi.
Muammoni yechishda aloqada bo'lishga,
10:37
provide details.
250
622000
2000
shubhalari bilan ulashishga
tayyor bo'ladi.
10:39
They're going to say, "Hey,
251
624000
2000
U deydi: "Qara, cheklarni ishchilar
10:41
maybe it was those guys in payroll that forged those checks."
252
626000
3000
shtatidagi yigitlar sohtalashtirgan
bo'lishi ham mumkin".
10:44
They're going to be infuriated if they sense they're wrongly accused
253
629000
3000
Undan sababsiz shubha qilinayotganini
bilganda, jahli chiqadi,
10:47
throughout the entire course of the interview, not just in flashes;
254
632000
2000
butun intervyu davomida,
har onda bir emas.
10:49
they'll be infuriated throughout the entire course of the interview.
255
634000
3000
U butun intervyu davomida
g'azabda bo'ladi. Va agar
10:52
And if you ask someone honest
256
637000
2000
siz kimdandir pullarni soxtalashtirgan
10:54
what should happen to whomever did forge those checks,
257
639000
3000
insonlarni qanday jazolash haqda
fikrini so'rasangiz,
10:57
an honest person is much more likely
258
642000
2000
rostgo'y inson katta ehtimol bilan,
10:59
to recommend strict rather than lenient punishment.
259
644000
4000
eng qattiqqo'l jazoni aytadi, yengilidan
ko'ra. Endi keling faraz qilaylik,
11:03
Now let's say you're having that exact same conversation
260
648000
2000
yolg'onchi inson bilan aniq bir narsa
11:05
with someone deceptive.
261
650000
2000
haqida gaplashayapsiz.
Bu inson yopilib
11:07
That person may be withdrawn,
262
652000
2000
qolishi mumkin, oyog'i ostiga qarashi,
11:09
look down, lower their voice,
263
654000
2000
pichirlab gapirishi, to'xtab qolishi,
11:11
pause, be kind of herky-jerky.
264
656000
2000
taktsiz bo'lishi mumkin.
Yolg'onchidan
11:13
Ask a deceptive person to tell their story,
265
658000
2000
o'zining tarixini so'zlab
berishni so'rang,
11:15
they're going to pepper it with way too much detail
266
660000
3000
va u hikoyasini keraksiz,
aloqador bo'lmagan detallarga
11:18
in all kinds of irrelevant places.
267
663000
3000
to'ldirib tashlaydi.
Undan tashqari, u hikoyasini aniq
11:21
And then they're going to tell their story in strict chronological order.
268
666000
3000
xronologik tartibda aytib beradi.
Tajribali tergovchilar
11:24
And what a trained interrogator does
269
669000
2000
bir necha soatlab vaqt davomida
shoshmasdan
11:26
is they come in and in very subtle ways
270
671000
2000
tafsilotlarga chuqur kirib borishadi.
11:28
over the course of several hours,
271
673000
2000
Keyin esa tergov qilinuvchidan
shu hikoyani
11:30
they will ask that person to tell that story backwards,
272
675000
3000
teskari tartibda aytib berishni
so'rashadi. Va qanday
11:33
and then they'll watch them squirm,
273
678000
2000
asabiylashayotganining guvohi bo'lishadi,
11:35
and track which questions produce the highest volume of deceptive tells.
274
680000
3000
shu tarzda aynan qaysi savollar
uni aldashga majbur qilganini
11:38
Why do they do that? Well we all do the same thing.
275
683000
3000
bilib olishadi. Ular nega shunday qiladi?
Menimcha hammamiz
11:41
We rehearse our words,
276
686000
2000
bir xilmiz.
Biz nutqimizni mashq qilamiz
11:43
but we rarely rehearse our gestures.
277
688000
2000
lekin imo-ishoralarimizni deyarli mashq
11:45
We say "yes," we shake our heads "no."
278
690000
2000
qilmaymiz.
Biz "xa" deymiz, lekin boshimiz
11:47
We tell very convincing stories, we slightly shrug our shoulders.
279
692000
3000
"yo'q" deb ishora qiladi.
Biz juda ishonarli hikoylarni aytamiz
11:50
We commit terrible crimes,
280
695000
2000
va ozgina yelkamizni qisib qo'yamiz.
11:52
and we smile at the delight in getting away with it.
281
697000
3000
Biz og'ir jinoyat qilib qo'yamizda,
keyin jilmayamiz, suvdan
11:55
Now that smile is known in the trade as "duping delight."
282
700000
3000
quruq chiqishni o'ylab.
Savdoda bunaqa tabassum "ahmoq qildim"
11:58
And we're going to see that in several videos moving forward,
283
703000
3000
deb nomlanadi. Hozir biz bu
videoda shuning guvohi bo'lamiz.
12:01
but we're going to start -- for those of you who don't know him,
284
706000
2000
Boshladik, bu insonni bilmaydiganlar
uchun
12:03
this is presidential candidate John Edwards
285
708000
3000
— AQSH prezidentligiga
nomzod Jon Edvards, nikohsiz farzandning
12:06
who shocked America by fathering a child out of wedlock.
286
711000
3000
otasi bo'lib butun Amerikani
hayron qoldirgan. Biz hozir uning
12:09
We're going to see him talk about getting a paternity test.
287
714000
3000
otalik genetik analizi o'tkazish
haqidagi suhbatini kuzatamiz.
12:12
See now if you can spot him
288
717000
2000
Kuzating, siz uni fosh eta olasizmi?
12:14
saying, "yes" while shaking his head "no,"
289
719000
2000
Xa deyayotib,
u boshi bilan "yo'q" deyapti,
12:16
slightly shrugging his shoulders.
290
721000
2000
yelkalarini ozgina qisib qo'ygan holda.
12:18
(Video) John Edwards: I'd be happy to participate in one.
291
723000
2000
Jon Edvards:
Men bunda qatnashishdan mamnun
12:20
I know that it's not possible that this child could be mine,
292
725000
3000
bo'lgan bo'lar edim.
Men bilamanki, bu go'dak meniki bo'lishi
12:23
because of the timing of events.
293
728000
2000
mumkin emas, agar muddatini hisobga olsak.
12:25
So I know it's not possible.
294
730000
2000
Men aniq bilaman, buning iloji yo'q.
Men bu
12:27
Happy to take a paternity test,
295
732000
2000
testdan o'tishdan baxtiyor bo'lar edim va
12:29
and would love to see it happen.
296
734000
2000
uning natijasini ko'rib hursand bo'lardim.
12:31
Interviewer: Are you going to do that soon? Is there somebody --
297
736000
3000
Jurnalist: Va siz buni tez orada
qilasizmi? Kimdir...
12:34
JE: Well, I'm only one side. I'm only one side of the test.
298
739000
3000
Jon Edvards: Bilasizmi, men bu testning
bir tomoniman xolos.
12:37
But I'm happy to participate in one.
299
742000
3000
Lekin men bunda qatnashishdan hursand
bo'lgan bo'lar edim.
12:40
PM: Okay, those head shakes are much easier to spot
300
745000
2000
Pamela: Agar siz bu yerda nima izlashni
12:42
once you know to look for them.
301
747000
2000
bilsangiz, aynan bosh
harakatlaridan bilish
12:44
There're going to be times
302
749000
2000
osonligini bilasiz.
Vaqti-vaqti bilan
12:46
when someone makes one expression
303
751000
2000
insonning yuzi bir xil ifoda ko'rsatadi,
12:48
while masking another that just kind of leaks through in a flash.
304
753000
3000
aslida boshqa ifodani niqoblagan va
onda-sonda ko'ringan holda.
12:52
Murderers are known to leak sadness.
305
757000
2000
Ma'lumki, qotillarni qayg'u fosh etadi.
12:54
Your new joint venture partner might shake your hand,
306
759000
2000
Yangi biznes-sherigingiz siz bilan qo'l
12:56
celebrate, go out to dinner with you
307
761000
2000
tashlashi mumkin, siz bilan ovqatlanishi,
12:58
and then leak an expression of anger.
308
763000
3000
va birdan g'azab belgilarini sezib
qolishi mumkin.
13:01
And we're not all going to become facial expression experts overnight here,
309
766000
3000
Biz bir kechada yuz ifodalari bo'yicha
ekspert bo'lib qolmasmiz,
13:04
but there's one I can teach you that's very dangerous, and it's easy to learn,
310
769000
3000
lekin bir narsa borki, men sizga buni
oson o'rgata olaman.
13:07
and that's the expression of contempt.
311
772000
3000
Bu nafrat ifodasi.
Gap g'azab haqda ketganda,
13:10
Now with anger, you've got two people on an even playing field.
312
775000
3000
biz buni bir xil sharoyitda turgan
ikkita insonning ishi deymiz.
13:13
It's still somewhat of a healthy relationship.
313
778000
2000
Bu sog'lom munosabatlarga o'xshaydi.
13:15
But when anger turns to contempt,
314
780000
2000
Lekin agar g'azab nafratga aylansa,
13:17
you've been dismissed.
315
782000
2000
siz hisobdan o'chirilasiz.
13:19
It's associated with moral superiority.
316
784000
2000
Nafrat axloqiy ustunlik bilan bog'liqdir.
13:21
And for that reason, it's very, very hard to recover from.
317
786000
3000
Va shu sabab undan qutulish juda
ham qiyin.
13:24
Here's what it looks like.
318
789000
2000
Mana u nimaga o'xshaydi.
13:26
It's marked by one lip corner
319
791000
2000
Nafratda shunday lab burchaklari bo'ladi,
13:28
pulled up and in.
320
793000
2000
sal ko'tarilgan va orqaga tortilgan.
13:30
It's the only asymmetrical expression.
321
795000
3000
Bu assimetrik yuz ifodasi xolos.
Agar nafrat mavjud bo'lsa va
13:33
And in the presence of contempt,
322
798000
2000
uning ortida yolg'on yashiringan
bo'lishiga
13:35
whether or not deception follows --
323
800000
2000
qaramay,
odatda ular nafratli
13:37
and it doesn't always follow --
324
802000
2000
munosabatlarda yonma-yon kelmaydi,
13:39
look the other way, go the other direction,
325
804000
2000
belgilarni boshqa tomondan qidiring,
13:41
reconsider the deal,
326
806000
2000
bitimni ko'rib chiqing,
13:43
say, "No thank you. I'm not coming up for just one more nightcap. Thank you."
327
808000
4000
shunday deng: "Yo'q, rahmat.
Menimcha shu joyida to'xtaymiz. Rahmat".
13:47
Science has surfaced
328
812000
2000
Fan tomonidan
13:49
many, many more indicators.
329
814000
2000
juda ko'p indikatorlar aniqlangan.
13:51
We know, for example,
330
816000
2000
Masalan, biz bilamizki, aldayotgan insonda
13:53
we know liars will shift their blink rate,
331
818000
2000
ko'z ochib-yumish chastotasi o'zgaradi,
13:55
point their feet towards an exit.
332
820000
2000
oyoqlari esa chiqish tomonga qarab turadi.
13:57
They will take barrier objects
333
822000
2000
Ular turli metodlarni qo'llashadi,
13:59
and put them between themselves and the person that is interviewing them.
334
824000
3000
to'siq sifatida ularni o'zi va suhbatdoshi
orasiga qo'yadi.
14:02
They'll alter their vocal tone,
335
827000
2000
Ular ovoz tonini o'zgartirishadi,
14:04
often making their vocal tone much lower.
336
829000
3000
odatda, uni sal tushirgan holda.
14:07
Now here's the deal.
337
832000
2000
Endi esa — eng muhimi.
14:09
These behaviors are just behaviors.
338
834000
3000
Bunday harakat — oddiy bir harakat xolos.
14:12
They're not proof of deception.
339
837000
2000
Uni yolg'onning isboti deyish mumkin emas.
14:14
They're red flags.
340
839000
2000
Bu qizil bayroqchalar.
Hammamiz insonmiz.
14:16
We're human beings.
341
841000
2000
Kun davomida hammamiz yolg'on haqida xabar
14:18
We make deceptive flailing gestures all over the place all day long.
342
843000
3000
beruvchi turliv ishoralar qilamiz.
Yakka holda ular ko'p narsa
14:21
They don't mean anything in and of themselves.
343
846000
2000
anglatmaydi.
Lekin siz bunday belgilarning
14:23
But when you see clusters of them, that's your signal.
344
848000
3000
juda ko'pini bir vaqtda ko'rsangiz,
bu sizga signal bo'lishi
14:26
Look, listen, probe, ask some hard questions,
345
851000
3000
kerak. Kuzating, yaxshilab
tinglang, qiyin savollar bering va
14:29
get out of that very comfortable mode of knowing,
346
854000
3000
oson o'rganish rejimini tark eting,
qiziquvchanlik rejimiga
14:32
walk into curiosity mode, ask more questions,
347
857000
3000
o'ting, ko'proq savollar bering,
shaxsiy qadr-qimmatingizni
14:35
have a little dignity, treat the person you're talking to with rapport.
348
860000
3000
saqlab qoling, suhbatdosh
bilan o'zaro tushinishga erishing.
14:38
Don't try to be like those folks on "Law & Order" and those other TV shows
349
863000
3000
O'z o'ljasini tan olishga majburlaydigan
"Qonun va tartib" kabi
14:41
that pummel their subjects into submission.
350
866000
2000
TV-shoulardagilardek bo'lishga intilmang.
14:43
Don't be too aggressive, it doesn't work.
351
868000
3000
Haddan ortiq agressiv bo'lmang,
bu ishlamaydi.
14:46
Now we've talked a little bit
352
871000
2000
Demak, biz yolg'onchi bilan
qanday muloqot
14:48
about how to talk to someone who's lying
353
873000
2000
qilish kerakligi va qanday uni fosh etish
14:50
and how to spot a lie.
354
875000
2000
mumkinligi haqda biroz gaplashib oldik.
14:52
And as I promised, we're now going to look at what the truth looks like.
355
877000
3000
Va'da qilganimdek, endi rost nimaga
o'xshashini ko'rib chiqamiz.
14:55
But I'm going to show you two videos,
356
880000
2000
Sizga ikki videoni ko'rsatishni
xohlayman,
14:57
two mothers -- one is lying, one is telling the truth.
357
882000
3000
biri rost, biri yolg'on gapirayotgan
ikkita onani.
15:00
And these were surfaced
358
885000
2000
Bu roliklar Kaliforniyalik tadqiqotchi
15:02
by researcher David Matsumoto in California.
359
887000
2000
Devid Matsumoto tomonidan o'rganilgan.
15:04
And I think they're an excellent example
360
889000
2000
Va menimcha, ular haqiqat nimaga o'xshash
15:06
of what the truth looks like.
361
891000
2000
ekanligining yorqin misoli bo'la oladi.
15:08
This mother, Diane Downs,
362
893000
2000
Bu ona Dian Dauns, u o'z bolalarini
15:10
shot her kids at close range,
363
895000
2000
masofadan otib tashlagan, ularni
15:12
drove them to the hospital
364
897000
2000
shifoxonaga olib borgan, ular mashinada
15:14
while they bled all over the car,
365
899000
2000
qonga botib yotgan
bir paytda buni notanish
15:16
claimed a scraggy-haired stranger did it.
366
901000
2000
kal odam qildi deb ko'rsatma bergan.
15:18
And you'll see when you see the video,
367
903000
2000
Videoni ko'rib, bolalari
uchun kuyinayotgan
15:20
she can't even pretend to be an agonizing mother.
368
905000
2000
onaga umuman o'xshamasligini tushunasiz.
15:22
What you want to look for here
369
907000
2000
Bunda siz topishingiz mumkin bo'lgan
jihat,
15:24
is an incredible discrepancy
370
909000
2000
u aytib berayotgan daxshatli
voqealar bilan
15:26
between horrific events that she describes
371
911000
2000
uning sovuq, ehtiyotkor o'zini tutishi
15:28
and her very, very cool demeanor.
372
913000
2000
o'rtasida aqlbovar qilmas
nomuvofiqlikdir.
15:30
And if you look closely, you'll see duping delight throughout this video.
373
915000
3000
Butun video davomida "ahmoq qildim"
ifodasini ko'rish mumkin.
15:33
(Video) Diane Downs: At night when I close my eyes,
374
918000
2000
Diana Dauns:
Tunda ko'zimni yumganimda,
15:35
I can see Christie reaching her hand out to me while I'm driving,
375
920000
3000
men mashinadan menga qo'lini uzatayotgan
Kristini ko'raman,
15:38
and the blood just kept coming out of her mouth.
376
923000
3000
va uning og'zidan qon kelishda davom
etayotgan bo'ladi.
15:41
And that -- maybe it'll fade too with time --
377
926000
2000
Balki, vaqt o'tib bu o'tib ketar,
15:43
but I don't think so.
378
928000
2000
lekin men unday o'ylamayman.
15:45
That bothers me the most.
379
930000
3000
Mana meni nima ko'proq xavotirlantiradi.
15:55
PM: Now I'm going to show you a video
380
940000
2000
Pamela Meyer:
Hozir esa men sizlarga
15:57
of an actual grieving mother, Erin Runnion,
381
942000
2000
haqiqiy ona qayg'usini, Erin Runnion,
15:59
confronting her daughter's murderer and torturer in court.
382
944000
4000
qizining qotiliga qarshi suddagi nutqini
ko'rsataman.
16:03
Here you're going to see no false emotion,
383
948000
2000
Siz bunda soxta emotsiyalarni
ko'rmaysiz,
16:05
just the authentic expression of a mother's agony.
384
950000
3000
faqat onaning chuqur qayg'usi.
16:08
(Video) Erin Runnion: I wrote this statement on the third anniversary
385
953000
2000
Erin Runnion:
Men bu arizani sen qizimni
16:10
of the night you took my baby,
386
955000
2000
o'g'irlab ketganingdan uch kun
o'tib yozdim
16:12
and you hurt her,
387
957000
2000
va sen uni qiynading,
16:14
and you crushed her,
388
959000
2000
sen uni yo'q qilding,
16:16
you terrified her until her heart stopped.
389
961000
4000
sen uni yuragi to'xtab
qolgunicha qo'rqitding.
16:20
And she fought, and I know she fought you.
390
965000
3000
U kurashgan, men bilaman,
u sen bilan kurashgan.
16:23
But I know she looked at you
391
968000
2000
Men bilaman, u senga ajoyib jigarrang
16:25
with those amazing brown eyes,
392
970000
2000
ko'zlari bilan qaragan,
16:27
and you still wanted to kill her.
393
972000
3000
sen esa shundayam uni o'ldirishni
xohlagansan.
16:30
And I don't understand it,
394
975000
2000
Va men buni tushunmayman,
16:32
and I never will.
395
977000
3000
hech qachon tushuna olmayman.
16:35
PM: Okay, there's no doubting the veracity of those emotions.
396
980000
4000
Pamela Meyer: Xo'sh, bu hissiyotlarning
haqiqiy ekaniga hech qanday shubha yo'q.
16:39
Now the technology around what the truth looks like
397
984000
3000
Bugungi kunda haqiqatni aniqlovchi
texnologiya
16:42
is progressing on, the science of it.
398
987000
3000
ilm-fan yutuqlari bilan borgan sari
rivojlanmoqda.
16:45
We know for example
399
990000
2000
Masalan, bizga ma'lumki, hozirda
16:47
that we now have specialized eye trackers and infrared brain scans,
400
992000
3000
ko'z harakatlari trekeri va infraqizil
miya skanerlari, rezonans
16:50
MRI's that can decode the signals that our bodies send out
401
995000
3000
magnitli tomograflar bor va
ular biz aldashga urinayotganimizda
16:53
when we're trying to be deceptive.
402
998000
2000
bu haqda signallar beradi.
16:55
And these technologies are going to be marketed to all of us
403
1000000
3000
Va bu texnologiyalar tez orada hammamizga
taklif qilinadi,
16:58
as panaceas for deceit,
404
1003000
2000
yolg'ondan qutilishning yo'li sifatida.
17:00
and they will prove incredibly useful some day.
405
1005000
3000
Qachondir ular o'zining foydasini
isbotlashadi. Lekin shunday
17:03
But you've got to ask yourself in the meantime:
406
1008000
2000
vaqtda siz o'zingizdan so'rashingiz kerak:
17:05
Who do you want on your side of the meeting,
407
1010000
2000
Kim bilan siz barrikadaning bir tarafida
17:07
someone who's trained in getting to the truth
408
1012000
3000
bo'lishni xoxlardingiz,
haqiqatni aniqlay oladigan inson bilanmi
17:10
or some guy who's going to drag a 400-pound electroencephalogram
409
1015000
2000
yoki 200 kg elektroensefalografni joyidan
17:12
through the door?
410
1017000
2000
siljitishga intilayotgan inson bilanmi?
17:14
Liespotters rely on human tools.
411
1019000
4000
Yolg'onni ko'ra oluvchilar ko'proq insoniy
emotsiyalar mavjudligiga qarashadi.
17:18
They know, as someone once said,
412
1023000
2000
Bir paytlar kimdir ushbu gapni aytgan:
17:20
"Character's who you are in the dark."
413
1025000
2000
"Xarakter bu sening qorong'ulikda kim
17:22
And what's kind of interesting
414
1027000
2000
ekanligingdir".
Va eng qizig'i shuki,
17:24
is that today we have so little darkness.
415
1029000
2000
bugun shu qorong'ulik juda kam.
17:26
Our world is lit up 24 hours a day.
416
1031000
3000
Bizning dunyoda sutkasiga 24 soat
chiroqlar yoniq.
17:29
It's transparent
417
1034000
2000
U shaffof,
17:31
with blogs and social networks
418
1036000
2000
ijtimoiy hayotni tanlagan butun
17:33
broadcasting the buzz of a whole new generation of people
419
1038000
2000
yangi avlodning mish-mishlarini keng
17:35
that have made a choice to live their lives in public.
420
1040000
3000
yoyadigan bloglar va ijtimoiy tarmoqlar
hisobiga.
17:38
It's a much more noisy world.
421
1043000
4000
Hozirgi dunyo ancha shovqinli.
17:42
So one challenge we have
422
1047000
2000
Shuning uchun ham bizning vazifamiz qiyin:
17:44
is to remember,
423
1049000
2000
haddan tashqari ochiqlik —
17:46
oversharing, that's not honesty.
424
1051000
3000
rostgo'ylik emasligini unutmaslik.
Bizning daxshatli darajada
17:49
Our manic tweeting and texting
425
1054000
2000
Twitterda muloqot qilishga va xabarlar
17:51
can blind us to the fact
426
1056000
2000
almashishga tashnaligimiz qadr-qimmatga
17:53
that the subtleties of human decency -- character integrity --
427
1058000
3000
ega bo'lib kelgan insoniyat xulqining
nozik jihatlari, xarakter
17:56
that's still what matters, that's always what's going to matter.
428
1061000
3000
butunligi kabi faktorlarni soyada
qoldirishi mumkin.
17:59
So in this much noisier world,
429
1064000
2000
Shuning uchun hozirgidek shovqinli dunyoda
18:01
it might make sense for us
430
1066000
2000
axloqiy me'yorlarimiz bilan bog'liq
18:03
to be just a little bit more explicit
431
1068000
2000
bo'lgan jihatlarga yanada jiddiyroq
18:05
about our moral code.
432
1070000
3000
yondashish alohida o'rin tutadi.
18:08
When you combine the science of recognizing deception
433
1073000
2000
Siz yolg'onni fosh etishni o'rganish bilan
18:10
with the art of looking, listening,
434
1075000
2000
diqqat bilan eshitish va kuzatishni
18:12
you exempt yourself from collaborating in a lie.
435
1077000
3000
birlashtirsangiz o'zingizni yolg'onlardan
asragan bo'lasiz.
18:15
You start up that path
436
1080000
2000
Atrofingizdagilarga: "Mening dunyoyim,
18:17
of being just a little bit more explicit,
437
1082000
2000
bizning dunyo rost dunyo bo'ladi.
18:19
because you signal to everyone around you,
438
1084000
2000
Mening dunyoyimda rost hamma narsaga qodir
18:21
you say, "Hey, my world, our world,
439
1086000
3000
va yolg'onni fosh etib, unga e'tibor
bermaydilar"
18:24
it's going to be an honest one.
440
1089000
2000
deyotgandan holda,
18:26
My world is going to be one where truth is strengthened
441
1091000
2000
siz ushbu katta aniqlikka qarab
18:28
and falsehood is recognized and marginalized."
442
1093000
3000
yo'l boshlaysiz.
18:31
And when you do that,
443
1096000
2000
Va siz buni uddalaganingizda
18:33
the ground around you starts to shift just a little bit.
444
1098000
3000
atrofingizdagi dunyo asta-sekin
o'zgarishni boshlaydi.
18:36
And that's the truth. Thank you.
445
1101000
3000
Va bu chin haqiqat. Rahmat.
18:39
(Applause)
446
1104000
5000
(Qarsaklar)
Translated by Nazarbek Nazarov

▲Back to top

ABOUT THE SPEAKER
Pamela Meyer - Lie detector
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth.

Why you should listen
Social media expert Pamela Meyer can tell when you’re lying. If it’s not your words that give you away, it’s your posture, eyes, breathing rate, fidgets, and a host of other indicators. Worse, we are all lied to up to 200 times a day, she says, from the white lies that allow society to function smoothly to the devastating duplicities that bring down corporations and break up families.

Working with a team of researchers over several years, Meyer, who is CEO of social networking company Simpatico Networks, collected and reviewed most of the research on deception that has been published, from such fields as law-enforcement, military, psychology and espionage. She then became an expert herself, receiving advanced training in deception detection, including multiple courses of advanced training in interrogation, microexpression analysis, statement analysis, behavior and body language interpretation, and emotion recognition. Her research is synthetized in her bestselling book Liespotting.
More profile about the speaker
Pamela Meyer | Speaker | TED.com