ABOUT THE SPEAKER
Ray Dalio - Hedge fund chair
Ray Dalio is the founder, chair and co-chief investment officer of Bridgewater Associates, a global leader in institutional portfolio management and the largest hedge fund in the world.

Why you should listen

Dalio started Bridgewater out of his two-bedroom apartment in New York City in 1975 and has grown it into the fifth most important private company in the U.S. (according to Fortune magazine). Because of the firm’s many industry-changing innovations over its 40-year history, he has been called the “Steve Jobs of investing” by aiCIO magazine and named one of TIME magazine’s "100 Most Influential People."

Dalio attributes Bridgewater’s success to its unique culture. He describes it as “a believability-weighted idea meritocracy” in which the people strive for “meaningful work and meaningful relationships through radical truth and radical transparency.” He has explained this approach in his book Principles, which has been downloaded more than three million times and has produced considerable curiosity and controversy.

More profile about the speaker
Ray Dalio | Speaker | TED.com
TED2017

Ray Dalio: How to build a company where the best ideas win

雷.達里歐: 如何讓公司裡最好的點子總是勝出

Filmed:
3,449,614 views

假設你知道你同事對你的真正看法,以及他們究竟是怎麼樣的人,到底會怎麼樣呢?雷.達里歐運用了徹底的透明化還有演算決策法來經營一家「優秀想法勝出」的公司。在這間公司裡面人可以暢所欲言──即使是挑戰老闆也可以。聆聽這場演講,你可以學到這些策略是怎麼幫助達里歐建立全世界最成功的對沖基金,以及如何利用強大的數據導向團體決策。
- Hedge fund chair
Ray Dalio is the founder, chair and co-chief investment officer of Bridgewater Associates, a global leader in institutional portfolio management and the largest hedge fund in the world. Full bio

Double-click the English transcript below to play the video.

00:12
Whether是否 you like it or not,
0
560
1336
不論你喜不喜歡
00:13
radical激進 transparency透明度 and algorithmic算法
decision-making做決定 is coming未來 at you fast快速,
1
1920
5376
徹底的透明化
與演算決策法都來勢洶洶
00:19
and it's going to change更改 your life.
2
7320
1976
它們也將改變你的生活
00:21
That's because it's now easy簡單
to take algorithms算法
3
9320
2816
但這是因為把演算法
放進電腦中很容易
00:24
and embed them into computers電腦
4
12160
1896
00:26
and gather收集 all that data數據
that you're leaving離開 on yourself你自己
5
14080
2936
然後收集你身上各式各樣的資料
00:29
all over the place地點,
6
17040
1376
00:30
and know what you're like,
7
18440
1696
來了解你是個什麼樣的人
00:32
and then direct直接 the computers電腦
to interact相互作用 with you
8
20160
2936
然後讓電腦以比任何人
都好的方式跟你互動
00:35
in ways方法 that are better
than most people can.
9
23120
2120
00:38
Well, that might威力 sound聲音 scary害怕.
10
26160
1616
嗯,這或許聽起來有點嚇人
00:39
I've been doing this for a long time
and I have found發現 it to be wonderful精彩.
11
27800
3640
我已經做這件事很久了
而且我發現這是件很棒的事
我的任務是做有意義的事情
00:44
My objective目的 has been
to have meaningful富有意義的 work
12
32159
2657
00:46
and meaningful富有意義的 relationships關係
with the people I work with,
13
34840
2856
以及和我的同事們建立良好的關係
00:49
and I've learned學到了 that I couldn't不能 have that
14
37720
2056
而我發現如果少了
透明化與演算決策法
00:51
unless除非 I had that radical激進 transparency透明度
and that algorithmic算法 decision-making做決定.
15
39800
4280
我將做不到上面提到的兩件事
00:56
I want to show顯示 you why that is,
16
44680
2016
我想告訴你們這是為什麼
00:58
I want to show顯示 you how it works作品.
17
46720
1696
我也想告訴你們這是如何運作的
01:00
And I warn警告 you that some of the things
that I'm going to show顯示 you
18
48440
3096
我要先警告你們我將要告訴你們的事
或許會有些驚人
01:03
probably大概 are a little bit shocking觸目驚心.
19
51560
1667
01:05
Since以來 I was a kid孩子,
I've had a terrible可怕 rote死記硬背 memory記憶.
20
53760
3480
從我小的時候,就很不擅長死記硬背
01:10
And I didn't like following以下 instructions說明,
21
58120
2176
我也不喜歡照著指示走
01:12
I was no good at following以下 instructions說明.
22
60320
2416
也不擅長照指示做事
01:14
But I loved喜愛 to figure數字 out
how things worked工作 for myself.
23
62760
3000
但我喜歡自己去發現
事物是如何運作的
01:18
When I was 12,
24
66680
1376
我 12 歲的時候
01:20
I hated school學校 but I fell下跌 in love
with trading貿易 the markets市場.
25
68080
3280
我討厭上學但我熱愛市場交易
01:23
I caddied球童 at the time,
26
71920
1656
我那時候是個桿弟
01:25
earned about five dollars美元 a bag.
27
73600
1576
提一個袋子賺五塊美金
01:27
And I took my caddying球童 money,
and I put it in the stock股票 market市場.
28
75200
3200
然後我把當桿弟賺的錢拿去買股票
01:31
And that was just because
the stock股票 market市場 was hot at the time.
29
79240
3376
就只是因為那時候流行買股票
01:34
And the first company公司 I bought
30
82640
1456
我投資的第一家公司
01:36
was a company公司 by the name名稱
of Northeast東北 Airlines航空公司.
31
84120
2600
是一家名叫東北航空的公司
01:39
Northeast東北 Airlines航空公司 was
the only company公司 I heard聽說 of
32
87360
2736
東北航空是那時候我所知道
01:42
that was selling銷售 for less
than five dollars美元 a share分享.
33
90120
2696
唯一一家股票
每股低於五塊美金的公司
01:44
(Laughter笑聲)
34
92840
1976
(笑聲)
01:46
And I figured想通 I could buy購買 more shares分享,
35
94840
1856
我發現我可以買進更多股票
01:48
and if it went up, I'd make more money.
36
96720
2096
如果股票漲價,我就可以賺更多錢
01:50
So, it was a dumb strategy戰略, right?
37
98840
2840
所以,這個策略很蠢,對吧?
01:54
But I tripled三倍 my money,
38
102360
1456
但我的錢翻了三倍
01:55
and I tripled三倍 my money
because I got lucky幸運.
39
103840
2120
我的錢翻了三倍,就因為我運氣好
01:58
The company公司 was about to go bankrupt破產者,
40
106520
1816
這家公司瀕臨破產
02:00
but some other company公司 acquired後天 it,
41
108360
2096
但其它公司把它買了下來
02:02
and I tripled三倍 my money.
42
110480
1456
然後我賺了三倍回來
02:03
And I was hooked迷上.
43
111960
1200
我上癮了
02:05
And I thought, "This game遊戲 is easy簡單."
44
113720
2280
我心想:這個遊戲真容易
02:09
With time,
45
117200
1216
過了一段時間之後
02:10
I learned學到了 this game遊戲 is anything but easy簡單.
46
118440
1960
我發現這個遊戲其實相當困難
02:12
In order訂購 to be an effective有效 investor投資者,
47
120880
2136
為了成為一名有效率的投資人
02:15
one has to bet賭注 against反對 the consensus共識
48
123040
2896
不能隨波逐流
02:17
and be right.
49
125960
1256
而且要眼光準確
02:19
And it's not easy簡單 to bet賭注
against反對 the consensus共識 and be right.
50
127240
2856
不隨波逐流且眼光準確並不容易
02:22
One has to bet賭注 against反對
the consensus共識 and be right
51
130120
2336
你必須不隨波逐流且眼光準確
02:24
because the consensus共識
is built內置 into the price價錢.
52
132480
2640
因為社會大眾的看法
會反映在股價上面
02:28
And in order訂購 to be an entrepreneur企業家,
53
136120
2456
為了成為一名企業家
02:30
a successful成功 entrepreneur企業家,
54
138600
1616
而且是一名成功的企業家
02:32
one has to bet賭注 against反對
the consensus共識 and be right.
55
140240
3480
就必須不隨波逐流且眼光準確
02:37
I had to be an entrepreneur企業家
and an investor投資者 --
56
145400
2936
為了同時成為企業家以及投資人
02:40
and what goes along沿 with that
is making製造 a lot of painful痛苦 mistakes錯誤.
57
148360
4200
在這條路上我犯了許多嚴重的錯誤
02:45
So I made製作 a lot of painful痛苦 mistakes錯誤,
58
153440
2816
我就是這麼過來的
02:48
and with time,
59
156280
1256
隨著時間過去
02:49
my attitude態度 about those mistakes錯誤
began開始 to change更改.
60
157560
2960
我對於這些錯誤的態度開始有了改變
02:53
I began開始 to think of them as puzzles謎題.
61
161160
2096
我開始把它們看作是一塊塊的拼圖
02:55
That if I could solve解決 the puzzles謎題,
62
163280
1936
如果我能夠把它們拼在一起的話
02:57
they would give me gems寶石.
63
165240
1440
我就能得到寶石
02:59
And the puzzles謎題 were:
64
167160
1656
這些拼圖就是:
03:00
What would I do differently不同 in the future未來
so I wouldn't不會 make that painful痛苦 mistake錯誤?
65
168840
3880
未來我要怎麼做
才能避免再犯這些錯誤?
03:05
And the gems寶石 were principles原則
66
173280
2576
而那些寶石就是我寫下並牢記的原則
03:07
that I would then write down
so I would remember記得 them
67
175880
3136
03:11
that would help me in the future未來.
68
179040
1572
這些原則在未來會對我有所幫助
03:13
And because I wrote them down so clearly明確地,
69
181000
2696
就是因為我清楚地把它們寫下來
03:15
I could then --
70
183720
1336
我最後才能發現
03:17
eventually終於 discovered發現 --
71
185080
1576
03:18
I could then embed them into algorithms算法.
72
186680
3760
我可以把這些原則寫進演算法裡面
03:23
And those algorithms算法
would be embedded嵌入式 in computers電腦,
73
191400
3456
然後就可以把這些演算法
植入到電腦裡
03:26
and the computers電腦 would
make decisions決定 along沿 with me;
74
194880
3336
最後電腦就可以跟我一起做出決定
03:30
and so in parallel平行,
we would make these decisions決定.
75
198240
3136
電腦會和我一起做出決定
03:33
And I could see how those decisions決定
then compared相比 with my own擁有 decisions決定,
76
201400
3976
之後我會去比較我們一起所做的決定
和我自己做的決定
03:37
and I could see that
those decisions決定 were a lot better.
77
205400
3096
然後我發現電腦和我
一起做的決定好多了
03:40
And that was because the computer電腦
could make decisions決定 much faster更快,
78
208520
4736
這是因為電腦做決定的速度快多了
03:45
it could process處理 a lot more information信息
79
213280
2256
而且它可以處理更多的資訊
03:47
and it can process處理 decisions決定 much more --
80
215560
3400
它在做決定的時候
也不會被情緒所影響
03:51
less emotionally感情上.
81
219880
1200
03:54
So it radically根本 improved改善
my decision-making做決定.
82
222760
3920
電腦大大改善了我做決定的過程
04:00
Eight years年份 after I started開始 Bridgewater布里奇沃特,
83
228440
4896
在我成立對沖基金橋水聯合
(Bridgewater) 八年之後
04:05
I had my greatest最大 failure失敗,
84
233360
1536
我經歷了我最大的失敗
04:06
my greatest最大 mistake錯誤.
85
234920
1200
最大的錯誤
04:09
It was late晚了 1970s,
86
237680
2136
當時是 1970 年代晚期
04:11
I was 34 years年份 old,
87
239840
1976
當年我 34 歲
04:13
and I had calculated計算 that American美國 banks銀行
88
241840
3656
我計算出美國的銀行
借給新興國家太多的錢
04:17
had lent借給 much more money
to emerging新興 countries國家
89
245520
2856
04:20
than those countries國家
were going to be able能夠 to pay工資 back
90
248400
2816
多到這些國家還不起
04:23
and that we would have
the greatest最大 debt債務 crisis危機
91
251240
2696
這可能會讓我們遭遇
自從經濟大蕭條以來最大的債務危機
04:25
since以來 the Great Depression蕭條.
92
253960
1360
04:28
And with it, an economic經濟 crisis危機
93
256200
2216
伴隨而來的就會是經濟危機
04:30
and a big bear market市場 in stocks個股.
94
258440
2040
還有嚴重的熊市
04:33
It was a controversial爭論的 view視圖 at the time.
95
261680
2000
這個觀點在當時很具爭議性
04:36
People thought it was
kind of a crazy point of view視圖.
96
264160
2440
大家覺得這是個有點瘋狂的想法
04:39
But in August八月 1982,
97
267480
2216
但在 1982 年 8 月
04:41
Mexico墨西哥 defaulted默認 on its debt債務,
98
269720
1960
墨西哥宣布了債務違約
04:44
and a number of other countries國家 followed其次.
99
272520
2256
然後一些其它的國家也跟進
04:46
And we had the greatest最大 debt債務 crisis危機
since以來 the Great Depression蕭條.
100
274800
3400
我們就這樣遇到了
經濟大蕭條以來最大的債務危機
04:51
And because I had anticipated預期 that,
101
279080
2776
因為我早就已經預料到這件事
04:53
I was asked to testify作證 to Congress國會
and appear出現 on "Wall Street Week,"
102
281880
4336
我被邀請到國會作見證
以及上當時熱門電視「本周華爾街」
(Wall Street Week) 當來賓
04:58
which哪一個 was the show顯示 of the time.
103
286240
1976
05:00
Just to give you a flavor味道 of that,
I've got a clip here,
104
288240
2936
為了讓你們親身體驗
我準備了一段影片
05:03
and you'll你會 see me in there.
105
291200
1920
我就出現在這影片當中
05:06
(Video視頻) Mr先生. Chairman主席, Mr先生. Mitchell米切爾,
106
294480
1696
(影片)主席、米歇爾先生
05:08
it's a great pleasure樂趣 and a great honor榮譽
to be able能夠 to appear出現 before you
107
296200
3376
很高興也很榮幸能夠來到這裡
05:11
in examination檢查 with what
is going wrong錯誤 with our economy經濟.
108
299600
3480
檢視我們的經濟出了什麼問題
05:15
The economy經濟 is now flat平面 --
109
303640
1936
經濟沒有成長
05:17
teetering搖搖欲墜 on the brink邊緣 of failure失敗.
110
305600
2136
在崩潰的邊緣搖搖欲墜
05:19
Martin馬丁 Zweig茨威格: You were recently最近
quoted in an article文章.
111
307760
2496
馬汀.茲維格:「最近一篇文章提到
你說『我非常肯定我所說過的話,
05:22
You said, "I can say this
with absolute絕對 certainty肯定
112
310280
2336
因為我瞭解市場是如何運作的』。」
05:24
because I know how markets市場 work."
113
312640
1616
05:26
Ray射線 Dalio戴利奧: I can say
with absolute絕對 certainty肯定
114
314280
2096
雷.達里歐:
「我非常確定我的論點是對的,
05:28
that if you look at the liquidity流動性 base基礎
115
316400
1856
如果把企業的流動資金
和全世界看做一體的話,
05:30
in the corporations公司
and the world世界 as a whole整個,
116
318280
3376
目前資金流動性大幅降低,
不可能回到停滯性通膨時代。」
05:33
that there's such這樣 reduced減少
level水平 of liquidity流動性
117
321680
2096
05:35
that you can't return返回
to an era時代 of stagflation滯脹."
118
323800
3216
05:39
I look at that now, I think,
"What an arrogant傲慢 jerk混蛋!"
119
327040
3096
看了這段影片,我只覺得
「這傢伙真是個傲慢的混蛋!」
05:42
(Laughter笑聲)
120
330160
2000
(笑聲)
05:45
I was so arrogant傲慢, and I was so wrong錯誤.
121
333760
2456
我當時非常傲慢,而且錯得離譜
05:48
I mean, while the debt債務 crisis危機 happened發生,
122
336240
2576
我是說,當債務危機發生的時候
05:50
the stock股票 market市場 and the economy經濟
went up rather than going down,
123
338840
3976
股市還有經濟是成長的,而不是下跌
05:54
and I lost丟失 so much money
for myself and for my clients客戶
124
342840
5016
而我和我的客戶損失了一大筆錢
05:59
that I had to shut關閉 down
my operation手術 pretty漂亮 much,
125
347880
3416
我的公司幾乎面臨倒閉
06:03
I had to let almost幾乎 everybody每個人 go.
126
351320
1880
我得讓幾乎所有的員工走人
06:05
And these were like extended擴展 family家庭,
127
353640
1736
這些人對我來說就像家人一樣
06:07
I was heartbroken腸斷.
128
355400
1616
我痛徹心扉
06:09
And I had lost丟失 so much money
129
357040
1816
我的損失大到
06:10
that I had to borrow
4,000 dollars美元 from my dad
130
358880
3336
我得向我父親借 4,000 美金
來付家裡的帳單
06:14
to help to pay工資 my family家庭 bills票據.
131
362240
1920
06:16
It was one of the most painful痛苦
experiences經驗 of my life ...
132
364840
3160
那是我這輩子最痛苦的經驗之一
06:21
but it turned轉身 out to be
one of the greatest最大 experiences經驗 of my life
133
369240
3776
但後來變成是
我這輩子最好的經驗之一
06:25
because it changed my attitude態度
about decision-making做決定.
134
373040
2680
因為那次的經驗改變了
我對決策的態度
06:28
Rather than thinking思維, "I'm right,"
135
376360
3056
我開始問我自己
「我怎麼確定我是對的?」
06:31
I started開始 to ask myself,
136
379440
1576
而不是認為「我是對的」
06:33
"How do I know I'm right?"
137
381040
1800
06:36
I gained獲得 a humility謙遜 that I needed需要
138
384480
1936
我學會了謙遜
06:38
in order訂購 to balance平衡 my audacity大膽.
139
386440
2560
來平衡我的放肆
06:41
I wanted to find the smartest最聰明的
people who would disagree不同意 with me
140
389880
4216
我想要找最聰明的人來反駁我
06:46
to try to understand理解 their perspective透視
141
394120
1896
讓我了解他們的觀點
06:48
or to have them
stress強調 test測試 my perspective透視.
142
396040
2600
或讓他們來對我觀點做壓力測試
06:51
I wanted to make an idea理念 meritocracy任人唯賢.
143
399400
2776
我希望優秀想法勝出的功績主義
06:54
In other words,
144
402200
1216
也就是說,不是我說什麼
你做什麼這種獨裁主義
06:55
not an autocracy專制 in which哪一個
I would lead and others其他 would follow跟隨
145
403440
3816
06:59
and not a democracy民主 in which哪一個 everybody's每個人的
points of view視圖 were equally一樣 valued價值,
146
407280
3616
也不是人人的意見都平等的民主
07:02
but I wanted to have an idea理念 meritocracy任人唯賢
in which哪一個 the best最好 ideas思路 would win贏得 out.
147
410920
5096
我想要的是那種
最好的想法會勝出的功績主義
07:08
And in order訂購 to do that,
148
416040
1256
為了做到這一點
07:09
I realized實現 that we would need
radical激進 truthfulness真實性
149
417320
3576
我了解到我們需要擁有徹底的誠實
07:12
and radical激進 transparency透明度.
150
420920
1616
還有徹底的透明化
07:14
What I mean by radical激進 truthfulness真實性
and radical激進 transparency透明度
151
422560
3856
我所說的徹底的誠實
還有徹底的透明化
07:18
is people needed需要 to say
what they really believed相信
152
426440
2656
是讓人們能夠說出他們
真正相信的,並看到一切
07:21
and to see everything.
153
429120
2000
07:23
And we literally按照字面
tape膠帶 almost幾乎 all conversations對話
154
431480
3936
我們確實地把幾乎
每個對話都錄了下來
07:27
and let everybody每個人 see everything,
155
435440
1616
然後讓所有的人看見
07:29
because if we didn't do that,
156
437080
1416
因為如果我們不這麼做
07:30
we couldn't不能 really have
an idea理念 meritocracy任人唯賢.
157
438520
3080
我們就沒辦法確實做到優秀想法勝出
07:34
In order訂購 to have an idea理念 meritocracy任人唯賢,
158
442760
3696
為了達到優秀想法勝出的功績主義
07:38
we have let people speak說話
and say what they want.
159
446480
2376
我們讓人們說出他們想要什麼
07:40
Just to give you an example,
160
448880
1376
下面是個例子
07:42
this is an email電子郵件 from Jim吉姆 Haskel哈斯克爾 --
161
450280
2696
這是吉米寄的一封電子郵件
07:45
somebody who works作品 for me --
162
453000
1376
他是我的員工
07:46
and this was available可得到
to everybody每個人 in the company公司.
163
454400
3376
每個在公司的人都看得到這封郵件
07:49
"Ray射線, you deserve值得 a 'D-'“D-”
164
457800
2536
「雷,你今天在會議裡的
表現只有 60 分,
07:52
for your performance性能
today今天 in the meeting會議 ...
165
460360
2256
07:54
you did not prepare準備 at all well
166
462640
1696
你根本沒有準備好,
07:56
because there is no way
you could have been that disorganized雜亂無章."
167
464360
3560
因為你不是一個這麼
沒有組織的人。」
08:01
Isn't that great?
168
469520
1216
這不是很棒嗎?
08:02
(Laughter笑聲)
169
470760
1216
(笑聲)
08:04
That's great.
170
472000
1216
這真的很棒
08:05
It's great because, first of all,
I needed需要 feedback反饋 like that.
171
473240
2936
第一,我需要像這樣的回饋
08:08
I need feedback反饋 like that.
172
476200
1616
我需要像這樣的回饋
08:09
And it's great because if I don't let Jim吉姆,
and people like Jim吉姆,
173
477840
3456
很棒的點在於如果我不讓吉米
還有其他像吉米的人
08:13
to express表現 their points of view視圖,
174
481320
1576
表達他們的觀點
08:14
our relationship關係 wouldn't不會 be the same相同.
175
482920
2056
我們的關係就不會像現在這樣
08:17
And if I didn't make that public上市
for everybody每個人 to see,
176
485000
3056
如果我不讓所有人都看得見這封信
08:20
we wouldn't不會 have an idea理念 meritocracy任人唯賢.
177
488080
1960
我們就無法實踐
優秀想法勝出的功績主義
08:23
So for that last 25 years年份
that's how we've我們已經 been operating操作.
178
491760
3280
所以過去 25 年來
我們就是這麼經營公司的
08:27
We've我們已經 been operating操作
with this radical激進 transparency透明度
179
495640
3056
我們用徹底的透明化來經營公司
08:30
and then collecting蒐集 these principles原則,
180
498720
2296
然後收集這些原則
08:33
largely大部分 from making製造 mistakes錯誤,
181
501040
2056
這些原則大部分是來自犯錯
08:35
and then embedding嵌入
those principles原則 into algorithms算法.
182
503120
4416
然後把這些原則放進演算法裡面
08:39
And then those algorithms算法 provide提供 --
183
507560
2696
然後演算法會做出選擇
08:42
we're following以下 the algorithms算法
184
510280
2016
我們就根據演算法
還有自己的想法來做出選擇
08:44
in parallel平行 with our thinking思維.
185
512320
1440
08:47
That has been how we've我們已經 run
the investment投資 business商業,
186
515280
3176
這就是我們怎麼讓
這間投資公司運作的
08:50
and it's how we also deal合同
with the people management管理.
187
518480
2736
也是我們怎麼管理員工的
08:53
In order訂購 to give you a glimmer微光
into what this looks容貌 like,
188
521240
3736
為了讓你們對我在說的東西有點概念
08:57
I'd like to take you into a meeting會議
189
525000
2336
我想帶你們一起參加一場會議
08:59
and introduce介紹 you to a tool工具 of ours我們的
called the "Dot Collector集電極"
190
527360
3136
並向你們介紹我們一個
名叫「集點」的工具
09:02
that helps幫助 us do this.
191
530520
1280
這個工具幫助我們做到這些事
09:07
A week after the US election選舉,
192
535640
2176
美國大選的一星期之後
09:09
our research研究 team球隊 held保持 a meeting會議
193
537840
2096
我們的研究團隊舉行了一場會議
09:11
to discuss討論 what a Trump王牌 presidency總統任期
would mean for the US economy經濟.
194
539960
3320
來討論川普的勝選
對美國經濟有何意義
09:16
Naturally自然, people had
different不同 opinions意見 on the matter
195
544000
2856
很正常的,人們對這個議題
還有我們怎麼去看這件事情
09:18
and how we were
approaching接近 the discussion討論.
196
546880
2040
有著不一樣的想法
09:21
The "Dot Collector集電極" collects收集 these views意見.
197
549840
2776
「集點」收集了這些想法
09:24
It has a list名單 of a few少數 dozen attributes屬性,
198
552640
2296
這些想法總共可以分成好幾十種
09:26
so whenever每當 somebody thinks something
about another另一個 person's人的 thinking思維,
199
554960
4016
所以當有某個人
對別人的看法有想法的時候
09:31
it's easy簡單 for them
to convey傳達 their assessment評定;
200
559000
2936
他們就可以很容易地作出評估
09:33
they simply只是 note注意 the attribute屬性
and provide提供 a rating評分 from one to 10.
201
561960
4520
他們會記下這個想法
然後給一個 1 到 10 分的分數
09:39
For example, as the meeting會議 began開始,
202
567520
2256
舉例來說,在會議開始的時候
09:41
a researcher研究員 named命名 Jen rated額定 me a three --
203
569800
3120
一位名叫詹的研究員給了我 3 分
09:45
in other words, badly --
204
573640
2016
就是很爛的意思
09:47
(Laughter笑聲)
205
575680
1376
(笑聲)
09:49
for not showing展示 a good balance平衡
of open-mindedness開明 and assertiveness獨斷.
206
577080
4160
因為我的論點沒有在包容性
與自信之間取得平衡
09:54
As the meeting會議 transpired蒸騰,
207
582080
1456
會議公開之後
09:55
Jen's仁的 assessments評估 of people
added添加 up like this.
208
583560
3240
詹對於人們的分析總結如下
09:59
Others其他 in the room房間
have different不同 opinions意見.
209
587920
2176
在會議室裡的其他人各有不同的意見
10:02
That's normal正常.
210
590120
1216
這很正常
10:03
Different不同 people are always
going to have different不同 opinions意見.
211
591360
2920
不同的人總是有著不同的意見
10:06
And who knows知道 who's誰是 right?
212
594800
1400
誰知道誰是對的呢?
10:09
Let's look at just what people thought
about how I was doing.
213
597240
3440
我們來看看大家
對我當時的說法是怎麼想的
10:13
Some people thought I did well,
214
601600
2216
有些人認為我說得很好
10:15
others其他, poorly不好.
215
603840
1200
其他人覺得我說得很差
10:18
With each of these views意見,
216
606080
1336
透過這些觀點
10:19
we can explore探索 the thinking思維
behind背後 the numbers數字.
217
607440
2320
我們可以探索在數字背後的這些想法
10:22
Here's這裡的 what Jen and Larry拉里 said.
218
610520
2160
詹和賴利是這麼說的
10:25
Note注意 that everyone大家
gets得到 to express表現 their thinking思維,
219
613760
2616
無論他們在公司的位階
10:28
including包含 their critical危急 thinking思維,
220
616400
1656
每個人都可以表達他們的想法
10:30
regardless而不管 of their position位置
in the company公司.
221
618080
2120
包括他們的批判性思考
10:33
Jen, who's誰是 24 years年份 old
and right out of college學院,
222
621120
3096
24 歲、剛剛從學校畢業的詹
10:36
can tell me, the CEOCEO,
that I'm approaching接近 things terribly可怕.
223
624240
2840
可以說我這個執行長
觀察事情的能力很差
10:40
This tool工具 helps幫助 people
both express表現 their opinions意見
224
628480
3776
這個工具幫人們表達他們的意見
10:44
and then separate分離 themselves他們自己
from their opinions意見
225
632280
3096
然後讓他們從更高的
一個客觀角度來看事情
10:47
to see things from a higher更高 level水平.
226
635400
2040
10:50
When Jen and others其他 shift轉移 their attentions注意事項
from inputting輸入 their own擁有 opinions意見
227
638640
4896
當詹和其他人從提供意見
10:55
to looking down on the whole整個 screen屏幕,
228
643560
2576
變成綜觀全局的時候
10:58
their perspective透視 changes變化.
229
646160
1720
他們的觀點就改變了
11:00
They see their own擁有 opinions意見
as just one of many許多
230
648680
3136
他們了解到他們的想法
只是眾多想法的其中之一
11:03
and naturally自然 start開始 asking themselves他們自己,
231
651840
2536
並很自然地開始思考
11:06
"How do I know my opinion意見 is right?"
232
654400
2000
「我怎麼知道我的想法是對的?」
11:09
That shift轉移 in perspective透視 is like going
from seeing眼看 in one dimension尺寸
233
657480
4056
這就像是從一個維度看事情
11:13
to seeing眼看 in multiple dimensions尺寸.
234
661560
2256
變成從多個維度看事情
11:15
And it shifts轉變 the conversation會話
from arguing爭論 over our opinions意見
235
663840
4096
也把對話從針對這些想法的爭論
11:19
to figuring盤算 out objective目的 criteria標準
for determining決定 which哪一個 opinions意見 are best最好.
236
667960
4400
變為找出客觀的標準
來決定哪個意見是最好的
11:24
Behind背後 the "Dot Collector集電極"
is a computer電腦 that is watching觀看.
237
672920
3600
在「集點」的背後
有一台在監控的電腦
11:29
It watches手錶 what all
these people are thinking思維
238
677120
2176
他監控了所有人的想法
11:31
and it correlates相關因素 that
with how they think.
239
679320
2576
然後去和他們是怎麼想的做連結
11:33
And it communicates相通 advice忠告
back to each of them based基於 on that.
240
681920
3520
然後根據這些結果回傳建議
11:38
Then it draws the data數據
from all the meetings會議
241
686520
3416
最後電腦會從所有的會議中收集資料
11:41
to create創建 a pointilistpointilist painting繪畫
of what people are like
242
689960
3216
並產生一個點描清單
顯示他們是怎麼樣的人
以及是怎麼思考的
11:45
and how they think.
243
693200
1240
11:47
And it does that guided引導 by algorithms算法.
244
695160
2720
電腦做的這些事就是從演算法來的
11:50
Knowing會心 what people are like helps幫助
to match比賽 them better with their jobs工作.
245
698800
3760
瞭解人們是怎麼樣的人
能夠幫助他們找到更適合他們的職位
11:55
For example,
246
703120
1216
舉例來說
11:56
a creative創作的 thinker思想家 who is unreliable靠不住
247
704360
1736
一個有創意但是不可靠的人
11:58
might威力 be matched匹配 up with someone有人
who's誰是 reliable可靠 but not creative創作的.
248
706120
3080
可以和一個可靠但沒創意的人做搭配
12:02
Knowing會心 what people are like
also allows允許 us to decide決定
249
710280
3336
瞭解人們是怎麼樣的人
也讓我們可以決定
12:05
what responsibilities責任 to give them
250
713640
2256
要派給他們什麼樣的職責
12:07
and to weigh稱重 our decisions決定
based基於 on people's人們 merits優點.
251
715920
3480
同時也可以根據大家的表現
來衡量我們的決定
12:12
We call it their believability可信度.
252
720040
1600
我們把它稱為這些人的可信度
12:14
Here's這裡的 an example of a vote投票 that we took
253
722560
1976
在某次投票裡
12:16
where the majority多數
of people felt one way ...
254
724560
2840
多數人投給了一邊
12:20
but when we weighed稱重 the views意見
based基於 on people's人們 merits優點,
255
728920
2936
但當我們對人們的功績作加權之後
12:23
the answer回答 was completely全然 different不同.
256
731880
1840
得到了完全相反的答案
12:26
This process處理 allows允許 us to make decisions決定
not based基於 on democracy民主,
257
734920
4576
這個過程讓我們的決策過程
不再根據民主
12:31
not based基於 on autocracy專制,
258
739520
2136
也不是根據獨裁
12:33
but based基於 on algorithms算法 that take
people's人們 believability可信度 into consideration考慮.
259
741680
5240
而是根據演算法
把人們的可信度納入考量
12:41
Yup, we really do this.
260
749520
1696
沒錯,我們真的這麼做
12:43
(Laughter笑聲)
261
751240
3296
(笑聲)
12:46
We do it because it eliminates消除
262
754560
2856
我們這麼做是因為它可以消除
12:49
what I believe to be
one of the greatest最大 tragedies悲劇 of mankind人類,
263
757440
4456
我認為人類最大的悲劇
12:53
and that is people arrogantly傲慢,
264
761920
2160
也就是人們把傲慢且天真的
錯誤想法放在心裡
12:56
naïvelyvely holding保持 opinions意見
in their minds頭腦 that are wrong錯誤,
265
764760
4456
13:01
and acting演戲 on them,
266
769240
1256
並根據這些想法做事情
13:02
and not putting them out there
to stress強調 test測試 them.
267
770520
2760
而且不肯開誠布公地接受壓力測試
13:06
And that's a tragedy悲劇.
268
774000
1336
這真的是個悲劇
13:07
And we do it because it elevates提升 ourselves我們自己
above以上 our own擁有 opinions意見
269
775360
5416
我們這麼做是因為
可以讓我們超脫於自我
13:12
so that we start開始 to see things
through通過 everybody's每個人的 eyes眼睛,
270
780800
2896
然後從每一個人的眼睛看每件事
13:15
and we see things collectively.
271
783720
1920
來看見事情的全貌
13:18
Collective集體 decision-making做決定 is so much
better than individual個人 decision-making做決定
272
786360
4336
如果好好地去做的話
集體決策比個人決策要好得太多了
13:22
if it's doneDONE well.
273
790720
1200
13:24
It's been the secret秘密 sauce
behind背後 our success成功.
274
792360
2616
這就是我們成功背後的原因
13:27
It's why we've我們已經 made製作
more money for our clients客戶
275
795000
2176
也是我們幫客戶賺得
比其它現有對沖基金更多的原因
13:29
than any other hedge樹籬 fund基金 in existence存在
276
797200
1936
13:31
and made製作 money
23 out of the last 26 years年份.
277
799160
2720
而且過去 26 年有 23 年是獲利的
13:35
So what's the problem問題
with being存在 radically根本 truthful真實
278
803880
4536
那麼彼此間徹底的誠實
13:40
and radically根本 transparent透明 with each other?
279
808440
2240
和徹底透明會有什麼問題呢?
13:45
People say it's emotionally感情上 difficult.
280
813400
2080
有人說是情感上的困難
13:48
Critics批評者 say it's a formula
for a brutal野蠻 work environment環境.
281
816240
4240
評論家說這是殘酷的職場
13:53
Neuroscientists神經學家 tell me it has to do
with how are brains大腦 are prewired預配.
282
821400
4856
神經科學家告訴我
這和大腦的網絡有關
13:58
There's a part部分 of our brain
that would like to know our mistakes錯誤
283
826280
3216
大腦裡有一個部位會試著去找出
我們犯的錯誤還有弱點
14:01
and like to look at our weaknesses弱點
so we could do better.
284
829520
3960
然後讓我們表現更好
14:06
I'm told that that's
the prefrontal前額葉 cortex皮質.
285
834120
2440
有人告訴我這個部位叫前額葉皮質區
14:09
And then there's a part部分 of our brain
which哪一個 views意見 all of this as attacks攻擊.
286
837040
4856
大腦還有個部位
會把錯誤和弱點視為攻擊
14:13
I'm told that that's the amygdala杏仁核.
287
841920
1960
這個部位叫杏仁核
14:16
In other words,
there are two you's你的 inside you:
288
844440
3056
換句話說,你的身體裡面有兩個你
14:19
there's an emotional情緒化 you
289
847520
1416
一個是情緒化的你
14:20
and there's an intellectual知識分子 you,
290
848960
1776
一個是理性的你
14:22
and often經常 they're at odds可能性,
291
850760
1776
他們常常是互相衝突的
14:24
and often經常 they work against反對 you.
292
852560
1920
也常常和你做對
14:27
It's been our experience經驗
that we can win贏得 this battle戰鬥.
293
855160
3736
我們的經驗告訴我們
我們能贏得這場競爭
14:30
We win贏得 it as a group.
294
858920
1320
透過團結合作
14:33
It takes about 18 months個月 typically一般
295
861000
2336
通常要花 18 個月
才會讓大家習慣並選擇徹底的透明
14:35
to find that most people
prefer比較喜歡 operating操作 this way,
296
863360
3056
14:38
with this radical激進 transparency透明度
297
866440
2016
14:40
than to be operating操作
in a more opaque不透明 environment環境.
298
868480
3336
而不是個看不清的環境
14:43
There's not politics政治,
there's not the brutality殘酷 of --
299
871840
4296
這無關政治,也無關殘酷
14:48
you know, all of that hidden,
behind-the-scenes幕後的 --
300
876160
2376
在這些的背後
14:50
there's an idea理念 meritocracy任人唯賢
where people can speak說話 up.
301
878560
2936
有個優秀想法勝出的功績主義
讓每個人都能發聲
14:53
And that's been great.
302
881520
1256
這一直都很棒
14:54
It's given特定 us more effective有效 work,
303
882800
1656
它讓我們的工作更有效果
14:56
and it's given特定 us
more effective有效 relationships關係.
304
884480
2400
也讓人們之間的關係更有效果
14:59
But it's not for everybody每個人.
305
887400
1320
不過這不適用於所有人
15:01
We found發現 something like
25 or 30 percent百分 of the population人口
306
889680
2936
我們發現有大約
百分之 25 到 30 的人不適用
15:04
it's just not for.
307
892640
1736
15:06
And by the way,
308
894400
1216
附帶一提,我說的徹底透明
不是說什麼都不掩飾
15:07
when I say radical激進 transparency透明度,
309
895640
1816
15:09
I'm not saying transparency透明度
about everything.
310
897480
2336
15:11
I mean, you don't have to tell somebody
that their bald禿 spot is growing生長
311
899840
3816
我是說,你不必告訴
某人他越來越禿了
15:15
or their baby's寶寶 ugly醜陋.
312
903680
1616
或是他們的小孩很醜
15:17
So, I'm just talking about --
313
905320
2096
我說的是──
15:19
(Laughter笑聲)
314
907440
1216
(笑聲)
15:20
talking about the important重要 things.
315
908680
2176
說的是那些比較重要的事
15:22
So --
316
910880
1216
所以──
15:24
(Laughter笑聲)
317
912120
3200
(笑聲)
15:28
So when you leave離開 this room房間,
318
916600
1416
當你離開這個會場的時候
15:30
I'd like you to observe yourself你自己
in conversations對話 with others其他.
319
918040
4440
我想要你在和別人
對話的時候觀察自己
15:35
Imagine想像 if you knew知道
what they were really thinking思維,
320
923360
3680
想像你知道他們真正的想法
15:39
and imagine想像 if you knew知道
what they were really like ...
321
927760
2600
想像你知道他們其實是個什麼樣的人
15:43
and imagine想像 if they knew知道
what you were really thinking思維
322
931840
3976
想像他們知道你真正的想法
15:47
and what were really like.
323
935840
1840
和真正的你
15:50
It would certainly當然 clear明確 things up a lot
324
938160
2576
這一定會讓事情明朗化
15:52
and make your operations操作
together一起 more effective有效.
325
940760
2856
並讓你的做法更有效
15:55
I think it will improve提高
your relationships關係.
326
943640
2240
我覺得這可以讓你的關係變得更美好
15:58
Now imagine想像 that you can have algorithms算法
327
946600
3296
現在想像你有演算法
16:01
that will help you gather收集
all of that information信息
328
949920
3816
幫你收集所有的資料
16:05
and even help you make decisions決定
in an idea-meritocratic理念,精英 way.
329
953760
4560
甚至幫你在優秀想法勝出的
功績主義之下做出決定
16:12
This sort分類 of radical激進 transparency透明度
is coming未來 at you
330
960640
4336
這樣的徹底透明化將會進入你的世界
16:17
and it is going to affect影響 your life.
331
965000
1960
而且會影響你的生活
16:19
And in my opinion意見,
332
967600
2056
我認為
16:21
it's going to be wonderful精彩.
333
969680
1336
這將會是很美好的一件事
16:23
So I hope希望 it is as wonderful精彩 for you
334
971040
2336
所以我希望這對你們來說
16:25
as it is for me.
335
973400
1200
會跟對我來說一樣美好
16:27
Thank you very much.
336
975160
1256
非常謝謝你們
16:28
(Applause掌聲)
337
976440
4360
(掌聲)
Translated by Nan-Kun Wu
Reviewed by Xiaowei Dong

▲Back to top

ABOUT THE SPEAKER
Ray Dalio - Hedge fund chair
Ray Dalio is the founder, chair and co-chief investment officer of Bridgewater Associates, a global leader in institutional portfolio management and the largest hedge fund in the world.

Why you should listen

Dalio started Bridgewater out of his two-bedroom apartment in New York City in 1975 and has grown it into the fifth most important private company in the U.S. (according to Fortune magazine). Because of the firm’s many industry-changing innovations over its 40-year history, he has been called the “Steve Jobs of investing” by aiCIO magazine and named one of TIME magazine’s "100 Most Influential People."

Dalio attributes Bridgewater’s success to its unique culture. He describes it as “a believability-weighted idea meritocracy” in which the people strive for “meaningful work and meaningful relationships through radical truth and radical transparency.” He has explained this approach in his book Principles, which has been downloaded more than three million times and has produced considerable curiosity and controversy.

More profile about the speaker
Ray Dalio | Speaker | TED.com