ABOUT THE SPEAKER
Thomas Goetz - Healthcare communicator
Thomas Goetz is the co-founder of Iodine and author of "The Decision Tree: Taking Control of Your Health in the New Era of Personalized Medicine.”

Why you should listen

Thomas Goetz is the co-founder of Iodine, a new company that gives consumers better information -- and better visualizations -- of their health data. The former executive editor of Wired, Goetz has a Master's of Public Health from UC Berkeley. In 2010 he published The Decision Tree, a fascinating look at modern medical decisionmaking and technology. Former FDA commissioner Dr. David Kessler called the book "a game changer.” His next book, The Remedy, explores the germ theory of disease and the quest to cure tuberculosis.

More profile about the speaker
Thomas Goetz | Speaker | TED.com
TEDMED 2010

Thomas Goetz: It's time to redesign medical data

Filmed:
587,895 views

Your medical chart: it's hard to access, impossible to read -- and full of information that could make you healthier if you just knew how to use it. At TEDMED, Thomas Goetz looks at medical data, making a bold call to redesign it and get more insight from it.
- Healthcare communicator
Thomas Goetz is the co-founder of Iodine and author of "The Decision Tree: Taking Control of Your Health in the New Era of Personalized Medicine.” Full bio

Double-click the English transcript below to play the video.

00:15
I'm going to be talking to you
0
0
2000
00:17
about how we can tap
1
2000
2000
00:19
a really underutilized resource in health care,
2
4000
2000
00:21
which is the patient,
3
6000
2000
00:23
or, as I like to use the scientific term,
4
8000
3000
00:26
people.
5
11000
2000
00:28
Because we are all patients, we are all people.
6
13000
2000
00:30
Even doctors are patients at some point.
7
15000
2000
00:32
So I want to talk about that
8
17000
2000
00:34
as an opportunity
9
19000
2000
00:36
that we really have failed to engage with very well in this country
10
21000
3000
00:39
and, in fact, worldwide.
11
24000
2000
00:41
If you want to get at the big part --
12
26000
2000
00:43
I mean from a public health level, where my training is --
13
28000
3000
00:46
you're looking at behavioral issues.
14
31000
2000
00:48
You're looking at things where people are actually given information,
15
33000
3000
00:51
and they're not following through with it.
16
36000
2000
00:53
It's a problem that manifests itself in diabetes,
17
38000
4000
00:57
obesity, many forms of heart disease,
18
42000
2000
00:59
even some forms of cancer -- when you think of smoking.
19
44000
3000
01:02
Those are all behaviors where people know what they're supposed to do.
20
47000
3000
01:05
They know what they're supposed to be doing,
21
50000
2000
01:07
but they're not doing it.
22
52000
2000
01:09
Now behavior change is something
23
54000
2000
01:11
that is a long-standing problem in medicine.
24
56000
2000
01:13
It goes all the way back to Aristotle.
25
58000
2000
01:15
And doctors hate it, right?
26
60000
2000
01:17
I mean, they complain about it all the time.
27
62000
2000
01:19
We talk about it in terms of engagement, or non-compliance.
28
64000
3000
01:22
When people don't take their pills,
29
67000
2000
01:24
when people don't follow doctors' orders --
30
69000
2000
01:26
these are behavior problems.
31
71000
2000
01:28
But for as much as clinical medicine
32
73000
2000
01:30
agonizes over behavior change,
33
75000
2000
01:32
there's not a lot of work done
34
77000
2000
01:34
in terms of trying to fix that problem.
35
79000
3000
01:37
So the crux of it
36
82000
2000
01:39
comes down to this notion of decision-making --
37
84000
2000
01:41
giving information to people in a form
38
86000
2000
01:43
that doesn't just educate them
39
88000
2000
01:45
or inform them,
40
90000
2000
01:47
but actually leads them to make better decisions,
41
92000
2000
01:49
better choices in their lives.
42
94000
2000
01:51
One part of medicine, though,
43
96000
2000
01:53
has faced the problem of behavior change pretty well,
44
98000
4000
01:57
and that's dentistry.
45
102000
2000
01:59
Dentistry might seem -- and I think it is --
46
104000
2000
02:01
many dentists would have to acknowledge
47
106000
2000
02:03
it's somewhat of a mundane backwater of medicine.
48
108000
2000
02:05
Not a lot of cool, sexy stuff happening in dentistry.
49
110000
3000
02:08
But they have really taken this problem of behavior change
50
113000
3000
02:11
and solved it.
51
116000
2000
02:13
It's the one great preventive health success
52
118000
2000
02:15
we have in our health care system.
53
120000
2000
02:17
People brush and floss their teeth.
54
122000
2000
02:19
They don't do it as much as they should, but they do it.
55
124000
3000
02:22
So I'm going to talk about one experiment
56
127000
2000
02:24
that a few dentists in Connecticut
57
129000
2000
02:26
cooked up about 30 years ago.
58
131000
2000
02:28
So this is an old experiment, but it's a really good one,
59
133000
2000
02:30
because it was very simple,
60
135000
2000
02:32
so it's an easy story to tell.
61
137000
2000
02:34
So these Connecticut dentists decided
62
139000
2000
02:36
that they wanted to get people to brush their teeth and floss their teeth more often,
63
141000
3000
02:39
and they were going to use one variable:
64
144000
2000
02:41
they wanted to scare them.
65
146000
2000
02:43
They wanted to tell them how bad it would be
66
148000
3000
02:46
if they didn't brush and floss their teeth.
67
151000
2000
02:48
They had a big patient population.
68
153000
3000
02:51
They divided them up into two groups.
69
156000
2000
02:53
They had a low-fear population,
70
158000
2000
02:55
where they basically gave them a 13-minute presentation,
71
160000
2000
02:57
all based in science,
72
162000
2000
02:59
but told them that, if you didn't brush and floss your teeth,
73
164000
3000
03:02
you could get gum disease. If you get gum disease, you will lose your teeth,
74
167000
3000
03:05
but you'll get dentures, and it won't be that bad.
75
170000
2000
03:07
So that was the low-fear group.
76
172000
2000
03:09
The high-fear group, they laid it on really thick.
77
174000
3000
03:12
They showed bloody gums.
78
177000
2000
03:14
They showed puss oozing out from between their teeth.
79
179000
3000
03:17
They told them that their teeth were going to fall out.
80
182000
2000
03:19
They said that they could have infections
81
184000
2000
03:21
that would spread from their jaws to other parts of their bodies,
82
186000
3000
03:24
and ultimately, yes, they would lose their teeth.
83
189000
2000
03:26
They would get dentures, and if you got dentures,
84
191000
2000
03:28
you weren't going to be able to eat corn-on-the-cob,
85
193000
2000
03:30
you weren't going to be able to eat apples,
86
195000
2000
03:32
you weren't going to be able to eat steak.
87
197000
2000
03:34
You'll eat mush for the rest of your life.
88
199000
2000
03:36
So go brush and floss your teeth.
89
201000
3000
03:39
That was the message. That was the experiment.
90
204000
2000
03:41
Now they measured one other variable.
91
206000
2000
03:43
They wanted to capture one other variable,
92
208000
2000
03:45
which was the patients' sense of efficacy.
93
210000
3000
03:48
This was the notion of whether the patients felt
94
213000
2000
03:50
that they actually would go ahead and brush and floss their teeth.
95
215000
3000
03:53
So they asked them at the beginning,
96
218000
2000
03:55
"Do you think you'll actually be able to stick with this program?"
97
220000
2000
03:57
And the people who said, "Yeah, yeah. I'm pretty good about that,"
98
222000
2000
03:59
they were characterized as high efficacy,
99
224000
2000
04:01
and the people who said,
100
226000
2000
04:03
"Eh, I never get around to brushing and flossing as much as I should,"
101
228000
2000
04:05
they were characterized as low efficacy.
102
230000
2000
04:07
So the upshot was this.
103
232000
3000
04:10
The upshot of this experiment
104
235000
2000
04:12
was that fear was not really a primary driver
105
237000
3000
04:15
of the behavior at all.
106
240000
2000
04:17
The people who brushed and flossed their teeth
107
242000
2000
04:19
were not necessarily the people
108
244000
2000
04:21
who were really scared about what would happen --
109
246000
2000
04:23
it's the people who simply felt that they had the capacity
110
248000
3000
04:26
to change their behavior.
111
251000
2000
04:28
So fear showed up as not really the driver.
112
253000
3000
04:31
It was the sense of efficacy.
113
256000
3000
04:34
So I want to isolate this,
114
259000
2000
04:36
because it was a great observation --
115
261000
2000
04:38
30 years ago, right, 30 years ago --
116
263000
2000
04:40
and it's one that's laid fallow in research.
117
265000
3000
04:43
It was a notion that really came out
118
268000
2000
04:45
of Albert Bandura's work,
119
270000
2000
04:47
who studied whether
120
272000
2000
04:49
people could get a sense of empowerment.
121
274000
3000
04:52
The notion of efficacy basically boils down to one -- that
122
277000
3000
04:55
if somebody believes that they have the capacity to change their behavior.
123
280000
3000
04:58
In health care terms, you could characterize this
124
283000
3000
05:01
as whether or not somebody feels
125
286000
2000
05:03
that they see a path towards better health,
126
288000
2000
05:05
that they can actually see their way towards getting better health,
127
290000
2000
05:07
and that's a very important notion.
128
292000
2000
05:09
It's an amazing notion.
129
294000
2000
05:11
We don't really know how to manipulate it, though, that well.
130
296000
3000
05:14
Except, maybe we do.
131
299000
3000
05:17
So fear doesn't work, right? Fear doesn't work.
132
302000
2000
05:19
And this is a great example
133
304000
2000
05:21
of how we haven't learned that lesson at all.
134
306000
3000
05:24
This is a campaign from the American Diabetes Association.
135
309000
3000
05:27
This is still the way we're communicating messages about health.
136
312000
3000
05:30
I mean, I showed my three-year-old this slide last night,
137
315000
3000
05:33
and he's like, "Papa, why is an ambulance in these people's homes?"
138
318000
4000
05:37
And I had to explain, "They're trying to scare people."
139
322000
3000
05:40
And I don't know if it works.
140
325000
2000
05:42
Now here's what does work:
141
327000
2000
05:44
personalized information works.
142
329000
2000
05:46
Again, Bandura recognized this
143
331000
2000
05:48
years ago, decades ago.
144
333000
2000
05:50
When you give people specific information
145
335000
2000
05:52
about their health, where they stand,
146
337000
2000
05:54
and where they want to get to, where they might get to,
147
339000
2000
05:56
that path, that notion of a path --
148
341000
2000
05:58
that tends to work for behavior change.
149
343000
2000
06:00
So let me just spool it out a little bit.
150
345000
2000
06:02
So you start with personalized data, personalized information
151
347000
3000
06:05
that comes from an individual,
152
350000
2000
06:07
and then you need to connect it to their lives.
153
352000
3000
06:10
You need to connect it to their lives,
154
355000
2000
06:12
hopefully not in a fear-based way, but one that they understand.
155
357000
2000
06:14
Okay, I know where I sit. I know where I'm situated.
156
359000
3000
06:17
And that doesn't just work for me in terms of abstract numbers --
157
362000
3000
06:20
this overload of health information
158
365000
2000
06:22
that we're inundated with.
159
367000
2000
06:24
But it actually hits home.
160
369000
2000
06:26
It's not just hitting us in our heads; it's hitting us in our hearts.
161
371000
2000
06:28
There's an emotional connection to information
162
373000
2000
06:30
because it's from us.
163
375000
2000
06:32
That information then needs to be connected to choices,
164
377000
3000
06:35
needs to be connected to a range of options,
165
380000
2000
06:37
directions that we might go to --
166
382000
2000
06:39
trade-offs, benefits.
167
384000
2000
06:41
Finally, we need to be presented with a clear point of action.
168
386000
3000
06:44
We need to connect the information
169
389000
2000
06:46
always with the action,
170
391000
2000
06:48
and then that action feeds back
171
393000
2000
06:50
into different information,
172
395000
2000
06:52
and it creates, of course, a feedback loop.
173
397000
2000
06:54
Now this is a very well-observed and well-established notion
174
399000
3000
06:57
for behavior change.
175
402000
2000
06:59
But the problem is that things -- in the upper-right corner there --
176
404000
3000
07:02
personalized data, it's been pretty hard to come by.
177
407000
2000
07:04
It's a difficult and expensive commodity,
178
409000
3000
07:07
until now.
179
412000
2000
07:09
So I'm going to give you an example, a very simple example of how this works.
180
414000
3000
07:12
So we've all seen these. These are the "your speed limit" signs.
181
417000
3000
07:15
You've seen them all around,
182
420000
2000
07:17
especially these days as radars are cheaper.
183
422000
2000
07:19
And here's how they work in the feedback loop.
184
424000
2000
07:21
So you start with the personalized data
185
426000
2000
07:23
where the speed limit on the road that you are at that point
186
428000
2000
07:25
is 25,
187
430000
2000
07:27
and, of course, you're going faster than that.
188
432000
2000
07:29
We always are. We're always going above the speed limit.
189
434000
3000
07:32
The choice in this case is pretty simple.
190
437000
2000
07:34
We either keep going fast, or we slow down.
191
439000
2000
07:36
We should probably slow down,
192
441000
2000
07:38
and that point of action is probably now.
193
443000
2000
07:40
We should take our foot off the pedal right now,
194
445000
3000
07:43
and generally we do. These things are shown to be pretty effective
195
448000
3000
07:46
in terms of getting people to slow down.
196
451000
2000
07:48
They reduce speeds by about five to 10 percent.
197
453000
2000
07:50
They last for about five miles,
198
455000
2000
07:52
in which case we put our foot back on the pedal.
199
457000
2000
07:54
But it works, and it even has some health repercussions.
200
459000
2000
07:56
Your blood pressure might drop a little bit.
201
461000
2000
07:58
Maybe there's fewer accidents, so there's public health benefits.
202
463000
3000
08:01
But by and large, this is a feedback loop
203
466000
2000
08:03
that's so nifty and too rare.
204
468000
3000
08:06
Because in health care, most health care,
205
471000
2000
08:08
the data is very removed from the action.
206
473000
3000
08:11
It's very difficult to line things up so neatly.
207
476000
3000
08:14
But we have an opportunity.
208
479000
2000
08:16
So I want to talk about, I want to shift now to think about
209
481000
2000
08:18
how we deliver health information in this country,
210
483000
2000
08:20
how we actually get information.
211
485000
3000
08:23
This is a pharmaceutical ad.
212
488000
3000
08:26
Actually, it's a spoof. It's not a real pharmaceutical ad.
213
491000
2000
08:28
Nobody's had the brilliant idea
214
493000
2000
08:30
of calling their drug Havidol quite yet.
215
495000
3000
08:34
But it looks completely right.
216
499000
2000
08:36
So it's exactly the way we get
217
501000
2000
08:38
health information and pharmaceutical information,
218
503000
3000
08:41
and it just sounds perfect.
219
506000
2000
08:43
And then we turn the page of the magazine,
220
508000
2000
08:45
and we see this --
221
510000
3000
08:48
now this is the page the FDA requires pharmaceutical companies
222
513000
3000
08:51
to put into their ads, or to follow their ads,
223
516000
3000
08:54
and to me, this is one of the most
cynical exercises in medicine.
224
519000
4000
08:58
Because we know.
225
523000
2000
09:00
Who among us would actually say that people read this?
226
525000
2000
09:02
And who among us would actually say
227
527000
2000
09:04
that people who do try to read this
228
529000
2000
09:06
actually get anything out of it?
229
531000
2000
09:08
This is a bankrupt effort
230
533000
2000
09:10
at communicating health information.
231
535000
3000
09:13
There is no good faith in this.
232
538000
2000
09:15
So this is a different approach.
233
540000
2000
09:17
This is an approach that has been developed
234
542000
3000
09:20
by a couple researchers at Dartmouth Medical School,
235
545000
3000
09:23
Lisa Schwartz and Steven Woloshin.
236
548000
2000
09:25
And they created this thing called the "drug facts box."
237
550000
3000
09:28
They took inspiration from, of all things,
238
553000
2000
09:30
Cap'n Crunch.
239
555000
2000
09:32
They went to the nutritional information box
240
557000
3000
09:35
and saw that what works for cereal, works for our food,
241
560000
3000
09:38
actually helps people understand what's in their food.
242
563000
3000
09:42
God forbid we should use that same standard
243
567000
2000
09:44
that we make Cap'n Crunch live by
244
569000
2000
09:46
and bring it to drug companies.
245
571000
3000
09:49
So let me just walk through this quickly.
246
574000
2000
09:51
It says very clearly what the drug is for, specifically who it is good for,
247
576000
3000
09:54
so you can start to personalize your understanding
248
579000
2000
09:56
of whether the information is relevant to you
249
581000
2000
09:58
or whether the drug is relevant to you.
250
583000
2000
10:00
You can understand exactly what the benefits are.
251
585000
3000
10:03
It isn't this kind of vague promise that it's going to work no matter what,
252
588000
3000
10:06
but you get the statistics for how effective it is.
253
591000
3000
10:09
And finally, you understand what those choices are.
254
594000
3000
10:12
You can start to unpack the choices involved
255
597000
2000
10:14
because of the side effects.
256
599000
2000
10:16
Every time you take a drug, you're walking into a possible side effect.
257
601000
3000
10:19
So it spells those out in very clean terms,
258
604000
2000
10:21
and that works.
259
606000
2000
10:23
So I love this. I love that drug facts box.
260
608000
2000
10:25
And so I was thinking about,
261
610000
2000
10:27
what's an opportunity that I could have
262
612000
2000
10:29
to help people understand information?
263
614000
3000
10:32
What's another latent body of information that's out there
264
617000
4000
10:36
that people are really not putting to use?
265
621000
3000
10:39
And so I came up with this: lab test results.
266
624000
3000
10:42
Blood test results are this great source of information.
267
627000
3000
10:45
They're packed with information.
268
630000
2000
10:47
They're just not for us. They're not for people. They're not for patients.
269
632000
3000
10:50
They go right to doctors.
270
635000
2000
10:52
And God forbid -- I think many doctors, if you really asked them,
271
637000
3000
10:55
they don't really understand all this stuff either.
272
640000
3000
10:58
This is the worst presented information.
273
643000
3000
11:01
You ask Tufte, and he would say,
274
646000
3000
11:04
"Yes, this is the absolute worst presentation of information possible."
275
649000
3000
11:07
What we did at Wired
276
652000
2000
11:09
was we went, and I got our graphic design department
277
654000
2000
11:11
to re-imagine these lab reports.
278
656000
2000
11:13
So that's what I want to walk you through.
279
658000
2000
11:15
So this is the general blood work before,
280
660000
3000
11:18
and this is the after, this is what we came up with.
281
663000
2000
11:20
The after takes what was four pages --
282
665000
2000
11:22
that previous slide was actually
283
667000
2000
11:24
the first of four pages of data
284
669000
2000
11:26
that's just the general blood work.
285
671000
2000
11:28
It goes on and on and on, all these values, all these numbers you don't know.
286
673000
3000
11:31
This is our one-page summary.
287
676000
3000
11:34
We use the notion of color.
288
679000
2000
11:36
It's an amazing notion that color could be used.
289
681000
3000
11:39
So on the top-level you have your overall results,
290
684000
3000
11:42
the things that might jump out at you from the fine print.
291
687000
3000
11:45
Then you can drill down
292
690000
2000
11:47
and understand how actually we put your level in context,
293
692000
3000
11:50
and we use color to illustrate
294
695000
2000
11:52
exactly where your value falls.
295
697000
2000
11:54
In this case, this patient is slightly at risk of diabetes
296
699000
3000
11:57
because of their glucose level.
297
702000
2000
11:59
Likewise, you can go over your lipids
298
704000
2000
12:01
and, again, understand what your overall cholesterol level is
299
706000
3000
12:04
and then break down into the HDL and the LDL if you so choose.
300
709000
3000
12:07
But again, always using color
301
712000
2000
12:09
and personalized proximity
302
714000
2000
12:11
to that information.
303
716000
2000
12:13
All those other values,
304
718000
2000
12:15
all those pages and pages of values that are full of nothing,
305
720000
2000
12:17
we summarize.
306
722000
2000
12:19
We tell you that you're okay, you're normal.
307
724000
2000
12:21
But you don't have to wade through it. You don't have to go through the junk.
308
726000
3000
12:24
And then we do two other very important things
309
729000
2000
12:26
that kind of help fill in this feedback loop:
310
731000
2000
12:28
we help people understand in a little more detail
311
733000
2000
12:30
what these values are and what they might indicate.
312
735000
3000
12:33
And then we go a further step -- we tell them what they can do.
313
738000
3000
12:36
We give them some insight
314
741000
2000
12:38
into what choices they can make, what actions they can take.
315
743000
3000
12:41
So that's our general blood work test.
316
746000
3000
12:44
Then we went to CRP test.
317
749000
2000
12:46
In this case, it's a sin of omission.
318
751000
2000
12:48
They have this huge amount of space,
319
753000
2000
12:50
and they don't use it for anything, so we do.
320
755000
2000
12:52
Now the CRP test is often done
321
757000
2000
12:54
following a cholesterol test,
322
759000
2000
12:56
or in conjunction with a cholesterol test.
323
761000
2000
12:58
So we take the bold step
324
763000
2000
13:00
of putting the cholesterol information on the same page,
325
765000
3000
13:03
which is the way the doctor is going to evaluate it.
326
768000
2000
13:05
So we thought the patient might actually want to know the context as well.
327
770000
3000
13:08
It's a protein that shows up
328
773000
2000
13:10
when your blood vessels might be inflamed,
329
775000
2000
13:12
which might be a risk for heart disease.
330
777000
2000
13:14
What you're actually measuring
331
779000
2000
13:16
is spelled out in clean language.
332
781000
2000
13:18
Then we use the information
333
783000
2000
13:20
that's already in the lab report.
334
785000
2000
13:22
We use the person's age and their gender
335
787000
2000
13:24
to start to fill in the personalized risks.
336
789000
3000
13:27
So we start to use the data we have
337
792000
2000
13:29
to run a very simple calculation
338
794000
2000
13:31
that's on all sorts of online calculators
339
796000
2000
13:33
to get a sense of what the actual risk is.
340
798000
3000
13:36
The last one I'll show you is a PSA test.
341
801000
2000
13:38
Here's the before, and here's the after.
342
803000
3000
13:41
Now a lot of our effort on this one --
343
806000
2000
13:43
as many of you probably know,
344
808000
2000
13:45
a PSA test is a very controversial test.
345
810000
2000
13:47
It's used to test for prostate cancer,
346
812000
2000
13:49
but there are all sorts of reasons
347
814000
2000
13:51
why your prostate might be enlarged.
348
816000
2000
13:53
And so we spent a good deal of our time
349
818000
2000
13:55
indicating that.
350
820000
2000
13:57
We again personalized the risks.
351
822000
2000
13:59
So this patient is in their 50s,
352
824000
2000
14:01
so we can actually give them a very precise estimate
353
826000
2000
14:03
of what their risk for prostate cancer is.
354
828000
2000
14:05
In this case it's about 25 percent, based on that.
355
830000
3000
14:08
And then again, the follow-up actions.
356
833000
3000
14:11
So our cost for this was less than 10,000 dollars, all right.
357
836000
3000
14:14
That's what Wired magazine spent on this.
358
839000
3000
14:17
Why is Wired magazine doing this?
359
842000
2000
14:19
(Laughter)
360
844000
3000
14:22
Quest Diagnostics and LabCorp,
361
847000
2000
14:24
the two largest lab testing companies --
362
849000
3000
14:27
last year, they made profits of over 700 million dollars
363
852000
3000
14:30
and over 500 million dollars respectively.
364
855000
3000
14:33
Now this is not a problem of resources;
365
858000
2000
14:35
this is a problem of incentives.
366
860000
3000
14:38
We need to recognize that the target of this information
367
863000
3000
14:41
should not be the doctor, should not be the insurance company.
368
866000
3000
14:44
It should be the patient.
369
869000
2000
14:46
It's the person who actually, in the end,
370
871000
2000
14:48
is going to be having to change their lives
371
873000
2000
14:50
and then start adopting new behaviors.
372
875000
2000
14:52
This is information that is incredibly powerful.
373
877000
2000
14:54
It's an incredibly powerful catalyst to change.
374
879000
3000
14:57
But we're not using it. It's just sitting there.
375
882000
2000
14:59
It's being lost.
376
884000
2000
15:01
So I want to just offer four questions
377
886000
2000
15:03
that every patient should ask,
378
888000
2000
15:05
because I don't actually expect people
379
890000
2000
15:07
to start developing these lab test reports.
380
892000
2000
15:09
But you can create your own feedback loop.
381
894000
2000
15:11
Anybody can create their feedback loop by asking these simple questions:
382
896000
3000
15:14
Can I have my results?
383
899000
2000
15:16
And the only acceptable answer is --
384
901000
2000
15:18
(Audience: Yes.) -- yes.
385
903000
2000
15:20
What does this mean? Help me understand what the data is.
386
905000
2000
15:22
What are my options? What choices are now on the table?
387
907000
3000
15:25
And then, what's next?
388
910000
2000
15:27
How do I integrate this information
389
912000
2000
15:29
into the longer course of my life?
390
914000
2000
15:32
So I want to wind up by just showing
391
917000
2000
15:34
that people have the capacity to understand this information.
392
919000
2000
15:36
This is not beyond the grasp of ordinary people.
393
921000
3000
15:39
You do not need to have the education level of people in this room.
394
924000
3000
15:42
Ordinary people are capable of understanding this information,
395
927000
3000
15:45
if we only go to the effort of presenting it to them
396
930000
3000
15:48
in a form that they can engage with.
397
933000
2000
15:50
And engagement is essential here,
398
935000
2000
15:52
because it's not just giving them information;
399
937000
2000
15:54
it's giving them an opportunity to act.
400
939000
2000
15:56
That's what engagement is. It's different from compliance.
401
941000
2000
15:58
It works totally different from the way we talk about behavior
402
943000
3000
16:01
in medicine today.
403
946000
2000
16:03
And this information is out there.
404
948000
2000
16:05
I've been talking today about latent information,
405
950000
2000
16:07
all this information that exists in the system
406
952000
2000
16:09
that we're not putting to use.
407
954000
2000
16:11
But there are all sorts of other bodies of information
408
956000
2000
16:13
that are coming online,
409
958000
2000
16:15
and we need to recognize the capacity of this information
410
960000
3000
16:18
to engage people, to help people
411
963000
2000
16:20
and to change the course of their lives.
412
965000
2000
16:22
Thank you very much.
413
967000
2000
16:24
(Applause)
414
969000
3000

▲Back to top

ABOUT THE SPEAKER
Thomas Goetz - Healthcare communicator
Thomas Goetz is the co-founder of Iodine and author of "The Decision Tree: Taking Control of Your Health in the New Era of Personalized Medicine.”

Why you should listen

Thomas Goetz is the co-founder of Iodine, a new company that gives consumers better information -- and better visualizations -- of their health data. The former executive editor of Wired, Goetz has a Master's of Public Health from UC Berkeley. In 2010 he published The Decision Tree, a fascinating look at modern medical decisionmaking and technology. Former FDA commissioner Dr. David Kessler called the book "a game changer.” His next book, The Remedy, explores the germ theory of disease and the quest to cure tuberculosis.

More profile about the speaker
Thomas Goetz | Speaker | TED.com