ABOUT THE SPEAKER
Grady Booch - Scientist, philosopher
IBM's Grady Booch is shaping the future of cognitive computing by building intelligent systems that can reason and learn.

Why you should listen

When he was 13, Grady Booch saw 2001: A Space Odyssey in the theaters for the first time. Ever since, he's been trying to build Hal (albeit one without the homicidal tendencies). A scientist, storyteller and philosopher, Booch is Chief Scientist for Software Engineering as well as Chief Scientist for Watson/M at IBM Research, where he leads IBM's research and development for embodied cognition. Having originated the term and the practice of object-oriented design, he is best known for his work in advancing the fields of software engineering and software architecture.

A co-author of the Unified Modeling Language (UML), a founding member of the Agile Allianc, and a founding member of the Hillside Group, Booch has published six books and several hundred technical articles, including an ongoing column for IEEE Software. He's also a trustee for the Computer History Museum, an IBM Fellow, an ACM Fellow and IEEE Fellow. He has been awarded the Lovelace Medal and has given the Turing Lecture for the BCS, and was recently named an IEEE Computer Pioneer.

Booch is currently deeply involved in the development of cognitive systems and is also developing a major trans-media documentary for public broadcast on the intersection of computing and the human experience.

More profile about the speaker
Grady Booch | Speaker | TED.com
TED@IBM

Grady Booch: Don't fear superintelligent AI

Grady Booch: Non temamos a superintelixente IA

Filmed:
2,866,438 views

Novas creacións tecnolóxicas, novas ansiedades, di o filósofo Grady Booch, pero non temos por que temer unha IA todopoderosa e sen sentimentos. Booch calma os nosos peores medos (inducidos pola ciencia ficción) acerca dos ordenadores superintelixentes explicando como lles aprenderemos, e non programaremos, para que compartan os nosos valores. En vez de preocuparnos por unha, pouco probable, ameaza existencial, anímanos a que nos preguntemos como a intelixencia artificial pode mellorar a vida humana.
- Scientist, philosopher
IBM's Grady Booch is shaping the future of cognitive computing by building intelligent systems that can reason and learn. Full bio

Double-click the English transcript below to play the video.

00:12
When I was a kid,
I was the quintessential nerd.
0
760
3840
Cando era un neno,
era o típico friqui.
00:17
I think some of you were, too.
1
5320
2176
Creo que algúns de vostedes tamén.
00:19
(Laughter)
2
7520
1216
(risos)
00:20
And you, sir, who laughed the loudest,
you probably still are.
3
8760
3216
E vostede, señor, o que ri máis alto,
pode que aínda o sexa.
00:24
(Laughter)
4
12000
2256
(risos)
00:26
I grew up in a small town
in the dusty plains of north Texas,
5
14280
3496
Medrei nunha cidade pequena
nas poirentas chairas do norte de Texas,
00:29
the son of a sheriff
who was the son of a pastor.
6
17800
3336
son fillo dun sheriff
que era fillo dun pastor.
00:33
Getting into trouble was not an option.
7
21160
1920
así que mellor non meterme en problemas.
00:36
And so I started reading
calculus books for fun.
8
24040
3256
Por tanto comecei a ler libros
de cálculo por divertimento.
00:39
(Laughter)
9
27320
1536
(risos)
00:40
You did, too.
10
28880
1696
Vostede tamén.
00:42
That led me to building a laser
and a computer and model rockets,
11
30600
3736
Iso levoume a construir un láser,
un computador, cohetes,
00:46
and that led me to making
rocket fuel in my bedroom.
12
34360
3000
e a facer combustible
para foguetes no meu cuarto.
00:49
Now, in scientific terms,
13
37960
3656
Agora, en termos científicos,
00:53
we call this a very bad idea.
14
41640
3256
a iso chámamoslle: unha moi mala idea.
00:56
(Laughter)
15
44920
1216
(risos)
00:58
Around that same time,
16
46160
2176
Ao mesmo tempo, estreábase
01:00
Stanley Kubrick's "2001: A Space Odyssey"
came to the theaters,
17
48360
3216
2001, unha odisea no espazo
de Stanley Kubrick,
01:03
and my life was forever changed.
18
51600
2200
e a miña vida cambiaba para sempre.
01:06
I loved everything about that movie,
19
54280
2056
Encantoume todo desa película,
01:08
especially the HAL 9000.
20
56360
2536
sobre todo HAL 9000.
01:10
Now, HAL was a sentient computer
21
58920
2056
HAL era un computador consciente
01:13
designed to guide the Discovery spacecraft
22
61000
2456
deseñado para guiar o Discovery
01:15
from the Earth to Jupiter.
23
63480
2536
da Terra a Xúpiter.
01:18
HAL was also a flawed character,
24
66040
2056
HAL era ademais un personaxe con defectos,
01:20
for in the end he chose
to value the mission over human life.
25
68120
4280
xa que ao final lle deu máis importancia
á misión ca á vida humana.
01:24
Now, HAL was a fictional character,
26
72840
2096
Agora, HAL pertence á ficción
01:26
but nonetheless he speaks to our fears,
27
74960
2656
pero aínda así fala dos nosos medos
01:29
our fears of being subjugated
28
77640
2096
a sermos dominados
01:31
by some unfeeling, artificial intelligence
29
79760
3016
por unha intelixencia artificial (IA)
sen sentimentos
01:34
who is indifferent to our humanity.
30
82800
1960
que é indiferente á nosa humanidade.
01:37
I believe that such fears are unfounded.
31
85880
2576
Creo que eses medos non teñen fundamento.
01:40
Indeed, we stand at a remarkable time
32
88480
2696
De feito, estamos nunha importante época
01:43
in human history,
33
91200
1536
para a historia humana
01:44
where, driven by refusal to accept
the limits of our bodies and our minds,
34
92760
4976
na que, por negarnos a aceptar
os límites dos nosos corpos e mentes,
01:49
we are building machines
35
97760
1696
estamos construíndo máquinas
01:51
of exquisite, beautiful
complexity and grace
36
99480
3616
de exquisita beleza, complexidade e graza
01:55
that will extend the human experience
37
103120
2056
que ampliarán a experiencia humana
01:57
in ways beyond our imagining.
38
105200
1680
de formas que nin imaxinamos.
01:59
After a career that led me
from the Air Force Academy
39
107720
2576
Despois de pasar
da Academia da Forza Aérea
02:02
to Space Command to now,
40
110320
1936
ao Mando Espacial da Forza Aérea,
02:04
I became a systems engineer,
41
112280
1696
pasei a enxeñeiro de sistemas
02:06
and recently I was drawn
into an engineering problem
42
114000
2736
e hai pouco metéronme
nun problema de enxeñaría
02:08
associated with NASA's mission to Mars.
43
116760
2576
relacionado coa misión da NASA a Marte.
02:11
Now, in space flights to the Moon,
44
119360
2496
Agora, nos voos espaciais á Lúa,
02:13
we can rely upon
mission control in Houston
45
121880
3136
dependemos do control
de Houston para vixiar
02:17
to watch over all aspects of a flight.
46
125040
1976
todos os aspectos do voo.
02:19
However, Mars is 200 times further away,
47
127040
3536
Porén Marte está 200 veces máis lonxe
e por tanto un sinal tarda
arredor de 13 minutos
02:22
and as a result it takes
on average 13 minutes
48
130600
3216
02:25
for a signal to travel
from the Earth to Mars.
49
133840
3136
en viaxar dende a Terra ata alí.
02:29
If there's trouble,
there's not enough time.
50
137000
3400
Se hai algún problema, é demasiado tempo.
02:32
And so a reasonable engineering solution
51
140840
2496
Unha solución razoable
02:35
calls for us to put mission control
52
143360
2576
pasa por incluír o control da misión
02:37
inside the walls of the Orion spacecraft.
53
145960
3016
dentro das paredes da nave espacial Orión.
02:41
Another fascinating idea
in the mission profile
54
149000
2896
Outra idea fascinante do perfil da misión
02:43
places humanoid robots
on the surface of Mars
55
151920
2896
sitúa a robots humanoides
na superficie de Marte
02:46
before the humans themselves arrive,
56
154840
1856
antes ca os propios humanos,
primeiro para construír infraestruturas
02:48
first to build facilities
57
156720
1656
02:50
and later to serve as collaborative
members of the science team.
58
158400
3360
e despois para serviren
como colaboradores do equipo científico.
02:55
Now, as I looked at this
from an engineering perspective,
59
163400
2736
Desde a perspectiva da enxeñaría,
02:58
it became very clear to me
that what I needed to architect
60
166160
3176
quedoume moi claro
que o que necesitaba deseñar
03:01
was a smart, collaborative,
61
169360
2176
era unha intelixencia artificial
03:03
socially intelligent
artificial intelligence.
62
171560
2376
esperta, colaborativa
e socialmente intelixente.
03:05
In other words, I needed to build
something very much like a HAL
63
173960
4296
É dicir, precisaba construír
algo moi parecido a HAL
03:10
but without the homicidal tendencies.
64
178280
2416
pero sen as súas tendencias homicidas.
03:12
(Laughter)
65
180720
1360
(risos)
03:14
Let's pause for a moment.
66
182920
1816
Deteñámonos un momento.
03:16
Is it really possible to build
an artificial intelligence like that?
67
184760
3896
É posible construírmos
unha intelixencia artificial coma esa?
03:20
Actually, it is.
68
188680
1456
Pois, si.
03:22
In many ways,
69
190160
1256
Por moitas razóns,
03:23
this is a hard engineering problem
70
191440
1976
este é un difícil problema de enxeñaría
03:25
with elements of AI,
71
193440
1456
cos elementos da lA,
03:26
not some wet hair ball of an AI problem
that needs to be engineered.
72
194920
4696
non só unhas pedriñas no camiño
que deben ser tratadas pola enxeñaría.
03:31
To paraphrase Alan Turing,
73
199640
2656
Parafraseando a Alan Turing,
03:34
I'm not interested
in building a sentient machine.
74
202320
2376
Non quero construír
unha máquina con sentimentos.
03:36
I'm not building a HAL.
75
204720
1576
Non constrúo un HAL.
03:38
All I'm after is a simple brain,
76
206320
2416
Só busco un cerebro simple,
03:40
something that offers
the illusion of intelligence.
77
208760
3120
algo que ofrece a ilusión da intelixencia.
03:45
The art and the science of computing
have come a long way
78
213000
3136
A arte e a ciencia informática
percorreron un longo camiño
03:48
since HAL was onscreen,
79
216160
1496
dende que vimos a HAL,
03:49
and I'd imagine if his inventor
Dr. Chandra were here today,
80
217680
3216
e creo que se o Dr. Chandra,
o seu inventor, estivese aquí hoxe,
03:52
he'd have a whole lot of questions for us.
81
220920
2336
tería moitas preguntas que facernos.
03:55
Is it really possible for us
82
223280
2096
Énos posible coller un sistema
03:57
to take a system of millions
upon millions of devices,
83
225400
4016
de millóns e millóns de dispositivos,
04:01
to read in their data streams,
84
229440
1456
medir os seus datos,
04:02
to predict their failures
and act in advance?
85
230920
2256
predicir os seus fallos e anticiparnos?
04:05
Yes.
86
233200
1216
É.
04:06
Can we build systems that converse
with humans in natural language?
87
234440
3176
Podemos crear sistemas
que falen nunha linguaxe natural?
04:09
Yes.
88
237640
1216
Podemos.
Podemos crear sistemas
que recoñezan obxectos,
04:10
Can we build systems
that recognize objects, identify emotions,
89
238880
2976
04:13
emote themselves,
play games and even read lips?
90
241880
3376
identifiquen e expresen emocións,
xoguen, lean os beizos?
04:17
Yes.
91
245280
1216
Podemos.
04:18
Can we build a system that sets goals,
92
246520
2136
Podemos crear un sistema que fixe metas,
04:20
that carries out plans against those goals
and learns along the way?
93
248680
3616
cree plans para acadalas
e de paso aprenda?
04:24
Yes.
94
252320
1216
Podemos.
04:25
Can we build systems
that have a theory of mind?
95
253560
3336
Podemos construír sistemas
que teñan unha teoría da mente?
Isto é o que estamos a aprender a facer.
04:28
This we are learning to do.
96
256920
1496
04:30
Can we build systems that have
an ethical and moral foundation?
97
258440
3480
Podemos construír un sistema
que teña principios éticos e morais?
04:34
This we must learn how to do.
98
262480
2040
Isto é o que debemos aprender.
04:37
So let's accept for a moment
99
265360
1376
Aceptemos por un momento
04:38
that it's possible to build
such an artificial intelligence
100
266760
2896
que podemos construír
esta intelixencia artificial
04:41
for this kind of mission and others.
101
269680
2136
para este e outro tipo de misións.
04:43
The next question
you must ask yourself is,
102
271840
2536
A seguinte pregunta que se deben facer é:
04:46
should we fear it?
103
274400
1456
deberiamos de lle ter medo?
04:47
Now, every new technology
104
275880
1976
Todas as novas tecnoloxías
04:49
brings with it
some measure of trepidation.
105
277880
2896
traen consigo certo grao de inquietude.
04:52
When we first saw cars,
106
280800
1696
A primeira vez que a xente viu un coche,
04:54
people lamented that we would see
the destruction of the family.
107
282520
4016
queixábase de que ía destruír as familias.
04:58
When we first saw telephones come in,
108
286560
2696
Cando vimos por primeira vez os teléfonos,
05:01
people were worried it would destroy
all civil conversation.
109
289280
2896
preocupábanos que acabasen
coas conversas civilizadas.
05:04
At a point in time we saw
the written word become pervasive,
110
292200
3936
Nalgún momento cremos
que a escritura se volvería ubicua,
05:08
people thought we would lose
our ability to memorize.
111
296160
2496
que se perdería a capacidade de memorizar.
05:10
These things are all true to a degree,
112
298680
2056
Todo isto é verdade ata certo punto,
05:12
but it's also the case
that these technologies
113
300760
2416
pero tamén o é que estas tecnoloxías
05:15
brought to us things
that extended the human experience
114
303200
3376
nos brindaron avances
que ampliaron a experiencia humana
05:18
in some profound ways.
115
306600
1880
de formas moi profundas.
05:21
So let's take this a little further.
116
309840
2280
Así que sigamos un pouco máis.
05:25
I do not fear the creation
of an AI like this,
117
313120
4736
Non me dá medo a creación
dunha IA coma esta,
05:29
because it will eventually
embody some of our values.
118
317880
3816
xa que, co tempo, encarnará
algúns dos nosos valores.
05:33
Consider this: building a cognitive system
is fundamentally different
119
321720
3496
Tendo isto en conta: construír
un sistema cognitivo é distinto
05:37
than building a traditional
software-intensive system of the past.
120
325240
3296
a crear un anticuado sistema
baseado en software intensivo.
05:40
We don't program them. We teach them.
121
328560
2456
Non os programamos, aprendémoslles.
05:43
In order to teach a system
how to recognize flowers,
122
331040
2656
Para lles aprendermos
a como recoñeceren as flores,
05:45
I show it thousands of flowers
of the kinds I like.
123
333720
3016
amosaríalles moitas flores
das miñas variedades preferidas.
05:48
In order to teach a system
how to play a game --
124
336760
2256
Para aprenderlles como xogar a un xogo...
05:51
Well, I would. You would, too.
125
339040
1960
Eu faríao, e vós tamén.
05:54
I like flowers. Come on.
126
342600
2040
De verdade que me gustan as flores.
05:57
To teach a system
how to play a game like Go,
127
345440
2856
Para ensinarlle a un sistema
a xogar a un xogo coma Go
06:00
I'd have it play thousands of games of Go,
128
348320
2056
faríao xogar milleiros de partidas.
06:02
but in the process I also teach it
129
350400
1656
E no proceso aprenderíalle
06:04
how to discern
a good game from a bad game.
130
352080
2416
a distinguir unha boa partida dunha mala.
06:06
If I want to create an artificially
intelligent legal assistant,
131
354520
3696
Se quixese crear unha IA
coa función de asistente legal
06:10
I will teach it some corpus of law
132
358240
1776
aprenderíalle un corpus legal
06:12
but at the same time I am fusing with it
133
360040
2856
ao mesmo tempo que lle introduciría
06:14
the sense of mercy and justice
that is part of that law.
134
362920
2880
o sentido da misericordia
e xustiza que é parte da lei.
06:18
In scientific terms,
this is what we call ground truth,
135
366560
2976
En termos científicos
chámaselles verdades fundamentais
06:21
and here's the important point:
136
369560
2016
e o máis importante:
06:23
in producing these machines,
137
371600
1456
ao producirmos estas máquinas
06:25
we are therefore teaching them
a sense of our values.
138
373080
3416
estámoslles inculcando os nosos valores.
06:28
To that end, I trust
an artificial intelligence
139
376520
3136
Chegados a este punto, confío nunha IA
06:31
the same, if not more,
as a human who is well-trained.
140
379680
3640
tanto, se non máis,
ca nun humano ben adestrado.
06:36
But, you may ask,
141
384080
1216
Preguntarédesvos
06:37
what about rogue agents,
142
385320
2616
que pasa cos axentes corruptos,
06:39
some well-funded
nongovernment organization?
143
387960
3336
con algunhas ONG ben financiadas?
06:43
I do not fear an artificial intelligence
in the hand of a lone wolf.
144
391320
3816
Unha IA en mans dun lobo solitario
non me pon medo.
06:47
Clearly, we cannot protect ourselves
against all random acts of violence,
145
395160
4536
Non nos podemos autoprotexer
de todo acto aleatorio de violencia
06:51
but the reality is such a system
146
399720
2136
pero a realidade é que tal sistema
06:53
requires substantial training
and subtle training
147
401880
3096
precisa dun adestramento
substancial e sutil
06:57
far beyond the resources of an individual.
148
405000
2296
moi por riba das capacidades
dun único individuo.
06:59
And furthermore,
149
407320
1216
E ademais é máis complicado
07:00
it's far more than just injecting
an internet virus to the world,
150
408560
3256
ca inxectarmos un virus informático
en todo o mundo.
07:03
where you push a button,
all of a sudden it's in a million places
151
411840
3096
Premendo un botón
o virus chega a milleiros de lugares
07:06
and laptops start blowing up
all over the place.
152
414960
2456
e os portátiles empezan a estoupar.
07:09
Now, these kinds of substances
are much larger,
153
417440
2816
Agora, este tipo de contidos
son moito máis grandes
07:12
and we'll certainly see them coming.
154
420280
1715
e vémolos vir.
07:14
Do I fear that such
an artificial intelligence
155
422520
3056
Teño medo a que unha IA deste tipo
07:17
might threaten all of humanity?
156
425600
1960
poida ameazar a toda a humanidade?
07:20
If you look at movies
such as "The Matrix," "Metropolis,"
157
428280
4376
Se ves películas como Matrix, Metrópolis,
07:24
"The Terminator,"
shows such as "Westworld,"
158
432680
3176
Terminator, series como Westworld
07:27
they all speak of this kind of fear.
159
435880
2136
todas tratan este tipo de medos.
07:30
Indeed, in the book "Superintelligence"
by the philosopher Nick Bostrom,
160
438040
4296
De feito, no libro Superintelixencia
do filósofo Nick Bostrom
07:34
he picks up on this theme
161
442360
1536
este trata o mesmo tema
07:35
and observes that a superintelligence
might not only be dangerous,
162
443920
4016
e opina que unha superintelixencia
non só podería ser perigosa
07:39
it could represent an existential threat
to all of humanity.
163
447960
3856
podería representar unha ameaza
para toda a humanidade.
07:43
Dr. Bostrom's basic argument
164
451840
2216
O argumento do Dr. Bostrom
07:46
is that such systems will eventually
165
454080
2736
é que estes sistemas
van chegar a ter nun momento
07:48
have such an insatiable
thirst for information
166
456840
3256
tal insaciable fame de información
07:52
that they will perhaps learn how to learn
167
460120
2896
que pode que aprendan a como aprender
07:55
and eventually discover
that they may have goals
168
463040
2616
e poden chegar a descubrir
que teñen obxectivos
07:57
that are contrary to human needs.
169
465680
2296
contrarios ás necesidades humanas.
08:00
Dr. Bostrom has a number of followers.
170
468000
1856
O Dr. Bostrom ten seguidores.
08:01
He is supported by people
such as Elon Musk and Stephen Hawking.
171
469880
4320
É apoiado por xente
como Elon Musk ou Stephen Hawking.
08:06
With all due respect
172
474880
2400
Con todo o meu respecto
08:10
to these brilliant minds,
173
478160
2016
por estas mentes prodixiosas,
08:12
I believe that they
are fundamentally wrong.
174
480200
2256
creo que se equivocan.
08:14
Now, there are a lot of pieces
of Dr. Bostrom's argument to unpack,
175
482480
3176
Hai moitas partes da argumentación
do Dr. Bostrom que analizar
08:17
and I don't have time to unpack them all,
176
485680
2136
e agora non teño tempo
08:19
but very briefly, consider this:
177
487840
2696
pero de forma moi breve, direi isto:
08:22
super knowing is very different
than super doing.
178
490560
3736
a supersabedoría é moi diferente
da superrealización.
08:26
HAL was a threat to the Discovery crew
179
494320
1896
HAL era unha ameaza
08:28
only insofar as HAL commanded
all aspects of the Discovery.
180
496240
4416
para a tripulación do Discovery
só se tomaba o control de toda a nave.
08:32
So it would have to be
with a superintelligence.
181
500680
2496
O mesmo pasaría cunha superintelixencia.
08:35
It would have to have dominion
over all of our world.
182
503200
2496
Tería que ter o dominio de todo o mundo.
08:37
This is the stuff of Skynet
from the movie "The Terminator"
183
505720
2816
Isto é o que pasa coa Skynet
de Terminator
08:40
in which we had a superintelligence
184
508560
1856
onde temos unha superintelixencia
08:42
that commanded human will,
185
510440
1376
que dirixe a vontade humana
08:43
that directed every device
that was in every corner of the world.
186
511840
3856
e cada dispositivo situado
en calquera recuncho do mundo.
08:47
Practically speaking,
187
515720
1456
Falando dende a práctica
08:49
it ain't gonna happen.
188
517200
2096
isto non vai pasar.
08:51
We are not building AIs
that control the weather,
189
519320
3056
Non estamos deseñando IA
que controlen o clima,
08:54
that direct the tides,
190
522400
1336
que dirixan as mareas,
08:55
that command us
capricious, chaotic humans.
191
523760
3376
nin a nós, os caprichosos
e caóticos humanos.
08:59
And furthermore, if such
an artificial intelligence existed,
192
527160
3896
Ademais, de existir
unha intelixencia artificial así
09:03
it would have to compete
with human economies,
193
531080
2936
tería que competir coas economías humanas
09:06
and thereby compete for resources with us.
194
534040
2520
e competir con nós polos recursos.
09:09
And in the end --
195
537200
1216
E á fin e ao cabo
09:10
don't tell Siri this --
196
538440
1240
(non llo digan a Siri)
09:12
we can always unplug them.
197
540440
1376
podemos desconectalos.
09:13
(Laughter)
198
541840
2120
(risos)
09:17
We are on an incredible journey
199
545360
2456
Estamos nunha incrible viaxe
de coevolución
09:19
of coevolution with our machines.
200
547840
2496
xunto coas nosas máquinas.
09:22
The humans we are today
201
550360
2496
Os humanos que somos agora
09:24
are not the humans we will be then.
202
552880
2536
non somos os que seremos daquela.
09:27
To worry now about the rise
of a superintelligence
203
555440
3136
Preocuparse agora
polo auxe dunha superintelixencia
09:30
is in many ways a dangerous distraction
204
558600
3056
é unha distracción perigosa
09:33
because the rise of computing itself
205
561680
2336
xa que o auxe da informática en si
09:36
brings to us a number
of human and societal issues
206
564040
3016
xa dá lugar a algúns asuntos
humanos e sociais
09:39
to which we must now attend.
207
567080
1640
aos que temos que atender.
09:41
How shall I best organize society
208
569360
2816
Como se pode organizar a sociedade
09:44
when the need for human labor diminishes?
209
572200
2336
cando diminúe
a necesidade de capital humano?
09:46
How can I bring understanding
and education throughout the globe
210
574560
3816
Como podo achegarlle
compresión e educación a todo o mundo
09:50
and still respect our differences?
211
578400
1776
respectando as nosas diferenzas?
09:52
How might I extend and enhance human life
through cognitive healthcare?
212
580200
4256
Como podo mellorar a vida das persoas
a través da medicina cognitiva?
09:56
How might I use computing
213
584480
2856
Como podo utilizar a informática
09:59
to help take us to the stars?
214
587360
1760
para que cheguemos ás estrelas?
10:01
And that's the exciting thing.
215
589760
2040
E iso é o excitante.
10:04
The opportunities to use computing
216
592400
2336
As oportunidades de usar a informática
10:06
to advance the human experience
217
594760
1536
para que progrese a experiencia humana
10:08
are within our reach,
218
596320
1416
están ao noso alcance
10:09
here and now,
219
597760
1856
aquí e agora
10:11
and we are just beginning.
220
599640
1680
e isto é só o comezo.
10:14
Thank you very much.
221
602280
1216
Moitas grazas.
10:15
(Applause)
222
603520
4286
(Aplausos)

▲Back to top

ABOUT THE SPEAKER
Grady Booch - Scientist, philosopher
IBM's Grady Booch is shaping the future of cognitive computing by building intelligent systems that can reason and learn.

Why you should listen

When he was 13, Grady Booch saw 2001: A Space Odyssey in the theaters for the first time. Ever since, he's been trying to build Hal (albeit one without the homicidal tendencies). A scientist, storyteller and philosopher, Booch is Chief Scientist for Software Engineering as well as Chief Scientist for Watson/M at IBM Research, where he leads IBM's research and development for embodied cognition. Having originated the term and the practice of object-oriented design, he is best known for his work in advancing the fields of software engineering and software architecture.

A co-author of the Unified Modeling Language (UML), a founding member of the Agile Allianc, and a founding member of the Hillside Group, Booch has published six books and several hundred technical articles, including an ongoing column for IEEE Software. He's also a trustee for the Computer History Museum, an IBM Fellow, an ACM Fellow and IEEE Fellow. He has been awarded the Lovelace Medal and has given the Turing Lecture for the BCS, and was recently named an IEEE Computer Pioneer.

Booch is currently deeply involved in the development of cognitive systems and is also developing a major trans-media documentary for public broadcast on the intersection of computing and the human experience.

More profile about the speaker
Grady Booch | Speaker | TED.com