episode
stringlengths
45
100
text
stringlengths
1
528
timestamp_link
stringlengths
56
56
Eric Schmidt: Google | Lex Fridman Podcast #8
you have to have a bunch of belief systems,
https://karpathy.ai/lexicap/0008-large.html#00:14:05.900
Eric Schmidt: Google | Lex Fridman Podcast #8
and one of them is that you have to have
https://karpathy.ai/lexicap/0008-large.html#00:14:08.060
Eric Schmidt: Google | Lex Fridman Podcast #8
bottoms up and tops down.
https://karpathy.ai/lexicap/0008-large.html#00:14:09.580
Eric Schmidt: Google | Lex Fridman Podcast #8
The bottoms up we call 20% time,
https://karpathy.ai/lexicap/0008-large.html#00:14:11.460
Eric Schmidt: Google | Lex Fridman Podcast #8
and the idea is that people can spend 20% of the time
https://karpathy.ai/lexicap/0008-large.html#00:14:13.580
Eric Schmidt: Google | Lex Fridman Podcast #8
whatever they want, and the top down
https://karpathy.ai/lexicap/0008-large.html#00:14:15.780
Eric Schmidt: Google | Lex Fridman Podcast #8
is that our founders in particular
https://karpathy.ai/lexicap/0008-large.html#00:14:17.740
Eric Schmidt: Google | Lex Fridman Podcast #8
have a keen eye on technology
https://karpathy.ai/lexicap/0008-large.html#00:14:19.700
Eric Schmidt: Google | Lex Fridman Podcast #8
and they're reviewing things constantly.
https://karpathy.ai/lexicap/0008-large.html#00:14:21.740
Eric Schmidt: Google | Lex Fridman Podcast #8
So an example would be they'll hear about an idea
https://karpathy.ai/lexicap/0008-large.html#00:14:23.880
Eric Schmidt: Google | Lex Fridman Podcast #8
or I'll hear about something and it sounds interesting,
https://karpathy.ai/lexicap/0008-large.html#00:14:26.540
Eric Schmidt: Google | Lex Fridman Podcast #8
let's go visit them.
https://karpathy.ai/lexicap/0008-large.html#00:14:28.700
Eric Schmidt: Google | Lex Fridman Podcast #8
And then let's begin to assemble the pieces
https://karpathy.ai/lexicap/0008-large.html#00:14:30.380
Eric Schmidt: Google | Lex Fridman Podcast #8
to see if that's possible.
https://karpathy.ai/lexicap/0008-large.html#00:14:33.060
Eric Schmidt: Google | Lex Fridman Podcast #8
And if you do this long enough,
https://karpathy.ai/lexicap/0008-large.html#00:14:34.780
Eric Schmidt: Google | Lex Fridman Podcast #8
you get pretty good at predicting what's likely to work.
https://karpathy.ai/lexicap/0008-large.html#00:14:35.980
Eric Schmidt: Google | Lex Fridman Podcast #8
So that's a beautiful balance that struck.
https://karpathy.ai/lexicap/0008-large.html#00:14:39.780
Eric Schmidt: Google | Lex Fridman Podcast #8
Is this something that applies at all scale?
https://karpathy.ai/lexicap/0008-large.html#00:14:42.020
Eric Schmidt: Google | Lex Fridman Podcast #8
It seems to be that Sergey, again, 15 years ago,
https://karpathy.ai/lexicap/0008-large.html#00:14:44.420
Eric Schmidt: Google | Lex Fridman Podcast #8
came up with a concept called 10% of the budget
https://karpathy.ai/lexicap/0008-large.html#00:14:53.060
Eric Schmidt: Google | Lex Fridman Podcast #8
should be on things that are unrelated.
https://karpathy.ai/lexicap/0008-large.html#00:14:56.840
Eric Schmidt: Google | Lex Fridman Podcast #8
It was called 70, 20, 10.
https://karpathy.ai/lexicap/0008-large.html#00:14:58.980
Eric Schmidt: Google | Lex Fridman Podcast #8
70% of our time on core business,
https://karpathy.ai/lexicap/0008-large.html#00:15:00.860
Eric Schmidt: Google | Lex Fridman Podcast #8
20% on adjacent business, and 10% on other.
https://karpathy.ai/lexicap/0008-large.html#00:15:03.540
Eric Schmidt: Google | Lex Fridman Podcast #8
And he proved mathematically,
https://karpathy.ai/lexicap/0008-large.html#00:15:06.780
Eric Schmidt: Google | Lex Fridman Podcast #8
of course he's a brilliant mathematician,
https://karpathy.ai/lexicap/0008-large.html#00:15:08.700
Eric Schmidt: Google | Lex Fridman Podcast #8
that you needed that 10% to make the sum
https://karpathy.ai/lexicap/0008-large.html#00:15:10.580
Eric Schmidt: Google | Lex Fridman Podcast #8
of the growth work.
https://karpathy.ai/lexicap/0008-large.html#00:15:13.860
Eric Schmidt: Google | Lex Fridman Podcast #8
And it turns out he was right.
https://karpathy.ai/lexicap/0008-large.html#00:15:14.700
Eric Schmidt: Google | Lex Fridman Podcast #8
So getting into the world of artificial intelligence,
https://karpathy.ai/lexicap/0008-large.html#00:15:18.620
Eric Schmidt: Google | Lex Fridman Podcast #8
you've talked quite extensively and effectively
https://karpathy.ai/lexicap/0008-large.html#00:15:20.940
Eric Schmidt: Google | Lex Fridman Podcast #8
to the impact in the near term,
https://karpathy.ai/lexicap/0008-large.html#00:15:25.380
Eric Schmidt: Google | Lex Fridman Podcast #8
the positive impact of artificial intelligence,
https://karpathy.ai/lexicap/0008-large.html#00:15:28.780
Eric Schmidt: Google | Lex Fridman Podcast #8
whether it's especially machine learning
https://karpathy.ai/lexicap/0008-large.html#00:15:32.020
Eric Schmidt: Google | Lex Fridman Podcast #8
in medical applications and education,
https://karpathy.ai/lexicap/0008-large.html#00:15:34.140
Eric Schmidt: Google | Lex Fridman Podcast #8
and just making information more accessible, right?
https://karpathy.ai/lexicap/0008-large.html#00:15:38.580
Eric Schmidt: Google | Lex Fridman Podcast #8
In the AI community, there is a kind of debate.
https://karpathy.ai/lexicap/0008-large.html#00:15:41.600
Eric Schmidt: Google | Lex Fridman Podcast #8
There's this shroud of uncertainty
https://karpathy.ai/lexicap/0008-large.html#00:15:45.860
Eric Schmidt: Google | Lex Fridman Podcast #8
as we face this new world
https://karpathy.ai/lexicap/0008-large.html#00:15:47.700
Eric Schmidt: Google | Lex Fridman Podcast #8
with artificial intelligence in it.
https://karpathy.ai/lexicap/0008-large.html#00:15:49.020
Eric Schmidt: Google | Lex Fridman Podcast #8
And there's some people, like Elon Musk,
https://karpathy.ai/lexicap/0008-large.html#00:15:50.460
Eric Schmidt: Google | Lex Fridman Podcast #8
you've disagreed, at least on the degree of emphasis
https://karpathy.ai/lexicap/0008-large.html#00:15:54.260
Eric Schmidt: Google | Lex Fridman Podcast #8
he places on the existential threat of AI.
https://karpathy.ai/lexicap/0008-large.html#00:15:57.660
Eric Schmidt: Google | Lex Fridman Podcast #8
So I've spoken with Stuart Russell,
https://karpathy.ai/lexicap/0008-large.html#00:16:00.700
Eric Schmidt: Google | Lex Fridman Podcast #8
Max Tegmark, who share Elon Musk's view,
https://karpathy.ai/lexicap/0008-large.html#00:16:02.540
Eric Schmidt: Google | Lex Fridman Podcast #8
and Yoshua Bengio, Steven Pinker, who do not.
https://karpathy.ai/lexicap/0008-large.html#00:16:05.340
Eric Schmidt: Google | Lex Fridman Podcast #8
And so there's a lot of very smart people
https://karpathy.ai/lexicap/0008-large.html#00:16:09.180
Eric Schmidt: Google | Lex Fridman Podcast #8
who are thinking about this stuff, disagreeing,
https://karpathy.ai/lexicap/0008-large.html#00:16:11.860
Eric Schmidt: Google | Lex Fridman Podcast #8
which is really healthy, of course.
https://karpathy.ai/lexicap/0008-large.html#00:16:14.620
Eric Schmidt: Google | Lex Fridman Podcast #8
So what do you think is the healthiest way
https://karpathy.ai/lexicap/0008-large.html#00:16:17.180
Eric Schmidt: Google | Lex Fridman Podcast #8
for the AI community to,
https://karpathy.ai/lexicap/0008-large.html#00:16:19.100
Eric Schmidt: Google | Lex Fridman Podcast #8
and really for the general public,
https://karpathy.ai/lexicap/0008-large.html#00:16:22.020
Eric Schmidt: Google | Lex Fridman Podcast #8
to think about AI and the concern
https://karpathy.ai/lexicap/0008-large.html#00:16:23.860
Eric Schmidt: Google | Lex Fridman Podcast #8
of the technology being mismanaged in some kind of way?
https://karpathy.ai/lexicap/0008-large.html#00:16:27.700
Eric Schmidt: Google | Lex Fridman Podcast #8
So the source of education for the general public
https://karpathy.ai/lexicap/0008-large.html#00:16:32.920
Eric Schmidt: Google | Lex Fridman Podcast #8
has been robot killer movies.
https://karpathy.ai/lexicap/0008-large.html#00:16:35.060
Eric Schmidt: Google | Lex Fridman Podcast #8
Right.
https://karpathy.ai/lexicap/0008-large.html#00:16:37.380
Eric Schmidt: Google | Lex Fridman Podcast #8
And Terminator, et cetera.
https://karpathy.ai/lexicap/0008-large.html#00:16:38.220
Eric Schmidt: Google | Lex Fridman Podcast #8
And the one thing I can assure you we're not building
https://karpathy.ai/lexicap/0008-large.html#00:16:40.860
Eric Schmidt: Google | Lex Fridman Podcast #8
are those kinds of solutions.
https://karpathy.ai/lexicap/0008-large.html#00:16:44.500
Eric Schmidt: Google | Lex Fridman Podcast #8
Furthermore, if they were to show up,
https://karpathy.ai/lexicap/0008-large.html#00:16:46.620
Eric Schmidt: Google | Lex Fridman Podcast #8
someone would notice and unplug them, right?
https://karpathy.ai/lexicap/0008-large.html#00:16:48.420
Eric Schmidt: Google | Lex Fridman Podcast #8
So as exciting as those movies are,
https://karpathy.ai/lexicap/0008-large.html#00:16:51.140
Eric Schmidt: Google | Lex Fridman Podcast #8
and they're great movies,
https://karpathy.ai/lexicap/0008-large.html#00:16:53.140
Eric Schmidt: Google | Lex Fridman Podcast #8
were the killer robots to start,
https://karpathy.ai/lexicap/0008-large.html#00:16:54.700
Eric Schmidt: Google | Lex Fridman Podcast #8
we would find a way to stop them, right?
https://karpathy.ai/lexicap/0008-large.html#00:16:57.500
Eric Schmidt: Google | Lex Fridman Podcast #8
So I'm not concerned about that.
https://karpathy.ai/lexicap/0008-large.html#00:17:00.500
Eric Schmidt: Google | Lex Fridman Podcast #8
And much of this has to do
https://karpathy.ai/lexicap/0008-large.html#00:17:04.060
Eric Schmidt: Google | Lex Fridman Podcast #8
with the timeframe of conversation.
https://karpathy.ai/lexicap/0008-large.html#00:17:05.980
Eric Schmidt: Google | Lex Fridman Podcast #8
So you can imagine a situation 100 years from now
https://karpathy.ai/lexicap/0008-large.html#00:17:08.540
Eric Schmidt: Google | Lex Fridman Podcast #8
when the human brain is fully understood
https://karpathy.ai/lexicap/0008-large.html#00:17:13.300
Eric Schmidt: Google | Lex Fridman Podcast #8
and the next generation and next generation
https://karpathy.ai/lexicap/0008-large.html#00:17:15.920
Eric Schmidt: Google | Lex Fridman Podcast #8
of brilliant MIT scientists have figured all this out,
https://karpathy.ai/lexicap/0008-large.html#00:17:18.140
Eric Schmidt: Google | Lex Fridman Podcast #8
we're gonna have a large number of ethics questions, right?
https://karpathy.ai/lexicap/0008-large.html#00:17:20.940
Eric Schmidt: Google | Lex Fridman Podcast #8
Around science and thinking and robots and computers
https://karpathy.ai/lexicap/0008-large.html#00:17:25.140
Eric Schmidt: Google | Lex Fridman Podcast #8
and so forth and so on.
https://karpathy.ai/lexicap/0008-large.html#00:17:28.060
Eric Schmidt: Google | Lex Fridman Podcast #8
So it depends on the question of the timeframe.
https://karpathy.ai/lexicap/0008-large.html#00:17:29.700
Eric Schmidt: Google | Lex Fridman Podcast #8
In the next five to 10 years,
https://karpathy.ai/lexicap/0008-large.html#00:17:32.260
Eric Schmidt: Google | Lex Fridman Podcast #8
we're not facing those questions.
https://karpathy.ai/lexicap/0008-large.html#00:17:34.780
Eric Schmidt: Google | Lex Fridman Podcast #8
What we're facing in the next five to 10 years
https://karpathy.ai/lexicap/0008-large.html#00:17:37.220
Eric Schmidt: Google | Lex Fridman Podcast #8
is how do we spread this disruptive technology
https://karpathy.ai/lexicap/0008-large.html#00:17:39.100
Eric Schmidt: Google | Lex Fridman Podcast #8
as broadly as possible to gain the maximum benefit of it?
https://karpathy.ai/lexicap/0008-large.html#00:17:42.140
Eric Schmidt: Google | Lex Fridman Podcast #8
The primary benefits should be in healthcare
https://karpathy.ai/lexicap/0008-large.html#00:17:46.500
Eric Schmidt: Google | Lex Fridman Podcast #8
and in education.
https://karpathy.ai/lexicap/0008-large.html#00:17:48.980
Eric Schmidt: Google | Lex Fridman Podcast #8
Healthcare because it's obvious.
https://karpathy.ai/lexicap/0008-large.html#00:17:50.860
Eric Schmidt: Google | Lex Fridman Podcast #8
We're all the same even though we somehow believe we're not.
https://karpathy.ai/lexicap/0008-large.html#00:17:52.320
Eric Schmidt: Google | Lex Fridman Podcast #8
As a medical matter,
https://karpathy.ai/lexicap/0008-large.html#00:17:55.780
Eric Schmidt: Google | Lex Fridman Podcast #8
the fact that we have big data about our health
https://karpathy.ai/lexicap/0008-large.html#00:17:57.340
Eric Schmidt: Google | Lex Fridman Podcast #8
will save lives, allow us to deal with skin cancer
https://karpathy.ai/lexicap/0008-large.html#00:17:59.180
Eric Schmidt: Google | Lex Fridman Podcast #8
and other cancers, ophthalmological problems.
https://karpathy.ai/lexicap/0008-large.html#00:18:02.700
Eric Schmidt: Google | Lex Fridman Podcast #8
There's people working on psychological diseases
https://karpathy.ai/lexicap/0008-large.html#00:18:05.500
Eric Schmidt: Google | Lex Fridman Podcast #8
and so forth using these techniques.
https://karpathy.ai/lexicap/0008-large.html#00:18:08.420
Eric Schmidt: Google | Lex Fridman Podcast #8
I can go on and on.
https://karpathy.ai/lexicap/0008-large.html#00:18:10.260
Eric Schmidt: Google | Lex Fridman Podcast #8
The promise of AI in medicine is extraordinary.
https://karpathy.ai/lexicap/0008-large.html#00:18:11.700
Eric Schmidt: Google | Lex Fridman Podcast #8
There are many, many companies and startups
https://karpathy.ai/lexicap/0008-large.html#00:18:15.840
Eric Schmidt: Google | Lex Fridman Podcast #8
and funds and solutions
https://karpathy.ai/lexicap/0008-large.html#00:18:17.980
Eric Schmidt: Google | Lex Fridman Podcast #8
and we will all live much better for that.
https://karpathy.ai/lexicap/0008-large.html#00:18:19.480
Eric Schmidt: Google | Lex Fridman Podcast #8
The same argument in education.
https://karpathy.ai/lexicap/0008-large.html#00:18:22.140
Eric Schmidt: Google | Lex Fridman Podcast #8
Can you imagine that for each generation of child
https://karpathy.ai/lexicap/0008-large.html#00:18:25.580
Eric Schmidt: Google | Lex Fridman Podcast #8
and even adult, you have a tutor educator that's AI based,
https://karpathy.ai/lexicap/0008-large.html#00:18:28.540