episode
stringlengths
45
100
text
stringlengths
1
528
timestamp_link
stringlengths
56
56
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
maybe half or so of AI researchers
https://karpathy.ai/lexicap/0001-large.html#01:09:46.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
think we're going to get AGI within decades.
https://karpathy.ai/lexicap/0001-large.html#01:09:48.640
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So if that happens, of course,
https://karpathy.ai/lexicap/0001-large.html#01:09:50.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
then I think these things are all possible.
https://karpathy.ai/lexicap/0001-large.html#01:09:52.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
But in terms of whether it will happen,
https://karpathy.ai/lexicap/0001-large.html#01:09:55.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I think we shouldn't spend so much time asking
https://karpathy.ai/lexicap/0001-large.html#01:09:56.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
what do we think will happen in the future?
https://karpathy.ai/lexicap/0001-large.html#01:10:00.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
As if we are just some sort of pathetic,
https://karpathy.ai/lexicap/0001-large.html#01:10:03.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
your passive bystanders, you know,
https://karpathy.ai/lexicap/0001-large.html#01:10:05.160
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
waiting for the future to happen to us.
https://karpathy.ai/lexicap/0001-large.html#01:10:07.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Hey, we're the ones creating this future, right?
https://karpathy.ai/lexicap/0001-large.html#01:10:09.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So we should be proactive about it
https://karpathy.ai/lexicap/0001-large.html#01:10:11.640
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and ask ourselves what sort of future
https://karpathy.ai/lexicap/0001-large.html#01:10:15.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
we would like to have happen.
https://karpathy.ai/lexicap/0001-large.html#01:10:16.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
We're going to make it like that.
https://karpathy.ai/lexicap/0001-large.html#01:10:18.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Well, what I prefer is just some sort of incredibly boring,
https://karpathy.ai/lexicap/0001-large.html#01:10:19.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
zombie like future where there's all these
https://karpathy.ai/lexicap/0001-large.html#01:10:22.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
mechanical things happening and there's no passion,
https://karpathy.ai/lexicap/0001-large.html#01:10:24.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
no emotion, no experience, maybe even.
https://karpathy.ai/lexicap/0001-large.html#01:10:26.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
No, I would of course, much rather prefer it
https://karpathy.ai/lexicap/0001-large.html#01:10:29.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
if all the things that we find that we value the most
https://karpathy.ai/lexicap/0001-large.html#01:10:32.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
about humanity are our subjective experience,
https://karpathy.ai/lexicap/0001-large.html#01:10:36.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
passion, inspiration, love, you know.
https://karpathy.ai/lexicap/0001-large.html#01:10:40.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
If we can create a future where those things do happen,
https://karpathy.ai/lexicap/0001-large.html#01:10:43.000
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
where those things do exist, you know,
https://karpathy.ai/lexicap/0001-large.html#01:10:48.000
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I think ultimately it's not our universe
https://karpathy.ai/lexicap/0001-large.html#01:10:50.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
giving meaning to us, it's us giving meaning to our universe.
https://karpathy.ai/lexicap/0001-large.html#01:10:54.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And if we build more advanced intelligence,
https://karpathy.ai/lexicap/0001-large.html#01:10:57.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
let's make sure we build it in such a way
https://karpathy.ai/lexicap/0001-large.html#01:11:01.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that meaning is part of it.
https://karpathy.ai/lexicap/0001-large.html#01:11:03.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
A lot of people that seriously study this problem
https://karpathy.ai/lexicap/0001-large.html#01:11:09.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and think of it from different angles
https://karpathy.ai/lexicap/0001-large.html#01:11:11.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
have trouble in the majority of cases,
https://karpathy.ai/lexicap/0001-large.html#01:11:13.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
if they think through that happen,
https://karpathy.ai/lexicap/0001-large.html#01:11:16.880
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
are the ones that are not beneficial to humanity.
https://karpathy.ai/lexicap/0001-large.html#01:11:19.160
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And so, yeah, so what are your thoughts?
https://karpathy.ai/lexicap/0001-large.html#01:11:22.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
What's should people, you know,
https://karpathy.ai/lexicap/0001-large.html#01:11:25.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I really don't like people to be terrified.
https://karpathy.ai/lexicap/0001-large.html#01:11:29.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
What's a way for people to think about it
https://karpathy.ai/lexicap/0001-large.html#01:11:33.440
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
in a way we can solve it and we can make it better?
https://karpathy.ai/lexicap/0001-large.html#01:11:35.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
No, I don't think panicking is going to help in any way.
https://karpathy.ai/lexicap/0001-large.html#01:11:39.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
It's not going to increase chances
https://karpathy.ai/lexicap/0001-large.html#01:11:42.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
of things going well either.
https://karpathy.ai/lexicap/0001-large.html#01:11:44.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Even if you are in a situation where there is a real threat,
https://karpathy.ai/lexicap/0001-large.html#01:11:45.880
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
does it help if everybody just freaks out?
https://karpathy.ai/lexicap/0001-large.html#01:11:48.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
No, of course, of course not.
https://karpathy.ai/lexicap/0001-large.html#01:11:51.080
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I think, yeah, there are of course ways
https://karpathy.ai/lexicap/0001-large.html#01:11:53.640
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
in which things can go horribly wrong.
https://karpathy.ai/lexicap/0001-large.html#01:11:56.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
First of all, it's important when we think about this thing,
https://karpathy.ai/lexicap/0001-large.html#01:11:59.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
about the problems and risks,
https://karpathy.ai/lexicap/0001-large.html#01:12:03.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to also remember how huge the upsides can be
https://karpathy.ai/lexicap/0001-large.html#01:12:05.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
if we get it right, right?
https://karpathy.ai/lexicap/0001-large.html#01:12:07.160
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Everything we love about society and civilization
https://karpathy.ai/lexicap/0001-large.html#01:12:08.440
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
is a product of intelligence.
https://karpathy.ai/lexicap/0001-large.html#01:12:12.360
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So if we can amplify our intelligence
https://karpathy.ai/lexicap/0001-large.html#01:12:13.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
with machine intelligence and not anymore lose our loved one
https://karpathy.ai/lexicap/0001-large.html#01:12:15.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to what we're told is an incurable disease
https://karpathy.ai/lexicap/0001-large.html#01:12:18.760
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and things like this, of course, we should aspire to that.
https://karpathy.ai/lexicap/0001-large.html#01:12:21.080
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So that can be a motivator, I think,
https://karpathy.ai/lexicap/0001-large.html#01:12:24.800
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
reminding ourselves that the reason we try to solve problems
https://karpathy.ai/lexicap/0001-large.html#01:12:26.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
is not just because we're trying to avoid gloom,
https://karpathy.ai/lexicap/0001-large.html#01:12:29.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
but because we're trying to do something great.
https://karpathy.ai/lexicap/0001-large.html#01:12:33.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
But then in terms of the risks,
https://karpathy.ai/lexicap/0001-large.html#01:12:35.760
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I think the really important question is to ask,
https://karpathy.ai/lexicap/0001-large.html#01:12:37.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
what can we do today that will actually help
https://karpathy.ai/lexicap/0001-large.html#01:12:42.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
make the outcome good, right?
https://karpathy.ai/lexicap/0001-large.html#01:12:45.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And dismissing the risk is not one of them.
https://karpathy.ai/lexicap/0001-large.html#01:12:47.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I find it quite funny often when I'm in discussion panels
https://karpathy.ai/lexicap/0001-large.html#01:12:51.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
about these things,
https://karpathy.ai/lexicap/0001-large.html#01:12:54.800
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
how the people who work for companies,
https://karpathy.ai/lexicap/0001-large.html#01:12:55.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
always be like, oh, nothing to worry about,
https://karpathy.ai/lexicap/0001-large.html#01:13:01.200
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
nothing to worry about, nothing to worry about.
https://karpathy.ai/lexicap/0001-large.html#01:13:03.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And it's only academics sometimes express concerns.
https://karpathy.ai/lexicap/0001-large.html#01:13:04.760
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
That's not surprising at all if you think about it.
https://karpathy.ai/lexicap/0001-large.html#01:13:09.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Right.
https://karpathy.ai/lexicap/0001-large.html#01:13:11.880
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Upton Sinclair quipped, right,
https://karpathy.ai/lexicap/0001-large.html#01:13:12.880
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that it's hard to make a man believe in something
https://karpathy.ai/lexicap/0001-large.html#01:13:15.200
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
when his income depends on not believing in it.
https://karpathy.ai/lexicap/0001-large.html#01:13:18.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And frankly, we know a lot of these people in companies
https://karpathy.ai/lexicap/0001-large.html#01:13:20.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that they're just as concerned as anyone else.
https://karpathy.ai/lexicap/0001-large.html#01:13:24.080
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
But if you're the CEO of a company,
https://karpathy.ai/lexicap/0001-large.html#01:13:26.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that's not something you want to go on record saying
https://karpathy.ai/lexicap/0001-large.html#01:13:28.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
when you have silly journalists who are gonna put a picture
https://karpathy.ai/lexicap/0001-large.html#01:13:30.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
of a Terminator robot when they quote you.
https://karpathy.ai/lexicap/0001-large.html#01:13:33.440
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So the issues are real.
https://karpathy.ai/lexicap/0001-large.html#01:13:35.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And the way I think about what the issue is,
https://karpathy.ai/lexicap/0001-large.html#01:13:39.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
is basically the real choice we have is,
https://karpathy.ai/lexicap/0001-large.html#01:13:41.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
first of all, are we gonna just dismiss the risks
https://karpathy.ai/lexicap/0001-large.html#01:13:48.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and say, well, let's just go ahead and build machines
https://karpathy.ai/lexicap/0001-large.html#01:13:50.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that can do everything we can do better and cheaper.
https://karpathy.ai/lexicap/0001-large.html#01:13:54.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Let's just make ourselves obsolete as fast as possible.
https://karpathy.ai/lexicap/0001-large.html#01:13:57.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
What could possibly go wrong?
https://karpathy.ai/lexicap/0001-large.html#01:14:00.200
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
That's one attitude.
https://karpathy.ai/lexicap/0001-large.html#01:14:01.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
The opposite attitude, I think, is to say,
https://karpathy.ai/lexicap/0001-large.html#01:14:03.440
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
here's this incredible potential,
https://karpathy.ai/lexicap/0001-large.html#01:14:06.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
let's think about what kind of future
https://karpathy.ai/lexicap/0001-large.html#01:14:08.800
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
we're really, really excited about.
https://karpathy.ai/lexicap/0001-large.html#01:14:11.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
What are the shared goals that we can really aspire towards?
https://karpathy.ai/lexicap/0001-large.html#01:14:14.640
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And then let's think really hard
https://karpathy.ai/lexicap/0001-large.html#01:14:18.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
about how we can actually get there.
https://karpathy.ai/lexicap/0001-large.html#01:14:19.960