episode
stringlengths
45
100
text
stringlengths
1
528
timestamp_link
stringlengths
56
56
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
wants it to never under any circumstances
https://karpathy.ai/lexicap/0001-large.html#00:47:31.000
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
fly into a building or a mountain.
https://karpathy.ai/lexicap/0001-large.html#00:47:33.000
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Yet the September 11 hijackers were able to do that.
https://karpathy.ai/lexicap/0001-large.html#00:47:35.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And even more embarrassingly, Andreas Lubitz,
https://karpathy.ai/lexicap/0001-large.html#00:47:38.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
this depressed Germanwings pilot,
https://karpathy.ai/lexicap/0001-large.html#00:47:41.800
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
when he flew his passenger jet into the Alps killing over 100
https://karpathy.ai/lexicap/0001-large.html#00:47:43.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
people, he just told the autopilot to do it.
https://karpathy.ai/lexicap/0001-large.html#00:47:47.360
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
He told the freaking computer to change the altitude
https://karpathy.ai/lexicap/0001-large.html#00:47:50.640
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to 100 meters.
https://karpathy.ai/lexicap/0001-large.html#00:47:53.200
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And even though it had the GPS maps, everything,
https://karpathy.ai/lexicap/0001-large.html#00:47:55.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
the computer was like, OK.
https://karpathy.ai/lexicap/0001-large.html#00:47:58.160
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So we should take those very basic values,
https://karpathy.ai/lexicap/0001-large.html#00:48:00.640
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
where the problem is not that we don't agree.
https://karpathy.ai/lexicap/0001-large.html#00:48:05.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
The problem is just we've been too lazy
https://karpathy.ai/lexicap/0001-large.html#00:48:08.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to try to put it into our machines
https://karpathy.ai/lexicap/0001-large.html#00:48:10.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and make sure that from now on, airplanes will just,
https://karpathy.ai/lexicap/0001-large.html#00:48:11.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
which all have computers in them,
https://karpathy.ai/lexicap/0001-large.html#00:48:15.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
but will just refuse to do something like that.
https://karpathy.ai/lexicap/0001-large.html#00:48:16.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Go into safe mode, maybe lock the cockpit door,
https://karpathy.ai/lexicap/0001-large.html#00:48:19.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
go over to the nearest airport.
https://karpathy.ai/lexicap/0001-large.html#00:48:22.160
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And there's so much other technology in our world
https://karpathy.ai/lexicap/0001-large.html#00:48:24.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
as well now, where it's really becoming quite timely
https://karpathy.ai/lexicap/0001-large.html#00:48:28.080
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to put in some sort of very basic values like this.
https://karpathy.ai/lexicap/0001-large.html#00:48:31.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Even in cars, we've had enough vehicle terrorism attacks
https://karpathy.ai/lexicap/0001-large.html#00:48:34.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
by now, where people have driven trucks and vans
https://karpathy.ai/lexicap/0001-large.html#00:48:39.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
into pedestrians, that it's not at all a crazy idea
https://karpathy.ai/lexicap/0001-large.html#00:48:42.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to just have that hardwired into the car.
https://karpathy.ai/lexicap/0001-large.html#00:48:45.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Because yeah, there are a lot of,
https://karpathy.ai/lexicap/0001-large.html#00:48:48.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
there's always going to be people who for some reason
https://karpathy.ai/lexicap/0001-large.html#00:48:50.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
want to harm others, but most of those people
https://karpathy.ai/lexicap/0001-large.html#00:48:52.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
don't have the technical expertise to figure out
https://karpathy.ai/lexicap/0001-large.html#00:48:54.800
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
how to work around something like that.
https://karpathy.ai/lexicap/0001-large.html#00:48:56.760
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So if the car just won't do it, it helps.
https://karpathy.ai/lexicap/0001-large.html#00:48:58.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So let's start there.
https://karpathy.ai/lexicap/0001-large.html#00:49:01.760
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So there's a lot of, that's a great point.
https://karpathy.ai/lexicap/0001-large.html#00:49:02.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So not chasing perfect.
https://karpathy.ai/lexicap/0001-large.html#00:49:04.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
There's a lot of things that most of the world agrees on.
https://karpathy.ai/lexicap/0001-large.html#00:49:06.800
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Yeah, let's start there.
https://karpathy.ai/lexicap/0001-large.html#00:49:10.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Let's start there.
https://karpathy.ai/lexicap/0001-large.html#00:49:11.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And then once we start there,
https://karpathy.ai/lexicap/0001-large.html#00:49:12.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
we'll also get into the habit of having
https://karpathy.ai/lexicap/0001-large.html#00:49:14.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
these kind of conversations about, okay,
https://karpathy.ai/lexicap/0001-large.html#00:49:17.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
what else should we put in here and have these discussions?
https://karpathy.ai/lexicap/0001-large.html#00:49:18.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
This should be a gradual process then.
https://karpathy.ai/lexicap/0001-large.html#00:49:21.760
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Great, so, but that also means describing these things
https://karpathy.ai/lexicap/0001-large.html#00:49:23.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and describing it to a machine.
https://karpathy.ai/lexicap/0001-large.html#00:49:28.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So one thing, we had a few conversations
https://karpathy.ai/lexicap/0001-large.html#00:49:31.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
with Stephen Wolfram.
https://karpathy.ai/lexicap/0001-large.html#00:49:34.200
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I'm not sure if you're familiar with Stephen.
https://karpathy.ai/lexicap/0001-large.html#00:49:35.640
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Oh yeah, I know him quite well.
https://karpathy.ai/lexicap/0001-large.html#00:49:37.080
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So he is, he works with a bunch of things,
https://karpathy.ai/lexicap/0001-large.html#00:49:38.360
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
but cellular automata, these simple computable things,
https://karpathy.ai/lexicap/0001-large.html#00:49:42.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
these computation systems.
https://karpathy.ai/lexicap/0001-large.html#00:49:46.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And he kind of mentioned that,
https://karpathy.ai/lexicap/0001-large.html#00:49:47.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
we probably have already within these systems
https://karpathy.ai/lexicap/0001-large.html#00:49:49.880
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
already something that's AGI,
https://karpathy.ai/lexicap/0001-large.html#00:49:52.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
meaning like we just don't know it
https://karpathy.ai/lexicap/0001-large.html#00:49:56.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
because we can't talk to it.
https://karpathy.ai/lexicap/0001-large.html#00:49:58.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So if you give me this chance to try to at least
https://karpathy.ai/lexicap/0001-large.html#00:50:00.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
form a question out of this is,
https://karpathy.ai/lexicap/0001-large.html#00:50:04.800
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I think it's an interesting idea to think
https://karpathy.ai/lexicap/0001-large.html#00:50:07.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that we can have intelligent systems,
https://karpathy.ai/lexicap/0001-large.html#00:50:10.880
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
but we don't know how to describe something to them
https://karpathy.ai/lexicap/0001-large.html#00:50:12.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and they can't communicate with us.
https://karpathy.ai/lexicap/0001-large.html#00:50:15.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I know you're doing a little bit of work in explainable AI,
https://karpathy.ai/lexicap/0001-large.html#00:50:17.360
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
trying to get AI to explain itself.
https://karpathy.ai/lexicap/0001-large.html#00:50:19.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So what are your thoughts of natural language processing
https://karpathy.ai/lexicap/0001-large.html#00:50:22.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
or some kind of other communication?
https://karpathy.ai/lexicap/0001-large.html#00:50:25.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
How does the AI explain something to us?
https://karpathy.ai/lexicap/0001-large.html#00:50:27.640
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
How do we explain something to it, to machines?
https://karpathy.ai/lexicap/0001-large.html#00:50:30.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Or you think of it differently?
https://karpathy.ai/lexicap/0001-large.html#00:50:33.640
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So there are two separate parts to your question there.
https://karpathy.ai/lexicap/0001-large.html#00:50:35.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
One of them has to do with communication,
https://karpathy.ai/lexicap/0001-large.html#00:50:39.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
which is super interesting, I'll get to that in a sec.
https://karpathy.ai/lexicap/0001-large.html#00:50:42.440
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
The other is whether we already have AGI
https://karpathy.ai/lexicap/0001-large.html#00:50:44.440
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
but we just haven't noticed it there.
https://karpathy.ai/lexicap/0001-large.html#00:50:47.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Right.
https://karpathy.ai/lexicap/0001-large.html#00:50:49.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
There I beg to differ.
https://karpathy.ai/lexicap/0001-large.html#00:50:51.800
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I don't think there's anything in any cellular automaton
https://karpathy.ai/lexicap/0001-large.html#00:50:54.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
or anything or the internet itself or whatever
https://karpathy.ai/lexicap/0001-large.html#00:50:56.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that has artificial general intelligence
https://karpathy.ai/lexicap/0001-large.html#00:50:59.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and that it can really do exactly everything
https://karpathy.ai/lexicap/0001-large.html#00:51:03.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
we humans can do better.
https://karpathy.ai/lexicap/0001-large.html#00:51:05.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I think the day that happens, when that happens,
https://karpathy.ai/lexicap/0001-large.html#00:51:07.000
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
we will very soon notice, we'll probably notice even before
https://karpathy.ai/lexicap/0001-large.html#00:51:11.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
because in a very, very big way.
https://karpathy.ai/lexicap/0001-large.html#00:51:15.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
But for the second part, though.
https://karpathy.ai/lexicap/0001-large.html#00:51:17.440
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Wait, can I ask, sorry.
https://karpathy.ai/lexicap/0001-large.html#00:51:18.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So, because you have this beautiful way
https://karpathy.ai/lexicap/0001-large.html#00:51:20.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to formulating consciousness as information processing,
https://karpathy.ai/lexicap/0001-large.html#00:51:24.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and you can think of intelligence
https://karpathy.ai/lexicap/0001-large.html#00:51:30.360
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
as information processing,
https://karpathy.ai/lexicap/0001-large.html#00:51:31.360
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and you can think of the entire universe
https://karpathy.ai/lexicap/0001-large.html#00:51:32.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
as these particles and these systems roaming around
https://karpathy.ai/lexicap/0001-large.html#00:51:34.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that have this information processing power.
https://karpathy.ai/lexicap/0001-large.html#00:51:38.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
You don't think there is something with the power
https://karpathy.ai/lexicap/0001-large.html#00:51:41.360
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to process information in the way that we human beings do
https://karpathy.ai/lexicap/0001-large.html#00:51:44.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that's out there that needs to be sort of connected to.
https://karpathy.ai/lexicap/0001-large.html#00:51:49.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
It seems a little bit philosophical, perhaps,
https://karpathy.ai/lexicap/0001-large.html#00:51:55.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
but there's something compelling to the idea
https://karpathy.ai/lexicap/0001-large.html#00:51:57.880