episode
stringlengths
45
100
text
stringlengths
1
528
timestamp_link
stringlengths
56
56
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Where does it sit?
https://karpathy.ai/lexicap/0005-large.html#00:04:55.640
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
And does math for you have limits of what it can describe?
https://karpathy.ai/lexicap/0005-large.html#00:04:57.040
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Some people say that math is language which use God.
https://karpathy.ai/lexicap/0005-large.html#00:05:01.480
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Use God.
https://karpathy.ai/lexicap/0005-large.html#00:05:06.480
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
So I believe that...
https://karpathy.ai/lexicap/0005-large.html#00:05:08.280
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Speak to God or use God or...
https://karpathy.ai/lexicap/0005-large.html#00:05:10.320
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Use God.
https://karpathy.ai/lexicap/0005-large.html#00:05:12.280
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Use God.
https://karpathy.ai/lexicap/0005-large.html#00:05:13.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Yeah.
https://karpathy.ai/lexicap/0005-large.html#00:05:14.080
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
So I believe that this article
https://karpathy.ai/lexicap/0005-large.html#00:05:15.560
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
about effectiveness, unreasonable effectiveness of math,
https://karpathy.ai/lexicap/0005-large.html#00:05:23.920
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
is that if you're looking at mathematical structures,
https://karpathy.ai/lexicap/0005-large.html#00:05:27.840
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
they know something about reality.
https://karpathy.ai/lexicap/0005-large.html#00:05:32.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
And the most scientists from Natural Science,
https://karpathy.ai/lexicap/0005-large.html#00:05:36.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
they're looking on equation and trying to understand reality.
https://karpathy.ai/lexicap/0005-large.html#00:05:41.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
So the same in machine learning.
https://karpathy.ai/lexicap/0005-large.html#00:05:47.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
If you try very carefully look on all equations
https://karpathy.ai/lexicap/0005-large.html#00:05:50.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
which define conditional probability,
https://karpathy.ai/lexicap/0005-large.html#00:05:56.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
you can understand something about reality
https://karpathy.ai/lexicap/0005-large.html#00:05:59.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
more than from your fantasy.
https://karpathy.ai/lexicap/0005-large.html#00:06:04.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
So math can reveal the simple underlying principles of reality perhaps.
https://karpathy.ai/lexicap/0005-large.html#00:06:07.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
You know what means simple?
https://karpathy.ai/lexicap/0005-large.html#00:06:13.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
It is very hard to discover them.
https://karpathy.ai/lexicap/0005-large.html#00:06:16.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
But then when you discover them and look at them,
https://karpathy.ai/lexicap/0005-large.html#00:06:19.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
you see how beautiful they are.
https://karpathy.ai/lexicap/0005-large.html#00:06:23.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
And it is surprising why people did not see that before.
https://karpathy.ai/lexicap/0005-large.html#00:06:26.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
You're looking on equation and derive it from equations.
https://karpathy.ai/lexicap/0005-large.html#00:06:33.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
For example, I talked yesterday about least square method.
https://karpathy.ai/lexicap/0005-large.html#00:06:37.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
And people had a lot of fantasy how to improve least square method.
https://karpathy.ai/lexicap/0005-large.html#00:06:43.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
But if you're going step by step by solving some equations,
https://karpathy.ai/lexicap/0005-large.html#00:06:48.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
you suddenly will get some term which after thinking,
https://karpathy.ai/lexicap/0005-large.html#00:06:52.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
you understand that it describes position of observation point.
https://karpathy.ai/lexicap/0005-large.html#00:06:59.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
In least square method, we throw out a lot of information.
https://karpathy.ai/lexicap/0005-large.html#00:07:04.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
We don't look in composition of point of observations,
https://karpathy.ai/lexicap/0005-large.html#00:07:08.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
we're looking only on residuals.
https://karpathy.ai/lexicap/0005-large.html#00:07:11.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
But when you understood that, that's very simple idea,
https://karpathy.ai/lexicap/0005-large.html#00:07:14.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
but it's not too simple to understand.
https://karpathy.ai/lexicap/0005-large.html#00:07:19.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
And you can derive this just from equations.
https://karpathy.ai/lexicap/0005-large.html#00:07:22.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
So some simple algebra, a few steps will take you to something surprising
https://karpathy.ai/lexicap/0005-large.html#00:07:26.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
that when you think about, you understand.
https://karpathy.ai/lexicap/0005-large.html#00:07:31.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
And that is proof that human intuition is not too rich and very primitive.
https://karpathy.ai/lexicap/0005-large.html#00:07:34.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
And it does not see very simple situations.
https://karpathy.ai/lexicap/0005-large.html#00:07:42.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
So let me take a step back.
https://karpathy.ai/lexicap/0005-large.html#00:07:48.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
In general, yes.
https://karpathy.ai/lexicap/0005-large.html#00:07:50.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
But what about human, as opposed to intuition, ingenuity?
https://karpathy.ai/lexicap/0005-large.html#00:07:54.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Moments of brilliance.
https://karpathy.ai/lexicap/0005-large.html#00:08:01.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Do you have to be so hard on human intuition?
https://karpathy.ai/lexicap/0005-large.html#00:08:06.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Are there moments of brilliance in human intuition?
https://karpathy.ai/lexicap/0005-large.html#00:08:09.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
They can leap ahead of math and then the math will catch up?
https://karpathy.ai/lexicap/0005-large.html#00:08:12.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
I don't think so.
https://karpathy.ai/lexicap/0005-large.html#00:08:17.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
I think that the best human intuition, it is putting in axioms.
https://karpathy.ai/lexicap/0005-large.html#00:08:19.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
And then it is technical.
https://karpathy.ai/lexicap/0005-large.html#00:08:26.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
See where the axioms take you.
https://karpathy.ai/lexicap/0005-large.html#00:08:28.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
But if they correctly take axioms.
https://karpathy.ai/lexicap/0005-large.html#00:08:31.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
But it axiom polished during generations of scientists.
https://karpathy.ai/lexicap/0005-large.html#00:08:35.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
And this is integral wisdom.
https://karpathy.ai/lexicap/0005-large.html#00:08:41.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
That is beautifully put.
https://karpathy.ai/lexicap/0005-large.html#00:08:45.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
But if you maybe look at, when you think of Einstein and special relativity,
https://karpathy.ai/lexicap/0005-large.html#00:08:47.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
what is the role of imagination coming first there in the moment of discovery of an idea?
https://karpathy.ai/lexicap/0005-large.html#00:08:56.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
So there is obviously a mix of math and out of the box imagination there.
https://karpathy.ai/lexicap/0005-large.html#00:09:04.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
That I don't know.
https://karpathy.ai/lexicap/0005-large.html#00:09:10.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Whatever I did, I exclude any imagination.
https://karpathy.ai/lexicap/0005-large.html#00:09:12.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Because whatever I saw in machine learning that comes from imagination,
https://karpathy.ai/lexicap/0005-large.html#00:09:17.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
like features, like deep learning, they are not relevant to the problem.
https://karpathy.ai/lexicap/0005-large.html#00:09:22.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
When you are looking very carefully from mathematical equations,
https://karpathy.ai/lexicap/0005-large.html#00:09:29.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
you are deriving very simple theory, which goes far beyond theoretically
https://karpathy.ai/lexicap/0005-large.html#00:09:34.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
than whatever people can imagine.
https://karpathy.ai/lexicap/0005-large.html#00:09:39.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Because it is not good fantasy.
https://karpathy.ai/lexicap/0005-large.html#00:09:42.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
It is just interpretation.
https://karpathy.ai/lexicap/0005-large.html#00:09:44.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
It is just fantasy.
https://karpathy.ai/lexicap/0005-large.html#00:09:46.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
But it is not what you need.
https://karpathy.ai/lexicap/0005-large.html#00:09:48.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
You don't need any imagination to derive the main principle of machine learning.
https://karpathy.ai/lexicap/0005-large.html#00:09:51.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
When you think about learning and intelligence,
https://karpathy.ai/lexicap/0005-large.html#00:09:59.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
maybe thinking about the human brain and trying to describe mathematically
https://karpathy.ai/lexicap/0005-large.html#00:10:02.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
the process of learning, that is something like what happens in the human brain.
https://karpathy.ai/lexicap/0005-large.html#00:10:06.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Do you think we have the tools currently?
https://karpathy.ai/lexicap/0005-large.html#00:10:13.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Do you think we will ever have the tools to try to describe that process of learning?
https://karpathy.ai/lexicap/0005-large.html#00:10:17.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
It is not description what is going on.
https://karpathy.ai/lexicap/0005-large.html#00:10:21.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
It is interpretation.
https://karpathy.ai/lexicap/0005-large.html#00:10:25.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
It is your interpretation.
https://karpathy.ai/lexicap/0005-large.html#00:10:27.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Your vision can be wrong.
https://karpathy.ai/lexicap/0005-large.html#00:10:29.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
You know, one guy invented microscope, Levenhuk, for the first time.
https://karpathy.ai/lexicap/0005-large.html#00:10:32.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Only he got this instrument and he kept secret about microscope.
https://karpathy.ai/lexicap/0005-large.html#00:10:39.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
But he wrote a report in London Academy of Science.
https://karpathy.ai/lexicap/0005-large.html#00:10:45.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
In his report, when he was looking at the blood,
https://karpathy.ai/lexicap/0005-large.html#00:10:49.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
he looked everywhere, on the water, on the blood, on the sperm.
https://karpathy.ai/lexicap/0005-large.html#00:10:52.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
But he described blood like fight between queen and king.
https://karpathy.ai/lexicap/0005-large.html#00:10:56.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
So, he saw blood cells, red cells, and he imagined that it is army fighting each other.
https://karpathy.ai/lexicap/0005-large.html#00:11:04.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
And it was his interpretation of situation.
https://karpathy.ai/lexicap/0005-large.html#00:11:12.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
And he sent this report in Academy of Science.
https://karpathy.ai/lexicap/0005-large.html#00:11:17.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
They very carefully looked because they believed that he is right.
https://karpathy.ai/lexicap/0005-large.html#00:11:20.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
He saw something.
https://karpathy.ai/lexicap/0005-large.html#00:11:24.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Yes.
https://karpathy.ai/lexicap/0005-large.html#00:11:25.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
But he gave wrong interpretation.
https://karpathy.ai/lexicap/0005-large.html#00:11:26.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
And I believe the same can happen with brain.
https://karpathy.ai/lexicap/0005-large.html#00:11:28.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
With brain, yeah.
https://karpathy.ai/lexicap/0005-large.html#00:11:32.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
The most important part.
https://karpathy.ai/lexicap/0005-large.html#00:11:33.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
You know, I believe in human language.
https://karpathy.ai/lexicap/0005-large.html#00:11:35.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
In some proverbs, there is so much wisdom.
https://karpathy.ai/lexicap/0005-large.html#00:11:39.120
Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
For example, people say that it is better than thousand days of diligent studies one day with great teacher.
https://karpathy.ai/lexicap/0005-large.html#00:11:43.120