video_id
stringclasses 7
values | text
stringlengths 2
29.3k
|
|---|---|
rLlZpnT02ZU
|
And if you've
heard of KL, you've
|
rLlZpnT02ZU
|
probably heard of entropy.
|
rLlZpnT02ZU
|
And that's what-- it's
basically minus the entropy.
|
rLlZpnT02ZU
|
And that's a quantity that
just depends on theta star.
|
rLlZpnT02ZU
|
But it's just the number.
|
rLlZpnT02ZU
|
I could compute this
number if I told
|
rLlZpnT02ZU
|
you this is n theta star 1.
|
rLlZpnT02ZU
|
You could compute this.
|
rLlZpnT02ZU
|
So now I'm going
to try to minimize
|
rLlZpnT02ZU
|
the estimate of this function.
|
rLlZpnT02ZU
|
And minimizing a function or
a function plus a constant
|
rLlZpnT02ZU
|
is the same thing.
|
rLlZpnT02ZU
|
I'm just shifting the
function here or here,
|
rLlZpnT02ZU
|
but it's the same minimizer.
|
rLlZpnT02ZU
|
OK.
|
rLlZpnT02ZU
|
So the function that maps
theta to KL of P theta star
|
rLlZpnT02ZU
|
to P theta is of the form
constant minus this expectation
|
rLlZpnT02ZU
|
of a log of P theta.
|
rLlZpnT02ZU
|
Everybody agrees?
|
rLlZpnT02ZU
|
Are there any
questions about this?
|
rLlZpnT02ZU
|
Are there any
remarks, including I
|
rLlZpnT02ZU
|
have no idea what's
happening right now?
|
rLlZpnT02ZU
|
OK.
|
rLlZpnT02ZU
|
We're good?
|
rLlZpnT02ZU
|
Yeah.
|
rLlZpnT02ZU
|
AUDIENCE: So when you're
actually employing this method,
|
rLlZpnT02ZU
|
how do you know which theta
to use as theta star and which
|
rLlZpnT02ZU
|
isn't?
|
rLlZpnT02ZU
|
PHILIPPE RIGOLLET: So this is
not a method just yet, right?
|
rLlZpnT02ZU
|
I'm just describing to
you what the KL divergence
|
rLlZpnT02ZU
|
between two distributions is.
|
rLlZpnT02ZU
|
If you really wanted
to compute it,
|
rLlZpnT02ZU
|
you would need to know
what P theta star is
|
rLlZpnT02ZU
|
and what P theta is.
|
rLlZpnT02ZU
|
AUDIENCE: Right.
|
rLlZpnT02ZU
|
PHILIPPE RIGOLLET: And so here,
I'm just saying at some point,
|
rLlZpnT02ZU
|
we still-- so here, you see--
|
rLlZpnT02ZU
|
so now let's move onto one step.
|
rLlZpnT02ZU
|
I don't know expectation
of theta star.
|
rLlZpnT02ZU
|
But I have data that comes
from distribution P theta star.
|
rLlZpnT02ZU
|
So the expectation by
the law of large numbers
|
rLlZpnT02ZU
|
should be close to the average.
|
rLlZpnT02ZU
|
And so what I'm doing
is I'm replacing any--
|
rLlZpnT02ZU
|
I can actually-- this is a very
standard estimation method.
|
rLlZpnT02ZU
|
You write something as an
expectation with respect
|
rLlZpnT02ZU
|
to the data-generating
process of some function.
|
rLlZpnT02ZU
|
And then you replace this by
the average of this function.
|
rLlZpnT02ZU
|
And the law of large
numbers tells me
|
rLlZpnT02ZU
|
that those two quantities
should actually be close.
|
rLlZpnT02ZU
|
Now, it doesn't mean that's
going to be the end of the day,
|
rLlZpnT02ZU
|
right.
|
rLlZpnT02ZU
|
When we did Xn bar, that
was the end of the day.
|
rLlZpnT02ZU
|
We had an expectation.
|
rLlZpnT02ZU
|
We replaced it by an average.
|
rLlZpnT02ZU
|
And then we were gone.
|
rLlZpnT02ZU
|
But here, we still
have to do something,
|
rLlZpnT02ZU
|
because this is not
telling me what theta is.
|
rLlZpnT02ZU
|
Now I still have to
minimize this average.
|
rLlZpnT02ZU
|
So this is now my candidate
estimator for KL, KL hat.
|
rLlZpnT02ZU
|
And that's the one
where I said, well, it's
|
rLlZpnT02ZU
|
going to be of the
form of constant.
|
rLlZpnT02ZU
|
And this constant, I don't know.
|
rLlZpnT02ZU
|
You're right.
|
rLlZpnT02ZU
|
I have no idea what
this constant is.
|
rLlZpnT02ZU
|
It depends on P theta star.
|
rLlZpnT02ZU
|
But then I have minus something
that I can completely compute.
|
rLlZpnT02ZU
|
If you give me data and theta,
I can compute this entire thing.
|
rLlZpnT02ZU
|
And now what I claim is that
the minimizer of f or f plus--
|
rLlZpnT02ZU
|
f of X or f of X plus
4 are the same thing,
|
rLlZpnT02ZU
|
or say 4 plus f of
X. I'm just shifting
|
rLlZpnT02ZU
|
the plot of my
function up and down,
|
rLlZpnT02ZU
|
but the minimizer stays
exactly where it is.
|
rLlZpnT02ZU
|
If I have a function--
|
rLlZpnT02ZU
|
so now I have a
function of theta.
|
rLlZpnT02ZU
|
This is KL hat of P
theta star, P theta.
|
rLlZpnT02ZU
|
And it's of the form--
it's a function like this.
|
rLlZpnT02ZU
|
I don't know where
this function is.
|
rLlZpnT02ZU
|
It might very well be this
function or this function.
|
rLlZpnT02ZU
|
Every time it's a translation
on the y-axis of all these guys.
|
rLlZpnT02ZU
|
And the value that I translated
by depends on theta star.
|
rLlZpnT02ZU
|
I don't know what it is.
|
rLlZpnT02ZU
|
But what I claim is that the
minimizer is always this guy,
|
rLlZpnT02ZU
|
regardless of what the value is.
|
rLlZpnT02ZU
|
OK?
|
rLlZpnT02ZU
|
So when I say constant, it's a
constant with respect to theta.
|
rLlZpnT02ZU
|
It's an unknown constant.
|
rLlZpnT02ZU
|
But it's with respect to theta,
so without loss of generality,
|
rLlZpnT02ZU
|
I can assume that this
constant is 0 for my purposes,
|
rLlZpnT02ZU
|
or 25 if you prefer.
|
rLlZpnT02ZU
|
All right.
|
rLlZpnT02ZU
|
So we'll just keep going
on this property next time.
|
rLlZpnT02ZU
|
And we'll see how from
here we can move on to--
|
rLlZpnT02ZU
|
the likelihood is actually going
to come out of this formula.
|
rLlZpnT02ZU
|
Thanks.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.