video_id stringclasses 7
values | text stringlengths 2 29.3k |
|---|---|
rLlZpnT02ZU | The KL divergence
is non-negative. |
rLlZpnT02ZU | Who knows the Jensen's
inequality here? |
rLlZpnT02ZU | That should be a subset
of the people who |
rLlZpnT02ZU | raised their hand when I asked
what a convex function is. |
rLlZpnT02ZU | All right. |
rLlZpnT02ZU | So you know what
Jensen's inequality is. |
rLlZpnT02ZU | This is Jensen's-- the
proof is just one step |
rLlZpnT02ZU | Jensen's inequality, which
we will not go into details. |
rLlZpnT02ZU | But that's basically
an inequality |
rLlZpnT02ZU | involving expectation
of a convex function |
rLlZpnT02ZU | of a random variable compared
to the convex function |
rLlZpnT02ZU | of the expectation
of a random variable. |
rLlZpnT02ZU | If you know Jensen,
have fun and prove it. |
rLlZpnT02ZU | What's really nice is that
if the KL is equal to 0, |
rLlZpnT02ZU | then the two distributions
are the same. |
rLlZpnT02ZU | And that's something
we're looking for. |
rLlZpnT02ZU | Everything else we're
happy to throw out. |
rLlZpnT02ZU | And actually, if
you pay attention, |
rLlZpnT02ZU | we're actually really
throwing out everything else. |
rLlZpnT02ZU | So they're not symmetric. |
rLlZpnT02ZU | It does satisfy the triangle
inequality in general. |
rLlZpnT02ZU | But it's non-negative and
it's 0 if and only if the two |
rLlZpnT02ZU | distributions are the same. |
rLlZpnT02ZU | And that's all we care about. |
rLlZpnT02ZU | And that's what we call
a divergence rather than |
rLlZpnT02ZU | a distance, and divergence will
be enough for our purposes. |
rLlZpnT02ZU | And actually, this
asymmetry, the fact |
rLlZpnT02ZU | that it's not flipping--
the first time I saw it, |
rLlZpnT02ZU | I was just annoyed. |
rLlZpnT02ZU | I was like, can we
just like, I don't |
rLlZpnT02ZU | know, take the average
of the KL between P theta |
rLlZpnT02ZU | and P theta prime and P
theta prime and P theta, |
rLlZpnT02ZU | you would think maybe
you could do this. |
rLlZpnT02ZU | You just symmatrize it by just
taking the average of the two |
rLlZpnT02ZU | possible values it can take. |
rLlZpnT02ZU | The problem is that this will
still not satisfy the triangle |
rLlZpnT02ZU | inequality. |
rLlZpnT02ZU | And there's no way basically
to turn it into something |
rLlZpnT02ZU | that is a distance. |
rLlZpnT02ZU | But the divergence is doing
a pretty good thing for us. |
rLlZpnT02ZU | And this is what will allow us
to estimate it and basically |
rLlZpnT02ZU | overcome what we could not
do with the total variation. |
rLlZpnT02ZU | So the first thing
that you want to notice |
rLlZpnT02ZU | is the total
variation distance-- |
rLlZpnT02ZU | the KL divergence,
sorry, is actually |
rLlZpnT02ZU | an expectation of something. |
rLlZpnT02ZU | Look at what it is here. |
rLlZpnT02ZU | It's the integral of some
function against a density. |
rLlZpnT02ZU | That's exactly the definition
of an expectation, right? |
rLlZpnT02ZU | So this is the expectation
of this particular function |
rLlZpnT02ZU | with respect to this density f. |
rLlZpnT02ZU | So in particular, if I call
this is density f-- if I say, |
rLlZpnT02ZU | I want the true distribution
to be the first argument, |
rLlZpnT02ZU | this is an expectation
with respect |
rLlZpnT02ZU | to the true distribution from
which my data is actually |
rLlZpnT02ZU | drawn of the log of this ratio. |
rLlZpnT02ZU | So ha ha. |
rLlZpnT02ZU | I'm a statistician. |
rLlZpnT02ZU | Now I have an expectation. |
rLlZpnT02ZU | I can replace it by an
average, because I have data |
rLlZpnT02ZU | from this distribution. |
rLlZpnT02ZU | And I could actually replace
the expectation by an average |
rLlZpnT02ZU | and try to minimize here. |
rLlZpnT02ZU | The problem is that-- |
rLlZpnT02ZU | actually the star here should
be in front of the theta, |
rLlZpnT02ZU | not of the P, right? |
rLlZpnT02ZU | That's P theta star,
not P star theta. |
rLlZpnT02ZU | But here, I still
cannot compute it, |
rLlZpnT02ZU | because I have this P
theta star that shows up. |
rLlZpnT02ZU | I don't know what it is. |
rLlZpnT02ZU | And that's now where
the log plays a role. |
rLlZpnT02ZU | If you actually pay
attention, I said |
rLlZpnT02ZU | you can use Jensen to
prove all this stuff. |
rLlZpnT02ZU | You could actually replace the
log by any concave function. |
rLlZpnT02ZU | That would be f divergent. |
rLlZpnT02ZU | That's called an f divergence. |
rLlZpnT02ZU | But the log itself is a
very, very specific property, |
rLlZpnT02ZU | which allows us to say
that the log of the ratio |
rLlZpnT02ZU | is the ratio of the log. |
rLlZpnT02ZU | Now, this thing here
does not depend on theta. |
rLlZpnT02ZU | If I think of this KL divergence
as a function of theta, |
rLlZpnT02ZU | then the first part is
actually a constant. |
rLlZpnT02ZU | If I change theta, this thing
is never going to change. |
rLlZpnT02ZU | It depends only on theta star. |
rLlZpnT02ZU | So if I look at
this function KL-- |
rLlZpnT02ZU | so if I look at the
function, theta maps |
rLlZpnT02ZU | to KL P theta
star, P theta, it's |
rLlZpnT02ZU | of the form expectation
with respect to theta star, |
rLlZpnT02ZU | log of P theta star
of X. And then I |
rLlZpnT02ZU | have minus expectation with
respect to theta star of log |
rLlZpnT02ZU | of P theta of x. |
rLlZpnT02ZU | Now as I said, this thing
here, this second expectation |
rLlZpnT02ZU | is a function of theta. |
rLlZpnT02ZU | When theta changes, this
thing is going to change. |
rLlZpnT02ZU | And that's a good thing. |
rLlZpnT02ZU | We want something that reflects
how close theta and theta |
rLlZpnT02ZU | star are. |
rLlZpnT02ZU | But this thing is
not going to change. |
rLlZpnT02ZU | This is a fixed value. |
rLlZpnT02ZU | Actually, it's the negative
entropy of P theta star. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.