video_id stringclasses 7
values | text stringlengths 2 29.3k |
|---|---|
rLlZpnT02ZU | PHILIPPE RIGOLLET: So
that's not Slutsky, right? |
rLlZpnT02ZU | AUDIENCE: That's [INAUDIBLE]. |
rLlZpnT02ZU | PHILIPPE RIGOLLET: So Slutsky
tells you that if you-- |
rLlZpnT02ZU | Slutsky's about combining
two types of convergence. |
rLlZpnT02ZU | So Slutsky tells you
that if you actually |
rLlZpnT02ZU | have one Xn that converges
to X in distribution and Yn |
rLlZpnT02ZU | that converges to Y
in probability, then |
rLlZpnT02ZU | you can actually
multiply Xn and Yn |
rLlZpnT02ZU | and get that the
limit in distribution |
rLlZpnT02ZU | is the product of X and Y,
where X is now a constant. |
rLlZpnT02ZU | And here we have the
constant, which is 1. |
rLlZpnT02ZU | But I did that already, right? |
rLlZpnT02ZU | Using Slutsky to
replace it for the-- |
rLlZpnT02ZU | to replace P by
Xn bar, we've done |
rLlZpnT02ZU | that last time, maybe a
couple of times ago, actually. |
rLlZpnT02ZU | Yeah. |
rLlZpnT02ZU | AUDIENCE: So I guess these
statements are [INAUDIBLE].. |
rLlZpnT02ZU | PHILIPPE RIGOLLET:
That's correct. |
rLlZpnT02ZU | AUDIENCE: So could we like
figure out [INAUDIBLE] |
rLlZpnT02ZU | can we set a finite [INAUDIBLE]. |
rLlZpnT02ZU | PHILIPPE RIGOLLET: So of
course, the short answer is no. |
rLlZpnT02ZU | So here's how you
would go about thinking |
rLlZpnT02ZU | about which method is better. |
rLlZpnT02ZU | So there's always the
more conservative method. |
rLlZpnT02ZU | The first one, the only
thing you're losing |
rLlZpnT02ZU | is the rate of convergence
of the central limit theorem. |
rLlZpnT02ZU | So if n is large enough so
that the central limit theorem |
rLlZpnT02ZU | approximation is very good,
then that's all you're |
rLlZpnT02ZU | going to be losing. |
rLlZpnT02ZU | Of course, the price you pay is
that your confidence interval |
rLlZpnT02ZU | is wider than it
would be if you were |
rLlZpnT02ZU | to use Slutsky for this
particular problem, |
rLlZpnT02ZU | typically wider. |
rLlZpnT02ZU | Actually, it is always
wider, because Xn bar-- |
rLlZpnT02ZU | 1 minus Xn bar is always
less than 1/4 as well. |
rLlZpnT02ZU | And so that's the
first thing you-- |
rLlZpnT02ZU | so Slutsky basically adds your
relying on the central limit-- |
rLlZpnT02ZU | your relying on the
asymptotics again. |
rLlZpnT02ZU | Now of course, you don't
want to be conservative, |
rLlZpnT02ZU | because you actually want to
squeeze as much from your data |
rLlZpnT02ZU | as you can. |
rLlZpnT02ZU | So it depends on how comfortable
and how critical it is for you |
rLlZpnT02ZU | to put valid error bars. |
rLlZpnT02ZU | If they're valid
in the asymptotics, |
rLlZpnT02ZU | then maybe you're actually
going to go with Slutsky |
rLlZpnT02ZU | so it actually gives you
slightly narrower confidence |
rLlZpnT02ZU | intervals and so you feel
like you're a little more-- |
rLlZpnT02ZU | you have a more precise answer. |
rLlZpnT02ZU | Now, if you really need
to be super-conservative, |
rLlZpnT02ZU | then you're actually going
to go with the P1 minus P. |
rLlZpnT02ZU | Actually, if you need to
be even more conservative, |
rLlZpnT02ZU | you are going to go with
Hoeffding's so you don't even |
rLlZpnT02ZU | have to rely on the
asymptotics level at all. |
rLlZpnT02ZU | But then you're
confidence interval |
rLlZpnT02ZU | becomes twice as wide
and twice as wide |
rLlZpnT02ZU | and it becomes wider
and wider as you go. |
rLlZpnT02ZU | So depends on-- |
rLlZpnT02ZU | I mean, there's a lot
of data in statistics |
rLlZpnT02ZU | which is gauging how critical
it is for you to output |
rLlZpnT02ZU | valid error bounds or if
they're really just here |
rLlZpnT02ZU | to be indicative of the
precision of the estimator you |
rLlZpnT02ZU | gave from a more
qualitative perspective. |
rLlZpnT02ZU | AUDIENCE: So the error
there is [INAUDIBLE]?? |
rLlZpnT02ZU | PHILIPPE RIGOLLET: Yeah. |
rLlZpnT02ZU | So here, there's basically
a bunch of errors. |
rLlZpnT02ZU | There's one that's-- so there's
a theorem called Berry-Esseen |
rLlZpnT02ZU | that quantifies how far this
probability is from 1 minus |
rLlZpnT02ZU | alpha, but the
constants are terrible. |
rLlZpnT02ZU | So it's not very
helpful, but it tells you |
rLlZpnT02ZU | as n grows how smaller
this thing grows-- |
rLlZpnT02ZU | becomes smaller. |
rLlZpnT02ZU | And then for
Slutsky, again you're |
rLlZpnT02ZU | multiplying something that
converges by something that |
rLlZpnT02ZU | fluctuates around 1, so
you need to understand |
rLlZpnT02ZU | how this thing fluctuates. |
rLlZpnT02ZU | Now, there's something
that shows up. |
rLlZpnT02ZU | Basically, what is the
slope of the function 1 |
rLlZpnT02ZU | over square root of X1
minus X around the value |
rLlZpnT02ZU | you're interested in? |
rLlZpnT02ZU | And so if this function
is super-sharp, |
rLlZpnT02ZU | then small fluctuations of Xn
bar around this expectation |
rLlZpnT02ZU | are going to lead to
really high fluctuations |
rLlZpnT02ZU | of the function itself. |
rLlZpnT02ZU | So if you're looking at-- |
rLlZpnT02ZU | if you have f of Xn bar and
f around say the true P, |
rLlZpnT02ZU | if f is really sharp
like that, then |
rLlZpnT02ZU | if you move a little
bit here, then you're |
rLlZpnT02ZU | going to move really
a lot on the y-axis. |
rLlZpnT02ZU | So that's what the function
here-- the function |
rLlZpnT02ZU | you're interested in is 1 over
square root of X1 minus X. |
rLlZpnT02ZU | So what does this function look
like around the point where you |
rLlZpnT02ZU | think P is the true parameter? |
rLlZpnT02ZU | Its derivative really
is what matters. |
rLlZpnT02ZU | OK? |
rLlZpnT02ZU | Any other question. |
rLlZpnT02ZU | OK. |
rLlZpnT02ZU | So it's important,
because now we're |
rLlZpnT02ZU | going to switch to the
real let's do some hardcore |
rLlZpnT02ZU | computation type of things. |
rLlZpnT02ZU | All right. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.