Sentence stringlengths 102 4.09k | video_title stringlengths 27 104 |
|---|---|
So I'm gonna take out the part of the table that has the negative z-scores on it. And remember, we're looking for 10%, but we don't wanna go beyond 10%. We wanna be sure that that value is within the 10th percentile, that any higher will be out of the 10th percentile. So let's see, when we have these really negative z's, so far it only doesn't even get to the first percentile yet. So let's scroll down a little bit. And let's remember as we do so that this is zero in the hundredths place, one, two, three, four, five, six, seven, eight, nine. So let's remember those columns. | Threshold for low percentile Modeling data distributions AP Statistics Khan Academy.mp3 |
So let's see, when we have these really negative z's, so far it only doesn't even get to the first percentile yet. So let's scroll down a little bit. And let's remember as we do so that this is zero in the hundredths place, one, two, three, four, five, six, seven, eight, nine. So let's remember those columns. So let's see, if we are at a z-score of negative 1.28, remember, this is, the hundredths is zero, one, two, three, four, five, six, seven, eight. So this right over here is a z-score of negative 1.28. And that's a little bit crossing the 10th percentile. | Threshold for low percentile Modeling data distributions AP Statistics Khan Academy.mp3 |
So let's remember those columns. So let's see, if we are at a z-score of negative 1.28, remember, this is, the hundredths is zero, one, two, three, four, five, six, seven, eight. So this right over here is a z-score of negative 1.28. And that's a little bit crossing the 10th percentile. But if we get a little bit more negative than that, we are in the 10th percentile. So this is negative 1.29. And this does seem to be the highest z-score for which we are within the 10th percentile. | Threshold for low percentile Modeling data distributions AP Statistics Khan Academy.mp3 |
And that's a little bit crossing the 10th percentile. But if we get a little bit more negative than that, we are in the 10th percentile. So this is negative 1.29. And this does seem to be the highest z-score for which we are within the 10th percentile. So negative 1.29 is our z-score. So this is z equals negative 1.29. And if we wanna figure out the actual value for that, we would start with the mean, which is 185. | Threshold for low percentile Modeling data distributions AP Statistics Khan Academy.mp3 |
And this does seem to be the highest z-score for which we are within the 10th percentile. So negative 1.29 is our z-score. So this is z equals negative 1.29. And if we wanna figure out the actual value for that, we would start with the mean, which is 185. And then we would say, well, we wanna go 1.29 standard deviations below the mean. The negative says we're going below the mean. So we could say minus 1.29 times the standard deviation. | Threshold for low percentile Modeling data distributions AP Statistics Khan Academy.mp3 |
And if we wanna figure out the actual value for that, we would start with the mean, which is 185. And then we would say, well, we wanna go 1.29 standard deviations below the mean. The negative says we're going below the mean. So we could say minus 1.29 times the standard deviation. And they tell us up here the standard deviation is 11 seconds. So it's going to be 1.29 times 11. And this is going to be equal to 1.29 times 11 is equal to 14.19. | Threshold for low percentile Modeling data distributions AP Statistics Khan Academy.mp3 |
So we could say minus 1.29 times the standard deviation. And they tell us up here the standard deviation is 11 seconds. So it's going to be 1.29 times 11. And this is going to be equal to 1.29 times 11 is equal to 14.19. And then I'll make that negative and then add that to 185. Plus 185 is equal to 170.81. 170.81. | Threshold for low percentile Modeling data distributions AP Statistics Khan Academy.mp3 |
And this is going to be equal to 1.29 times 11 is equal to 14.19. And then I'll make that negative and then add that to 185. Plus 185 is equal to 170.81. 170.81. Now they say round to the nearest whole second. There's a couple of ways to think about it. If you really wanna ensure that you're not gonna cross the 10th percentile, you might wanna round to the nearest second that is below this threshold. | Threshold for low percentile Modeling data distributions AP Statistics Khan Academy.mp3 |
170.81. Now they say round to the nearest whole second. There's a couple of ways to think about it. If you really wanna ensure that you're not gonna cross the 10th percentile, you might wanna round to the nearest second that is below this threshold. So you might say that this is approximately 170 seconds. If you were to just round normally, this would go to 171. But just by doing that, you might have crossed the threshold. | Threshold for low percentile Modeling data distributions AP Statistics Khan Academy.mp3 |
We're told Della has over 500 songs on her mobile phone, and she wants to estimate what proportion of the songs are by a female artist. She takes a simple random sample, that's what SRS stands for, of 50 songs on her phone, and finds that 20 of the songs sampled are by a female artist. Based on this sample, which of the following is a 99% confidence interval for the proportion of songs on her phone that are by a female artist? So like always, pause this video and see if you can figure it out on your own. Della has a library of 500 songs right over here, and she's trying to figure out the proportion that are sung by a female artist. She doesn't have the time to go through all 500 songs to figure out the true population proportion, P, so instead she takes a sample of 50 songs, N is equal to 50, and from that she calculates a sample proportion, which we could denote with P hat, and she finds that 20 out of the 50 are sung by a female. 20 out of the 50, which is the same thing, is 0.4, and then she wants to construct a 99% confidence interval. | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
So like always, pause this video and see if you can figure it out on your own. Della has a library of 500 songs right over here, and she's trying to figure out the proportion that are sung by a female artist. She doesn't have the time to go through all 500 songs to figure out the true population proportion, P, so instead she takes a sample of 50 songs, N is equal to 50, and from that she calculates a sample proportion, which we could denote with P hat, and she finds that 20 out of the 50 are sung by a female. 20 out of the 50, which is the same thing, is 0.4, and then she wants to construct a 99% confidence interval. So before we even go about constructing the confidence interval, you wanna check to make sure that we're making some valid assumptions, we're using a valid technique. So before we actually calculate the confidence interval, let's just make sure that our sampling distribution is not distorted in some way, and so that we can, with confidence, make a confidence interval. So the first condition is to make sure that your sample is truly random, and they tell us that it's a simple random sample, so we'll take their word for it. | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
20 out of the 50, which is the same thing, is 0.4, and then she wants to construct a 99% confidence interval. So before we even go about constructing the confidence interval, you wanna check to make sure that we're making some valid assumptions, we're using a valid technique. So before we actually calculate the confidence interval, let's just make sure that our sampling distribution is not distorted in some way, and so that we can, with confidence, make a confidence interval. So the first condition is to make sure that your sample is truly random, and they tell us that it's a simple random sample, so we'll take their word for it. The next condition is to assume that your sampling distribution of the sample proportions is approximately normal, and there, you wanna be confident, or you wanna see that in your sample, you have at least 10 successes and at least 10 failures. Well, here, we have 20 successes, which means, well, 50 minus 20, we have 30 failures. So both of those are more than 10, and so meets that condition. | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
So the first condition is to make sure that your sample is truly random, and they tell us that it's a simple random sample, so we'll take their word for it. The next condition is to assume that your sampling distribution of the sample proportions is approximately normal, and there, you wanna be confident, or you wanna see that in your sample, you have at least 10 successes and at least 10 failures. Well, here, we have 20 successes, which means, well, 50 minus 20, we have 30 failures. So both of those are more than 10, and so meets that condition. And then the last condition is, sometimes it'll called the independence test or the independence rule or the 10% rule. If you were doing this sample with replacement, so if she were to look at one song, test whether it's a female or not, and then put it back in her pile and then look at another song, then each of those observations would truly be independent. But we don't know that. | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
So both of those are more than 10, and so meets that condition. And then the last condition is, sometimes it'll called the independence test or the independence rule or the 10% rule. If you were doing this sample with replacement, so if she were to look at one song, test whether it's a female or not, and then put it back in her pile and then look at another song, then each of those observations would truly be independent. But we don't know that. In fact, we'll assume that she didn't do it with the replacement, and so if you don't do it with the replacement, you can assume rough independence for each observation of a song if this is no more than 10% of the population. And so it looks like it is exactly 10% of the population, so Della just squeezes through on our independence test right over there. So with that out of the way, let's just think about what the confidence interval is going to be. | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
But we don't know that. In fact, we'll assume that she didn't do it with the replacement, and so if you don't do it with the replacement, you can assume rough independence for each observation of a song if this is no more than 10% of the population. And so it looks like it is exactly 10% of the population, so Della just squeezes through on our independence test right over there. So with that out of the way, let's just think about what the confidence interval is going to be. Well, it's going to be her sample proportion, plus or minus, there's going to be some critical value, and this critical value is going to be dictated by our confidence level we wanna have, and then that critical value times the standard deviation of the sampling distribution of the sample proportions, which we don't know. And so instead of having that, we use the standard error of the sample proportion, and in this case, it would be p hat times one minus p hat, all of that over n, our sample size, all of that over 50. So what's this going to be? | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
So with that out of the way, let's just think about what the confidence interval is going to be. Well, it's going to be her sample proportion, plus or minus, there's going to be some critical value, and this critical value is going to be dictated by our confidence level we wanna have, and then that critical value times the standard deviation of the sampling distribution of the sample proportions, which we don't know. And so instead of having that, we use the standard error of the sample proportion, and in this case, it would be p hat times one minus p hat, all of that over n, our sample size, all of that over 50. So what's this going to be? We're gonna get p hat, our sample proportion here, is 0.4 plus or minus, I'll save the z star here, our critical value, for a little bit. We're gonna use a z table for that. And so we're gonna have 0.4 right over there, one minus 0.4 is times 0.6, all of that over 50. | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
So what's this going to be? We're gonna get p hat, our sample proportion here, is 0.4 plus or minus, I'll save the z star here, our critical value, for a little bit. We're gonna use a z table for that. And so we're gonna have 0.4 right over there, one minus 0.4 is times 0.6, all of that over 50. So we can already look at some choices that look interesting here. This choice and this choice both look interesting, and the main thing we have to reason through is which one has a correct critical value. Do we wanna go 1.96 standard errors above and below our sample proportion, or do we wanna go 2.576 standard errors above and below our sample proportion? | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
And so we're gonna have 0.4 right over there, one minus 0.4 is times 0.6, all of that over 50. So we can already look at some choices that look interesting here. This choice and this choice both look interesting, and the main thing we have to reason through is which one has a correct critical value. Do we wanna go 1.96 standard errors above and below our sample proportion, or do we wanna go 2.576 standard errors above and below our sample proportion? And the key is the 99% confidence level. Now, if we have a 99% confidence level, one way to think about it is, so let me just do my best shot at drawing a normal distribution here. And so if you want a 99% confidence level, that means you wanna contain the 99%, the middle 99% under the curve right over here, that area. | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
Do we wanna go 1.96 standard errors above and below our sample proportion, or do we wanna go 2.576 standard errors above and below our sample proportion? And the key is the 99% confidence level. Now, if we have a 99% confidence level, one way to think about it is, so let me just do my best shot at drawing a normal distribution here. And so if you want a 99% confidence level, that means you wanna contain the 99%, the middle 99% under the curve right over here, that area. And so if this is 99%, then this right over here is going to be 0.5%, and this right over here is 0.5%. We want the z value that's going to leave 0.5% above it. And so that's actually going to be 99.5% is what we wanna look up on the table. | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
And so if you want a 99% confidence level, that means you wanna contain the 99%, the middle 99% under the curve right over here, that area. And so if this is 99%, then this right over here is going to be 0.5%, and this right over here is 0.5%. We want the z value that's going to leave 0.5% above it. And so that's actually going to be 99.5% is what we wanna look up on the table. And that's because many z tables, including the one that you might see on something like an AP Stats exam, they will have the area up to and including, up to and including a certain value. And so they're not going to leave this free right over here. So let's just look up 99.5% on our z table. | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
And so that's actually going to be 99.5% is what we wanna look up on the table. And that's because many z tables, including the one that you might see on something like an AP Stats exam, they will have the area up to and including, up to and including a certain value. And so they're not going to leave this free right over here. So let's just look up 99.5% on our z table. All right, so let me move this down so you can see it. All right, that's our z table. Let's see, we're at 99 point, okay, it's gonna be right in this area right over here. | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
So let's just look up 99.5% on our z table. All right, so let me move this down so you can see it. All right, that's our z table. Let's see, we're at 99 point, okay, it's gonna be right in this area right over here. And so that is 2.5, looks like 2.57 or 2.58 around that. And so this right over here is about 2.57. It's between 2.57 and 2.58, which gives us enough information to answer this question. | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
Let's see, we're at 99 point, okay, it's gonna be right in this area right over here. And so that is 2.5, looks like 2.57 or 2.58 around that. And so this right over here is about 2.57. It's between 2.57 and 2.58, which gives us enough information to answer this question. It's definitely not going to be this one right over here. We have 2.576, which is indeed between 2.57 and 2.58. So let's remind ourselves. | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
It's between 2.57 and 2.58, which gives us enough information to answer this question. It's definitely not going to be this one right over here. We have 2.576, which is indeed between 2.57 and 2.58. So let's remind ourselves. We've been able to construct our confidence interval right over here. But what does that actually mean? That means that if we were to repeatedly take samples of size 50 and repeatedly use this technique to construct confidence intervals, that roughly 99% of those intervals constructed this way are going to contain our true population parameter. | Example constructing and interpreting a confidence interval for p AP Statistics Khan Academy.mp3 |
So the expected value of X, which I could also denote as the mean of our random variable X, let's say I expect to see three dogs a day, and similarly for the cats, the expected value of Y is equal to, I could also denote that as the mean of Y, is going to be equal to, and this is just for the sake of argument, let's say I expect to see four cats a day, and in previous videos, we defined how do you take the mean of a random variable, or the expected value of a random variable. What we're going to think about now is, what would be the expected value of X plus Y? Or another way of saying that, the mean of the sum of these two random variables. Well, it turns out, and I'm not proving it just yet, that the mean of the sum of random variables is equal to the sum of the means. So this is going to be equal to the mean of random variable X plus the mean of random variable Y. And so in this particular case, if I were to say, well what's the expected number of dogs and cats that I would see in a given day? Well, I would add these two means. | Mean of sum and difference of random variables Random variables AP Statistics Khan Academy.mp3 |
Well, it turns out, and I'm not proving it just yet, that the mean of the sum of random variables is equal to the sum of the means. So this is going to be equal to the mean of random variable X plus the mean of random variable Y. And so in this particular case, if I were to say, well what's the expected number of dogs and cats that I would see in a given day? Well, I would add these two means. It would be three plus four, it would be equal to seven. So in this particular case, it is equal to three plus four, which is equal to seven. And similarly, if I were to ask you the difference, if I were to say, well what's the, how many more cats in a given day would I expect to see than dogs, so the expected value of Y minus X? | Mean of sum and difference of random variables Random variables AP Statistics Khan Academy.mp3 |
Well, I would add these two means. It would be three plus four, it would be equal to seven. So in this particular case, it is equal to three plus four, which is equal to seven. And similarly, if I were to ask you the difference, if I were to say, well what's the, how many more cats in a given day would I expect to see than dogs, so the expected value of Y minus X? What would that be? Well, intuitively, you might say, well, hey, if we can add random, if the expected value of the sum is the sum of the expected values, then the expected value or the mean of the difference will be the differences of the means, and that is absolutely true. So this is the same thing as the mean of Y minus X, which is equal to the mean of Y, is going to be equal to the mean of Y minus the mean of X minus the mean of X, and in this particular case, it would be equal to four minus three, minus three is equal to one. | Mean of sum and difference of random variables Random variables AP Statistics Khan Academy.mp3 |
And similarly, if I were to ask you the difference, if I were to say, well what's the, how many more cats in a given day would I expect to see than dogs, so the expected value of Y minus X? What would that be? Well, intuitively, you might say, well, hey, if we can add random, if the expected value of the sum is the sum of the expected values, then the expected value or the mean of the difference will be the differences of the means, and that is absolutely true. So this is the same thing as the mean of Y minus X, which is equal to the mean of Y, is going to be equal to the mean of Y minus the mean of X minus the mean of X, and in this particular case, it would be equal to four minus three, minus three is equal to one. So another way of thinking about this intuitively is I would expect to see on a given day one more cat than dogs. Now, the example that I've just used, this is discrete random variables, on a given day, I wouldn't see 2.2 dogs or pi dogs. The expected value itself does not have to be a whole number because you could, of course, average it over many days, but this same idea that the mean of a sum is the same thing as the sum of means and that the mean of a difference of random variables is the same as the difference of the means. | Mean of sum and difference of random variables Random variables AP Statistics Khan Academy.mp3 |
So just to review a little bit of the last video, we said we're trying to model out the probability distribution of how many cars might pass in an hour. And the first thing we did is we sat at that intersection, and we found a pretty good expected value of our random variable, and this random variable, just to go back to the top, we defined the random variable as the number of cars that pass in an hour at a certain point on a certain road, and we said that we measure it a bunch. We sat out there a bunch of hours, and we got a pretty good estimate of this, and we say it's lambda. And we said, OK, we wanted to model it as a binomial distribution. So if this is a binomial distribution, then this lambda would be equal to the number of trials times the probability of success per trial. And so if we could view a trial as an interval of time, this is the total number of successes in an hour. Success in hour, and so this would be success in an smaller interval, an interval, and this would be the probability of success in that smaller interval. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
And we said, OK, we wanted to model it as a binomial distribution. So if this is a binomial distribution, then this lambda would be equal to the number of trials times the probability of success per trial. And so if we could view a trial as an interval of time, this is the total number of successes in an hour. Success in hour, and so this would be success in an smaller interval, an interval, and this would be the probability of success in that smaller interval. And in the last video, we tried it out. We said, oh, well, what if we make this interval a minute, and this is the probability of success per minute? We'd have maybe a reasonable description of what we're describing, but what if more than one car passes in a minute? | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
Success in hour, and so this would be success in an smaller interval, an interval, and this would be the probability of success in that smaller interval. And in the last video, we tried it out. We said, oh, well, what if we make this interval a minute, and this is the probability of success per minute? We'd have maybe a reasonable description of what we're describing, but what if more than one car passes in a minute? And they said, oh, let's make this per second, and this is the probability of success per second. But then we still have the problem. More than one car could pass in a second very easily. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
We'd have maybe a reasonable description of what we're describing, but what if more than one car passes in a minute? And they said, oh, let's make this per second, and this is the probability of success per second. But then we still have the problem. More than one car could pass in a second very easily. So what we want to do is we want to take the limit as this approaches infinity, and then see what kind of formula we get from the math gods. So if we describe this as a binomial distribution with the limit as it approaches infinity, we could say that the probability that x is equal to some number, so the probability that our random variable is equal to, I don't know, three cars in a particular hour, exactly three cars in an hour, is equal to, oh, we want to take the limit as it approaches infinity, right? The limit as n approaches infinity of n choose k. We're going to have k moments in time, right? | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
More than one car could pass in a second very easily. So what we want to do is we want to take the limit as this approaches infinity, and then see what kind of formula we get from the math gods. So if we describe this as a binomial distribution with the limit as it approaches infinity, we could say that the probability that x is equal to some number, so the probability that our random variable is equal to, I don't know, three cars in a particular hour, exactly three cars in an hour, is equal to, oh, we want to take the limit as it approaches infinity, right? The limit as n approaches infinity of n choose k. We're going to have k moments in time, right? Because as n approaches infinity, these intervals become super, super, duper small, right? So these become moments in time. So we're going to have close to an infinite number of moments, and this is the number of successful moments where cars pass. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
The limit as n approaches infinity of n choose k. We're going to have k moments in time, right? Because as n approaches infinity, these intervals become super, super, duper small, right? So these become moments in time. So we're going to have close to an infinite number of moments, and this is the number of successful moments where cars pass. If we have three moments where there was a success where a car passed, then we had a total of three cars pass, right? Or seven cars. Seven moments where it was true that a car passed, then we would have a total of seven cars pass in the hour. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
So we're going to have close to an infinite number of moments, and this is the number of successful moments where cars pass. If we have three moments where there was a success where a car passed, then we had a total of three cars pass, right? Or seven cars. Seven moments where it was true that a car passed, then we would have a total of seven cars pass in the hour. So just finishing up with our binomial distribution, n moments choose k successes times the probability of success. What's the probability of success? We said if this is, so this would be n, what's p equal to? | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
Seven moments where it was true that a car passed, then we would have a total of seven cars pass in the hour. So just finishing up with our binomial distribution, n moments choose k successes times the probability of success. What's the probability of success? We said if this is, so this would be n, what's p equal to? p is equal to lambda divided by n, right? n times p is lambda, so let me just write that down. p is equal to lambda divided by n. I just rearranged this up here, right? | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
We said if this is, so this would be n, what's p equal to? p is equal to lambda divided by n, right? n times p is lambda, so let me just write that down. p is equal to lambda divided by n. I just rearranged this up here, right? So our probability of success is lambda times n. And we're saying, what's the probability that we have k successes? And then what's the probability that we have a failure? Well, it's 1 minus the probability of success. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
p is equal to lambda divided by n. I just rearranged this up here, right? So our probability of success is lambda times n. And we're saying, what's the probability that we have k successes? And then what's the probability that we have a failure? Well, it's 1 minus the probability of success. And how many failures are we going to have? How many moments will not have a car pass? Well, we have a total of n moments, and k of them were successes. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
Well, it's 1 minus the probability of success. And how many failures are we going to have? How many moments will not have a car pass? Well, we have a total of n moments, and k of them were successes. So we'll have n minus k failures. Let's see what we can do with this. So this is equal to, let me rewrite it all, and I'll change colors. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
Well, we have a total of n moments, and k of them were successes. So we'll have n minus k failures. Let's see what we can do with this. So this is equal to, let me rewrite it all, and I'll change colors. The limit as n approaches infinity. Let me write out this binomial coefficient. That's n factorial over n minus k factorial times k factorial, normally I write these the other way around, but it's the same thing, times, let's see, lambda to the k, I'm just using my exponent properties, over n to the k. And then this expression right here, I can actually separate out the exponents. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
So this is equal to, let me rewrite it all, and I'll change colors. The limit as n approaches infinity. Let me write out this binomial coefficient. That's n factorial over n minus k factorial times k factorial, normally I write these the other way around, but it's the same thing, times, let's see, lambda to the k, I'm just using my exponent properties, over n to the k. And then this expression right here, I can actually separate out the exponents. This is the same thing as 1 minus lambda over n to the n times 1 minus lambda over n to the minus k. You have the same base, you could add the exponents and you would get this up here. And let me simplify a little bit more. Let me swap spots with these two. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
That's n factorial over n minus k factorial times k factorial, normally I write these the other way around, but it's the same thing, times, let's see, lambda to the k, I'm just using my exponent properties, over n to the k. And then this expression right here, I can actually separate out the exponents. This is the same thing as 1 minus lambda over n to the n times 1 minus lambda over n to the minus k. You have the same base, you could add the exponents and you would get this up here. And let me simplify a little bit more. Let me swap spots with these two. You can kind of view them both as being in the denominator. So you can change the order of division or multiplication depending how you view it. So this is equal to the limit, let me switch colors, the limit as n approaches infinity, I don't like that color. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
Let me swap spots with these two. You can kind of view them both as being in the denominator. So you can change the order of division or multiplication depending how you view it. So this is equal to the limit, let me switch colors, the limit as n approaches infinity, I don't like that color. Actually, let me just rewrite what we did in the last video. What is this thing right here? And we showed it at the end of the last video. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
So this is equal to the limit, let me switch colors, the limit as n approaches infinity, I don't like that color. Actually, let me just rewrite what we did in the last video. What is this thing right here? And we showed it at the end of the last video. n factorial divided by n minus k factorial. It was n times n minus 1 times n minus 2, all the way to n minus k plus 1. If this was 7 over 7 minus 2 factorial, we would have 7 times 6, right? | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
And we showed it at the end of the last video. n factorial divided by n minus k factorial. It was n times n minus 1 times n minus 2, all the way to n minus k plus 1. If this was 7 over 7 minus 2 factorial, we would have 7 times 6, right? And 6 is 1 more than 7 minus 2. So that's where we got that. And we did that in the last video, if you're getting confused. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
If this was 7 over 7 minus 2 factorial, we would have 7 times 6, right? And 6 is 1 more than 7 minus 2. So that's where we got that. And we did that in the last video, if you're getting confused. And we also said that there's going to be exactly k terms here. So if you counted these, there's 1, 2, 3, all the way there's going to be k terms here. And so we took care of that, we just rewrote that. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
And we did that in the last video, if you're getting confused. And we also said that there's going to be exactly k terms here. So if you counted these, there's 1, 2, 3, all the way there's going to be k terms here. And so we took care of that, we just rewrote that. And I said I would switch these two things around. So that's divided by n to the k times lambda to the k over k factorial. And then what do we have here? | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
And so we took care of that, we just rewrote that. And I said I would switch these two things around. So that's divided by n to the k times lambda to the k over k factorial. And then what do we have here? I can just rewrite that. This is continuing the same line. 1 minus lambda over n to the n times 1 minus lambda over n to the minus k. Now we can take the limit. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
And then what do we have here? I can just rewrite that. This is continuing the same line. 1 minus lambda over n to the n times 1 minus lambda over n to the minus k. Now we can take the limit. So what happens when we take the limit? So just so you know, if you take the limit, this is another property, just so you don't get too overwhelmed. Another property of limits, if I take the limit as x approaches anything, a of f of x times g of x, that's equal to the limit as x approaches a of f of x times the limit as x approaches a of g of x. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
1 minus lambda over n to the n times 1 minus lambda over n to the minus k. Now we can take the limit. So what happens when we take the limit? So just so you know, if you take the limit, this is another property, just so you don't get too overwhelmed. Another property of limits, if I take the limit as x approaches anything, a of f of x times g of x, that's equal to the limit as x approaches a of f of x times the limit as x approaches a of g of x. So we could take each of these limits in the product and then multiply them and then we'll be all set. So let's do that. And I want to leave this stuff up here. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
Another property of limits, if I take the limit as x approaches anything, a of f of x times g of x, that's equal to the limit as x approaches a of f of x times the limit as x approaches a of g of x. So we could take each of these limits in the product and then multiply them and then we'll be all set. So let's do that. And I want to leave this stuff up here. So first of all, what's this limit? Let me write this out. And let me pick a good color. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
And I want to leave this stuff up here. So first of all, what's this limit? Let me write this out. And let me pick a good color. Yellow. So we have the limit as n approaches infinity. So this thing up here, this n times n minus 1 times n minus 2, all the way down to n minus k plus 1, what's it going to look like? | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
And let me pick a good color. Yellow. So we have the limit as n approaches infinity. So this thing up here, this n times n minus 1 times n minus 2, all the way down to n minus k plus 1, what's it going to look like? It's going to be a polynomial, right? We're multiplying a bunch of binomials. We're doing it k times. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
So this thing up here, this n times n minus 1 times n minus 2, all the way down to n minus k plus 1, what's it going to look like? It's going to be a polynomial, right? We're multiplying a bunch of binomials. We're doing it k times. So the largest degree term is going to be n to the k, right? It's going to be n to the k plus something times n to the k minus 1. It's going to be this big kind of binomial, this big polynomial, kth degree polynomial. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
We're doing it k times. So the largest degree term is going to be n to the k, right? It's going to be n to the k plus something times n to the k minus 1. It's going to be this big kind of binomial, this big polynomial, kth degree polynomial. And that's really all we need to know for this derivation. So it's going to be n to the k plus blah, blah, blah, blah, blah, blah, blah, a bunch of other stuff. This thing, when you multiply it out. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
It's going to be this big kind of binomial, this big polynomial, kth degree polynomial. And that's really all we need to know for this derivation. So it's going to be n to the k plus blah, blah, blah, blah, blah, blah, blah, a bunch of other stuff. This thing, when you multiply it out. Over, we have this n to the k, right? So we just, that's this part of it. Times the limit as, well actually, we don't have to worry, this is a constant. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
This thing, when you multiply it out. Over, we have this n to the k, right? So we just, that's this part of it. Times the limit as, well actually, we don't have to worry, this is a constant. So we could actually bring this out front. So we don't even have to write a limit. So times lambda to the k over k factorial, right? | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
Times the limit as, well actually, we don't have to worry, this is a constant. So we could actually bring this out front. So we don't even have to write a limit. So times lambda to the k over k factorial, right? There's no n here, so this is a constant with respect to n. Times the limit as n approaches infinity of 1 minus lambda over n to the n times 1 minus lambda over n to the minus k. All right. I know you can barely read this. So first of all, what's this limit? | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
So times lambda to the k over k factorial, right? There's no n here, so this is a constant with respect to n. Times the limit as n approaches infinity of 1 minus lambda over n to the n times 1 minus lambda over n to the minus k. All right. I know you can barely read this. So first of all, what's this limit? The limit as n approaches infinity of some polynomial where it's n to the kth power plus blah, blah, blah, blah, where all of these other terms have a lower degree. This is the highest degree term. So you have n to the k in the numerator and you have n to the k in the denominator. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
So first of all, what's this limit? The limit as n approaches infinity of some polynomial where it's n to the kth power plus blah, blah, blah, blah, where all of these other terms have a lower degree. This is the highest degree term. So you have n to the k in the numerator and you have n to the k in the denominator. So the highest degrees are the same, the coefficients are 1. So this limit is 1. Another way you could do it, you could divide the numerator and the denominator by n to the k and you would get, this would be 1 plus 1 over n plus 1 over everything else would have a 1 over n in it. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
So you have n to the k in the numerator and you have n to the k in the denominator. So the highest degrees are the same, the coefficients are 1. So this limit is 1. Another way you could do it, you could divide the numerator and the denominator by n to the k and you would get, this would be 1 plus 1 over n plus 1 over everything else would have a 1 over n in it. And this would just be a 1. And if you take the limit as you approach infinity, then all of these other terms would be 0 and you'd get left with 1 over 1. But either way, you have the same degree in the top and the bottom and their coefficients are the same. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
Another way you could do it, you could divide the numerator and the denominator by n to the k and you would get, this would be 1 plus 1 over n plus 1 over everything else would have a 1 over n in it. And this would just be a 1. And if you take the limit as you approach infinity, then all of these other terms would be 0 and you'd get left with 1 over 1. But either way, you have the same degree in the top and the bottom and their coefficients are the same. So the limit as n approaches infinity of this is 1, which is a nice simplification. So you end up with 1 times lambda k over k factorial. Now what's the limit as n approaches infinity of this thing right here? | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
But either way, you have the same degree in the top and the bottom and their coefficients are the same. So the limit as n approaches infinity of this is 1, which is a nice simplification. So you end up with 1 times lambda k over k factorial. Now what's the limit as n approaches infinity of this thing right here? 1 minus lambda over n to the n. Well, in the last video we showed that it would be, I'll write it right here, that the limit as n approaches infinity of 1 plus a over n to the n is equal to e to the a, right? That's exactly what we have here, but instead of an a, we have a minus lambda, right? Minus lambda, so this is going to be equal to e to the minus lambda, right? | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
Now what's the limit as n approaches infinity of this thing right here? 1 minus lambda over n to the n. Well, in the last video we showed that it would be, I'll write it right here, that the limit as n approaches infinity of 1 plus a over n to the n is equal to e to the a, right? That's exactly what we have here, but instead of an a, we have a minus lambda, right? Minus lambda, so this is going to be equal to e to the minus lambda, right? We have a minus lambda instead of an a. And then finally, what's the limit as n approaches infinity? Let me write it a little bit neater. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
Minus lambda, so this is going to be equal to e to the minus lambda, right? We have a minus lambda instead of an a. And then finally, what's the limit as n approaches infinity? Let me write it a little bit neater. I'm just rewriting this term. 1 minus lambda over n to the minus k power. What happens as n approaches infinity? | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
Let me write it a little bit neater. I'm just rewriting this term. 1 minus lambda over n to the minus k power. What happens as n approaches infinity? Well, this term, right, lambda's a constant. As this approaches infinity, this term's going to approach 0, so you have 1 to the minus k. 1 to any power is 1, so that term becomes 1. So we have another 1 there. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
What happens as n approaches infinity? Well, this term, right, lambda's a constant. As this approaches infinity, this term's going to approach 0, so you have 1 to the minus k. 1 to any power is 1, so that term becomes 1. So we have another 1 there. So there you have it. We're done. The probability that our random variable, the number of cars that pass in an hour, is equal to a particular number, you know, it's equal to 7 cars pass in an hour, is equal to the limit as n approaches infinity of n choose k times, well, we said it was lambda over n to the k successes, times 1 minus lambda over n to the n minus k failures. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
So we have another 1 there. So there you have it. We're done. The probability that our random variable, the number of cars that pass in an hour, is equal to a particular number, you know, it's equal to 7 cars pass in an hour, is equal to the limit as n approaches infinity of n choose k times, well, we said it was lambda over n to the k successes, times 1 minus lambda over n to the n minus k failures. And we just showed that this is equal to lambda to the kth power over k factorial times e to the minus lambda. And that's pretty neat, because when you just see it in kind of a vacuum, if you have no context for it, you wouldn't guess that this is in any way related to the binomial theorem. I mean, it's got an e in there. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
The probability that our random variable, the number of cars that pass in an hour, is equal to a particular number, you know, it's equal to 7 cars pass in an hour, is equal to the limit as n approaches infinity of n choose k times, well, we said it was lambda over n to the k successes, times 1 minus lambda over n to the n minus k failures. And we just showed that this is equal to lambda to the kth power over k factorial times e to the minus lambda. And that's pretty neat, because when you just see it in kind of a vacuum, if you have no context for it, you wouldn't guess that this is in any way related to the binomial theorem. I mean, it's got an e in there. It's got a factorial, but you know, a lot of things have factorials in life, so not clear that that would make it a binomial theorem. But this is just the limit as you take smaller and smaller and smaller intervals, and the probability of success in each interval becomes smaller. But as you take the limit, you end up with e. And if you think about it, it makes sense, because one of our derivations of e actually came out of compound interest, and we kind of did something similar there. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
I mean, it's got an e in there. It's got a factorial, but you know, a lot of things have factorials in life, so not clear that that would make it a binomial theorem. But this is just the limit as you take smaller and smaller and smaller intervals, and the probability of success in each interval becomes smaller. But as you take the limit, you end up with e. And if you think about it, it makes sense, because one of our derivations of e actually came out of compound interest, and we kind of did something similar there. We took smaller and smaller intervals of compounding, and over each interval we compounded by a much smaller number, and when you took the limit, you got e again. And that's actually where that whole formula up here came from to begin with. But anyway, just so that you know how to use this thing, so let's say that I were to go out, I'm the traffic engineer, and I figure out that on average, 9 cars pass per hour. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
But as you take the limit, you end up with e. And if you think about it, it makes sense, because one of our derivations of e actually came out of compound interest, and we kind of did something similar there. We took smaller and smaller intervals of compounding, and over each interval we compounded by a much smaller number, and when you took the limit, you got e again. And that's actually where that whole formula up here came from to begin with. But anyway, just so that you know how to use this thing, so let's say that I were to go out, I'm the traffic engineer, and I figure out that on average, 9 cars pass per hour. So I want to know the probability that 2 cars pass in a given hour, exactly 2 cars pass, it's going to be equal to 9 cars per hour to the 2th power, or squared, instead of the 2th power, divided by 2 factorial times e to the minus 9 power. So it's equal to 81 over 2 times e to the minus 9 power. And let's see, maybe I should just get the graphing calculator out there. | Poisson process 2 Probability and Statistics Khan Academy.mp3 |
I think now is as good a time as any to play around a little bit with the formula for variance and see where it goes. And I think just by doing this, we'll also get a little bit better intuition of just manipulating sigma notation or even what it means. So we learned several times that the formula for variance, and let's just do variance of a population. It's almost the same thing as variance of a sample. You just divide by n instead of n minus 1. Variance of a population is equal to, when you take each of the data points, x sub i, you subtract from that the mean, you square it, and then you take the average of all of these. So you add the squared distance for each of these points from i equals 1 to i is equal to n, and you divide it by n. So let's see what happens if we can, I don't know, maybe we want to multiply out the squared term and see where it takes us. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
It's almost the same thing as variance of a sample. You just divide by n instead of n minus 1. Variance of a population is equal to, when you take each of the data points, x sub i, you subtract from that the mean, you square it, and then you take the average of all of these. So you add the squared distance for each of these points from i equals 1 to i is equal to n, and you divide it by n. So let's see what happens if we can, I don't know, maybe we want to multiply out the squared term and see where it takes us. So let's see. And I think it'll take us someplace interesting. So this is the same thing as the sum from i is equal to 1 to n. Let's see, this we just multiply it out. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
So you add the squared distance for each of these points from i equals 1 to i is equal to n, and you divide it by n. So let's see what happens if we can, I don't know, maybe we want to multiply out the squared term and see where it takes us. So let's see. And I think it'll take us someplace interesting. So this is the same thing as the sum from i is equal to 1 to n. Let's see, this we just multiply it out. This is the same thing as x sub i squared minus, this is your little algebra going on here. So when you square it, I mean, we could multiply it out. We could write it x sub i minus mu times x sub i minus mu, so we have x sub i times x sub i, that's x sub i squared. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
So this is the same thing as the sum from i is equal to 1 to n. Let's see, this we just multiply it out. This is the same thing as x sub i squared minus, this is your little algebra going on here. So when you square it, I mean, we could multiply it out. We could write it x sub i minus mu times x sub i minus mu, so we have x sub i times x sub i, that's x sub i squared. Then you have x sub i times mu, times minus mu, and then you have minus u times x sub i. So when you add those two together, you get minus 2 x sub i mu, because you have it twice. x sub i times mu, that's 1 minus x sub i mu. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
We could write it x sub i minus mu times x sub i minus mu, so we have x sub i times x sub i, that's x sub i squared. Then you have x sub i times mu, times minus mu, and then you have minus u times x sub i. So when you add those two together, you get minus 2 x sub i mu, because you have it twice. x sub i times mu, that's 1 minus x sub i mu. And then you have another one, minus mu x sub i. When you add them together, you get minus 2 x sub i mu. I know it's confusing with me saying sub i and all of that, but it's really no different than when you did a minus b squared, just the variables look a little bit more complicated. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
x sub i times mu, that's 1 minus x sub i mu. And then you have another one, minus mu x sub i. When you add them together, you get minus 2 x sub i mu. I know it's confusing with me saying sub i and all of that, but it's really no different than when you did a minus b squared, just the variables look a little bit more complicated. And then the last term is minus mu times minus mu, which is plus mu squared. Fair enough. Let me switch colors, just to keep it interesting. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
I know it's confusing with me saying sub i and all of that, but it's really no different than when you did a minus b squared, just the variables look a little bit more complicated. And then the last term is minus mu times minus mu, which is plus mu squared. Fair enough. Let me switch colors, just to keep it interesting. Let me cordon that off. OK, so how can we, well, the sum of this is the same thing as the sum of, because you think about it, we're going to take each x sub i, for each of the numbers in our population, we're going to perform this thing, and we're going to sum it up. But if you think about it, this is the same thing as, if you're not familiar with sigmation, this is a good kind of thing to know in general, just a little bit of intuition, that this is the same thing as, I'll do it here to have space, as the sum from i is equal to 1 to n of the first term, x sub i squared minus, and actually we can bring out the constant terms. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
Let me switch colors, just to keep it interesting. Let me cordon that off. OK, so how can we, well, the sum of this is the same thing as the sum of, because you think about it, we're going to take each x sub i, for each of the numbers in our population, we're going to perform this thing, and we're going to sum it up. But if you think about it, this is the same thing as, if you're not familiar with sigmation, this is a good kind of thing to know in general, just a little bit of intuition, that this is the same thing as, I'll do it here to have space, as the sum from i is equal to 1 to n of the first term, x sub i squared minus, and actually we can bring out the constant terms. You just can't take, when you're summing, the only thing that matters is the thing that has the ith term. So in this case, it's x sub i, so x sub 1, x sub 2. So that's the thing that you have to leave on the right hand side of the sigmonotation. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
But if you think about it, this is the same thing as, if you're not familiar with sigmation, this is a good kind of thing to know in general, just a little bit of intuition, that this is the same thing as, I'll do it here to have space, as the sum from i is equal to 1 to n of the first term, x sub i squared minus, and actually we can bring out the constant terms. You just can't take, when you're summing, the only thing that matters is the thing that has the ith term. So in this case, it's x sub i, so x sub 1, x sub 2. So that's the thing that you have to leave on the right hand side of the sigmonotation. And if you've done the calculus playlist already, sigmonotation is really, it's kind of like a discrete integral on some level. Because in an integral, you're summing up a bunch of things, you're multiplying them times dx, which is a really small interval, but here you're just taking a sum. And that's what we showed in the calculus playlist, that an integral actually is kind of this infinite sum of infinitely small things, but I don't want to digress too much, but this was just a long way of saying that the sum from i equals 1 to n of the second term is the same thing as minus 2 times mu of the sum from i is equal to 1 to n of x sub i. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
So that's the thing that you have to leave on the right hand side of the sigmonotation. And if you've done the calculus playlist already, sigmonotation is really, it's kind of like a discrete integral on some level. Because in an integral, you're summing up a bunch of things, you're multiplying them times dx, which is a really small interval, but here you're just taking a sum. And that's what we showed in the calculus playlist, that an integral actually is kind of this infinite sum of infinitely small things, but I don't want to digress too much, but this was just a long way of saying that the sum from i equals 1 to n of the second term is the same thing as minus 2 times mu of the sum from i is equal to 1 to n of x sub i. And then finally, you have plus, well this is just a constant term, right? This is just a constant term, so you can take it out, times mu squared times the sum from i equals 1 to n. And what's going to be here? It's going to be a 1. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
And that's what we showed in the calculus playlist, that an integral actually is kind of this infinite sum of infinitely small things, but I don't want to digress too much, but this was just a long way of saying that the sum from i equals 1 to n of the second term is the same thing as minus 2 times mu of the sum from i is equal to 1 to n of x sub i. And then finally, you have plus, well this is just a constant term, right? This is just a constant term, so you can take it out, times mu squared times the sum from i equals 1 to n. And what's going to be here? It's going to be a 1. We just divided a 1, we just divided this by 1, took it out of the sigma sign, out of the sum, and you're just left with a 1 there. And actually, we could have just left the mu squared there. But either way, let's just keep simplifying it. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
It's going to be a 1. We just divided a 1, we just divided this by 1, took it out of the sigma sign, out of the sum, and you're just left with a 1 there. And actually, we could have just left the mu squared there. But either way, let's just keep simplifying it. So this we can't really do. Well actually, we could. Well no, we don't know what the x sub i's are, so we just have to leave that the same. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
But either way, let's just keep simplifying it. So this we can't really do. Well actually, we could. Well no, we don't know what the x sub i's are, so we just have to leave that the same. So that's the sum. Oh sorry, this is just the numerator, right? This whole simplification, we're just simplifying the numerator, and later we're just going to divide by n. So that is equal to that divided by n, which is equal to this thing divided by n. I'll divide by n at the end, because it's the numerator that's the confusing part, right? | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
Well no, we don't know what the x sub i's are, so we just have to leave that the same. So that's the sum. Oh sorry, this is just the numerator, right? This whole simplification, we're just simplifying the numerator, and later we're just going to divide by n. So that is equal to that divided by n, which is equal to this thing divided by n. I'll divide by n at the end, because it's the numerator that's the confusing part, right? We just want to simplify this term up here. So let's keep doing this. So this is equal to the sum from i equals 1 to n of x sub i squared, and let's see, minus 2 times mu. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
This whole simplification, we're just simplifying the numerator, and later we're just going to divide by n. So that is equal to that divided by n, which is equal to this thing divided by n. I'll divide by n at the end, because it's the numerator that's the confusing part, right? We just want to simplify this term up here. So let's keep doing this. So this is equal to the sum from i equals 1 to n of x sub i squared, and let's see, minus 2 times mu. Sorry, that mu doesn't look good. Edit, undo. Minus 2 times mu times the sum from i is equal to 1 to n of x i, and then what is this? | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
So this is equal to the sum from i equals 1 to n of x sub i squared, and let's see, minus 2 times mu. Sorry, that mu doesn't look good. Edit, undo. Minus 2 times mu times the sum from i is equal to 1 to n of x i, and then what is this? What is another way to write this, right? Essentially we're going to add 1 to itself n times, right? This is kind of saying just look, whatever you have here, just iterate through it n times. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
Minus 2 times mu times the sum from i is equal to 1 to n of x i, and then what is this? What is another way to write this, right? Essentially we're going to add 1 to itself n times, right? This is kind of saying just look, whatever you have here, just iterate through it n times. If you had an x sub i here, you would use the first x term and then the second x term. When you have a 1 here, this is just essentially saying add 1 to itself n times, right? Which is the same thing as n. So this is going to be plus mu squared times n. All right. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
This is kind of saying just look, whatever you have here, just iterate through it n times. If you had an x sub i here, you would use the first x term and then the second x term. When you have a 1 here, this is just essentially saying add 1 to itself n times, right? Which is the same thing as n. So this is going to be plus mu squared times n. All right. Then let's see if there's anything else we can do here. Remember, this was just the numerator. So this looks fine. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
Which is the same thing as n. So this is going to be plus mu squared times n. All right. Then let's see if there's anything else we can do here. Remember, this was just the numerator. So this looks fine. We add up each of those terms. We have minus 2 mu, right? From i equals 1 to, oh, well think about this. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
So this looks fine. We add up each of those terms. We have minus 2 mu, right? From i equals 1 to, oh, well think about this. What is this? What is this thing right here? Well, actually, let's bring back that n. So this simplified to that divided by n, which simplifies to that whole thing, which simplifies to this whole thing divided by n, which simplifies to this whole thing divided by n, which is the same thing as each of the terms divided by n, which is the same thing as that, which is the same thing as that, which is the same thing as that, right? | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
From i equals 1 to, oh, well think about this. What is this? What is this thing right here? Well, actually, let's bring back that n. So this simplified to that divided by n, which simplifies to that whole thing, which simplifies to this whole thing divided by n, which simplifies to this whole thing divided by n, which is the same thing as each of the terms divided by n, which is the same thing as that, which is the same thing as that, which is the same thing as that, right? And now, how does this simplify is the interesting part. Well, nothing much I can do here. So that just becomes the sum from i is equal to 1 to n times x sub i squared divided by big N. Now, this is interesting. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
Well, actually, let's bring back that n. So this simplified to that divided by n, which simplifies to that whole thing, which simplifies to this whole thing divided by n, which simplifies to this whole thing divided by n, which is the same thing as each of the terms divided by n, which is the same thing as that, which is the same thing as that, which is the same thing as that, right? And now, how does this simplify is the interesting part. Well, nothing much I can do here. So that just becomes the sum from i is equal to 1 to n times x sub i squared divided by big N. Now, this is interesting. What is, if I take each of the terms in my population and I add them up and then I divide it by n, what is that? This thing right here. If I sum up all the terms in my population and divide by the number of terms there are. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
So that just becomes the sum from i is equal to 1 to n times x sub i squared divided by big N. Now, this is interesting. What is, if I take each of the terms in my population and I add them up and then I divide it by n, what is that? This thing right here. If I sum up all the terms in my population and divide by the number of terms there are. That's the mean, right? That's the mean of my population. So this thing right here is also mu. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
If I sum up all the terms in my population and divide by the number of terms there are. That's the mean, right? That's the mean of my population. So this thing right here is also mu. So this thing simplifies to what? Minus 2 times what? This whole thing is mu too, so times mu squared. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
So this thing right here is also mu. So this thing simplifies to what? Minus 2 times what? This whole thing is mu too, so times mu squared. Mu times mu. This is the mean of the population. So that was a nice simplification. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
This whole thing is mu too, so times mu squared. Mu times mu. This is the mean of the population. So that was a nice simplification. And then plus, what do you have here? Let's see. You have mu, you have n over n. Those cancel out. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
So that was a nice simplification. And then plus, what do you have here? Let's see. You have mu, you have n over n. Those cancel out. So you have plus mu squared. So that was a very nice simplification. And then this simplifies to, can't do much on this side, so the sum from i is equal to 1 to n of x sub i squared over n. And then see, we have minus 2 mu squared plus mu squared. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
You have mu, you have n over n. Those cancel out. So you have plus mu squared. So that was a very nice simplification. And then this simplifies to, can't do much on this side, so the sum from i is equal to 1 to n of x sub i squared over n. And then see, we have minus 2 mu squared plus mu squared. Well, that's the same thing as minus mu squared. Minus the mean squared. So this, already, we've kind of come up with a neat way of writing the variance. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
And then this simplifies to, can't do much on this side, so the sum from i is equal to 1 to n of x sub i squared over n. And then see, we have minus 2 mu squared plus mu squared. Well, that's the same thing as minus mu squared. Minus the mean squared. So this, already, we've kind of come up with a neat way of writing the variance. You can essentially take the average of the squares of all of the numbers, in this case a population, and then subtract from that the mean squared of your population. So this could be, depending on how you're calculating things, maybe a slightly faster way of calculating the variance. So just playing with a little algebra we got from this thing, where you have to, each time, take each of your data points, subtract the mean from it, and then square it. | Statistics Alternate variance formulas Probability and Statistics Khan Academy.mp3 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.