text stringlengths 71 10k |
|---|
heorem, 18 Exact differential, 196 Exact sequence of sheaves, 83 Exceptional curves of the first kind, see Minus one curve Exceptional divisor, 119 Exceptional locus, 261, 72 Exceptional subvariety, 119 Existence of inflexion, 71 Existence of zeros, 71 Exterior power of a sheaf Exterior product ∧, 195 p G F , 58 F Factor... |
ry X ×S Y , 40 of irreducibles, 35 of schemes over S X ×S Y , 40 of varieties X × Y , 25, 26, 54, 252, 52 Projection, 6, 33, 39, 52, 53, 135 Projection formula, 195 Projective algebraic plane curve, 18 closure, 68 completion, 45 embedding, 134, 212, 230, 205, 209, 216 embedding of curve, 109 limit lim←− Eα , 18 line, 2... |
faces 4 or 5 turn up you win $8. What is a fair price for one roll? Essentially the same arithmetic and ideas as we gave above for the parsimonious innkeeper reveals the fair price (or value) of one roll to be $11 because We will meet this concept again under the name “expectation.” 13 × 3 5 + 8 × 2 5 = 11. 0.6 Introsp... |
ut we are imperfect instruments, and not infrequently rather confused. It is the case that most people’s intuition about problems in chance will often lead them grossly astray, even with very simple concepts. Although many examples appear later, we mention a few here: (a) The base rate fallacy. This appears in many con... |
the more likely eventualities have their probabilities nearer to one. The following chapters use similar arguments to develop more complicated rules and properties of probability. Appendix: Review of Elementary Mathematical Prerequisites It is difficult to make progress in any branch of mathematics without using the id... |
m n→∞ sn = s 22 0 Introduction Notice that sn need never actually take the value s, it must just get closer to it in the long run (e.g., let xn = n−1). Let (ar ; r ≥ 1) be a sequence of terms, with partial sums Infinite Series sn = n r =1 ar , n ≥ 1. ∞ If sn has a finite limit s as n → ∞, then the sum s. Otherwise, it di... |
represents an event A. The point ω represents an outcome in the event Ac. The diagram clearly illustrates the identities Ac ∪ A = and \A = Ac. These methods of combining events give rise to many equivalent ways of denoting an event. Some of the more useful identities for any events A and B are: (1) (2) (3) (4) (5) (6) ... |
gets $1. The value of this offer to you is $P(A ∪ B); the value to your left hand is $P(A); and the value to your right hand is $P(B). Obviously, it does not matter in which hand you get the money, so P(A ∪ B) = P(A) + P(B). Finally, consider the case where we imagine a point is picked at random anywhere in some plane ... |
ualities: 1.1 experiment outcome sample space event probability Venn diagram 1.2 certain event impossible event 1 Probability 40 event space σ -field 1.3 addition rules probability distribution countable additivity 1.4 axioms of probability Boole’s inequalities 1.5 continuous set function 1.6 de Morgan’s Laws .8 Example... |
was first used by Pascal and Fermat in the seventeenth century. In fact, we find easier methods for evaluating this probability in Chapter 2, using new concepts. 1.12 Example: Family Planning A woman planning her family considers the following schemes on the assumption that boys and girls are equally likely at each deli... |
ball if (a) You select a ball at random from the first urn? (b) You select an urn at random and then select a ball from it at random? (c) You discard two balls from the second urn and select the last ball? Four fair dice are rolled and the four numbers shown are multiplied together. What is the probability that this pr... |
note that conditional probability is a probability function in the sense defined in Section 1.4. Thus, P(|B) = 1 and, if Ai ∩ A j = φ for i = j, we have i P Ai |B = i P(Ai |B). From these, we may deduce various useful identities (as we did in Section 1.4); for example: P(A ∩ B ∩ C) = P(A|B ∩ C)P(B|C)P(C), n P 1 Ai = P A... |
ling. Let Fk denote the event that it is on the floor after k moves. What is fk = P(Fk)? Solution Let Ck denote the event that it is on the ceiling after k moves, and Nk, Ek, Wk, Sk denote the corresponding event for the four walls. Set ck = P(Ck), and so on. Then by Theorem 2.1.3, (3) P(Fk) = P(Fk|Fk−1)P(Fk−1) + P(Fk|C... |
hat is the limit of this probability as n → ∞? Show that for any j, k, P(Ck|C j ) = P(C j |Ck). Exercise Show that in m + n drawings, the probability that m cyan balls are followed by n blue Exercise balls is the same as the probability that n blue balls are followed by m cyan balls. Generalize this result. 2.8 Example... |
ly chosen driver makes a claim Solution in each of the first and second years. Then conditioning on the sex of the driver (M or F) yields P(A1) = P(A1|M)P(M) + P(A1|F)P(F) = 1 2 (µ + λ) because P(F) = P(M) = 1 2 . (b) Likewise, P(A1 ∩ A2) = P(A1 ∩ A2|M)P(M) + P(A1 ∩ A2|F)P(F) = 1 2 (µ2 + λ2). (c) By definition, P(A2|A1) ... |
that C did claim that D is a liar, given SA? Exercise What is the probability that both C and D lied, given SA? Exercise rather than by listing outcomes. Exercise you reconstruct the argument that led him to this answer? Eddington himself gave the answer to this problem as 25 Prove the result of the example more labori... |
eads shown (including the first coin). If you tell me only that your score is two, what is the probability that you rolled a die? Three fair dice labelled A, B, and C are rolled on to a sheet of paper. If a pair show the same number a straight line is drawn joining them. Show that the event that the line AB is drawn is ... |
be the event that Mn reports that Mn−1 reports that . . . that M2 reports that M1 is a liar. If each reporter lies independently with probability p, find pn, the probability that M1 told the truth given Rn. Show that as n → ∞, pn → 1 − p. Suppose that for events S, A, and B, P(S|A) ≥ P(S) P(A|S ∩ B) ≥ P(A|S) P(A|Sc) ≥ ... |
times, the number of ways of choosing a set of size r is ( n+r −1 ). r Proof A proof of Theorem (4) may be found in Example 3.12 or Theorem 3.7.5. 3.4 Inclusion–Exclusion 87 (5) Example: Ark The wyvern is an endangered species in the wild. You want to form a captive breeding colony, and you estimate that a viable colon... |
sets of n coupons. 3.7 Techniques 93 Then exactly similar arguments show that exp 2n sr r ! p2(n, r ), and so on for more sets. In conclusion, it is worth remarking that multivariate generating functions are often useful, although they will not appear much at this early stage. We give one example. (15) Multinomial Theo... |
−1 ) ways. (c) If neither the locomotives nor wagons have numbers and each train must contain m wagons at least, then we require the number of partitions of m into at most r integers, all of which are not less than m. This is the same as the number of partitions of n − mr into at most r integers, that is, pr (n − mr )... |
le has a principal seat (head) or not? Suppose that n pairs of twins are seated randomly at a round table. What is the Exercise probability that no pair of twins sit next to each other? What is the limit of this probability as n → ∞? Exercise What is the limit of (2) as n → ∞? The problem was first discussed by E. Lucas... |
1 and using Theorem 3.6.10 gives ∞ 1 yn p(n, 0). y 1 − y = ey − 1 + ey ∞ 1 yn p(n, 0) so that ∞ 1 yn p(n, 0) = e−y 1 − y − 1 = n ∞ yn 1 k=0 (−1)k k! . Hence, by (4) and (3), we get (1). Exercise occur. Show that for large n, it is approximately Find the probability that exactly r + s matches occur given that at least r... |
I decide that I will replace all the dead bulbs at the end of the year only if at least two are adjacent. Find the probability that this will happen. If it does, what is the probability that I will need more than two bulbs? A biased coin is tossed 2n times. Show that the probability that the number of heads is the same... |
ed simply the mass function of X , or even more briefly the p.m.f. The p.m.f., f (x) = f X (x), has the following properties: first, f (x) ≥ 0 for x ∈ {xi : i ∈ Z} f (x) = 0 elsewhere. That is to say, it is positive for a countable number of values of x and zero elsewhere. Second, if X (ω) is finite with probability one, ... |
orem 7 holds explicitly. Solution bility evenly over the values of X , it is called the uniform distribution.) Hence, The mass function of X is P(X = k) = 1/n. (Because it distributes proba- n E(=1 1 2 (x(x + 1) − x(x − 1)) (n + 1) by successive cancellation. x=1 = 1 2 Likewise, using Theorems 4 and 6(iv), E(X 2) + E(=... |
s that as n → ∞ the number of matches has a Poisson distribution (with parameter 1) in the limit. (3) Example: M´enages Revisited In Problem 3.38, we found the probability that exactly m couples were adjacent when seated randomly at a circular table (alternating the 4.6 Inequalities 131 sexes) is pm = 2 m! n−m k=0 (−)k... |
.) Remark bution of T is less than its mean. See Problem 4.51 for bounds on this difference. The Master of the Ball was exploiting the fact that the median of the distri- Note that T has a geometric distribution. 138 4 Random Variables (1) (2) (3) Give an example of a distribution for which the median is larger than th... |
n p = 1 2 . Exercise When the first game is over they redivide the a + b coins as follows. All the coins are tossed, one player gets those showing a head, the other gets all those showing a tail. Now they play a series of games as before. What is the expected number to be played until one or other player again has all t... |
he kth bout. Then, E(X ) = E(X |A1) p + E q, X |Ac 1 by conditioning on the outcome of the first bout. Now if Pascal is awarded the first bout but not the second, the state of the duel in respect of the final outcome is exactly the same as if he had lost the first bout, except of course that one bout extra has been fought.... |
ber of red balls in urn I in the long run. A monkey has a bag with four apples, three bananas, and two pears. He eats fruit at random until he takes a fruit of a kind he has eaten already. He throws that away and the bag with the rest. What is the mass function of the number of fruit eaten, and what is its expectation?... |
called the bull. The archer is as likely to miss the target as she is to hit it. When the 156 4 Random Variables archer does hit the target, she is as likely to hit any one point on the target as any other. What is the probability that the archer will hit the bull? What is the probability that the archer will hit k bul... |
and Dependence Then, by (5), f X (x) = c µλx ∞ y= µy−1 = (1 − λ − µ)λx (1 − µ)x+1 , x ≥ 0. Likewise, fY (y) = c ∞ x=0 x + y − 1 x λx µy = (1 − λ − µ)µy−1 (1 − λ)y , y ≥ 1. Thus, X + 1 and Y are both geometric, with parameters taking values in the nonnegative integers and Y in the positive integers. λ 1−µ and µ 1−λ , re... |
Y ) = 5 12 cov (U, V ) = 1 2 cov (W 12 = 1 8 + 3 8 + 5 12 − (ii) (iii) − − = 1 24 + 5 12 = 0. (7) Example 5.1.8 Revisited (X, Y ) and cov (V, W ). Solution Recall that Art and Bart are cutting for the deal. Find cov E(X ) = E(Y ) = 14 2 x 13 = 8. Also, using (1), E(X Y ) = . 1 13 . y 13 x = 1 12 12 (105 × 104 − 1118) =... |
(14) P(Xi = k) = (1 − p)k p; Show that the mass function of X 1 + X 2 is f (z) = (z + 1)(1 − p)z p2; z ≥ 0. k ≥ 0. (b) If (Xi ; i > 1) are independent random variables, each having the geometric distri- n 1 Xi . bution (14), find the mass function of Z = Solution (a) Using (12), z f (z) = (1 − p)k p(1 − p)z−k p = k=0 z... |
ction, and suppose that X, Y, and Z are jointly distributed. Then (assuming all the expectations exist), (i) E(a|Y ) = a (ii) E(a X + bZ |Y ) = aE(X |Y ) + bE(Z |Y ) (iii) E(X |Y ) ≥ 0 if X ≥ 0 (iv) E(X |Y ) = E(X ), if X and Y are independent (v) E(Xg(Y )|Y ) = g(Y )E(X |Y ) (vi) E(X |Y ; g(Y )) = E(X |Y ) (vii) E(E(X... |
of paths from Proof Let π be a path that visits b on its journey from (0, 0) to (n − 1, b − 1). Let L be the occasion of its last visit. Now reflect that part of the walk after L in the line y = b. This yields a path π from (0, 0) to (n − 1, b + 1). Conversely, for any path from (0, 0) to (n − 1, b + 1), we may reflect ... |
act that a martingale {X n; n ≥ 0} that is stopped at a random time T is still a martingale, provided that T is a stopping time for {X n; n ≥ 0}. That is to say, formally: (10) Theorem Let T be a stopping time for the martingale {X n; n ≥ 0}, and let Zn = X T ∧n = n ≤ T X n, X T , n > T . Then Zn is a martingale, and E... |
les. If there is a random variable X such that lim n→∞ E(X n − X )2 = 0, then X n is said to converge in mean square to X . We sometimes write this as X n m.s.→ X . Although important, this section may be omitted at a first reading. 5.9 Convergence In the preceding section and at various earlier times, we introduced sev... |
ess that it is not in general where Remark true that var X = E var (X |Y ). Conditional independence: X and Y are conditionally independent given Z = z if, for all x and y, P(X = x, Y = y|Z = z) = f X |Z (x|z) fY |Z (y|z). Checklist of Terms for Chapter 5 5.1 joint probability mass function marginal mass function 5.2 i... |
size of the chosen congregation is 1 1 E(Xi ). n The fact that a member picked at random was in a larger expected congregation is a form of sampling “paradox.” For what distributions of Xi , if any, does the expected size of a randomly selected Exercise individual’s group actually equal the mean size of groups? Family ... |
write down the distribution of the number of marked animals captured, and then evaluate the mean by a method similar to the first method of (a). It is easier to let I j be the indicator of the event that the jth captured animal is marked. Then the required expectation is (2) E [E(Ym )] 1 I j = [E(Ym)]E(I j ) = [E(Ym)] ... |
The point of this example is that you cannot turn a fair game in your favour (1) (2) (3) Show that using any of the following systems, the gambler’s fortune is a martingale: Exercise (a) Optional skipping. At each play, the gambler skips the round or wagers a unit stake. (b) Optional starting. The gambler does not joi... |
that P(|X − Y | ≤ M) = 1, where M is finite. Show that if E(X ) < ∞, then E(Y ) < ∞ and |E(X ) − E(Y )| ≤ M. Show that the following are joint p.m.f.s and find the marginal distributions. β c 15 16 17 (a) (b) f (x1, . . . , xk) = x! x1! . . . xk! p x1 1 · · · p xk k , where k 1 p j = 1 and k 1 x j = x. f (x, x1, . . . , ... |
G(s) of the integer valued random G(s) = k P(X = k)sk. Because all random variables in this chapter are integer valued, this is not again mentioned explicitly. (2) Example Let X be uniformly distributed in {−a, −a + 1, . . . , b − 1, b}, where a, b > 0. Then provided s = 1, G(s) = b k=−a 1 a + b + 1 sk = s−a − sb+1 (a... |
probability mass functions. k = 0 f (k) < 1, then it still makes sense If X is a nonnegative random variable such that ∞ k k f (k) < ∞, to define the generating function G(s) = then G(1) = k k f (k). However, this is not now the expectation E(X ), but rather the “defective” expectation k = 0 sk f (k). Furthermore, if ∞ ... |
1 + t)Z ) = G X (1 + t)GY (1 + t). Finally, we record the existence of yet another function that generates the moments of X , albeit indirectly. (11) Definition If the function κ(t) = log(E(e X t )) = log(MX (t)) 6.5 Joint Generating Functions 247 can be expanded in powers of t, in the form (12) κ(t) = ∞ r =1 κr t r /r ... |
+ 1)) k! X k = E(S) = n k E(I j1 . . . I jk ) 253 6.6 Sequences = = n k n k P (a given set of k all match! . µ(k) = 1; 0; ! k ≤ n k > n, → 1 for all k Hence, as n → ∞. But these are the factorial moments of the Poisson distribution with parameter 1, and so as n → ∞ (9) P(X = k) → 1 ek! . We conclude with an example tha... |
ocess in which E(s X 2) = G(s) and E(s X 1) = H (s). Show that for all n (10) P(Hn) = 1 G(1) . Solution cients. Furthermore, by L’Hˆopital’s rule, From (6.1.6), we have that H (s) is a power series with nonnegative coeffi- H (1) = lim s↑1 −G(s) −E(X ) = 1 Hence, H (s) is a p.g.f. Finally, if D(s) = H (s) in (6), then 6.... |
ccordingly, we define: Continuity The function f (x) is continuous in (α, β) if, for all a ∈ (α, β), f (x) = f (a). lim x→a Now, given a continuous function f (x), we are often interested in two principal questions about f (x). (i) What is the slope (or gradient) of f (x) at the point x = a? (ii) What is the area under ... |
(r, n + 1) are independent and identically distributed, with mean µ, variance σ 2, and cumulant generating function κ(t) = log (E[exp(t X (1, 1))]). E(Zn) = µn. Show that Show also that var (Zn) = var (Z1)(E(Z1))n−1 + (E(Z1))2var (Zn−1) and hence find an expression for var (Zn) in terms of µ and σ , when µ = 1. Solutio... |
E(Ibs T ) + E(I c a I c b s T ). Now on the event Ia, X 1 = T because the first bulb failed before a and was identified as unusual at X 1. So E(Ias T ) = E(Ias X 1). a I c b , the process regenerates at the first replacement X 1 ∈ [a, b], and so On the event I c T = X 1 + T , where T is independent of X 1 and has the same... |
. A biased coin is tossed N times, where N is a random variable with finite mean. Show that if the numbers of heads and tails are independent, then N is Poisson. [You may want to use f (x + y) = f (x) f (y) take the form f (x) = eλx for the fact that all continuous solutions of some λ.] Let X n have a negative binomial ... |
lection F (the event space) of subsets of (the sample space). Then we think of a random variable X as a real valued function X (ω) defined for each ω. Our first requirement (as in the discrete case) is a function that tells us about the relative likelihoods of possible values of X . Happily, we already have such a functi... |
es; the one in Example 21 is called the standard normal density denoted by N (0, 1), and by φ(x) = (2π)−1/2 exp(− 1 2 x 2). Its distribution is (x), given by (x) = x −∞ φ(v)dv. (22) (23) (24) (25) Example: Gamma Distribution Show that for α, λ, x > 0, the function f (x) = cλα x α−1e−λx can be a density. When α is a pos... |
void substantial queues? Once again an experiment is impractical. However, simple apparatus can provide us with the rates and properties of traffic on equivalent roads. If we then simulate the workings of the booth and test it with the actual traffic flows, we should obtain reasonable estimates of the chances of congestio... |
e that the word “exponential” is always omitted in this context, and that the required interchange at (8) is permissible if MX (t) exists in an interval that includes the origin. You may also ask, do we always know the density f X (x), if we know MX (t)? After all, the probability generating function uniquely determine... |
(iii) If for all s ≥ 0, t ≥ 0, increases, then T is (or has) increasing failure rate average, denoted by IFRA. t H (s + t) ≥ H (s) + H (t), then T is new better than used, denoted by NBU. 314 7 Continuous Random Variables (iv) If for all t ≥ 0 E(T ) ≥ E(T − t|At ), then T is new better than used in expectation, denote... |
), when the derivative exists, with the Key Rule: P(X ∈ A|B) = f (x|B)d x. x∈A Such conditioned random variables may have an expectation if ∞ |x| f X |B(x|B)d x < ∞, −∞ 320 7 Continuous Random Variables Table 7.1. Continuous random variables and their associated characteristics X f (x) EX var X m.g.f. Uniform Exponenti... |
ave a setup cost, so that posting y bits costs F(x) = find the optimal delivery policy. 1 − e−λ(x−a); 0; x ≥ a x < a, (4) (5) (6) 7.15 Example: Obtaining Your Visa A certain consular clerk will answer the telephone only on weekdays at about 10.00 a.m. On any such morning, it is an evens chance whether he is at his desk ... |
p , 0 ≤ x < n. √ n → y, then → x, j/ −x y2 1 − x , 0 < x < 1. Remark lished by de Moivre in 1730. The formula actually proved by Stirling in 1730 was The result of Exercise 6, which is known as Stirling’s formula, was estab- −(n+ 1 2 ) n + 1 2 n! en+ 1 2 → (2π) 1 2 , as n → ∞. (11) Exercise Prove (10). 10 Let f (x) = c... |
tly Continuous Random Variables (c) Show that it is possible to construct a triangle with sides X, Y, 2 − X − Y , with prob- ability one. (d) Show that the angle opposite to the side of length Y is obtuse with probability p0 = c 1 x a+1 − x a+2 2 − x 0 d x. (e) When a = 0, show that p0 = 3 − 4 log 2. Solution (a) Becau... |
< α < 1 2 π, find the probability that b < (X 2 + Y 2) 1 4 π < tan−1(Y/ X ) < 1 2 π, given that (X 2 + Y 2) 1 2 < a, Y > 0, and tan−1( and π . Figure 8.1 Bertrand’s paradox. 8.3 Independence 347 Solution Because X and Y are independent, they have joint density f (x, y) = k2 exp − 1 2 (x 2 + y2) . Make the change of vari... |
ize for independent discrete random variables, it is the case that joint m.g.f.s factorize for independent continuous random variables. That is to say, if and only if X and Y are independent. MX,Y (s, t) = MX (s)MY (t) We offer no proofs for the above statements, as a proper account would require a wealth of analytical... |
t) = (1 − 2ρσ τ t − σ 2τ 2(1 − ρ2)t 2)−1 − ρ)t Hence, Z = X 1Y1 + X 2Y2 has an asymmetric bilateral exponential density, 1 1 − σ τ (1 + ρ)t = 1 − ρ 2 . f (z) = 1 + ρ 2 1 − ρ 2 exp (−σ τ (1 + ρ)z) if z > 0 exp (σ τ (1 − ρ)z) if z < 0. We note without proof that ψ(Y ) has the useful properties that we recorded in... |
inear and invertible with |J | = 1. Hence, by Theorem 8.7.2, the random variables Tn = n 1 Xi ; 1 ≤ n ≤ k + 1 have joint density f (t1, . . . , tk+1) = λk+1e−λtk+1; 0 < t1 < . . . < tk+1. (9) Now P(0 < T1 < t1 < T2 < . . . < Tk < tk; N (t) = k) = P(0 < T1 < t1 < . . . < Tk < tk < t < Tk+1) = λkt1(t2 − t1) . . . (tk − t... |
r martingales and the optional stopping theorem, and prove simple forms of the weak law of large numbers and the central limit theorem. We summarize most of these principal properties for the bivariate case (X, Y ). The extension to larger collections of random variables (X 1, X 2, X 3, . . . ; the multivariate case) i... |
r) of a semicircle with radius 1. Show that the expected area of the resulting triangle they make with the midpoint of the diameter is 1/(2 + π). Exercise Write down the joint density of U and W ; then integrate to derive (1) by a fourth method. 8.13 Example: Buffon’s Needle An infinite horizontal table is marked with a... |
= P(B(t) > y)P(C(t) > z). by the independence of increments, Worked Examples and Exercises 383 Furthermore, we showed that N (t) − N (t − z) has the same distribution as N (z), for t and t − z both nonnegative. Hence, (1) (2) Likewise, Hence, P(C(t) > z) = 1 e−λz . P(B(t) > y) = 1 e−λy y < 0 y > 0. E(B + C) = 1 ... |
is uniform in x on finite intervals including 0.] n r =1 Xr . Then Sn is Poisson with parameter n, mean n, and vari- Solution ance n. Thus, P(Sn = n) = e−nnn/n!, and we may write Let Sn = (1) (2) (3) (4) √ ne−nnn/n! = = = √ √ √ n √ nP(Sn = n) = n − 1 − n √ n nP " nP(n − 1 < Sn ≤ n) < Sn − n √ n # ≤ 0 Fn(0) − Fn − 1√ n √... |
= max {U1, . . . , Un} . Show that, as n → ∞, the distribution of Zn = n(1 − Mn) converges to an exponential distribution. Let (Xi ; i ≥ 1) be independent exponential random variables each with parameter µ. Let N be independent of the Xi having mass function f N (n) = (1 − p) pn−1; n ≥ 1. What is the density of Y = Let... |
egers Z+. r =1 When S is a finite set, X is known as a finite Markov chain. Until further notice, we consider finite chains (unless it is specifically stated otherwise) and write |S| = d. (5) Example: Information Source A basic concern of telecommunications engineers is the transmission of signals along a channel. Signals ... |
p transition probabilities ( pik(n); 1 ≤ i ≤ d, 1 ≤ k ≤ d) ; 1 ≤ i ≤ d) as a row can be regarded as a matrix Pn, and the absolute probabilities (α(n) vector αn. It follows from Theorem 12 and (18) that i and αn = αPn, where α = (α1, . . . , αd ). Pm+n = PmPn = Pm+n (16) (17) (18) (19) (20) Example: Two State Chain The ... |
at a first passage time T , the future of the chain is independent of the past. The following example makes this more precise. (17) Example: Preservation of Markov Property at First Passage Times Let X be a regular Markov chain with transition matrix P, and let T be the first passage time of the chain to d. Show that fo... |
+ πi δikµk = 1 + πi pi j µ jk = 1 + π j µ jk i i i j j on using the fact that π = πP. Hence, using (19) in the second sum, we have πkµk = 1. Because µk is uniquely determined and finite, the required results follow. (21) Example: Cube Suppose that a particle performs a random walk on the vertices of a cube in such a way... |
∈S g(k)=0 $ $ $ $ P Vk(n) n + 1 − πk $ $ $ $ > dg(k) → 0, as n → ∞ , by Theorem 9 (using the fact that S is finite). We can give an immediate application of these results. (18) Example: Asymptotic Equipartition for a Markov Source Let the regular Markov chain X with transition matrix P and stationary distribution π repr... |
eviously obtained solution satisfies (9.6.3). A pressing question is, can we solve (9.6.3) without already knowing the answer? We therefore develop a technique for tackling the Chapman–Kolmogorov equations in this section. First, we observe that for the Poisson process, as t → 0 (1) (2) (3) (4) pk,k+1(t) = P(N (t) = 1) ... |
) for all t. However, if αβ > 0, then the chain is irreducible and has stationary distribution π = 9.8 Forward Equations: Equilibrium 435 β α+β ( , α α+β ). We can check that for all t π0 = = + αe−(α+β) − βe−(α+β)t α + β = π0 p00(t) + π1 p10(t). In practice, the state space is often countably infinite, and of course we ... |
but does not get too far away from zero. These path properties are not so easy to verify so we turn our attention to the other kind, that is, properties of the joint distributions of W (t), which are again best illustrated by examples. It is useful to recall that the N (0, σ 2) normal density is denoted by φσ 2(x) = 1... |
), 0 ≤ s ≤ t}. We note these two useful (22) (23) {M(t) ≥ c} ⊇ {W (t) ≥ c}. Second, for c > 0, denoting the first passage time to c by Tc, {M(t) ≥ c} ≡ {Tc ≤ t}, and after Tc the process has a symmetric distribution about c that is independent of W (t), t ≤ Tc. Therefore, P(W (t) ≤ c|M(t) ≥ c) = P(W (t) ≥ c|M(t) ≥ c) = ... |
ndent of the past. We make this idea precise for Markov processes in discrete and continuous time. For those with discrete state space, we derive the Chapman–Kolmogorov equations and use them to examine the evolution of the chain over time. We consider first passage times and recurrence times, and examine the link to st... |
f it is in C2, then it is transferred with probability β. Otherwise, the particles remain where they are. Show that X has stationary distribution (1) πi = αm−i βi (α + β)m . m i (a) Given X 0, . . . , X n = j, the probability that a particle in C1 is selected Solution for transfer is j/m, and the probability that a par... |
ikewise, p ji (1) = p ji (2) = 0 and µ ji = 1 + 1 2 + 1 4 8 = 14. 460 9 Markov Chains (b) Let D j denote the event that the chain first enters D at j. Then µsi = E(Tsi ) = E(Tsi − Ts D + Ts D) = E(Tsi − Ts D) + µs D = E(Tsi − Ts j |D j )φs j + µs D. j∈D However, given D j , the chain continues its journey to i independe... |
oisson with parameter vx, so that using conditional expectation again, the generating function of the descendants at t of the arrivals in [t − x, t] is (6) exp(νx(log(1 − s + se−λx )− 1 λx − 1)) = e−νx (1 − s + se−λx ) ν λ . Now we recall from Example 8.17 that the current life (or age) of a Poisson process has density... |
rcise Let Tb be the first passage time of W (t) to b = 0. Show that ETb = ∞. Use the martingales 9.9.29–9.9.31 to show that 3ET 2 = 3a2b2 − ab(a2 + b2) and 3varT = −ab(a2 + b2). (4) Exercise Use the martingale eθ W (t)− 1 2 θ 2t = Mθ to show that, when a = −b, Ee−sT = [cosh(a √ 2s)]−1. (Hint: Show that Mθ + M−θ is a mar... |
s with parameter pλ(t). Let X (t) be a Markov chain with transition probabilities pi j (t) and stationary distribution π. Let (Tn; n ≥ 0) be the jump times of a Poisson process independent of X (t). Show that the sequence Yn = X (Tn) is a Markov chain with the same stationary distribution as X (t). Find the mean and va... |
p1 3 p1 + p2(1 − p1) . (b) ; and p2 = p1 1 − p1 (b) a = b = c. . (a) a = 1, b = 1, c = 2; 16 41 20 41 2.14.1 2.14.2 482 1 3 4 5 6 7 8 9 Appendix Problems (a) 0.12; (b) 0.61; (c) 0.4758; (d) ; 1 2 respectively; (ii) 7 9 ; ; (i) 1 1 6 3 (b) No. (a) 0.36; (b) 0.06; (c) 0.7; (d) (iii) 18 41 . 7 13 1 14 ; 2 7 ; 9 14 , respe... |
e, E(T ) = 14. To find E(U ), consider the event that a sequence of n tosses including no H T H is followed by H T H . Hence, either U = n + 1 or U = n + 3, and so P(U > n) 1 8 = P(U = n + 1) 1 4 + P(U = n + 3). Summing over n gives 9 E(X ) = ∞. 1 8 E(U ) = 1 4 + 1; E(U ) = 10. 14 15 1 (i) b − a + 1 e.g. f (−2) = 1 2 fo... |
c. E(s N ) + s2 4 ; ; E(N ) = 6. 1 − 1 ps Hence, (1 − s)G R(s) = cd(1 − p)s + cs . ER = 1 2 15 By the independence var(H − T ) = var(H ) + var(T ) = λ = var(N ). 16 With the notation of Problem 15, log(1 − ps), cp/(1 − p). where d = log(1 − p) 1 p P(R = r |X = x)P(X = x) = ∞ x=r cpx x(x + 1) . E(s H t T ) = E(s H t N −... |
e same as X (1), namely n(1 − x)n−1; 0 ≤ x ≤ 1. 8.12.7 By symmetry, this is the same as the joint density of X (1) and 1 − X (n). Now P(X (1) > x, 1 − X (n) > y) = (1 − x − y)n, so f = n(n − 1)(1 − x − y)n−2. 8.12.8 Given neither point is on the diameter, the density of the angle they make at the midpoint of the diamet... |
; n ≥ 0). 9.14.10 If Bn > 0, then Bn − 1 = Bn+1, and if Bn = 0, then Bn+1 is the time to the next event, less the elapsed unit of time. Hence, B is a Markov chain with pi, i−1 = 1; i > 0 and p0 j = f X ( j + 1) = P(X = j + 1). Hence, for a stationary distribution π with π(s) = i si πi , whence and so if i πi = 1, Henc... |
lity with martingales, Cambridge University Press. Markov Chains and Other Random Processes Most of the above books contain much material on Markov chains and other random processes at their own levels. However, mention should be made of the classic text: Doob, J.L. (1953) Stochastic processes, John Wiley, New York. Fi... |
nce, 199–200 in counting, 83–84 cumulative, 117–118 current life, 382–383 defective, 239, 244 discrete, 114 excess life, 382–383 expectation, 302–306 exponential, 292–294, 297–298, 303, 311–312, 314, 320 fns, 117–118 gamma, 297 Gaussian. See normal distn geometric, 124–125, 134, 137, 217, 292 hypergeometric, 134 joint,... |
e times, 410 jointly continuous rv, 383 lumping states and, 399, 474 messages, 397 Poisson pr and, 427–431 simple rw, 397 strong, 410, 444 Wiener pr, 437 Markov sources, 398, 405–406, 423–424 Markov time, 474 martingales, 190–196 backward, 195–196 bounded, 225 branching, 278 coin toss, 190–191 conditional expectation, ... |
ing, optional, 194, 201, 223, 231, 446–447 stopping times, 192–194, 367–368, 444 strange but true, 215–216 strong laws of large numbers, 198 strong Markov property, 410, 444 subjective probability, 9–10 submartingales, 195–196, 202 substochastic matrices, 400 sudden death, 65–66 sums arithmetic, 21 binomial, 176–177 de... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.