这是2014年5月理论的1-3题。
For X 2 X_2 X2, P ( X 2 = 0 ∣ X 1 = 0 ) = b + 1 b + r + 1 P ( X 2 = 1 ∣ X 1 = 0 ) = r b + r + 1 P ( X 2 = 0 ∣ X 1 = 1 ) = b b + r + 1 P ( X 2 = 1 ∣ X 1 = 1 ) = r + 1 b + r + 1 P(X_2 = 0|X_1 = 0) = \frac{b+1}{b+r+1} \\ P(X_2 = 1|X_1 = 0) = \frac{r}{b+r+1} \\ P(X_2 = 0|X_1 = 1) = \frac{b}{b+r+1} \\ P(X_2 = 1|X_1 = 1) = \frac{r+1}{b+r+1} P(X2=0∣X1=0)=b+r+1b+1P(X2=1∣X1=0)=b+r+1rP(X2=0∣X1=1)=b+r+1bP(X2=1∣X1=1)=b+r+1r+1
Notice Y ∈ { 0 , 1 , 2 } Y \in \{0,1,2\} Y∈{0,1,2}. P ( Y = 0 ) = P ( X 1 = 0 , X 2 = 0 ) = P ( X 2 = 0 ∣ X 1 = 0 ) P ( X 1 = 0 ) = b ( b + 1 ) ( b + r + 1 ) ( b + r ) P ( Y = 1 ) = P ( X 2 = 1 ∣ X 1 = 0 ) P ( X 1 = 0 ) + P ( X 2 = 0 ∣ X 1 = 1 ) P ( X 1 = 1 ) = r b ( b + r + 1 ) ( b + r ) + b r ( b + r + 1 ) ( b + r ) = 2 b r ( b + r + 1 ) ( b + r ) P ( Y = 2 ) = P ( X 2 = 1 ∣ X 1 = 1 ) P ( X 1 = 1 ) = r ( r + 1 ) ( b + r + 1 ) ( b + r ) P(Y = 0) = P(X_1 = 0,X_2 = 0) = P(X_2 = 0|X_1 = 0)P(X_1 = 0) = \frac{b(b+1)}{(b+r+1)(b+r)} \\ P(Y = 1) = P(X_2 = 1|X_1 = 0)P(X_1=0) + P(X_2 = 0|X_1 = 1)P(X_1=1) \\ = \frac{rb}{(b+r+1)(b+r)} + \frac{br}{(b+r+1)(b+r)} = \frac{2br}{(b+r+1)(b+r)} \\ P(Y = 2) =P(X_2 = 1|X_1 = 1)P(X_1=1) =\frac{r(r+1)}{(b+r+1)(b+r)} P(Y=0)=P(X1=0,X2=0)=P(X2=0∣X1=0)P(X1=0)=(b+r+1)(b+r)b(b+1)P(Y=1)=P(X2=1∣X1=0)P(X1=0)+P(X2=0∣X1=1)P(X1=1)=(b+r+1)(b+r)rb+(b+r+1)(b+r)br=(b+r+1)(b+r)2brP(Y=2)=P(X2=1∣X1=1)P(X1=1)=(b+r+1)(b+r)r(r+1)
So E Y = 2 b r ( b + r + 1 ) ( b + r ) + 2 r ( r + 1 ) ( b + r + 1 ) ( b + r ) = 2 r b + r E Y 2 = 2 b r ( b + r + 1 ) ( b + r ) + 4 r ( r + 1 ) ( b + r + 1 ) ( b + r ) = 4 r 2 + 2 b r + 4 r ( b + r + 1 ) ( b + r ) V a r Y = E Y 2 − ( E Y ) 2 = 4 r 2 + 2 b r + 4 r ( b + r + 1 ) ( b + r ) − 4 r 2 ( b + r ) 2 = 4 r 2 b + 2 b 2 r + 4 b r + 4 r 3 + 2 b r 2 + 4 r 2 − 4 r 2 b − 4 r 3 − 4 r 2 ( b + r + 1 ) ( b + r ) 2 = 2 b r ( b + r + 2 ) ( b + r + 1 ) ( b + r ) 2 EY = \frac{2br}{(b+r+1)(b+r)} + \frac{2r(r+1)}{(b+r+1)(b+r)} =\frac{2r}{b+r} \\ EY^2 =\frac{2br}{(b+r+1)(b+r)} + \frac{4r(r+1)}{(b+r+1)(b+r)}=\frac{4r^2 + 2br+4r}{(b+r+1)(b+r)} \\ Var Y = EY^2 - (EY)^2 = \frac{4r^2 + 2br+4r}{(b+r+1)(b+r)} - \frac{4r^2}{(b+r)^2} \\= \frac{4r^2b+2b^2r+4br+4r^3+2br^2+4r^2 - 4r^2b-4r^3-4r^2}{(b+r+1)(b+r)^2} = \frac{2br(b+r+2)}{(b+r+1)(b+r)^2} EY=(b+r+1)(b+r)2br+(b+r+1)(b+r)2r(r+1)=b+r2rEY2=(b+r+1)(b+r)2br+(b+r+1)(b+r)4r(r+1)=(b+r+1)(b+r)4r2+2br+4rVarY=EY2−(EY)2=(b+r+1)(b+r)4r2+2br+4r−(b+r)24r2=(b+r+1)(b+r)24r2b+2b2r+4br+4r3+2br2+4r2−4r2b−4r3−4r2=(b+r+1)(b+r)22br(b+r+2)
这道题的题干是不对的, τ Y \tau_Y τY前漏了一个 1 / n 1/n 1/n的系数。
Now that μ Y = 0 \mu_Y = 0 μY=0, kurtosis is τ Y = E Y 4 σ Y 4 ⇒ E Y 4 = σ Y 4 τ Y \tau_Y = \frac{EY^4}{\sigma_Y^4} \Rightarrow EY^4 = \sigma^4_Y \tau_Y τY=σY4EY4⇒EY4=σY4τY
For T = ∑ i = 1 n Y i T = \sum_{i=1}^n Y_i T=∑i=1nYi, E T 2 = E ( ∑ i = 1 n Y i ) 2 = E ∑ i = 1 n Y i 2 + ∑ i ≠ j E Y i E Y j = E ∑ i = 1 n Y i 2 + E ∑ i ≠ j Y i Y j = n σ Y 2 E T 4 = E ( ∑ i = 1 n Y i ) 4 = E ( ∑ i = 1 n Y i 2 + ∑ i ≠ j Y i Y j ) 2 = E ( ∑ i = 1 n Y i 2 ) 2 + 2 E ( ∑ i = 1 n Y i 2 ) ( ∑ i ≠ j Y i Y j ) + E ( ∑ i ≠ j Y i Y j ) 2 = E ( ∑ i = 1 n Y i 4 ) + 3 E ( ∑ i ≠ j Y k 2 Y i Y j ) + 3 E ( ∑ i ≠ j Y i 2 Y j 2 ) + E ( ∑ i ≠ j ≠ k ≠ l Y i Y j Y k Y l ) = n σ Y 4 τ Y + 3 n ( n − 1 ) σ Y 4 ET^2 = E(\sum_{i=1}^n Y_i)^2 = E\sum_{i=1}^n Y_i^2 + \sum_{i \ne j} EY_iEY_j = E\sum_{i=1}^n Y_i^2 + E \sum_{i \ne j} Y_iY_j = n\sigma_Y^2 \\ ET^4 = E(\sum_{i=1}^n Y_i)^4 = E(\sum_{i=1}^n Y_i^2 + \sum_{i \ne j} Y_iY_j)^2 \\= E(\sum_{i=1}^n Y_i^2)^2 +2E(\sum_{i=1}^n Y_i^2)(\sum_{i \ne j} Y_iY_j) + E( \sum_{i \ne j} Y_iY_j)^2\\ = E(\sum_{i=1}^n Y_i^4) + 3E(\sum_{i \ne j} Y_k^2Y_iY_j) + 3E(\sum_{i \ne j}Y_i^2Y_j^2) +E(\sum_{i \ne j\ne k\ne l}Y_iY_jY_kY_l) \\= n\sigma_Y^4 \tau_Y + 3n(n-1)\sigma_Y^4 ET2=E(i=1∑nYi)2=Ei=1∑nYi2+i=j∑EYiEYj=Ei=1∑nYi2+Ei=j∑YiYj=nσY2ET4=E(i=1∑nYi)4=E(i=1∑nYi2+i=j∑YiYj)2=E(i=1∑nYi2)2+2E(i=1∑nYi2)(i=j∑YiYj)+E(i=j∑YiYj)2=E(i=1∑nYi4)+3E(i=j∑Yk2YiYj)+3E(i=j∑Yi2Yj2)+E(i=j=k=l∑YiYjYkYl)=nσY4τY+3n(n−1)σY4
So τ T = E T 4 ( E T ) 2 = 1 n τ Y + 3 ( n − 1 ) n \tau_T = \frac{ET^4}{(ET)^2} = \frac{1}{n}\tau_Y + \frac{3(n-1)}{n} τT=(ET)2ET4=n1τY+n3(n−1)
Part a Expectation of X X X is E X = ∫ 0 ∞ x θ e − x θ d x = − x e − x θ ∣ 0 ∞ + ∫ 0 ∞ e − x θ d x = θ EX = \int_0^{\infty} \frac{x}{\theta}e^{-\frac{x}{\theta}}dx = -xe^{-\frac{x}{\theta}}|_0^{\infty} + \int_{0}^{\infty} e^{-\frac{x}{\theta}}dx = \theta EX=∫0∞θxe−θxdx=−xe−θx∣0∞+∫0∞e−θxdx=θ
E θ ^ a = a E X ˉ n = a n ∑ i = 1 n E X i = a θ E\hat{\theta}_a = aE\bar{X}_n = \frac{a}{n}\sum_{i=1}^n EX_i = a\theta Eθ^a=aEXˉn=nai=1∑nEXi=aθ
Second-order moment of X X X is E X 2 = ∫ 0 ∞ x 2 θ e − x θ d x = − x 2 e − x θ ∣ 0 ∞ + 2 ∫ 0 ∞ x e − x θ d x = 2 θ 2 EX^2 = \int_{0}^{\infty} \frac{x^2}{\theta}e^{-\frac{x}{\theta}}dx = -x^2 e^{-\frac{x}{\theta}}|_0^{\infty} + 2\int_0^{\infty} xe^{-\frac{x}{\theta}}dx = 2\theta^2 EX2=∫0∞θx2e−θxdx=−x2e−θx∣0∞+2∫0∞xe−θxdx=2θ2
E θ ^ a 2 = a 2 n 2 E ( ∑ i = 1 n X i ) 2 = a 2 n 2 ∑ i = 1 n E ( X i ) 2 + a 2 n 2 ∑ i ≠ j E X i X j = 2 n a 2 θ 2 + n ( n − 1 ) a 2 θ 2 n 2 = n + 1 n a 2 θ 2 E\hat{\theta}_a^2 = \frac{a^2}{n^2} E (\sum_{i=1}^nX_i)^2 = \frac{a^2}{n^2}\sum_{i=1}^nE (X_i)^2 +\frac{a^2}{n^2}\sum_{i\ne j}E X_iX_j \\= \frac{2na^2\theta^2 + n(n-1)a^2\theta^2}{n^2} = \frac{n+1}{n}a^2\theta^2 Eθ^a2=n2a2E(i=1∑nXi)2=n2a2i=1∑nE(Xi)2+n2a2i=j∑EXiXj=n22na2θ2+n(n−1)a2θ2=nn+1a2θ2
Now consider E ( θ ^ a − θ ) 2 = E θ ^ a 2 − 2 θ E θ ^ a + θ 2 = ( ( n + 1 ) a 2 n − 2 a + 1 ) θ 2 E(\hat{\theta}_a - \theta)^2 = E\hat{\theta}_a^2 - 2\theta E\hat{\theta}_a + \theta^2 = (\frac{(n+1)a^2}{n}-2a+1)\theta^2 E(θ^a−θ)2=Eθ^a2−2θEθ^a+θ2=(n(n+1)a2−2a+1)θ2
To make E ( θ ^ a − θ ) 2 E(\hat{\theta}_a - \theta)^2 E(θ^a−θ)2 as small as possible, a = n n + 1 a = \frac{n}{n+1} a=n+1n
Part b By LLN, X ˉ n → p E X = θ \bar{X}_n \to_p EX = \theta Xˉn→pEX=θ, so X ˉ n \bar{X}_n Xˉn is consistent. By CLT, X ˉ n − θ 1 n θ → d N ( 0 , 1 ) \frac{\bar{X}_n - \theta}{\sqrt{\frac{1}{n}}\theta} \to_d N(0,1) n1 θXˉn−θ→dN(0,1)
So aymptotic variance is θ 2 / n \theta^2/n θ2/n.
Part c By LLN, Y ˉ n → p E Y = 1 / θ \bar{Y}_n \to_p EY = 1/\theta Yˉn→pEY=1/θ. By property of convrgence in probability 1 / Y ˉ → p θ 1/\bar{Y} \to_p \theta 1/Yˉ→pθ. So 1 / Y ˉ n 1/\bar{Y}_n 1/Yˉn is consistent. Now apply generalized CLT: (参考UA MATH564 概率论V 中心极限定理的Delta方法) We can get 1 / Y ˉ n − θ 1 / n θ → d N ( 0 , 1 ) \frac{1/\bar{Y}_n - \theta}{\sqrt{1/n}\theta} \to_d N(0,1) 1/n θ1/Yˉn−θ→dN(0,1)
So aymptotic variance is θ 2 / n \theta^2/n θ2/n.