这是2014年5月的5题。
Part (a) Joint density of the bivariate normal disrtribution is f Y 1 , Y 2 ( y 1 , y 2 ) = 1 2 π σ 2 1 − ρ 2 exp ( − 1 2 ( 1 − ρ 2 ) [ ( Y 1 − μ ) 2 σ 2 + ( Y 2 − μ ) 2 σ 2 − 2 ρ ( Y 1 − μ ) ( Y 2 − μ ) σ 2 ] ) f_{Y_1,Y_2}(y_1,y_2) = \frac{1}{2\pi \sigma^2\sqrt{1-\rho^2}}\exp \left(-\frac{1}{2(1-\rho^2)} \left[ \frac{(Y_1-\mu)^2}{\sigma^2} + \frac{(Y_2-\mu)^2}{\sigma^2} - \frac{2\rho(Y_1-\mu)(Y_2-\mu)}{\sigma^2} \right]\right) fY1,Y2(y1,y2)=2πσ21−ρ2 1exp(−2(1−ρ2)1[σ2(Y1−μ)2+σ2(Y2−μ)2−σ22ρ(Y1−μ)(Y2−μ)])
Joint likelihood for random sample is L ( α , β , σ 2 , ρ ) = ∏ i = 1 n f Y 1 , Y 2 ( Y 1 i , Y 2 i ) = ( 2 π ) n σ 2 n ( 1 − ρ 2 ) n / 2 × exp ( − 1 2 σ 2 ( 1 − ρ 2 ) [ ∑ i = 1 n ( Y 1 i − α − β x i ) 2 + ∑ i = 1 n ( Y 2 i − α − β x i ) 2 − 2 ρ ∑ i = 1 n ( Y 1 i − α − β x i ) ( Y 2 i − α − β x i ) ] ) L(\alpha,\beta,\sigma^2,\rho) = \prod_{i=1}^n f_{Y_1,Y_2}(Y_{1i},Y_{2i}) = (2\pi)^{n}\sigma^{2n}(1-\rho^2)^{n/2} \times \\ \exp \left( -\frac{1}{2\sigma^2(1-\rho^2)} \left[ \sum_{i=1}^n (Y_{1i}-\alpha - \beta x_i)^2 + \sum_{i=1}^n (Y_{2i}-\alpha - \beta x_i)^2 - 2\rho\sum_{i=1}^n (Y_{1i}-\alpha - \beta x_i)(Y_{2i}-\alpha - \beta x_i) \right] \right) L(α,β,σ2,ρ)=i=1∏nfY1,Y2(Y1i,Y2i)=(2π)nσ2n(1−ρ2)n/2×exp(−2σ2(1−ρ2)1[i=1∑n(Y1i−α−βxi)2+i=1∑n(Y2i−α−βxi)2−2ρi=1∑n(Y1i−α−βxi)(Y2i−α−βxi)])
Let’s compute [ ∑ i = 1 n ( Y 1 i − α − β x i ) 2 + ∑ i = 1 n ( Y 2 i − α − β x i ) 2 − 2 ρ ∑ i = 1 n ( Y 1 i − α − β x i ) ( Y 2 i − α − β x i ) ] = ∑ i = 1 n Y 1 i 2 − 2 ∑ i = 1 n Y 1 i ( α + β x i ) + ∑ i = 1 n ( α + β x i ) 2 + ∑ i = 1 n Y 2 i 2 − 2 ∑ i = 1 n Y 2 i ( α + β x i ) + ∑ i = 1 n ( α + β x i ) 2 − 2 ρ [ ∑ i = 1 n Y 1 i Y 2 i − ∑ i = 1 n ( α + β x i ) Y 1 i − ∑ i = 1 n ( α + β x i ) Y 2 i + ∑ i = 1 n ( α + β x i ) 2 ] = ∑ i = 1 n Y 1 i 2 + ∑ i = 1 n Y 2 i 2 + 2 n α 2 − 2 α ∑ i = 1 n ( Y 1 i + Y 2 i ) − 2 β ∑ i = 1 n x i ( Y 1 i + Y 2 i ) + 2 α β ∑ i = 1 n x i + β 2 ∑ i = 1 n x i 2 − 2 ρ [ ∑ i = 1 n Y 1 i Y 2 i − α ∑ i = 1 n ( Y 1 i + Y 2 i ) − β ∑ i = 1 n x i ( Y 1 i + Y 2 i ) + n α 2 + 2 α β ∑ i = 1 n x i + β ∑ i = 1 n x i 2 ] \left[ \sum_{i=1}^n (Y_{1i}-\alpha - \beta x_i)^2 + \sum_{i=1}^n (Y_{2i}-\alpha - \beta x_i)^2 - 2\rho\sum_{i=1}^n (Y_{1i}-\alpha - \beta x_i)(Y_{2i}-\alpha - \beta x_i) \right] \\ = \sum_{i=1}^n Y_{1i}^2 - 2\sum_{i=1}^n Y_{1i}(\alpha+\beta x_i) + \sum_{i=1}^n (\alpha+\beta x_i)^2 + \sum_{i=1}^n Y_{2i}^2 - 2\sum_{i=1}^n Y_{2i}(\alpha+\beta x_i) \\+ \sum_{i=1}^n (\alpha+\beta x_i)^2 - 2\rho \left[ \sum_{i=1}^n Y_{1i}Y_{2i} -\sum_{i=1}^n(\alpha+\beta x_i)Y_{1i} - \sum_{i=1}^n (\alpha+\beta x_i)Y_{2i} + \sum_{i=1}^n (\alpha+\beta x_i)^2 \right] \\ = \sum_{i=1}^n Y_{1i}^2+ \sum_{i=1}^n Y_{2i}^2 + 2n\alpha^2-2\alpha\sum_{i=1}^n \left( Y_{1i} + Y_{2i}\right) - 2\beta \sum_{i=1}^n x_i(Y_{1i} + Y_{2i})+2\alpha\beta\sum_{i=1}^n x_i + \beta^2 \sum_{i=1}^n x_i^2 \\ -2\rho \left[\sum_{i=1}^n Y_{1i}Y_{2i} -\alpha\sum_{i=1}^n \left( Y_{1i} + Y_{2i}\right)-\beta \sum_{i=1}^n x_i(Y_{1i} + Y_{2i})+n\alpha^2+2\alpha\beta \sum_{i=1}^n x_i + \beta\sum_{i=1}^n x_i^2 \right] [i=1∑n(Y1i−α−βxi)2+i=1∑n(Y2i−α−βxi)2−2ρi=1∑n(Y1i−α−βxi)(Y2i−α−βxi)]=i=1∑nY1i2−2i=1∑nY1i(α+βxi)+i=1∑n(α+βxi)2+i=1∑nY2i2−2i=1∑nY2i(α+βxi)+i=1∑n(α+βxi)2−2ρ[i=1∑nY1iY2i−i=1∑n(α+βxi)Y1i−i=1∑n(α+βxi)Y2i+i=1∑n(α+βxi)2]=i=1∑nY1i2+i=1∑nY2i2+2nα2−2αi=1∑n(Y1i+Y2i)−2βi=1∑nxi(Y1i+Y2i)+2αβi=1∑nxi+β2i=1∑nxi2−2ρ[i=1∑nY1iY2i−αi=1∑n(Y1i+Y2i)−βi=1∑nxi(Y1i+Y2i)+nα2+2αβi=1∑nxi+βi=1∑nxi2]
Define T 1 ( Y ) = ∑ i = 1 n ( Y 1 i + Y 2 i ) T_1(Y) = \sum_{i=1}^n \left( Y_{1i} + Y_{2i}\right) T1(Y)=∑i=1n(Y1i+Y2i), T 2 ( Y ) = ∑ i = 1 n x i ( Y 1 i + Y 2 i ) T_2(Y) =\sum_{i=1}^n x_i(Y_{1i} + Y_{2i}) T2(Y)=∑i=1nxi(Y1i+Y2i), T 3 ( Y ) = ∑ i = 1 n Y 1 i Y 2 i T_3(Y) = \sum_{i=1}^n Y_{1i}Y_{2i} T3(Y)=∑i=1nY1iY2i. By Neyman-Fisher theorem, they are sufficient statistics. Consider another group of random sample { ( Z 1 i , Z 2 i ) } \{(Z_{1i},Z_{2i})\} {(Z1i,Z2i)}, L ( α , β , σ 2 , ρ ∣ Y ) L ( α , β , σ 2 , ρ ∣ Y ) = exp ( ∑ i = 1 n ( Y 1 i 2 + Y 2 i 2 − Z 1 i 2 − Z 2 i 2 ) + 2 α ( T 1 ( Z ) − T 1 ( Y ) ) + 2 β ( T 2 ( Z ) − T 2 ( Y ) ) − 2 ρ [ T 3 ( Y ) − T 3 ( Z ) − α ( T 1 ( Y ) − T 1 ( Z ) ) − β ( T 2 ( Y ) − T 2 ( Z ) ) ] ) \frac{L(\alpha,\beta,\sigma^2,\rho|\textbf{Y})}{L(\alpha,\beta,\sigma^2,\rho|\textbf{Y})} = \exp (\sum_{i=1}^n (Y_{1i}^2+Y_{2i}^2 - Z_{1i}^2 - Z_{2i}^2) + 2\alpha (T_1(Z) - T_1(Y)) + \\2\beta(T_2(Z) - T_2(Y))-2\rho [T_3(Y)-T_3(Z) - \alpha(T_1(Y)-T_1(Z))-\beta(T_2(Y)-T_2(Z))]) L(α,β,σ2,ρ∣Y)L(α,β,σ2,ρ∣Y)=exp(i=1∑n(Y1i2+Y2i2−Z1i2−Z2i2)+2α(T1(Z)−T1(Y))+2β(T2(Z)−T2(Y))−2ρ[T3(Y)−T3(Z)−α(T1(Y)−T1(Z))−β(T2(Y)−T2(Z))])
only if T 1 ( Y ) = T 1 ( Z ) , T 2 ( Y ) = T 2 ( Z ) , T 3 ( Y ) = T 3 ( Z ) T_1(Y) = T_1(Z),T_2(Y)=T_2(Z),T_3(Y) = T_3(Z) T1(Y)=T1(Z),T2(Y)=T2(Z),T3(Y)=T3(Z), this likelihood ratio is independent of parameters. So T 1 , T 2 , T 3 T_1,T_2,T_3 T1,T2,T3 are minimal sufficient statistics.
Part (b) Log-likelihood function is l ( α , β , σ 2 , ρ ) = n log ( 2 π ) + n log ( σ 2 ) + n 2 log ( 1 − ρ 2 ) − 1 2 σ 2 ( 1 − ρ 2 ) [ ∑ i = 1 n ( Y 1 i − α − β x i ) 2 + ∑ i = 1 n ( Y 2 i − α − β x i ) 2 − 2 ρ ∑ i = 1 n ( Y 1 i − α − β x i ) ( Y 2 i − α − β x i ) ] l(\alpha,\beta,\sigma^2,\rho) = n\log (2\pi)+n\log(\sigma^2)+\frac{n}{2}\log(1-\rho^2)-\\ \frac{1}{2\sigma^2(1-\rho^2)} \left[ \sum_{i=1}^n (Y_{1i}-\alpha - \beta x_i)^2 + \sum_{i=1}^n (Y_{2i}-\alpha - \beta x_i)^2 - 2\rho\sum_{i=1}^n (Y_{1i}-\alpha - \beta x_i)(Y_{2i}-\alpha - \beta x_i) \right] l(α,β,σ2,ρ)=nlog(2π)+nlog(σ2)+2nlog(1−ρ2)−2σ2(1−ρ2)1[i=1∑n(Y1i−α−βxi)2+i=1∑n(Y2i−α−βxi)2−2ρi=1∑n(Y1i−α−βxi)(Y2i−α−βxi)]
∂ l ∂ β = − 1 2 σ 2 ( 1 − ρ 2 ) [ ∑ i = 1 n − 2 x i ( Y 1 i − α − β x i ) + ∑ i = 1 n − 2 x i ( Y 2 i − α − β x i ) − 2 ρ ∑ i = 1 n ( Y 1 i − α − β x i ) ( Y 2 i − α − β x i ) ] \frac{\partial l}{\partial \beta} = - \frac{1}{2\sigma^2(1-\rho^2)} \left[ \sum_{i=1}^n -2x_i(Y_{1i}-\alpha - \beta x_i)+ \sum_{i=1}^n -2x_i(Y_{2i}-\alpha - \beta x_i) - 2\rho\sum_{i=1}^n (Y_{1i}-\alpha - \beta x_i)(Y_{2i}-\alpha - \beta x_i) \right] ∂β∂l=−2σ2(1−ρ2)1[i=1∑n−2xi(Y1i−α−βxi)+i=1∑n−2xi(Y2i−α−βxi)−2ρi=1∑n(Y1i−α−βxi)(Y2i−α−βxi)]
这道题,我实在是不想算了,弃疗,下面是答案。个人建议考试就四个小时还是放弃这种思路不难但计算很麻烦的题吧,反正六选五。