Stat100c var cov operations PDF

Title Stat100c var cov operations
Author Pter Sam
Course Statistics
Institution University of California Los Angeles
Pages 5
File Size 59.6 KB
File Type PDF
Total Downloads 12
Total Views 136

Summary

Download Stat100c var cov operations PDF


Description

University of California, Los Angeles Department of Statistics Statistics 100C

Instructor: Nicolas Christou Variance and covariance operations in simple regression

Let random variables X, Y with means µX , µY respectively. The covariance, denoted with cov(X, Y ), is a measure of the association between X and Y . Definition: σXY = cov(X, Y ) = E(X − µX )(Y − µY ) Note: If X, Y are independent then E(XY ) = (EX)E(Y ) Therefore cov(X, Y ) = 0. The opposite is NOT always true! Let W, X, Y, Z be random variables, and a, b, c, d be constants. Then • cov (a + X, Y ) = cov (X, Y ) • cov(aX, bY ) = ab cov(X, Y ) • cov (X, Y + Z) = cov (X, Y ) + cov (X, Z) • cov (aW + bX, cY + dZ ) = ac cov(W, Y ) + ad cov(W, Z ) + bc cov(X, Y ) + bd cov(X, Z )

• Variance of the sum of two random variables: var (X + Y ) = var (X) + var(Y ) + 2cov (X, Y ) • Variance of a liner combination of two random variables: var (aX + bY ) = a2 var (X) + b2 var(Y ) + 2ab cov(X, Y )

1

A general result: The covariance operation is additive.   m n n X m X X X cov(Xi , Yj ). Yj  = cov  Xi , i=1

j=1

i=1 j=1

Proof: Let E(Xi ) = µi and E(Yj ) = vj . Then n X

E(

Xi ) =

i=1

n X

m X

µi and E(

i=1

Yj ) =

j=1

m X

vj .

j=1

Therefore using the definition of covariance,   m n X X Yj  cov  Xi , i=1

j=1



= E =

n X

Xi −

i=1

 n X E  (Xi

i=1

− µi )

i=1

=

n X m X

n X

! m X µi  Y

m X

j=1



j=1

E(Xi − µi )(Yj − vj ) =



vj  

 n X m X ( Xi E i=1 j=1

n X m X



− µi )(Yj − vj )

cov(Xi , Yj ).

i=1 j=1

Similarly, find the covariance between

Pn

i=1

ai Yi and

  n n n X n X X X ai bj cov(Yi , Yj ). cov  ai Yi , bj Yj  = j=1



(Yj − vj ) =

i=1 j=1

i=1

j

j=1

m X

Pn

j=1 bj Yj :

i=1 j=1

Result If Y1 , . . . , Yn are independent (e.g. one of the Gauss-Markov conditions in regression) then, when i 6= j we have cov(Yi , Yj ) = 0 and therefore:   n n X X cov  ai Yi , bj Yj  i=1

= a1 b1 cov(Y1 , Y1 ) + a1 b2 cov(Y1 , Y2 ) + . . . + a1 bn cov(Y1 , Yn )

j=1

+ a2 b1 cov(Y2 , Y1 ) + a2 b2 cov(Y2 , Y2 ) + . . . + a2 bn cov(Y2 , Yn ) + .. . + + an b1 cov(Yn , Y1 ) + an b2 cov(Yn , Y2 ) + . . . + an bn cov(Yn , Yn ) =

n X

ai bi var(Yi ).

i=1

(Because when i = j we have cov(Yi , Yi ) = var(Yi )).

2

Application in simple regression: Let Yi = β0 +β1 xi +ǫi . The Gauss-Markov conditions hold. Use the previous result to find cov(Y¯, βˆ1 ), cov( βˆ0 , βˆ1 ). and cov(Yˆi , Yˆj ). Hint: Express Y¯ , βˆ0 , βˆ1 , Yˆi as linear combinations of Y1 , . . . , Yn . P P Y¯ = 1 n Yi = n ai Yi , where ai = 1 . n

i=1

i=1

n

P βˆ1 = ni=1 ki Yi . What is ki ? P βˆ0 = ni=1 li Yi . What is li ? P Pn dr Yr . What are cl , dr ? Yˆi = ni=n cl Yl and Yˆj = i=n

Therefore, Pn Pn ai Yi , j=1 kj Yj )= 1. cov(Y¯ , βˆ1 ) = cov( i=1

2. cov(βˆ0 , βˆ1 ) = cov(

Pn

i=1 li Yi ,

Pn

j=1

kj Yj )=

P Pn 3. cov(Yˆi , Yˆj ) = cov( i=1 cl Yl , nj=1 dr Yr )=

3

Variance of a linear combination of random variables: Using the result above find var(Y1 + . . . + Yn ): var

n X i=1

Yi

!

= = =

  n n X X Yj  cov  Yi , i=1 n n XX

cov(Yi , Yj )

i=1 j=1 n X

var(Yi ) +

i=1

=

j=1

n X i=1

n X n X

cov(Yi , Yj )

i=1 j6=i

var(Yi ) + 2

n−1 X n X

cov(Yi , Yj )

i=1 j>i

4

The previous result can be extended to the more general case: Let Y1 , Y2 , . . . , Yn be random variables, and a1 , a2 , . . . , an be constants. Find the variance of the linear combination Q = a1 Y1 + a2 Y2 + . . . + an Yn . var

n X i=1

ai Y i

!

=

  n n X X aj Y j  cov  ai Yi , i=1

= = =

j=1

n X n X

ai aj cov(Yi , Yj ) i=1 j=1 n n X n X X a2i var(Yi ) + i=1 j6=i i=1 n X a2

i var(Yi ) + 2

i=1

ai aj cov(Yi , Yj )

n n−1 XX

ai aj cov(Yi , Yj )

i=1 j>i

Application in simple regression: When the Gauss-Markov condition of independence holds, the previous expression is simplified considerably, because cov(Yi , Yj ) = 0. Let Yi = β0 + β1 xi + ǫi . Use the previous result to find var(Yˆi ).

5...


Similar Free PDFs