Covariance

Covariance definition:

Given random variables X and Y with joint density p(x, y) and means E(X) = µ_1, E(Y) = µ_2

The covariance of X and Y is cov(X,Y) = E[(X − µ_1)(Y − µ_2)]

Properties

1) cov(X, Y) = E(XY) − E(X) E(Y).

Proof follows easily from the definition

2) cov(X, X) = var(X)

3) If X and Y are independent then cov(X, Y) = 0. Proof: Independence of X and Y implies that E(XY) = E(X)E(Y).

Note: The converse is NOT true in general. It can happen that the covariance is 0 but X and Y are highly dependent (try to think of an example; use the intuition that covariance is a linear property by dependence is not). However, for the bivariate normal case the converse does hold.

For any two random variables:

Var(X+Y)=Var(X)+Var(Y)+2Cov(X,Y)

If the variables are uncorrelated (i.e., Cov(X,Y)=0), then

Var(X+Y)=Var(X)+Var(Y)

Since X and Y being independent implies their covariance is 0, variance of independent random variables add. Back to Lectures