Math: Vectors and matrices

You are currently browsing the archive for the Math: Vectors and matrices category.

Idempotent matrices appear in regression analysis, so can’t be ignored. Here are commonly used properties:

Properties of idempotent matrices

  • The eigenvalues of an idempotent matrix are 0 and 1.
  • The rank of an idempotent matrix is equal to its trace (sum of its diagonal elements).

The following result is useful for deriving F-tests.

Independence of quadratic forms for orthogonal projection matrices

Let \boldsymbol{ \epsilon} \sim N(0, \sigma \mathbf{I_n}) , and let n \times n matrices \mathbf{A} and \mathbf{B} be orthogonal projections – and as such they must be symmetric and idempotent.
Then, the quadratic forms \mathbf{\epsilon'A\epsilon} and \mathbf{\epsilon'B\epsilon} are independently distributed if and only if


\mathbf{AB} = \mathbf{BA} =0

An example of an orthogonal projection matrix is regression is the Hat matrix, typically denoted in econometrics by \mathbf{P} , or \mathbf{H}, and \mathbf{H = X(X'X)^{-1}X'}. It’s called a Hat matrix cuz it puts a hat on the dependent variable, \mathbf{HY} like this:


\mathbf{HY} = \mathbf{\hat{Y}}

Let f be a scalar function of a n \times 1 vector \mathbf{x}=(x_1,x_2,\cdots,x_n)'. Then partially differentiating f(\mathbf{x}) with respect to each x_i and arranging the derivatives into a column vector gives us


\frac{\partial f}{\partial \mathbf{x}} =   \begin{pmatrix}  \frac{\partial f}{\partial x_1} \\  \frac{\partial f}{\partial x_2} \\  \vdots \\  \frac{\partial f}{\partial x_n}   \end{pmatrix}

Vector differentiation
Suppose f(\mathbf{x}) is a linear function of the column vector \mathbf{x}


f(\mathbf{x}) = \mathbf{a'x}

where \mathbf{a} is a n \times 1 vector of constants. Then


\frac{\partial f}{\partial \mathbf{x}} = \frac{\partial (\mathbf{a'x})}{\partial \mathbf{x}}=\frac{\partial \mathbf{x'a}}{\partial \mathbf{x}} = \mathbf{a}

Derivative of a quadratic form
Denote by \mathbf{A} a n \times n matrix of constants. Let f(\mathbf{x}) be the quadratic form


f(\mathbf{x}) = \mathbf{x'Ax}

Then


\frac{\partial (\mathbf{x'Ax})}{\partial \mathbf{x}} = (\mathbf{A} + \mathbf{A'})\mathbf{x}

If \mathbf{A} is symmetric (as is commonly the case in econometrics and statistics) we have


\frac{\partial (\mathbf{x'Ax})}{\partial \mathbf{x}} = 2\mathbf{Ax}

Basic results of matrix differentiation with proofs

Derivative of a quadratic form with proof