Appendix B — Matrix rules

B.1 Rules for multiplication, transposed and inverse matrices:

If \, \mathbf x is a column vector with n elements, then

  • \mathbf x' \mathbf x = x_1^2+x_2^2+\ldots+x_n^2 = \sum_{i=1}^n x_i^2 \, is the inner (scalar) product and

  • \mathbf x \, \mathbf x' is the outer product, which is a (n \times n) matrix with the typical element [x_i x_j], where i is the row index and j the column index of the matrix

\mathbf x \, \mathbf x' = \left[\begin{array}{cccc} x_{1}^2 & x_{1}x_2 & \cdots & x_{1}x_n \\ x_{2}x_1 & x_{2}^2 & \cdots & x_{2}x_n \\ \vdots & \vdots & \ddots & \vdots \\ x_{n}x_1 & x_{n}x_2 & \cdots & x_{n}^2 \end{array}\right] \tag{B.1}

\mathbf A, \mathbf B,\mathbf C are matrices:

\mathbf A \, \mathbf B \ne \mathbf B \, \mathbf A \\ \text { (in general) } \tag{B.2}

(\mathbf A + \mathbf B) \,\mathbf C = \mathbf {A \, C}\, + \mathbf {B \,C } (\mathbf {A B C})^{\prime}=\mathbf {C}^{\prime} \mathbf {B}^{\prime} \mathbf {A}^{\prime} \\ \text { (if the dimensions are compatible for multiplications ) } \tag{B.3}

(\mathbf {A}+\mathbf {B})^{\prime}=\mathbf {A}^{\prime}+\mathbf {B}^{\prime}

(\mathbf {A} \mathbf {B} \mathbf {C})^{-1}=\mathbf {C}^{-1} \mathbf {B}^{-1} \mathbf {A}^{-1} \\ \text { (if all inverses exit) } \tag{B.4}

\left((\mathbf {A B})^{\prime}\right)^{-1}=\left((\mathbf {A B})^{-1}\right)^{\prime} \tag{B.5}

B.2 Rules for the trace of a matrix

\operatorname{tr}(\mathbf {A}) \equiv \sum \nolimits_i a_{i, i}

\operatorname{tr}(\mathbf {A B C})=\operatorname{tr}(\mathbf {C A B})=\operatorname{tr}(\mathbf {B C A}) \\ \text { (if the dimensions are compatible for multiplications ) } \tag{B.6}

\operatorname{tr}\left(\mathbf {I}_N\right) = N \tag{B.7}

E(\operatorname{tr}(\mathbf {A}))=\operatorname{tr}(E(\mathbf {A})) \\ \quad \text { (Both trace and expectation operator are linear and therefore exchangeable) } \tag{B.8}

\operatorname{tr}(k \cdot \mathbf {A}+\mathbf {B})=k \cdot \operatorname{tr}(\mathbf {A})+\operatorname{tr}(\mathbf {B}) . \tag{B.9}


B.3 Rank(\mathbf A)

  • The (column) rank of a matrix \mathbf A corresponds to the maximal number of linearly independent columns of \mathbf A

  • If a squared matrix \mathbf A (n \times n) is of full rank n, then it is regular (as opposed to singular), det(A) \neq 0, and A has an inverse with \, \mathbf A \mathbf A^{-1} = \mathbf A^{-1} \mathbf A = \mathbf I


B.4 Eigenvalues and eigenvectors

A nontrivial solution of the homogeneous system of equations: \mathbf{A} \mathbf{x}_i=\lambda_i \mathbf{x}_i, exits iff \operatorname{det}\left(\mathbf{A}-\lambda_i \mathbf{I}\right)=0.

  • \lambda_i are the eigenvalues and \mathbf{x}_i the corresponding eigenvectors.

  • A \, n \times n \, matrix has n eigenvalues and eigenvectors.

  • We generally have: \ \mathbf{A Q}=\mathbf{Q} \mathbf{D}_\lambda. Thereby \mathbf{Q} is the matrix of all eigenvectors \mathbf x_i and \mathbf{D}_\lambda is a diagonal matrix with d_{i i}=\lambda_i. (This is simply \mathbf{A} \mathbf{x}_i=\lambda_i \mathbf{x}_i, \, \forall \ \lambda_is and corresponding \mathbf{x}_is)


B.5 Symmetric matrices

A matrix \mathbf A is symmetric if \mathbf A = \mathbf A'

  • If \mathbf{A} symm. then \mathbf{B} \mathbf{A} \mathbf{B}' or \, \mathbf{B}' \mathbf{A} \mathbf{B} \, are symmetric (\mathbf{B} might be non-symmetric)

    • Hence, \mathbf{B} \mathbf{B}' and \, \mathbf{B}' \mathbf{B} \, are always symmetric
  • If \mathbf{A} symm. then \mathbf{A}^{-1} is symmetric

  • If \mathbf{A} symm. then all eigenvalues \lambda_i are real.

  • If \mathbf{A} is symm. then \, \mathbf Q \, from above is orthogonal with \, \mathbf{Q}^{\prime} \mathbf{Q}=\mathbf{Q} \mathbf{Q}^{\prime}=\mathbf{I}, such that
    \mathbf{A}=\mathbf{Q} \mathbf{D}_\lambda \mathbf{Q}^{\prime} \; and \, \mathbf{Q}^{\prime} \mathbf{A} \mathbf{Q}=\mathbf{D}_\lambda.


B.6 Definite matrices

A symmetric matrix \mathbf {A} is positive definite (p.d) iff : \mathbf {x}^{\prime} \mathbf {A} \mathbf {x}=\sum_j \sum_i a_{i j} x_i x_j>\mathbf{0}, \forall \mathbf {x} \neq \mathbf{0}.

  • If \mathbf {A} p.d. and \mathbf {B} is s \times k, with rank(\mathbf {B})=k, then \mathbf {B}^{\prime} \mathbf {A} \mathbf {B} is p.d.

  • If \mathbf {A} p.d. then it has a positive main diagonal and a positive determinant

  • If \mathbf {A} is p.d. then \mathbf {A} \mathbf {A}' and \mathbf {A}' \mathbf {A} are p.s.d.

  • \mathbf {A} is p.d. then (-\mathbf {A}) is n.d.

  • \mathbf {A} is p.d. then (\mathbf {A})^{-1} exits and is p.d.

  • Iff \mathbf {A} is p.d. then all eigenvalues \lambda_i are positive

  • If \mathbf {A} is p.d. then there exits a unique, non-singular, symmetric, p.d. factor (square root matrix) \mathbf {A}^{\frac {1}{2}} (which is \mathbf Q \sqrt{\mathbf{D}_\lambda} \mathbf Q' from above) with \mathbf {A} = \mathbf A^{\frac {1}{2}} \mathbf {A^{\frac {1}{2}}} and {\mathbf {A}^{-\frac {1}{2}}} \mathbf {A} \, {\mathbf {A}^{-\frac {1}{2}}} = {\mathbf {A}^{\frac {1}{2}}} \mathbf {A}^{-1} {\mathbf {A}^{\frac {1}{2}}} = \mathbf {I}

  • Every regular covariance matrix \mathbf {A} is p.d.


B.7 Idempotent Matrices

\mathbf {M} is symmetric and idempotent iff \, \mathbf {M}=\mathbf {M}^{\prime}=\mathbf {M} \mathbf {M}.

  • If \mathbf {M} is idempotent then \operatorname{tr}(\mathbf {M})=rank(\mathbf {M}) and all eigenvalues are either 1 or 0

    • Hence, an idempotent matrix is p.s.d.
  • If \mathbf {M} is k \times k with rank(\mathbf {M})=r<k and idempotent then

    \mathbf {Q}^{\prime} \mathbf {M} \mathbf {Q}=\mathbf {D}_\lambda=\left[\begin{array}{cc}\mathbf {I}_r & 0 \\ \mathbf {0} & 0\end{array}\right].

B.8 Kronecker product

  • \mathbf A \otimes \mathbf B means that every element of \mathbf A is multiplied with matrix \mathbf B

(\mathbf{A} \otimes \mathbf{B})(\mathbf{C} \otimes \mathbf{D}) = (\mathbf{A C}) \otimes(\mathbf{B D}) \tag{B.10}

(\mathbf A \otimes \mathbf B)' = \mathbf A' \otimes \mathbf B' \tag{B.11}

(\mathbf A \otimes \mathbf B)^{-1} = \mathbf A^{-1} \otimes \mathbf B^{-1} \tag{B.12}