Linear Algebra Review

Notation:

• Vectors are column vectors

• An matrix has rows, columns and entries .

• Matrix and vector addition defined componentwise:

• If is and is then is the matrix

• The matrix or sometimes which is an matrix with for all and for any pair is called the identity matrix.

• The span of a set of vectors is the set of all vectors of the form . It is a vector space. The column space of a matrix, , is the span of the set of columns of . The row space is the span of the set of rows.

• A set of vectors is linearly independent if implies for all . The dimension of a vector space is the cardinality of the largest possible set of linearly independent vectors.

Defn: The transpose, , of an matrix is the matrix whose entries are given by

so that is . We have

and

Defn: rank of matrix , rank: # of linearly independent columns of . We have

 rank dimcolumn space of dimrow space of rank

If is then rank.

Matrix inverses

For now: all matrices square .

If there is a matrix such that then we call the inverse of . If exists it is unique and and we write . The matrix has an inverse if and only if rank.

Inverses have the following properties:

(if one side exists then so does the other) and

Determinants

Again is . The determinant if a function on the set of matrices such that:

1. det.

2. If is the matrix with two columns interchanged then

det   det

(So: two equal columns implies det.)

3. det is a linear function of each column of . If with denoting the th column of the matrix then

 det det det

Here are some properties of the determinant:

1. det   det.

2. det   detdet.

3. detdet.

4. is invertible if and only if det if and only if rank.

5. Determinants can be computed (slowly) by expansion by minors.

Special Kinds of Matrices

1. is symmetric if .

2. is orthogonal if (or ).

3. is idempotent if .

4. is diagonal if implies .

Inner Products, orthogonal, orthonormal vectors

Defn: Two vectors and are orthogonal if .

Defn: The inner product or dot product of and is

Defn: and are orthogonal if .

Defn: The norm (or length) of is

is orthogonal if each column of has length 1 and is orthogonal to each other column of .

Suppose is an matrix. The function

is called a quadratic form. Now

so that depends only on the total . In fact

Thus we will assume that is symmetric.

Eigenvalues and eigenvectors

If is and and such that

then is eigenvalue (or characteristic or latent value) of ; is corresponding eigenvector. Since matrix is singular.

Therefore det.

Conversely: if singular then there is such that .

Fact: det is polynomial in of degree .

Each root is an eigenvalue.

General the roots could be multiple roots or complex valued.

Diagonalization

Matrix is diagonalized by a non-singular matrix if is diagonal.

If so then so each column of is eigenvector of with the th column having eigenvalue .

Thus to be diagonalizable must have linearly independent eigenvectors.

Symmetric Matrices

If is symmetric then

1. Every eigenvalue of is real (not complex).

2. is diagonalizable; columns of may be taken unit length, mutually orthogonal: is diagonalizable by an orthogonal matrix ; in symbols .

3. Diagonal entries in eigenvalues of .

4. If are two eigenvalues of and and are corresponding eigenvectors then

and

Since and we see . Eigenvectors corresponding to distinct eigenvalues are orthogonal.

Positive Definite Matrices

Defn: A symmetric matrix is non-negative definite if for all . It is positive definite if in addition implies .

is non-negative definite iff all its eigenvalues are non-negative.

is positive definite iff all eigenvalues positive.

A non-negative definite matrix has a symmetric non-negative definite square root. If

for orthogonal and diagonal then

is symmetric, non-negative definite and

Here is diagonal with

Many other square roots possible. If and is orthogonal and then .

Orthogonal Projections

Suppose vector subspace of , basis for . Given any there is a unique which is closest to ; minimizes

over . Any in is of the form

, , columns ; column with th entry . Define

( has rank so is invertible.) Then

Note that and that

so that

Then

Since we see that

This shows that

Choose to minimize: minimize second term.

Achieved by making .

Since can take

Summary: closest point in is

call the orthogonal projection of onto .

Notice that the matrix is idempotent:

We call the orthogonal projection of on because is perpendicular to the residual .

Partitioned Matrices

Suppose matrix, , and . Make matrix by putting in 2 by 2 matrix:

For instance if

and

then

Lines indicate partitioning.

We can work with partitioned matrices just like ordinary matrices always making sure that in products we never change the order of multiplication of things.

and

Note partitioning of and must match.

Addition: dimensions of and must be the same.

Multiplication formula must have as many columns as has rows, etc.

In general: need to make sense for each .

Works with more than a 2 by 2 partitioning.

Defn: block diagonal matrix: partitioned matrix for which if . If

then is invertible iff each is invertible and then

Moreover det   det   det. Similar formulas work for larger matrices.

Partitioned inverses. Suppose , are symmetric positive definite. Look for inverse of

of form

Multiply to get equations

Solve to get

Richard Lockhart
2002-09-24