Notation:
are column vectors
matrix
.
and
then
matrix
which is an
matrix with
for all
for any pair
is called the
identity matrix.
is the
set of all vectors
. It is a vector space.
The column space of a matrix,
is linearly independent
if
implies
for all
Defn: The transpose,
, of an
matrix
is
the
matrix whose entries are given by
. We have
Defn: rank of matrix
,
rank
:
# of linearly independent
columns of
.
We have
rank![]() |
||
![]() |
If
is
then
rank
.
For now: all matrices square
.
If there is a matrix
such that
then
we call
the inverse of
. If
exists it is unique and
and we write
. The matrix
has an inverse if and only
if
rank
.
Inverses have the following properties:
Again
is
. The determinant if a function on the set
of
matrices such that:
.
det
.)
is a linear function of each column of
with det![]() |
![]() |
|
det![]() |
||
![]() |
Here are some properties of the determinant:
det
.
det
det
.
det
.
if and only if
rank
.
implies
.
Defn: Two vectors
and
are orthogonal if
.
Defn: The inner product or dot product of
and
is
Defn:
and
are orthogonal if
.
Defn: The norm (or length) of
is
is orthogonal if each column of
has length 1 and
is orthogonal to each other column of
.
Suppose
is an
matrix. The function
![]() |
![]() |
|
![]() |
depends only on the total
. In
fact
If
is
and
and
such that
matrix
is singular.
Therefore
det
.
Conversely: if
singular
then there is
such that
.
Fact:
det
is polynomial in
of degree
.
Each root is an eigenvalue.
General
the roots could be
multiple roots or complex valued.
Matrix
is diagonalized by a non-singular matrix
if
is diagonal.
If so then
so each column of
is eigenvector of
with
the
th column having eigenvalue
.
Thus to be diagonalizable
must have
linearly independent eigenvectors.
If
is symmetric then
are two eigenvalues of
![]() |
![]() |
|
![]() |
and
we see
. Eigenvectors corresponding
to distinct eigenvalues are orthogonal.
Defn: A symmetric matrix
is non-negative definite if
for all
. It is positive definite if in addition
implies
.
is non-negative definite iff all its eigenvalues are
non-negative.
is positive definite iff all eigenvalues positive.
A non-negative definite matrix has a symmetric non-negative definite square root. If
.
Suppose
vector subspace of
,
basis for
. Given any
there
is a unique
which is closest to
;
minimizes
. Any
, columns
;
![]() |
||
![]() |
||
![]() |
||
![]() |
||
![]() |
Note that
and that
we see that
![]() |
![]() |
|
![]() |
||
![]() |
||
![]() |
![]() |
![]() |
|
![]() |
Choose
to minimize:
minimize second term.
Achieved by making
.
Since
can
take
Summary:
closest point
in
is
Notice that the matrix
is idempotent:
the orthogonal projection of
is perpendicular to the residual
.
Suppose
matrix,
,
and
. Make
matrix
by putting
in 2 by 2 matrix:
We can work with partitioned matrices just like ordinary matrices always making sure that in products we never change the order of multiplication of things.
![]() |
![]() |
|
![]() |
![\begin{multline*}
\left[ \begin{array}{cc}
A_{11} & A_{12}
\\
\\
A_{21} & A_{2...
...{11}+A_{22}B_{21} &A_{21}B_{12}+ A_{22}B_{22}
\end{array}\right]
\end{multline*}](img175.gif)
Note partitioning of
and
must match.
Addition: dimensions of
and
must be the same.
Multiplication formula
must
have as many columns as
has rows, etc.
In general:
need
to make sense for each
.
Works with more than a 2 by 2 partitioning.
Defn: block diagonal matrix: partitioned matrix
for which
if
. If
is invertible and
then
det
det
.
Similar formulas work for larger matrices.
Partitioned inverses. Suppose
,
are symmetric positive
definite. Look for inverse of
![]() |
![]() |
|
![]() |
![]() |
|
![]() |
![]() |
|
![]() |
![]() |
Solve to get
![]() |
![]() |
|
![]() |
![]() |
|
![]() |
||
![]() |