Matrix Operations
Assume A , B , and C are matrices and
Matrix Times Vector
Given a
Matrix Times Matrix
Dot Product Rule
Given two matrices
The
Determinants of a Matrix
Definition 1: If A is a 1x1 matrix, then
Definition 2: If A is a 2x2 matrix, then
Definition 3*: If A is an
Determinants are recursive, you split a larger matrix into smaller ones to find the determinant for each and combine them accordingly.
- To compute
, we can expand across any row/column - It is better to simplify (using row ops) before computing
This is because the determinant of A does not change when row ops are applied
There are 2 easy ways to tell if a the determinant of a matrix is 0:
- An entire row or column is = 0
- 2 rows or columns are equal
Invertible Matrices
If A is a square matrix, a matrix B is the inverse of A if and only if
i.e. if matrix A and B multiply communitatively and the result is the identity matrix A is the inverse of B and vice-versa.
Elementary Matrices
- A matrix
is elementary if the can be obtained from I using a single row operation.
e. g.
If
- Inverse of an elementary matrix is just the inverse operation that was used to obtain the elementary matrix.
- Can compute products/powers of elementary matrices by repeating the row operation n times.
e.g.
or
Importance:
Suppose A -> B takes one operation. If E = I correlated to this same row operation, then
i.e. performing row operations are just performing matrix multiplications by an elementary matrix in disguise.
This can be expanded express
e.g. Assuming we get B from A and
We can also factor out invertible matrices in terms of elementary matrices.
e.g. Assuming A is an invertible matrix and
Eigenvalues
Eigenvalues are paired with eigenvectors
To find an eigenvalue we use the formula: $$\det(A-\lambda I)=0$$
This comes from the equivalence
Removing the
Eigenvectors
Eigenvectors are paired with eigenvalues
To find an eigenvalue we first have to find its corresponding eigenvalue or
From there we sub in the value of the eigenvalue to the equation
Theorems
#determinants
You cannot distribute determinants to matrices i.e
If A is a
#determinants
If A, B are both
Prof. Hernandez calls it the "beautiful theorem"
This provides a few things consequentially:
- If
the previous formula becomes
#determinants
Suppose that A is
If A is a square matrix, then the following statements are equivalent
- A is invertible
- The only solution to
is - RREF of
is - A is a product of elementary matrices
- The determinant of
#determinants
If A is a square matrix and is invertible:
- If a multiple of a row is added to another row
- If a row is multiplied by a scalar
then - If a row is swapped with another then
- If any two rows or columns are identical then
- If there exists a row or column of zeroes then
#diagonalizeable
A square matrix
Ideas:
- Diagonal matrices are easy to deal with (compute det, eigenvalue, eigenvector, etc.)
- If A is diagonalizable then A inherits those properties from D
#diagonalizeable
A square matrix
Properties
and are both constants while , , and are all matrices.
- A + B = B + A
#associative-property - (A + B) + C = A + (B + C)
#communitaive-property (A + B) = A + B
#distributive-property(A + B) = A + B
#distributive-property- (
)(A) = ( A) order matters is the identity matrix and has the same properties as the constant 1 for all matrices - Inverted Matrices
exists exactly when which should be order matters k can be a Transpose i.e.