Math 371 - Alexiades
Some Linear Algebra
Concepts
  • matrix operations, elementary row operations, REF, RREF, rank
  • transpose, diagonal, triangular, symmetric matrices
  • invertible matrix; Fundamental Thm for Square Matrices
  • determinant, cofactors, adjoint of A
  • vectors in ℝn: operations, norm, unit vectors, dot product
  • parallel vectors, orthogonal vectors

    Core Methods:
  • write a linear m×n system in matrix form: Ax=b
  • Gauss elimination: reduce m×n matrix A to row-echelon form; find rank
  • Gauss-Jordan elimination: reduce m×n A to reduced row-echelon form; find rank
  • decide consistency of m×n system Ax = b ; solve Ax = b by Gauss elimination and by LU factorization of A
      ( a linear system can have none or exactly one or infinitely many solutions )
  • find A−1 by Gauss elimination (on [ A | I ]), and by LU factorization of A
  • find determinant by cofactor expansion
  • find eigenvalues and eigenvectors
  • vector operations, find length and direction (unit vector) of a vector , projection of a vector onto a unit vector

  • Fundamental Theorem for square matrices:  
            For a square n×n matrix A, the following are equivalent:
          1. A is invertible
          2. A In   (i.e. A is row equivalent to the Identity)
          3. A is expressible as a product of elementary matrices
          4. rank(A) = n
          5. the n columns (and rows) of A are linearly independent vectors in Rn
          6. the homogeneous system Ax = 0 has only the trivial solution
          7. the nonhomogeneous system Ax = b has unique solution for any vector bRn
          8. det(A) 0
          9. λ = 0 is not an eigenvalue of A
    Numerical aspects:
  • Gauss elimination is EXACT in infinite precision arithmetic, but NOT in finite precision arithmetic !!!
  • to preserve accuracy, must use partial pivoting (use as pivot the largest |entry| in the column), which slows it down...

    Operation counts:
  • for LU factorization:   ∼ n3 / 3 ,   and ∼ n2   for backward/forwafd substitutions
  • so for solving via LU:   ∼ n3/3 + 2n2
  • for finding A−1:   ∼ n3 at best,   and   ∼ n2 for A−1b
  • Numerically always use LU !!! it's about 4 times faster

    Matlab functions:
  • x = A \ b;   solves Ax=b via LU method ( for m×n A )
  • [L , U] = lu(A);   for LU=A   or   [L, U, P] = lu(A);   for LU=PA
  • Aref = rref(A);   =reduced row echelon form of A
  • Ainv = inv(A);   = A−1

    Variants of Gauss elimination: ... there are many...
  • Compact methods:   There exist explicit formulas for [lij], [uij] involving dot products, can be computed efficiently
  • Cholesky method   for SPD matrices ( Symmetric Positive Definite:   A=AT and xTAx > 0 for all x ∈ Rn )
                  gives efficient A=LLT, in ∼n3/6 operations
  • Band systems:   have only few nonzero bands, arise from discretization of PDEs, only entries of nonzero bands need be stored, L, U can be found directly
      e.g. Tridiagonal algorithm for tridiagonal systems can be solved in only   5n-4 operations!

    Numerical Linear Algebra types of methods:
  • Direct methods:
        • based on Gauss elimination (LU and variants)
        • destroy sparsity, so preferred for dense A (but take up lots of memory...)
  • Iterative methods:
        • start with a guess and try to improve it by iteration, there are many such methods...
              simplest:   Jacobi, Gauss-Seidel, SOR ; more involved:   CG(Conjugate Gradient), SVD, Krylov, ...
        • strongly preferred for sparse systems
        • Jacobi: solve i-th equation for i-th unknown, fixed point iteration
        • Gauss-Seidel: same but use latest available xj. Easier to implement but harder to parallelize than Jacobi, 2 times faster
        • SOR:   x(k+1)sor = (1-ω) x(k) + ω x(k+1)GS , with 0<ω<2
              if ω=1: same as GS; if 0<ω<1 : under-relaxation (slower, more accurate) ; if 1<ω<2 : over-relaxation (faster, less accurate)
              There exists an optimal ω but hard to find (usually done by trial-and-error)