next up previous
Next: Vector and Matrix Up: No Title Previous: No Title

Linear Algebra

Linear algebra is perhaps the most important tool in scientific computing. We will start with a quick review of linear algebra. Notation:

Given an real matrix, , we denote its inverse by and the determinant by . If the determinant is non-zero, the matrix is non-singular. For a given non-singular matrix, the following results are equivalent:

  1. A is non-singular.
  2. .
  3. The linear system has the only solution .
  4. For any vector b, the linear system has a unique solution.
  5. The columns (rows) of are linearly independent; that is, if are the columns of and

    then all the scalars are necessarily zero.

  6. has rank n, where the rank of a matrix is the number of linearly independent rows or columns.

The transpose of is . A matrix is symmetric if . Furthermore, if for all vectors , , , then is positive definite.

A submatrix of is obtained by deleting rows and columns of . A principal submatrix results from deleting corresponding rows and columns. A leading principal submatrix of size k is obtained by deleting rows and columns .

Eigenvalues and Eigenvectors: The eigenvalues and eigenvectors of a matrix are the solutions to the following matrix equation:

where is the eigenvalue and is the eigenvector. The eigenvalues are the roots of the polynomial equation:

This is the characteristic equation of and is a polynomial of degree n in . As a result, has precisely n eigenvalues. Notice that the problem of computing the eigenvalues is well-conditioned for most cases and stable algorithms are known for it. On the other hand the problem of finding roots can be ill-conditioned. The stable algorithms for eigenvalue computation DO NOT reduce the problem to finding roots of its characteristic polynomial.



next up previous
Next: Vector and Matrix Up: No Title Previous: No Title



Dinesh Manocha
Mon Jan 27 13:16:15 EST 1997