next up previous
Next: QR Decomposition Up: Linear Least Square Previous: Linear Least Square

Normal Equations

Consider and look for where the gradient of the function vanishes. Let be the set of vectors in the local neighbourhood of . For a minima, we want

The second term in the above formula converges to zero (as shown below):

Therefore, the gradient is zero at the solution of:

or

The above system of n linear equations in n unknowns are known as the normal equations. The matrix is also known as the pseudo-inverse of . The minimum of the least square system corresponds to .

NOTE: is s.p.d. we can use Cholesky decomposition (). The overall cost would be

Why is the minimum ?

  1. Note that the Hessian is positive definite ( a necessary condition for the minima).
  2. Lets consider any other solution to the least squares problem expressed as: = . After we substitute this solution in the original set of linear equations and simplify, we obtain:

    So clearly obtains a minima.



Dinesh Manocha
Tue Feb 3 23:49:47 EST 1998