Definition

Gauss-Newton method is an iterative algorithm used to solve used to solve non-linear least squares problems. This method approximates Hessian Matrix using the Jacobian Matrix.

Algorithm

Consider a non-linear regression model with -dimensional explanatory variable and -dimensional parameter vector The first order Taylor expansion gives the linear approximation of the model. where is the initial vector for given by domain knowledge, is the estimation vector with , and is the Jacobian Matrix of the function at .

Now, the approximated model has a Multiple Linear Regression. Since and are constants given , we can find LSE of . And the updating formula is defined as The updating is repeated iteratively until it converges. converges to the minimizing Sum of Squared Errors Loss with a proper initial value .