**3.1 The Method of Least Squares: Regression Analysis**

The squares of the residuals are minimized; there is justification for this, and there is a long history and a vast literature (e.g. Williams 1959, Linnik 1961, Montgomery & Peck 1992).

For our particular example of fitting the ``regression line'', or a
straight line *y* = *ax + b* through *N* pairs of
(*x _{i}*,

In the absence of knowledge of the how and why of a relation between
the *x _{i}* and the

an exponential,

*y*=*b*exp*a*, requires*y*to be changed to ln_{i}*y*in the above expressions;_{i}a power-law,

*y*=*bx*^{a}; change*y*to ln_{i}*y*and_{i}*x*to ln_{i}*x*;_{i}a parabola,

*y*=*b*+*ax*^{2}; change*x*to_{i}*x*._{i}

(Note that the residuals cannot be Gaussian for *all* of these
transformations: of course it is always possible to minimize the
squares of the residuals, but it may well not be possible to retain
the formal justification for doing so.)

There are many further variations available. Algebra can provide expressions for weighted data-pairs and/or the fitting of polynomials of any order. For all of these, residuals can be examined to determine which is the best way to model the data relations.