Title: Linear regression
1Linear regression Fitting a straight line to
observations
2Equation for straight line
Difference between observation and line
ei is the residual or error
3Goal in linear regression is to minimize
To find minimum, take derivatives
And set to zero
4Some algebra
The Normal Equations
5Solve these simultaneously
These are the least-squares linear regression
coefficients
6Example
7and
8(No Transcript)
9Error in linear regression a0 and a1 are maximum
likelihood estimates standard error of estimate
Quantifies spread around regression line
10Another measure of goodness of fit - coefficient
of determination r2 or correlation coefficient r
Can also write
11For our example
12Linearization of nonlinear relationships
13Polynomial regression - extend linear regression
to higher order polynomials
Sum of squared residuals becomes
14Take derivatives to minimize Sr
Set equal to zero
15Can write as
16We can solve this with any number of matrix
methods
Example
17After Gauss elimination
18Best fit curve
19Standard error for polynomial regression
- where
- n observations
- m order polynomial
(start off with n degrees of freedom, use up m1
for m order polynomial)