Engineering Computation Curve Fitting 1 - PowerPoint PPT Presentation

About This Presentation
Title:

Engineering Computation Curve Fitting 1

Description:

Interpolates values of well-behaved (precise) data or for geometric design. ... Curve Fitting & Interpolation. 2. We now discuss Interpolation & Extrapolation ... – PowerPoint PPT presentation

Number of Views:239
Avg rating:3.0/5.0
Slides: 29
Provided by: billph1
Category:

less

Transcript and Presenter's Notes

Title: Engineering Computation Curve Fitting 1


1
Engineering Computation Curve Fitting 1
Curve Fitting By Least-Squares Regression and Spli
ne Interpolation Part 7
2
Engineering Computation Curve fitting 2
Curve Fitting Given a set of points -
experimental data - tabular data - etc. Fit
a curve (surface) to the points so that we can
easily evaluate f(x) at any x of interest. If x
within data range ? interpolating (generally
safe) If x outside data range ?extrapolating
(often dangerous)
3
Engineering Computation Curve fitting 3
  • Curve Fitting
  • Two main methods will be covered
  • 1. Least-Squares Regression
  • Function is "best fit" to data.
  • Does not necessarily pass through points.
  • Used for scattered data (experimental)
  • Can develop models for analysis/design.
  • 2. Interpolation
  • Function passes through all (or most) points.
  • Interpolates values of well-behaved (precise)
    data or for geometric design.

4
Engineering Computation Curve Fitting
Interpolation
Curve Fitting 1. We have discussed
Least-Squares Regression where the function is
"best fit" to points but does not necessarily
pass through the points.
2. We now discuss Interpolation
Extrapolation The function passes through all
(or at least most) points.
5
Engineering Computation Least Squares
Regression General Procedure 5
6
Engineering Computation Least Squares
Regression 6
  • Curve Fitting by Least-Squares Regression
  • Objective
  • Obtain low order approximation (curve or surface)
    that
  • "best fits" data
  • Note
  • Because the order of the approximation is lt the
    number of data points, the curve or surface can
    not pass through all points.
  • We will need a consistent criterion for
    determining the "best fit."
  • Typical Usage
  • Scattered (experimental) data
  • Develop empirical models for analysis/design.

7
Engineering Computation Least Squares
Regression 7
Least-Squares Regression 1. In laboratory,
apply x, measure y, tabulate data. 2. Plot data
and examine the relationship.
8
Engineering Computation Least Squares
Regression 8
Least-Squares Regression 1. In laboratory,
apply x, measure y, tabulate data. 2. Plot data
and examine the relationship.
9
Engineering Computation Least Squares
Regression 9
  • Least-Squares Regression
  • 3. Develop a "model" an approximate
    relationship between y
  • and x
  • y m x b
  • 4. Use the model to predict or estimate y for any
    given x.
  • 5. "Best fit" of the data requires
  • Optimal way of finding parameters (e.g., slope
    and intercept of a straight line.
  • Perhaps optimize the selection of the model form
  • (i.e., linear, quadratic, exponential, ...).
  • That the magnitudes of the residual errors do not
    vary in any systematic fashion. In statistical
    applications, the residual errors should be
    independent and identically distributed.

10
Engineering Computation Least Squares
Regression 10
Least-Squares Regression Given n data points
(x1,y1), (x2,y2), (xn,yn) Obtain "Best fit"
curve f(x) a0 Z0(x) a1 Z1(x) a2 Z2(x)
am Zm(x) ai's are unknown parameters of
model Zi's are known functions of x. We will
focus on two of the many possible types of
regression models Simple Linear
Regression Z0(x) 1 Z1(x) x General
Polynomial Regression Z0(x) 1, Z1(x) x,
Z2(x) x2, , Zm(x) xm
11
b REGRESS(y,X) returns the vector of regression
coefficients, b, in the linear model y Xb, (X
is an nxp matrix, y is the nx1 vector of
observations). B,BINT,R,RINT,STATS
REGRESS(y,X,alpha) uses the input, ALPHA to
calculate 100(1 - ALPHA) confidence intervals for
B and the residual vector, R, in BINT and RINT
respectively. The vector STATS contains the
R-square statistic along with the F and p values
for the regression.
gtgt xlinspace(0,1,20) gtgt y2x10.1randn(20,1)
gtgt plot(x,y,'.')
gtgt xxones(20,1), x gtgt bregress(y,xx) b
1.0115 1.9941
gtgt yyxxb gtgt hold on gtgt plot(x,yy,k-')
12
Engineering Computation Least Squares
Regression General Procedure 12
Least Squares Regression (cont'd) General
Procedure For the ith data point, (xi,yi) we
find the set of coefficients for which yi a0
Z0(xi) a1 Z1(xi) .... am Zm (xi) ei where
ei is the residual error the difference between
reported value and model ei yi a0Z0 (xi)
a1Z1 (x)i amZm (xi) Our "best fit" will
minimize the total sum of the squares of the
residuals
13
Engineering Computation Least Squares
Regression General Procedure 13
Our "best fit" will be the function which
minimizes the sum of squares of the residuals
14
Engineering Computation Least Squares
Regression General Procedure 14
Least Squares Regression (cont'd)
To minimize this expression with respect to the
unknowns a0, a1 am take derivatives of Sr and
set them to zero
15
Engineering Computation Least Squares Linear
Algebra 15
Least Squares Regression (cont'd) In Linear
Algebra form Y Z A E or E
Y Z A where E and Y --- n x
1 Z -------------- n x (m1) A
------------- (m1) x 1 n points (m1)
unknowns ET e1 e2 ... en, YT
y1 y2 ... yn, AT a0 a1 a2 ... am
16
Engineering Computation Least Squares Sum
Square error 16
Least Squares Regression (cont'd) E Y
ZA Then Sr ETE (YZA)T
(YZA) YTY ATZTY
YTZA ATZTZA YTY 2
ATZTY ATZTZA Setting
0 for i 1,...,n yields 0
2 ZTZA 2 ZTY or ZTZA
ZTY
17
Engineering Computation Least Squares Normal
Equations 17
Least Squares Regression (cont'd)
ZTZA ZTY
(CC Eq. 17.25) This is the general form of
Normal Equations. They provides (m1) equations
in (m1) unknowns. (Note that we end up with a
system of linear equations.)
18
Engineering Computation Least Squares Simple
Linear Regression 18
Simple Linear Regression (m 1) Given n
data points, (x1,y1),(x2,y2),(xn,yn) with
n gt 2 Obtain "Best fit" curve f(x) a0
a1x from the n equations y1 a0
a1x1 e1 y2 a0 a1x2 e2
yn a0 a1xn en Or, in matrix form
ZTZ A ZTY
19
Engineering Computation Least Squares Simple
Linear Regression 19
Simple Linear Regression (m 1) Normal
Equations ZTZ A ZTY upon
multiplying the matrices become
Normal Equations for Linear Regression CC Eqs.
(17.4-5) (This form works well for spreadsheets.)
20
Engineering Computation Least Squares Simple
Linear Regression 20
Simple Linear Regression (m 1) ZTZ A
ZTY
Solving for a
CC equations (17.6) and (17.7)
21
Engineering Computation Least Squares Simple
Linear Regression 21
Simple Linear Regression (m 1) ZTZ A
ZTY
A better version of the first normal equation is
which is easier and numerically more stable, but
the 2nd equation remains the same
22
ENGRD 241 / CEE 241 Engineering Computation
Curve Fitting 22
Common Nonlinear Relations Objective Use
linear equations for simplicity. Remedy
Transform data into linear form and perform
regressions. Given data which appears as
  • exponential-like curve
  • (e.g., population growth, radioactive decay,
    attenuation of a transmission line)
  • Can also use ln(y) ln(a1) b1x

23
ENGRD 241 / CEE 241 Engineering Computation
Curve Fitting 23
  • Common Nonlinear Relations
  • Power-like curve
  • ln(y) ln(a2) b2 ln(x)
  • (3) Saturation growth-rate curve
  • population growth under limiting conditions
  • Be careful about the implied distribution of the
    errors. Always use the untransformed values for
    error analysis.

a35 b31..10
24
Engineering Computation Goodness of fit 24
  • Major Points in Least-Squares Regression
  • In all regression models one is solving an
    overdetermined system of equations, i.e., more
    equations than unknowns.
  • How good is the fit?
  • Often based on a coefficient of determination, r2

25
Engineering Computation Goodness of fit 25
r2 Compares the average spread of the data about
the regression line compared to the spread of the
data about the mean. Spread of the data around
the regression line
26
Engineering Computation Goodness of fit 26
Coefficient of determination describes how much
of variance is explained by the regression
equation
  • Want r2 close to 1.0.
  • Doesn't work if models have different numbers of
    parameters.
  • Be careful when using different transformations
    always do the analysis on the untransformed data.

27
Engineering Computation Standard Errpr of the
estimate 27
Precision If the spread of the points around
the line is of similar magnitude along
the entire range of the data, Then one can use
standard error of the estimate
(standard deviation in y)
to describe the precision of the regression
estimate (in which m1 is the number of
coefficients calculated for the fit, e.g., m12
for linear regression)
28
Engineering Computation Standard Errpr of the
estimate 28
Statistics   Chapra and Canale in sections
PT5.2, 17.1.3 and 17.4.3 discuss the statistical
interpretation of least squares regression and
some of the associated statistical concepts.
The statistical theory of least squares
regression is elegant, powerful, and widely used
in the analysis of real data throughout the
sciences.   See Lecture Notes pages X-14
through X-16.
Write a Comment
User Comments (0)
About PowerShow.com