Statistics%20350%20%20Lecture%202 - PowerPoint PPT Presentation

About This Presentation
Title:

Statistics%20350%20%20Lecture%202

Description:

Statistics 350 Lecture 2 Today Last Day: Section 1.1-1.3 Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34, 35 Due: January ... – PowerPoint PPT presentation

Number of Views:49
Avg rating:3.0/5.0
Slides: 14
Provided by: Fuji366
Category:

less

Transcript and Presenter's Notes

Title: Statistics%20350%20%20Lecture%202


1
Statistics 350 Lecture 2
2
Today
  • Last Day Section 1.1-1.3
  • Today Section 1.6
  • Homework 1
  • Chapter 1 Problems (page 33-38) 2, 5, 6, 7, 22,
    26, 33, 34, 35
  • Due January 19
  • Read Sections 1.1-1.3 and 1.6

3
Simple Linear Regression
  • Last day, introduced the simple linear regression
    model
  • Yi?0?1Xi?i for i1,2,,n
  • In practice, do not know the values of the ?s
    nor ?2
  • Use data to estimate model parameters giving
    estimated regression equation
  • Want to get the line of best fitwhat does this
    mean?

4
Apartment Example
5
Least Squares
  • Would like to get estimates, bo and b1 to get
    estimated regression function
  • Would like the estimated line to come as close as
    possible to the data
  • Coming close to one point may make the line
    further from others
  • Would like all points to be close on average

6
Least Squares
  • The single criterion that is commonly used is the
    least squares criterion
  • Q
  • Want to select values of bo and b1 that minimize
    Q
  • How to minimize

7
Least Squares
  • Partial derivatives
  • Normal equations

8
Least Squares
  • Solving the normal equations
  • b0
  • b1

9
Comments and Properties
  • b1 can be re-written as linear combination of the
    Yis
  • Therefore
  • It is a statistic (function of the data)
  • It is a random variable with its own distribution
  • It is a linear combination of independent normal
    random variables and thus will have a
  • Same is true for bo
  • As we shall see, both are unbiased estimators of
    , respectively, so

10
Comments and Properties
  • The resulting estimated regression line is
  • It gives an estimate of E(Y) for a given X
  • For the Xis in the sample, can compute the
    predicted (or fitted) values
  • The difference between the actual observed data
    and the predicted value is
  • They should resemble the ?is (Chapter 3)

11
Comments and Properties
  • Residuals sum to 0
  • The sum of the squared residuals is minimized for
    these data (i.e., property of least squares)

12
Comments and Properties
  • ? Xiei0
  • The predicted response at the mean of the
    observed X is

13
Comments and Properties
  • The sum of the residuals, weighted by the
    predicted responses, is
  • The mean square error is
  • It is useful because
Write a Comment
User Comments (0)
About PowerShow.com