Title: Evaluating Forecasting Models
1Evaluating Forecasting Models
2Evaluating Forecasting Models
- In-sample versus out-of-sample forecasts
- Ex Post versus Ex Ante Forecasts
- Forecast error from forecasts of independent
variables
37 Error Measures Diebold p. 260-263
- Mean (Average) Error (ME) the average of the
error for each period - Mean Absolute Deviation (MAD) the average of
the absolute values of the error for each period - Mean Squared Error (MSE) the average of the
squared error for each period - Standard Deviation (SD) the square root of the
MSE
47 Error Measures (contd)
- Signed Squared Error (SSE) the average of the
squared error with the signed retained of the
direction of the error - Mean Percentage Error (MPE) the average of the
percentage error for each period. The percentage
is calculated by dividing the error by the actual
sales. - Mean Absolute Percentage Error (MAPE) the
average of the absolute value of the percentage
error for each period. The percentage is
calculated by dividing the error by the actual
sales.
5Error Measure Example
6Mean Error (ME)
- shows direction of error
- does not penalize extreme errors
- errors cancel out (no idea of how much)
- in original units
7Mean Error (ME)
- Average Error This measure averages the errors
for all periods forecast.
8Mean Absolute Deviation (MAD)
- shows magnitude of overall error
- does not penalize extreme errors
- errors do not cancel out
- no idea of direction of error
- in original units
9Mean Absolute Deviation (MAD)
- Mean Absolute Deviation - This measure averages
the absolute values of the errors for all periods
forecast
10Mean Squared Error (MSE)
- penalizes extreme errors
- errors do not offset one another
- not in original units
- does not show direction of error
11Mean Squared Error (MSE)
- This measure averages the squares of the error
for all periods forecast.
12Standard Deviation (SD) or (RMSE)
- penalizes extreme errors
- errors do not offset one another
- in original units
- does not show direction of error
13Standard Deviation (SD) or (RMSE)
- This measure is the square root of the mean
squared error (MSE).
14Signed Squared Error (SSE)
- penalizes extreme errors
- errors can offset one another
- shows direction of error
- not in original units
15Signed Squared Error (SSE)
- This measure is similar to the mean squared error
except that the original sign of the error is
kept for each periods squared error
16Mean Percentage Error (MPE)
- takes percentage of actual sales
- does not penalize extreme error
- errors can offset one another
- shows direction of error
- assumes more sales can absorb more error in units
17Mean Percentage Error (MPE)
- This measure divides the error by the actual
value. An average is taken for all periods
forecast.
18Mean Absolute Percentage Error (MAPE)
- takes percentage of actual sales
- does not penalize extreme deviations
- does not cancel offsetting errors
- assumes more sales can absorb more error in units
- does not show direction of error
19Mean Absolute Percentage Error (MAPE)
- This measure sums the absolute values of the
error divided by the actual value. An average is
taken for all periods forecast
20Theils U-Statistic
- This measure takes the ratio of the MPE (defined
above) to what the MPE would be if using the
naïve method of letting the last actual value be
the next forecast.
21Theils U-Statistic
- If U1, the naïve method is as good as the
forecasting technique being evaluated. - If Ult1, the forecasting technique being used is
better than the naïve method. The smaller the
U-statistic, the better the forecasting technique
relative to the naïve method. - If Ugt1, there is no point in using a formal
forecasting method, since using a naïve method
will produce better results.
22McLaughlins Batting Average
- This measure is a conversion of Theils
U-Statistic to something more intuitive. If the
batting average is over .300, the forecasting
method being evaluated is doing better than a
naïve technique.