Pertti Nurmi - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

Pertti Nurmi

Description:

using ROC Verification. H = a / ( a c ) F = b / ( b d ) Note difference: FAR = b / ( a b ) ... ROC Curve Generation: 2 hypothetical forecast systems ... – PowerPoint PPT presentation

Number of Views:64
Avg rating:3.0/5.0
Slides: 29
Provided by: fmi52
Category:
Tags: nurmi | pertti | roc

less

Transcript and Presenter's Notes

Title: Pertti Nurmi


1
Pertti Nurmi Juha Kilpinen Sigbritt
Näsman Annakaisa Sarkanen ( Finnish
Meteorological Institute ) Verification
of Probabilistic Forecasts to be used
as Decision-Making Guidance for Warnings against
Near-Gale Force Winds ECAM-7 _
EMS-5_MODIFIED Utrecht, 12-16 September 2005
2
Example of Decision-Making using ROC Verification
Forecast method x
H
H87
Forecast method y
H78
H a / ( a c ) F b / ( b d )
Note difference FAR b / ( a b ) For rare
events, FAR gtgtgt F
ROCA (x) 0.93 ROCA (y) 0.89
F
F16
3
ROC Curve Generation 2 hypothetical forecast
systems
H
H a / ( a c ) F b / ( b d )
30 threshold with F 42 FC System1 H 74 FC
System2 H 93
Note difference FAR b / ( a b ) For rare
events, FAR gtgtgt F
F
Note The two forecasting systems dont have to
have the same F values for each probability
thresholds The shown curves result from my lack
of skill in using Excel spreadsheets!
4
ROC Curve Generation 2 hypothetical forecast
systems
H
H a / ( a c ) F b / ( b d )
Note difference FAR b / ( a b ) For rare
events, FAR gtgtgt F
40 threshold with F 23 FC System1 H 57 FC
System2 H 86
F
Note The two forecasting systems dont have to
have the same F values for each probability
thresholds The shown curves result from my lack
of skill in using Excel spreadsheets!
5
ROC Curve Generation 2 hypothetical forecast
systems
H
H a / ( a c ) F b / ( b d )
Note difference FAR b / ( a b ) For rare
events, FAR gtgtgt F
50 threshold with F 12 FC System1 H 38 FC
System2 H 75
F
Note The two forecasting systems dont have to
have the same F values for each probability
thresholds The shown curves result from my lack
of skill in using Excel spreadsheets!
6
ROC Curve Generation 2 hypothetical forecast
systems
H
H a / ( a c ) F b / ( b d )
80 HIT TARGET FC System1 gt 25 TRIGGER
THRESHOLD with F 50 FC System2 gt 45
Trigger THRESHOLD with F 16
Note difference FAR b / ( a b ) For rare
events, FAR gtgtgt F
F
Note The two forecasting systems dont have to
have the same F values for each probability
thresholds The shown curves result from my lack
of skill in using Excel spreadsheets!
7
ROC Curve Generation 2 hypothetical forecast
systems
H
H a / ( a c ) F b / ( b d )
10 F TARGET (not FAR) FC System1 gt 50
TRIGGER THRESHOLD with H 35 FC System2 gt
50 Trigger THRESHOLD with H 70
Note difference FAR b / ( a b ) For rare
events, FAR gtgtgt F
F
Note The two forecasting systems dont have to
have the same F values for each probability
thresholds The shown curves result from my lack
of skill in using Excel spreadsheets!
8
ROC Curve Generation
1920
5351
Example
H a / ( a c ) F b / ( b d )
To learn more about ROC and Signal Detection
Theory, check http//wise.cgu.edu/
9
Introduction
  • Develop warning criteria / Guidance methods to
    forecast probability of near-gale force winds in
    the Baltic ? Joint Scandinavian research
    undertaking
  • e.g. Finland and Sweden issue near-gale storm
    force wind warnings for same areas using
    different criteria ? Homogenize !
  • Coastal stations of Finland, Sweden, Denmark,
    Norway
  • Probabilistic vs. deterministic approach
  • HIRLAM ? ECMWF models
  • Different calibration methods, e.g. Kalman
    filtering
  • Goal Common Scandinavian operational warning
    practice

10
Data
  • HIRLAM (Limited Area Model)
  • RCR 22 km version (Reference version)
  • MBE 9 km version (Operational since 2004)
  • Data coverage Nov 2004 Mar 2005 ? 140
    cases
  • ECMWF
  • Data interpolated to 0.5o 0.5o ? Nearest grid
    point
  • Data coverage Oct 2004 Apr 2005 ? 210 cases
  • Forecasts obs Mean wind speed at 10 meter
    height
  • Near-Gale ? Wind speed ³ 14 m/s
  • Forecast lead times 6 144 hrs
  • Special emphasis on early warning time range

11
Potential problems
  • with height of instrumentation ?
  • with observing site surroundings and obstacles ?
  • with the coast ?
  • with nearby islands ?
  • with barriers ?
  • with installations ?
  • with low-level stability ?

Statistical correction scheme available at FMI
12
Height of the instrumentation - Large filled
dots 6 Finnish stations being used- Yellow
dots Stations whose results presented
(m) 55 50 45 40 35 30 25 20 15 10 5
13
Methods for producing probabilistic forecasts 1
  • ECMWF EPS (51 members) ? P (wind speed) ³ 14 m/s
  • ECMWF Kalman filtering
  • Various approaches ? No details given here
  • Deterministic forecast, dressed with a
    posteriori description of the observed error
    distribution of the past, dependent sample ? P
    (wind speed) ³ 14 m/s
  • Simplistic reference !
  • Deterministic forecasts
  • Error distribution of original sample (140 or
    210 cases)
  • Approximation of the error distribution with a
    Gaussian fit (m, s)
  • Dressing method

14
Methods for producing probabilistic forecasts 2
  • Deterministic forecast, adjusted with a Gaussian
    fit
  • to model forecasted stability
  • ( Temperature forecasts from 2 adjacent model
    levels )
  • ? P (wind speed) ³ 14 m/s ? Stability method
  • Scheme used at SMHI (H. Hultberg)
  • Uncertainty area method
  • (aka Neighborhood method)
  • (aka Probabilistic upscaling)
  • Spatial (Fig.) and/or temporal
  • neighboring grid points
  • Size of uncertainty area ?
  • Size of time window ?
  • c. 50-500 members
  • RCR 3 points 150150 km2
  • MBE 6 points 120120 km2

15
Relative Operating Characteristic
Probabilistic FCs ROC
  • To determine the ability of a forecasting system
    to discriminate between situations when a signal
    is present (here, occurrence of near-gale) from
    no-signal cases (noise)
  • To test model performance (Hit Rate vs. False
    Alarm Rate) relative to a given threshold
  • Applicable for probability forecasts and also for
    categorical deterministic forecasts
  • Allows for their comparison
  • R statistical package used for ROC
  • computation/presentation

16
ROC curve/area Station_981 24 hrs
Simple reference (dependent sample) ECMWF_Dres
sing
ROCA fit 0.96
17
ROC curve/area Station_981 24 hrs
Simple reference (dependent sample) ECMWF_Dres
sing
ECMWF_EPS
ROCA fit 0.96 ROCA fit (EPS) 0.92
18
ROC curve/area Station_981 24 hrs
Simple reference (dependent sample) ECMWF_Dres
sing
ECMWF_EPS
ECMWF_EPS_Kal
ROCA fit 0.96 ROCA fit (EPS) 0.92 ROCA fit
(KAL) 0.965
19
Comparison of methods Station_981 24 hrs
Brier Score BS ( 1/n ) S ( p i o i ) 2
Brier Skill Score BSS 1 BS / BS ref
  • MSE in probability space
  • Sensitive to large forecast errors !
  • Careful with limited datasets !
  • Influenced by sample climatology
  • Different samples not to be compared

Range - oo to 1 Perfect score 1
20
Comparison of methods Station_981 24 hrs
Brier Score BS ( 1/n ) S ( p i o i ) 2
Brier Skill Score BSS 1 BS / BS ref
  • MSE in probability space
  • Sensitive to large forecast errors !
  • Careful with limited datasets !
  • Influenced by sample climatology
  • Different samples not to be compared

Range - oo to 1 Perfect score 1
21
ROC Area BSS w.r.t. to FC lead time
Station_981
ECMWF
ROC A
BSS
22
ROC Area BSS w.r.t. to FC lead time
Station_987
ECMWF
ROC A
BSS
23
Conclusions ? Future
  • Weve only scratched the (sea) surface
  • Need (much) more experimentation with various
    methods models
  • Different methods for different time/space
    scales models ?
  • Apply to data of other Scandinavian counterparts
    (here, only 1-2 stations)
  • Scores depend on station properties
  • (e.g. observation height Not dealt with here)
  • (Statistical) adjustment of original
    observations required !
  • Finland has an operational scheme for this !
  • Dressing of dependent sample quality level
    hard to reach
  • Higher resolution HIRLAM version produces higher
    scores
  • Not necessarily a trivial result !
  • Kalman filtering reduces biases improves ECMWF
    EPS
  • Reach the goal, i.e. common operational practice
    !!!

24
(No Transcript)
25
(No Transcript)
26
(No Transcript)
27
(No Transcript)
28
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com