Predictability - PowerPoint PPT Presentation

1 / 35
About This Presentation
Title:

Predictability

Description:

Predictability & Prediction of Seasonal Climate over North America Lisa Goddard, Simon Mason, Ben Kirtman, Kelly Redmond, Randy Koster, Wayne Higgins, – PowerPoint PPT presentation

Number of Views:144
Avg rating:3.0/5.0
Slides: 36
Provided by: noa134
Category:

less

Transcript and Presenter's Notes

Title: Predictability


1
Predictability Prediction of Seasonal Climate
over North America
  • Lisa Goddard, Simon Mason, Ben Kirtman, Kelly
    Redmond, Randy Koster, Wayne Higgins, Marty
    Hoerling, Alex Hall, Jerry Meehl, Tom Delworth,
    Nate Mantua, Gavin Schmidt(US CLIVAR PPAI Panel)

2
Time Series of Prediction Skill
(Courtesy of Arun Kumar Ants Leetmaa)
(1) Understand the limit of predictability (2)
Identify conditional predictability (e.g. state
of ENSO or Indian Ocean) (3) Document the
expected skill to judge potential utility of the
information for decision support (4) Set a
baseline for testing improvements to prediction
tools and methodologies (5) Set a target for
real-time predictions.
3
Real-time prediction skillNorth America,
1-month lead, seasonal terrestrial climate
  • Provide a template for verification
  • - What are the best metrics? Best for who?
  • - Pros cons of current metrics
  • - Can we capture important aspects of
    variability (e.g. trends, drought periods)?
  • Estimate skill of real-time forecasts
  • - How predictable is N. America climate?
  • - Benefit of multi-model ensembling?
  • Provide baseline against which we can judge
    future advances
  • - How best to archive/document for future
    comparison?
  • - Are we missing something? (i.e. statistical
    models)

4
Forecast Data
  • Dynamical models (single)
  • CCCma Canadian Centre for Climate Modeling and
    Analysis
  • KMA Korean Meteorological Agency
  • MGO- Main Geophysical Observatory, Russia
  • NASA/GMAO-National Aeronautics and Space
    Administration, USA
  • RPN Canadian Meteorological Centre
  • ECHAM4.5 MPI (run at IRI)
  • CCM3.6 NCAR (run at IRI)
  • ECMWF European Center for Medium Range Weather
    Forecasts
  • Meteo-France Meteorological Service, France
  • LODYC- Laboratoire d'Océanographie Dynamique et
    de Climatologie, France
  • Met Office UK Meteorological Office
  • MPI Max Planc Institute for Meteorology,
    Germany
  • CERFACS European Centre for Research and
    Advanced Training in Scientific
    Computing, France
  • INGV-Instituto Nazionale di Geofisica e
    Vulcanolgia, Italy
  • NOAA-CFS National Oceanic Atmospheric
    Administration, USA
  • Multi-Model of dynamical models (simple average)
  • Statistical models (from CPC) CCA, OCN (others?)

5
Forecast Data
Model NX NY NM L S
CCCma-GCM2 96 48 10 0.5-3.5 Mar1969-Dec2003 by 3
CCCma-GCM3 128 64 10 0.5-3.5 Mar1969-Dec2003 by 3
KMA 144 73 6 2.5-8.5 Jan1979-Dec2002
MGO 144 73 6 0.5-3.5 Nov1978-Nov2000 by 3
NASA-GMAO 144 90 6 1.5-3.5 Feb1993-Nov2002 by 3
RPM 192 96 10 0.5-3.5 Mar1969-Dec2000 by 3
ECHAM4.5 128 64 24 0.5-6.5 Jan1958-Dec2002
CCM3.6 128 64 24 0.5-6.5 Jan1958-Dec2002
ECMWF 144 71 9 0.5-5.5 Feb1958-Nov2001 by 3
Meteo-France 144 71 9 0.5-5.5 Feb1958-Nov2001 by 3
LODYC 144 71 9 0.5-5.5 Feb1974-Nov2001 by 3
MetOffice 144 71 9 0.5-5.5 Feb1959-Nov2001 by 3
MPI 144 71 9 0.5-5.5 Feb1969-Nov2001 by 3
CERFACS 144 71 9 0.5-5.5 Feb1980-Nov2001 by 3
INGV 144 71 9 0.5-5.5 Feb1973-Nov2001 by 3
CFS 192 94 15 0.5-8.5 Jan1981-Dec2003
6
Forecast Data JJA DJF (1981-2001)
Model NX NY NM L S
CCCma-GCM2 96 48 10 0.5-3.5 Mar1969-Dec2003 by 3
CCCma-GCM3 128 64 10 0.5-3.5 Mar1969-Dec2003 by 3
KMA 144 73 6 2.5-8.5 Jan1979-Dec2002
MGO 144 73 6 0.5-3.5 Nov1978-Nov2000 by 3
NASA-GMAO 144 90 6 1.5-3.5 Feb1993-Nov2002 by 3
RPM 192 96 10 0.5-3.5 Mar1969-Dec2000 by 3
ECHAM4.5 128 64 24 0.5-6.5 Jan1958-Dec2002
CCM3.6 128 64 24 0.5-6.5 Jan1958-Dec2002
ECMWF 144 71 9 0.5-5.5 Feb1958-Nov2001 by 3
Meteo-France 144 71 9 0.5-5.5 Feb1958-Nov2001 by 3
LODYC 144 71 9 0.5-5.5 Feb1974-Nov2001 by 3
MetOffice 144 71 9 0.5-5.5 Feb1959-Nov2001 by 3
MPI 144 71 9 0.5-5.5 Feb1969-Nov2001 by 3
CERFACS 144 71 9 0.5-5.5 Feb1980-Nov2001 by 3
INGV 144 71 9 0.5-5.5 Feb1973-Nov2001 by 3
CFS 192 94 15 0.5-8.5 Jan1981-Dec2003
7
Verification Data Metrics
  • OBSERVATIONAL DATA 2.5x2.5 deg
  • 2m T CRU-TSv2.0 (1901-2002)
  • Precipitation CMAP (1979-2004)
  • VERIFICATION MEASURES
  • Metrics consistent with WMO - SVS for LRF
    (Standardised Verification System for Long Range
    Forecasts)
  • Deterministic information
  • - MSE its decomposition - correlation, mean
    bias, variance ratio
  • Probabilistic information
  • - Reliability diagrams, regionally accumulated
  • - ROC areas for individual grid boxes

8
Mean Squared Error
9
Mean Squared Error
  • Pro
  • Gives some estimate of uncertainty in forecast
    (i.e. RMSE).
  • Con
  • Can not infer frequency of large errors unless
    precise distributional assumptions are met.
  • Recommendation
  • Perhaps simple graph or table showing
    frequency of errors of different magnitudes would
    be appropriate.

10
Correlation TemperatureDJF 1981-2001
11
Correlation TemperatureJJA 1981-2001
12
Correlation PrecipitationDJF 1981-2001
13
Correlation PrecipitationJJA 1981-2001
14
Correlation
  • Pros
  • Commonly used familiar
  • Gives simple overview of where models are
    likely to have skill or not
  • Con
  • Merely measure of association, not of forecast
    accuracy
  • Recommendation
  • Avoid deterministic metrics

15
(No Transcript)
16
Example
Ensemble forecasts of above-median March May
rainfall over north-eastern Brazil
17
(No Transcript)
18
ROC Areas DJF TemperatureBELOW-NORMAL
19
ROC Areas DJF TemperatureABOVE-NORMAL
20
ROC Areas JJA TemperatureABOVE-NORMAL
21
ROC Areas JJA TemperatureBELOW-NORMAL
22
ROC Areas DJF PrecipitationABOVE-NORMAL
23
ROC Areas DJF PrecipitationBELOW-NORMAL
24
ROC Areas JJA PrecipitationABOVE-NORMAL
25
ROC Areas JJA PrecipitationBELOW-NORMAL
26
ROC Areas
  • Pros
  • Can treat probabilistic forecasts
  • Can be provided point-wise
  • Can distinguish asymmetric skill
  • Cons
  • Fails to address reliability

27
RELIABILITY
28
RELIABILITY
29
Reliability
  • Pros
  • Treats probabilistic forecasts
  • Relatively easy to interpret
  • Provides most relevant information on
    usability of forecast information over time
  • Cons
  • Difficult to provide for individual grid
    points, especially for short time samples

30
Temperature Trends over North America
-Area Covered by Above-Normal
31
Temperature Trends over North America
-Area Covered by Above-Normal
32
Observed Precipitation over North
America1998-2001
JJA
DJF
Anomalies relative to1981-1997
Percent difference relative to 1981-1997
Frequency ( years out of 4)for precipitation in
BN category
33
Frequency of Below-Normal PrecipitationJJA
1998-2001
OBSERVATIONS
34
Frequency of Below-Normal PrecipitationDJF
1998-2001
OBSERVATIONS
35
Summary
  • Whats an appropriate template?
  • - Skill metrics should be flexible (i.e. user
    defined events, categories, thresholds)
  • - Probabilistic forecasts must be treated
    probabilistically!!!
  • How are we doing?
  • - Could be better. Encouraging performance
    estimates by some measures, but inadequate
    performance on important aspects of climate
    variability. - Missing elements necessary for
    seasonal prediction?
  • Baseline??
Write a Comment
User Comments (0)
About PowerShow.com