Role of CMC and Regions in Environmental Prediction. - PowerPoint PPT Presentation

About This Presentation
Title:

Role of CMC and Regions in Environmental Prediction.

Description:

CMC has access to NCEP grib data in more timely fashion. Overview of NAEFS Telecom issues ... Telecom issues can take lots of planning and a long time to fully ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 31
Provided by: aes5
Category:

less

Transcript and Presenter's Notes

Title: Role of CMC and Regions in Environmental Prediction.


1
A NAEFS Status Report from CMC Fall 2005
Yves Pelletier, Operations Branch, Canadian
Meteorological Center Sept 19, 2005
2
Some CMC-NCEP comments on the development of NAEFS
3
Brief review of NAEFS accomplishments (from CMC
perspective)
  • SHARED VARIABLES
  • Winter 2004 common set of variables-levels (See
    Appendix 5)
  • April 2004 add 3D humidity variable to Appendix
    5 list
  • August 2004 add additional variables to CMC
    dataset
  • CAPE, Tmin, Tmax, QPF types (rain, snow,
    freezing rain and ice pellets)
  • September 2004
  • CMC GRIB dataset complete as per Appendix 5
    variables (except wave model)
  • NCEP freeze limits changes to datasets
  • EPS Production at CMC
  • Start of CMC Ensemble Kalman Filter (EnKF)
    parallel run
  • CMC data available sooner from 07Z down to 545Z
    daily
  • Fall 2005 Start of CMC EPS parallel run with
    improved members and more data
  • MEETINGS
  • Nov 2004 NAEFS IOC workshop
  • TELECOM
  • Sept 2005 NCEP pushes their data to CMC to
    accelerate CMC retrieval of NCEP data

4
Some lessons learned from NAEFSInter-Center
Coordination
  • Coordinating several centers requires long term
    thinking and planning

5
Some lessons learned from NAEFSDefining list of
shared variables, formats
  • Any changes to shared system requires extensive
    planning and coordination
  • Define list of shared variables correct as early
    as possible, changing it later requires effort
  • Ensure common data representation (GRIB 1, GRIB
    2)
  • Consider post-processing data needs when
    finalizing shared variable list

6
Some lessons learned from NAEFSPlanning the
Post-Processing of variables products
  • Post-processing of variables requires
    coordination
  • Will there be uniformity of post-processed shared
    variables and products?
  • Will post processed shared variables and products
    use same code for consistency?
  • Initial set gt NCEP and CMC to use their own
    internal codes and share data as it is produced
    by their respective centers.
  • Can/Should centers work in a programming
    environment that can be easily shared?
  • Will post-processing tasks be done by a code that
    is shared by all participants or by having each
    center use their own proprietary tools and code

7
Some lessons learned from NAEFSDatasets are large
  • Datasets large and getting larger
  • CMCs next update to EPS will deliver
  • 3.4 Gb/run x 2 runs/day 6.8 Gb/day
  • Variables at 6hr interval from 000-360 hr
  • Each CMC file contains
  • OPS All variables / 2-12hr intervals / 1 member
    / file
  • PAR All variables / 1-6hr interval / 1 member /
    file
  • This type of packaging facilitates
    post-processing
  • Presently in Grib-1 format

8
Some lessons learned from NAEFSGrib-1 to Grib-2
transition plan
  • Use of locally defined tables in Grib-1 became
    necessary
  • Locally defined tables require coordination and
    subject to future changes
  • A Grib-1 to Grib-2 transition plan is being
    developed
  • Target dates early 2006

9
CMCs 23 shared variables for NAEFS(NAEFS
Appendix 5) (5 (3D) 10 (2D) 7 (2D
post-processed) 1 (missing) 23 variables)
  • GRIB CMC
    Level(s)
  • 3D model sortie variables ( 5 variables each on
    7 pressure levels)
  • HGT GZ 200,
    250, 500,700, 850 , 925, 1000 (hPa)
  • TMP TT 200,
    250,500,700, 850 , 925, 1000 (hPa)
  • UGRD UU 200,
    250, 500,700, 850 , 925, 1000 (hPa)
  • VGRD VV 200,
    250, 500,700, 850 , 925, 1000 (hPa)
  • DEPR ES 200,
    250, 500, 700, 850, 925, 1000 (hPa)
  • 2D Surface-based model sortie variables (10
    variables)
  • TCDC NT SFC (0)
  • PWAT IH SFC (0)
  • UGRD UU 10m
    (12000)
  • VGRD VV 10m
    (12000)
  • TMP TT 2m
    (12000)
  • DEPR ES 2m
    (12000)
  • HGT GZ Eta1
    (12000)
  • PRES P0 SFC
    (0)
  • PRMSL PN MSL (0)
  • APCP PR SFC
    (0) Cummulative Total Precipitation

10
CMC EPS GRIB-1 format files for NCEP(Comparing
CMCs Ops vs. Par runs)
  • Operational
    Parallel (fall 2005)
  • Ensemble 1 control, 8 SEF, 8 GEM
    1 control, 8 SEF, 8 GEM
  • GRID 1.2 x 1.2 (300 X 151 lat-lon)
    1.2 x 1.2 (300 X 151 lat-lon)
  • DOMAIN GLOBAL
    Global
  • FORMAT WMO Grib 1
    WMO Grib 1
  • Cycles Presently 00Z cycle
    00Z, 12Z cycle by Jan 2006
  • HOURS 000-240 hrs _at_ 12hr intevals
    000-360 hrs _at_ 6hr intervals
  • Files 187 (7 mbrs x 11 files
    1037 (17 mbrs x 61 files)
  • 1 member Approx. 70 Mb/mbr/run
    205.4 Mb/mbr/run
  • 17 members Approx. 1.2 Gb/run/day
    Approx. 3.41Gb/rungt6.82 Gb/day
  • Location ftp.cmc.ec.gc.ca/pub/cmc/ncep
    Location (lt see Ops)
  • Content of CMC grib-1 files
  • all variables/2 (12hr) timesteps/1 mbr
    all variables/1 (6hr) timestep/1 mbr
  • (similar to NCEPs standard pressure format)
  • offers efficient data packaging for
    post-processing

11
Overview of NAEFS Telecom issues
  • Sample retrieval times
  • NCEP pulls CMC ensemble data via internet from
    CMCs FTP site very good access times for NCEP
    (order of 30 minutes for 1.2 Gb/run)
  • Before Oct 2005
  • CMC pulled NCEP ensemble data via internet via
    NWSs FTP site
  • irregular and long access times server load
    problems at NWS
  • (of the order of 4 to 6 hours for 1.8 Gb)
  • Oct 2005
  • NCEP pushes their grib data onto CMC server (1
    hour transfer time)
  • CMC has access to NCEP grib data in more timely
    fashion
  • Longer term
  • Use of internet could be acceptable but ..
    reliability and timeliness will need to be high
    in full operational mode
  • Increase in data volumes will require a more
    stable access to bandwidth
  • Need to consider dedicated telecom link or
    upgrade to current GTS link (T1)

12
Some lessons learned from NAEFS Telecom issues
  • Data retrieval rates not consistent between
    centers
  • Clearly identify data exchange timeliness
    delivery targets
  • Plans to achieve targets take time to implement
  • Some solutions are expensive (ex dedicated
    lines)
  • Temporary work-around solutions can help (ex
    NCEP pushing their data onto CMC server)
  • Telecom issues can take lots of planning and a
    long time to fully resolve

13
Some lessons learned from NAEFSOperational
considerations
  • Implementing expanded EPS features in parallel
    while supporting a centers daily operational run
  • Coordination of future improvements with other
    participants scientific and operational

14
Some lessons learned from NAEFSOnline/Archival
considerations
  • How long should data be kept online?
  • Do centers archive each others data?
  • What retention time is expected in the archive?
    (5 years?)
  • Do we archive only Appendix 5 variables?
  • Do we archive raw model outputs (all eta level
    data?) so that post processed shared variables
    (CAPE) can be re-calculate in the future?

15
Brief overview of future milestones for NAEFS
  • Reasonable data transfer mechanisms and
    timeliness now in place
  • Basic production mechanism in place
  • CMC and NCEP can exchange ensemble grib data
  • Create first generation of post-processed
    products
  • Feb 2006 or sooner
  • NCEP re-packaging its grib data sets to put all
    variables/1-6hr interval/1 member/ file
  • Science and data content coordination
  • CMC-NCEP converge on NAEFS science content issues
    to ensure consistency in grids, production of
    variables and post-processed products
  • Telecom
  • Ongoing evaluation of telecom issues to make data
    transfer faster and more operationally reliable
    (dedicated line?)
  • Target for actual operational exchanges and
    production of products?

16
More detailed outlook of NAEFS Telecom
Issues (CMC-NCEP)
  • CMC EPS data for NCEP (Short term issues)
  • CMC evaluates their ENS parallel run Sept-Dec
    2005
  • Testing of updated packaging of CMC data (all
    parms/6hour/1 mbr/file, 000-360hr)
  • Testing of transmitting larger data sets ( from
    1.2 Gb/run to 3.4 Gb/run)
  • CMC implements ENS parallel run into operations
    (Jan 2006)
  • (delivers 0-360hr forecasts, data _at_ 6hr
    intervals, 00Z and 12Z runs, more data)
  • NCEP data for CMC (Short term issues)
  • CMC receives pushed NCEP enspost style data
    operationally (Sept 2005)
  • NCEP working towards common file packaging format
    for optimum post-processing
  • NCEP pushes their new common file packaging
    format to CMC (Fall 2005, Winter 2006)
  • CMC works on post-processing of new NCEP data
    packaging (Fall 2005, Winter 2006)
  • Longer term Telecom issues
  • - Telecommunications solution to ensure
    high level timeliness and reliability
  • - Plan for transition to GRIB2 (2006 ?)
  • - Other participants added to project

17
Following slides are for additional background
information
18
Update of the CMC Environmental Prediction System
19
  • CMC EPS started with 8 members (SEF T95) in
    quasi-operational mode in March 1996.
  • became operational in February 1998.
  • 8 new models (GEM) were added in August 1999.
  • products available on external Web page available
    in October 1999.
  • increased horizontal resolution to T149 in June
    2001.

Current operational configuration
  • 16 members, 10 day forecasts done once a day
    (00Z run)
  • perturbed analyses obtained from perturbed
    assimilation cycles (Kalman Filter
    Ensemble, EnKF)
  • multi-model approach SEF T150 and GEM 1.2º
    (130km)
  • different model options used for both models

Fall 2005 EPS Parallel run
  • Surface info (ISBA), member config, production
    to 360 hrs now in parallel
  • New system to become operational possible Jan
    2006

20
Canadian EPS set-up
observations
random numbers
and
perturbed observations
data assimilation cycle
data assimilation
6-h integration with models i, i1,8
perturbed trial fields
perturbed analyses ai, i1,8
random numbers
doubling of analyses, i1,8

perturbed surface fields (Z0, SST, AL)
and
forecast cycle
surface fields (Z0, SST, AL)
ai , ai i1,4
ai , ai i5,8
T149 or 150 km
10 days 16 members
medium-range integration with models i (SEF),
i1,8
each model i, i1,8 and j, j1,8 has its own
physics parameterization
medium-range forecast
medium-range integration with models j (GEM),
j1,8
1.2 or 135 km
21
ENSEMBLE (PAR) SET-UP (including SFC info)
observations
4D Var data assimilation
trials
6 hour integration GEM model (FC)
analysis
6 hour integration GEM model (ISBA)
trials (ISBA)
dynamic fields
Pseudo analysis
New surface fields
Surface fields ISBA scheme
22
Combination of model perturbations (OPS (sept
2005))
SEF (T149) Add ops Convection/Radiation
GWD GWD Orography Number
Time level analysis
version of levels 1 yes Kuo/
Garand Strong High altitude 0.3
23 3 2 no Manabe/ Sasamori Strong Low
altitude 0.3 41 3 3 no Kuo/ Garand Weak Low
altitude Mean 23 3 4 yes Manabe/
Sasamori Weak High altitude Mean 41 3 5 yes Manab
e/ Sasamori Strong Low altitude Mean 23 2 6 no K
uo/ Garand Strong High altitude Mean 41 2 7 no Ma
nabe/ Sasamori Weak High altitude 0.3
23 2 8 yes Kuo/ Garand Weak Low altitude 0.3
41 2 control mean Kuo/ Garand Mean Low
altitude 0.15 41 3 GEM (1.20) Add
ops Deep Shallow Soil
Sponge Number Coriolis
analysis convection convection moisture
of levels 9 no Kuosym new Less
20 global 28
Implicit 10 yes RAS old Less 20
equatorial 28 Implicit 11 yes RAS
old Less 20 global 28
Implicit 12 no Kuosym old More 20
global 28
Implicit 13 no Kuosym new More 20
global 28 Implicit 14 yes Kuosym
new Less 20 global 28
Implicit 15 yes Kuosym old Less 20
global 28
Implicit 16 no OldKuo new More 20
global 28 Implicit
23
Combination of model purturbations (PAR (sept
2005))
SEF GWD Convection Schemes
Surface Number Time level
(T149) taufac deep
shallow scheme of levels Control
8.0e-6 Kuo conres
Fcrest 27 3
1 1.2e-5 Kuo conres
ISBA 27 3 2
1.2e-5 Ras turwet
Fcrest 27 3 3
4.0e-6 Kuo conres
Fcrest 27 3 4
4.0e-6 Ras
turwet ISBA 27 3 5
1.2e-5 Ras turwet
Fcrest 27 2 6
1.2e-5 Kuo conres ISBA
27 2 7 4.0e-6
Ras turwet ISBA 27 2
8 4.0e-6 Kuo
conres Fcrest 27 2 GEM
GWD Convection Schemes Surface
Number Time level (1.2) taufac
deep shallow scheme
of levels 9 8.0e-6
Kuosym ktrsnt Fcrest 28
2 10 8.0e-6
Ras conres ISBA 28
2 11 8.0e-6 Ras conres
Fcrest 28 2 12
8.0e-6 Kuosym ktrsnt ISBA 28
2 13 8.0e-6
Kuostd ktrsnt Fcrest
28 2 14 8.0e-6
Kuostd ktrsnt ISBA 28
2 15 8.0-e6
Kuosym conres ISBA 28
2 16 8.0e-6 Kuo conres
Fcrest 28 2
24
Summary of EPS developments at CMC
  • Fall 2005 parallel run
  • forecasts extending to 15 days,
  • increase robustness of schemes,
  • reconfigure certain members
  • validate post processing routines
  • January 2006 (or sooner)
  • Parallel run becomes operational
  • Daily 00Z 12Z forecast runs out to 360 hrs
  • Ongoing
  • strive to use state of the art parameterizations,
  • eventually add stochastic physics,
  • develop week two weather products
  • replace SEF members by GEM
  • increase the number of members to 20
  • unification of EPS members

25
NAEFSDetails of data exchange betweenCMC-NCEP
26
Past NAEFS milestones (CMC perspective)
  • Winter 2004 confirmed common set of
    variables-levels (See Appendix 5)
  • April 2004 addition of 3D humidity variable to
    the Appendix 5 list
  • August 2004 addition of new variables to the CMC
    dataset
  • CAPE, Tmin, Tmax
  • quantitative precipitation types (rain, snow,
    freezing rain and ice pellets)
  • September 2004
  • CMC GRIB dataset is complete as per Appendix 5
    variables (except wave model)
  • final testing and validation of CMC grib encoding
    with NCEP (before their freeze)
  • October 2004
  • Start of full Ensemble Kalman Filter (EnKF)
    parallel run
  • Improvements to ensemble production runs
    products available more than one hour earlier
  • Before around 07Z Now around 545Z
  • Sept 2005 NCEP pushes their dataset to CMC
    server to accelerate CMC retrieval of NCEP data
  • Fall 2005 Start of CMC EPS parallel run
  • Improved SEF member configuration
  • Addition of high quality surface information in
    the assimilation cycle
  • All Appendix 5 variables (except WAM data)
    included

27
Some lessons learned to date from NAEFS
experience
  • Any changes to system require extensive planning
    and coordination
  • Try and get the list of shared variables correct
    as early as possible
  • Making changes to the content of operational
    files requires parallel runs often followed by
    coordination between centers
  • Strive to ensure post-processing data needs will
    be met by the variables list
  • Effort required between centers to encode parms
    when standard grib-1 tables are not designed for
    those variables. Ex handling precipitation type
    accumulation with grib 1 (Grib-2 may help this)
  • Transferring from grib-1 to grib-2 will require
    development and coordination
  • Post-processing of variables requires extensive
    coordination
  • Will there be uniformity of post processed
    products?
  • Will post processed products use same code for
    inter center consistency?
  • Can/Should centers easily share post-processing
    code?
  • Will post-processing tasks be done by a code that
    is shared by all participants or by having each
    center use their own proprietary tools and code
  • Telecom issues take a long time to resolve
  • Datasets very large and getting larger (CMC now
    at 3.4 Gb/run x 2 runs/day)
  • Data retrieval times not consistent between
    centers
  • Dedicated lines could help but very costly
  • Very clear mid and long term plans are important
    to help plan activities

28
CMC Cape calculation
  • Cape is calculated in post-production mode
  • Cape is calculated only for gridpoints which have
    lifted index lt 3
  • Gridpoints with values of -1 indicated no CAPE
    value has been calculated for that gridpoint
  • Some 0h members (001,004, 005, 008, 010, 011, 014
    et 015) may not be at equilibrium at
    initialization and as so their CAPE values may be
    unrealiable
  • A new approach in intialization will correct this
    shortly

29
CMC Tmin/Tmax calculation
  • Tmin Tmax calculated in post-processing mode
  • Max/Min surface temperatures are presently
    calculated over a 12 hour period

30
CMC Precipitation Type Information
  • CMC dataset offers quantitative precipitation
    amounts for 4 types of precipitation rain, snow,
    freezing rain and ice pellets
  • Our algorithm used is the Bourgouin method
  • Weather and Forecasting (Bourgouin, Pierre. 2000
    A Method to Determine Precipitation Types.
    Weather and Forecasting Vol. 15, No. 5, pp.
    583-592. or http//ams.allenpress.com/amsonline/?r
    equestget-documentissn1520-0434volume015issu
    e05page0583)
  • For each 12-hour period, each member of the
    Canadian EPS forecasts precipitation amounts
    expected to fall as rain, snow, freezing rain and
    ice pellets separately (i.e. four fields).
  • We could also further include dominant type of
    precipitation per time interval but this would
    have to be further discussed and agreed upon
  • NCEP only provides categorical occurrence (0-1)
    of each precip type.
Write a Comment
User Comments (0)
About PowerShow.com