P1253297510EJjST - PowerPoint PPT Presentation

1 / 25
About This Presentation
Title:

P1253297510EJjST

Description:

David H. Bromwich1,2 and Keith M. Hines1. 1-Polar Meteorology Group. Byrd Polar Research Center. The Ohio State University. Columbus, Ohio ... – PowerPoint PPT presentation

Number of Views:48
Avg rating:3.0/5.0
Slides: 26
Provided by: pol126
Category:

less

Transcript and Presenter's Notes

Title: P1253297510EJjST


1
Polar Optimized WRF for Arctic System Reanalysis
of Arctic Meteorology over Recent Decades
David H. Bromwich1,2 and Keith M. Hines1
1-Polar Meteorology Group Byrd Polar Research
Center The Ohio State University Columbus,
Ohio 2-Atmospheric Sciences Program Department of
Geography The Ohio State University Columbus, Ohio
2
SEARCH (Study of Environmental Arctic Change)
Project http//psc.apl.washington.edu/search/
A. Motivated by observations of
significant climate change in the Arctic within
recent decades. B. Catch-word "Unaami"
adopted ... meaning "tomorrow" C. Need
for broad, interdisciplinary, multi-scale study
of complicated phenomena and change of the
Arctic D. A multi-agency, international
effort E. Arctic CHAMP (Community-wide
Hydrological Analysis and Monitoring Program)
F. Strategy for SEARCH (1)
Long-term observations (2)
Modeling (3) Process studies
(4) Application SEARCH Plan
Includes an Arctic System Reanalysis
3
Polar MM5 A. The Pennsylvania State
University (PSU)-National Center for Atmospheric
Research (NCAR) Fifth-generation
Mesoscale Model (MM5) has been adapted for polar
applications (1) Real-time forecasting
for Antarctica (AMPS) in collaboration with NCAR
(2) Forecasting for the Arctic Rivers
(RIMS) Project (3) Contemporary climate
studies (e.g., ENSO impacts on Antarctica)
(4) Paleoclimate studies (e.g., Laurentide
Ice Sheet climate at the LGM) B. Polar
Modifications to MM5 (1) Revised cloud
/ radiation interaction (2) Modified
explicit ice phase microphysics (3)
Optimized turbulence (boundary layer)
parameterization (4) Implementation
of a sea ice surface type (5)
Improved treatment of heat transfer through
snow/ice surfaces (6) Improved upper
boundary treatment C. Future
Modifications for Polar MM5 (1)
Improved treatment of horizontal pressure
gradient force (2) Additional testing
and development of upper boundary treatments
(3) Improved snow/ice albedo
parameterization D. Future after Polar
MM5?
4
Arctic RIMS Polar MM5 Forecast for Barrow,
Alaska Surface Temperature During the Snowmelt
Season
Surface Temperature
Day
5
Arctic RIMS Polar MM5 Forecast for Barrow,
Alaska Low-level Wind Speed
Wind Speed
Day
6
Arctic System Reanalysis (ASR) A. Initiation of
an Arctic Reanalysis Activity in SEARCH (1)
David Bromwich, Mark Serreze, Jeff Tilley, and
John Walsh (2) Project Supported by NOAA B. New
Reanalysis of all available data (1) Recent
Arctic field programs (2) Data rescue from
former Soviet Union C. Reanalysis combine
short-term model forecast with observations from
the ground, rawinsondes, aircraft, satellites,
etc. D. Reanalyses provide quantitative measures
of fields which are difficult to observe
directly E. An optimum combination of
observations and modeling F. Fixed data
assimilation system G. ASR will take advantage of
the lessons learned in earlier reanalyses
(NCEP, NCEP-2, ERA-15, ERA-40, NARR) (1)
temporal resolution 6 hours (2)
horizontal resolution 80-210 km (3)
there are still biases in the earlier reanalyses
(4) difficulties in incorporation TOVS
over sea ice H. Will include sea ice, ocean, land
surface and hydrology components
7
  • WRF will be the atmospheric model for ASR
  • http//wrf-model.org/
  • The Weather Research and Forecasting Model is in
    production by multiple agencies and development
    groups.
  • The "next generation" model after MM5, ETA, etc.
    will provide an advanced mesoscale forecast and
    assimilation system.
  • WRF, a community model, will provide closer ties
    between research and applications.
  • Up to 1 km resolution
  • Portable, efficient, parallel-friendly
  • Currently in development. Version 1.3 is
    available to public.

8
WRF Development Teams
9
Projected Timeline for WRF Project
10
Model Physics in High Resolution NWP
Physics No Mans Land
11
International Polar Year - 2007 http//dels.nas.ed
u/prb/ipy/ A. Follows up on earlier IPY's
(1882/1883 and 1932/1933) and the IGY
(1957/1958) B. Scientific Rationale from
http//ipy.gsfc.nasa.gov/about.shtml It
is clear that a complex suite of significant,
interrelated, atmospheric, oceanic and
terrestrial changes has occurred in the polar
regions in recent decades. These events are
affecting every part of the polar environment and
are having repercussions on society. Solar
variations are also believed to influence climate
through the formation of high altitude clouds,
influencing the ionosphere of Earth, variations
in solar output as recorded in the the Holocene
sediments and other effects. Polar contributions
to and the effect of global climate change are
still a matter of conjecture, and to a large
extent so are the extraterrestrial
contributions. C. Some goals from
http//ipy.gsfc.nasa.gov/about.shtml
(1) Separation of the profound changes in the
polar regions between anthropogenic effects and
natural fluctuations. (2) The
environmental paleohistory/tectonics of the high
latitude (3) Reconstruction of the
detailed history of polar ice sheets
(4) Oceanic and terrestrial ecosystem studies of
the structure and processes of the environment
from sea floor to space and their relationships
with and climate change (5) Take
advantage of the exceptional opportunities to
integrate educational outreach into research
projects by communicating the unique results
12
Plan for including WRF into ASR A.
Include polar changes for MM5 into WRF B.
Identify Arctic strengths and weaknesses of
previous reanalyses C. Apply the lessons
learned from previous reanalyses D. Add
non-atmospheric components E. Compile
additional data not incorporated into previous
reanalyses F. Test/improve data
assimilation
13
3rd Annual WRF Users Workshop
Overall WRF Goals
Design Priorities
1-10 km horizontal grids Portable and efficient
on parallel computers Advanced data assimilation
and model physics Well suited for a broad range
of applications Community model with direct path
to operations
14
WRF Project Collaborators
  • Signatory Partners
  • NCAR Mesoscale and Microscale Meteorology
    Division
  • NOAA National Centers for Environmental
    Prediction
  • NOAA Forecast Systems Laboratory
  • Air Force Weather Agency
  • Federal Aviation Administration
  • Navy NRL Marine Meteorology Division
  • Additional Collaborators
  • OU Center for the Analysis and Prediction of
    Storms
  • Department of Defense HPCMO
  • NOAA Geophysical Fluid Dynamics Laboratory
  • NASA GSFC Atmospheric Sciences Division
  • NOAA National Severe Storms Laboratory
  • EPA Atmospheric Modeling Division
  • University Community

15
Registered WRF Users (6/21/02)
WRF Principal Partners 75

NCAR
33 NCEP
16 FSL
13 OU/CAPS
4 AFWA
9 U.S. Universities
141 U.S. Government Labs
91 Private Sector
78 Foreign
342
---- Total
727
WRF Web site http//wrf-model.org
16
WRF Software
  • Goals
  • Community Model
  • Good performance
  • Portable across a range of architectures
  • Flexible, maintainable, understandable
  • Facilitate code reuse
  • Multiple dynamics/ physics options
  • Run-time configurable
  • Nested
  • Aspects of Design
  • Single-source code
  • Fortran90 modules, dynamic memory, structures,
    recursion
  • Hierarchical software architecture
  • Multi-level parallelism
  • CASE Registry
  • Package-neutral APIs
  • I/O, data formats
  • Communication
  • IKJ Storage Order

17
Software Architecture
Driver
Driver Layer
Config Inquiry
I/O API
DM comm
Package Independent
Solve
Mediation Layer
OMP
Config Module
WRF Tile-callable Subroutines
Data formats, Parallel I/O
Package Dependent
Message Passing
Threads
Model Layer
External Packages
  • Driver I/O, communication, multi-nests, state
    data
  • Model routines computational, tile-callable,
    thread-safe
  • Mediation layer interface between model and
    driver
  • Interfaces to external packages

18
WRF Multi-Layer Domain Decomposition
Logical domain
1 Patch, divided into multiple tiles
  • Single version of code for efficient execution
    on
  • Distributed-memory
  • Shared-memory
  • Clusters of SMPs
  • Vector and microprocessors
  • Model domains are decomposed for parallelism on
    two-levels
  • Patch section of model domain allocated to a
    distributed memory node
  • Tile section of a patch allocated to a
    shared-memory processor within a node this is
    also the scope of a model layer subroutine.
  • Distributed memory parallelism is over patches
    shared memory parallelism is over tiles within
    patches

Inter-processor communication
19
I/O Architecture
  • Requirements of I/O Infrastructure
  • Efficiency key concern for operations
  • Flexibility key concern in research
  • Both types of user-institution already heavily
    invested in I/O infrastructure
  • Operations GRIB, BUFR
  • Research NetCDF, HDF
  • Portable I/O adaptable to range of uses,
    installations without affecting WRF and other
    programs that use the I/O infrastructure

20
I/O Architecture
  • WRF I/O API
  • Package-independent interface to NetCDF,
    Fast-binary, HDF (planned)
  • Random access of fields by timestamp/name
  • Full transposition to arbitrary memory order
  • Built-in support for read/write of parallel file
    systems (planned)
  • Data-set centric, not file-centric (planned)
    Grid Computing
  • Additional WRF model functionality
  • Collection/distribution of decomposed data to
    serial datasets
  • Fast, asynchronous, quilt-server I/O from NCEP
    Eta model

21
Model Performance
  • Efficiency with respect to other models
  • WRF about 2x cost of NCEP Eta (mid 2001)
  • Complexity WRF 1.6 times more operations for a
    given period of integration
  • Code efficiency WRF .78 of Eta
  • Scientific or forecast efficiency?

22
Mass Coordinate WRF Dynamical Core
WRF dynamical core development efforts
(Working Group 1) four cores have been
or are being developed
- Eulerian geometric height coordinate (z) core
(in framework, parallel, tested in idealized,
NWP applications) - Eulerian mass coordinate
(hydrostatic pressure) core (in framework,
parallel, tested in idealized, NWP
applications) - Semi-Lagrangian hybrid
coordinate core (under development, see Jim
Pursers talk) - Eulerian hybrid coordinate
core (under development)
23
Mass and Height Cores - Comparison
  • - Both cores solve the unapproximated fully
    compressible
  • nonhydrostatic Euler equations.
  • Both cores use the same numerical integration
    methods
  • 3rd order Runge-Kutta time integration
  • 2nd-5th order advection operators
  • C-grid
  • Split explicit acoustic/gravity wave
    integration
  • The two cores produce nearly identical (and
    equally accurate)
  • solutions in the idealized test cases and for
    several months
  • of daily 48h forecasts (using the same
    physics). Both are
  • equally efficient.

24
Mass and Height Cores - Differences
Upper boundary condition - Height coordinate
core uses a rigid lid (w 0) condition or
uses a radiation condition (w / 0) - Mass
coordinate core uses a constant or specified
pressure, in both cases this upper surface
is a material surface. Consequences when
the atmosphere is heated the pressure
increases in the height coordinate model,
the atmosphere expands in the mass coordinate
model the latter is physically more
realistic. caveat a radiation condition
is still needed to prevent
reflection of vertically propagating gravity
waves.
25
The Mass Coordinate WRF Core Its Here!!!
There is little difference between the cores with
respect running the cores and using the results.
We are supporting the mass coordinate core as the
dynamics solver for community use.
Nesting and 3DVAR will appear for the
mass-coordinate solver these capabilities will
not be developed for the height coordinate solver.
Message USE THE
MASS COORDINATE MODEL
Write a Comment
User Comments (0)
About PowerShow.com