Peter Miller, Jason Holt1 and Dave Storkey2 - PowerPoint PPT Presentation

1 / 37
About This Presentation
Title:

Peter Miller, Jason Holt1 and Dave Storkey2

Description:

Validation of multiple ocean shelf models against EO data using automated front detection ... Faroe-Shetland current, 18-24 May 1999 ... – PowerPoint PPT presentation

Number of Views:96
Avg rating:3.0/5.0
Slides: 38
Provided by: tke70
Category:

less

Transcript and Presenter's Notes

Title: Peter Miller, Jason Holt1 and Dave Storkey2


1
Peter Miller, Jason Holt1 and Dave Storkey2
Validation of multiple ocean shelf models against
EO data using automated front detection
  • NCOF Science Workshop, Croyde Bay, 21-23 Oct. 2008

1. Proudman Oceanographic Laboratory2. UK
Meteorological Office
2
Model validation using fronts
  • Rationale
  • Models to validate
  • Validation method
  • Composite front maps
  • Local regional comparison
  • Model cloudiness
  • Visual and quantitative results
  • Applications and future work

3
Rationale for model validation using fronts
  • There are a myriad of different ocean models
  • FOAM, ROMS, NEMO, OCCAM, POLCOMS,
  • Increasing usage of and reliance on ocean models
  • Coupled ocean-atmosphere met-ocean forecasts and
    climate predictions coupled physics-ecosystem
    pollution trajectories water quality and algal
    predictions.
  • Realistic ecosystem modelling requires accurate
    physical forcing to control the supply of
    nutrients to the surface mixed layer.
  • Existing validation uses point comparisons
  • Average difference is of questionable value
  • Earth Observation (EO) surface fields give
    greater coverage but do not improve understanding
  • Need better validation methods to improve models

4
Operational NW European shelf domains
  • Atlantic Margin model (AMM) 12km (32 level)
  • Medium Resolution Continental Shelf (MRCS) 7km
    (18 level) including sediments and ecosystem
    (ERSEM).
  • Irish Sea model, 1 nm (18 level)

AMM (12km)
MRCS (7km)
to be transitioned to NEMO framework
5
Model validation using fronts
  • Rationale
  • Models to validate
  • Validation method
  • Composite front maps
  • Local regional comparison
  • Model cloudiness
  • Visual and quantitative results
  • Applications and future work

6
Conventional image composites
  • Mean SST at each location during week
  • Dynamic and transient features are blurred
  • Spurious features introduced

Miller, P.I., (2004) Multispectral front maps for
automatic detection of ocean colour features
from SeaWiFS, International Journal of Remote
Sensing, 25 (7-8), 1437-1442.
7
Composite front maps
  • Does not blur dynamic features.
  • Highlights persistent or strong gradient fronts.

Miller, P.I., (2004) Multispectral front maps for
automatic detection of ocean colour features
from SeaWiFS, International Journal of Remote
Sensing, 25 (7-8), 1437-1442.
8
Model validation using fronts
  • Rationale
  • Models to validate
  • Validation method
  • Composite front maps
  • Local regional comparison
  • Model cloudiness
  • Visual and quantitative results
  • Applications and future work

9
Front detection on model SST
  • POLCOMS 3D hydrodynamic model
  • HRCS 2 km resolution
  • Horizontal latitude-longitude Arakawa B-grid
  • Vertical S-coordinates

10
Local regional comparison
  • Summarise by subsampling or filtering
  • Properties of gradient magnitude, persistence,
    direction, etc.
  • Compare by differencing maps or checking for
    matches
  • Robust method, can be automated.

Miller, P., J. Holt, and D. Storkey (in press)
Validation of multiple ocean shelf models against
EO data using automated front detection initial
results, EuroGOOS Conference, Exeter.
11
Model cloudiness
12
Regional comparison
Remapping and resampling (e.g. 8 x 8 window)
Model thermal fronts
EO thermal fronts
13
Regional comparison
Validation measures, EO truth
Hits of EO fronts by model fronts
EO front min4, model front min4, win
size24x24, Aug. 2001
14
Fronts explains biological errors
Low Chl-a model skill High
15
Model validation using fronts
  • Rationale
  • Models to validate
  • Validation method
  • Composite front maps
  • Local regional comparison
  • Model cloudiness
  • Visual and quantitative results
  • Applications and future work

16
HRCS model vs AVHRR SST fronts
May 2001
May 2001
HRCS model 2km, SST fronts Cloud-masked
AVHRR HRPT 1km, EO SST fronts
17
FOAM modelSST fronts 12km Cloud-masked
Aug. 2005
AVHRR PathfinderEO SST fronts 4km gt 12km
Aug. 2005
18
FOAM-NEMO fronts vs EO AVHRR
Aug. 2007
Aug. 2007
FOAM-NEMO model 7km, SST fronts Cloud-masked
AVHRR HRPT 1km, EO SST fronts
EO front min4, model front min2, win size4x4
19
ROC validation of HRCS 2km fronts
Varying model front minimum value
1lax threshold
Model frontminimum value
20strict threshold
EO front min4, model min1..20, mean Jan-Aug.
2001
20
ROC comparison of model fronts
EO front min4, win size48x48 km, model
min1..20 (top to bottom)
21
Initial results MRCS 7km SST fronts
MRCS model 7km, SST fronts Cloud-masked
AVHRR HRPT 1km, EO SST fronts
Jul 2007
Aug 2007
Sep 2007
22
Initial results MRCS 7km Chl-a fronts
MRCS model 7km, Chl fronts Cloud-masked
Aqua-MODIS 1km, EO Chl fronts
23
Potential applications
  • Analyse and improve models
  • E.g. persistence of eddies at sea surface,
    boundary effects.
  • Assess improvement in ecosystem model.
  • Data assimilation method?
  • Compare alternative models or versions
  • E.g. UK MetOffice moving from FOAM to NEMO.

24
AlgaRisk UK algal bloom risk
  • Provide satellite and model information to the EA
  • Help focus monitoring for bloom events
  • Enable EA to advise local authorities
  • Demonstrate potential to assist with EU
    directives

Chlorophyll-a 18 July 2006
25
Further work
  • Further model comparisons
  • POLCOM-MRCS vs. FOAM-NEMO, both at 7km.
  • Optimise EO/model front detections for
    validation.
  • Detailed analysis over annual sequence
  • Indicate consistently good and bad regions.
  • Confirm genuine time-series changes, and
    interpret significant deviations of model from
    obs.
  • Front contours by simplifying clusters
  • Model location errors for particular fronts /
    overall.

26
Model validation using fronts
  • Rationale
  • Models to validate
  • Validation method
  • Composite front maps
  • Local regional comparison
  • Model cloudiness
  • Visual and quantitative results
  • Applications and future work

27
(No Transcript)
28
Front detection method
SST map
Local window
Cayula, J.-F., and Cornillon, P., (1992), Edge
detection algorithm for SST images. Journal of
Atmospheric and Oceanic Technology, 9, 67-80.
29
Weighting factors
  • Mean gradient
  • Persistence P(front)
  • Advection proximity

Miller, P.I., (in press) Composite front maps for
improved visibility of dynamic oceanic fronts
on cloudy AVHRR and SeaWiFS data, Journal of
Marine Systems.
30
Example thermal front maps
Eddies off NW Spain, 29-31 Mar. 1997
Miller, P.I., (in press) Composite front maps for
improved visibility of dynamic oceanic fronts
on cloudy AVHRR and SeaWiFS data, Journal of
Marine Systems.
31
Capabilities of PML RSG and NEODAAS
www.neodaas.ac.uk info_at_neodaas.ac.uk
32
(No Transcript)
33
EO fronts without cloud?
AMSR-E Passive microwave SST thermal fronts, 25
km resolution01-31 Aug. 2001
34
(No Transcript)
35
FOAM-NEMO fronts vs EO AVHRR
36
ROC validation of FOAM 12km fronts
Varying model front minimum value
EO front min4, model min1..20, mean Jan-Dec.
2005
37
ROC validation of NEMO 7km fronts
Varying model front minimum value
EO front min4, model min1..20, mean Jul-Sep.
2007
Write a Comment
User Comments (0)
About PowerShow.com