Radar Calibration Radar Snowfall Estimation Automated Precipitation Detection, Amount and Typing - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

Radar Calibration Radar Snowfall Estimation Automated Precipitation Detection, Amount and Typing

Description:

General Comments Wx Radar/Precip Measurements. Radar Calibration Workshop ... Radar for QPE not there yet. Can work well in certain situations. Long range and ... – PowerPoint PPT presentation

Number of Views:96
Avg rating:3.0/5.0
Slides: 28
Provided by: paulj79
Category:

less

Transcript and Presenter's Notes

Title: Radar Calibration Radar Snowfall Estimation Automated Precipitation Detection, Amount and Typing


1
Radar CalibrationRadar Snowfall Estimation
Automated Precipitation Detection, Amount and
Typing
  • Paul Joe, Brian Sheppard, Nick Kouwen
  • Isztar Zawadzki, Norman Donaldson

2
Outline
  • General Comments Wx Radar/Precip Measurements
  • Radar Calibration Workshop
  • Automated Snowfall Amounts/Typing

3
Conclusions
  • Radar for QPE not there yet
  • Can work well in certain situations
  • Long range and large areas problems are being
    solved
  • But not demonstrated to the hydrology community

4
General Comments
  • Since 1948/52
  • Z related to Drop Size Distributions
  • Z nD6 RD4.5
  • Z 200 R 1.6
  • valid for rain and snow
  • many more since then (/- 3 dB)
  • Z related to R in ideal situation
  • short range
  • Z related to snow water equivalent
  • DSD is only one of the factors
  • long range
  • Joss and Waldvogel 1990

5
Snow/Rainfall Estimation Factors
  • Space-time sampling
  • Ground Clutter/Anomalous Propagation
  • Vertical Profile Correction/Bright Band
  • Calibration
  • Attenuation
  • Partial Beamfilling
  • DSD Z-R/Rain Classification and type
  • convective vs stratiform
  • wet bulb technique (rain vs snow)
  • Drop Size Distribution
  • Z-R convective, Z-R stratiform, Z-A
  • Wind drift problem for snow (application
    dependent)

6
Ideal SituationClose Range Difference due to DSD
(Joss)
7
Snowfall
  • 3 seasons of very good comparisions
  • well calibrated radar
  • well sited gauge
  • consistent snow type
  • low wind speed
  • better than rain results
  • effective climatological ZS relationships
  • low snow gradients

8
Low Level Scanning
0o
GC removal
9
(No Transcript)
10
Network Processing (QA)
Using two radars goes along way toward filling
gaps
11
Vertical Profile of Reflectivity and Range
Convective Stratiform Snow
12
Vertical Profile of Reflectivity and Range
13
The Swiss experience
14
The Swiss experience
15
Polarization
16
Radar Calibration
17
Radar Calibration Workshop Summary
  • Complex system - Stable electronic calibration
    but consistent?
  • Rainfall as a success criteria
  • Expansive definition of calibration
  • various criteria of success
  • calibration, performance measurements,
    validation, adjustment
  • Inter-radar/network comparisons
  • cross-radar comparisons including TRMM (GPM)
  • Validation
  • problematic standards
  • trend to use basins
  • Successful absolute external target
    calibrations
  • Prevalent use of the Sun

18
Calibration and Validation(Andy White Nexrad ROC)
  • Calibration is the process of quantitatively
    defining system responses to known, controlled
    signal inputs.
  • Validation is the process of assessing by
    independent means the quality of the data
    products derived from system inputs.
  • Validation is a natural adjunct to calibration.
  • Calibration - validation inseparable

Mean Reflectivity Error of -1.47 dB, s 1.34 dB
before solar cal
19
Calibration with a DSD gage (POSS) Using the Z-R
relationship for each day or by event(Zawadzki,
MRO Sheppard and Joe, MSC)
By Event and ZR coefficients
34 to 7.5 error reduction
20
Range Correction with Raingauges ApproachTo
Resolve a Heterogeneous Network(Daniel
Michelson, SMHI)
Range Correction See also USBR studies
21
Kdp/Zdr Self-Consistency for Calibration(Vivek
NCAR)
  • Procedure
  • Select radial radar data in rain that are free
    from any ice and ground clutter.
  • Compute KDP at each range gate using Z and ZDR.
  • Sum the KDP along the radial to obtain the
    estimated FDP.
  • Compute reflectivity bias as estimated
    FDP/measured FDP.
  • Process sufficient of rays to obtain a mean
    value of the bias.

3.13 dBZ bias
22
Calibration Measurements
  • Transmitter
  • Frequency, PRF, Average RF power
  • Antenna
  • Gain, Beamwidths
  • Receiver
  • Output vs. RF input, IF filter loss, Range

Uncertainty Analysis
23
Inter-radar Comparisons
TRMM WSR88D(Meneghini, NASA)
TRMM PR Stability 0.5 dB Consistent
device Compare with WSR88D
Inter-radar Comparisons(Asko Huuskonen FMI)
24
Needs
  • Consistency of calibration
  • Redundant measurements
  • Error estimates
  • Use small basins to integrate the measurements
    (USWRP QPE workshop) - overcome scale and time
    sampling

(Uijlenhoet, Princeton)
25
Z-S
26
High Temporal Snow Rate Measurements
  • Experiment to assess our ability to provide 1
    minute information on precipitation type and
    amount.

27
Analysis
  • Compare snow amounts by event
  • integrate 1 minute results
  • No standard for 1 minute rate measurements
  • Analysis is necessary but not sufficient

28
Vaisala FD12P
Biral VPF730
Fwd Scattering, capacitance
Fwd Back Scattering, pulse detection of
terminal velocity
POSS
WiVis
Microwave CW Doppler
Fischer-Porter Manual Obs
IR scintillation, spectral analysis of
fluctuations, fwd scattering
29
(No Transcript)
30
Fisher-Porter vs Manual Observations
31
Poss vs Manual Obs
32
POSS Snow Amounts by Temperature
Bright Band
33
POSS and Snowfall
1 minute data
34
Z-S Crystal Type
35
Automated Precipitation Typing
36
Reference Data Set
  • Regular manual observations
  • hourlies and specials
  • Special clinical observations - not done
  • minutely manual observations
  • usually made during precipitation events
  • false alarms not computable since observations
    are not made
  • expensive
  • however, optimal if done all the time!

37
Window Matching vs Minutely Matching
  • Minutely matching - too many ways for mismatch
  • Develop window matching to try to mimic how an
    observation is made
  • sensor observation time window is defined
  • to match the uncertainty in manual observation
    period
  • percentage of window is defined
  • predominant type is determined
  • to match the persistence criteria
  • observation window determined by an observer
    report
  • one event one observation
  • no report no event no comparison (even if
    sensor reports)

38
Performance MetricHeike Skill Score
HSS1 perfect HSS0 no skill HSS-1 always wrong
39
Optimal Detection Results/Setup
40
IdentificationHeike Skill Score
All show skill!
41
Snow vs Drizzle
Doppler Spectra Mode Analysis
42
Conclusions
  • All sensors show skill in detection and typing
  • Not all sensors report the same types
  • Main sensor differences in light and very light
    conditions which dominate the statistical results
  • Window analysis presented to better represent
    the capability of the sensor (10-15
    improvements)
  • Analysis is affected by instrument
    detectability/sensitivity
  • Microwave estimates of snowfall water equivalent
    amounts superior to optical. However, bright
    band effect experienced by radar near 0 C
    causes large overestimation of amounts
  • Room for improvement - must determine the
    requirement and criteria for success
  • Mixed type not studied and require algorithm
    development.

43
How are we doing with the various
steps? (Zawadzki)
Removing non meteorological data Filtering at
signal processing Morphology of reflectivity and
Doppler Polarization diversity
VPR correction for beam height and
shadows Morphology of reflectivity Adjustments
with gages (N N, Optimization) Probability
Matching
Correction for attenuation Networks of
radars? Polarization diversity?
Z-R relations Polarization diversity?
Morphology of reflectivity?
Validation not enough
44
Summary
  • Progress on many fronts
  • user success criteria
  • network consistency needed
  • radar adjustments
  • calibration
  • automated high temporal measurements
  • Much more to do
  • complete correction system needed
  • need to demonstrate
  • Need a standard for high temporal rate
    measurements
  • necessary but not sufficient demonstration

45
Then ID the target
r 120km (h 2.6km)
Write a Comment
User Comments (0)
About PowerShow.com