Summary of Experimental Uncertainty Assessment Methodology with Example - PowerPoint PPT Presentation

About This Presentation
Title:

Summary of Experimental Uncertainty Assessment Methodology with Example

Description:

Uncertainty estimates are imperative for risk assessments in design both when ... E = 4.9% (reference data) and E = 5.4% (ErTco hydrometer) Data not validated: ... – PowerPoint PPT presentation

Number of Views:211
Avg rating:3.0/5.0
Slides: 34
Provided by: engineer2
Category:

less

Transcript and Presenter's Notes

Title: Summary of Experimental Uncertainty Assessment Methodology with Example


1
Summary of Experimental Uncertainty Assessment
Methodology with Example
  • F. Stern, M. Muste, M-L. Beninati, and W.E.
    Eichinger

2
Table of Contents
  • Introduction
  • Terminology
  • Uncertainty Propagation Equation
  • UA for Multiple and Single Tests
  • Recommendations for Implementation
  • Example

3
Introduction
  • Experiments are an essential and integral tool
    for engineering and science
  • Uncertainty estimates are imperative for risk
    assessments in design both when using data
    directly or in calibrating and/or validating
    simulations methods
  • True values are seldom known and experiments have
    errors due to instruments, data acquisition, data
    reduction, and environmental effects
  • Determination of truth requires estimates for
    experimental errors, i.e., uncertainties

4
Introduction
  • Uncertainty analysis (UA) rigorous methodology
    for uncertainty assessment using statistical and
    engineering concepts
  • ASME and AIAA standards (e.g., ASME, 1998 AIAA,
    1995) are the most recent updates of UA
    methodologies, which are internationally
    recognized
  • Presentation purpose to provide summary of EFD
    UA methodology accessible and suitable for
    student and faculty use both in classroom and
    research laboratories

5
Terminology
  • Accuracy closeness of agreement between measured
    and true value
  • Error difference between measured and true value
  • Uncertainties (U) estimate of errors in
    measurements of individual variables Xi (Uxi) or
    results (Ur)
  • Estimates of U made at 95 confidence level, on
    large data samples (at least 10/measurement)

6
Terminology
  • Bias error (b) fixed, systematic
  • Bias limit (B) estimate of b
  • Precision error (e) random
  • Precision limit (P) estimate of e
  • Total error d b e

7
Terminology
  • Measurement systems for individual variables Xi
    instrumentation, data acquisition and reduction
    procedures, and operational environment
    (laboratory, large-scale facility, in situ)
  • Results expressed through data-reduction
    equations (DRE)
  • r r(X1, X2, X3,, Xj)
  • Estimates of errors are meaningful only when
    considered in the context of the process leading
    to the value of the quantity under consideration
  • Identification and quantification of error
    sources require considerations of
  • steps used in the process to obtain the
    measurement of the quantity
  • the environment in which the steps were
    accomplished

8
Terminology
  • Block diagram elemental error sources,
    individual measurement systems, measurement of
    individual variables, data reduction equations,
    and experimental results

9
Uncertainty propagation equation
  • One variable, one measurement


DRE
10
Uncertainty propagation equation
  • Two variables, the kth set of measurements (xk,
    yk)

The total error in the kth determination of r
(1)
11
Uncertainty propagation equation
(2)
  • A measure of dr is

Substituting (2) in (1), and assuming that
bias/precision errors are correlated
(3)
ss are not known use estimates for the
variances and covariances of the distributions of
the total, bias, and precision errors
The total uncertainty of the results at a
specified level of confidence is
(K 2 for 95 confidence level)
12
Uncertainty propagation equation
  • Generalizing (3) for J variables

sensitivity coefficients
Example
13
Single and multiple tests
  • Single test one set of measurements (X1, X2, ,
    Xj) for r
  • Multiple tests many sets of measurements (X1,
    X2, , Xj) for r
  • The total uncertainty of the result (single and
    multiple)

(4)
  • Br determined in the same manner for single and
    multiple tests
  • Pr determined differently for single and
    multiple tests

14
Bias limits (single and multiple tests)
  • Br given by
  • Sensitivity coefficients
  • Bi estimate of calibration, data acquisition,
    data reduction, and
  • conceptual bias errors for Xi
  • Bik estimate of correlated bias limits for Xi
    and Xk

15
Precision limits (multiple tests)
  • Precision limit of the result (end to end)

t coverage factor (t 2 for N gt 10)
standard deviation for M readings of the result
  • The average result

16
Precision limits (single test)
  • Precision limit of the result (end to end)

t coverage factor (t 2 for N gt 10) Sr the
standard deviation for the N readings of the
result. It is not available for single
test. Use of best available information
(literature, inter-laboratory comparison, etc.)
needed.
17
EFD Validation
  • Conduct uncertainty analysis for the results
  • EFD result A UA
  • Benchmark or EFD data B UB
  • E B-A
  • UE2 UA2UB2
  • Validation
  • E lt UE

18
Recommendations for implementation
  • Determine data reduction equation r r(X1, X2,
    , Xj)
  • Construct the block diagram
  • Identify and estimate sources of errors
  • Establish relative significance of the bias
    limits for the individual variables
  • Estimate precision limits (end-to-end procedure
    recommended)
  • Calculate total uncertainty using equation (4)
  • Report total error, bias and precision limits for
    the final result

19
Recommendations for implementation
  • Recognition of the uncertainty analysis (UA)
    importance
  • Full integration of UA into all phases of the
    testing process
  • Simplified UA
  • dominant error sources only
  • use of previous data
  • end-to-end calibration and estimation of errors
  • Full documentation
  • Test design, measurement systems, data-stream in
    block diagrams
  • Equipment and procedure
  • Error sources considered
  • Estimates for bias and precision limits and
    estimating procedures
  • Detailed UA methodology and actual data
    uncertainty estimates

20
Experimental Uncertainty Assessment Methodology
Example for Measurement of Density and Kinematic
Viscosity
21
Test Design
  • A sphere of diameter D falls a distance l at
    terminal velocity V (fall time t) through a
    cylinder filled with 99.7 aqueous glycerin
    solution of density r, viscosity m, and kinematic
    viscosity n ( m/r).
  • Flow situations
  • - Re VD/n ltlt1 (Stokes law)
  • - Re gt 1 (asymmetric wake)
  • - Re gt 20 (flow separates)

22
Test Design
  • Assumption Re VD/n ltlt1
  • Forces acting on the sphere
  • Apparent weight
  • Drag force (Stokes law)

23
Test design
  • Terminal velocity
  • Solving for n and substituting l/t for V
  • (5)
  • Evaluating n for two different spheres (e.g.,
    teflon and steel) and solving for r
  • (6)
  • Equations (5) and (6) data reduction equations
    for n and r in terms of measurements of the
    individual variables Dt, Ds, tt, ts, l

24
Measurement Systems and Procedures
  • Individual measurement systems
  • Dt and Ds micrometer resolution 0.01mm
  • l scale resolution 1/16 inch
  • tt and ts - stopwatch last significant digit
    0.01 sec.
  • T (temperature) digital thermometer last
    significant digit 0.1? F
  • Data acquisition procedure
  • measure T and l
  • measure diameters Dt,and fall times tt for 10
    teflon spheres
  • measure diameters Ds and fall times ts for 10
    steel spheres
  • Data reduction is done at steps (5) and (6) by
    substituting the measurements for each test into
    the data reduction equation (6) for evaluation of
    r and then along with this result into the data
    reduction equation (5) for evaluation of n

25
Block-diagram
26
Test results
27
Uncertainty assessment (multiple tests)
  • Density r (DRE
    )
  • Bias limit

Sensitivity coefficients e.g.,
  • Precision limit
  • Total uncertainty

28
Uncertainty assessment (multiple tests)
  • Density r

29
Uncertainty assessment (multiple tests)
  • Viscosity n (DRE
    )
  • Calculations for teflon sphere
  • Bias limit
  • Precision limit
  • Total uncertainty

30
Comparison with benchmark data
  • Density r

E 4.9 (reference data) and E 5.4 (ErTco
hydrometer)
Neglecting correlated bias errors
Data not validated
31
Comparison with benchmark data
  • Viscosity n

E 3.95 (reference data) and E 40.6 (Cannon
capillary viscometer)
Neglecting correlated bias errors
Data not validated (unaccounted bias error)
32
References
  • AIAA, 1995, Assessment of Wind Tunnel Data
    Uncertainty, AIAA S-071-1995.
  • ASME, 1998, Test Uncertainty, ASME PTC
    19.1-1998.
  • ANSI/ASME, 1985, Measurement Uncertainty Part
    1, Instrument and Apparatus, ANSI/ASME PTC
    19.I-1985.
  • Coleman, H.W. and Steele, W.G., 1999,
    Experimentation and Uncertainty Analysis for
    Engineers, 2nd Edition, John Wiley Sons, Inc.,
    New York, NY.
  • Coleman, H.W. and Steele, W.G., 1995,
    Engineering Application of Experimental
    Uncertainty Analysis, AIAA Journal, Vol. 33,
    No.10, pp. 1888 1896.
  • ISO, 1993, Guide to the Expression of
    Uncertainty in Measurement,", 1st edition, ISBN
    92-67-10188-9.
  • ITTC, 1999, Proceedings 22nd International Towing
    Tank Conference, Resistance Committee Report,
    Seoul Korea and Shanghai China.

33
References
  • Granger, R.A., 1988, Experiments in Fluid
    Mechanics, Holt, Rinehart and Winston, Inc., New
    York, NY.
  • ProctorGamble, 1995, private communication.
  • Roberson, J.A. and Crowe, C.T., 1997, Engineering
    Fluid Mechanics, 6th Edition, Houghton Mifflin
    Company, Boston, MA.
  • Small Part Inc., 1998, Product Catalog, Miami
    Lakes, FL.
  • Stern, F., Muste, M., M-L. Beninati, and
    Eichinger, W.E., 1999, Summary of Experimental
    Uncertainty Assessment Methodology with Example,
    IIHR Technical Report No. 406.
  • White, F.M., 1994, Fluid Mechanics, 3rd edition,
    McGraw-Hill, Inc., New York, NY.

Write a Comment
User Comments (0)
About PowerShow.com