Title: Diapositiva 1
1Isaac Newton Institute for Mathematical Sciences
DEMA 2008 Workshop
Cambridge, 11-15 August 2008
A Sequential Methodology for Integrated Physical
and Simulation Experiments
Daniele Romano
joint with Alessandra Giovagnoli
Dept. of Mechanical Engineering, University of
Cagliari Piazza dArmi, Cagliari, Italy e-mail
romano_at_dimeca.unica.it
2The problem
Design or improve a physical system by combined
use of physical and simulation experiments
Assumptions
- Physical observations are more reliable (closer
to reality) - Simulation runs cost less
Schematisation
Two-treatment sequential experiment
T0 physical experiment T1 simulation experiment
T1? T0 ? T1? T1? ? T0? Stop
Stopping rule is an essential part of the
approach Additional task design of the
experiments at each stage (i.e. choice of doses
in clinical trials)
3Questions
- Is this problem relevant to applications?
- Has it already been investigated?
Partially in Calibration of computer models but
the objective is different and sometimes field
data are not designed (Kennedy and OHagan, 2001,
Bayarri et al., 2007). Calibration could be part
of the method.
4Analogies with other statistical problems
- George Boxs sequential experimentation (Box and
Wilson, 1957). However, there are no simulation
experiments in that methodology and experiments
are decided based mainly on expert judgment. - Sample surveys by questionnaires. Information can
be obtained directly or by proxy, and a main
question is how many resources to allocate to
direct observations and how many to proxy ones.
We are not aware however of a sequential
approach. - Computer models with different level of accuracy
(Qian et al., 2004) - Sequential experiments in clinical trials
5Two motivating applications
Design of a robotic device (Manuello et al.,
2003) Improvement of a manufacturing process
(Masala et al. 2008)
Sequence of experiments is based on judgement
6Climbing robot
simulation model developed in Working Model
21 factors investigated
788
12
Confirmation
Optimization
Computer model modification
Exploration of the feasible region
Feasibility check on the prototype
Extensive exploration
Note the efficient allotment of experimental
effort
8Benefits
The robot can climb steadily with speed seven
times higher than in the initial design
configuration and virtually on any post surface
(robustness) ? better design We just built one
physical prototype instead of tens to investigate
on 21 factors ? cost saving Computer exploration
makes us discover that the robot can descend
passively, by using gravity ? innovation The
comparison of physical vs numerical gave the
designer the idea of how to modify the code
improving the model ? simulation model improved
9operating modes of the robot
climb steadily
fall in control
innovation
fall
no move
climb and then fall
10Improvement of the flocking process
Car component covered by flock fabric
Flock yarns
Two simulation models developed one for the
electric field inside the chamber (FEMLAB), one
for the motion of the flock (MATLAB).
9 factors investigated
thread
flock
1163
37
12Benefits
Operating conditions considered potentially
unsafe have been tried on the simulator,
obtaining golden information for improving the
process. These conditions would never have been
tried in the field. ? process efficiency
increases Increased process efficiency can be
exploited to raise productivity (up to 50) (?
process improvement) or to produce yarns with
new characteristics ? product innovation Results
from physical and simulation experiments were
used for tuning some unknown parameters of the
simulator ? computer model calibration A
mechanistic model of the whole process was
developed by combining the simulation models with
the results of a physical experiment (determining
the rate of lifted flock). ? new process design
tool
13Response models
Reality
e??(0,s2) with independent errors,
Physical trials
x ? D (a hyper-rectangle)
Simulation
b(x) is the bias function, estimated by
Objective
Locate a sufficiently high value of the true
response over the domain D by using simulation as
much as possible, provided that simulation is
found reliable enough.
14Sequential procedure
At each step k of the procedure we must decide on
dk0 the experiment is physical ? Pk
- the type of the experiment
dk1 the experiment is numerical ?Sk
- - the region where the experiment is run, Rk
- - the runs size, nk
- - the design, Dk
We make simple choices on the type of the region
and the design throughout
- - Rk is a hypercube of fixed size (centre Ck)
- - Dk is a Latin Hypercube Design
15Rationale of the procedure
We want to use a physical experiment only in two
particular situations
- A satisfactory response level has been found by
simulation and it is worth checking it - There is the need to update the measure of the
unreliability of simulation in order to check if
it is worth going on or stopping
A physical experiment is always run in the region
of the preceding simulation experiment
dk0 Rk Rk-1
We want to stop the procedure in two particular
situations
- A satisfactory response level has been found by
simulation and it has been proven in the physical
set-up ? SUCCESS - The simulator is found too unreliable after a
check by a physical experiment
In all other circumstances we will use simulation
experiments
16Allowed transitions
S1
P2
START
Sk1
Sk
Stop
Sk
Pk1
Sk1
Pk
Pk1
Pk
Stop
17Performance measures at stage k
Satisfaction
Increment in satisfaction wrt the last experiment
of the same kind
DSAT(k) FSAT(k) - FSAT(k-l)
gradient
Expected improvement in next simulation experiment
Rk
frontier
Total cost
18continued
Unreliability of the simulation (after Pk)
error variance estimated at step k
Unreliability of the simulation (after Sk)
mk number of regions where both kind of
experiments were done up to step k d length of
the hypercube edge
19How are transitions ruled?
Sk1
Sk1
otherwise
otherwise
Sk
Pk
Pk
Stop
or (r1,r2,r3,r4) 1
FSAT(k)gtsC or gtuC
Satisfaction (after a physical experiment) is
high Simulation is too unreliable
r1 FSAT(k)gtsC r2 FUNREL(k)gtuC r3 r4
Too many stages without any actual
improvement Allowable cost exceeded
20High-level flow diagram of the procedure
k1
S1
kk1
kk1
Pk
Sk
N
Check stop
Check S-gtP
N
Y
Y
STOP
Only high level decision is made explicit here
21Block Sk or Pk
Select Rk,nk, Dk
Run Dk
Estimate y(x)
Compute performance measures FSAT(k), DSAT(k),
FUNREL(k), FIMPR(k), c(k)
22Low-level decisions select region, run size and
design
Region
dk0 Rk Rk-1
dk1 Rk ? Rk-1
Y
FIMPR(k)gt0
N
Draw Ck at random
Compute Ck (Rk adjacent to Rk-1)
23x2
R4
C4
B
R3
R6
C3
R1R2
A
C6
R5
C1 C2
D
C5
sample space
x1
24Run size
Run size of each physical experiment is such that
it costs as much as the simulation experiment
preceding it
dk0 Pk
nk r nk-1,
r cS/cP , 0ltrlt1
Run size of simulation experiments is
proportional to the expected increase of the
response (if any) at the centre, Ck, of the next
region, i.e.,
dk1 Sk
dk1
Parameter h can be tuned by setting the
willingness to pay for obtaining an improvement
Dy, for example
When region Rk is drawn at random, we put nk n1
25x2
R4
C4
B
R3
R6
C3
R1R2
A
C6
R5
C1C2
D
C5
sample space
x1
26Demonstrative case
A computer code for implementing the procedure
has been developed in Matlab
reality
simulation
27case 1
sequence of experiments S1?P2?S3?S4?P5?Stop
FUNREL(k)gtuC
active stopping rule
reality
physical experiment
prediction
simulation experiment
prediction
2
R3
R4 R5
R1 R2
28case 2
sequence of experiments S1?P2?S3?P4?Stop
FSAT(4)gtuS
active stopping rule
reality
physical experiment
prediction
simulation experiment
prediction
3
R1 R2
R3 R4
29simulation reality
case 3
sequence of experiments S1?P2?S3?S4? P5? Stop
FSAT(5)gtuS
active stopping rule
reality
physical experiment
prediction
simulation experiment
prediction
2
R1 R2
R4 R5
R3
30simulation reality
case 4
sequence of experiments S1?P2?S3?S4? S5? S6? P7?
Stop
FSAT(7)gtuS
active stopping rule
reality
physical experiment
prediction
simulation experiment
prediction
2
R4
R5
R3
R6 R7
R1 R2
31Conclusions
The scope of the approach is wide. In general, it
is apt to deal with any situation where the
response can be measured by two (or more)
instruments realizing a different quality-cost
trade-off. The method is aimed at performance
optimisation (maximisation of a distance measure
in the output space). However, the basic
sequential mechanism can be applied to different
goals. Testing and validation in real
applications is needed.
32George Box, commenting on the sequential
experimentation method
The reader should notice the degree to which
informed human judgement decides the final
outcome.
Box, G. P. E., Hunter, W. G., Hunter, J. S.
(1978) Statistics for experimenters, p. 537.
human judgement or automation?
Shall we ask Newton?
33References
Box, G.E.P., Wilson, K.B. On the Experimental
Attainment of Optimum Conditions, Journal of the
Royal Statistical Society Series B, 13, 1-45
(1951)
Kennedy, M.C., OHagan, A. Bayesian calibration
of computer models. Journal of the Royal
Statistical Society Series B, 63 Part 3, 425-464
(2001)
Bayarri, M.J., Berger, J.O., Paulo, R., Sacks,
J., Cafeo, J.A., Cavendish, J., Lin, C.-H., Tu,
J. A Framework for Validation of Computer
Models, Technometrics, 49(2), 138-154 (2007)
Qian, Z., Seepersad, C.C., Joseph, V.R., Allen,
J.K., Wu, C.F.J. Building Surrogate Models Based
on Detailed and Approximate Simulations. ASME
30th Conf. of Design Automation, Salt Lake City,
USA. Chen, W. (Ed.), ASME Paper no.
DETC2004/DAC-57486 (2004)
Manuello, A., Romano, D., Ruggiu, M. Development
of a Pneumatic Climbing Robot by Computer
Experiments. 12th Int. Workshop on Robotics in
Alpe-Adria-Danube Region, Cassino, Italy.
Ceccarelli, M. (Ed.), available on CD Rom (2003)
Masala, S., Pedone, P., Sandigliano, M. and
Romano, D. Improvement of a manufacturing
process by integrated physical and simulation
experiments a case-study in the textile
industry. Quality and Reliability Engineering
Int., to appear