A Survey of Neo-Schumpeterian Simulation Models: - PowerPoint PPT Presentation

1 / 25
About This Presentation
Title:

A Survey of Neo-Schumpeterian Simulation Models:

Description:

Empirical Validation of Agent-Based Models', Special Issue of Computational ... Non-reductionism. Non-linearity. Direct (endogenous) interactions. Bounded rationality ... – PowerPoint PPT presentation

Number of Views:78
Avg rating:3.0/5.0
Slides: 26
Provided by: Pla122
Category:

less

Transcript and Presenter's Notes

Title: A Survey of Neo-Schumpeterian Simulation Models:


1
  • A Survey of Neo-Schumpeterian Simulation Models
  • Review and Prospects
  • Paul Windrum
  • presented at DIME-ETIC
  • The Economy As A Complex Evolving System,
  • UNU-MERIT, Maastricht, 15 - 19 October 2007

2
  • References
  • Empirical Validation of Agent-Based Models,
    Special Issue of Computational Economics,
    Computational Economics, 2007 Vol. 30 (3), Chris
    Birchenhall, Giorgio Fagiolo and Paul Windrum
    (editors).
  • Windrum, P., 2007, Neo-Schumpeterian simulation
    models, in The Edward Elgar Companion to
    Neo-Schumpeterian Economics, H. Hanusch and A.
    Pyka (eds.), Cheltenham Edward Elgar.
  • Windrum, P., Fagiolo, G., and Moneta, A., 2007,
    Empirical validation of agent-based models
    alternatives and prospects, Journal of
    Artificial Societies and Social Simulation, 10(2)
    8, lthttp//jasss.soc.surrey.ac.uk/10/2/8.htmlgt.
  • Windrum, P., 1999, Simulation models of
    technological innovation a review, American
    Behavioral Scientist, 42 (10), pp.1531-1550.
    ISSN 0002-7642.

3
  • Neo-Classical (Type 1) and Schumpeterian (Type 2)
    Models
  • Different views about the world in which real
    economic agents operate.
  • Type 1 world can, in principle, be known and
    understood.
  • In the Type 2 world the set is unknown, and
    agents must engage in an open-ended search for
    new objects.
  • Primary interest of models differ
  • Type 1 models learning that leads to
    improvements in allocative efficiency.
  • Type 2 models open-ended search of dynamically
    changing environments. Due to (i) ongoing
    introduction of novelty generation of new
    patterns of behaviour (Knightian uncertainty),
    and (ii) complexity of interactions between
    heterogeneous agents.

4
  • Equilibrium versus non-equilibrium
  • Type 1 models view the underlying structure of
    the economic system as an equilibrium structure.
  • In Type 2 models, aggregate regularities are not
    equilibrium properties but emergent properties
    that arise from an evolutionary process process
    in which variety generation and selection interact

5
  • A bottom-up perspective
  • Heterogeneity
  • The evolving complex system (ECS) approach
  • Non-reductionism
  • Non-linearity
  • Direct (endogenous) interactions
  • Bounded rationality
  • The nature of learning
  • True dynamics (irreversibility)
  • Endogenous and persistent novelty
  • Selection-based market mechanisms

6
  • Early Neo-Schumpeterian (Type 2) Models
  • Stylised Man of early models (Nelson Winter,
    Dosi et al., Silverberg-Verspagen).
  • Focus develop models containing key evolutionary
    mechanisms that can generate stylised facts
    observed at industry/macro level.
  • Key algorithms variety generation (search) and
    selection algorithms
  • Key elements heterogeneity of agents, stochastic
    processes (notably of innovation), interaction
    between agents, feedbacks between decision making
    and emergent properties (path dependency),
    absence of perfect foresight and learning as a
    process of open-ended search (constrained
    rationality/ myopia).

7
  • Success of Early Models
  • Agenda setting a viable, neo-Schumpeterian
    alternative.
  • Method Nelson Winter 2-step method to approach
    to empirical validation of simulation models
  • Demonstration demonstration of the feasibility
    of the new approach using simulation.
  • Models generated outputs accorded with
    empirically observed phenomena and evolutionary
    neo-Schumpeterian explanations for these
    phenomena.
  • Innovation open-ended search as basis for
    learning in worlds with Knightian uncertainty.

8
  • Limitations of Early Models
  • Generality of the studies
  • A very limited range of agents were considered
    and the agent representations were highly
    stylised
  • Reports conducted on very few simulation runs
    (illustrations from handful of individual runs)
  • High dimensionality of the models (random walk?)
  • Lack of sensitivity analysis on key variables and
    parameters (whats really driving the model)
  • Lack of rigorous testing procedures for model
    outputs.
  • No comparison of alternative theories or models.

9
  • More Recent Neo-Schumpeterian Models
  • Motivations
  • To address the limitation of the early models
  • Exploit new algorithms, procedures developed in
    computer sciences, statistics etc., and make use
    of improved software / hardware.
  • Will consider 2 examples
  • Malerba-Nelson-Orsenigo-Winter, and
  • Windrum-Birchenhall models.

10
  • History friendly modelling Methodology
  • suggests tying down simulation models to
    carefully specified, empirical histories of
    individual industries.
  • Detailed empirical data to inform the simulation
    work
  • Act as a guide when specifying the
    representations of agents (their behaviour,
    decision rules, and interactions), and the
    environment in which they operate.
  • Assist in identification of particular parameters
    on key variables likely to generate the observed
    history.
  • Enable more demanding tests on model outputs
    to be specified - evaluate a model by comparing
    its output (simulated trace history) with the
    actual history of an industry.

11
  • Issues Regarding the History friendly Method
  • Category 1 implementation issues.
  • Modelling a special case Malerba et al. (1999,
    2001) is not informed by a history of the
    computer industry as whole, but of one particular
    company IBM. The research questions addressed
    are not relevant to others in the industry.
  • IBM account is itself highly stylised and
    subjective
  • Lack of data on industry as a whole empirical
    data regarding RD spend, market shares, and
    profitability of computer firms. No empirical
    data on key variables relative sizes of network
    externalities and branding in the mainframe and
    PC markets.
  • Question is the empirical data required by the
    method readily obtained in practice?

12
  • Other factors influencing the modelling choices
  • modelling is informed by theory as well as
    available empirical data
  • Empirical data is itself informed by theory
  • Do not present a rigorous sensitivity analysis of
    the initial seedings or the random parameter
    values used in the 50 simulation runs reported.

13
Attractor state
Cheapness
PC
Mainframe
Quality
14
  • Compare with
  • Windrum, P. and Birchenhall, C., 2005,
    Structural change in the presence of network
    externalities a co-evolutionary model of
    technological successions, Journal of
    Evolutionary Economics, 15(2), pp.123-148.
  • This paper opens up
  • quality is unpacked a complex
    multi-dimensional variable its own right
  • relationship between heterogeneous consumer
    demand and different sets of characteristics
    offered by old new technology products
  • some performance characteristics of old
    technology are better than those of new
    technology
  • some performance characteristics of new
    technology are better than those of old technology

15
  • some performance characteristics of offered by
    the new technology are NOT offered by the old
    technology
  • some performance characteristics of offered by
    the old technology are NOT offered by the new
    technology
  • Now can address the question of co-evolution new
    consumer groups with different preferences
    demanding new technology products with new
    characteristics (a succession).
  • Can distinguish between successions and
    substitutions
  • Rigorous sensitivity analysis is conducted on
    1000 runs using different parameter values of
    variables and different seedings.
  • Then a statistical model is used to test which
    variables affect the probability of a succession
    occurring (here to predict the probability of a
    succession occurring).

16
  • Results successions occur if
  • direct utility of new tech product gt old tech
    product
  • (product characteristics)
  • indirect utility of new tech product gt old tech
    product
  • (price production economies efficiency of
    production techniques)
  • Important initial new design(s) needs to be
    highly competitive have a supporting new
    customer group(s)
  • rate of innovative improvements of new tech firms
    gt rate of innovative improvements of new tech
    firms (sail ship effect is possible).
  • Complex interplay between quality, price and
    cost is NOT open to investigation in Malerba et
    al model.

17
  • Results
  • Time
  • Time old technology firms have to improve
    performance, price and cost of their designs
    prior to new technology firms arriving in the
    market (entry)
  • Time the new firms have to improve to undertake
    innovation and improve their product process
    performance (degree of competition in the market
    strength of replicator OR degree to which new
    consumer preference are distinguished from old
    consumer preferences).

18
  • Issues comparing the models
  • Both Malerba et al and Windrum-Birchenhall
    consider sequential competitions, and the
    conditions under which old technology and
    established firms may be replaced by new
    technologies and new firms.
  • But the models are very different in terms of
  • what they are trying to explain Malerba et al
    want to understand the conditions under which
    established firms can survive by switching
    production from old to new technology
  • the elements used in each model
  • the empirical data they draw upon when building
    their models (input)
  • the empirical data that is selected as the
    stylised facts that each model is expected to
    reproduce

19
  • Category 2 Methodological Issues.
  • Can history to be the final arbiter in
    theoretical and modelling debates? (as suggested
    by Malerba et al, and Brenner)
  • E.H Carr (1961) History itself is neither simple
    nor uncontested
  • Records that exist are fortuitously bequeathed.
  • In-built biases Yin (1994) verbal reports are
    subject to problems of bias, poor recall and
    inaccurate articulation
  • Missing data
  • Contestability of events, current and past.
  • Process of writing academic history is
    open-ended process in which many pieces of
    data, bequeathed from the past, are filtered by
    the historian. Some data accorded status of
    facts by the community of historians. But this
    status is open to review.

20
  • Upshot
  • Need to develop high quality accounts, open to
    critical scrutiny.
  • On the basis of these accounts, guidance is taken
    on particular modelling choices, on parameter
    testing, and output evaluation.
  • In recognising the limitations of any historical
    account, we simultaneously recognise the
    limitations of decisions based on that account.
  • Applies to all historical / empirical
    approaches to modelling

21
  • Further Issues.
  • goal of modelling what is the advantage of
    having a very accurate description of one case?
    (Silverbergs example perfect description of the
    fall of an individual leaf from a tree, or
    Brownian motion equations)
  • unconditional objects and alternative model
    testing (Brock, 1999)
  • alternative methods of sensitivity analysis
  • counterfactuals (Cowan Foray, 2002)
  • ergodicity (what if the system is non-ergodic?)

22
  • Further Issues cont.
  • structural change - the relationship of
    statistical data to evolutionary models, timing
    effects and lag structures in simulation models
  • calibration alternative ways to calibrate
    initial conditions and parameters
  • Indirect calibration
  • First validate, then indirectly calibrate the
    model by focusing on the parameters that are
    consistent with output validation.

23
  • Werker-Brenner approach
  • Step 1 use existing empirical knowledge to
    calibrate initial conditions and the ranges of
    model parameters Step 2 empirically validate the
    outputs for each of the model specifications
    derived from Step 1. This reduces the plausible
    set of models still further.
  • Step 3 further round of calibration on surviving
    set of models and, where helpful, recourse to
    expert testimony from historians (so-called
    methodological abduction).

24
  • Conclusions and Forward Look
  • Rapid development consistent reappraisal of the
    boundaries of research, both with respect to the
    range of phenomena studied and model content.
  • Development of distinctive features which set
    neo-Schumpeterian models apart from other models,
    and which gives them a collective coherence.
  • A distinctive view about the type of world in
    which real economic agents operate.
  • An identifiable set of algorithms that make up a
    neo-Schumpeterian simulation model a search
    algorithm, a selection algorithm, and a
    population of objects in which variation is
    expressed and on which selection operates.

25
  • Limitations of the early models are starting to
    be addressed in various ways.
  • Malerba et al. have put forward a new methodology
    and a new model structure.
  • BUT key issues relating to use of data -
    applicable to all historical / empirically based
    modelling
  • unconditional objects and alternative model
    testing
  • alternative methods of sensitivity analysis
  • counterfactuals
  • ergodicity and structural change - the
    relationship of statistical data to evolutionary
    models, timing effects and lag structures in
    simulation models, and calibration.
Write a Comment
User Comments (0)
About PowerShow.com