Template knowledge models - PowerPoint PPT Presentation

1 / 68
About This Presentation
Title:

Template knowledge models

Description:

example: design of a car. can include creative design of components ... example: configuration of an elevator; or PC ... Example: elevator design. Layout ... – PowerPoint PPT presentation

Number of Views:42
Avg rating:3.0/5.0
Slides: 69
Provided by: Schre9
Category:

less

Transcript and Presenter's Notes

Title: Template knowledge models


1
Template knowledge models
  • Reusing knowledge model elements

2
Lessons
  • Knowledge models partially reused in new
    applications
  • Type of task main guide for reuse
  • Catalog of task templates
  • small set in this book
  • see also other repositories

3
The need for reuse
  • prevent "re-inventing the wheel"
  • cost/time efficient
  • decreases complexity
  • quality-assurance

4
Task template
  • reusable combination of model elements
  • (provisional) inference structure
  • typical control structure
  • typical domain schema from task point-of-view
  • specific for a task type
  • supports top-down knowledge modeling

5
A typology of tasks
  • range of task types is limited
  • advantage of KE compared to general SE
  • background cognitive science/psychology
  • several task typologies have been proposed in the
    literature
  • typology is based on the notion of system

6
The term system
  • abstract term for object to which a task is
    applied.
  • in technical diagnosis artifact or device being
    diagnosed
  • in elevator configuration elevator to be
    designed
  • does not need to exist (yet)

7
Analytic versus synthetic tasks
  • analytic tasks
  • system pre-exists
  • it is typically not completely "known"
  • input some data about the system,
  • output some characterization of the system
  • synthetic tasks
  • system does not yet exist
  • input requirements about system to be
    constructed
  • output constructed system description

8
Task hierarchy
9
Structure of template description in catalog
  • General characterization
  • typical features of a task
  • Default method
  • roles, sub-functions, control structure,
    inference structure
  • Typical variations
  • frequently occurring refinements/changes
  • Typical domain-knowledge schema
  • assumptions about underlying domain-knowledge
    structure

10
Classification
  • establish correct class for an object
  • object should be available for inspection
  • "natural" objects
  • examples rock classification, apple
    classification
  • terminology object, class, attribute, feature
  • one of the simplest analytic tasks many methods
  • other analytic tasks sometimes reduced to
    classification problem especially diagnosis

11
Classification pruning method
  • generate all classes to which the object may
    belong
  • specify an object attribute
  • obtain the value of the attribute
  • remove all classes that are inconsistent with
    this value

12
Classificationinference structure
13
Classification method control
  • while new-solution generate(object -gt candidate)
    do
  • candidate-classes candidate union
    candidate-classes
  • while new-solution specify(candidate-classes -gt
    attribute)
  • and length candidate-classes gt 1 do
  • obtain(attribute -gt new-feature)
  • current-feature-set new-feature union
    current-feature-set
  • for-each candidate in candidate-classes do
  • match(candidate current-feature-set -gt
    truth-value)
  • if truth-value false
  • then candidate-classes candidate-classes
    subtract candidate

14
Classification method variations
  • Limited candidate generation
  • Different forms of attribute selection
  • decision tree
  • information theory
  • user control
  • Hierarchical search through class structure

15
Classification domain schema
16
Rock classification
17
Nested classification
18
Rock classification prototype
19
Assessment
  • find decision category for a case based on
    domain-specific norms.
  • typical domains financial applications (loan
    application), community service
  • terminology case, decision, norms
  • some similarities with monitoring
  • differences
  • timing assessment is more static
  • different output decision versus discrepancy

20
Assessment abstract match method
  • Abstract the case data
  • Specify the norms applicable to the case
  • e.g. rent-fits-income, correct-household-size
  • Select a single norm
  • Compute a truth value for the norm with respect
    to the case
  • See whether this leads to a decision
  • Repeat norm selection and evaluation until a
    decision is reached

21
Assessmentinference structure
case
abstract
abstracted
norms
select
specify
case
evaluate
norm
norm
decision
match
value
22
Assessment method control
  • while new-solution abstract(case-description -gt
    abstracted-case) do
  • case-description abstracted-case
  • end while
  • specify(abstracted-case -gt norms)
  • repeat
  • select(norms -gt norm)
  • evaluate(abstracted-case norm -gt norm-value)
  • evaluation-results norm-value union
    evaluation-results
  • until has-solution match(evaluation-results -gt
    decision)

23
Assessment control UML notation
24
Assessment method variations
  • norms might be case-specific
  • cf. housing application
  • case abstraction may not be needed
  • knowledge-intensive norm selection
  • random, heuristic, statistical
  • can be key to efficiency
  • sometimes dictated by human expertise
  • only acceptable if done in a way understandable
    to experts

25
Assessment domain schema
26
Claim handling for unemployment benefits
27
Decision rules for claim handling
28
Diagnosis
  • find fault that causes system to malfunction
  • example diagnosis of a copier
  • terminology
  • complaint/symptom, hypothesis, differential,
    finding(s)/evidence, fault
  • nature of fault varies
  • state, chain, component
  • should have some model of system behavior
  • default method simple causal model
  • sometimes reduced to classification task
  • direct associations between symptoms and faults
  • automation feasible in technical domains

29
Diagnosis causal covering method
  • Find candidate causes (hypotheses) for the
    complaint using a causal network
  • Select a hypothesis
  • Specify an observable for this hypothesis and
    obtain its value
  • Verify each hypothesis to see whether it is
    consistent with the new finding
  • Continue this process until a single hypothesis
    is left or no more observables are available

30
Diagnosisinference structure
31
Diagnosis method control
  • while new-solution cover(complaint -gt hypothesis)
    do
  • differential hypothesis add differential
  • end while
  • repeat
  • select(differential -gt hypothesis)
  • specify(hypothesis -gt observable)
  • obtain(observable -gt finding)
  • evidence finding add evidence
  • foreach hypothesis in differential do
  • verify(hypothesis evidence -gt result)
  • if result false then differential
    differential subtract hypothesis
  • until length differential lt 1 or no observables
    left
  • faults hypothesis

32
Diagnosis method variations
  • inclusion of abstractions
  • simulation methods
  • see literature on model-based diagnosis
  • library of Benjamins

33
Diagnosis domain schema
34
Monitoring
  • analyze ongoing process to find out whether it
    behaves according to expectations
  • terminology
  • parameter, norm, discrepancy, historical data
  • main features
  • dynamic nature of the system
  • cyclic task execution
  • output "just" discrepancy gt no explanation
  • often coupling monitoring and diagnosis
  • output monitoring is input diagnosis

35
Monitoringdata-driven method
  • Starts when new findings are received
  • For a find a parameter and a norm value is
    specified
  • Comparison of the find with the norm generates a
    difference description
  • This difference is classified as a discrepancy
    using data from previous monitoring cycles

36
Monitoring inference structure
37
Monitoring method control
  • receive(new-finding)
  • select(new-finding -gt parameter)
  • specify(parameter -gt norm)
  • compare(norm finding -gt difference)
  • classify(difference historical-data -gt
    discrepancy)
  • historical-data finding add historical-data

38
Monitoring method variations
  • model-driven monitoring
  • system has the initiative
  • typically executed at regular points in time
  • example software project management
  • classification function treated as task in its
    won right
  • apply classification method
  • add data abstraction inference

39
Prediction
  • analytic task with some synthetic features
  • analyses current system behavior to construct
    description of a system state at future point in
    time.
  • example weather forecasting
  • often sub-task in diagnosis
  • also found in knowledge-intensive modules of
    teaching systems e.g. for physics.
  • inverse retrodiction big-bang theory

40
Synthesis
  • Given a set of requirements, construct a system
    description that fulfills these requirements

41
Ideal synthesis method
  • Operationalize requirements
  • preferences and constraints
  • Generate all possible system structures
  • Select sub-set of valid system structures
  • obey constraints
  • Order valid system structures
  • based on preferences

42
Synthesisinference structure
43
Design
  • synthetic task
  • system to be constructed is physical artifact
  • example design of a car
  • can include creative design of components
  • creative design is too hard a nut to crack for
    current knowledge technology
  • sub-type of design which excludes creative design
    gt configuration design

44
Configuration design
  • given predefined components, find assembly that
    satisfies requirements obeys constraints
  • example configuration of an elevator or PC
  • terminology component, parameter, constraint,
    preference, requirement (hard soft)
  • form of design that is well suited for automation
  • computationally demanding

45
Elevator configuration knowledge base reuse
46
Configurationpropose revise method
  • Simple basic loop
  • Propose a design extension
  • Verify the new design,
  • If verification fails, revise the design
  • Specific domain-knowledge requirements
  • revise strategies
  • Method can also be used for other synthetic tasks
  • assignment with backtracking
  • skeletal planning

47
Configuration method decomposition
48
Configuration method control
  • operationalize(requirements -gt hard-reqs
    soft-reqs)
  • specify(requirements -gt skeletal-design)
  • while new-solution propose(skeletal-design
    design soft-reqs -gt extension) do
  • design extension union design
  • verify(design hard-reqs -gt truth-value
    violation)
  • if truth-value false then
  • critique(violation design -gt action-list)
  • repeat select(action-list -gt action)
  • modify(design action -gt design)
  • verify(design hard-reqs -gt truth-value
    violation)
  • until truth-value true
  • end while

49
Configuration method variations
  • Perform verification plus revision only when for
    all design elements a value has been proposed.
  • can have a large impact on the competence of the
    method
  • Avoid the use of fix knowledge
  • Fixes are search heuristics to navigate the
    potentially extensive space of alternative
    designs
  • alternative chronological backtracking

50
Configuration domain schema
51
Types of configuration may require different
methods
  • Parametric design
  • Assembly is largely fixed
  • Emphasis on finding parameter values that obey
    global constraints and adhere to preferences
  • Example elevator design
  • Layout
  • Component parameters are fixed
  • Emphasis on constructing assembly (topological
    relations)
  • Example mould configuration
  • Literature Motta (1999), Chandrasekaran (1992)

52
Assignment
  • create mapping between two sets of objects
  • allocation of offices to employees
  • allocation of airplanes to gates
  • mapping has to satisfy requirements and be
    consistent with constraints
  • terminology
  • subject, resource, allocation
  • can be seen as a degenerative form of
    configuration design

53
Assignmentmethod without backtracking
  • Order subject allocation to resources by
    selecting first a sub-set of subjects
  • If necessary group the subjects into
    subject-groups for joint resource assignment
  • requires special type of constraints and
    preferences
  • Take an subject(-group) and assign a resource to
    it.
  • Repeat this process until all subjects have a
    resource

54
Assignmentinference structure
55
Assignmentmethod control
  • while not empty subjects do
  • select-subset(subjects -gt subject-set)
  • while not empty subject-set do
  • group(subject-set -gt subject-group)
  • assign(subject-group resources
    current-allocations -gt resource)
  • current-allocations lt subject-group,
    resource gt union
  • current-allocations
  • subject-set subject-set/subject-group
  • resources resources/resource
  • end while
  • subjects subjects/subject-set
  • end while

56
Assignmentmethod variations
  • Existing allocations
  • additional input
  • subject-specific constraints and preferences
  • see synthesis and configuration-design

57
Planning
  • shares many features with design
  • main difference "system" consists of activities
    plus time dependencies
  • examples travel planning planning of building
    activities
  • automation only feasible, if the basic plan
    elements are predefined
  • consider use of the general synthesis method (e.g
    therapy planning) or the configuration-design
    method

58
Planning method
59
Scheduling
  • Given a set of predefined jobs, each of which
    consists of temporally sequenced activities
    called units, assign all the units to resources
    at time slots
  • production scheduling in plant floors
  • Terminology job, unit, resource, schedule
  • Often done after planning ( specification of
    jobs)
  • Take care use of terms planning and
    scheduling differs

60
Schedulingtemporal dispatching method
  • Specify an initial schedule
  • Select a candidate unit to be assigned
  • Select a target resource for this unit
  • Assign unit to the target resource
  • Evaluate the current schedule
  • Modify the schedule, if needed

61
Schedulinginference structure
62
Schedulingmethod control
  • specify(jobs -gt schedule)
  • while new-solution select(schedule -gt
    candidate-unit) do
  • select(candidate-unit schedule -gt
    target-resource)
  • assign(candidate-unit target-resource -gt
    schedule)
  • evaluate(schedule -gt truth-value)
  • if truth-value false then
  • modify(schedule -gt schedule)
  • end while

63
Scheduling method variations
  • Constructive versus repair method
  • Refinement often necessary
  • see scheduling literature
  • catalog of Hori (IBM Japan)

64
Scheduling typical domain schema
65
Modeling
  • included for completeness
  • "construction of an abstract description of a
    system in order to explain or predict certain
    system properties or phenomena"
  • examples
  • construction of a simulation model of nuclear
    accident
  • knowledge modeling itself
  • seldom automated gt creative steps
  • exception chip modeling

66
In applications typical task combinations
  • monitoring diagnosis
  • Production process
  • monitoring assessment
  • Nursing task
  • diagnosis planning
  • Troubleshooting devices
  • classification planning
  • Military applications

67
Example apple-pest management
68
Comparison with O-O analysis
  • Reuse of functional descriptions is not common in
    O-O analysis
  • notion of functional object
  • But see work on design patterns
  • strategy patterns
  • templates are patterns of knowledge-intensive
    tasks
  • Only real leverage from reuse if the patterns are
    limited to restricted task types
Write a Comment
User Comments (0)
About PowerShow.com