University of Florida Dept' of Computer - PowerPoint PPT Presentation

1 / 42
About This Presentation
Title:

University of Florida Dept' of Computer

Description:

Dept. of Computer & Information Science & Engineering. COT 3100 ... Kleene, Church, Turing, Post, 1930's. Turing Machines Turing, 1940's ... – PowerPoint PPT presentation

Number of Views:55
Avg rating:3.0/5.0
Slides: 43
Provided by: michae1481
Category:

less

Transcript and Presenter's Notes

Title: University of Florida Dept' of Computer


1
University of FloridaDept. of Computer
Information Science EngineeringCOT
3100Applications of Discrete StructuresDr.
Michael P. Frank
  • Slides for a Course Based on the TextDiscrete
    Mathematics Its Applications (5th Edition)by
    Kenneth H. Rosen

2
Module 24Models of Computation
  • Rosen 5th ed., ch. 11
  • 19 slides

3
Modeling Computation
  • We learned earlier the concept of an algorithm.
  • A description of a computational procedure.
  • Now, how can we model the computer itself, and
    what it is doing when it carries out an
    algorithm?
  • For this, we want to model the abstract process
    of computation itself.

4
Early Models of Computation
  • Recursive Function Theory
  • Kleene, Church, Turing, Post, 1930s
  • Turing Machines Turing, 1940s
  • RAM Machines von Neumann, 1940s
  • Cellular Automata von Neumann, 1950s
  • Finite-state machines, pushdown automata
  • various people, 1950s
  • VLSI models 1970s
  • Parallel RAMs, etc. 1980s

5
Concerns of the Earliest Models
  • Computability, Universality
  • How does the model affect the class of problems
    that can be solved at all (given unlimited time)?
  • Model should not restrict this more than needed.
  • It was found that a wide variety of models are
    equally powerful from this perspective. ?
    Universal
  • Programmability
  • It should be relatively straightforward to
    express any desired algorithm within the model.
  • As instructions for the model machine, in some
    form.
  • The model machine then just follows
    instructions.

6
Concerns of Some Later Models
  • Computational Complexity
  • The order of growth of time and space complexity
    to run a desired algorithm on the model should be
    no greater than is physically necessary.
  • Congruence (Physical Realism)
  • The OOG of time and space complexity to run a
    desired algorithm should also not be modeled to
    be less than is actually physically possible!

If the model is as powerful as is physically
possible, but no more powerful, we dub it UMS
(Universally Maximally Scalable).
7
Real-World Computer Modeling
Modeling areas needed for real Computer Systems
Engineering
  • Logic Devices
  • Technology Scaling
  • Interconnections
  • Synchronization
  • Processor Architecture
  • Capacity Scaling
  • Energy Transfer
  • Programming
  • Error Handling
  • Performance
  • Cost

An efficient and physically realistic model of
computing must accurately address all of these
areas!
8
11.1 Languages Grammars
  • Phrase-Structure Grammars
  • Types of Phrase-Structure Grammars
  • Derivation Trees
  • Backus-Naur Form

9
Computers as Transition Functions
  • A computer (or really any physical system) can be
    modeled as having, at any given time, a specific
    state s?S from some (finite or infinite) state
    space S.
  • Also, at any time, the computer receives an input
    symbol i?I and produces an output symbol o?O.
  • Where I and O are sets of symbols.
  • Each symbol can encode an arbitrary amount of
    data.
  • A computer can then be modeled as simply being a
    transition function TSI ? SO.
  • Given the old state, and the input, this tells us
    what the computers new state and its output will
    be a moment later.
  • Every model of computing well discuss can be
    viewed as just being some special case of this
    general picture.

10
Language Recognition Problem
  • Let a language L be any set of some arbitrary
    objects s which will be dubbed sentences.
  • That is, the legal or grammatically correct
    sentences of the language.
  • Let the language recognition problem for L be
  • Given a sentence s, is it a legal sentence of the
    language L?
  • That is, is s?L?
  • Surprisingly, this simple problem is as general
    as our very notion of computation itself!
  • See next slide

11
Computation as Language Recognition
  • Assume the states, input symbols, and output
    symbols of the computation can all be described
    with bit strings.
  • This is true so long as each of the sets S,I,O is
    countable.
  • Then there is bijection between states/symbols
    bit-strings.
  • Thus, the transition function T can be modeled as
    a Boolean function TB?B from input bit-strings
    that represent s,i, to output bit-strings that
    represent s,o.
  • Where B 0, 1, and B means the union of Bn
    ?n.
  • Then, each bit bi of an output string T(a)b can
    be viewed as the yes/no answer to the language
    recognition problem

Is a a legal sentence of the language Li
consisting of all those bit strings whose images
under T have a 1 in the ith position? (I.e., Li
a T(a)i 1)
12
Vocabularies and Sentences
  • Remember the concept of strings w of symbols s
    chosen from an alphabet S?
  • An alternative terminology for this concept
  • Sentences s of words ? chosen from a vocabulary
    V.
  • No essential difference in concept or notation!
  • Empty sentence (or string) ? (length 0)
  • Set of all sentences over V Denoted V.

13
Grammars
  • A formal grammar G is any compact, precise
    mathematical definition of a language L.
  • As opposed to just a raw listing of all of the
    languages legal sentences, or just examples of
    them.
  • A grammar implies an algorithm that would
    generate all legal sentences of the language.
  • Often, it takes the form of a set of recursive
    definitions.
  • A popular way to specify a grammar recursively is
    to specify it as a phrase-structure grammar.

14
Phrase-Structure Grammars
  • A phrase-structure grammar (abbr. PSG) G
    (V,T,S,P) is a 4-tuple, in which
  • V is a vocabulary (set of words)
  • The template vocabulary of the language.
  • T ? V is a set of words called terminals
  • Actual words of the language.
  • Also, N V - T is a set of special words
    called nonterminals. (Representing concepts like
    noun)
  • S?N is a special nonterminal, the start symbol.
  • P is a set of productions (to be defined).
  • Rules for substituting one sentence fragment for
    another.

A phrase-structure grammar is a special case of
the more general concept of a string-rewriting
system, due to Post.
15
Productions
  • A production p?P is a pair p(b,a) of sentence
    fragments l, r (not necessarily in L), which may
    generally contain a mix of both terminals and
    nonterminals.
  • We often denote the production as b ? a.
  • Read b goes to a (like a directed graph edge)
  • Call b the before string, a the after string.
  • It is a kind of recursive definition meaning that
    If lbr ? LT, then lar ? LT. (LT sentence
    templates)
  • That is, if lbr is a legal sentence template,
    then so is lar.
  • That is, we can substitute a in place of b in any
    sentence template.
  • A phrase-structure grammar imposes the constraint
    that each l must contain a nonterminal symbol.

16
Languages from PSGs
  • The recursive definition of the language L
    defined by the PSG G (V, T, S, P)
  • Rule 1 S ? LT (LT is Ls template language)
  • The start symbol is a sentence template (member
    of LT).
  • Rule 2 ?(b?a)?P ?l,r?V lbr ? LT ? lar ? LT
  • Any production, after substituting in any
    fragment of any sentence template, yields another
    sentence template.
  • Rule 3 (?s ? LT ?n?N n?s) ? s?L
  • All sentence templates that contain no
    nonterminal symbols are sentences in L.

Abbreviatethis usinglbr ? lar.(read, lar is
directly derivable from lbr).
17
PSG Example English Fragment
  • We have G (V, T, S, P), where
  • V (sentence), (noun phrase), (verb
    phrase), (article), (adjective), (noun),
    (verb), (adverb), a, the, large, hungry,
    rabbit, mathematician, eats, hops,
    quickly, wildly
  • T a, the, large, hungry, rabbit,
    mathematician, eats, hops, quickly, wildly
  • S (sentence)
  • P (see next slide)

18
Productions for our Language
  • P (sentence) ? (noun phrase) (verb
    phrase),(noun phrase) ? (article) (adjective)
    (noun),(noun phrase) ? (article) (noun),(verb
    phrase) ? (verb) (adverb),(verb phrase) ?
    (verb), (article) ? a, (article) ?
    the,(adjective) ? large, (adjective) ?
    hungry,(noun) ? rabbit, (noun) ?
    mathematician,(verb) ? eats, (verb) ?
    hops,(adverb) ? quickly, (adverb) ? wildly

19
Backus-Naur Form
  • ?sentence? ?noun phrase? ?verb phrase?
  • ?noun phrase? ?article? ?adjective? ?noun?
  • ?verb phrase? ?verb? ?adverb?
  • ?article? a the
  • ?adjective? large hungry
  • ?noun? rabbit mathematician
  • ?verb? eats hops
  • ?adverb? quickly wildly

Square brackets mean optional
Vertical barsmean alternatives
20
A Sample Sentence Derivation
  • (sentence) (noun phrase) (verb
    phrase)
  • (article) (adj.) (noun) (verb phrase)
  • (art.) (adj.) (noun) (verb) (adverb)
  • the (adj.) (noun) (verb) (adverb)
    the large (noun) (verb) (adverb)
    the large rabbit (verb) (adverb)
  • the large rabbit hops
    (adverb)
  • the large rabbit hops
    quickly

On each step,we apply a production to a fragment
of the previous sentence template to get a new
sentence template. Finally, we end up with a
sequence of terminals (real words), that is, a
sentence of our language L.
21
Another Example
T
V
  • Let G (a, b, A, B, S, a, b, S, S
    ? ABa, A ? BB, B ? ab, AB ? b).
  • One possible derivation in this grammar is S ?
    ABa ? Aaba ? BBaba ? Bababa ? abababa.

P
22
Derivability
  • Recall that the notation w0 ? w1 means that
    ?(b?a)?P ?l,r?V w0 lbr ? w1 lar.
  • The template w1 is directly derivable from w0.
  • If ?w2,wn-1 w0 ? w1 ? w2 ? ? wn, then we
    write w0 ? wn, and say that wn is derivable from
    w0.
  • The sequence of steps wi ? wi1 is called a
    derivation of wn from w0.
  • Note that the relation ? is just the transitive
    closure of the relation ?.

23
A Simple Definition of L(G)
  • The language L(G) (or just L) that is generated
    by a given phrase-structure grammar G(V,T,S,P)
    can be defined by L(G) w ? T S ? w
  • That is, L is simply the set of strings of
    terminals that are derivable from the start
    symbol.

24
Language Generated by a Grammar
  • Example Let G (S,A,a,b,a,b, S,S ? aA, S
    ? b, A ? aa). What is L(G)?
  • Easy We can just draw a treeof all possible
    derivations.
  • We have S ? aA ? aaa.
  • and S ? b.
  • Answer L aaa, b.

S
aA
b
Example of aderivation treeor parse tree or
sentence diagram.
aaa
25
Generating Infinite Languages
  • A simple PSG can easily generate an infinite
    language.
  • Example S ? 11S, S ? 0 (T 0,1).
  • The derivations are
  • S ? 0
  • S ? 11S ? 110
  • S ? 11S ? 1111S ? 11110
  • and so on

L (11)0 theset of all strings consisting
of somenumber of concaten-ations of 11 with
itself,followed by 0.
26
Another example
  • Construct a PSG that generates the language L
    0n1n n?N.
  • 0 and 1 here represent symbols being concatenated
    n times, not integers being raised to the nth
    power.
  • Solution strategy Each step of the derivation
    should preserve the invariant that the number of
    0s the number of 1s in the template so far,
    and all 0s come before all 1s.
  • Solution S ? 0S1, S ? ?.

27
Types of Grammars
  • Venn Diagram of Grammar Types

Type 0 Phrase-structure Grammars
Type 1 Context-Sensitive
Type 2 Context-Free
Type 3 Regular
28
Defining the PSG Types
  • Type 1 Context-Sensitive PSG
  • All after fragments are either longer than the
    corresponding before fragments, or empty b
    lt a ? a ? .
  • Type 2 Context-Free PSG
  • Type 1 and all before fragments have length 1
    b 1 (b ? N).
  • Type 3 Regular PSGs
  • Type 2 and all after fragments are either single
    terminals, or a pair of a terminal followed by a
    nonterminal. a ? T ? a ? TN.

29
11.2 Finite State Machines with Output
  • Remember our general picture of a computer as
    being a transition function TSI?SO?
  • If the state set S is finite (not infinite), we
    call this system a finite state machine.
  • If the domain SI is reasonably small, then we
    can specify T explicitly by writing out its
    complete graph.
  • However, this is practical only for machines that
    have a very small information capacity.

30
Size of FSMs
  • The information capacity of an FSM is C IS
    log S.
  • Thus, if we represent a machine having an
    information capacity of C bits as an FSM, then
    its state transition graph will have S 2C
    nodes.
  • E.g. suppose your desktop computer has a 512MB
    memory, and 60GB hard drive.
  • Its information capacity, including the hard
    drive and memory (and ignoring the CPUs internal
    state), is then roughly 512223 60233
    519,691,042,816 b.
  • How many states would be needed to write out the
    machines entire transition function graph?

2519,691,042,816 A number having gt1.7 trillion
decimal digits!
31
One Problem with FSMs as Models
  • The FSM diagram of a reasonably-sized computer is
    more than astronomically huge.
  • Yet, we are able to design and build these
    computers using only a modest amount of
    industrial resources.
  • Why is this possible?
  • Answer A real computer has regularities in its
    transition function that are not captured if we
    just write out its FSM transition function
    explicitly.
  • I.e., a transition function can have a small,
    simple, regular description, even if its domain
    is enormous.

32
Other Problems with FSM Model
  • It ignores many important physical realities
  • How is the transition functions structure to be
    encoded in physical hardware?
  • How much hardware complexity is required to do
    this?
  • How close in physical space is one bits worth of
    the machines information capacity to another?
  • How long does it take to communicate information
    from one part of the machine to another?
  • How much energy gets dissipated to heat when the
    machine updates its state?
  • How fast can the heat be removed, and how much
    does this impact the machines performance?

33
Vending Machine Example
  • Suppose a certain vending machine accepts
    nickels, dimes, and quarters.
  • If gt30 is deposited, change isimmediately
    returned.
  • If the coke button is pressed,the machine
    drops a coke.
  • Can then accept a new payment.

Ignore any otherbuttons, bills,out of
change,etc.
34
Modeling the Machine
  • Input symbol set I nickel, dime, quarter,
    button
  • We could add nothing or ? as an additional
    input symbol if we want.
  • Representing no input at a given time.
  • Output symbol set O ?, 5, 10, 15, 20,
    25, coke.
  • State set S 0, 5, 10, 15, 20, 25, 30.
  • Representing how much money has been taken.

35
Transition Function Table
36
Transition Function Table cont.
37
Another Format State Table
Each entryshowsnew state,output symbol
38
Directed-Graph State Diagram
  • As you can see, these can get kind of busy.

q,5
d,5
q
q
q,20
d
d
d
n
n
n
n
n
n
0
5
10
15
20
25
30
n,5
b
b
b
b
b
b
d,10
q,25
q,15
b,coke
q,10
39
Formalizing FSMs
  • Just like the general transition-function
    definition from earlier, but with the output
    function separated from the transition function,
    and with the various sets added in, along with an
    initial state.
  • A finite-state machine M(S, I, O, f, g, s0)
  • S is the state set.
  • I is the alphabet (vocabulary) of input symbols
  • O is the alphabet (vocab.) of output symbols
  • f is the state transition function
  • g is the output function
  • s0 is the initial state.
  • Our transition function from before is T (f,g).

40
11.3 - Finite-State Machines with No Output
41
11.4 Language Recognition
42
11.5 Turing Machines
Write a Comment
User Comments (0)
About PowerShow.com