EEL 4930 6 5930 5, Spring 06 Physical Limits of Computing PowerPoint PPT Presentation

presentation player overlay
1 / 21
About This Presentation
Transcript and Presenter's Notes

Title: EEL 4930 6 5930 5, Spring 06 Physical Limits of Computing


1
EEL 4930 6 / 5930 5, Spring 06Physical Limits
of Computing
http//www.eng.fsu.edu/mpf
  • Slides for a course taught byMichael P. Frankin
    the Department of Electrical Computer
    Engineering

2
Physical Limits of ComputingCourse Outline
Currently I am working on writing up a set of
course notes based on this outline,intended to
someday evolve into a textbook
  • Course Introduction
  • Moores Law vs. Modern Physics
  • Foundations
  • Required Background Material in Computing
    Physics
  • Fundamentals
  • The Deep Relationships between Physics and
    Computation
  • IV. Core Principles
  • The two Revolutionary Paradigms of Physical
    Computation
  • V. Technologies
  • Present and Future Physical Mechanisms for the
    Practical Realization of Information Processing
  • VI. Conclusion

3
Part II. Foundations
  • This first part of the course quickly reviews
    some key background knowledge that you will need
    to be familiar with in order to follow the later
    material.
  • You may have seen some of this material before.
  • Part II is divided into two chapters
  • Chapter II.A. The Theory of Information and
    Computation
  • Chapter II.B. Required Physics Background

4
Chapter II.A. The Theory of Information and
Computation
  • In this chapter of the course, we review a few
    important things that you need to know about
  • II.A.1. Combinatorics, Probability, Statistics
  • II.A.2. Information Communication Theory
  • II.A.3. The Theory of Computation

5
Section II.A.1 Basic Elements of Combinatorics,
Probability, and Statistics
  • Topics covered in this section
  • Basic Combinatorical Laws
  • Sum and product rules
  • Rules for counting sequences, permutations, and
    combinations
  • Basic Probability Theory
  • Events, Probabilities, Conditional Mutual
    Probabilities
  • Basic Statistical Quantities
  • Expected Value, Variance, Standard Deviation

6
Subsection II.A.1.a Basic Laws of Combinatorics
  • Sum and Product Rules
  • Rules for Counting Sequences, Permutations, and
    Combinations

7
Combinatorics
  • Combinatorics is the mathematical study of how to
    quickly count the number of ways to combine
    entities together in a specified fashion.
  • In combinatorics, we are always (explicitly or
    implicitly) counting the cardinality or number of
    elements X in some set X of possibilities,
    where each possibility is a particular way of
    combining entities together in the designated
    fashion.
  • Example problem How many ways are there to deal
    out a hand of standard playing cards that are all
    of the suit clubs (?)? (Order doesnt matter.)
  • Mathematically, the problem can be interpreted as
    saying that we are supposed to find the value of
    X, where
  • X hands H H is a set of 5 different cards
    all of suit ?
  • Well see how to solve this problem shortly.

8
Sum Rule for Disjoint Unions
  • Theorem Sum rule. Suppose each possible
    arrangement is of one of two distinct kinds, and
    there are y possibilities of the first kind, and
    z of the second kind. Then there are x y z
    total arrangements.
  • Mathematically Let X Y ? Z and let Y ? Z ?.
    Let x X, y Y, z Z. Then x y z.
  • Example My home movie collection consists
    entirely of comedies and action movies. I own 3
    comedies and 4 action movies. None of my movies
    are action-comedies. How many movies do I have?
  • Answer 34 7.

9
Product Rule for Ordered Pairs
  • Theorem Product rule. Suppose there is a
    one-to-one correspondence between the possible
    arrangements and ordered pairs of entities of two
    kinds (possibly the same kind), where there are y
    entities of the first kind and z of the second
    kind. (The two kinds of entities do not need to
    be disjoint.) Then there are x yz total
    arrangements.
  • Mathematically Let there be a one-to-one map
    fX?Y?Z, where Y?Z (a,b)a?Y, b?Z. Then X
    YZ.
  • Example A meal deal at a certain restaurant
    consists of a choice of one appetizer and one
    entrée. The restaurant has 6 different
    appetizers, 3 entrées, and a bowl of chili which
    can be served as either an appetizer or an entrée
    (or as both). How many different meal deals
    could one order?
  • Answer (61)(31) 74 28. Note that the
    sets Y and Z did not need to be disjoint (unlike
    in the case with the sum rule).

10
Exponential Rule for Sequences
This will lead to the logarithmic measure of
information entropy
  • Theorem Exponential rule. Suppose the
    arrangements correspond to sequences of n items,
    where any of y items could go at each position in
    the sequence. (Repetition of items is allowed,
    and the order of items matters.) Then there are
    x yn possible arrangements.
  • Mathematically (a1,a2,,an)?i ai?Y
    Yn.
  • Proof By repeated application of product rule.
  • Example How many different 4-digit PIN numbers
    are there?
  • Answer 104 10,000

11
Rule for Permutations
  • Definition A k-permutation of a set Y is a
    sequence of k elements of Y in which no element
    appears more than once.
  • Theorem Permutation rule. If Yy, then there
    are P(y,k) y!/(y-k)! k-permutations of the set
    Y.
  • Proof Using the product rule on a sequence of
    items from sets of size y, y-1, , down to y-k1.
  • Example A railroad yard has 20 different cars
    in it. How many ways are there to assemble a
    train of 5 cars to take away? (If the order of
    the cars matters.)
  • Answer 20!/(20-5)! 2019181716 1,860,480.

12
Rule for Combinations
  • Definition A k-combination of a set Y is a
    subset consisting of k elements of Y.
  • Theorem Combination rule. If Yy, then there
    are C(y,k) P(y,k)/k! y!/k!(y-k)!
    k-combinations of Y.
  • Proof The set of all k-permutations can be
    partitioned into disjoint subsets, each
    consisting of the k! different k-permutations of
    each k-combination.
  • Example In the previous example, what if the
    order of cars in the train does not matter?
  • Answer 20!/(15!5!) 15,504.
  • Example How many hands of 5 clubs are there?
  • Answer C(13,5) 13!/(8!5!) 1,287

13
Subsection II.A.1.b Basic Probability Theory
  • Events, Probabilities, Conditional and Mutual
    Probabilities

14
Events Probabilities
  • In statistics, an event E is any possible
    situation (occurrence, state of affairs) that
    might or might not be the actual situation.
  • The proposition P the event E occurred (or
    will occur) could turn out to be either true or
    false.
  • The probability of an event E is a real number p
    in the range 0,1 which gives our degree of
    belief in the truth of proposition P, i.e., the
    proposition that E will/did occur, where
  • The value p 0 means that P is false with
    complete certainty, and
  • The value p 1 means that P is true with
    complete certainty,
  • The value p ½ means that the truth value of P
    is completely unknown
  • That is, as far as we know, it is equally likely
    to be either true or value.
  • The probability p(E) is also the fraction of
    times that we would expect the event E to occur
    in a repeated experiment.
  • That is, on average, if the experiment could be
    repeated infinitely often, and if each repetition
    was independent of the others.
  • If the probability of E is p, then we would
    expect E to occur once for every 1/p independent
    repetitions of the experiment, on average.
  • Well call 1/p the improbability i of E, and
    write i(E) 1/p(E)

15
Joint Probability
  • Let X and Y be events, and let XY denote the
    event that events X and Y both occur together
    (that is, jointly).
  • Then p(XY) is called the joint probability of X
    and Y.
  • Product rule If X and Y are independent events,
    then p(XY) p(X) p(Y).
  • This follows from basic combinatorics.
  • It can also be considered a definition of what it
    means for X and Y to be independent.

16
Event Complements, Mutual Exclusivity,
Exhaustiveness
  • For any event E, its complement E is the event
    that event E does not occur.
  • Complement rule p(E) p(E) 1.
  • Two events E and F are called mutually exclusive
    if it is impossible for E and F to occur
    together.
  • That is, p(EF) 0.
  • Note that E and E are always mutually exclusive.
  • A set S E1, E2, of events is exhaustive if
    the event that some event in S occurs has
    probability 1.
  • Note that S E, E is always an exhaustive
    set.
  • Theorem The sum of the probabilities of any
    exhaustive set S of mutually exclusive events is
    1.

17
Conditional Probability
  • Let XY be the event that X and Y occur jointly.
  • Then the conditional probability of X given Y is
    defined by p(XY) p(XY) / p(Y).
  • It is the probability that if we are given that Y
    occurs, then X would also occur.
  • Bayes rule p(XY) p(X) p(YX) / p(Y).

r(XY)
Space of possible outcomes
Event Y
Event XY
Event X
18
Mutual Probability Ratio
  • The mutual probability ratio of X and Y is
    defined as r(XY) p(XY)/p(X)p(Y).
  • Note that r(XY) p(XY)/p(X) p(YX)/p(Y).
  • I.e., r is the factor by which the probability of
    either X or Y gets boosted upon learning that the
    other event occurs.
  • WARNING Many authors define the term mutual
    probability to be the reciprocal of our quantity
    r.
  • Dont get confused! I call that mutual
    improbability ratio.
  • Note that for independent events, r 1.
  • Whereas for dependent, positively correlated
    events, r gt 1.
  • And for dependent, anti-correlated events, r lt 1.

19
Subsection II.A.1.c Basic Statistical Quantities
  • Norm, Variance, Standard Deviation

20
Expectation Values
  • Let S be an exhaustive set of mutually exclusive
    events Ei.
  • This is sometimes known as a sample space.
  • Let f(Ei) be any function of the events in S.
  • This is sometimes called a random variable.
  • The expectation value or expected value or norm
    of f, written Exf or even just ?f?, is just the
    mean or average value of f(Ei), as weighted by
    the probability of the event Ei.
  • WARNING The expected value may actually be
    quite unexpected, or even impossible to occur!
  • Its not the ordinary English meaning of the word
    expected.
  • Expected values combine linearly ?afg? a?f?
    ?g?.

21
Variance Standard Deviation
  • The variance of a random variable f is s2(f)
    ?(f - ?f?)2?
  • The expected value of the squared deviation of f
    from the norm. (The squaring makes it positive.)
  • The standard deviation or root-mean-square (RMS)
    difference of f is s(f) s2(f)1/2.
  • This is comparable, in absolute magnitude, to a
    typical value of f - ?f?.
Write a Comment
User Comments (0)
About PowerShow.com