CSE 221: Probabilistic Analysis of Computer Systems - PowerPoint PPT Presentation

1 / 29
About This Presentation
Title:

CSE 221: Probabilistic Analysis of Computer Systems

Description:

Stochastic processes (Ch. 6, Sec. 6.1, 6.3 and 6.4) Definition and classification of stochastic processes, Bernoulli and Poisson processes. ... – PowerPoint PPT presentation

Number of Views:57
Avg rating:3.0/5.0
Slides: 30
Provided by: swapnas
Learn more at: http://www.cse.uconn.edu
Category:

less

Transcript and Presenter's Notes

Title: CSE 221: Probabilistic Analysis of Computer Systems


1
CSE 221 Probabilistic Analysis of Computer
Systems
Topics covered Course outline and
schedule Introduction Event Algebra (Sec.
1.1-1.4)
2
General information
CSE 221 Probabilistic Analysis of Computer
Systems Instructor Swapna S. Gokhale Phone
6-2772. Email ssg_at_engr.uconn.edu Offic
e ITEB 237 Lecture time Mon/Fri 1100
1215 pm Office hours By appointment
(I will hang around for a few
minutes at the end of
each class). Web page http//www.engr.uconn.e
du/ssg/cse221.html
(Lecture notes, homeworks, and general
announcements will
be posted on the web page) TA Narasimha
Shashidhar
3
Course goals
  • Appreciation and motivation for the study of
    probability theory.
  • Definition of a probability model
  • Application of discrete and continuous random
    variables
  • Computation of expectation and moments
  • Application of discrete and continuous time
    Markov chains.
  • Estimation of parameters of a distribution.
  • Testing hypothesis on distribution parameters

4
Expected learning outcomes
  • Sample space and events
  • Define a sample space (outcomes) of a random
    experiment and identify events of interest and
    independent events on the sample space.
  • Compute conditional and posterior probabilities
    using Bayes rule.
  • Identify and compute probabilities for a sequence
    of Bernoulli trials.
  • Discrete random variables
  • Define a discrete random variable on a sample
    space along with the associated probability mass
    function.
  • Compute the distribution function of a discrete
    random variable.
  • Apply special discrete random variables to
    real-life problems.
  • Compute the probability generating function of a
    discrete random variable.
  • Compute joint pmf of a vector of discrete random
    variables.
  • Determine if a set of random variables are
    independent.

5
Expected learning outcomes (contd..)
  • Continuous random variables
  • Define general distribution and density
    functions.
  • Apply special continuous random variables to real
    problems.
  • Define and apply the concepts of reliability,
    conditional failure rate, hazard rate and inverse
    bath-tub curve.
  • Expectation and moments
  • Obtain the expectation, moments and transforms of
    special and general random variables.
  • Stochastic processes
  • Define and classify stochastic processes.
  • Derive the metrics for Bernoulli and Poisson
    processes.

6
Expected learning outcomes (contd..)
  • Discrete time Markov chains
  • Define the state space, state transitions and
    transition probability matrix
  • Compute the steady state probabilities.
  • Analyze the performance and reliability of a
    software application based on its architecture.
  • Statistical inference
  • Understand the role of statistical inference in
    applying probability theory.
  • Derive the maximum likelihood estimators for
    general and special random variables.
  • Test two-sided hypothesis concerning the mean of
    a random variable.

7
Expected learning outcomes (contd..)
  • Continuous time Markov chains
  • Define the state space, state transitions and
    generator matrix.
  • Compute the steady state or limiting
    probabilities.
  • Model real world phenomenon as birth-death
    processes and compute limiting probabilities.
  • Model real world phenomenon as pure birth, and
    pure death processes.
  • Model and compute system availability.

8
Textbooks
  • Required text book
  • K. S. Trivedi, Probability and Statistics with
    Reliability, Queuing and
  • Computer Science Applications, Second Edition,
    John Wiley.

9
Course topics
  • Introduction (Ch. 1, Sec. 1.1-1.5, 1.7-1.11)
  • Sample space and events, Event algebra,
    Probability axioms, Combinatorial problems,
    Independent events, Bayes rule, Bernoulli trials
  • Discrete random variables (Ch. 2, Sec. 2.1-2.4,
    2.5.1-2.5.3, 2.5.5,2.5.7,2.7-2.9)
  • Definition of a discrete random variable,
    Probability mass and distribution functions,
    Bernoulli, Binomial, Geometric, Modified
    Geometric, and Poisson, Uniform pmfs, Probability
    generating function, Discrete random vectors,
    Independent events.
  • Continuous random variables (Ch. 3, Sec. 3.1-3.3,
    3.4.6,3.4.7)
  • Probability density function and cumulative
    distribution functions, Exponential and uniform
    distributions, Reliability and failure rate,
    Normal distribution

10
Course topics (contd..)
  • Expectation (Ch. 4, Sec. 4.1-4.4, 4.5.2-4.5.7)
  • Expectation of single and multiple random
    variables, Moments and transforms
  • Stochastic processes (Ch. 6, Sec. 6.1, 6.3 and
    6.4)
  • Definition and classification of stochastic
    processes, Bernoulli and Poisson processes.
  • Discrete time Markov chains (Ch. 7, Sec.
    7.1-7.3)
  • Definition, transition probabilities, steady
    state concept. Application of discrete time
    Markov chains to software performance and
    reliability analysis
  • Statistical inference (Ch. 10, Sec. 10.1, 10.2.2,
    10.3.1)
  • Motivation, Maximum likelihood estimates for the
    parameters of Bernoulli, Binomial, Geometric,
    Poisson, Exponential and Normal distributions,
    Parameter estimation of Discrete Time Markov
    Chains (DTMCs), Hypothesis testing.

11
Course topics (contd..)
  • Continuous time Markov chains (Ch. 8, Sec.
    8.1-8.3, 8.4.1)
  • Definition, Generator matrix, Computation of
    steady state/limiting probabilities, Birth-death
    process, M/M/1 and M/M/m queues, Pure birth and
    pure death process, Availability analysis.

12
Course topics and exams calendar
Week 1 (Jan. 21) 1. Jan 25 Logistics,
Introduction, Sample Space, Events, Event
algebra Week 2 (Jan. 28) 2. Jan 28
Probability axioms, combinatorial problems
3. Feb. 1 Conditional probability, Independent
events, Bayes rule, Bernoulli trials Week 3
(Feb. 4) 4. Feb. 4 Discrete random
variables, Probability mass and Distribution
function. 5. Feb. 8 Special discrete
distributions Week 4 (Feb. 11) 6. Feb.
11 Poisson pmf, Uniform pmf, Probability
Generating Function 7. Feb. 15 Discrete
random vectors, Independent random variables Week
5 (Feb. 18) 8. Feb. 18 Continuous
random variables, Uniform and Normal
distributions 9. Feb. 22 Exponential
distribution, reliability and failure rate

13
Course topics and exams calendar (contd..)
Week 6 (Feb. 25) 10. Feb. 25
Expectations of random variables, moments
11. Feb. 29 Multiple random variables,
transform methods Week 7 (Mar. 3) 12.
Mar 3 Moments and transforms of special
distributions 13. Mar 7 Stochastic
processes, Bernoulli and Poisson processes Week
8 (Mar. 10) Spring break, no class.
Week 9 (Mar. 17) 14. Mar 17 Discrete
time Markov chains 15. Mar 21 Discrete
time Markov chains (contd..) Week 10 (Mar. 24)
16. Mar 24 Analysis of software
reliability and performance 17. Mar 28
Statistical inference Week 11 (Mar. 31)
18. Mar 31 Statistical inference (contd..)
19. Apr. 4 Confidence intervals

14
Course topics and exams calendar (contd..)
Week 12 (Apr. 7) 20. Apr. 7 Hypothesis
testing 21. Apr. 11 Hypothesis testing
(contd..) Week 13 (Apr. 13) Apr. 14 No
class 22. Apr. 18 Continuous time Markov
chains Week 14 (Apr. 20) 23. Apr. 21
Simple queuing models 24. Apr. 25 Pure
death processes, availability models Week 15
(Apr. 27) Apr. 27 Make up class
May 2 Final exam handed.

15
Assignment/Homework logistics
  • There will be one homework based on each topic
    (approximately)
  • One week will be allocated to complete each
    homework
  • Homeworks will not be graded, but I encourage you
    to do homeworks since the exam problems will be
    similar to the homeworks.
  • Solution to each homework will be provided after
    a week.
  • Homework schedule is as follows
  • HW 1 (Handed Feb. 1, Lectures 1-3 )
  • HW 2 (Handed Feb. 15, Lectures 4 - 7)
  • HW 3 (Handed Feb. 22, Lectures 8 - 9)
  • HW 4 (Handed Mar 2, Lectures 10 - 12 )
  • HW 5 (Handed Mar. 24, Lectures 13 - 16)
  • HW 6 (Handed Apr. 11, Lectures 17 - 21)
  • HW 7 (Handed Apr. 25, Lectures 22 - 24)

16
Exam logistics
  • Exams will have problems similar to that of the
    homeworks.
  • Exam I (Feb. 29)
  • Lectures 1 through 9
  • Exam II (Apr. 11)
  • Lectures 10 through 19
  • Exams will be take-home.

17
Project logistics
  • Project will be handed in the week first week of
    April, and and will be due in the last week of
    classes.
  • 2-3 problems
  • Experimenting with design options to explore
    tradeoffs and to determine which system has
    better performance/reliability etc.
  • Parameter estimation, hypothesis testing with
    real data.
  • May involve some programming (can be done using
    Java, Matlab etc.)
  • Project report must describe
  • Approach used to solve the problem.
  • Results and analysis.

18
Grading system
Homeworks 0 - Ungraded homeworks.
Midterms - 30 - Three midterms, 15 per
midterm Project 25 - Two to three problems.
Final - 45 - Heavy emphasis on the
final
19
Attendance policy
  • Attendance not mandatory.
  • Attending classes helps!
  • Many examples, derivations (not in the book) in
    the class
  • Problems, examples covered in the class fair game
    for the exams.
  • Everything not in the lecture notes

20
Feedback
Please provide informal feedback early and often,
before the formal review process.
21
Introduction and motivation
  • Why study probability theory?
  • Answer questions such as

22
Probability model
  • Examples of random/chance phenomenon
  • What is a probability model?

23
Sample space
  • Definition
  • Example Status of a computer system
  • Example Status of two components CPU, Memory
  • Example Outcomes of three coin tosses

24
Types of sample space
  • Based on the number of elements in the sample
    space
  • Example Coin toss
  • Countably finite/infinite
  • Countably infinite

25
Events
  • Definition of an event
  • Example Sequence of three coin tosses
  • Example System up.

26
Events (contd..)
  • Universal event
  • Null event
  • Elementary event

27
Example
  • Sequence of three coin tosses
  • Event E1 at least two heads
  • Complement of event E1 at most one head (zero
    or one head)
  • Event E2 at most two heads

28
Example (contd..)
  • Event E3 Intersection of events E1 and E2.
  • Event E4 First coin toss is a head
  • Event E5 Union of events E1 and E4
  • Mutually exclusive events

29
Example (contd..)
  • Collectively exhaustive events
  • Defining each sample point to be an event
Write a Comment
User Comments (0)
About PowerShow.com