Randomized Computation - PowerPoint PPT Presentation

About This Presentation
Title:

Randomized Computation

Description:

Randomized Computation Roni Parshani 025529199 Orly Margalit 037616638 Eran Mantzur 028015329 Avi Mintz 017629262 – PowerPoint PPT presentation

Number of Views:75
Avg rating:3.0/5.0
Slides: 58
Provided by: ORLY8
Category:

less

Transcript and Presenter's Notes

Title: Randomized Computation


1
Randomized Computation
  • Roni Parshani 025529199
  • Orly Margalit 037616638
  • Eran Mantzur 028015329
  • Avi Mintz 017629262

2
RP Random Polynomial Time
  • Denotation
  • L is Language
  • M is probabilistic polynomial time turning
    machine
  • Definition
  • L? RP if ?M such that
  • x ? L ? Prob M(x) 1 ? ½
  • x ? L ? Prob M(x) 0 1

3
RP Random Polynomial Time
  • The disadvantage of RP (coRP) is when the
    Input doesnt belong to language (does belong
    to the language) the machine needs to return a
    correct answer at all times.
  • Definition
  • x ? L ? ?L(x) 1
  • x ? L ? ?L(x) 0

4
RP ? NP
  • Proof
  • Given L? RP
  • Aim L? NP
  • L?RP
  • ? ?x?L ?M such that more than 50 of y give
    M(x,y) 1 ? ?y M(x,y) 1
  • ? ?x?L ?y M(x,y) 0
  • L?NP

5
coRP - Complementary Random Polynomial Time
  • Definition
  • L? coRP if ?M such that
  • x ? L ? Prob M(x) 1 1
  • x ? L ? Prob M(x) 0 ? ½
  • An alternative way to define coRP is
  • coRP L ? RP

6
coRP ? co-NP
  • Proof
  • Give L? coRP
  • Aim L? co-NP
  • L?coRP ? ?RP ? ?NP ? L?co-NP

7
RP1
  • P(.) is a polynomial
  • Definition
  • L? RP1 if ? M, p(.) such that
  • x ? L ? Prob M(x,r) 1 ?
  • x ? L ? Prob M(x,r) 0 1

8
RP2
  • P(.) is a polynomial
  • Definition
  • L? RP2 if ?M, p(.) such that
  • x ? L ? Prob M(x,r) 1 ? 1 2-p(x)
  • x ? L ? Prob M(x,r) 0 1

9
RP1 RP2 RP
  • Aim RP1 RP2
  • RP2 ? RP1
  • ? we can always select a big enough x such that
  • lt 1 2-p(x)

10
RP1 RP2 RP
  • RP1 ? RP2
  • L? RP1 ? ?M, p(.) such that
  • ?x?L Prob M(x,r) 1 ?
  • we run M(x,r) t(x) times
  • If in any of the runs M(x,r) 1 ? output is 1
  • If in all of the runs M(x,r) 0 ? output is
    0

11
RP1 ? RP2
  • Select t(x)
  • Therefore if x?L ? output is 0
  • If x?L the probability of outputting 0 is only if
    M(x,r) 0 all t(x) times
  • ( ProbM(x,r) 0 )t(x) (1- )t(x)
  • 1- 2-p(x)

12
RP1 ? RP2
  • ? So the probability of outputting 1 is larger
    than 1- 2- p(x)
  • L? RP2
  • Conclusion
  • RP1 ? RP ? RP2 ? RP1
  • Therefore RP1 RP RP2

13
BPP Bounded Probability Polynomial Time
  • Definition
  • L? BPP if ?M such that
  • x ? L ? Prob M(x) 1 ? ?
  • x ? L ? Prob M(x) 1 lt
  • In other words
  • ?x Prob M(x) ?L(x) ? ?

14
coBPP BPP
  • coBPP L ? BPP
  • ?M Prob M(x) ?L(x) ? ?
  • ? Prob (x) ? (x) ? ?
  • BPP
  • 1 M(.)
  • (M(.) exists iff (.) exists)

15
BPP1
  • Previously we defined stricter and weaker
    definition for RP, in a similar way we will
    define for BPP.
  • Denotation
  • p(.) positive polynomial
  • f polynomial time computable function
  • Definition
  • L? BPP1 if ?M, p(.), f such that
  • ? x ? L ? Prob M(x) 1 ? f(x)
  • ? x ? L ? Prob M(x) 1 lt f(x) -

16
BPP BPP1
  • Proof
  • Aim BPP ? BPP1
  • f(x) ½ and p(x) 6
  • This gives the original definition of BPP.

17
BPP BPP1
  • Proof
  • Aim BPP1 ? BPP
  • L ? BPP1 ? ?M such that
  • ?x?L Prob M(x) 1 ? f(x)
  • ?x?L Prob M(x) 1 lt f(x)

18
BPP1 ? BPP
  • we want to know with Prob gt ?
  • if 0 ? p ? f(x) 1/p(x)
  • or if f(x) 1/p(x) ? p ? 1
  • Define M runs M(x) n times, and each M(x)
    returns
  • If gt f(x) M returns YES, else NO

19
BPP1 ? BPP
  • Calculation of n
  • We run n independent Bernoulli variables
    with p ? ½ and
  • Prob lt 2? ?
  •  

20
BPP1 ? BPP
  • Choose and
  • Result M decides L(M) with Prob gt ?

21
BPP2
  • Denotation
  • p(.) positive polynomial
  • Definition
  • L? BPP2 if ?M, p(.) such that
  • ?x Prob M(x) ?L(x) ? 1-2-p(x)

22
BPP ? BPP2
  • Proof
  • Aim BPP ? BPP2
  • p(x)
  • This gives the original definition of BPP.

23
BPP ? BPP2
  • Proof
  • Aim BPP ? BPP2
  • L? BPP ? ?M ?x Prob M(x) ?L(x) ? ?
  •  
  • Define M runs M(x) n times, and each M(x)
    returns
  • If gt ½ M returns YES, else NO
  •  
  • We know ExpM(x) gt ? ? x?L
  • ExpM(x) lt ? x ?L

24
BPP ? BPP2
  • Chernoffs Equation
  • Let X1 , X2 , , Xn be a set of independent
    Bernoulli variables with the same expectations p
    ? ½ ,and ?? 0lt ? ? p(p-1)
  • Then
  • Prob

25
BPP ? BPP2
  • From Chernoffs equation
  • ? ProbM(x) ExpM(x) ? ?
  •  
  • But if M(x) ExpM(x) ?
  • ? then M returns a correct answer

26
BPP ? BPP2
  • ?ProbM(x) ?L(x) ?
  •  
  • ? polynomial P(x) we choose n such that
  • ? ProbM(x) ?L(x) ?  
  • ? L ? BPP2

27
RP ? BPP
  • Proof
  • L? RP if ?M such that
  • x ? L ? Prob M(x) 1 ? ½
  • x ? L ? Prob M(x) 0 1
  • We previously proved BPP BPP1
  • If we place in BPP1 formula with
  • f(.)? and p(.)?4
  • this gives the original definition of RP.

28
P ? BPP
  • Proof
  • L? P ? ?M such that M(x) ?L(x)
  • ?x Prob M(x) ?L(x) 1 ? ?
  • L? BPP

29
PSPACE
  • Definition
  • L? PSPACE if ?M such that M(x) ?L(x)
  • and ?p such that M uses p(x) space.
  • (No time restriction)

30
PP Probability Polynomial Time
  • Definition
  • L? PP if ?M such that
  • x ? L ? Prob M(x) 1 gt ½
  • x ? L ? Prob M(x) 1 ? ½
  • In other words
  • ?x Prob M(x) ?L(x) gt ½

31
PP ? PSPACE
  • Definition (reminder)
  • L? PP if ?M such that
  • ?x Prob M(x) ?L(x) ? ½
  • Proof
  • L? PP ? ?M, p(.) such that
  • ?x Prob M(x,r) ?L(x) gt ½
  • and M is polynomial time.
  • If we run M on ?r, M is correct more than 50 of
    the time.

32
PP ? PSPACE
  • Aim L? PSPACE
  • Run M on every single r.
  • Count the number of received 1 and 0.
  • The correct answer is the greater result.

33
PP ? PSPACE
  • By the definition of PP, every L? PP this
    algorithm will always be correct.
  • M(x,r) is polynomial in space
  • New algorithm is polynomial in space
  • L? PSPACE

34
Claim PP PP1 If we have a machine that
satisfies PP it also satisfies PP1 (Since PP is
stricter then PP1 and demands grater then 1/2 and
PP demands only, equal or grater to ½) so
clearly
?
35
Let M be a language in PP1 Motivation The trick
is to build a machine that will shift the answer
of M towards the NO direction with a very small
probability that is smaller than the smallest
probability difference that M could have. So if M
is biased towards YES our shift will not harm the
direction of the shift. But if there is no
bias(or bias towards NO) our shift will give us a
bias towards the no answer.
36
Proof Let M be defined as
37

M chooses one of two moves.
  • With probability return
    NO
  • With probability
    invoke M

38
If
39
If
40
Suppose that is decided by a
non deterministic machine M with a running time
that is bounded by the polynomial p(x). The
following machine M then will decide L according
to the following definition
41
M uses its random coin tosses as a witness to M
with only one toss that it does not pass to M.
This toss is used to choose its move. One of the
two possible moves gets it to the ordinary
computation of M with the same input(and the
witness is the random input).
42
The other choice gets it to a computation that
always accepts. Consider string x. If M doesn't
have an accepting computation then the
probability that M will answer 1 is exactly 1/2.
On the other hand, if M has at least one
accepting computation the probability that M
will answer correctly is greater then 1/2.
43
So we get that
Meaning and by the previous
claim (PP PP1) we get that .
44
ZPP Zero Error Probability
  • We define a probabilistic turning machine which
    is allowed to reply I Dont Know which will be
    symbolized by -.
  • Definition
  • L? ZPP if ?M such that
  • ?x Prob M(x) - ½
  • ?x Prob M(x) ?L(x) or M(x) - 1

45
Take . Let M be a ZPP
machine. We will build a machine M that
decides L according to the definition of RP.
46
If then by returning 0 when
we will always answer correctly because in this
case
47
If the probability of getting
the right answer with M is greater then 1/2
since M returns a definite answer with
probability greater then 1/2 and Ms definite
answers are always correct.
48
In the same way it can be seen that by defining
M(x) as we get that
49
(No Transcript)
50
If then we will get a YES
answer from and hence from M
with probability greater then 1/2. If
then we will get a NO answer from
and hence from M with probability
greater then 1/2.
51
RSPACE Randomized Space Complexity
  • Definition
  • RSPACE (s) L? RP such that MRP uses at most
    s(x) space and exp( s(x) ) time.
  • BadRSPACE (s) RSPACE (s) without time
    restriction.

52
Claim badRSPACE NSPACE
badRSPACE NSPACE Let L badRSPACE.
If x L that means there is at least one
witness and the non deterministic machine of
NSPACE will choose it.
53
If x L that means there are no
witnesses at all therefore the non deterministic
machine of NSPACE also will not find a solution.
54
NSPACE badRSPACE L NSPACE. M
is the Non - deterministic Turing machine which
decides L in space S(x). If x L there
exists r of length exp(S(x), so that M(x,r)
1, where r is the non-deterministic guess used
by M. Therefore the probability of selecting r so
that M(x,r) 1 is at least
55
So if we repeatedly invoke M(x,.) on random rs
we can expect that after
tries we will see an accepting computation. So
what we want our machine M to do is run M on x
and a newly randomly selected r (of length
exp(S(x))) for about
times and accept iff M accepts in one of these
tries.
56
Problem In order to count to
we need a counter that uses space of
exp(S(x)), and we only have S(x).
57
Solution We will use a randomized counter that
will use only S(x) space. We flip k
coins. if all are
heads then stop else go on. The expected num of
tries . But the real counter only needs
to count to k and therefore only needs space of

.
Write a Comment
User Comments (0)
About PowerShow.com