Cryptography - PowerPoint PPT Presentation

About This Presentation
Title:

Cryptography

Description:

Cryptography Lecture 2 Stefan Dziembowski www.dziembowski.net stefan_at_dziembowski.net Plan Information-theoretic cryptography Introduction to cryptography based on the ... – PowerPoint PPT presentation

Number of Views:89
Avg rating:3.0/5.0
Slides: 45
Provided by: Dziembowski
Category:

less

Transcript and Presenter's Notes

Title: Cryptography


1
Cryptography
  • Lecture 2Stefan Dziembowskiwww.dziembowski.net
  • stefan_at_dziembowski.net

2
Plan
  1. Information-theoretic cryptography
  2. Introduction to cryptography based on the
    computational assumptions
  3. Provable security
  4. Pseudorandom generators

3
The scenario from the previous lecture
Alice
Bob
Eve
Shannons theorem ? perfect secrecy is
possible only if the key is as long as the
plaintext
In real-life it is completely impractical
4
What to do?
  • Idea limit the power of the adversary.
  • How?
  • Classical (computationally-secure) cryptography
  • bound his computational power.
  • Alternative options exists
  • (but are not very practical)

5
Quantum cryptography
  • Stephen Wiesner (1970s), Charles H. Bennett and
    Gilles Brassard (1984)

quantum link
Alice
Bob
Quantum indeterminacy quantum states cannot be
measured without disturbing the original state.
Hence Eve cannot read the bits in an
unnoticeable way.
Eve
6
Quantum cryptography
  • Advantage
  • security is based on the laws of quantum physics
  • Disadvantage
  • needs a dedicated equipment.
  • Practicality?
  • Currently successful transmissions for distances
    of length around 150 km.
  • Commercial products are available.

7
A satellite scenario
A third party (a satellite) is broadcasting
random bits.
000110100111010010011010111001110111 1110100111010
10101010010010100111100 00100111111110001010100100
0101010010 001010010100101011010101001010010101
Alice
Bob
Eve
8
Ueli Maurer (1993) noisy channel.
1 0 1 0 1 0 0 1 1 0 1 0 0 1 0
1 0 1 0 1 0 0 1 1 0 1 0 0 1 0
1 0 1 0 1 0 0 1 1 0 1 0 0 1 0
0 0 1 0 0 0 0 1 1 0 0 0 0 1 1
1 0 1 1 1 0 0 1 1 0 1 0 0 0 1
some bits get flipped (because of the noise)
1 0 1 0 1 0 0 1 1 0 1 0 0 1 0
1 0 1 1 1 0 0 1 1 0 1 0 0 0 0
Assumption the data that the adversary receives
is noisy. (The data that Alice and Bob receive
may be even more noisy.)
9
Bounded-Storage Model
  • Another idea bound the size of adversarys memory

000110100111010010011010111001110111 1110100111010
10101010010010100111100 00100111111110001010100100
0101010010 001010010100101011010101001010010101
too large to fit in Eves memory
10
Real (computationally-secure) cryptography starts
here
Eve is computationally-bounded
  • But what does it mean?
  • Ideas
  • She has can use at most 1000 Intel Core 2
    Extreme X6800 Dual Core Processors for at most
    100 years...
  • She can buy equipment worth 1 million euro and
    use it for 30 years...

its hard to reasonformally about it
11
A better idea
  • The adversary has access to a Turing Machine
    that can make at most 1030 steps.
  • More generally, we could have definitions of a
    type
  • a system X is (t,e)-secure if every Turing
    Machine
  • that operates in time t
  • can break it with probability at most e.
  • This would be quite precise, but...
  • We would need to specify exactly what we mean by
    a Turing Machine
  • how many tapes it has?
  • how does it access these tapes (maybe a random
    access memory is a more realistic model..)
  • ...
  • Moreover, this approach often leads to ugly
    formulas...

12
What to do?
(t,e)-security
  • Idea
  • t steps of a Turing Machine efficient
    computation
  • e a value very close to zero.

How to formalize it? Use the asymptotics!
13
Efficiently computable?
polynomial-time computable on a Turing Machine
efficiently computable

that is running in time O(nc) (for some c)
Here we assume that the Turing Machines are the
right model for the real-life computation. Not
true if a quantum computer is built...
14
Very small?
very small negligible approaches 0
faster than the inverse of any polynomial Formal
ly
15
Negligible or not?
no
yes
yes
yes
yes
no
16
Security parameter
Typically, we will say that a scheme X is secure
if
A
P (M breaks the scheme X) is negligible
polynomial-time Turing Machine M
  • The terms negligible and polynomial make
    sense only if X (and the adversary) take an
    additional input n called
  • a security parameter.
  • In other words we consider an infinite sequence
  • X(1),X(2),...
  • of schemes.

17
Example
  • Consider the authentication scheme from the last
    week

18
Nice properties of these notions
  • A sum of two polynomials is a polynomial
  • poly poly poly
  • A product of two polynomials is a polynomial
  • poly poly poly
  • A sum of two negligible functions is a negligible
    function
  • negl negl negl
  • Moreover
  • A negligible function multiplied by a polynomial
    is negligible
  • negl poly negl

19
A new definition of an encryption scheme
20
Is this the right approach?
  • Advantages
  • All types of Turing Machines are equivalent up
    to a polynomial reduction.Therefore we do
    need to specify the details of the model.
  • The formulas get much simpler.
  • Disadvantage
  • Asymptotic results dont tell us anything about
    security of the concrete systems.
  • However
  • Usually one can prove formally an asymptotic
    result and then argue informally that the
    constants are reasonable
  • (and can be calculated if one really wants).

21
Provable security
  • We want to construct schemes that are
  • provably secure.
  • But...
  • why do we want to do it?
  • how to define it?
  • and is it possible to achieve it?

22
Provable security the motivation
  • In many areas of computer science formal proofs
    are not essential.
  • For example, instead of proving that an algorithm
    is efficient, we can just simulate it on a
  • typical input.
  • In cryptography its not true, because
  • there cannot exist an experimental proof that a
    scheme is secure.
  • Why?
  • Because a notion of a
  • typical adversary
  • does not make sense.

23
How did we define the perfect secrecy?
  • Experiment (m a message)
  • the key k is chosen randomly
  • message m is encrypted using k c
    Enck(m)
  • c is given to the adversary

Idea 1The adversary should not be able to
compute k.
Idea 2The adversary should not be able to
compute m.
Idea 3The adversary should not be able to
compute any information about m.
Idea 4 The adversary should not be able to
compute any additional information about m.
makes more sense
24
Idea The adversary should not be able to compute
any additional information about m.
25
Towards the definition of computational secrecy...
A
A
P(C c) P(C c Mm)
m
c
A
A
P(C c M m0) P(C c M m1)
m0,m1
c
A
A
P(Enc(K,M) c M m0) P(Enc(K,M) c M
m1)
m0,m1
c
A
A
P(Enc(K,m0) c M m0) P(Enc(K,m1) c M
m1)
m0,m1
c
A
A
P(Enc(K,m0) c) P(Enc(K,m1) c)
m0,m1
c
26
Indistinguishability
A
A
  • P(Enc(K,m0) c) P(Enc(K,m1) c)

m0,m1
c
In other words the distributions of Enc(K,m0)
Enc(K,m1) are identical
IDEA change it to are indistinguishable by a
polynomial time adversary
27
A game
(Gen,Enc,Dec) an encryption scheme
security parameter 1n
adversary (polynomial-time Turing machine)
oracle
chooses m0,m1 such that m0m1
  1. selects k G(1n)
  2. chooses a random b 0,1
  3. calculates c Enc(k,mb)

has to guess b
Alternative name semantially-secure (sometimes
we will say is computationally-secure, if the
context is clear)
Security definition We say that (Gen,Enc,Dec)
has indistinguishable encryptions if any
polynomial time adversary guesses b correctly
with probability at most 0.5 e(n), where e is
negligible.
28
Testing the definition
  • Suppose the adversary can compute k from some
    Enc(k,m). Can he win the game?
  • Suppose the adversary can compute some bit of m
    from Enc(k,m). Can he win the game?

YES!
YES!
29
Is it possible to prove security?
  • (Gen,Enc,Dec) -- an encryption scheme.
  • For simplicity suppose that
  • for a security parameter n the key is of length
    n.
  • Enc is deterministic
  • Consider the following language

Q What if L is polynomial-time decidable?
A Then the scheme is broken (exercise)
Is it really true?
On the other hand L is in NP.
(k is the NP-witness)
So, if P NP, then any semantically-secure
encryption is broken.
30
If PNP, then the semantically-secure encryption
is broken
  • Is it 100 true?
  • Not really...
  • This is because even if PNP we do not know what
    are the constants.
  • Maybe PNP in a very inefficient way...

31
  • In any case, to prove security of a cryptographic
    scheme we would need to show
  • a lower bound on the computational complexity of
    some problem.
  • In the asymptotic setting that would mean that
  • at least
  • we show that P ? NP.
  • Does the implication in the other direction hold?
  • (that is does P ? NP imply anything for
    cryptography?)
  • No! (at least as far as we know)
  • Intuitively because NP is a notion from the
    worst case complexity, and cryptography
    concerns the average case complexity.
  • Therefore
  • proving that an encryption scheme is secure is
    probably much harder than proving that P ? NP.

32
What can we prove?
  • We can prove conditional results.
  • That is, we can show theorems of a type

Suppose that some computational assumption A
holds
Suppose that some scheme Y is secure
then scheme X is secure.
then scheme X is secure.
33
Research program in cryptography
  • Base the security of cryptographic schemes on a
    small number of well-specified computational
    assumptions.

Examples of A decisional Diffie-Hellman
assumption strong RSA assumption
in this we have to believe
Some computational assumption A holds
the rest is provable
then scheme X is secure.
34
Example
  • We are now going to show an example of such
    reasoning

Suppose that some computational assumption A
holds
Suppose that G is a cryptographic pseudorandom
generator
then scheme X is secure.
we G can construct a secure encryption scheme
35
Pseudorandom generators
G(s)
s
36
  • If we use a normal PRG this idea doesnt work
    (exercise).
  • It works only with the cryptographic PRGs.

37
Looks random
  • What does it mean?
  • Non-cryptographic applications
  • should pass some statistical tests.
  • Cryptography
  • should pass all polynomial-time tests.

38
Cryptographic PRG
outputs
0 if he thinks its R
1 if he thinks its G(S)
a polynomial-timedistinguisher D
Should not be able to distinguish...
39
Constructions
  • There exists constructions of cryptographic
    pseudorandom-generators, that are conjectured to
    be secure.
  • Some of them are extremely efficient, and widely
    used in practice.
  • They are called the stream ciphers (we will
    discuss them later).

40
Theorem
  • If G is a cryptographic PRG then the encryption
    scheme constructed before is semantically-secure
    (i.e. it has indistinguishable encryptions).

cryptographic PRGs
computationally-secure encryption
41
simulates
X
  • If the adversary guessed b correctly then output
    1 x is pseudorandom.
  • Otherwise output 0 x is random.

42
x is a random string R
x G(S)
the adversary guesses b correctly with
probability 0.5
the adversary guesses b correctly with
probability 0.5 d(n)
prob. 0.5 d(n)
prob.0.5 - d(n)
prob. 0.5
prob. 0.5
outputs
1
0
1
0
QED
43
Moral
cryptographic PRGs
semantically-secure encryption
  • To construct secure encryption it suffices to
    construct a secure PRG.

44
Outlook
Cryptography
  • one time pad,
  • quantum cryptography,
  • the satellite scenario
  • often called
  • information-theoretic,
  • unconditional
  • computationally-secure
  • based on 2 assumptions
  • some problems are computationally difficult
  • our understanding of what computational
    difficulty means is correct.
Write a Comment
User Comments (0)
About PowerShow.com