Optimal Lower Bounds for 2Query Locally Decodable Linear Codes - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

Optimal Lower Bounds for 2Query Locally Decodable Linear Codes

Description:

Theorem: If C is (2, c, e)-smooth, then Gi contains a matching of size em/c. Better to work with non-degenerate codes ... edges are non-trivial linear ... – PowerPoint PPT presentation

Number of Views:102
Avg rating:3.0/5.0
Slides: 29
Provided by: kenji2
Category:

less

Transcript and Presenter's Notes

Title: Optimal Lower Bounds for 2Query Locally Decodable Linear Codes


1
Optimal Lower Bounds for2-Query Locally
Decodable Linear Codes
  • Kenji Obata

2
Codes
  • Error correcting code
  • C 0,1n ? 0,1m
  • with decoding procedure A s.t.
  • for y ? 0,1m with d(y,C(x)) dm,
  • A(y) x

3
Locally Decodable Codes
  • Weaken power of A Can only look at a constant
    number q of input bits
  • Weaken requirements
  • A need only recover a single given bit of x
  • Can fail with some probability bounded away from
    ½
  • Study initiated by Katz and Trevisan KT00

4
Locally Decodable Codes
  • Define a (q, d, ?)-locally decodable code
  • A can make q queries (w.l.o.g. exactly q
    queries)
  • For all x ? 0,1n, all y ? 0,1m with d(y,
    C(x)) dm, all inputs bits i ? 1,, n
  • A(y, i) xi w/ probability ½ ?

5
LDC Applications
  • Direct Scalable fault-tolerant information
    storage
  • Indirect Lower bounds for certain classes of
    private information retrieval schemes(more on
    this later)

6
Lower Bounds for LDCs
  • KT00 proved a general lower bound
  • m nq/(q-1)
  • (at best n2, but known codes exponential)
  • For 2-query linear LDCsGoldreich, Karloff,
    Schulman, Trevisan GKST02 proved an exponential
    bound
  • m 2O(edn)

7
Lower Bounds for LDCs
  • Restriction to linear codes interesting, since
    known LDC constructions are linear
  • But 2O(edn) not quite right
  • Lower bound should increase arbitrarily as
    decoding probability ? 1 (e ? ½)
  • No matching construction

8
Lower Bounds for LDCs
  • In this work, we prove that for 2-query linear
    LDCs,
  • m 2O(d/(1-2e)n)
  • Optimal There is an LDC construction matching
    this within a constant factor in the exponent

9
Techniques from KT00
  • Fact An LDC is also a smooth code (A queries
    each position w/ roughly the same probability)
  • so can study smooth codes
  • Connects LDCs to information-theoretic PIR
    schemes
  • q queries ? q servers
  • smoothness ? statistical indistinguishability

10
Techniques from KT00
  • For i ? 1,,n, define the recovery graph Gi
    associated with C
  • Vertex set 1,,m (bits of the codeword)
  • Edges are pairs (q1, q2) such that, conditioned
    on A querying q1, q2,
  • A(C(x),i) outputs xi with prob gt ½
  • Call these edges good edges (endpoints contain
    information about xi)

11
Techniques from KT00/GKST02
  • Theorem If C is (2, c, e)-smooth, then Gi
    contains a matching of size em/c.
  • Better to work with non-degenerate codes
  • Each bit of the encoding depends on more than one
    bit of the message
  • For linear codes, good edges are non-trivial
    linear combinations
  • Fact Any smooth code can be made non-degenerate
    (with constant loss in parameters).

12
Core Lemma GKST02
  • Let q1,,qm be linear functions on 0,1n s.t.
    for every i ? 1,,nthere is a set Mi of at least
    ?m disjoint pairs of indices j1, j2 such that
  • xi qj1(x) qj2(x).
  • Then m 2?n.

13
Putting it all together
  • If C is a (2, c, ?)-smooth linear code, then (by
    reduction to non-degenerate code existence of
    large matchings core lemma),
  • m 2?n/4c.
  • If C is a (2, d, ?)-locally decodable linear
    code, then (by LDC ? smooth reduction),
  • m 2?dn/8.

14
Putting it all together
  • Summarylocally decodable ? smooth ?big
    matchings ? exponential size
  • This worklocally decodable ? big matchings
  • (skip smoothness reduction, argue directly about
    LDCs)

15
The Blocking Game
  • Let G(V,E) be a graph on n vertices, w a prob
    distribution on E, Xw an edge sampled according
    to w, S a subset of V
  • Define the blocking probability ßd(G) as
  • minw (maxSdn Pr (Xw intersects S))

16
The Blocking Game
  • Want to characterize ßd(G) in terms of size of a
    maximum matching M(G), equivalently defect d(G)
    n 2M(G)
  • Theorem Let G be a graph withdefect an. Then
  • ßd(G) min (d/(1-a), 1).

17
The Blocking Game
Define K(n,a) to be the edge-maximal graph on n
vertices with defect an
clique
(1-a)n
an
K1
K2
18
The Blocking Game
  • Optimization on K(n,a) is a relaxation of
    optimization on any graph with defect an
  • If d(G) an then
  • ßd(G) ßd(K(n,a))
  • So, enough to think about K(n,a).

19
The Blocking Game
  • Intuitively, best strategy for player 1 is to
    spread distribution as uniformly as possible
  • A (?1,?2)-symmetric dist
  • all edges in (K1,K2) have weight ?1
  • all edges in (K2,K2) have weight ?2
  • Lemma ? (?1,?2)-symmetric dist w s.t.
  • ßd(K(n,a)) maxSdn Pr (Xw intersects S).

20
The Blocking Game
  • Claim Let w1,,wk be dists s.t.
  • maxSdn Pr (Xwi intersects S) ßd(G).
  • Then for any convex comb w ? ?i wi
  • maxSdn Pr (Xw intersects S) ßd(G).
  • Proof For S?? V, S dn, intersection prob is
    ? ?i ßd(G) ßd(G). So
  • maxS dn Pr (Xw intersects S) ßd(G).
  • But by defn of ßd(G), this must be ßd(G).

21
The Blocking Game
  • Proof Let w be any distribution optimizing
    ßd(G). If w does, then so does p(w) for p ?
    Aut(G) G. By prior claim, so does
  • w (1/G) ?p?G p(w).
  • For e?E, s?G,
  • w(e) (1/G) ?p?G w(p(e))
  • (1/G) ?p?G w(ps(e))
  • w(s(e)). .
  • So, if e, e are in the same G-orbit, they have
    the same weight in w ? w is (?1,?2)-symmetric.

22
The Blocking Game
  • Claim If w is (?1,?2)-sym then ? S ? V,S
    dn s.t.
  • Pr (Xw intersects S) min (d/(1-a), 1).
  • Proof If d 1 a then can cover every edge.
    Otherwise, set S any dn vertices of K2. Then
  • Pr d (1/(1 - a) ½ n2 (1 - a d) ?2)
  • which, for d lt 1 - a, is at least
  • d/(1 - a)
  • (optimized when ?2 0).

23
The Blocking Game
  • Theorem Let G be a graph withdefect an. Then
  • ßd(G) min (d/(1-a), 1).
  • Proof ßd(G) ßd(K(n,a)). Blocking prob on
    K(n,a) is optimized by some (?1,?2)-sym dist.
    For any such dist w, ? dn vertices blocking w
    with Pr min (d/(1-a), 1).

24
Lower Bound for LDLCs
  • Still need a degenerate?? non-degenerate
    reduction (this time, for LDCs instead of smooth
    codes)
  • Theorem Let C be a (2, d, e)-locally decodable
    linear code. Then, for large enough n, there
    exists a non-degenerate (2, d/2.01, e)-locally
    decodable linear code
  • C 0,1n ? 0,12m.

25
Lower Bound for LDLCs
  • Theorem Let C be a (2, d, e)-LDLC. Then, for
    large enough n,
  • m 21/4.03 d/(1-2e) n.
  • Proof
  • Make C non-degenerate
  • Local decodability
  • low blocking probability (at most ¼ - ½ e)
  • low defect (a 1 (d/2.01)/(1-2e))
  • big matching (½ (d/2.01)/(1-2e) (2m) )
  • exponentially long encoding (m 2(1/4.02)
    d/(1-2e)n 1)

26
Matching Upper Bound
  • Hadamard code on 0,1n
  • yi ai x (ai runs through 0,1n)
  • 2-query locally decodable
  • Recovery graphs are perfect matchings onn-dim
    hypercube
  • Success parameter e ½ - 2d
  • Can use concatenated Hadamard codes (Trevisan)

27
Matching Upper Bound
  • Set c (1-2e)/4d (can be shown that for feasible
    values of d, e, c 1).
  • Divide input into c blocks of n/c bits, encode
    each block with Hadamard code on 0,1n/c.
  • Each block has a fraction cd corrupt entries,
    so code has recovery parameter
  • ½ - 2 (1-2e)/4d d e
  • Code has length
  • (1-2e)/4d 24d/(1-2e)n

28
Conclusions
  • There is a matching upper bound (concatenated
    Hadamard code)
  • New results for 2-query non-linear codes (but
    using apparently completely different techniques)
  • q gt 2?
  • No analog to the core lemma for more queries
  • But blocking game analysis might generalize to
    useful properties other than matching size
Write a Comment
User Comments (0)
About PowerShow.com