Context-Based Adaptive Entropy Coding - PowerPoint PPT Presentation

About This Presentation
Title:

Context-Based Adaptive Entropy Coding

Description:

Title: PowerPoint Presentation Author: Jacques Vaisey Last modified by: Xiaolin Wu Created Date: 12/21/2001 5:24:57 AM Document presentation format – PowerPoint PPT presentation

Number of Views:148
Avg rating:3.0/5.0
Slides: 61
Provided by: JacquesV150
Category:

less

Transcript and Presenter's Notes

Title: Context-Based Adaptive Entropy Coding


1
Context-Based Adaptive Entropy Coding
  • Xiaolin Wu
  • McMaster University
  • Hamilton, Ontario, Canada

2
Data Compression System
Entropy Coding
Quantization
Transform
bit patterns for
a block of
n
the Q-indices
samples
quantized coefficients
transform coefficients
  • lossless compression (entropy coding) of the
    quantized data
  • entropy coding removes redundancy and achieves
    compression

3
Entropy Coding Techniques
  • Huffman code
  • Golomb-Rice code
  • Arithmetic code
  • Optimal in the sense that it can approach source
    entropy
  • Easily adapts to non-stationary source statistics
    via context modeling (context selection and
    conditional probability estimation).
  • Context modeling governs the performance of
    arithmetic coding.

4
Entropy (Shannon 1948)
5
Conditional Entropy
  • Consider two random variables and
  • Alphabet of
  • Alphabet of
  • Conditional Self-information of is
  • Conditional Entropy is the average value of
    conditional self-information

6
Entropy and Conditional Entropy
  • The conditional entropy can be
    interpreted as the amount of uncertainty
    remaining about the , given that we know
    random variable .
  • The additional knowledge of should reduce the
    uncertainty about .

7
Context Based Entropy Coders
  • Consider a sequence of symbols

8
Context model estimated conditional probability
  • Variable length coding schemes need estimates of
    probability of each symbol - model
  • Model can be
  • Static - Fixed global model for all inputs
  • English text
  • Semi-adaptive - Computed for specific data being
    coded and transmitted as side information
  • C programs
  • Adaptive - Constructed on the fly
  • Any source!

9
Adaptive vs. Semi-adaptive
  • Advantages of semi-adaptive
  • Simple decoder
  • Disadvantages of semi-adaptive
  • Overhead of specifying model can be high
  • Two-passes of data required
  • Advantages of adaptive
  • one-pass ? universal ? As good if not better
  • Disadvantages of adaptive
  • Decoder as complex as encoder
  • Errors propagate

10
Adaptation with Arithmetic and Huffman Coding
  • Huffman Coding - Manipulate Huffman tree on the
    fly - Efficient algorithms known but nevertheless
    they remain complex.
  • Arithmetic Coding - Update cumulative probability
    distribution table. Efficient data structure /
    algorithm known. Rest essentially same.
  • Main advantage of arithmetic over Huffman is the
    ease by which the former can be used in
    conjunction with adaptive modeling techniques.

11
Context models
  • If source is not iid then there is complex
    dependence between symbols in the sequence
  • In most practical situations, pdf of symbol
    depends on neighboring symbol values - i.e.
    context.
  • Hence we condition encoding of current symbol to
    its context.
  • How to select contexts? - Rigorous answer beyond
    our scope.
  • Practical schemes use a fixed neighborhood.

12
Context dilution problem
  • The minimum code length of sequence
  • achievable by arithmetic coding, if
  • is known.
  • The difficulty in estimating
    due to insufficient sample statistics,
    preventing the use of high-order Markov models.

13
Estimating probabilities in different contexts
  • Two approaches
  • Maintain symbol occurrence counts within each
    context
  • number of contexts needs to be modest to avoid
    context dilution
  • Assume pdf shape within each context same (e.g.
    Laplacian), only parameters (e.g. mean and
    variance) different
  • Estimation may not be as accurate but much larger
    number of contexts can be used

14
Context Explosion
  • Consider an image
  • quantized to 16-levels
  • causal context space
  • No enough data to learn the histograms
  • the data dilution problem
  • solution quantize the context space

15
Current Solutions
  • Non-Binary Source
  • JPEG simple entropy coding without context
  • J2K ad-hoc context quantization strategy
  • Binary Source
  • JBIG use suboptimal context templates of modest
    sizes

16
Overview
  • Introduction
  • Context Quantization for Entropy Coding of
    Non-Binary Source
  • Context Quantization
  • Context Quantizer Description
  • Context Quantization for Entropy Coding of Binary
    Source
  • Minimum Description Length Context Quantizer
    Design
  • Image Dependent Context Quantizer Design with
    Efficient Side Info.
  • Context Quantization for Minimum Adaptive Code
    Length
  • Context-Based Classification and Quantization
  • Conclusions and Future Work

17
Context Quantization
  • Group the contexts with similar histogram

18
Distortion Measure
  • Kullback-Leibler distance between pmf histograms
  • Always non-zero

19
Context Quantization
20
Experimental Results
  • 1st experiment source
  • use a controlled source with memory
  • 1st-order Gauss-Markov
  • quantize into 32 levels
  • flip sign of every sample with probability 0.5

21
  • 1st experiment setup
  • context space
  • How many groups do we expect?
  • max of 16 are needed
  • all will be bimodal

22
  • 1st experiment results
  • vary M, the number of groups

M Rate bits/sym
1 4.062
2 3.709
4 3.563
8 3.510
16 3.504
774 3.493
23
  • histograms for M16

24
  • 2nd experiment source
  • a 512?512 natural image - barb
  • wavelet transform, 9-7 filters, 3-levels
  • each subband scalar quantized to 24 levels

25
  • 2nd experiment setup
  • context space
  • group histogram structure?
  • unknown

26
  • 2nd experiment results
  • subband S0 (low pass)
  • need our method to quantize the context space!

27
Overview
  • Introduction
  • Context Quantization for Entropy Coding of
    Non-Binary Source
  • Context Quantization
  • Context Quantizer Description
  • Context Quantization for Entropy Coding of Binary
    Source
  • Minimum Description Length Context Quantizer
    Design
  • Image Dependent Context Quantizer Design with
    Efficient Side Info.
  • Context Quantization for Minimum Adaptive Code
    Length
  • Context-Based Classification and Quantization
  • Conclusions and Future Work

28
Quantizer description
Describe Context Book
Estimate Prob.
Entropy Coders
Define Context
Quantize Context
Adaptive arithmetic coder 1
Adaptive arithmetic coder N
29
Coarse Context Quantization
Full Resolution Quantizer
Low Resolution Context Quantizers
0
999729
65390
Number of contexts
30
State Sequence
Context Book
Context Indices
Group Indices
31
Experimental Results
  • Experiment source
  • a 512?512 natural image - barb
  • wavelet transform, 9-7 filters, 3-levels
  • each subband scalar quantized to 24 levels

32
Experimental Results
???
33
Experimental Results
Subband LH2
Subband LH3
34
Overview
  • Introduction
  • Context Quantization for Entropy Coding of
    Non-Binary Source
  • Context Quantization
  • Context Quantizer Description
  • Context-Based Classification and Quantization
  • Context Quantization for Entropy Coding of Binary
    Source
  • Minimum Description Length Context Quantizer
    Design
  • Image Dependent Context Quantizer Design with
    Efficient Side Info.
  • Context Quantization for Minimum Adaptive Code
    Length
  • Conclusions and Future work

35
MDL-Based Context Quantizer
Describe Context Book
Estimate Prob.
Entropy Coders
Define Context
Quantize Context
Adaptive arithmetic coder 1
Adaptive arithmetic coder N
36
Distortion Measure
37
Context Quantization
Vector Quantization
(Local Optima)
Scalar Quantization
(Global Optima)
38
Proposed Method
  • Context mapping description
  • Using training set to obtain pmf of common
    context
  • Applying classification map method for rare
    context
  • Minimum Description Length Context Quantizer
  • Minimize the objective function
  • Dynamic programming method is applied to achieve
    the global minima

39
Contributions
  • A new context quantizer design approach based on
    the principle of minimum description length.
  • An input-dependent context quantizer design
    algorithm with efficient handling of rare
    contexts
  • Context quantization for minimum adaptive code
    length.
  • A novel technique to handle the mismatch of
    training statistics and source statistics.

40
Experimental Results
Bit rates (bpp) of dithering halftone images
Bit rates (bpp) of error diffusion halftone images
41
Overview
  • Introduction
  • Context Quantization for Entropy Coding of
    Non-Binary Source
  • Context Quantization
  • Context Quantizer Description
  • Context Quantization for Entropy Coding of Binary
    Source
  • Minimum Description Length Context Quantizer
    Design
  • Image Dependent Context Quantizer Design with
    Efficient Side Info.
  • Context Quantization for Minimum Adaptive Code
    Length
  • Context-Based Classification and Quantization
  • Conclusions and Future Work

42
Motivation
  • MDL-based context quantizer is designed mainly
    based on the training set statistics.
  • If there is any mismatch in statistics between
    the input and the training set, the optimality of
    the predesigned context quantizer can be
    compromised

43
Input-dependent context quantization
  • Context Quantizer Design
  • Raster scan the input image to obtain the
    conditional probabilities and the
    number of occurrence of each
    context instance
  • Minimize the objective function by dynamic
    programming
  • Reproduction pmfs
    are sent as the side information

44
Handling of Rare Context Instances
  • Context template definition
  • Estimation of conditional probabilities from the
    training set
  • Rare contexts estimated by decreased size
    context
  • Only used as initial one and being updated when
    coding the input image

45
Coding Process
  • The context quantizer output may change according
    to the more accurate estimate of the conditional
    probability along the way

46
Experimental Results
Bit rates (bpp) of error diffusion halftone images
47
Overview
  • Introduction
  • Context Quantization for Entropy Coding of
    Non-Binary Source
  • Context Quantization
  • Context Quantizer Description
  • Context Quantization for Entropy Coding of Binary
    Source
  • Minimum Description Length Context Quantizer
    Design
  • Image Dependent Context Quantizer Design with
    Efficient Side Info.
  • Context Quantization for Minimum Adaptive Code
    Length
  • Context-Based Classification and Quantization
  • Conclusions and Future Work

48
Motivation
  • To minimize the actual adaptive code length
    instead of static code length
  • To minimize the effect of mismatch between the
    input and training set statistics

49
Context Quantization for Minimum Adaptive Code
Length
  • The adaptive code length can be calculated
  • The order of 0 and 1 appearance does not change
    the adaptive code length

50
Context Quantization for Minimum Adaptive Code
Length
  • The objective function to minimize the effect of
    mismatch between the input and the training set

51
Experimental Results
Bit rates (bpp) of error diffusion halftone images
52
Overview
  • Introduction
  • Context Quantization for Entropy Coding of
    Non-Binary Source
  • Context Quantization
  • Context Quantizer Description
  • Context Quantization for Entropy Coding of Binary
    Source
  • Minimum Description Length Context Quantizer
    Design
  • Image Dependent Context Quantizer Design with
    Efficient Side Info.
  • Context Quantization for Minimum Adaptive Code
    Length
  • Context-Based Classification and Quantization
  • Conclusions and Future Work

53
Motivation
Quantizer after Classification
Single Quantizer
54
Single Quantizer
55
Classification and Quantization
TC
EC
IQ
20.5 40.5 31 9 45 38 28 12
25 39.5 47 19 23.5
Transform Coef.
2 4 3 0 4 3 2
1 2 3 4 1 2
Initial Q
Group Indies
0 1 1 0 1 1 0
0 0 1 1 0 1
56
Experimental Results
57
Overview
  • Introduction
  • Context Quantization for Entropy Coding of
    Non-Binary Source
  • Context Quantization
  • Context Quantizer Description
  • Context Quantization for Entropy Coding of Binary
    Source
  • Minimum Description Length Context Quantizer
    Design
  • Image Dependent Context Quantizer Design with
    Efficient Side Info.
  • Context Quantization for Minimum Adaptive Code
    Length
  • Context-Based Classification and Quantization
  • Conclusions and Future Work

58
Conclusions
  • Non-Binary Source
  • New context quantization method is proposed
  • Efficient context description strategies
  • Binary Source
  • Global optimal partition of the context space
  • MDL-based context quantizer
  • Context-based classification and quantization

59
Future Work
  • Context-based entropy coder
  • Context shape optimization
  • Mismatch between the training set and test data
  • Classification and Quantization
  • Classification tree among the wavelet subbands
  • Apply these techniques to video codec

60
?
Write a Comment
User Comments (0)
About PowerShow.com