SWE 423: Multimedia Systems - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

SWE 423: Multimedia Systems

Description:

Redo the previous example CAEE$ using Huffman coding and notice how many bits ... Formally, define the integer signal as the set of values fn. ... – PowerPoint PPT presentation

Number of Views:70
Avg rating:3.0/5.0
Slides: 23
Provided by: wasfiga
Category:

less

Transcript and Presenter's Notes

Title: SWE 423: Multimedia Systems


1
SWE 423 Multimedia Systems
  • Chapter 7 Data Compression (2)

2
Outline
  • Entropy Encoding
  • Arithmetic Coding
  • Predictive Coding
  • Lossless Predictive Coding
  • Differential Coding
  • Lossy Predictive Coding
  • Differential Pulse Code Modulation Coding (DPCM)
  • Delta Modulation (DM)

3
Entropy Encoding Arithmetic Coding
  • Initial idea introduced in 1948 by Shannon
  • Many researchers worked on this idea
  • Modern arithmetic coding can be attributed to
    Pasco (1976) and Rissanen and Langdon (1979)
  • Arithmetic coding treats the whole message as one
    unit
  • In practice, the input data is usually broken up
    into chunks to avoid error propagation

4
Entropy Encoding Arithmetic Coding
  • A message is represented by a half-open interval
    a,b), where a,b??.
  • General idea of encoding
  • Map the message into an open interval a,b)
  • Find a binary fractional number with minimum
    length that belongs to the above interval. This
    will be the encoded message
  • Initially, a,b) 0,1)
  • When the message becomes longer, the length of
    the interval shortens and the of bits needed to
    represent the interval increases

5
Entropy Encoding Arithmetic Coding
  • Coding Algorithm
  • Algorithm ArithmeticCoding
  • // Input symbol Input stream of the message
  • terminator terminator symbol
  • // Low and High all symbols ranges
  • // Output binary fractional code of the message
  • low 0 high 1 range 1
  • do
  • get (symbol)
  • high low range High(symbol)
  • low low range Low(symbol)
  • range high low
  • while (symbol ! terminator)
  • return CodeWord(low,high)

6
Entropy Encoding Arithmetic Coding
  • Binary code generation
  • Algorithm CodeWord
  • // Input low and high
  • // Output binary fractional code
  • code 0
  • k 1
  • while (value(code) lt low)
  • assign 1 to the kth binary fraction bit
  • if (value(code) gt high)
  • replace the kth bit by 0
  • k

7
Entropy Encoding Arithmetic Coding
  • Example Assume S A,B,C,D,E,F,, where is
    the terminator symbol. In addition, assume the
    following probabilities for each character
  • Pr (A) 0.2
  • Pr(B) 0.1
  • Pr(C) 0.2
  • Pr(D) 0.05
  • Pr(E) 0.3
  • Pr(F) 0.05
  • Pr() 0.1
  • Generate the fractional binary code of the
    message CAEE

8
Entropy Encoding Arithmetic Coding
  • It can be proven that ??log2 (1/? Pi)? is the
    upper bound on the number of bits needed to
    encode a message
  • In our case, the maximum is equal to 12.
  • When the length of the message increases, the
    range decreases and the upper bound value ......
  • Generally, arithmetic coding outperforms Huffman
    coding
  • Treats the whole message as one unit vs. an
    integral number of bits to code each character in
    Huffman coding
  • Redo the previous example CAEE using Huffman
    coding and notice how many bits are required to
    code this message.

9
Entropy Encoding Arithmetic Coding
  • Decoding Algorithm
  • Algorithm ArithmeticDecoding
  • // Input code binary code
  • // Low and High all symbols ranges
  • // Output The decoded message
  • value convert2decimal(code)
  • Do
  • find a symbol s so that
  • Low(s) lt value lt High(s)
  • output s
  • low Low(s) high High(s) range high
    low
  • value (value low) / range
  • while s is not the terminator symbol

10
Entropy Encoding Arithmetic Coding
  • Example

11
Predictive Coding
  • Predictive coding simply means transmitting
    differences
  • Predict the next sample as being equal to the
    current sample
  • More complex prediction schemes can be used
  • Instead of sending the current sample, send the
    error involved in the previous assumption

12
Predictive Coding Why?
  • The idea of forming differences is to make the
    histogram of sample values more peaked.
  • In this case, what happens to the entropy?
  • As a result, which is better to compress?

13
Predictive Coding Why?
14
Lossless Predictive Coding
  • Formally, define the integer signal as the set of
    values fn. Then, we predict values fn and
    compute the error en as follows
  • when t 1, we get ...
  • Usually, t is between 2 and 4 (in this case it is
    called a linear predictor)
  • We might need to have a truncating or rounding
    operation following the prediction computation

15
Lossless Predictive Coding
16
Lossless Predictive Coding Example
  • Consider the following predictor
  • Show how to code the following sequence

17
Lossless Predictive Coding
  • Examples in the Image Compression Domain
  • Differential Coding
  • Lossless JPEG

18
Lossy Predictive Coding DPCM
  • DPCM Differential Pulse Code Modulation
  • Form the prediction f n
  • Form an error en
  • Quantize the error

19
Lossy Predictive Coding DPCM
  • The distortion is the average squared error
  • To illustrate the quality of a compression
    scheme, diagrams of distortion vs. the number of
    bit levels used are usually shown
  • Quantization used
  • Uniform
  • Lloyd-Max
  • Does better than Uniform

20
Lossy Predictive Coding DPCM
21
Lossy Predictive Coding DPCM
  • Example
  • Show how to code the following sequence

22
Lossy Predictive Coding
  • DM (Delta Modulation) is a simplified version of
    DPCM that is used as a quick analog-to-digital
    converter.
  • Note that the prediction simply involves a delay
Write a Comment
User Comments (0)
About PowerShow.com