Huffman Encoding - PowerPoint PPT Presentation

About This Presentation
Title:

Huffman Encoding

Description:

We have a way of decoding the bit stream. Must tell where each character ... encoded bit string can be decoded! Why it works ... To decode it, you would need ... – PowerPoint PPT presentation

Number of Views:1215
Avg rating:3.0/5.0
Slides: 18
Provided by: davidma
Category:

less

Transcript and Presenter's Notes

Title: Huffman Encoding


1
Huffman Encoding
2
Entropy
  • Entropy is a measure of information content the
    number of bits actually required to store data.
  • Entropy is sometimes called a measure of surprise
  • A highly predictable sequence contains little
    actual information
  • Example 11011011011011011011011011 (whats
    next?)
  • Example I didnt win the lottery this week
  • A completely unpredictable sequence of n bits
    contains n bits of information
  • Example 01000001110110011010010000 (whats
    next?)
  • Example I just won 10 million in the
    lottery!!!!
  • Note that nothing says the information has to
    have any meaning (whatever that is)

3
Actual information content
  • A partially predictable sequence of n bits
    carries less than n bits of information
  • Example 1 111110101111111100101111101100
  • Blocks of 3 111110101111111100101111101100
  • Example 2 101111011111110111111011111100
  • Unequal probabilities p(1) 0.75, p(0) 0.25
  • Example 3 "We, the people, in order to form
    a..."
  • Unequal character probabilities e and t are
    common, j and q are uncommon
  • Example 4 we, the, people, in, order, to,
    ...
  • Unequal word probabilities the is very common

4
Fixed and variable bit widths
  • To encode English text, we need 26 lower case
    letters, 26 upper case letters, and a handful of
    punctuation
  • We can get by with 64 characters (6 bits) in all
  • Each character is therefore 6 bits wide
  • We can do better, provided
  • Some characters are more frequent than others
  • Characters may be different bit widths, so that
    for example, e use only one or two bits, while x
    uses several
  • We have a way of decoding the bit stream
  • Must tell where each character begins and ends

5
Example Huffman encoding
  • A 0B 100C 1010D 1011R 11
  • ABRACADABRA 01001101010010110100110
  • This is eleven letters in 23 bits
  • A fixed-width encoding would require 3 bits for
    five different letters, or 33 bits for 11 letters
  • Notice that the encoded bit string can be decoded!

6
Why it works
  • In this example, A was the most common letter
  • In ABRACADABRA
  • 5 As code for A is 1 bit long
  • 2 Rs code for R is 2 bits long
  • 2 Bs code for B is 3 bits long
  • 1 C code for C is 4 bits long
  • 1 D code for D is 4 bits long

7
Creating a Huffman encoding
  • For each encoding unit (letter, in this example),
    associate a frequency (number of times it occurs)
  • You can also use a percentage or a probability
  • Create a binary tree whose children are the
    encoding units with the smallest frequencies
  • The frequency of the root is the sum of the
    frequencies of the leaves
  • Repeat this procedure until all the encoding
    units are in the binary tree

8
Example, step I
  • Assume that relative frequencies are
  • A 40
  • B 20
  • C 10
  • D 10
  • R 20
  • (I chose simpler numbers than the real
    frequencies)
  • Smallest number are 10 and 10 (C and D), so
    connect those

9
Example, step II
  • C and D have already been used, and the new node
    above them (call it CD) has value 20
  • The smallest values are B, CD, and R, all of
    which have value 20
  • Connect any two of these

10
Example, step III
  • The smallest values is R, while A and BCD all
    have value 40
  • Connect R to either of the others

11
Example, step IV
  • Connect the final two nodes

12
Example, step V
  • Assign 0 to left branches, 1 to right branches
  • Each encoding is a path from the root
  • A 0B 100C 1010D 1011R 11
  • Each path terminates at a leaf
  • Do you see why encoded strings are decodable?

13
Unique prefix property
  • A 0B 100C 1010D 1011R 11
  • No bit string is a prefix of any other bit string
  • For example, if we added E01, then A (0) would
    be a prefix of E
  • Similarly, if we added F10, then it would be a
    prefix of three other encodings (B100, C1010,
    and D1011)
  • The unique prefix property holds because, in a
    binary tree, a leaf is not on a path to any other
    node

14
Practical considerations
  • It is not practical to create a Huffman encoding
    for a single short string, such as ABRACADABRA
  • To decode it, you would need the code table
  • If you include the code table in the entire
    message, the whole thing is bigger than just the
    ASCII message
  • Huffman encoding is practical if
  • The encoded string is large relative to the code
    table, OR
  • We agree on the code table beforehand
  • For example, its easy to find a table of letter
    frequencies for English (or any other
    alphabet-based language)

15
About the example
  • My example gave a nice, good-looking binary tree,
    with no lines crossing other lines
  • Thats because I chose my example and numbers
    carefully
  • If you do this for real data, you can expect your
    drawing will be a lot messierthats OK

16
Data compression
  • Huffman encoding is a simple example of data
    compression representing data in fewer bits than
    it would otherwise need
  • A more sophisticated method is GIF (Graphics
    Interchange Format) compression, for .gif files
  • Another is JPEG (Joint Photographic Experts
    Group), for .jpg files
  • Unlike the others, JPEG is lossyit loses
    information
  • Generally OK for photographs (if you dont
    compress them too much), because decompression
    adds fake data very similiar to the original

17
The End
Write a Comment
User Comments (0)
About PowerShow.com