SlideShare a Scribd company logo
Compression & Huffman Codes
Compression
Definition
Reduce size of data
(number of bits needed to represent data)
Benefits
Reduce storage needed
Reduce transmission cost / latency / bandwidth
Sources of Compressibility
Redundancy
Recognize repeating patterns
Exploit using
Dictionary
Variable length encoding
Human perception
Less sensitive to some information
Can discard less important data
Types of Compression
Lossless
Preserves all information
Exploits redundancy in data
Applied to general data
Lossy
May lose some information
Exploits redundancy & human perception
Applied to audio, image, video
Effectiveness of Compression
Metrics
Bits per byte (8 bits)
2 bits / byte  ¼ original size
8 bits / byte  no compression
Percentage
75% compression  ¼ original size
Effectiveness of Compression
Depends on data
Random data  hard
Example: 1001110100  ?
Organized data  easy
Example: 1111111111  110
Corollary
No universally best compression algorithm
Effectiveness of Compression
Lossless Compression is not always possible
If compression is always possible (alternative view)
Compress file (reduce size by 1 bit)
Recompress output
Repeat (until we can store data with 0 bits)
Lossless Compression Techniques
LZW (Lempel-Ziv-Welch) compression
Build pattern dictionary
Replace patterns with index into dictionary
Run length encoding
Find & compress repetitive sequences
Huffman codes
Use variable length codes based on frequency
Huffman Code
Approach
Variable length encoding of symbols
Exploit statistical frequency of symbols
Efficient when symbol probabilities vary widely
Principle
Use fewer bits to represent frequent symbols
Use more bits to represent infrequent symbols
A A B A
A A
A B
Huffman Code Example
Expected size
Original  1/82 + 1/42 + 1/22 + 1/82 = 2 bits / symbol
Huffman  1/83 + 1/42 + 1/21 + 1/83 = 1.75 bits / symbol
Symbol A B C D
Frequency 13% 25% 50% 12%
Original
Encoding
00 01 10 11
2 bits 2 bits 2 bits 2 bits
Huffman
Encoding
110 10 0 111
3 bits 2 bits 1 bit 3 bits
Huffman Code Data Structures
Binary (Huffman) tree
Represents Huffman code
Edge  code (0 or 1)
Leaf  symbol
Path to leaf  encoding
Example
A = “110”, B = “10”, C = “0”
Priority queue
To efficiently build binary tree 1
1 0
0
D
C
B
A
0
1
Huffman Code Algorithm Overview
Encoding
Calculate frequency of symbols in file
Create binary tree representing “best” encoding
Use binary tree to encode compressed file
For each symbol, output path from root to leaf
Size of encoding = length of path
Save binary tree
Huffman Code – Creating Tree
Algorithm
Place each symbol in leaf
Weight of leaf = symbol frequency
Select two trees L and R (initially leafs)
Such that L, R have lowest frequencies in tree
Create new (internal) node
Left child  L
Right child  R
New frequency  frequency( L ) + frequency( R )
Repeat until all nodes merged into one tree
Huffman Tree Construction 1
3 5 8 2 7
A C E H I
Huffman Tree Construction 2
3 5 8
2 7
5
A C E
H I
Huffman Tree Construction 3
3
5
8
2
7
5
10
A
C
E
H
I
Huffman Tree Construction 4
3
5
8
2
7
5
10
15
A
C
E
H
I
Huffman Tree Construction 5
3
5 8
2
7
5
10 15
25
1
1
1
1
0
0
0
0
A
C E
H
I
E = 01
I = 00
C = 10
A = 111
H = 110
Huffman Coding Example
Huffman code
Input
ACE
Output
(111)(10)(01) = 1111001
E = 01
I = 00
C = 10
A = 111
H = 110
Huffman Code Algorithm Overview
Decoding
Read compressed file & binary tree
Use binary tree to decode file
Follow path from root to leaf
Huffman Decoding 1
3
5 8
2
7
5
10 15
25
1
1
1
1
0
0
0
0
A
C E
H
I
1111001
Huffman Decoding 2
3
5 8
2
7
5
10 15
25
1
1
1
1
0
0
0
0
A
C E
H
I
1111001
Huffman Decoding 3
3
5 8
2
7
5
10 15
25
1
1
1
1
0
0
0
0
A
C E
H
I
1111001
A
Huffman Decoding 4
3
5 8
2
7
5
10 15
25
1
1
1
1
0
0
0
0
A
C E
H
I
1111001
A
Huffman Decoding 5
3
5 8
2
7
5
10 15
25
1
1
1
1
0
0
0
0
A
C E
H
I
1111001
AC
Huffman Decoding 6
3
5 8
2
7
5
10 15
25
1
1
1
1
0
0
0
0
A
C E
H
I
1111001
AC
Huffman Decoding 7
3
5 8
2
7
5
10 15
25
1
1
1
1
0
0
0
0
A
C E
H
I
1111001
ACE
Huffman Code Properties
Prefix code
No code is a prefix of another code
Example
Huffman(“I”)  00
Huffman(“X”)  001 // not legal prefix code
Can stop as soon as complete code found
No need for end-of-code marker
Nondeterministic
Multiple Huffman coding possible for same input
If more than two trees with same minimal weight
Huffman Code Properties
Greedy algorithm
Chooses best local solution at each step
Combines 2 trees with lowest frequency
Still yields overall best solution
Optimal prefix code
Based on statistical frequency
Better compression possible (depends on data)
Using other approaches (e.g., pattern dictionary)

More Related Content

PPTX
DAA PPT.pptx
PPTX
Huffman ppt
PDF
12_HuffmanhsjsjsjjsiejjssjjejsjCoding_pdf.pdf
PDF
DSA Presentetion Huffman tree.pdf
PDF
Sunzip user tool for data reduction using huffman algorithm
PPT
Huffman code presentation and their operation
PPT
Huffman Coding
DAA PPT.pptx
Huffman ppt
12_HuffmanhsjsjsjjsiejjssjjejsjCoding_pdf.pdf
DSA Presentetion Huffman tree.pdf
Sunzip user tool for data reduction using huffman algorithm
Huffman code presentation and their operation
Huffman Coding

Similar to hufman coding for compression algorithm.ppt (20)

PDF
Data compression huffman coding algoritham
PPTX
5c. huffman coding using greedy technique.pptx
PDF
j001adcpresentation-2112170415 23.pdf
PPTX
Huffman Algorithm and its Application by Ekansh Agarwal
PPT
Huffman coding
PPTX
Huffman Coding
PDF
Implementation of Lossless Compression Algorithms for Text Data
PPTX
Huffman Codes
PPTX
Huffman codes
PPTX
t.pptx hd gsd unduf jdsnfijnfi bndsfuu ubfuh
PPT
Greedy Algorithms Huffman Coding.ppt
PPT
Hufman coding basic
PDF
Module-IV 094.pdf
PDF
Huffman Encoding Algorithm - Concepts and Example
PDF
Huffman
PDF
Huffman
DOCX
Huffman Coding is a technique of compressing data
PPT
16_Greedy_Algorithms.ppt
PPT
16_Greedy_Algorithms Greedy_AlgorithmsGreedy_Algorithms
PPT
16_Greedy_Algorithms.ppt
Data compression huffman coding algoritham
5c. huffman coding using greedy technique.pptx
j001adcpresentation-2112170415 23.pdf
Huffman Algorithm and its Application by Ekansh Agarwal
Huffman coding
Huffman Coding
Implementation of Lossless Compression Algorithms for Text Data
Huffman Codes
Huffman codes
t.pptx hd gsd unduf jdsnfijnfi bndsfuu ubfuh
Greedy Algorithms Huffman Coding.ppt
Hufman coding basic
Module-IV 094.pdf
Huffman Encoding Algorithm - Concepts and Example
Huffman
Huffman
Huffman Coding is a technique of compressing data
16_Greedy_Algorithms.ppt
16_Greedy_Algorithms Greedy_AlgorithmsGreedy_Algorithms
16_Greedy_Algorithms.ppt
Ad

Recently uploaded (20)

PPTX
5. MEASURE OF INTERIOR AND EXTERIOR- MATATAG CURRICULUM.pptx
PPTX
title _yeOPC_Poisoning_Presentation.pptx
PPTX
Embeded System for Artificial intelligence 2.pptx
DOCX
fsdffdghjjgfxfdghjvhjvgfdfcbchghgghgcbjghf
PPTX
Nanokeyer nano keyekr kano ketkker nano keyer
PDF
Dozuki_Solution-hardware minimalization.
PPTX
Wireless and Mobile Backhaul Market.pptx
PPTX
Presentacion compuuuuuuuuuuuuuuuuuuuuuuu
PDF
Smarter Security: How Door Access Control Works with Alarms & CCTV
PPTX
DEATH AUDIT MAY 2025.pptxurjrjejektjtjyjjy
PPT
Hypersensitivity Namisha1111111111-WPS.ppt
PPTX
Lecture-3-Computer-programming for BS InfoTech
PPTX
Entre CHtzyshshshshshshshzhhzzhhz 4MSt.pptx
PPTX
Fundamentals of Computer.pptx Computer BSC
DOCX
A PROPOSAL ON IoT climate sensor 2.docx
PPTX
sdn_based_controller_for_mobile_network_traffic_management1.pptx
PDF
How NGOs Save Costs with Affordable IT Rentals
PPTX
PROGRAMMING-QUARTER-2-PYTHON.pptxnsnsndn
PPTX
quadraticequations-111211090004-phpapp02.pptx
PDF
-DIGITAL-INDIA.pdf one of the most prominent
5. MEASURE OF INTERIOR AND EXTERIOR- MATATAG CURRICULUM.pptx
title _yeOPC_Poisoning_Presentation.pptx
Embeded System for Artificial intelligence 2.pptx
fsdffdghjjgfxfdghjvhjvgfdfcbchghgghgcbjghf
Nanokeyer nano keyekr kano ketkker nano keyer
Dozuki_Solution-hardware minimalization.
Wireless and Mobile Backhaul Market.pptx
Presentacion compuuuuuuuuuuuuuuuuuuuuuuu
Smarter Security: How Door Access Control Works with Alarms & CCTV
DEATH AUDIT MAY 2025.pptxurjrjejektjtjyjjy
Hypersensitivity Namisha1111111111-WPS.ppt
Lecture-3-Computer-programming for BS InfoTech
Entre CHtzyshshshshshshshzhhzzhhz 4MSt.pptx
Fundamentals of Computer.pptx Computer BSC
A PROPOSAL ON IoT climate sensor 2.docx
sdn_based_controller_for_mobile_network_traffic_management1.pptx
How NGOs Save Costs with Affordable IT Rentals
PROGRAMMING-QUARTER-2-PYTHON.pptxnsnsndn
quadraticequations-111211090004-phpapp02.pptx
-DIGITAL-INDIA.pdf one of the most prominent
Ad

hufman coding for compression algorithm.ppt

  • 2. Compression Definition Reduce size of data (number of bits needed to represent data) Benefits Reduce storage needed Reduce transmission cost / latency / bandwidth
  • 3. Sources of Compressibility Redundancy Recognize repeating patterns Exploit using Dictionary Variable length encoding Human perception Less sensitive to some information Can discard less important data
  • 4. Types of Compression Lossless Preserves all information Exploits redundancy in data Applied to general data Lossy May lose some information Exploits redundancy & human perception Applied to audio, image, video
  • 5. Effectiveness of Compression Metrics Bits per byte (8 bits) 2 bits / byte  ¼ original size 8 bits / byte  no compression Percentage 75% compression  ¼ original size
  • 6. Effectiveness of Compression Depends on data Random data  hard Example: 1001110100  ? Organized data  easy Example: 1111111111  110 Corollary No universally best compression algorithm
  • 7. Effectiveness of Compression Lossless Compression is not always possible If compression is always possible (alternative view) Compress file (reduce size by 1 bit) Recompress output Repeat (until we can store data with 0 bits)
  • 8. Lossless Compression Techniques LZW (Lempel-Ziv-Welch) compression Build pattern dictionary Replace patterns with index into dictionary Run length encoding Find & compress repetitive sequences Huffman codes Use variable length codes based on frequency
  • 9. Huffman Code Approach Variable length encoding of symbols Exploit statistical frequency of symbols Efficient when symbol probabilities vary widely Principle Use fewer bits to represent frequent symbols Use more bits to represent infrequent symbols A A B A A A A B
  • 10. Huffman Code Example Expected size Original  1/82 + 1/42 + 1/22 + 1/82 = 2 bits / symbol Huffman  1/83 + 1/42 + 1/21 + 1/83 = 1.75 bits / symbol Symbol A B C D Frequency 13% 25% 50% 12% Original Encoding 00 01 10 11 2 bits 2 bits 2 bits 2 bits Huffman Encoding 110 10 0 111 3 bits 2 bits 1 bit 3 bits
  • 11. Huffman Code Data Structures Binary (Huffman) tree Represents Huffman code Edge  code (0 or 1) Leaf  symbol Path to leaf  encoding Example A = “110”, B = “10”, C = “0” Priority queue To efficiently build binary tree 1 1 0 0 D C B A 0 1
  • 12. Huffman Code Algorithm Overview Encoding Calculate frequency of symbols in file Create binary tree representing “best” encoding Use binary tree to encode compressed file For each symbol, output path from root to leaf Size of encoding = length of path Save binary tree
  • 13. Huffman Code – Creating Tree Algorithm Place each symbol in leaf Weight of leaf = symbol frequency Select two trees L and R (initially leafs) Such that L, R have lowest frequencies in tree Create new (internal) node Left child  L Right child  R New frequency  frequency( L ) + frequency( R ) Repeat until all nodes merged into one tree
  • 14. Huffman Tree Construction 1 3 5 8 2 7 A C E H I
  • 15. Huffman Tree Construction 2 3 5 8 2 7 5 A C E H I
  • 16. Huffman Tree Construction 3 3 5 8 2 7 5 10 A C E H I
  • 17. Huffman Tree Construction 4 3 5 8 2 7 5 10 15 A C E H I
  • 18. Huffman Tree Construction 5 3 5 8 2 7 5 10 15 25 1 1 1 1 0 0 0 0 A C E H I E = 01 I = 00 C = 10 A = 111 H = 110
  • 19. Huffman Coding Example Huffman code Input ACE Output (111)(10)(01) = 1111001 E = 01 I = 00 C = 10 A = 111 H = 110
  • 20. Huffman Code Algorithm Overview Decoding Read compressed file & binary tree Use binary tree to decode file Follow path from root to leaf
  • 21. Huffman Decoding 1 3 5 8 2 7 5 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001
  • 22. Huffman Decoding 2 3 5 8 2 7 5 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001
  • 23. Huffman Decoding 3 3 5 8 2 7 5 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 A
  • 24. Huffman Decoding 4 3 5 8 2 7 5 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 A
  • 25. Huffman Decoding 5 3 5 8 2 7 5 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 AC
  • 26. Huffman Decoding 6 3 5 8 2 7 5 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 AC
  • 27. Huffman Decoding 7 3 5 8 2 7 5 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 ACE
  • 28. Huffman Code Properties Prefix code No code is a prefix of another code Example Huffman(“I”)  00 Huffman(“X”)  001 // not legal prefix code Can stop as soon as complete code found No need for end-of-code marker Nondeterministic Multiple Huffman coding possible for same input If more than two trees with same minimal weight
  • 29. Huffman Code Properties Greedy algorithm Chooses best local solution at each step Combines 2 trees with lowest frequency Still yields overall best solution Optimal prefix code Based on statistical frequency Better compression possible (depends on data) Using other approaches (e.g., pattern dictionary)