SlideShare a Scribd company logo
Information T
DEPARTMENT OF ELECTRONICS AND COMMUNICATION
SRI CHANDRASEKHARENDRA SARASWATHI
VISWA MAHAVIDYALAYA
(Deemed to be University established under section 3 of UGC act 1956)
ENATHUR, KANCHIPURAM
A Course Material on
Information Theory and Coding
By
S.CHANDRAMOHAN
Assistant Professor
DEPARTMENT OF ELECTRONICS AND COMMUNICATION
ENGINEERING
SRI CHANDRASEKHARENDRA SARASWATHI
VISWA MAHAVIDYALAYA
(Deemed to be University established under section 3 of UGC act 1956)
ENATHUR, KANCHIPURAM – 631 561
DEPARTMENT OF ELECTRONICS AND COMMUNICATION
SRI CHANDRASEKHARENDRA SARASWATHI
(Deemed to be University established under section 3 of UGC act 1956)
INFORMATION THEORY AND CODING
Pre-requisite: Basic knowledge of Digital Communication
OBJECTIVE: To get exposed to information and entropy, compression technique, audio & video
UNIT I: INFORMATION THEORY
Information – Entropy, Information rate, classification of codes, Kraft McMillan inequality,
Source coding theorem, Shannon-Fano coding, Huffman coding, Extended Huffman coding -
Joint and conditional entropies, Mutual information - Discrete memory less channels – BSC,
BEC – Channel capacity, Shannon limit
UNIT II: ERROR CONTROL CODING:
BLOCK CODES Definitions and Principles: Hamming weight, Hamming distance, Minimum
distance decoding - Single parity codes, Hamming codes, Repetition codes - Linear block codes,
Cyclic codes - Syndrome calculation, Encoder and decoder - CRC
UNIT III: ERROR CONTROL CODING: CONVOLUTIONAL CODES
Convolution codes – code tree, trellis, state diagram - Encoding – Decoding: Sequential search
and Viterbi algorithm – Principle of Turbo coding
UNIT IV: SOURCE CODING:
TEXT, AUDIO AND SPEECH Text: Adaptive Huffman Coding, Arithmetic Coding, LZW
algorithm – Audio: Perceptual coding, Masking techniques, Psychoacoustic model, MEG Audio
layers I,II,III, Dolby AC3 - Speech: Channel Vocoder, Linear Predictive Coding
UNIT V: SOURCE CODING:
IMAGE AND VIDEO Image and Video Formats – GIF, TIFF, SIF, CIF, QCIF – Image
compression: READ, JPEG – Video Compression: Principles-I, B, P frames, Motion estimation,
Motion compensation, H.261, MPEG standard
Text Books: 1. Ranjan Bose, Information Theory, Coding and Cryptography, Publication,2005.
2. Cover, Thomas, and Joy Thomas. Elements of Information Theory. 2nd ed. New York, NY:
Wiley-Interscience, 2006. ISBN: 9780471241959
E books and online learning materials:
1. http://www-public.tem-tsp.eu/~uro/cours-pdf/poly.pdf
2. http://www.cl.cam.ac.uk/teaching/0910/InfoTheory/InfoTheoryLectures.pdf
INFORMATION THEORY:– Entropy, Information rate, classification of codes, Kraft
McMillan inequality, Source coding theorem, Shannon-Fano coding, Huffman coding, Extended
Huffman coding - Joint and conditional entropies, Mutual information - Discrete memory less
channels – BSC, BEC – Channel capacity, Shannon limit
***************************************************************************************************
Information is the source of a communication system, whether it is analog or digital.
Information theory is a mathematical approach to the study of coding of information along
with the quantification, storage, and communication of information.
Conditions of Occurrence of Events
If we consider an event, there are three conditions of occurrence.
 If the event has not occurred, there is a condition of uncertainty.
 If the event has just occurred, there is a condition of surprise.
 If the event has occurred, a time back, there is a condition of having some information.
These three events occur at different times. The differences in these conditions help us gain
knowledge on the probabilities of the occurrence of events.
Information_coding_theory  important sub
Definition of Information:
Information_coding_theory  important sub
Information_coding_theory  important sub
ENTROPY:
Information_coding_theory  important sub
Information_coding_theory  important sub
Entropy:When we observe the possibilities of the occurrence of an event, how surprising or
uncertain it would be, it means that we are trying to have an idea on the average content of the
information from the source of the event.
Entropy can be defined as a measure of the average information content per source symbol.
Where pi is the probability of the occurrence of character number i from a given stream of
characters and b is the base of the algorithm used. Hence, this is also called as Shannon’s
Entropy.
Conditional Entropy: The amount of uncertainty remaining about the channel input after
observing the channel output, is called as Conditional Entropy.
It is denoted by
Information_coding_theory  important sub
Mutual Information :
Let us consider a channel whose output is Y and input is X
Let the entropy for prior uncertainty be X = Hx
Thisisassumedbeforetheinputisapplied
To know about the uncertainty of the output, after the input is applied, let us consider
Conditional Entropy, given that Y = yk
This is a random variable for
With probabilities p (yo)….p(yk-1)
respectively
Now, considering both the uncertainty
conditions beforeandafterapplyingtheinputsbeforeandafterapplyingtheinputs,
we come to know that the difference, i.e. H(x)−H(x∣y)must represent the uncertainty about the channel
input that is resolved by observing the channel output.
This is called as the Mutual Information of the channel.
Denoting the Mutual Information as I(x;y),we can write the whole thing in an equation, as follows
Hence, this is the equational representation of Mutual Information.
Discrete Memoryless Channel (DMC):
Fig1: Noiseless Channel
Fig2: Deterministic Channel
Fig2: Noiseless Channel
Information_coding_theory  important sub
Information_coding_theory  important sub
Information_coding_theory  important sub
Information_coding_theory  important sub
Information_coding_theory  important sub
Information_coding_theory  important sub
Information_coding_theory  important sub
Information_coding_theory  important sub
Information_coding_theory  important sub
Next Page 72 ?????
SHANNON- FANO CODING:
Information_coding_theory  important sub
Information_coding_theory  important sub
Information_coding_theory  important sub
Information_coding_theory  important sub
Information_coding_theory  important sub
Information_coding_theory  important sub
MCQ Test
1. Self information should be
a) Negative
b) Positive
c) Positive & Negative
d) None of the mentioned Ans: b
2. The unit of average mutual information is
a) Bytes per symbol
b) Bytes
c) Bits per symbol
d) Bits Ans: d
3. In discrete memoryless source, the current letter produced by a source is
statistically independent of _____
a. Past output
b. Future output
c. Both a and b
d. None of the aboveAns: c
4. When the base of the logarithm is 2, then the unit of measure of information is
a) Bits
b) Bytes
c) Nats
d) None of the mentioned Ans: a
5. The self information of random variable is
a) 0
b) 1
c) Infinite
d) Cannot be determined Ans: c
6. Entropy of a random variable is
a) 0
b) 1
c) Infinite
d) Cannot be determined Ans: c
7. Which is more efficient method?
a) Encoding each symbol of a block
b) Encoding block of symbols
c) Encoding each symbol of a block & Encoding block of symbols
d) None of the mentioned Ans: b
8. The mutual information between a pair of events is
a) Positive
b) Negative
c) Zero
d) All of the mentionedAns: d
9. When the base of the logarithm is e, the unit of measure of information is
a) Bits
b) Bytes
c) Nats
d) None of the mentionedAns: c
10. When probability of error during transmission is 0.5, it indicates that
a) Channel is very noisy
b) No information is received
c) Channel is very noisy & No information is received
d) None of the mentioned Ans: c
11.Types of compression
a. Lossless
b. Lossy.
c. both a and b
d. None of the aboveAns: C
12. What is significance of D- frames in video coding
a. They generate low resolution picture.
b. highly compressed technique.
c. They generate high resolution picture.
d. None of the aboveAns: A
13. MPEG coders are used for
a. compression of audio
b. compression of text
c. compression of audio and video
d. None of the aboveAns: A
14. B-frame is also known as
a. unidirectional.
b. B- De-compression technique
c. B- compression technique
d. bidirectional frame. Ans: D
15. I-frame is
a. It basically searches the frames.
b. It basically predicts the movement of objects.
c. It basically compress the movement of objects
d. None of the above.Ans: B
74. Video Coding consists of two process
a. Processing for reducing Temporal Redundancy.
b. Processing for reducing Spatial Redundancy
c. Both a and b
d. None of the above.Ans: C
16. H.261 is
a. compression of audio
b. De-compression of audio
c. Video Compression standard
d. None of the aboveAns: A
Assignment
(1)Explain the terms (i) Self information (ii) Average information (iii) Mutual Information.
(2)Discuss the reason for using logarithmic measure for measuring the amount of information.
(3)Explain the concept of amount of information associated with message.
(4) A binary source emitting an independent sequence of 0’s and 1’s with probabilities p and (1p)
respectively. Plottheentropyofthesource.
(5) Explain the concept of information, average information, information rate and redundancy as
referred to information transmission.
(6)Let X represents the outcome of a single roll of a fair dice. What is the entropy of X?
(7)A code is composed of dots and dashes. Assume that the dash is 3 times as long as the dot and
has one-third the probability of occurrence. (i) Calculate the information in dot and that in a
dash; (ii) Calculate the average information in dot-dash code; and (iii) Assume that a dot lasts
for 10 ms and this same time interval is allowed between symbols. Calculate the average rate
of information transmission.
(8)What do you understand by the term extension of a discrete memory less source? Show that
the entropy of the nth extension of a DMS is n times the entropy of the original source.
(9)A card is drawn from a deck of playing cards. A) You are informed that the card you draw is
spade. How much information did you receive in bits? B) How much information did you
receive if you are told that the card you drew is an ace? C) How much information did you
receive if you are told that the card you drew is an ace of spades? Is the information content of
the message “ace of spades” the sum of the information contents of the messages ”spade” and
“ace”?
(10) The output of an information source consists OF 128 symbols, 16 of which occurs with
probability of 1/32 and remaining 112 occur with a probability of 1/224. The source emits
1000 symbols/sec. assuming that the symbols are chosen independently; find the rate of
information of the source.

More Related Content

PPTX
Information theory & coding PPT Full Syllabus.pptx
PDF
Information_Theory_and_Coding_ITC_CSE.pdf
PPTX
Information Theory and coding - Lecture 2
PDF
Module 1 till huffman coding5c-converted.pdf
PPT
DC Lecture Slides 1 - Information Theory.ppt
PPTX
Unit-1_Digital_Communication-Information_Theory.pptx
PPTX
Unit-1_Digital_Communication-Information_Theory.pptx
PDF
INFORMATION_THEORY.pdf
Information theory & coding PPT Full Syllabus.pptx
Information_Theory_and_Coding_ITC_CSE.pdf
Information Theory and coding - Lecture 2
Module 1 till huffman coding5c-converted.pdf
DC Lecture Slides 1 - Information Theory.ppt
Unit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptx
INFORMATION_THEORY.pdf

Similar to Information_coding_theory important sub (20)

PDF
Information Theory - Introduction
PPTX
Unit I.pptx INTRODUCTION TO DIGITAL COMMUNICATION
PDF
Lecture Information Theory and CodingPart-1.pdf
PPT
Lecture1
PDF
Lecture 1. Introduction
PPTX
Communication engineering -UNIT IV .pptx
DOC
Information Theory and Coding Question Bank
PPTX
Information Theory Final.pptx
PDF
Ith ch1-part1
PDF
Information Theory Mike Brookes E4.40, ISE4.51, SO20.pdf
PPTX
Innformation theory in digital communication
PDF
Itblock2 150209161919-conversion-gate01
PPT
Ec eg intro 1
PDF
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
PPT
Source coding
PPTX
Ch6 information theory
PPT
DC@UNIT 2 ppt.ppt
PPTX
Information Theory Coding 1
PDF
information_theory_coding_and_cryptograp.pdf
Information Theory - Introduction
Unit I.pptx INTRODUCTION TO DIGITAL COMMUNICATION
Lecture Information Theory and CodingPart-1.pdf
Lecture1
Lecture 1. Introduction
Communication engineering -UNIT IV .pptx
Information Theory and Coding Question Bank
Information Theory Final.pptx
Ith ch1-part1
Information Theory Mike Brookes E4.40, ISE4.51, SO20.pdf
Innformation theory in digital communication
Itblock2 150209161919-conversion-gate01
Ec eg intro 1
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Source coding
Ch6 information theory
DC@UNIT 2 ppt.ppt
Information Theory Coding 1
information_theory_coding_and_cryptograp.pdf
Ad

Recently uploaded (20)

PPTX
CH1 Production IntroductoryConcepts.pptx
PPT
Introduction, IoT Design Methodology, Case Study on IoT System for Weather Mo...
PPT
Mechanical Engineering MATERIALS Selection
PPTX
Safety Seminar civil to be ensured for safe working.
PPTX
bas. eng. economics group 4 presentation 1.pptx
PDF
PPT on Performance Review to get promotions
PPTX
OOP with Java - Java Introduction (Basics)
PPTX
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
PPTX
FINAL REVIEW FOR COPD DIANOSIS FOR PULMONARY DISEASE.pptx
PPTX
Artificial Intelligence
PPTX
Sustainable Sites - Green Building Construction
PPT
introduction to datamining and warehousing
PPTX
Lecture Notes Electrical Wiring System Components
PPTX
UNIT 4 Total Quality Management .pptx
PPTX
M Tech Sem 1 Civil Engineering Environmental Sciences.pptx
PPTX
Current and future trends in Computer Vision.pptx
PPTX
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
PPTX
additive manufacturing of ss316l using mig welding
PPTX
Geodesy 1.pptx...............................................
PDF
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
CH1 Production IntroductoryConcepts.pptx
Introduction, IoT Design Methodology, Case Study on IoT System for Weather Mo...
Mechanical Engineering MATERIALS Selection
Safety Seminar civil to be ensured for safe working.
bas. eng. economics group 4 presentation 1.pptx
PPT on Performance Review to get promotions
OOP with Java - Java Introduction (Basics)
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
FINAL REVIEW FOR COPD DIANOSIS FOR PULMONARY DISEASE.pptx
Artificial Intelligence
Sustainable Sites - Green Building Construction
introduction to datamining and warehousing
Lecture Notes Electrical Wiring System Components
UNIT 4 Total Quality Management .pptx
M Tech Sem 1 Civil Engineering Environmental Sciences.pptx
Current and future trends in Computer Vision.pptx
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
additive manufacturing of ss316l using mig welding
Geodesy 1.pptx...............................................
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
Ad

Information_coding_theory important sub

  • 1. Information T DEPARTMENT OF ELECTRONICS AND COMMUNICATION SRI CHANDRASEKHARENDRA SARASWATHI VISWA MAHAVIDYALAYA (Deemed to be University established under section 3 of UGC act 1956) ENATHUR, KANCHIPURAM A Course Material on Information Theory and Coding By S.CHANDRAMOHAN Assistant Professor DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING SRI CHANDRASEKHARENDRA SARASWATHI VISWA MAHAVIDYALAYA (Deemed to be University established under section 3 of UGC act 1956) ENATHUR, KANCHIPURAM – 631 561 DEPARTMENT OF ELECTRONICS AND COMMUNICATION SRI CHANDRASEKHARENDRA SARASWATHI (Deemed to be University established under section 3 of UGC act 1956)
  • 2. INFORMATION THEORY AND CODING Pre-requisite: Basic knowledge of Digital Communication OBJECTIVE: To get exposed to information and entropy, compression technique, audio & video UNIT I: INFORMATION THEORY Information – Entropy, Information rate, classification of codes, Kraft McMillan inequality, Source coding theorem, Shannon-Fano coding, Huffman coding, Extended Huffman coding - Joint and conditional entropies, Mutual information - Discrete memory less channels – BSC, BEC – Channel capacity, Shannon limit UNIT II: ERROR CONTROL CODING: BLOCK CODES Definitions and Principles: Hamming weight, Hamming distance, Minimum distance decoding - Single parity codes, Hamming codes, Repetition codes - Linear block codes, Cyclic codes - Syndrome calculation, Encoder and decoder - CRC UNIT III: ERROR CONTROL CODING: CONVOLUTIONAL CODES Convolution codes – code tree, trellis, state diagram - Encoding – Decoding: Sequential search and Viterbi algorithm – Principle of Turbo coding UNIT IV: SOURCE CODING: TEXT, AUDIO AND SPEECH Text: Adaptive Huffman Coding, Arithmetic Coding, LZW algorithm – Audio: Perceptual coding, Masking techniques, Psychoacoustic model, MEG Audio layers I,II,III, Dolby AC3 - Speech: Channel Vocoder, Linear Predictive Coding UNIT V: SOURCE CODING: IMAGE AND VIDEO Image and Video Formats – GIF, TIFF, SIF, CIF, QCIF – Image compression: READ, JPEG – Video Compression: Principles-I, B, P frames, Motion estimation, Motion compensation, H.261, MPEG standard Text Books: 1. Ranjan Bose, Information Theory, Coding and Cryptography, Publication,2005. 2. Cover, Thomas, and Joy Thomas. Elements of Information Theory. 2nd ed. New York, NY: Wiley-Interscience, 2006. ISBN: 9780471241959 E books and online learning materials: 1. http://www-public.tem-tsp.eu/~uro/cours-pdf/poly.pdf 2. http://www.cl.cam.ac.uk/teaching/0910/InfoTheory/InfoTheoryLectures.pdf
  • 3. INFORMATION THEORY:– Entropy, Information rate, classification of codes, Kraft McMillan inequality, Source coding theorem, Shannon-Fano coding, Huffman coding, Extended Huffman coding - Joint and conditional entropies, Mutual information - Discrete memory less channels – BSC, BEC – Channel capacity, Shannon limit *************************************************************************************************** Information is the source of a communication system, whether it is analog or digital. Information theory is a mathematical approach to the study of coding of information along with the quantification, storage, and communication of information. Conditions of Occurrence of Events If we consider an event, there are three conditions of occurrence.  If the event has not occurred, there is a condition of uncertainty.  If the event has just occurred, there is a condition of surprise.  If the event has occurred, a time back, there is a condition of having some information. These three events occur at different times. The differences in these conditions help us gain knowledge on the probabilities of the occurrence of events.
  • 11. Entropy:When we observe the possibilities of the occurrence of an event, how surprising or uncertain it would be, it means that we are trying to have an idea on the average content of the information from the source of the event. Entropy can be defined as a measure of the average information content per source symbol. Where pi is the probability of the occurrence of character number i from a given stream of characters and b is the base of the algorithm used. Hence, this is also called as Shannon’s Entropy. Conditional Entropy: The amount of uncertainty remaining about the channel input after observing the channel output, is called as Conditional Entropy. It is denoted by
  • 13. Mutual Information : Let us consider a channel whose output is Y and input is X Let the entropy for prior uncertainty be X = Hx Thisisassumedbeforetheinputisapplied To know about the uncertainty of the output, after the input is applied, let us consider Conditional Entropy, given that Y = yk This is a random variable for With probabilities p (yo)….p(yk-1) respectively
  • 14. Now, considering both the uncertainty conditions beforeandafterapplyingtheinputsbeforeandafterapplyingtheinputs, we come to know that the difference, i.e. H(x)−H(x∣y)must represent the uncertainty about the channel input that is resolved by observing the channel output. This is called as the Mutual Information of the channel. Denoting the Mutual Information as I(x;y),we can write the whole thing in an equation, as follows Hence, this is the equational representation of Mutual Information.
  • 28. Next Page 72 ????? SHANNON- FANO CODING:
  • 35. MCQ Test 1. Self information should be a) Negative b) Positive c) Positive & Negative d) None of the mentioned Ans: b 2. The unit of average mutual information is a) Bytes per symbol b) Bytes c) Bits per symbol d) Bits Ans: d 3. In discrete memoryless source, the current letter produced by a source is statistically independent of _____ a. Past output b. Future output c. Both a and b d. None of the aboveAns: c 4. When the base of the logarithm is 2, then the unit of measure of information is a) Bits b) Bytes c) Nats d) None of the mentioned Ans: a 5. The self information of random variable is a) 0 b) 1 c) Infinite d) Cannot be determined Ans: c 6. Entropy of a random variable is a) 0 b) 1 c) Infinite d) Cannot be determined Ans: c 7. Which is more efficient method? a) Encoding each symbol of a block b) Encoding block of symbols
  • 36. c) Encoding each symbol of a block & Encoding block of symbols d) None of the mentioned Ans: b 8. The mutual information between a pair of events is a) Positive b) Negative c) Zero d) All of the mentionedAns: d 9. When the base of the logarithm is e, the unit of measure of information is a) Bits b) Bytes c) Nats d) None of the mentionedAns: c 10. When probability of error during transmission is 0.5, it indicates that a) Channel is very noisy b) No information is received c) Channel is very noisy & No information is received d) None of the mentioned Ans: c 11.Types of compression a. Lossless b. Lossy. c. both a and b d. None of the aboveAns: C 12. What is significance of D- frames in video coding a. They generate low resolution picture. b. highly compressed technique. c. They generate high resolution picture. d. None of the aboveAns: A 13. MPEG coders are used for a. compression of audio b. compression of text c. compression of audio and video d. None of the aboveAns: A 14. B-frame is also known as
  • 37. a. unidirectional. b. B- De-compression technique c. B- compression technique d. bidirectional frame. Ans: D 15. I-frame is a. It basically searches the frames. b. It basically predicts the movement of objects. c. It basically compress the movement of objects d. None of the above.Ans: B 74. Video Coding consists of two process a. Processing for reducing Temporal Redundancy. b. Processing for reducing Spatial Redundancy c. Both a and b d. None of the above.Ans: C 16. H.261 is a. compression of audio b. De-compression of audio c. Video Compression standard d. None of the aboveAns: A
  • 38. Assignment (1)Explain the terms (i) Self information (ii) Average information (iii) Mutual Information. (2)Discuss the reason for using logarithmic measure for measuring the amount of information. (3)Explain the concept of amount of information associated with message. (4) A binary source emitting an independent sequence of 0’s and 1’s with probabilities p and (1p) respectively. Plottheentropyofthesource. (5) Explain the concept of information, average information, information rate and redundancy as referred to information transmission. (6)Let X represents the outcome of a single roll of a fair dice. What is the entropy of X? (7)A code is composed of dots and dashes. Assume that the dash is 3 times as long as the dot and has one-third the probability of occurrence. (i) Calculate the information in dot and that in a dash; (ii) Calculate the average information in dot-dash code; and (iii) Assume that a dot lasts for 10 ms and this same time interval is allowed between symbols. Calculate the average rate of information transmission. (8)What do you understand by the term extension of a discrete memory less source? Show that the entropy of the nth extension of a DMS is n times the entropy of the original source. (9)A card is drawn from a deck of playing cards. A) You are informed that the card you draw is spade. How much information did you receive in bits? B) How much information did you receive if you are told that the card you drew is an ace? C) How much information did you receive if you are told that the card you drew is an ace of spades? Is the information content of the message “ace of spades” the sum of the information contents of the messages ”spade” and “ace”? (10) The output of an information source consists OF 128 symbols, 16 of which occurs with probability of 1/32 and remaining 112 occur with a probability of 1/224. The source emits 1000 symbols/sec. assuming that the symbols are chosen independently; find the rate of information of the source.