Semester : SEMESTER 7
Subject : Information Theory & Coding
Year : 2021
Term : DECEMBER
Scheme : 2015 Full Time
Course Code : EC 401
Page:1
A 10000EC401122101 Pages: 2
Reg No.: Name:
APJ ABDUL KALAM TECHNOLOGICAL UNIVERSITY
Seventh Semester B.Tech Degree Regular and Supplementary Examination December 2021 (2015 Scheme)
Course Code: EC401
Course Name: INFORMATION THEORY & CODING
Max. Marks: 100 Duration: 3 Hours
PARTA
Answer any two full questions, each carries 15 marks. Marks
1 a) Define the term marginal entropy and give its units? What will be the marginal (5)
entropy if a source emits all the M messages with equal probability?
b) Let X and Y be two discrete random variables and their joint probability is given (10)
by
.08 .15 .11
_ |.06 .09 .14
2൦502 03 06
.13 .09 .04
Find marginal, conditional and joint entropies and verify the relation.
2 a) State and prove Kraft’s inequality (7)
b) Two symbols 51, x2 with probabilities P(x;) = 0.4 and P(x2) = 0.6 are transmitted (8)
through a discrete channel given below.
P(Y/X)= 0.8 |.
0.2 0.8
Identify the channel and calculate the capacity and the efficiency of the channel.
3 ஐ Define mutual information I(X; ४). Find the mutual information if X and Y are (5)
independent.
b) A discrete source emits 7 symbols with probabilities, 0.15, 0.24, 0.13, 0.26, 0.12, (19)
0.02, 0.08. Construct binary codes using Huffman algorithm and Shannon Fano
algorithm. Compare the efficiencies of these two codes.
PART تا
Answer any two full questions, each carries 15 marks.
4 a) Find the differential entropy of a Gaussian distributed random variable. (7)
b) Derive the capacity of a Gaussian channel with bandwidth B and noise power (8)
spectral density N/2. Also, find the capacity when the bandwidth of the channel
tends to infinity.
Page | of 2