Semester : SEMESTER 2
Subject : Information Theory
Year : 2016
Term : MAY
Branch : TELECOMMUNICATION ENGINEERING
Scheme : 2015 Full Time
Course Code : 01 EC 6518
Page:1
APJ ABDUL KALAM TECHNOLOGICAL UNIVERSITY
SECOND SEMESTER M.TECH DEGREE EXAMINATION, MAY 2016
Electronics and Communication Engineering
(Telecommunication Engineering)
01EC6518 Information Theory
Max. Marks : 60 Duration: 3 Hours
(Answer ANY TWO questions from each Part)
Part A
1.3) Derive the chain rules for mutual information and relative entropy. (4)
b) What do you mean by relative entropy? Writc the formula for the relative entropy between
two probability mass functions p(x) and q(x). (3) c) Prove that the relative entropy between
two probability density functions is zero if
both are the same. (2)
(5)
2. a) State and prove Jensen's inequality.
b) Prove : i) 11%, Y/Z) 10/2)
ii) rx, ४; ൧) 1(%: 2)
iii) Under what condition the equality holds. (4)
3. a) State and prove Krafts Inequality. (3)
b) Briefly explain the codes i) Non-singular ii) Uniquely decodable iii) Instantaneous, With
an example. (2)
c) Consider the random variable:
xl | م x
49 | .26 | .12 | 04 | 04 | .03 | .02
i) Find a binary Huffman code for X.
ii) Find the expected code length for X.
111) Find a ternary Huffman code for X. (4)