Semester : SEMESTER 7
Subject : Information Theory & Coding
Year : 2020
Term : SEPTEMBER
Scheme : 2015 Full Time
Course Code : EC 401
Page:1
۸ 000006 01 Pages: 2
Reg No.: Name:
APJ ABDUL KALAM TECHNOLOGICAL UNIVERSITY
B.Tech S7 (S) Examination Sept 2020
Course Code: EC401
Course Name: INFORMATION THEORY & CODING
Max. Marks: 100 Duration: 3 Hours
PARTA
Answer any two full questions, each carries 15 marks. Marks
1 a) Explain the necessary and sufficient conditions for a code to be instantaneous. (3)
Give examples.
b) A zero memory source has a source alphabet, $ = (91, 52, 53) with P = {0.5,0.3, (5)
0.2}. Find the entropy of the source. Find the entropy of its second extension and
verify.
c) Explain the properties of mutual information. (7)
2 a) Prove that the entropy of a discrete memory less source 5 is upper bounded by (5)
average code word length L for any distortion less source encoding scheme.
b) Given a binary source with two symbols x; and x2. Given x2 is twice as long as ۶ (4)
and half as probable. The duration of x; is 0.3 seconds. Calculate the information
rate of the source.
c) Consider a source with 8 alphabets, a to h with respective probabilities 0.2, 0.2, (6)
0.18, 0.15, 0.12, 0.08, 0.05 and 0.02. Construct a minimum redundancy code and
determine the code efficiency.
3 a) Consider a message ensemble 3 = 16), 52, 53, ہوک 55, 56, 57] with probabilities P= (5)
{0.45, 0.15, 0.12, 0.08, 0.08, 0.08, 0.04}. Construct a binary code and determine
its efficiency using Shannon — Fano coding procedure.
b) ೫ % (10)
Given a binary symmetric channel with P(Y/X)= ۲ 1 and
hh
P(x, )= ೫; P(x, ) = ¥. Calculate the mutual information and channel capacity.
PART 1
Answer any two full questions, each carries 15 marks.
4 a) Explain the significance of Shannon-Hartley’s theorem. (5)
b) Define standard array. How is it used in syndrome decoding? Explain with an (10)
example.
Page lof 2