Semester : SEMESTER 7
Subject : Machine Learning
Year : 2020
Term : SEPTEMBER
Branch : COMPUTER SCIENCE AND ENGINEERING
Scheme : 2015 Full Time
Course Code : CS 467
Page:1
Reg No.:
Max. Marks: 100
10
11
12
13
ಬ
0)
a)
b)
a)
b)
00000CS 467121903
Pages: 3
Name:
APJ ABDUL KALAM TECHNOLOGICAL UNIVERSITY
Seventh semester B.Tech examinations (S), September 2020
Course Code: CS467
Course Name: MACHINE LEARNING
PARTA
Answer all questions, each carries 4 marks.
Define VC dimension. How VC dimension is related with no of training
examples used for learning.
Compare Classification with regression with an example.
Distinguish between overfitting and underfitting. How it can affect model
generalization?
Explain the general MLE method for estimating the parameters of a
probability distribution.
Compare Cross validation with Bootstrapping Techniques.
Calculate the output y of a three input neuron with bias. The input feature
vector is (x1, x2, x3) ಎ (0.8,0.6,0.4) and weight values are [wl,w2,w3, b]
=[0.2, 0.1, -0.3, 0.35]. Use binary Sigmoid function as activation function.
Describe the significance of Kernal functions in SVM. List any two kernel
functions.
Explain the basic elements of a Hidden Markov Model (HMM). List any
two applications of HMM.
Explain any two model combination scheme to improve the accuracy of a
classifier.
Compare K means clustering with Hierarchical Clustering Techniques.
PART 13
Answer any two full questions, each carries 9 marks.
Distinguish between supervised learning and Reinforcement learning.
Illustrate with an example.
Discuss any four examples of machine learning applications.
Define Probably Approximately Learning.
Explain the procedure for the computation of the principal components of
the data.
Compare Feature Extraction and Feature Selection techniques. Explain how
dimensionality can be reduced using subset selection procedure.
Explain the methods used to learn multiple classes for a 14 class
Classification Problem.
Page lof 3
Duration: 3 Hours
Marks
(4)
(4)
(4)
(4)
(4)
(4)
(4)
(4)
(4)
(4)
(5)
(4)
(3)
(6)
(5)
(4)