Code: DS3010  Category: PC  Credits: 3035
Prerequisite: Familiarity with Algorithms, Probability, Linear Algebra, Programming
Course Content

Introduction to the course, revision of linear algebra and probability (3 hours)

Regression: linear regression, ridge regression (3 hours)
 Classification: (9 hours)
 Linear discriminant analysis, logistic regression, perceptrons,
 support vector machines, Bayes classifier, decision tree.
 Nonparametric methods: knearest neighbours, Parzen window.

Principal component analysis, Canonical correlation analysis (3 hours)

Evaluation and Model Selection: ROC Curves, Evaluation Measures, Cross validation, Significance tests (3 hours)

Ensemble methods: boosting, bagging, random forests (3 hours)
 Clustering: (9 hours)
 kmeans, hierarchical, density based clustering
 Gaussian mixture model

Sequential Learning : hidden Markov model (6 hours)
 Neural network : feedforward NN (3 hours)
Learning Outcomes
 State definitions, theorems/results, algorithms related to key concepts
 Apply standard techniques to solve known problems
 Given a task, derive a learning model by defining appropriate loss function, regulariser, optimization problem and stating the best possible solution.
 Analyse and compare models and algorithms with respect to their complexity, performance and applicability
 Develop models/algorithms with small modifications of existing standard techniques for a modification of known task
Text Books
 Richard Duda, Peter Hart, David Stork, Pattern Classification, 2nd Ed, John Wiley & Sons, 2001. ISBN 9788126511167
 Christopher Bishop. â€‹Pattern Recognition and Machine Learningâ€‹. ISBN 0387310738.
 Trevor Hastie, Robert Tibshirani, Jerome Friedman. Elements of Statistical Learning. ISBN 0387952845.
References
 Tom Mitchell. Machine Learning. McGrawHill. ISBN 0070428077.
 Shai ShalevShwartz, and Shai BenDavid, Understanding Machine Learning: From Theory to Algorithms, Cambridge University Press, 2014. ISBN 9781107057135.