Syllabus :What is learning? Multilayer perceptron as a concrete example of learning. Visualization of learning as multidimensional function approximation, either through training or prior experiences (supervised learning) or through a set of predetermined rules (unsupervised learning).
Supervised learning: Variable types and terminologies, two simple approaches to prediction (1) least squares and (2) nearest neighbors. Structured regression models. Model selection and biasvariance trade off. Linear methods for regression, least square linear regression, Gauss-Markov theorem, multiple regression, subset selection. Linear methods for classification, linear discriminant analysis, logistic regression. Bayesian decision theory.
Overview of neural networks, feed forward and feed back algorithms, an application in handwritten character recognition, specificity vs. generalizability.
Basic ideas of support vector machine, Vapnik-Chervonenkis dimension.
Unsupervised learning: Cluster analysis, principal component analysis, independent component analysis.

Suggested Texts :
1. T. Hastie, R. Tibshirani and J. Friedman, The Elements of Statistical Learning: Data Mining, Inference and Prediction, 2nd ed., Springer, New York, 2009.
2. R. O. Duda, P. E. Hart and D. G. Stork, Pattern Classification, 2nd ed., John Wiley & Sons Inc., New York, 2001.
3. C. J. C. Burges, A tutorial on support vector machine for pattern recognition, Data Mining and Knowledge Discovery, vol. 2, pp. 121 167, 1998.

Pre-requisite: Linear algebra, basic probability and statistics. The course will be covered in 16 weeks with 3 hour lecture per week. It will have 70% weight on theory and 30% on programming assignments, preferably in MATLAB or Python.

https://www.isibang.ac.in/~adean/infsys/database/MMath/SLT.html