FALL 2008 Class Information:
Lectures: Mondays and Wednesdays 10:30-12:00, 36-155
Recitations: Fridays, E15-235, 11:00-12:00
Textbook: Pattern Classification by Duda, Hart, and Stork, with
other readings
Staff | Announcements | Assignments| Syllabus | Policies
Instructor:
Prof. Rosalind W. Picard
Office: E15-448
Office hours: Wednesdays 4-5 pm and by appointment.
Phone: 617-253-0611
picard ( @ media dot mit dot edu)
Teaching Assistants:
Hyungil Ahn
Office: E15-450
Office hours: Thursday 4-5 pm and by appointment.
Phone: 617-253-8628
hiahn ( @ media dot mit dot edu)
Dawei Shen
Office: E15-491
Office hours: Tuesdays 2:30-3:30pm and by appointment.
Phone: 617-253-0112
dawei ( @ media dot mit dot edu)
Support Staff:
Mr. Daniel Bender
Office: E15-443f
Phone: 617-253-0369
danielb ( @ media dot mit dot edu)
Staff | Announcements | Assignments| Syllabus | Policies
W 9-3 First day of class. (Visitors welcome, although class
size is limited to 20.)
The first recitation will be held Friday Sep 5, providing an
overview for those who would like help with MATLAB, probability, or
the first problem set. This class has traditionally made heavy use of
MATLAB. Here is
a MATLAB
Tutorial you may find helpful. You might also enjoy this short
insightful (optional) piece
on
nuances in the use of probability by Media Lab graduate Tom Minka,
Ph.D.
There is a useful website for the text, which includes not only ppt slides for the chapters, but also errata for each edition of the text. Please correct the errors in your text now to save you time later when you are reading and trying to understand.
The IAPR Pattern Recognition Education Resources web site was initiated by the Internation Association for Pattern Recognition (http://www.iapr.org/). The goal was a web site that can support students, researchers and staff. Of course, advances in pattern recognition and its subfields means that developing the site will be a never-ending process. What resources does the IAPR Education web site have? The most important resources are for students, researchers and educators. These include lists with URLs to: - Tutorials and surveys - Explanatory text - Online demos - Datasets - Book lists - Free code - Course notes - Lecture slides - Course reading lists - Coursework/homework - A list of course web pages at many universities
PROJECTS Here is the 2008 Class Projects Page.
OLD EXAMPLES OF PROJECTS Here are the 2002
Class Projects Page, the 2004
Class Projects Page, and the 2006
Class Projects Page.
Staff | Announcements | Assignments| Syllabus | Policies
W 9-3 Lecture 1 Introduction, Reading: DHS Chap 1, A.1-A.2
Problem Set 1 (due M 9-15)
Dataset for Problem 4 (2x200 array, 200 points in 2-dim space, each column denotes a sample point (x,y))
Solutions
Monty Hall Demo
F 9-5 Recitation 1 Matlab Introduction
M 9-8 Lecture 2 Reading: DHS Chap A.2-A.5
W 9-10 Lecture 3 Reading: DHS Chap 2.1-2.4 (can skip 2.3.1, 2.3.2)
F 9-12 Recitation 2
M 9-15 Lecture 4 Reading: DHS Chap 2.5-2.7
Problem Set 2 (due W 9-24)
Dataset for Problem 6
Solutions
W 9-17 Lecture 5 Reading: DHS Chap 2.8.3, 2.9, 2.11
Reading: Independence Diagrams, Thomas P. Minka, 1998.
F 9-19 Recitation 3
M 9-22 Student Holiday, No Classes
W 9-24 Lecture 6 Inference on Bayes Nets and Dynamic Bayes Nets, guest lecture from Dr. Rana el Kaliouby
Reading: Introduction to Graphical Models and Bayesian Networks, Kevin Murphy, 1998.
(DBN Lecture PDF slides)
Problem Set 3 (due M 10-6)
Four Datasets for Problem 1 (dataset1.txt dataset2.txt dataset3.txt dataset4.txt)
Matlab Code for Problem 2 (conditional_dep.m)
Matlab Code for Problem 2 (graph_draw.m)
Matlab Codes for Problem 4 (generate_sample_dbn.m learning_dbn.m inference_dbn.m)
Solutions
Kevin Murphy's Bayes Net Toolbox for Matlab
Tutorial Pages for this Toolbox:
1.General Usage
2. DBN Usage
F 9-26 Recitation 4
M 9-29 Lecture 7 DHS 3.1-3.5.1
W 10-1 Lecture 8 DHS 3.7.1-.3, 3.8
Additional optional reading: M. Turk and A. Pentland (1991). "Eigenfaces for recognition". Journal of Cognitive Neuroscience 3 (1): 71-86.
Additional optional reading: Peter N. Belhumeur, Joao~ P. Hespanha, and David J. Kriegman (1997). "Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection". IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 19, No. 7, July 1997.
F 10-3 Recitation 5
M 10-6 Lecture 9 HMMs, reading: A Tutorial On Hidden Markov Models and Selected Applications in Speech Recognition, L.R. Rabiner, Proceedings of the IEEE, Vol 77 No 2, Feb. 1989.; optional extra reading: DHS 3.10 (beware, as DHS uses non-standard notation in this section).
Problem Set 4 (due W 10-15)
Dataset for Problem 3
A tutorial document that helps you understand some mathematics behind HMM algorithms
Solution to Problem4-1
Solution to Problem4-2
Solution to Problem4-3
W 10-8 Lecture 10 HMMs, same reading.
F 10-10 Recitation 6
M 10-13 No Classes, Columbus Day Holiday
W 10-15 Lecture 11 DHS 2.10 and 3.9, Missing Data and Expectation Maximization
Problem Set 5a (due W 11-5)
Dataset for Problem 2
Solutions (Revision due W 11-12)
F 10-17 Recitation 7
M 10-20 Lecture 12 DHS 10.2-10.4.3 Mixture densities, K-means clustering, Quiz review
W 10-22 MIDTERM QUIZ, Covers PS1-4 and L1-L10
F 10-24 Recitation 8
M 10-27 Lecture 13 DHS 10.4.3, 10.4.4, 10.6-10.10 Clustering
W 10-29 CLASS CANCELLED. Media Lab Sponsor Events
F 10-31 Recitation 9
M 11-3 Lecture 14, Provided Data for Projects Ready
Presentations on project data available, Optional Readings (short and
very interesting articles to discuss with your friends, given Election
Day coming)
"Election Selection: Are we using the worst voting procedure?" Science
News, Nov 2 2002.
Range voting: Best way to select a leader?
W 11-5 Lecture 15, Project Plans Due if you're using your own data DHS 4.1-4.6.1, Nonparametric density estimation, Parzen windows and k-nn estimation, K-nn classifier, metrics; DHS 5.1-5.5.1 and 5.8.1 Linear Discriminants.
Problem Set 5b (due F 11-14)
Dataset
Solutions (Revision due W 11-19)
F 11-7 Recitation 10
M 11-10 No Classes; Veteran's Day Holiday, MAS students and faculty attend Crit Day
W 11-12 Lecture 16, All Project Plans Due DHS 6.1-6.6 and 6.8, Multilayer Neural Networks
No more problem sets: focus on your project now
F 11-14 Recitation 11
M 11-17 Lecture 17 Feature Selection,
webpage shown
in class:
http://ro.utia.cz/fs/fs_guideline.html
DHS reading:
Entropy/Mutual information A.7, Decision Trees, 8.1-8.4
W 11-19 Lecture 18 5.11 SVM, 9.1-9.2.1 No free lunch, 9.2.3-9.2.5 MDL, Occam; 9.3 Bias and Variance, 9.5 Bagging, Boosting, Active Learning, 9.6 Estimating and Comparing Classifiers, 9.7 Classifier Combination
F 11-21 Recitation 12
M 11-24 Lecture 19 Project Progress Presentations/ Critique Day/ Attendance counts toward grade today
W 11-26 Lecture 20 Project Progress Presentations/ Critique Day/ Attendance counts toward grade today
F 11-28 No Recitation, Thanksgiving Vacation
M 12-1 Lecture 21 Reinforcement Learning, guest lecturer: Hyungil Ahn
Readings:
Reinforcement Learning: An Introduction by Sutton and Barto
Reinforcement Learning by Dayan
W 12-3 Lecture 22 Kalman Filtering, guest lecturer: Dawei Shen
F 12-5 Recitation 13 Project help session
M 12-8 Project Presentations: Attendance counts toward grade today.
W 12-10 Project Presentations, Last Day of Class: Attendance
counts toward grade today.
Staff | Announcements | Assignments| Syllabus | Policies
Fall 2008 Syllabus: (subject to adjustment)
Intro to pattern recognition, feature detection, classification
Review of probability theory, conditional probability and Bayes rule
Random vectors, expectation, correlation, covariance
Review of linear algebra, linear transformations
Decision theory, ROC curves, Likelihood ratio test
Linear and quadratic discriminants, Fisher discriminant
Sufficient statistics, coping with missing or noisy features
Template-based recognition, feature extraction
Eigenvector and Fisher Linear Discriminant analysis
Training methods, Maximum likelihood and Bayesian parameter estimation
Linear discriminant/Perceptron learning, optimization by gradient descent
Support Vector Machines
K-Nearest-Neighbor classification
Non-parametric classification, density estimation, Parzen estimation
Unsupervised learning, clustering, vector quantization, K-means
Mixture modeling, Expectation-Maximization
Hidden Markov models, Viterbi algorithm, Baum-Welch algorithm
Bayesian networks
Decision trees, Multi-layer Perceptrons
Optional possible end of term topics:
Combination of multiple classifiers "Committee Machines"
Reinforcement Learning and Affective Decision-Making
Genetic algorithms
Kalman filtering
Case studies in pattern recognition
Staff | Announcements | Assignments | Syllabus | Policies
30% Homework/Mini-projects, due every 1-2 weeks up until 3 weeks before the end of the term. These will involve both programming (Matlab) and non-programming assignments.
New homework submission and grading policy:
On the day when the homework is due, bring the best efforts you have made and submit your homework in class. Please photocopy your homework before submission. At the end of the class, you will receive the homework solution. You are given the second opportunity to refine your work after learning the solution. Submit your second homework by the recitation time on Friday at the latest. If you wish to submit it earlier, you can directly bring it to TAs' offices. You final homework grade will be the average of the two scores you get. Directly copying answers from the solution sheet is definitely unacceptable. Again, learning and understanding course materials is far more important than getting a revised score.
30% Project with approximate due dates:
15% Your presence and interaction in lectures (especially your
presence during the two days of project critiques and two days of
final project presentations, which is 10%), in recitation, and with
the staff outside the classroom.
The midterm will be closed-book, but we will allow a cheat sheet.