FALL 2006 Class Information:
Lectures: Mondays and Wednesdays 2:30 - 4:00, 32-124
Recitations: Fridays, (could change) E15-235, 2:30-3:30
Textbook: Pattern Classification by Duda, Hart, and Stork, with
other readings
Staff | Announcements | Assignments| Syllabus | Policies
Instructor:
Prof. Rosalind W. Picard
Office: E15-448
Office hours: Wednesdays 4-5 pm and by appointment.
Phone: 617-253-0611
picard ( @ media dot mit dot edu)
Teaching Assistants:
Andrea L. Thomaz, Ph.D (Head TA)
Office: E15-485
Office hours: Tuesdays, 9-10am and by appointment
Phone: 617-452-5612 (office)
alockerd ( @ media dot mit dot edu)
Bo Morgan
Office: outside of E15-352
Office hours: Wednesdays 9-10 am and by appointment
Phone: 617-803-3629 (mobile)
bo ( @ mit dot edu)
Support Staff:
Ms. Lynne Lenker
Office: E15-443f
Phone: 617-253-0369
llenker ( @ media dot mit dot edu)
(works part-time; usually in midday Mon-Thurs.)
Staff | Announcements | Assignments| Syllabus | Policies
W 9-6 First day of class. (Visitors welcome.) The first recitation will be held Friday Sep 8 for those who would like help with MATLAB, probability, or the first problem set. This class has traditionally made heavy use of MATLAB. Here is a MATLAB Tutorial you may find helpful. You might also enjoy this short insightful (optional) piece on nuances in the use of probability by Tom Minka.
M 9-11 There is a useful website for the text, which includes not only ppt slides for the chapters, but also errata for each edition of the text. Please correct the errors in your text now to save you time later when you are reading and trying to understand.
M 9-18 There will be two extra office hours on Monday 9-18: Bo 9-10am; Andrea 10-11am.
W 9-20 A few typos were found in Problem Set 2, please download it again.
W 10-25 Andrea and Bo will hold a quiz review/Q&A session on Monday evening 10/30/06 5pm-7pm in room E15-235.
W 10-25 Prof. Picard will hold extra ofice hours before the quiz next week, on Wendesday 12:15-1:15.
M 11-6 1 day extension on problem set 5, it is now due at 2pm tomorrow. Hand in at Andrea's office by then.
PROJECTS Here is the 2006 Class Projects Page. You can find descriptions of the datasets available this year.
OLD EXAMPLES OF PROJECTS Here is the 2002 Class Projects Page. Here is the 2004 Class Projects Page.
Staff | Announcements | Assignments| Syllabus | Policies
W 9-6 Lecture 1 Reading: DHS Chap 1, A.1-A.2
Problem Set 1 (due M 9-18)
Data set
Solutions
Monty Hall Demo
F 9-8 Recitation 1 Matlab Introduction
M 9-11 Lecture 2 Reading: DHS Chap A.2-A.5
W 9-13 Lecture 3 Reading: DHS Chap 2.1-2.4 (can skip 2.3.1, 2.3.2)
F 9-15 Recitation 2
M 9-18 Lecture 4 Reading: DHS Chap 2.5-2.7
Problem Set 2 (due M 10-2)
Data set
Solutions
W 9-20 Lecture 5 Reading: DHS Chap 2.8.3, 2.9, 3.1-3.2
F 9-22 Recitation 3
M 9-25 Student Holiday, No Classes
W 9-27 Lecture 6 Reading DHS 3.1-3.5.1
F 9-29 Recitation 4
M 10-2 Lecture 7: Introduction to Reinforcement Learning (and Interactive RL) by Andrea Thomaz
Reading: Chapter 21 of AI a Modern Approach
Problem Set 3 (due M 10-16)
Data set,
RL Activity
Solutions
W 10-4 Lecture 8 DHS 3.7.1-.3, 3.8
Additional optional reading: M. Turk and A. Pentland (1991). "Eigenfaces for recognition". Journal of Cognitive Neuroscience 3 (1): 71-86.
F 10-6 Recitation 5
M 10-9 Columbus Day Holiday, No Classes
W 10-11 Lecture 9 Intro to Bayes Nets, DHS 2.11
Additional optional reading: Peter N. Belhumeur, Joao~ P. Hespanha, and David J. Kriegman (1997). "Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection". IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 19, No. 7, July 1997.
F 10-13 Recitation 6
M 10-16 Lecture 10: Inference on Bayes Nets and Dynamic Bayes Nets, Dr. Rana el Kaliouby
Reading: Introduction to Graphical Models and Bayesian Networks, Kevin Murphy, 1998.
Problem Set 4 (due M 10-25)
Solutions
Problem 1 Data:
Class0_TrainA
Class0_TrainB
Class0_Test
Class1_TrainA
Class1_TrainB
Class1_Test
Problem 2 Data: BNData.txt
W 10-18 CLASS CANCELLED. Media Lab Sponsor Events (MAS students please attend the sponsor events, watch for examples of pattern recognition, and think about more ways to put this material to good use.)
F 10-20 Recitation 7
M 10-23 Lecture 11 HMMs, reading: A Tutorial On Hidden Markov Models and Selected Applications in Speech Recognition, L.R. Rabiner, Proceedings of the IEEE, Vol 77 No 2, Feb. 1989.
W 10-25 Lecture 12 HMMs, same reading.
Problem Set 5 (due M 11-6) Solutions
Data: hmmdata2.tar.gz
F 10-27 Recitation 8
M 10-30 Lecture 13 DHS 2.10 and 3.9
W 11-1 MIDTERM QUIZ
F 11-3 Recitation 9
M 11-6 Lecture 14 DHS 10.2-10.4.2; Provided data for projects available.
Problem Set 6 (due M 11-20)
Solutions
problem6data.tar
Optional Readings (short and very interesting
articles to discuss with your friends, given Election Day coming)
"Election Selection:
Are we using the worst voting procedure?" Science News, Nov 2
2002.
Range voting: Best way to select a leader?
W 11-8 Lecture 15 DHS 4.1-4.6, DHS 5.1-5.5.1 and 5.8.1 DUE TODAY: Your project plan (one page) if you are using your own data.
F 11-10 NO RECITATION Veteran's Day Holiday
M 11-13 Lecture 16: DHS 6.1-6.6 and 6.8
W 11-15 Lecture 17: Kalman Filtering by Nicholas Mavridis; DUE TODAY: Final project plans.
F 11-17 Recitation 10
M 11-20 Lecture 18 Multilinear Analysis by Alex Vasilescu
W 11-22 Lecture 19: Introduction to Genetic Algorithms, Bo Morgan (DROP DAY)
F 11-24 NO RECITATION Thanksgiving Vacation
M 11-27 Lecture 20 Clustering: DHS 10.4.3, 10.4.4, 10.6-10.10
W 11-29 Lecture 21 Feature Selection,
webpage shown in class: http://ro.utia.cz/fs/fs_guideline.html
DHS reading: Entropy/Mutual information A.7,
Decision Trees, 8.1-8.4
F 12-1 Recitation 11
M 12-4 Lecture 22
W 12-6 Lecture 23
F 12-8 Recitation 12 for help with projects if necessary
M 12-11 Project Presentations: Everyone required to attend class today from 2:30 until 5pm
W 12-13 Project Presentations: Everyone required to attend class today from 2:30 until 5pm (Last day of class.)
Staff | Announcements | Assignments| Syllabus | Policies
Fall 2006 Syllabus: (subject to adjustment)
Intro to pattern recognition, feature detection, classification
Review of probability theory, conditional probability and Bayes rule
Random vectors, expectation, correlation, covariance
Review of linear algebra, linear transformations
Decision theory, ROC curves, Likelihood ratio test
Linear and quadratic discriminants, Fisher discriminant
Sufficient statistics, coping with missing or noisy features
Template-based recognition, feature extraction
Eigenvector and Multilinear analysis
Training methods, Maximum likelihood and Bayesian parameter estimation
Linear discriminant/Perceptron learning, optimization by gradient descent
Support Vector Machines
K-Nearest-Neighbor classification
Non-parametric classification, density estimation, Parzen estimation
Unsupervised learning, clustering, vector quantization, K-means
Mixture modeling, Expectation-Maximization
Hidden Markov models, Viterbi algorithm, Baum-Welch algorithm
Linear dynamical systems, Kalman filtering
Bayesian networks
Decision trees, Multi-layer Perceptrons
Reinforcement Learning with human interaction
Genetic algorithms
Combination of multiple classifiers "Committee Machines"
Staff | Announcements | Assignments | Syllabus | Policies
30% Project with approximate due dates:
10% Your presence and interaction in lectures (especially your presence during the last two days of project presentations), in recitation, and with the staff outside the classroom.
The midterm will be closed-book, but we will allow a cheat sheet.