SPRING 2002 Class Information:
Wednesdays 10:00-12:00
Room E15-335
Textbook: Affective Computing by Rosalind Picard (MIT Press),
and other readings
Staff | Announcements | Assignments| Syllabus | Policies
Instructor:
Prof. Rosalind W. Picard
Office: E15-020g
Office hours: Mondays 11-12 am except April 1 and 22 (traveling)
and holidays; Also by appointment.
Phone: 253-0611
picard@media.mit.edu
Support Staff:
Ms. Vickey McKiernen
E15-020a
253-0369
vm@media.mit.edu
Staff | Announcements | Assignments| Syllabus | Policies
Announcements:
2/6/02 First day of class. (Visitors welcome.)
2/13/02 Projects: I will say much more about possible projects in the coming classes. If you're interested in seeing some past projects, last year's projects are online.
2/19/02 Upcoming Performance: Teresa Marrin Nakra, PhD, Media Lab 2000, will be presenting a concert involving some work she did on the Conductor's Jacket, a device for sensing conductor's expressive gestures and mapping these to musical output. The concert is Saturday February 23, 8pm at Harvard Adams House Pool Theater on Bow street. Concert Information, Student Tkts $3
2/26/02 Saturday, March 2, 2:30-3:00, Museum of Science, there will be a "Forum" discussion led by experts on Xeno-transplantation, where we will be fitting the audience with galvactivators and filming their brightness level. Feel free to come and join us. I think an MIT ID still gets you free into the museum. The Forum will be meeting in the main part of the museum, out in the open, in the "Current Science and Technology Center."
3/06/02 Note: I will be out of town March 11 and unable to hold office hours that Monday. Office hours will be Tuesday March 12, and expanded 9:30-11:30, for you to come by and discuss project ideas if you wish.
Staff | Announcements | Assignments| Syllabus | Policies
Projects (presentation and web page) due in class 5/15/02
Dear Class,
Remember your projects and their presentation
count for 40% of your grade. Also you will have (TBD, but probably
less than) 15 minutes to present -- ~10 minutes to present, with ~5
minutes for interrupting and the class/me asking questions). Thus,
here are some guidelines to help you in the final push.
Please focus your presentation on these issues:
Due in class Wed May 8:
READ: "SuperToys Last All Summer Long," the story that inspired
Kubrick to make the movie AI, and "R.U.R.," the play that coined the
word "robot." Optional: Affective Computing Chap 4, especially
pp. 124-137.
ASSIGNMENT: Please come to class prepared to discuss the role of affective computing
in these futuristic scenarios. I will also be interested in your views on how likely
certain aspects of these stories are to actually happen.
Due in class Wed May 1:
READ: "Emotion Recognition in Human-Computer Interaction" by Cowie
et al. Most of the in-class discussion will be focused on the sections on
speech, so you may want to give those a closer reading.
ASSIGNMENT: In addition to the readings, we are assigning a short lab and a
write-up based on it.
Part I: In the first part of this assignment, you will be asked to complete a perceptual experiment in which you will rate the affective content of a set of speech utterances.
System Requirements: The Java code for this application is a few months old.
It runs using version 1.3.1 of the Java JDK, and version 1.2.2 of Java Media
Framework. It is known not to work using version 1.3.4 of JDK! If
your machine has more recent versions installed, you may find that you can't
hear the sound clips. You may opt to uninstall the newer versions and replace
them with the ones that work, or simply skip all that hassle and drop by E15-001
to run it on Mirth.
Part II: Complete the following questions based on your experiences using
the above interface, and what you have learned in this course about models of
emotion.
Due in class Wed Apr 24:
READ: (1) Chaps 3, 16, 17 from the Illusion of Life by F. Thomas and O. Johnson (handed out).
BRING TO CLASS: video assignment per Prof. Bruce Blumberg's request below. Here is a note from Bruce:
Dear Students,
Please read chapters 3,16 & 17 from the Illusion of Life. It may seem
like a lot, and perhaps it is, but there are also lots of pictures, as
well as a great deal of wisdom. Please note that the Illusion of Life
was written by 2 old white guys in the 1960's so you may notice
comments that seem sexist by today's standards. Thank goodness
attitudes have changed (hopefully.) So try to ignore the irritating
attitudes in places and focus on the remaining wisdom.
After doing the reading, I want you to find a 30 second - 1 minute
clip from an animation in which a character is presented as being in a
strong emotional state (e.g. Very fearful, very happy...) In a 1 page
write-up, carefully analyze exactly what it is about the quality of
movement, posture, staging, sound, etc. that conveys the emotional
state. Try to be as specific as you can. For example, don't just tell
me "I know the character is sad because he is walking sadly." Instead
analyze exactly what it is about the walk that conveys sadness.
Please bring the clip to class and be prepared to present the clip and your analysis.
Thanks
-bruce
Due 10 a.m. Tues Apr 16:
READ: Zuckerman et al 81, Ekman and O'Sullivan 91, Reeve 93.
WRITE:
1) How is your project progressing? Please jot a few lines telling me
what you've
accomplished so far.
2) Suppose you are called upon to help design the affect-related part
of a lie-detection system for use in airports. When you check in as a
passenger, it is already routine at the check-in counter to ask each
passenger questions such as "Have your bags been out of your immediate
control since you packed them? Has any stranger given you a package
or present to carry?" It is also routine to walk through a metal
detector, and the metal-detection systems in Boston also gather video of you as you
walk through. Describe how you might augment one of these
interactions (or some other part of the experience, if you prefer) to
help catch people who might have malicious life-threatening intent.
What additional questions would you ask if your system involves
questioning? What sensors would you use? What behavior is your
system aimed at detecting? Be very specific, and give a couple
examples. What do you think would be the strengths of your system?
What do you expect would be its weaknesses? What concerns does your system
raise?
3) A future computer tutor or mentor for kids could potentially be more effective if it
could see if the child is interested or bored. Suppose you have a camera
and the ability to custom-develop computer vision software. Your programmer
is highly talented but
has limited time and wants to know which features are most important to
implement first. Suggest a set of four or so features that you would place
priority on for detecting interest. Justify your answer.
EMAIL:
Please send your response TO: picard@media.mit.edu, SUBJECT: mas630-homework
by the deadline. Many thanks!
Due 10 a.m. Tues Apr 9:
READ:
Klein et al (note this paper was accepted to Interacting with
Computers and should appear any day), Card et al, Goleman Chap 11,
Williams (ok to skim the detail about studies of type A behavior),
McCraty et al.
WRITE:
1) Briefly describe a computer interaction you or a friend has had, which gave rise
to one or more emotions, and name the emotion(s).
Suppose the machine could have engaged the user
in dialogue around the time of that interaction. Write a short dialogue that you
think would have been beneficial (for you or for your friend -- you don't have
to specify.) Does your dialogue address emotion specifically, even if subtle?
How well do you think the dialogue you constructed would have worked in reality?
Describe a condition where you would expect it to fail.
2) "Computers appear less judgmental" is certainly a believable reason why
people report more negative health information to computers than to
trained psychiatrists. With your "affective
computing" hat on, identify two other factors that you think would be important
to pay attention to in designing a medical-information gathering system.
3) Heart disease is the number-one killer, 1.6 million/year (and all cancers combined
kill ~.5 million/year). Describe a way (it doesn't have to be implementable
yet with current technology) that affective computers might potentially
help reduce this number.
EMAIL:
Please send your response TO: picard@media.mit.edu, SUBJECT: mas630-homework
by the deadline. Many thanks!
Due 10 a.m. Tues Apr 2:
READ: Kismet , Fernald, Smith
& Scott, Frijda
WRITE: Suppose you were building a robot that had to learn from people who
had no special training or experience teaching robots or computers.
What emotional capabilities would this robot need? Consider: (i) What
emotions would it need to recognize? (ii) What emotions would it need
to express? (iii) What internal regulatory emotional functions might
it need? (iv) How do you think these emotional factors would
influence the learning process in the robot?
SPECIAL NOTE: Please mail your answer to BOTH cynthiab@media.mit.edu and picard,
subject: mas630-homework. Prof. Cynthia Breazeal is guest-lecturing and will want to see your responses before class. Thanks!
Due 10 a.m. Tues Mar 19:
READ:
(1) Johnson-Laird Chap 20: pp. 369-384 (2) _Affective
Computing_, Chapter 2 pp 60-75 and chap 7; (3)
"What does it mean for a machine to 'have' emotions?" Chapter in book
"R. Trappl, P. Petta, S. Payr, eds.: Emotions in Humans and
Artifacts." Cambridge, MA, MIT Press, 2002 (in press). (4)
(optional) Breese & Ball paper handed out in class (nice update to
methods in my book.)
WRITE:
1) "Emotions are always caused by cognitive evaluations, whether
conscious or unconscious." The Johnson-Laird paper argues for this.
Can you supply one or two examples based on readings to date, that appear to
go against this? What is your opinion as to the truth of this
statement?
2) Construct a situation (imaginary system/application we might build)
where one of the models in chapter 7 would be appropriate to use. For
example, one of them is illustrated as being useful for a
poker-playing agent. Why is the model you picked good for this
situation? How well do you think the model would work in the
situation? Does the model have the "right" emotions? Does it operate
at the right level(s) (bodily influences, cognitive reasoning, and so
forth) needed for the situation/problem you describe? What would it
need that it doesn't have yet? (You don't have to build it; this is a
thought experiment.)
2) Suppose that you are hired as an expert on
affective computing, to test if a machine really "has"
emotions. Describe two tests you would perform with the machine. You
do not need to limit your interaction to text-only as in the Turing
test, but you can assume that if you wish. State any assumptions you
are making. What are the strengths of your two tests? What are some
weaknesses of your tests? Feel free to try out your test with
Alicebot .
EMAIL:
Please send your response TO: picard@media.mit.edu, SUBJECT: mas630-homework
by the deadline. HEY! You really do save me time and effort if you use the subject line
I request, and I really appreciate it when you show this respect -- thank you to
all who do this!
Due by the start of class WED Mar 13:
READ: Picard AC book: Chapters 3 and 8,
Toward
Computers that Recognize and Respond to Emotion...IBM Systems J.,
and TR
543, Kapoor et al, Towards a Learning Companion .
Due before MIDNIGHT Wed Mar 13:
WRITE for your project proposal (1-2 pages is plenty):
1) A short description of what you hope to build/create/investigate and
a sentence or two about why it is relevant to affective computing (if not obvious.)
2) Special materials/equipment/abilities you will need, and whether you
have them yet or not. If not, what help do you still need?
3) What do you hope to be able to conclude from your project?
4) If "3" is not very precise or very strong, would you like to meet w/me
to discuss ways to refine your plan? In my experience, students too often
bite off too much, when a smaller focused project would actually be more
intersting than a big "ambitious" sounding one that you can't actually do.
Pls take a moment here and evaluate realistically some subsets of your
grand and ambitious ideas that might be most doable.
5) Is this project related to another project (or thesis) you are doing?
If so, this is fine, I just want to know. Please state how the project differs from the related project.
EMAIL:
Please send your response TO: picard@media.mit.edu,
SUBJECT: mas630-homework OR you can slip a hardcopy under my door, E15-020G.
Please let me know if you need another day or so on the proposal -- I can
grant extensions up to a few days if that helps, but I want to at least
know what the hold up is.
Due 10 a.m. Tues Mar 5:
READ:(1) Forgas and Moylan 87 on Going to the
Movies, (2) Isen et al 87 on Affect and Creativity,
(3) Isen et al 91 on Affect and Clinical Problem Solving,
(4) Clore 92 on Feelings and Judgment,
and (5) Bouhuys et al 95 on Perceiving Faces.
WRITE responses to these questions (ASCII please):
1) Jack, the please-his-boss pollster, has been given ten questions
with which he must canvas people's opinions. The questions relate to
the overall satisfaction that people perceive with his party's
political figures and their impact both locally and nationwide. He's
not so dishonest that he would modify the questions, but he doesn't
think it's wrong to conduct the poll in a way that makes his party's
political candidates look as good as possible. (You might disagree
about this.) He plans to poll 1000 people nationally by phone and 1000
locally, in person, by some "random" process. Describe three ways
Jack can try to bias the opinions he collects by manipulating
affect-related factors.
2.) "Being emotional influences thinking -- it makes your thinking
less rational." To some extent, we all know this is true. List two
examples from this weeks' readings that support this statement. Then
list two examples from this week's readings that contradict it. Justify
your examples.
3.) You've read about "feelings of knowing," "feelings of ease,"
"feelings of uncertainty," "feelings of familiarity" and other
internal signals that are perceived as feelings, but seem to have
primarily cognitive roles. Pick one of these internal signals and
argue why it might be important to implement in a machine. Give an
example of its possible improvement over existing machine capability.
Pick another one and argue against its implementation, also with an
example why it might be undesirable.
EMAIL:
Please send your response TO: picard@media.mit.edu, SUBJECT: mas630-homework
by the deadline. HEY! You really do save me time and effort if you use the subject line
I request, and I really appreciate it when you show this respect -- thank you to
all who do this!
Due 10 a.m. Tues Feb 26:
READ: (recommended to read in order given here:) Affective Computing Chap 2 pp 47-60 (just through section on expressing emotions), Hama and Tsuda 90, Clynes et al 90, AC book chap 6, Picard et al 01, optional on Conductor's Jacket: Marrin articles TR 470 and TR 475.
WRITE:
1) Name a strong point of the Hama and Tsuda study. Name a weak point
of the study.
2) It is interesting to use measurement techniques to test for emotional
interactions. Do you think love is blocked by lying but not by anger (in
the sense Clynes describes?) Support or critique the evidence for this
interaction.
3) The Gnu York Times prints "MIT researchers have demonstrated that
a computer can recognize eight of your emotions with greater than 80% accuracy."
Critique this statement; don't hesitate to be critical.
EMAIL:
Please send your response TO: picard@media.mit.edu, SUBJECT: mas630-homework
by the deadline.
Due 10 a.m. Tues Feb 19:
READ: Affective Computing Chap 5, Dawson90, Schlosberg54, LeDoux94, and http://www.media.mit.edu/galvactivator.
WRITE:
1) Briefly describe an experience where the galvactivator brightness
changed significantly while you or somebody you know was wearing it and
had a clear change in emotional state.
2) Briefly describe an experience where the galvactivator brightness
changed significantly while you or somebody you know was wearing it and
did not have any obvious change in emotional state.
3) Schlossberg's "Attention-Rejection" (A-R) axis can be seen as a
third axis for the arousal/activity - valence space. Another commonly used
third axis is the "dominance" axis -- a raging forest fire dominates you,
whereas an ant is dominated by you. Construct a scenario involving a computer
where the A-R axis might be useful. Construct a scenario where the "dominance"
axis might be useful.
4) Construct a scenario (however fanciful) where it might be useful
for an ordinary office computer to have the computer-equivalent of one
of the mechanisms LeDoux describes. Make clear which mechanism you are
embedding in the computer. Comment on how valuable (or not) you think this
feature might be.
EMAIL:
Please send your response TO: picard@media.mit.edu, SUBJECT: mas630-homework
by the deadline.
Due 10 a.m. Tues Feb 12:
READ: Affective Computing Introduction and Chapter 1, Bechara et al.
WRITE:
1. Construct a specific example of a human-human interaction that involves
affect, and how it might have an "equivalent" human-computer interaction.
Describe the interaction and the affective information in both the human-human
and human-computer cases.
2. Construct another example interaction involving affect, but this
time try to think of a human-human interaction that may not carry over
to the human-computer "equivalent". In what way does "the media equation"
hold (or not) for your example?
3. Name (briefly) three things you learned or liked from this
weeks readings, and one thing you didn't like (perhaps some part
that was confusing or raises your skepticism?)
4. If time is short, what one question/issue would you most like to
discuss based on these readings?
EMAIL:
Please send your response TO: picard@media.mit.edu, SUBJECT: mas630-homework
by the deadline. Note, I have a preference for plain ascii responses
(vs. large word docs, attachments, etc.) Ideally, you should send
me no more than a page of ascii.
Note: I make the "due date" for each week's assignment to be
24 hours before the following class. This gives me time to look over your
input before we gather again in class. I find that this makes a big difference
in the quality of our time together learning.
Staff | Announcements | Assignments | Syllabus | Policies
Grading:
20%
Classroom participation
40% Ten short assignments
40% Project and presentation (proposal Mar 13, presentation May 15)
Policy: All students are expected to attend all classes and
all project presentations. Please contact Prof. Picard in advance if you
will have to miss one of our classes. Unexcused
absence will affect your grade.
Staff | Announcements | Assignments|Syllabus | Policies
SPRING 2002 Syllabus:
Neuroscience findings related to emotion
Emotion and perception, decision-making, and creativity
Emotion and learning
Physiology of emotion
Affect recognition by machines (incl. wearable systems)
Measuring frustration/stress for usability feedback
Responding to user emotion to reduce user frustration
Inducing emotion
Robots/agents that "have" emotion
Emotion and behavior
Expression of emotion by machines/agents/synthetic characters
Philosophical, social, ethical implications of affective computing