SPRING 2004 Class Information:
Wednesdays 10:00-12:00
Room E15-335
Textbook: Affective Computing by Rosalind Picard (MIT Press),
and other readings
Staff | Announcements | Assignments| Syllabus | Policies
Instructor:
Prof. Rosalind W. Picard
Office: E15-020g
Office hours now till end of term: Apr 7 from 4-5, Apr 15 from 11-12, Apr 22 from 12-1, Apr 28 from 4-5, May 4 from 12-1. Also by Appointment.
Office hours before project proposals due: Wed Mar 3 4:00-5:00, Thurs Mar 4 12:00-1:00, Tues Mar 9 4:30-5:30 (Also by Apt.)
Phone: 253-0611
picard (at) media (dot) mit (dot) edu
Support Staff:
Ms. Lynne Lenker
E15-020a
253-0369
llenker (at) media (dot) mit (dot) edu
Staff | Announcements | Assignments| Syllabus | Policies
Announcements:
2/4/04 First day of class. (Visitors welcome.) We will be joined by Prof. Seymour Papert for a conversation about affect and learning.
2/4/04 Projects: I will say much more about possible projects in the coming classes. If you're interested in seeing some past projects the last two years' projects are here: 2001 Projects and 2002 projects.
2/18/04 You are all invited to a seminar I will give on "Computers that Sense and Respond Intelligently to Human Affect" at the CSAIL Lab Human-Computer Interaction Seminar, 1:30-2:30 Friday April 16, in 32-346 (new Stata building).
2/25/04 I am pleased to announce a special guest presentation by Dr. Tim Bickmore on Wed Mar 17. Tim is a pioneer in developing "relational agents" and will describe how affect, empathy, and other relational aspects of interaction can contribute to crafting an interaction that was found to have greater liking, respect, "caring", and several other attributes, compared to a control, even when evaluated after a month of interaction.
3/02/04 I'm pleased to announce upcoming guest presentations by Hugo Liu, Ashish Kapoor, and Prof. Bruce Blumberg (dates TBA).
3/03/04 Two upcoming Harvard Colloquia that might be of interest are: March 17 - Richard Davidson (Univ. of Wisconsin) "The Pervasive Import of Affect: Lessons from Affective Neuroscience" and April 14 - B.J. Casey (Cornell University) Development of Cognitive and Neural Processes Underlying Attention Conflict and Control These Harvard Pscyhology Colloquia are Wednesdays at 4pm at William James Hall on the Harvard Campus.
4/14/04 I'm pleased to announce that Win Burleson, doctoral candidate in my group, will be joining us for discussing of affective pedagogical agents next week.
4/28/04 I'm pleased to announce that Prof. Chris Csikszentmihalyi is planning to join us for discussion of philosophical, social, ethical issues of affective computing next week.
Staff | Announcements | Assignments| Syllabus | Policies
Due in class Wed May 5:
READ (or at least skim) these two readings before class Wed May 5:
Picard, AC Chapter 4: Potential Concerns
"Computers that recognise and respond to user emotion: theoretical and practical implications," R. W. Picard and J. Klein, Interacting with Computers 14, (2002) 141-169.
And please skim these news/online short articles:
The "Little Emotional Controller" Story (Consider: could this scenario really have been prevented with their "emotion technology?")
Desktop computers to counsel users to make better decisions
Seeing through the face of deception
The Naked Face
Although there is no written assignment for these papers, please be
thinking about the concerns raised in these readings. Please feel free to express
any additional concerns you might have.
Due 10 a.m. Tues Apr 27:
WRITE & EMAIL to picard the following:
1. How is your project going? Please jot a few lines of what you have
accomplished so far. Please jot a timeline of how you expect to
proceed for the final two weeks.
READ and please be prepared to DISCUSS in class Wed Apr 28 (handouts):
D. Goleman, Emotional Intelligence, Chapter 11.
R. McCraty et al., The Effects of Emotions on Short-Term Power Spectrum Analysis of HRV
R. M. Frankel, Emotion and the Physician-Patient Relationship
R. Robinson and R. West, A Comparison of Computer and Questionnaire Methods of History-Taking...
Selected recent news articles
Although there is no written assignment for these papers, please be
thinking about how affective computing might be used to address the
problems these papers raise.
Due 10 a.m. Tues Apr 20:
READ:
"Achieving Affective Impact: Visual Emotive Communication in Lifelike
Pedagogical Agents", James C. Lester, Stuart G. Towns, and Patrick
J. Fitzgerald, Int. Journal of AI in Education (1999), 10, 278-291.
alternate link sans page nos
"Affective Pedagogical Agents and User Persuasion", Chioma
Okonkwa, Proceedings from the 9th International Conference on HCI, Aug
5-10, 2001, New Orleans.
"The integration of affective responses into AutoTutor"
,Person, N. K., Klettke, B., Link, K., Kreuz, R. J., & the
Tutoring Research Group (1999). Proceedings of the International
Workshop on Affect in Interactions, pp.167-178. Siena, Italy.
"Towards a Learning Companion that Recognizes Affect", Ashish
Kapoor, Selene Mota and Rosalind W. Picard, Proceedings from Emotional
and Intelligent II: The Tangled Knot of Social Cognition, AAAI Fall
Symposium, November 2001
WRITE & EMAIL to picard the following:
1. List what you think is the biggest strength and biggest weakness of the emotive capabilities for each of Cosmo, AutoTutor, and the Okonkwo agent.
2. Suppose that a pedagogical agent was trained to sense and approximately mimic the smiles and frowns of a learner, in an effort to appear empathetic. Provided this was done subtly and not as an irritating perfect imitation, how well do you think it would work in terms of building rapport with the learner? Can you illustrate a specific situation where it might be helpful, and another where it might not?
3. What do you think is the biggest strength and biggest weakness of the recognition system proposed in the MIT paper? Can you think of scenarios where it is likely to succeed? Fail?
Due 10 a.m. Tues Apr 13:
READ:
Picard, 1997 Chapters 2 and 7
"From Human Emotions to Robot Emotions", by Jean-Marc Fellous (2004), AAAI Spring Symposium, Stanford, CA.
WRITE & EMAIL to picard the following:
1. Ask somebody who doesn't know about affective computing what
they think it means for a computer to "have emotions". What
properties of emotion or what emotional abilities do they describe? Do
they think a computer can have emotions like we do? Write about what they said.
Feel free to report any other interesting features of your conversation with them.
2. What capabilities does a computer have to have before you would say
it "has emotions?" If you don't think you would ever say this, then
explain why. Feel free to disagree with what I've written.
3. Construct a situation (imaginary system or application someone
might build) where one of the models in chapter 7 would be appropriate
to use. For example, one of them is illustrated as being useful for a
poker-playing agent. Think of how one of the other models might be
used in a situation, and critique how well you think the model would perform.
Give an example where it would succeed, and where it would fail.
Due 10 a.m. Tues Apr 6:
READ:
"D Learning:
What Dog Learning tells us about Building Characters that can
Learn", Bruce M. Blumberg, Chapter from Exploring Artificial
Intelligence in the new Millennium, ed. G. Lakemeyer & B. Nobel,
Morgan Kaufmann Publishers, 2002. (Roz recommends you look at the whole page, not just the paper linked to on this page).
"The
Misbehavior of Organisms", by Keller Breland and Marian Breland
(1961), American Psychologist, 16, 681-684.
WRITE & EMAIL to both bruce@media.mit.edu and picard@media.mit.edu the following:
1. What is the conventional wisdom on the role of motivation in learning? Why is the Breland article so perplexing given the conventional wisdom? Can you come up with a satisfying explanation for "misbehavior" within the context of motivation-driven learning? If so, what is it? If not, why not?
2. The terms "reward" and "reinforcement" are used extensively in animal and machine learning. Are they well understood concepts in animal learning? Do you think there is a scalar reward signal in the brain? Is a more complex view of reward warranted in machine learning? Why or why not?
3. What modifications would be required to the learning algorithm described in the D-Learning paper so as to model misbehavior?
4. Someone once suggested that animal learning is "all about affect". Do you agree or disagree?
Due 10 a.m. Tues Mar 30:
READ:
"Facial Expression and Emotion," P. Ekman, American Psychologist, Vol. 48, No. 4, 384-392.
"The Face of Interest," Motivation and Emotion, J. Reeve, Vol. 17, No. 4, 1993, 353-371?.
"Recognizing Action Units for Facial Expression Analysis"
Tian, Y.L, Kanade, T., & Cohn, J.F. (2001), IEEE Transactions on Pattern Analysis and
Machine Intelligence, Vol. 23, No. 2, pp.97-116, 2001.
WRITE & EMAIL to both ash@media.mit.edu and picard@media.mit.edu the following:
1. Prepare a list of facial actions that you think might be indicative of the
affective state of interest. Prepare a similar list for the state of
boredom. Comment on your lists.
2. "Facial Actions/Expressions always correspond to an underlying
affective state." Justify or refute this statement taking into account
this week's as well as previous week's readings. If you have trouble with this,
describe what problems arise.
3. Describe at least three things that make automated facial affect recognition
non-trivial.
Note your assignment this week has TWO PARTS:
PART ONE: Due 10 a.m. Tues Mar 16:
READ:
"This Computer Responds to User Frustration: Theory, Design, and Results," J. Klein, Y. Moon, R. W. Picard, Interacting with Computers 14 (2002) 119-140;
"Establishing and Maintaining Long-term Human-Computer Relationships," T. Bickmore and R. W. Picard, to appear in Transactions on Computer-Human Interaction; and
"Towards Caring Machines," T. Bickmore and R. W. Picard, to appear in Proceedings of the Conference on Computer-Human Interaction, Vienna, 2004.
WRITE & EMAIL to both bickmore@media.mit.edu and picard@media.mit.edu the following:
1. Describe a situation in which having a social relationship
with someone is advantageous to achieving one of your goals
(by relationship, I mean anything from casual acquaintances,
to intimate relationships).
2. How would you define this kind of relationship in an unambiguous
manner? Try to define it in words as precisely as possible.
3. How might a machine take similar advantage of such a situation? (It
is okay to imagine technological capability that does not yet exist.)
Describe in a paragraph or two how a machine might go about
establishing this kind of social relationship.
PART TWO: due in CLASS Wed Mar 17:
Please submit your project proposal
(1-2 pages is plenty). This should
contain the following:
1) A short description of what you hope to build/create/investigate
and a sentence or two about why it is relevant to affective computing
(if not obvious.)
2) Special materials/equipment/abilities you will need, and whether
you have them yet or not. If not, what help do you still need?
3) What do you hope to be able to conclude from your project?
4) Is this project related to another project (or thesis) you are
doing? If so, this is fine, I just want to know. Please state how the
project differs from the related project.
5) If you are collaborating, how do you expect the workload to be distributed?
Note your assignment this week has TWO PARTS:
PART ONE: Due 10 a.m. Tues Mar 9:
READ:
Picard AC book: Chapter 3
Toward
Computers that Recognize and Respond to Emotion by R. W. Picard, and
A Model of Textual Affect Sensing using Real-World Knowledge by Hugo
Liu, Henry Lieberman and Ted Selker
WRITE & EMAIL in the following:
1. Pick an example of text, about half a page, from either an
expressive mail (please check for permission with sender before sharing) or from
another expressive source. Label the text (by hand) according to the
categories in Liu's paper.
2. What was most difficult about this task?
3. How appropriate were the six "basic" categories? If they weren't a perfect fit, how
would you modify them to make them fit better?
PART TWO: Due IN CLASS Wed Mar 10:
Please prepare a draft of your project proposal
(1-2 pages is plenty). The final proposal (due in class Mar 17) should
contain the following:
1) A short description of what you hope to build/create/investigate
and a sentence or two about why it is relevant to affective computing
(if not obvious.)
2) Special materials/equipment/abilities you will need, and whether
you have them yet or not. If not, what help do you still need?
3) What do you hope to be able to conclude from your project?
4) Is this project related to another project (or thesis) you are
doing? If so, this is fine, I just want to know. Please state how the
project differs from the related project.
For Wed Mar 10, it is okay if you have not decided things to this
level of detail. What I would like is your idea(s) of what you're
thinking of doing, and some thoughts on how these ideas trade off
w.r.t. 2)-4). I am also available to discuss ideas with you (see
office hours above, and also by appointment) and also happy to try to
hook you up with resources or help you team up if you have overlapping
interests.
Due 10 a.m. Tues Mar 2:
READ:
Forgas and Moylan 87 on Going to the Movies
Isen et al 87 on Affect and Creativity
Isen et al 91 on Affect and Clinical Problem Solving
Clore 92 on Feelings and Judgment
Halberstadt, et al. 1995
WRITE responses to these questions (ASCII please):
1) Jack, the please-his-boss pollster, has been given ten questions
with which he must canvas people's opinions. The questions relate to
the overall satisfaction that people perceive with his party's
politicians and their impact both locally and nationwide. He's
not so dishonest that he would modify the questions, but he doesn't
think it's wrong to conduct the poll in a way that makes his party's
political candidates look as good as possible. (You might disagree
about this.) He plans to poll 1000 people nationally by phone and 1000
locally, in person, by some "random" process. Describe three ways
Jack can try to bias the opinions he collects by manipulating
affect-related factors.
Be clear how you think these manipulations would affect their opinions.
2.) "Being emotional influences thinking -- it makes your thinking
less rational." To some extent, we all know this is true. List two
examples from this weeks' readings that support this statement. Then
list two examples from this week's readings that contradict it. Justify
your examples.
3.) You've read about "feelings of knowing," "feelings of ease,"
"feelings of uncertainty," "feelings of familiarity" and other
internal signals that are perceived as feelings, but seem to have
primarily cognitive roles. Pick one of these internal signals and
argue why it might be important to implement in a machine. Give an
example of its possible improvement over existing machine capability.
Pick another one and argue against its implementation, also with an
example why it might be undesirable.
EMAIL:
Please send your response TO: picard@media.mit.edu, SUBJECT: mas630-homework
by the deadline. By the way -- *thank you all* for sending printable emails (no attachments) last time.
Due 10 a.m. Tues Feb 24:
READ:
Barsalou et al.
Strack et al.
WRITE:
1) When we read these articles it is important to read critically
and not simply accept everything the authors claim. The Barsalou et
al. paper, because it is a summary, leaves out a lot of detail. (For
example, consider how the Strack et al paper could be reduced to a
one-line claim in a summary paper.) Pick one of the claims that
Barsalou et al. associate with a reference, and generate a list of
two or three considerations that could potentially nullify the claim,
e.g., some specific ways in which the cited authors might not have
properly controlled the experiment, which could allow for a different
interpretation of the results. Then go and obtain the original paper
and read it critically to see if the authors were careful about the
items that you thought were important. Which reference did you pick?
Did the authors properly support the summarized claim? Bring a spare copy
of this paper to class (that we can make copies off of if somebody
else wants a copy) and be prepared to share your critique and what
details you learned about one of these supporting pieces of work.
2)Buried in the Barsalou et al. paper are several findings that have
implications for interactive interface design. Write a scenario where
one of these is applied to a future interaction with an affective
computer. For example, consider that you are building a (future)
interactive robot that helps a child learn how to solve puzzles or
other challenging problems. Suppose the robot is able to sense and
respond to various affective and bodily cues in the child. What are
some specific capabilities you might wish to build into the system,
based on findings about "social embodiment"?
3) (Optional) Please feel free to jot an implication that these
readings have for your own work, or one that you'd like to discuss as
a class.
EMAIL:
Please send your response with SUBJECT: mas630-homework TO: picard@media.mit.edu,
by the deadline.
Due 10 a.m. Tues Feb 17:
READ:
Affective Computing Chapter 5,
Dawson90,
Schlosberg54,
LeDoux94,
and http://www.media.mit.edu/galvactivator.
WRITE:
1) Briefly describe an experience where the galvactivator brightness
changed significantly while you or somebody you know was wearing it and
had a clear change in emotional state.
2) Briefly describe an experience where the galvactivator brightness
changed significantly while you or somebody you know was wearing it and
did not have any obvious change in emotional state.
3) Schlossberg's "Attention-Rejection" (A-R) axis can be seen as a
third axis for the arousal/activity - valence space. Another commonly used
third axis is the "dominance" axis: a raging forest fire dominates you,
whereas an ant is dominated by you. Construct a scenario involving a computer
where the A-R axis might be useful. Construct a scenario where the "dominance"
axis might be useful.
4) Construct a scenario (however fanciful) where it might be useful
for an ordinary office computer to have the computer-equivalent of one
of the mechanisms LeDoux describes. Make clear which mechanism you are
embedding in the computer. Comment on how valuable (or not) you think this
feature might be.
EMAIL:
Please send your response with SUBJECT: mas630-homework TO: picard@media.mit.edu,
by the deadline.
Due 10 a.m. Tues Feb 10:
READ:
Affective Computing Introduction and Chapter
1,
Bechara et al.
WRITE:
1. Construct a specific example of a human-human interaction that involves
affect, and how it might have an "equivalent" human-computer interaction.
Describe the interaction and the affective information in both the human-human
and human-computer cases.
2. Construct another example interaction involving affect, but this
time try to think of a human-human interaction that may not carry over
to the human-computer "equivalent". In what way does "the media equation"
hold (or not) for your example?
3. Critique this statement: "Emotions are just special kinds of thoughts"
4. If time is short, what one question/issue would you most like to
discuss based on these readings?
EMAIL:
Please send your response TO: picard@media.mit.edu, SUBJECT: mas630-homework
by the deadline above (24 hours before class, so I have time to see your responses and prepare for class).
Staff | Announcements | Assignments | Syllabus | Policies
Grading:
20%
Classroom participation
40% Ten short assignments
40% Project and presentation (proposal draft due March 10, final proposal due by March 17, presentations May 12)
Policy: All students are expected to attend all classes and
all project presentations. Please contact Prof. Picard in advance if you
will have to miss class. Unexcused absence will affect your grade.
The final project presentations are especially important for everyone to
attend; please do not plan to leave for summer until after the last
day of class May 12.
Staff | Announcements | Assignments|Syllabus | Policies
SPRING 2004 Syllabus:
Neuroscience findings related to emotion
Emotion and perception, decision-making, and creativity
Emotion and learning
Physiology of emotion
Affect recognition by machines (incl. wearable systems)
Measuring frustration/stress for usability feedback
Responding to user emotion to reduce user frustration
Inducing emotion
Robots/agents that "have" emotion
Emotion and behavior
Expression of emotion by machines/agents/synthetic characters
Philosophical, social, ethical implications of affective computing