
|
Please read these before the next class meeting. There is no written assignment this week in order to give you more time to focus on your class project. Stoop to Conquer, Riskind 1984 (emailed) Embodying Emotion, P. M. Niedenthal, et al. "Stoop to Conquer: Posture and affect interact to influence computer users' persistence," The 2nd International Conference on Affective Computing and Intelligent Interaction, September 12-14, 2007, Lisbon, Portugal. Ahn, H.I, Teeters, A., Wang, A., Breazeal, C., and Picard, R.W. Due 10am, Sunday, April 27, 2008 Please choose to watch either one of these films: a. 2001: A Space Odyssey (148 mins.) (Roz has a VHS copy and Jackie has a DVD copy; let us know if you'd like to borrow.) You may read the chapter "Does Hal Cry Digital Tears? Emotion and Computers" from the book Hal's Legacy http://www-mitpress.mit.edu/e-books/Hal/chap13/thirteen1.html instead of watching the whole movie if you wish. b. Artificial Intelligence (145 mins.) If you wish, you can stop this one after 45-60min. (Contact Jackie to borrow it). READ: [Journal] "Computers that recognise and respond to user emotion: theoretical and practical implications," R. W. Picard and J. Klein, Interacting with Computers 14, (2002) 141-169. [Short news articles] a. The "Little Emotional Controller" Story b. Desktop computers to counsel users to make better decisions [Paper] Lie Detection - Recovery of the Periorbital Signal through Tandem Tracking and Noise Suppression in Thermal Facial Video P. Tsiamyrtzis , J. Dowdall , D. Shastri , I. Pavlidis, M.G. Frank , P. Eckman, SPIE Conference (2005). [Book Chapters] Picard, Affective Computing, Chapter 4: Potential Concerns, pp. 113-137. Dumit, Joseph, Picturing Personhood: Brain scans and biomedical identity (Chapter 4. Ways of seeing brains as expert images), Princeton University Press, 2004, pp. 109-133 Hacking, Ian., The Social Construction of What? (Chapter 4. Madness: Biological or constructed?), Harvard University Press: Cambridge, MA, 1999, pp. 100-124 WRITE: Plan to bring to class on Monday a paragraph describing your class project progress. 1. In 2001: A Space Odyssey, AI and two articles, these imaginary stories were not entirely positive on future emotional technologies. Choose one scenario (from the movies or the two short articles) and consider the bad part that is most likely to happen in the near future. Are you concerned about this happening? Describe. 2. In Affective Computing (Ch.4) and [Picard and Klein, 2002], affective computers can play many different roles to engage people in everyday life. Affective computers may serve for one individual or a community. Please illustrate one scenario where affective computers are important for a group of people, not for an individual. 3. In the Lie Detection paper and in Dumits book (Ch.4), colorful images (e.g. thermal images, CT scans) of peoples faces and brains can suggest that a person is lying or a person may have mental disorders. How do these claims (produced by experts and technologies) influence the society? Who benefits from these technologies? Who gets hurt? 4. In philosopher Ian Hackings book (Ch.4), he talked about the concept of Interactive Kinds, kinds that can influence what is classified. Are emotions (e.g., bored, irritated, arrogant, annoyed) interactive kinds? When we design emotional technologies, how can we deal with the problems with inaccurate labels depending on changing contexts? Due 10am, Friday, April 11, 2008 READ: (all readings were emailed to you) Frankel Slack Robinson Bickmore 1. The results of Slack's survey of patients using computer-based medical interviewing are very interesting. Considering that the study was done in 1968, do you think that the overwhelmingly positive results were due to the novelty of interacting with a computer? Or do you think that similar results would be achieved today? More importantly, how do you think that user evaluations would change after repeated interactions or with long-term interactions? 2. There have been a number of studies since the Robinson and West paper that have shown superior performance of computer-based interviews over physician interviews in the solicitation of sensitive information from patients. This has included drug use, sexual behavior, and violence. Robinson and West give some great criticisms of their own work, but still raise the hypothesis that patients may report more to computers than to physicians because of less embarrassment. Do you think that their work provides reasonable evidence of this? What weaknesses of their study do you think are most critical? Now take the opposite point of view and assume that it is true that patients feel less embarrassed with and less evaluated by computers. If a computer system was designed to behave more like a human (to use text-to-speech, to use an anthropomorphic animation, to show affective facial expressions, etc.), do you think that the differences in patient reporting between computer and physician would decrease? 3.The paper by Bickmore, Gruber, and Picard showed increases in the bond dimension of the Working Alliance Inventory and greater desire to continue working with the relational agent than the non-relational agent. Unfortunately, the differences in physical activity levels between the different groups only approached significance (p=0.06) for the agent vs. non-agent condition (the relational agent was no different than the non-relational agent with respect to exercise measures during the study). Do you think that this dismisses the relational agent as a useful tool for behavioral change? Or do you think that the limitations could be overcome? Can you think of any techniques to boost the motivational value of the relational agent? 4. What strategy do you think is best?: 1) design highly efficient computer systems for healthcare so that doctors will have more time and energy to be empathetic or 2) design empathetic computer systems for healthcare that can augment the empathy delivered by physicians. There are two parts to this week's assignment: Due 10am, Monday April 7 Please hand Roz in class a paragraph describing what you have accomplished on your project so far.Due 5pm, Friday, April 4, 2008 Please send your response TO: mas630-staff@media.mit.edu with Subject: mas630-homework.READ: Forgas and Moylan, After the movies Lerner, Small, Loewenstein: Heart Strings and Purse Strings Isen et al 87 on Affect and Creativity Isen et al 91 on Affect and Clinical Decision Making Larson and Picard: The Aesthetics of Reading Picard and Daily: Alternatives to asking what users feel WRITE: 1. Jill, the please-her-boss pollster, has been given ten questions on which to collect people's opinions. The questions relate to the overall satisfaction that people perceive with her party's politicians and their impact both locally and nationwide. She is not allowed to modify the questions, but she is willing to modify how the poll is conducted in subtle ways to make her party's political candidates look as good as possible. She plans to poll 1000 people nationally by phone and 1000 locally, in person, by some "random" process. Describe three ways Jill might bias the opinions she collects by manipulating affect-influencing factors. Be clear how you think each of Jill's three manipulations would affect their opinions. 2. The work with Larson was inspired by twisting around the idea of Isen's in order to find a measure that could be influenced by very subtle affective feelings. Pay careful attention to all the factors that might have influenced each participant's feelings when reading this paper and see if you can find some that were not fully controlled. 3. Have the readings this week changed the way you will (critically) read future psychological studies? Describe a way some other work you've seen or read might have had a different outcome if they had carefully controlled for emotion-related variables up front.
Due 10:00am, Friday, March 28, 2008 READ: Mauss, I.B., Cook, C.L., & Gross, J.J. (2007). Automatic emotion regulation during an anger provocation. Journal of Experimental Social Psychology, 43, 698-711. Tamir, M., Mitchell, C. & Gross, J.J. (in press). Hedonic and Instrumental Motives in Anger Regulation. Psychological Science. Butler, E.A., Wilhelm, F.H., & Gross, J.J. (2006). Respiratory sinus arrhythmia, emotion, and emotion regulation during social interaction. Psychophysiology, 3, 612-622. McRae, K., Ochnser K. N., Mauss, I. B., Gabrielli, J. J. D., & Gross, J. J. (in press). Gender Differences in Emotion Regulation: An fMRI study of Cognitive Reappraisal. Group Processes and Intergroup Relations. Allen, J.B., Harmon-Jones, E., & Cavender, J.H. (2001). Manipulation of frontal EEG asymmetry through biofeedback alters self-reported emotional responses and facial EMG. Psychophysiology, 38, 685-693. Supplemental:Lieberman, M.D., Eisenberger, N. I., Crockett, M. J., Tom, S. M., Pfeifer, J. H., Way, B. M. (2007). Putting feelings in words: Affect labeling disrupts amygdala activity to affective stimuli. Psychological Science, 18, 421-428. Note: Don't stress too much about the fMRI data analysis sections in McRae et al. and Lieberman et al. WRITE: 1. If we can regulate our emotions automatically, we can avoid the effortful (and perhaps costly) process of intentional emotion regulation. This is the contention of Mauss et al., and it is shared by many researchers in the emotion regulation field. Do you think computers could help guide us to regulate our emotions automatically? Do you think they should? Consider the efficacy of this approach as well as its ethical implications (i.e., do we want computers to purposefully manipulate our emotions without our knowledge, even if this might be helpful?). 2. Tamir et al. argue that anger, while unpleasant, might be purposefully sought to achieve certain goals. Can you think of another unpleasant emotion you might willingly summon? What techniques would you use to conjure this emotion? Tamir et al. used music and emotional recall. Can you think of some other emotion regulation tricks that could be suited for this purpose? 3. Many of these papers discuss individual differences in emotion regulation. Do you think technology could tailor itself to these individual differences in order to respond more adaptively? How? Please give one example. 4. Check out http://www.wilddivine.com/products/stress-relief-game/# and click on “view demo.” The demo is thick with syrupy new-agey vocab, but please try to evaluate the product objectively. We will discuss limitations of this (and other) biofeedback programs in class. For now, however, please think about how games like this could be useful. Consider what you’ve learned in the readings to answer this question. Replace their rhetoric with your own informed insight into emotion regulation (that is, don't just say "it can help you unfold your full potential and glimpse the field of infinite possibilities!").
Due 10:00am, Monday, March 17, 2008 READ: Russell, J. A. (1994). Is there universal recognition of emotion from facial expression? a review of the cross-cultural studies., Psychol Bull, 115(1), 102-41. Russell, J. A. (2003). Core affect and the psychological construction of emotion., Psychol Rev, 110(1), 145-72. Russell, J. A., & Dols, J. M. F. (1997). The psychology of facial expression. Cambridge University Press. Chapter 1 Carroll, J. M., & Russell, J. A. (1996). Do facial expressions signal specific emotions? judging emotion from the face in context., J Pers Soc Psychol, 70(2), 205-18. Carroll, J., & Russell, J. (1997). Facial expressions in hollywood's portrayal of emotion, Journal of personality and social psychology, 72(1), 164-176. Write: There is no advance writing assignment related to the readings this week. However, your project proposals are due in class (see below for what is needed in them). Please also think about these two more-complex-than- they-sound questions Jim Russell has asked us to contemplate while reading this week: "Do faces express emotions?" and "How can we understand emotions?" Project Proposals: Please submit a page or two describing & explaining: 1. What are you proposing to build/test/investigate?
Due 10:00a.m., Friday, March 7, 2008 READ: Baron-Cohen et al., "The "Reading the Mind in the Eyes" Test Revised Version: A study with Normal Adults, and Adults with Asperger Syndrome or High_functioning Autism" (and take the test at the link below.) Wagner et al., "From Physiological signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification" Pantic and Rothkrantz, "Toward an Affect-Sensitive Multimodal Human-computer Interaction" Supplemental: WRITE: 2. Take the eyes test. Did you find the test difficult? If you did well, what features were you paying attention to that you felt helped you make a decision on the emotional state from the eyes? If you did poorly explain, upon getting the right answer, what feature(s) of the eyes did you miss that you think could have improved your performance. 3. Most machine learning technologies/algorithms depend heavily on the features you choose to extract from a given set of raw data. Think of an affect recognition system, be it any combination of sensors you want (vision, SC, EMG, EKG, ...) and describe what states you would want it to recognize and what features you expect to be important for recognizing those states. 4. The number of sensors that are used in a given application can be very small or very large depending on what you are looking for. Rana el Kaliouby used just one sensing modality while Wagner et al. used several. Think of and describe a situation where a large number of sensing modalities might be needed. Think of and describe a situation where only one is needed. Feel free to check out some of the projects from the affective computing group for ideas (it is ok to report something somebody else has done - just put it in your own words, and feel free to raise questions and critique.) Due 10 a.m., Friday, February 29, 2008 READ: Bickmore, T., & Schulman, D. (2007). Practical approaches to comforting users with relational agents In , CHI '07 extended abstracts on Human factors in computing systems (pp. 2291-2296). San Jose, CA, USA: ACM. doi: 10.1145/1240866.1240996. Klein, J., Moon, Y., & Picard, R. W. (1999). This computer responds to user frustration In , CHI '99 extended abstracts on Human factors in computing systems (pp. 242-243). Pittsburgh, Pennsylvania: ACM. doi: 10.1145/632716.632866. Nass, C., Jonsson, I., Harris, H., Reaves, B., Endo, J., Brave, S., et al. (2005). Improving automotive safety by pairing driver emotion and car voice emotion In , CHI '05 extended abstracts on Human factors in computing systems (pp. 1973-1976). Portland, OR, USA: ACM. Picard, R. W., & Liu, K. K. (2007). Relative subjective count and assessment of interruptive technologies applied to mobile monitoring of stress, Int. J. Hum.-Comput. Stud., 65(4), 361-375. doi: 10.1016/j.ijhcs.2006.11.019. Supplemental Readings: Brave, S., Nass, C., & Hutchinson, K. (2005). Computers that care: investigating the effects of orientation of emotion exhibited by an embodied computer agent, Int. J. Hum.-Comput. Stud., 62(2), 161-178. doi: 10.1016/j.ijhcs.2004.11.002. WRITE: 1. Consider this criticism: "empathetic technology" cannot succeed because technology cannot feel what people feel. Can you give an
example where one person cannot feel what another person feels, and yet their empathy succeeds? What do you think are the limits of empathetic technology given what you know about technology on the near horizon? Will it be able to help more than it has been shown to do so in these readings where it has been limited largely to scripted responses from machines? 2. The work in Klein et al., only used empathy once, and it appeared to help reduce frustration. Do you think technology's use of empathy (without being able to actually fix the user's problem) could succeed over long-term use? Do you think it would be necessary to build into the technology something else in order for this approach to succeed repeatedly, over time? If you think it would fail over continued use, be clear why. Support your argument, considering this week's readings as well as any other sources you'd like. 3. These two approaches are sometimes contrasted for advising a parent how to help their child when he or she is frustrated: identify the child's problem and offer a fix for it, vs. empathize with the child, so he/she can get past the bad feelings and find their own fix. (1) Comment on how you think this approach might work when delivered by technology to a child who is engaged in using it for educational instruction (e.g., using a computerized learning tutor). (2) Change the previous use of "Parent" to "Technology provider" and of "child" to "customer" and consider the customer to be an adult user of some new technology. Does your answer to the previous question change in this case? In both parts of this question be clear if you recommend favoring mostly approach (1), approach (2), or a mix of both. 4. How important do you think it is to allow for "repair" when technology shows empathy? What do you think technology should do if a person responds adversely to its attempt at empathy? 5. Cheery drivers responded best to an energetic voice, and upset ones to a subdued voice. Listen for instances this week where people change their voice to deal with the emotions of another person effectively, and share an example of this with the class. (If you can't find an effective example, you can give an example where the interaction was ineffective and tell us why you think it failed.) Please do not disclose identifying information.
Due 10 a.m., Friday, February 22, 2008 Davis, M. H. (2006). Empathy In J. E. Stets, and J. H. Turner (Eds.), Handbook of the Sociology of Emotions, Handbooks of Sociology and Social Research. (emailed to you)
Marci, C. D., Ham, J., Moran, E., and Orr, S. P. (2007). Physiologic correlates of perceived therapist empathy and social-emotional process during psychotherapy, The Journal of Nervous and Mental Disease, 195(2), 103-111. Marci, C. D., Moran, E. K., and Orr, S. P. (2004). Physiologic evidence for the interpersonal role of laughter during psychotherapy, The Journal of Nervous and Mental Disease, 192(10), 689-695. Bargh, J., and Chartrand, T. L. (1999). The unbearable automaticity of being, American Psychologist, 54(7), 462-79. 2. Take the Empathy quotient. What are your impressions of the test in light of the different approaches to understanding empathy? How would you design a test that does not rely on self report to see if someone's ability to empathize has changed as a result of some intervention? For example, how might you incorporate physiological measures or other affective technologies into this test? 3. If you were able to control an unconscious tendency to mimic the emotions of others, how might that affect interactions? Due 10 a.m., Friday, February 15, 2008 2. Argue for or against this statement: "Emotions are just special kinds of thoughts." 3. Pick a least favorite and a most favorite application from Chapter 3 and critique both of them (pros and cons) based on your own personal and unique research perspective. I wrote these over ten years ago, and while some things have not changed much, I am interested in what you think is most interesting, most likely to succeed or fail, and why. |
