🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

Emotions & Artificial Intelligence PDF

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Document Details

BrightestBoston2440

Uploaded by BrightestBoston2440

Tags

emotions artificial intelligence affective computing facial expressions

Summary

This presentation explores the intersection of emotions and artificial intelligence, covering topics such as facial expression analysis, microexpressions, and the use of affective computing in machines. The work discussed includes research on the detection of emotions through facial expressions and tone of voice.

Full Transcript

Emotions & Artificial Intelligence I. Expression of emotions II. Reading emotions A. Reading the Mind in the Eyes Test (RMET) B. Real vs. fake smiles C. Emotions conveyed by tone of voice D. Identifying microexpressions III. Affective Computing Expression of Emotions 90% or more of emotional communi...

Emotions & Artificial Intelligence I. Expression of emotions II. Reading emotions A. Reading the Mind in the Eyes Test (RMET) B. Real vs. fake smiles C. Emotions conveyed by tone of voice D. Identifying microexpressions III. Affective Computing Expression of Emotions 90% or more of emotional communication occurs through nonverbal channels, e.g., facial expression, gesture and voice Some researchers, like Raymond Dolan, have pointed out that emotions, unlike thoughts, are embodied Ekman has identified seven basic emotions: surprise, fear, disgust, anger, happiness, sadness, and contempt Each is expressed by a unique set of muscle contractions in the face These seven emotions are universal Ø Ekman study on cross-cultural similarities in emotional expression Children who are born blind, or both blind and deaf, manifest emotions in same basic way as sighted children v Expressions do not have to be learned through observations However, there are cross-cultural differences in display rules, that is when and where it is considered appropriate to display particular emotional expressions (e.g., when one uses certain gestures) Ø Eyebrow flash: Japanese, who are more reserved in their social expressions, use it mainly when greeting young children, whereas Samoans greet nearly everyone that way Can you guess an Olympic athlete’s nationality? Generally not – unless they are smiling! - Australian smile is more friendly - American smile is more dominant - British smile is more polite (think Prince Charles) Differences have also been found between the emotional expressions of JapaneseAmericans and Japanese nationals Two groups were indistinguishable when displaying neutral facial expressions (Abigail Marsh) Reading Emotions Our brains are in general amazing emotion detectors After a quick glance at someone’s photo, we have a pretty good sense of the person’s extraversion and agreeableness Watching 90 seconds of people walking and talking will allow us to accurately predict how others evaluate them Reading the Mind in the Eyes Test (RMET) http://socialintelligence.labinthewild.org/mite/ ➜ Women tend to outperform men on this test Which Smiles Are Real? *Hint: look at the eyes Identifying Microexpressions Ekman Student of Silvan Tomkins, Princeton professor whose ability to read people’s emotions was legendary Identified 46 distinct muscular movements (action units) in the face Facial Action Coding System (FACS) ² In addition, emotion is conveyed by tone of voice Ø Physician’s malpractice study by Nalini Ambady: - Research participants were able to very accurately predict which doctors would later get sued based on the pitch, intonation, and rhythm of their speech Emotions & Artificial Intelligence: Affective Computing Affective computing: Computing that relates to, arises from, or deliberately influences emotions Goals: To create computers and robots with ability to recognize emotions in people Ø Ex: computer could tell when you are tired or frustrated based on your facial expression and recommend a rest break To imbue machines with ability to express emotions ✧ Cynthia Breazeal’s Kismet project was a first step in this direction Designed to model the interaction between an infant and its caregiver Cute robotic head capable of sensing others and making wide range of facial expressions Processed visual and auditory information to detect affective information Information then guided its own emotional state, e.g, expressing frustration, anger, surprise, or sadness − If it does not receive enough stimulation, it expresses sadness − If it is getting too much stimulation, it makes a fearful face − If it receives a moderate amount of stimulation, it expresses joy ✧ FaceSense program at MIT analyzes facial expressions, head gestures, etc. Along with top-down predictions of what the expected affect should be like in a given situation, can be used to evaluate customer satisfaction Also used as training tool for autism spectrum disorders ✧ A number of computer programs already exist that are capable of recognizing human emotion from spoken auditory information alone Use neural networks to process features in speech, such as energy, speaking rate, and fundamental frequency Can recognize emotions with about 79% accuracy – equivalent to human-level performance ✧ As mentioned earlier, programs like these have been used to develop virtual therapists and chatbot therapists ★ Part of difficulty of creating simulations is that people are extremely sensitive to subtle deviations – early roboticists had tough time creating robots that didn’t creep people out! ✧ FEELIX GROWING (Feel, Interact, eXpress: A Global Approach to Development with Interdisciplinary Grounding) Consortium of universities and robotic companies across Europe that are developing robots that learn, interact, and respond to humans as children might do Programmed to learn to adapt to the actions and moods of their human caregivers Programmed to become attached to human agent Able to express anger, fear, sadness, distress, happiness, excitement, and pride Research is also exploring nonverbal cues and emotions associated with physical postures, gestures, and body movements Aim is for robots to provide caregiving and companionship, e.g., in hospital setting ✧ Robot “Nao” will begin replacing tellers at a number of bank branches in Japan Has cameras in his face to analyze customers’ facial expressions Has microphones to judge their mood by tone of voice Can greet customers in 19 different languages and ask which service they need ★ Construction of these types of robots requires a large team of different experts, including psychologists, linguists, computer programmers, etc. Ø Paro therapeutic robot seal Ø Cynthia Breazeal’s Leonardo robot Video References Videos excerpted from: Lie To Me: Superb Body Analysis by Tim Roth https://www.youtube.com/watch?v=tf8Iy_XfAIA Cute Baby Seal Robot - PARO Theraputic Robot #DigInfo https://www.youtube.com/watch?v=oJq5PQZHU-I The Rise of Personal Robots | Cynthia Breazeal | TED Talks https://www.youtube.com/watch?v=eAnHjuTQF3M

Use Quizgecko on...
Browser
Browser