Lecture 3 - Perceiving Faces and Bodies PDF
Document Details
Uploaded by HandsDownHappiness7951
Tags
Summary
This document outlines the neural basis of face processing. It covers topics like facial recognition, the role of different brain areas (OFA, FFA, STS), early perception of facial features, and various models of face perception.
Full Transcript
Lecture 3 - Perceiving Faces and Bodies Processing of facial and bodily information is crucial to recognise friends and colleagues to understand others’ emotions and intentions Outline Perceiving faces (OFA, FFA, STS) Perceiving bodies Integrating faces & bodies Perceivin...
Lecture 3 - Perceiving Faces and Bodies Processing of facial and bodily information is crucial to recognise friends and colleagues to understand others’ emotions and intentions Outline Perceiving faces (OFA, FFA, STS) Perceiving bodies Integrating faces & bodies Perceiving Faces Faces covey lots of information Neural Basis of Face Processing A vast neural network, encompassing cortical and subcortical areas But some areas towards the back of the head are particularly important The core network of face processing Occipital face area (OFA) Fusiform face area (FFA) Superior Temporal Sulcus (STS) ** Discovery of FFA - Nancy Kanwisher ** Schalk et al. (2017) Occipial vs Fusiform Face Area FFA - specialised face processing identity OFA - Early perception of facial features fMRI and repetition suppression Neural activity decreases, when identical stimuli are repeated But what is “identical” might differ between OFA and FFA Models of Face Perception The study of face processing has a long tradition in psychology Early models of face processing were more cognitive boxes and arrows More recently, neurally-inspired models were developed, maily thanks to brain imaging However, evidence from other sources does not always go into the same direction Bruce & Young (1986) – A cognitive model of face perception Structural encoding Detect individual features (e.g. eyes, nose) in a certain configuration (eyes over mouth) Invariant features (e.g., identity) and variant features (e.g. emotional expression) are processed by separate modules Mutual exchange between face processing modules and the cognitive system Haxby et al. (2000) – a neural model of face perception extension of Bruce & Young (1986) variant vs. invariant aspects Core vs. extended system Evidence of Models of Face Perception The functional and neural dissociation between facial identity and expression is backed by a large amount of evidence from neurological patients, brain imaging, and intracranial recordings of non-human primates. Face brain areas in early infants Neurological evidence for the dissociation of face identity and expression Specific EEG responses to faces (N170) Evidence by transcranial magnetic stimulation (TMS) We may be born with face selective brain regions fMRI shows face-selective regions in adults and 4-6 months old infants (Deen et al. 2017) fMRI shows: 2-9 months old infants have face-, scene-, and body-selective responses in the location of the adult FFA, PPA, and EBA, respectively (Kosakowski et al., 2022) Activation of face brain areas also in congenitally blind people touching faces – suggests that “Visual experience is not necessary for the development of face-selectivity in the lateral fusiform gyrus” (Ratan Murti et al., 2020) identity vs expression – neurological evidence ** PROSOPAGNOSIA (Face blindess) ** “A neurological condition that affects an individual’s ability to reliably recognise familiar faces – acquaintances, friends, colleagues, well known people, and even close family members” (https://www.faceblind.org.uk/information) [ ]()[ ]()[ ]() (typically) intact emotion recognition (Tranel et al., 1988; Duchaine et al. 2003) cause: ventral occipito-temporal lesion ~ (acquired) Prosopagnosia ~ impaired identity recognition (often) intact emotion recognition (Tranel et al., 1988; Duchaine et al. 2003) cause: ventral occipito-temporal lesion ~ Capgras’ syndrome ~ belief that close others have been replaced by doubles / impostors Intact identity recognition Anomalous emotional response (GSR) to faces Cause: unknown (possibly prefrontal lesions) But often comorbidity with schizophrenia ERPs ERPs are obtained by averaging the EEG over multiple trials (e.g. aligned by stimulus onset) Attention: negativity is sometimes plotted up (for historical reasons) The N170 Brain wave with negative polarity ~170 ms Larger for faces than other objects (houses, cars) Over (right) occipito-temporal areas Originates in OFA and/or FFA Supposed to reflect the structural encoding of faces (perceptual aspects of face processing) Face inversion effect: the N170 is LARGER for upside down faces The N170 in prosopagnosia N170 effect absent in acquired prosopagnosia and after lesion of right OFA + FFA N170 inversion effect Absent in acquired prosopagnosia Anomalous in developmental prosopagnosia TMS Evidence Short but strong magnetic pulses delivered over scalp Causes electric discharge in underlying brain area interferes with neural communication and temporarily inhibits/impairs activity in the targeted areas “virtual lesion” Advantages: Provides direct evidence about the role of a brain area in a cognitive function (while fMRI only shows correlational evidence) Disadvantages: Mostly limited to the cortex, cannot attain deep layers of brain Can be unpleasant Activity might spread (focality questioned?) TMS over right OFA, pSTS (localised with fMRI) Task: indicate if 2nd face has same identity/emotion Inhibition of OFA disrupts expression matching only in early time (60-100ms) Inhibition of pSTS disrupts expression matching over a longer period (60-140ms) Inhibition of pSTS impairs expression but not identity matching OFA may feed into emotion recognition by processing low level features (early facial processing), while later processing occurs in STS (where expression is identified) The FFA was not targeted here, because it cannot be reached with TMS A word of caution Separation between recognition of face identity and expression not so clear cut Most prosopagnosics also (although less) impaired in emotion recognition Maybe identity recognition is just more demanding? the facial identity and expression routes might separate at a later stage, i.e. after a common representational system coding for both the separation of identity and expression is relative rather than absolute Importance of subcortical structures like the amygdala for processing of emotional facial expressions Emotion X Sex interaction in face processing When asked to imagine an angry/happy face, people think of a man/woman Faster and more accurate categorisation Of anger in males, and happiness in females Of maleness in angry faces, and femaleness in happy faces These effects partly extend to voice processing (Korb et al., 2023, Emotion) Facial Expression Recognition - other important players Amygdala Somatosensory cortices Subcortical processing of emotional expressions amygdala responses, 74-ms onset fear faces (Méndez-Bértolo et al. , 2016) Intact facial responses to emotional faces (and bodies) in patients with unilateral destruction of visual cortex (Tamietto et al., 2009) 108 subjects with focal brain lesions 3 different tasks that assessed the recognition and naming of emotions from facial expressions Found that recognizing emotions from visually presented facial expressions requires right somatosensory-related cortices consistent with idea that we recognize another individual’s emotional state by internally generating somatosensory representations that simulate how the other individual would feel when displaying a certain facial expression Role of somatosensory cortices in facial emotion perception: Let’s look at the evidence Facial mimicry Facial feedback hypothesis Blocking/interfering with facial mimicry affects emotion recognition TMS: inhibition of somatosensory cortex reduces facial mimicry and affects emotion recognition Mimicry People have the tendency to imitate perceived emotional facial expressions Even when they are not aware of them Facial feedback hypothesis Facial feedback hypothesis: Facial expressions not only communicate our emotions, but can also influence them the activation of facial muscles sends (feedback) signals to the brain This feedback can influence how we perceive others’ emotions Blocking/interfering with FM slows down the recognition of facial expressions (baumeister et al., 2016; Stel & van Knippenberg, 2008) interferes with the recognition of happiness (Oberman, Winkielman, & Ramachandran, 2007) impairs the distinction between true and false smiles (Maringer, Krumhuber, Fischer, & Niedenthal, 2011; Rychlowska et al., 2014) delays the perception of change between happy and sad facial expressions (Niedenthal, Brauer, Halberstadt, & Innes-Ker, 2001) decreases responses to angry faces in the amygdala (Hennenlotter et al., 2009) Face Perception Study Core areas crucial for face perception are at the back of the head: Occipital face area (OFA) – facial features Fusiform face area (FFA) – invariant features (identity) Superior temporal sulcus (STS) – variant features (emotion, gaze) We are either born with these in place, or they develop early in infancy Lesion, or temporary inhibition of these areas results in specific impairments of emotion or identity recognition a larger network is also involved Subcortical areas (amygdala) – emotional responses, also when unaware of face (right) somatosensory cortex – contributes to emotion recognition Parietal (attention), anterior temporal (memory), cingulate (emotion), and prefrontal (emotion and decision making) Separation of variant (emotion) and invariant (identity) face features might be not so clear cut Some unknowns remain about what aspects of faces are processed when and where in the brain Perceiving Bodies Bodies also convey loads of nonverbal information we use it every day, e.g. to Recognise people’s identity Recognise people’s emotions, mood, and intentions But also sex, age, attractiveness, health, etc. We are all fascinated by the human body shape Extrastriate body area (EBA) response was high to human body parts (A) and whole human bodies (B) whether presented as photographs, line drawings (C), stick figures (D), or silhouettes (E) low response to whole faces (G) response was significantly lower to object parts (H) and whole articulated objects (I), whether represented as photographs or line drawings (J), as well as to scrambled control versions of stick figures (K) and silhouettes EBA activations (from several fMRI studies): The N190 ERP component peaks at same location as N170 but 20 ms later Is greater for bodies than faces or other objects generalizes to abstract depictions of the body (stick figures, human silhouettes) is increased for inverted bodies originates from EBA (Ishizu et al., 2010; Meeren et al., 2013) Intracranial recordings of EBA Implanted electrode over EBA shows greater response to images of bodies than faces, objects, or animals body-selective responses start at 190ms and peak at 260ms post-stimulus onset (similar to N190) Inhibition of EBA with TMS Fusiform body area (FBA) responds selectively to whole bodies and body parts, and to schematic depictions of the body Creates a more holistic body representation (than the EBA) activation of pSTS for human point light displays Posterior STS: biologically plausible > scrambled moving human PLDs Area MT (visual cortex): Non-biological > static movement TMS over pSTS disrupts perception of biological motion Summary body perception EBA responds to body parts and whole bodies FBA more selective for holistic body representations STS processes body movement Similarly, the FFA is for face identity recognition, and the STS for emotional expression and other changeable aspects of faces 2 systems for emotional body language (EBL) An automated reflex-like circuit that predominantly resides in subcortical structures: rapid automatic perception of EBL and preparation of adaptive reflexes (fear behaviour, autonomic responses) A cortically controlled circuit in the service of recognition and deliberation: perceive EBL in detail and compute the behavioural con- sequences of an emotion decide on a course of action in response to the stimulus accurate but slow Summary face and body perception Core network for face processing: OFA (features) FFA (invariable aspects: identity) STS (changeable aspects: emotion / gaze) Core network for body processing: EBA (body parts) FBA (holistic) STS (body movements, partly emotion) Fast integration of emotion in faces and bodies Facial emotion perception influenced by body emotion Incongruent face-body causes: Lower accuracy and slower RTs in facial emotion recognition task Enhanced occipital P1 component Face and body integration Hierarchical integration: 1) faces and bodies are initially processed separately in posterior brain regions 2) gradually faces and bodies are integrated in more anterior brain regions #PS495