Membres du laboratoire : contributions et intérêts de recherche


Postdoctoral fellows / Graduate students / Research personnel

Postdoctoral fellow


Simon Rigoulot, PhD

Postdoctoral Fellow

I studied Cognitive Science and Neuroscience in France and I completed my PhD in December  2008 from University of Lille 2. This thesis(“Electrophysiological and behavioral impact of emotional information in peripheral vision”) was about the exploration of emotional processing in peripheral vision, i.e. in far eccentric points of the visual field. Next, I held a position as an associate  teaching  and research in the University of Lille 1.  I am interested in the investigation of the cortical and autonomic concomitants to the cognitive processes underlying the processing of emotional  information.  During my thesis, I explored the processing of faces with a dynamic emotional expression in central and peripheral vision. Furthermore, I developed a project with audiovisual stimulations, associating emotional prosody to these dynamic faces, to investigate the emotional modulations of attentional resources. Indeed, in this frame, only rare studies used multimodal  stimulations despite the fact that they are more ecological and more typical in human communication than unimodal ones.

List of publications

Rossignol, M., Phillipot, P., Bissot, C., Rigoulot, S., Campanella, S. (In review). Electrophysiological correlates of enhanced perceptual process and attentional capture by emotional faces in social anxiety. Brain Research.

S. Rigoulot & M.D. Pell, Seeing emotion with your ears: Emotional prosody implicitly guides visual attention to faces (in Revision,PLOS One).

Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they just had seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0-1250 ms], [1250-2500 ms], [2500-5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and plays a critical role in how humans respond to related visual cues in the environment such as facial expressions.

S. Rigoulot & M.D. Pell, Emotion in the voice influences the way we scan emotional faces (submitted, Journal of Nonverbal behavior) . 

During interpersonal communication, the emotional expression of a face is often accompanied by speech conveying vocal emotional cues (or emotional ‘prosody’). Recent data show that emotional prosody systematically influences how listeners gaze at related versus unrelated facial expressions (Paulmann, Titone and Pell, 2011; Rigoulot and Pell, 2011)), although it is unclear whether emotional prosody influences how we scan a single face with an emotional expression. Here, we analyzed the eye fixations of 21 participants who were presented faces expressing fear, sadness, disgust, or happiness while listening to an emotionally-inflected pseudo-utterance (e.g., Someone migged the pazing) spoken in a congruent or incongruent tone. Participants judged whether the emotional meaning of the voice and the face was the same (yes/no decision). Four regions of interest known to play a role in emotional face recognition (brows, eyes, mouth, and nose) were analyzed for the location of first fixations, the frequency and the mean duration of total looks to faces that were congruent or incongruent with the voice. In general, fixations to the eyes were longer than to other face regions for all emotional expressions, and first fixations were usually directed to the eye region with the exception of disgusted faces (which generated significantly more looks to the mouth than the other emotions). Fixations to fearful and sad faces tended to be shorter on average and more frequent to the upper face/eye region. Of key interest here, eye movements to specific face regions were modulated by the emotional relationship of the voice and face: when the stimuli were congruent, participants looked more frequently at regions that are thought to aid recognition of particular emotional expressions (regions of the upper face for fear and sadness, regions of the lower face for disgust and happiness). Our data confirm by the analysis of eye movements that different regions of the face are more salient for recognizing certain facial expressions, and show that emotional prosody influences how we extract information from visual cues when making social judgments about human faces.

Rigoulot, S., D’Hondt, F., Defoort-Dhellemmes, S., Despretz, P., Honoré, J., Sequeira, J. (In preparation). Emotional implicit categorization in peripheral vision: an ERP study.

Pell, M.D. Radlinska, B., Rigoulot, S., Pike, B., Klepousniotou, E. (In preparation). Voices of emotion in the brain.

Rigoulot, S. & Pell, M.D. (In review). Seeing emotion with your ears: Emotional prosody implicitly guides visual attention to faces. Journal of Experimental Psychology: General.

Rossignol, M., Phillipot, P., Bissot, C., Rigoulot, S., Campanella, S. (In review). Electrophysiological correlates of enhanced perceptual process and attentional capture by emotional faces in social anxiety. Cognitive, Affective and Behavioral Neuroscience.

Rigoulot, S., D’Hondt, F., Defoort-Dhellemmes, S., Despretz, P., Honoré, J., Sequeira, H. (In press). Perceiving fearful faces in peripheral vision: behavioral and ERP evidence. Neuropsychologia.

Many studies provided evidence that the emotional content of visual stimulations modulates behavioral performance and neuronal activity. Surprisingly, these studies were carried out using stimulations presented in the center of the visual field while the majority of visual events firstly appear in the peripheral visual field. In this study, we assessed the impact of the emotional facial expression of fear when projected in near and far periphery. Sixteen participants were asked to categorize fearful and neutral faces projected at four peripheral visual locations (15° and 30° of eccentricity in right and left sides of the visual field) while reaction times and event-related potentials (ERPs) were recorded. ERPs were analyzed by means of spatio-temporal principal component and baseline-to-peak methods. Behavioral data confirmed the decrease of performance with eccentricity and showed that fearful faces induced shorter reaction times than neutral ones. Electrophysiological data revealed that the spatial position and the emotional content of faces modulated ERPs components. In particular, the amplitude of N170 was enhanced by fearful facial expression. These findings shed light on how visual eccentricity modulates the processing of emotional faces and suggest that, despite impoverished visual conditions, the preferential neural coding of fearful expression of faces still persists in far peripheral vision. The emotional content of faces could therefore contribute to their foveal or attentional capture, like in social interactions.

rigoulot_et_al._2011.pdf

D’Hondt, F., Collignon, O., Dubarry, A.S., Robert, M., Rigoulot, S., Honoré, J., Lepore, F., Lassonde, M., Sequeira, H. (2010). Early brain-body impact of emotional arousal. Frontiers in Human Neuroscience, Vol.4, Article 33.

Current research in affective neuroscience suggests that the emotional content of visual stimuli activates brain–body responses that could be critical to general health and physical disease. The aim of this study was to develop an integrated neurophysiological approach linking central and peripheral markers of nervous activity during the presentation of natural scenes in order to determine the temporal stages of brain processing related to the bodily impact of emotions. More specifically, whole head magnetoencephalogram (MEG) data and skin conductance response (SCR), a reliable autonomic marker of central activation, were recorded in healthy volunteers during the presentation of emotional (unpleasant and pleasant) and neutral pictures selected from the International Affective Picture System (IAPS). Analyses of event-related magnetic fields (ERFs) revealed greater activity at 180 ms in an occipitotemporal component for emotional pictures than for neutral counterparts. More importantly, these early effects of emotional arousal on cerebral activity were significantly correlated with later increases in SCR magnitude. For the first time, a neuromagnetic cortical component linked to a well-documented marker of bodily arousal expression of emotion, namely, the SCR, was identified and located. This finding sheds light on the time course of the brain–body interaction with emotional arousal and provides new insights into the neural bases of complex and reciprocal mind–body links.

dhondt_et_al._2010.pdf

Rigoulot, S., Delplanque, S., Despretz, P., Defoort-Dhellemmes, S., Honoré, J., Sequeira, H. (2008). Peripherally presented emotional scenes: A spatiotemporal analysis of early ERP responses. Brain Topography, 20(4), 216-23.

Recent findings from event-related potentials (ERPs) studies provided strong evidence that centrally presented emotional pictures could be used to assess affective processing. Moreover, several studies showed that emotionally charged stimuli may automatically attract attention even if these are not consciously identified. Indeed, such perceptive conditions can be compared to those typical of the peripheral vision, particularly known to have low spatial resolution capacities. The aim of the present study was to characterize at behavioral and neural levels the impact of emotional visual scenes presented in peripheral vision. Eighteen participants were asked to categorize neutral and unpleasant pictures presented at central (0°) and peripheral eccentricities (-30 and +30°) while event-related potentials (ERPs) were recorded from 63 electrodes. ERPs were analysed by means of spatio-temporal principal component analyses (PCA) in order to evaluate influences of the emotional content on ERP components for each spatial position (central vs. peripheral). Main results highlight that affective modulation of early ERP components exists for both centrally and peripherally presented pictures. These findings suggest that, for far peripheral eccentricities as for central vision, the brain engages specific resources to process emotional information.

rigoulotetal.braintopography2008.pdf

Delplanque, S., Silvert, L., Hot, P., Rigoulot, S., Sequeira, H. (2006). Arousal and valence effects on event-related P3a and P3b during emotional categorization. International Journal of Psychophysiology, 60(3), 315-322.

Due to the adaptive value of emotional situations, categorizing along the valence dimension may be supported by critical brain functions. The present study examined emotion–cognition relationships by focusing on the influence of an emotional categorization task on the cognitive processing induced by an oddball-like paradigm. Event-related potentials (ERPs) were recorded from subjects explicitly asked to categorize along the valence dimension (unpleasant, neutral or pleasant) deviant target pictures embedded in a train of standard stimuli. Late positivities evoked in response to the target pictures were decomposed into a P3a and a P3b and topographical differences were observed according to the valence content of the stimuli. P3a showed enhanced amplitudes at posterior sites in response to unpleasant pictures as compared to both neutral and pleasant pictures. This effect is interpreted as a negativity bias related to attentional processing. The P3b component was sensitive to the arousal value of the stimulation, with higher amplitudes at several posterior sites for both types of emotional pictures. Moreover, unpleasant pictures evoked smaller amplitudes than pleasant ones at fronto-central sites. Thus, the context updating process may be differentially modulated by the affective arousal and valence of the stimulus. The present study supports the assumption that, during an emotional categorization, the emotional content of the stimulus may modulate the reorientation of attention and the subsequent updating process in a specific way.

delplanqueetal.2006.pdf

Graduate students


Karyn Fish

PhD Student

Funding: FQRSC (2008-2011), NSERC (2007-08)

I completed a Bachelor of Science degree in Psychology at McGill University.  I have a broad interest in social cognition and language processing. My initial research will investigate how prosody  influences the way that listeners understand speaker attitudes such as sincerity.


Pan Liu

PhD Candidate

I am from Mainland China--I studied Psychology and gained my Bachelor of Science degree at  Northeast Normal University in China; then I studied Social Cognitive Neuroscience at Beijing Normal  University, with a focus on emotion and Working Memory, and gained my Master of Science degree there.  My initial research interest is to investigate the relationships between Working Memory  and  Emotional Prosody, e.g., the role of Working Memory in the processing of Emotional Prosody, which would have a continuity with my previous work.


Rachel Schwartz

PhD Student

I received my bachelor's degree in Linguistics from the University of California, Los Angeles and  studied Cognitive Science and Industrial Design as a visiting scholar at the University of California,  Berkeley. My research assistantships to date have examined listener perception, visual communication practices, and symbolic gesture.I am interested in exploring the role of prosody in  developing  shared meaning between conversational partners. My initial research will examine how conversational rhythms affect interpersonal understanding, and the possible effects of prosody  on the perception  of humor.

List of publications

Kreiman, J., Gerratt, B.R., & Schwartz, R. (2010). Temporal and spectral characteristics of period-doubled phonation. Manuscript in preparation.

Research Personnel


 

Melissa Stern

Research Assistant/Lab Manager

I am a Master's student in School/Applied Psychology at McGill University.  I received my  undergraduate degree in Psychology from McGill University in 2010.I am currently working on my Master's  thesis, which examines the relationship between emotion regulation and adolescents' engagement in risky behaviors.  As a lab manager and research assistant in the Pell Lab, I assist in carrying out  behavioral studies involving the identification of emotions, data analyses, and making sure the lab runs smoothly.


Back to top