Face emotion perception: evidence from fNIRS

Share on facebook
Share on twitter
Share on google
Share on linkedin
Share on email
Share on print
blood

Prosodic influence in face emotion perception: evidence from functional near-infrared spectroscopy.

Abstract
Emotion is communicated via the integration of concurrently presented information from multiple information channels, such as voice, face, gesture and touch. This study investigated the neural and perceptual correlates of emotion perception as influenced by facial and vocal information by measuring changes in oxygenated hemoglobin (HbO) using functional near-infrared spectroscopy (fNIRS) and acquiring psychometrics. HbO activity was recorded from 103 channels while participants ([Formula: see text], [Formula: see text]) were presented with vocalizations produced in either a happy, angry or neutral prosody. Voices were presented alone or paired with an emotional face and compared with a face-only condition. Behavioral results indicated that when voices were paired with faces, a bias in the direction of the emotion of the voice was present. Subjects’ responses also showed greater variance and longer reaction times when responding to the bimodal conditions when compared to the face-only condition. While both the happy and angry prosody conditions exhibited right lateralized increases in HbO compared to the neutral condition, these activations were segregated into posterior-anterior subdivisions by emotion. Specific emotional prosodies may therefore differentially influence emotion perception, with happy voices exhibiting posterior activity in receptive emotion areas and angry voices displaying activity in anterior expressive emotion areas.

PMID: 32873844 [PubMed – in process]

Sci Rep. 2020 Sep 01;10(1):14345

Authors: Becker KM, Rojas DC

Join Our Newsletter


Mike

Mike

Comments?