2020-2021 Undergraduate Awardee: Jiani Li

“Neural Synchrony as a Signature for Similar Emotional Experience”

Jiani Li, University of California Los Angeles, Department of Psychology


Prevalent, automatic and powerful, emotional experience forms an integral part of human life. Despite numerous studies pointing at the impact of emotion in shaping one’s interpretation of situation and guiding action, emotional experience has not been studied extensively due to its idiosyncratic nature. However, advances in neuroimaging techniques and statistical analysis methods enabled more rigorous investigation of subjective experience, one of which is neural synchrony. Previous research revealed that people who interpreted narratives in similar ways showed enhanced neural synchrony in regions implicated in mentalizing and emotion processing, including bilateral temporoparietal junctions (TPJ), bilateral inferior parietal lobules (IPL), and ventromedial prefrontal cortex (vmPFC). Given that similar emotional experience is naturally associated with similar perspective, we sought to examine if neural synchrony in the aforementioned neural regions underlies shared emotional experience. A hundred and four participants watched political videos while being scanned by Functional Near-Infrared Spectroscopy (fNIRS) and rated their emotional experience afterwards. Using Intersubejct Correlation (ISC) analysis, we found that subjects who reported more similar emotional experience exhibited more synchronized neural fluctuations in vmPFC and bilateral TPJ/IPL. The results suggests that neural synchrony in these regions could be a neural signature for subjective emotional experience.


“It should be possible to present, for every emotion, a corresponding description of how the world looks to the subject.” – Nico Frijda, The Emotions, p. 196

When an emotionally intense play such as Shakespeare’s Hamlet is shown in front of an audience, some may shed gallons of tears while others might be bored by the sheer length of the play. Idiosyncratic, effortless and subjective, emotional experience represents one crucial way in which each of us psychologically engage with the world. Indeed, as Nico Frijda pointed out in his quote, emotional experience is one’s perception of the world.

Due to the private nature of emotional experience, research in affective science has largely focused on the measurable channels of emotional expression – behavioral, physiological and neural – as opposed to studying emotion as a stream-of-consciousness-like experience. With technological advances in neuroimaging and statistics, however, there is greater potential to scientifically investigate subjective experience, by analyzing the real-time progression of neural activity during the experience. Then, by comparing how synchronized two individuals’ neural fluctuations are, one might be able to infer the similarity of their psychological states such as emotional experience. This approach, termed neural synchrony, has been widely used in cognitive neuroscience studies to examine the neural mechanisms for shared perspective. These studies consistently found an association between shared interpretations of narratives and synchronized neural activity in bilateral temporoparietal junctions (TPJ) and bilateral inferior parietal lobule (IPL). These regions are nodes of the Default Mode Network (DMN), which is a network of areas responsible for processes such as theory of mind and self-referential processing. Given that shared perspective and shared emotional experience often go hand in hand, and shared perspective has been empirically associated with neural synchrony in certain regions, a natural question to ask is whether shared emotional experience could also be related to neural synchrony.

Yet, little research has been done to answer this question. A recent study by Chang et al. (2020) showed that activity patterns in the ventromedial prefrontal cortex (vmPFC), yet another node of DMN, tracked affective transitions in a movie. However, the researchers did not directly measure subjects’ emotional experience. Moreover, since independent raters overall agreed on how emotional each portion of the movie was, emotional experience induced by the movie presumably stayed rather constant across subjects, meaning that it would be hard to assess whether individual differences in emotional experience would correspond to neural synchrony.

In the present study, we sought to examine whether shared emotional experience is underpinned by synchronized neural activity across subjects. Given that previous literature identified TPJ and IPL as possible neural mechanisms for shared perspective, which often co- occurs with shared emotional experience, we hypothesized that greater similarity in emotional experience would be associated with more correlated neural activity in TPJ and IPL. In addition, since vmPFC has been shown to track affective changes in stimuli and may thus be implicated in emotional experience, we also hypothesized a correlation between individual differences in vmPFC activity and that in emotional experience.


We recruited 146 participants from the University of California, Los Angeles and neighboring communities. To be selected, participants must be over 18 years old, right-handed and have lived in the U.S. since childhood. Forty-two participants were excluded from data analysis due to incomplete data. Among the remaining 104 participants, the mean age was 20.09 years (SD = 2.03) and 51% were female. Most of them identified as Caucasian, Asian and/or Hispanic (43% White, 35% Asian, 23% Hispanic).

We created four Youtube-style videos in which English speakers discuss their stances on gun control. In two videos, the speakers (one male, one female) express support for stricter gun control, while in the other two videos two separate speakers argue against gun control. Scripts were written by the researchers to control for length and rigor of arguments.

Neuroimaging Data Acquisition
We used Functional Near-Infrared Spectroscopy (fNIRS) to measure neural activity. Participants were scanned in a NIRScout fNIRS rig with a layout of 108 channels (32 light sources and 32 detectors). Raw light intensity data was collected at a sampling rate of 1.95 Hz at wavelengths of 760 and 850 nm.

Self-Report Measurement
Participants rated their emotional experience on three items with 11-point Likert scales. The questions asked about: 1) how much the participant liked the speaker; 2) the extent to which they were bothered by the arguments; 3) their overall experience while watching the video. Participants completed the first two questions immediately after each video, and the third after the scanning phase concluded.

Prior to the scanning session, participants reported their stances on gun control and demographic information. Participants who fulfilled all inclusion criteria were contacted for the scanning session, in which they watched the four political videos in randomized order while being scanned, and rated their emotional experience.

Data Analysis

fNIRS Data Preprocessing
Time courses were trimmed to correspond to stimulus onset and offset, bandpass filtered to 0.01-0.5 Hz to account for signal drift, and corrected for motion artifacts using a PCA algorithm which identifies signal spikes. They were then converted into HbO concentration relative to baseline using the Modified Beer Lambert Law, and z-scored.

Inter-Subject Correlation (ISC) Analysis
For fNIRS data, we first averaged each subject’s time courses across channels that correspond to each region of interest (ROI). Since IPL could not be clearly separated from TPJ in fNIRS, we grouped them together to have three ROI’s in total (i.e., vmPFC, right TPJ/IPL, left TPJ/IPL). We operationalized neural dissimilarity via the Euclidean distance between each pair of subjects’ time courses, using the formula:

D = √∑t(?1(?) – ?2(?))2

where c1(t) and c2(t) are two subjects’ time courses averaged within ROI. This resulted in a Representational Dissimilarity Matrix (RDM) of inter-subject Euclidean distances for each ROI*video combination, and 12 neural RDM’s in total. We then computed the difference between each pair of subjects’ ratings on each emotion question, producing three behavioral RDM’s. Afterwards, we computed pairwise Pearson correlation between each neural RDM and behavioral RDM, and FDR-corrected all correlation coefficients.


ISC analysis revealed that correlations in all 36 ROI*video*emotion question combinations were significant. There were significant positive correlations between inter-subject differences in vmPFC activity and that in ratings of the Like, Bother and Experience questions (r = .07 – .11, p < .001), between differences in rTPJ/IPL fluctuations and that in all three emotion questions (r = .06 – .11, p < .001), and between differences in lTPJ/IPL and that in all three emotion questions (r = .06 – .12, p < .001). In other words, for two subjects who had more similar emotional experience relative to others, their neural fluctuations in vmPFC, rTPJ/IPL and lTPJ/IPL were also more similar, and the converse was true for two individuals whose emotional experiences were more distinct. See Figure 1 for locations with significant results.


The present study examined whether similar emotional experience is subserved by synchronized neural activities in vmPFC, right TPJ/IPL and left TPJ/IPL. Results indicated that the extent to which dyads shared emotional experience was positively correlated with the degree of synchrony between their neural fluctuations in the aforementioned ROI’s, supporting our hypothesis. Importantly, significant correlations were observed for neural responses during both anti-gun-control and pro-gun-control videos. Given that our participants included conservatives, liberals and moderates, who should have had distinct emotional experiences when watching videos of two different stances, the consistency of results across video types suggests that the association between shared emotional experience and neural synchrony still holds even when the range of emotional experiences is diverse, thereby offering stronger support for our hypothesis.

Our results were consistent with existing studies that found a correlation between shared perspective and neural synchrony in TPJ and IPL, and between affective values of stimuli and neural synchrony in vmPFC. However, we extended prior literature by combining the three ROI’s for the first time. Since all three regions are parts of the DMN, one might speculate that the DMN underlies subjective experience in general, whether that being how a narrative unfolds from one’s perspective, as previous research showed, or emotional experience, as the current study revealed. Future research could extend our analysis to other nodes of the DMN.

Our study represents one of the first attempts to examine neural correlates of emotional experience, something that is difficult to study due to its idiosyncratic nature. By revealing neural synchrony in vmPFC and bilateral TPJ/IPL as a possible neural signature for shared emotional experience, the present study opens doors for inferring emotional experience from neural data. For instance, if we know the emotional experiences and average neural time courses of several “reference groups,” can we infer an individual’s emotional experience by simply correlating their neural data with that of the “reference groups” whose emotional experiences are known? Such a question may exceed the scope of our study, yet our study represents a preliminary effort towards deciphering subjective psychological processes like emotional experience.

A further strength of our study lies in the neuroimaging technique used. Unlike Functional Magnetic Resonance Imaging (fMRI) that poses substantial physical constraints on participants, fNIRS is lightweight and allows for more naturalistic movements. This is particularly important when the construct examined is emotional experience, which is prone to environmental influence. Therefore, coupling fNIRS with other experimental controls enabled us to assess neural correlates of relatively genuine emotional experience in a controlled setting, striking a balance between ecological validity and internal validity. However, a limitation o fNIRS is that it can only measure neural activity in cortical areas. Since some well-established “emotion regions” (e.g. the amygdala) are subcortical, future research could utilize other neuroimaging techniques to investigate neural synchrony in subcortical structures and shared emotional experience, thereby complementing the current study and enriching the limited literature on neural mechanisms for emotional experience.

Impact Statement

In the current study, we provided a way to scientifically investigate emotional experience, a phenomenological entity that has traditionally been hard to study. We showed that neural synchrony in nodes of the DMN could serve as an indirect measure of similarity in emotional experience, bringing up the possibility of inferring emotional experience from similarity in neural responses. Furthermore, we demonstrated that fNIRS is a viable neuroimaging modality that may be particularly suitable for affective neuroscience research.