Skip to content

When you choose to publish with PLOS, your research makes an impact. Make your work accessible to all, without restrictions, and accelerate scientific discovery with options like preprints and published peer review that make your work more Open.

PLOS BLOGS The Official PLOS Blog

A neural code for emotion

Our daily experience rides along the backdrop of a dynamic stream of mental states, characterized by spontaneous changes is mood or emotion. For instance, as you take an important exam and perform above expectations, your emotions may fluctuate from anxiety to frustration, followed by surprise and finally contentment. These changes in mood are a defining feature of the human experience, but how do they arise from the underlying stream of neural activity? In their new PLOS Biology study, researchers from Duke University used functional MRI (fMRI) to capture brain signals of emotion, discovering that patterns of brain activity during rest can decode an array of emotional experiences.

Earlier studies have shown that brain activation patterns measured with fMRI can accurately decode a range of stimulus-driven experiences, including perception, mental imagery and performance of various cognitive processes (such as making decisions or remembering). However, it has been unclear if distinct brain states similarly map onto unique emotional experiences, like feeling anger or surprise – much fuzzier concepts than such concrete experiences as seeing a tree or hearing a dog bark. Last year, coauthors of the new PLOS Biology paper, Philip Kragel and Kevin LaBar, showed that brain activation patterns accurately predicted emotions elicited by movies and music. However, like prior stimulus-driven studies, their experiment could not separate the effects of emotion from the external emotion-inducing sounds and images. Their new study sought to isolate emotion from its exogenous triggers, examining brain activation patterns of spontaneous emotions, in the absence of external driving input.

Imaging the emotive brain

To evaluate whether brain activation indices of emotion correspond with individual mood and personality traits, the researchers performed resting fMRI on 499 young adults. Using algorithms derived from their earlier study of emotional responses to movies and music, they computed how much evidence there was from whole-brain activity patterns for each of seven emotions (neutral, contentment, amusement, fear, anger, surprise, sadness). Brain activity patterns corresponding to neutral, surprise and amusement occurred most often, whereas contentment was represented least often. Over time in the scanner, brain states of fear became less frequent, in line with the common experience of “scanner anxiety” at the start of an MRI session. Time series analysis of each emotion for each subject corroborated this trend, showing more negative emotions (fear and sadness) represented early, and more neutral or positive emotions (neutral and surprise) arising later.

Mapping brain signals onto mood

Perhaps more importantly, they found that the prevalence of certain brain states mapped well onto self-reported mood and personality ratings (Figure 1). For instance, individuals who were more likely to later report having felt depressed or anxious during the scan showed higher frequencies of brain activity patterns corresponding with sadness or fear, respectively. Furthermore, those with higher anxious personality ratings showed more common fear and less common anger brain states, those with higher anger ratings had frequent anger brain states, and those with high depression ratings often showed fear and sadness brain states.

Brain-based classifications correspond with mood (A) and personality traits (B). Kragel et al., 2016.
Figure 1. Brain-based classifications correspond with mood (A) and personality traits (B). Kragel et al., 2016.

Critically, however, these mood and personality ratings are at best a proxy for the participants’ true emotional states during the MRI scan. To test the accuracy of the brain decoding algorithms on real-time measures of emotion, another group of 21 young adults underwent fMRI while self-reporting their current feeling at periodic intervals. The brain activity models predicted the self-reported emotional states better than chance, and the frequency of the reported emotions correlated with the frequency of emotions predicted by the brain classification (Figure 2). Thus, while the first study could only indirectly associate brain states with general differences in individual emotion and personality, this second study confirmed sensitivity to real-time fluctuations in emotion.

The frequency of brain-based classifications correlates with self-reported emotions. Kragel et al., 2016
Figure 2. The frequency of brain-based classifications correlates with self-reported emotions. Kragel et al., 2016

A clinical window onto the affective brain?

In contrast to past “brain decoding” studies, which used brain activity patterns to predict externally-driven sensory or cognitive experiences, Kragel and colleagues showed for the first time that brain signals can accurately identify spontaneously arising emotional states. These brain patterns not only map onto mood and personality traits, but also track rapid, real-time emotional fluctuations. This study helps push the limits of the rapidly growing cognitive applications of neuroimaging, demonstrating its power to characterize the dynamic neural events underlying the landscape of human mood and affect. However, just how well fMRI could distinguish more nuanced differences in emotion remains to be seen. As Dr. LaBar notes, “We don’t yet know the limit to the ability of pattern classification tools to discriminate at finer levels of distinction in affect representations (e.g., fear vs. anxiety vs. apprehension) or emotion blends (e.g., do fear and sadness combine to elicit feelings of despair?), but these issues will be interesting to test in future work. Of course, there are limits in the spatial resolution of the dependent measures themselves (the BOLD signal in fMRI). Despite these challenges, we believe that more fine-grained distinctions are likely, as long as the emotions under investigation can be reliably induced across subjects in the MRI scanner.”

Despite these open questions, this study holds important clinical implications; for example, with refinement, fMRI could potentially be useful for diagnosing personality or mood disorders, or for tracking therapeutic outcomes. Such applications could prove particularly valuable in identifying emotional experiences in individuals with impaired awareness or compromised ability to communicate. According to Dr. La Bar,

“Some of the most interesting applications would be in situations where people are unaware that a specific emotion is being elicited yet their behavior indicates an affective bias. If we can show elevated activation in our brain-based emotion models under these circumstances, we can begin to unpack unconscious emotional influences in a way that was previously unimaginable.”

References

Kamitani Y, Tong F. (2005). Decoding the visual and subjective contents of the human brain. Nat Neurosci. 8(5):679-85. doi: 10.1038/nn1444

Kragel PA, LaBar KS. (2015). Multivariate neural biomarkers of emotional states are categorically distinct. Soc Cogn Affect Neurosci. 10(11):1437-48. doi: 10.1093/scan/nsv032

Kragel PA, Knodt AR, Haririr AR, LaBar KS. (2016). Decoding Spontaneous Emotional States in the Human Brain. PLOS Biol. 14(9): e2000106. doi: 10.1371/journal.pbio.2000106

Poldrack RA, Halchenko Y, Hanson SJ. (2009). Decoding the Large-Scale Structure of Brain Function by Classifying Mental States Across Individuals. Psychol Sci. 20(11):1364-72. doi: 10.1111/j.1467-9280.2009.02460.x

Reddy L, Tsuchiya N, Serre T. (2010). Reading the mind’s eye: decoding category information during mental imagery. Neuroimage. 50(2):818-25. doi: 10.1016/j.neuroimage.2009.11.084


Any views expressed are those of the author, and do not necessarily reflect those of PLOS.

Emilie Reas received her PhD in Neuroscience from UC San Diego, where she used fMRI to study memory. As a postdoc at UCSD, she currently studies how the brain changes with aging and disease. In addition to her tweets for @PLOSNeuro she is @etreas.

Back to top