We all have similarly constructed brains and our lives are governed by the same physical principles. Yet our perception of the people and the world around us could still be very different. So how exactly do we tune into other people to understand things from their point of view?
According to the findings of the SOCIAL BRAIN project, which was funded by the European Research Council, synchronisation plays a key role in our common understanding of our social environment. When we tell others about past events that have affected us, for example, their brains attune to ours. Similar patterns of neural activation form as listeners mentally recreate these experiences, says principal investigator Lauri Nummenmaa of the University of Turku, Finland, who led the project.
This alignment is central to our ability to understand each other, Nummenmaa emphasises. ‘We mimic the brain state of others,’ he says, adding that individual emotions produce very distinctive neural fingerprints.
SOCIAL BRAIN developed new tools and techniques to study how our brains process and relay social information. It also came up with leads for potential applications, including diagnostic support for conditions such as autism, where this synchronisation process may be impaired.
Another potential use could be automatic recognition of human emotions from brain activity, Nummenmaa adds, explaining that a machine-learning algorithm designed by the project accomplishes this task with a high level of accuracy. This capability could be useful for movie-rating purposes, among other things.
Spine-chilling experiences ...
SOCIAL BRAIN also studied the physical sensations that we associate with emotions, concluding that these too combine in distinctive ways – and that they are universal.
As of April 2020, respondents from more than 100 countries have used the project’s mapping tool to show where they feel physical manifestations of emotions, such as anger or shame, says Nummenmaa. Their descriptions are remarkably similar, indicating that these patterns are part of our shared representation of the world. Based on this collective input, the project generated body maps representing a wide variety of emotions.
... in a natural setting
According to Nummenmaa, neuroimaging to study how our brain works has traditionally focused on simplified stimuli. Photos, for instance, are typically used to learn more about face recognition, although things are much more complex in the real world. ‘We see faces in a constant ebb and flow as people move about or engage with each other,’ he says. ‘We live in a dynamic, interactive environment, so that’s what the brain is tuned for. Many of the phenomena we study in the lab are different in real life.’
As the unwieldy scanners on which his research relies are not suitable for use outside the lab, the project set out to create more realistic conditions inside the brain scanners, Nummenmaa explains. ‘We had to push the boundaries of brain imaging and analytical technique to be able to do so.’
This newfound capability enabled the team to capture and analyse the complex phenomena at play. For example, in one study, volunteers in the brain scanner viewed horror films through a pair of goggles, which created an immersive experience. This experiment enabled the researchers to study the brain’s response to fear more accurately.
The project, which ended in December 2018, was backed by a European Research Council grant designed to support potential research leaders at the beginning of their career. ‘It gave us time and freedom to focus,’ says Nummenmaa, who was in the process of setting up his own laboratory at the time.
Seven years on, the lab’s work on the brain basis of emotions continues – notably with researchers applying the project’s ‘neurocinematics’ approach to the study of addictions and eating disorders.