Cracking the code of the unspoken language

Monday, 27 October, 2014
A Marie Curie research fellow, Oya Aran, has paved the way for the development of computer techniques that could automatically reveal meaning from body language and other visual cues, predict people's mood and help improve, she says, “collective decision-making”. The scientist studied 100 people interacting in small groups, using computer vision, audio processing and “machine learning” to detect dominance and emergent leadership.

She examined hand gestures, head gestures and body posture through a number of different emerging research fields. These included human-computer interaction, machine learning, speech and language processing and computer vision – as well as other disciplines, like social psychology. The conclusion of the research work was that dominance can indeed be revealed automatically from non-verbal behavioural clues.

“Non-verbal communication is far richer than most people imagine and it is mainly used unconsciously,” explains Aran, adding that the research makes use of the common knowledge that “people often say more in gestures - through smiles, shrugs or winks - than they do when they actually speak.”

The research came out of the European Union (EU)-funded Marie Curie project called NOVICOM, run by Aran, who is currently working at the Idiap Research Institute in Martigny, Switzerland.

Aran used webcams and audio devices to record group discussions between approximately 100 volunteers divided into groups of three or four people. She then integrated methods from computer vision, audio processing and machine learning, or the computer ability for pattern recognition, to create automatic analyses of the voices and gestures’ quality and quantity. Based on the analyses, she focused on different features, such as the amount and diversity of a person's body motion or how loud a person talked during a meeting or when the groups carried out teamwork tasks.

Aran was thus able to measure perceived levels of dominance in groups, how much interest individuals showed during their talks, and how close they were to agreeing. The results outlined that the emergent leader is perceived by his and her peers as an active and dominant person, and that visual information enhances acoustic information. All these research results helped break new ground in accurately measuring non-verbal communication.

Aran, who has a master and PhD degree in computer engineering from Istanbul’s Bogazici University in Turkey, says that the eventual applications of this research may include computer tools to improve collective decision-making and to enhance communication. The outcome of the analysis “can be used to resolve conflicts, create attention and set a clearer focus on the discussion,” affirms the researcher.

After her Marie Curie fellowship, Aran applied for the Swiss National Science Foundation’s Ambizione fellowship, which she is currently conducting at the Idiap Research Institute. She praises the Marie Curie fellowship for focusing not only on research, but also on the training activities, such as project management, teaching and networking – all skills that helped her become an independent researcher.

NOVICOM’s project coordinator, Idiap Institute senior researcher Daniel Gatica-Perez, says “Aran’s research offered scientific proof of how dominance and emergent leadership in small groups can be revealed automatically by specialist software tracking of non-verbal behavioural cues.” Her research helped measure non-verbal communication more accurately and could eventually transform group dynamics and decision-making.

utomatic analysis of group conversations via visual cues in non-verbal communication
Project Acronym: