EUROPA: Research Information Centre

Close window  
 
Last Update: 2019/03/08   Source: Research Headlines
 
  View this page online at: http://ec.europa.eu/research/infocentre/article_en.cfm?artid=41256
 
     

AI trained on human gestures could help solve problems

How do groups of people solve problems? What behaviours and gestures precede a successful solution? And can artificial intelligence learn to interpret - and assist - collaborative problem-solving? An EU-funded project has sought answers to these intriguing questions, opening up a new direction in gesture studies.

Image

© AndSus #221390147, 2019 source: stock.adobe.com

Updated on 8 March 2019

The EU-funded GETUI project generated a classification system for human-human and human-computer interaction gestures, such as pointing, prodding or fiddling with an object. This taxonomy will enable the training of gesture-recognition systems and the creation of patterns for artificial intelligence-based machine learning – for example, to support decision-making in business environments or to improve student training and assessment in education, such as in the OECD's PISA programme.

Researcher Dimitra Anastasiou modelled and classified the range of non-verbal gestures that individuals use when interacting with each other and with computer interfaces.

'In a high-achieving group that is solving a problem fast, gestures are limited, well-oriented and mainly involve pointing to tangible objects: this accelerates the whole problem-solving and decision-making process,' she says. 'On the other hand, a group that is not sure about their actions will use a lot of affective gestures, such as head-scratching, and a significant number of manipulative gestures, like dragging or moving a tangible object. These decelerate the process and are linked with low-achieving groups.'

More effective decision-making

Anastasiou's work has filled gaps in scientific and practical knowledge about evaluating human behaviour when groups of people have to work together to solve a problem involving the use of physical objects or 'tangible user interfaces' (TUIs) such as digital touch screens and table tops.

TUIs play a central role as both physical representations and controls, and gesture recognition is used across many education, telecommunications, entertainment and healthcare applications.

Working at the Cognitive Environment Lab of the Luxembourg Institute of Science and Technology, Anastasiou intends to build on the results of the project with follow-up research focused on enabling computer feedback based on users' performed gestures and conducting broader analysis of multi-user interactions.

'Human-human and human-computer gestures are separated in scientific literature, but in my opinion, psychologists, social scientists and engineers have to work together to analyse human behaviour and then implement all kinds of gestures in gesture-recognition applications,' Anastasiou says. 'GETUI has helped raise awareness that human behaviour is crucial in assessing problem-solving skills and competences. This can benefit both educators and learners, leading to a more effective learning and decision-making.'

GETUI received funding from the EU's Marie Skłodowska-Curie fellowship programme. In the three-year duration of the project, nine peer-reviewed conference and workshop articles were published.

 

Project details

  • Project acronym: GETUI
  • Participants: Luxembourg (Coordinator)
  • Project N°: 654477
  • Total costs: € 172 800
  • EU contribution: € 172 800
  • Duration: November 2015 to December 2018

 
 
Read Also
Project website: http://www.list.lu/en/project/getui
Project details: https://cordis.europa.eu/project/rcn/194970/factsheet/en
Top