Avis juridique important
   
Contact   |   Search   
RDT info logoMagazine on European Research N° 51 - December 2006
Top
 HOME
 TABLE OF CONTENTS
 EDITORIAL
 E-inclusion, heads or tails
 Biotechnology: growing in popularity
 "A straight-talking scientist can create quite a stir"
 Diabetes + obesity = diabesity
 The history of the yeast genome
 Rebel with multiple causes
 On the trail of the sixties
 Nobel prize winners to be
 COMMUNICATING SCIENCE
 IN BRIEF
 PUBLICATIONS
 AGENDA

Download pdf de en fr


INFORMATION SOCIETY
Title   Man-machine: new means of communication

IT is looking for new ways of communicating with human beings, using all of their polysensory potential. The computer is becoming sensitive to evermore subtle tactile commands, interprets words that we speak and is capable of responding to them. It is starting to decipher our eye movements. IT specialists are exploring ideas that are practically science fiction, where the machine manages to interpret “mental” commands or extend its ramifications to the clothes we wear, in order to look after our well-being. This type of technology, known as interface technology, must first of all meet the needs of our most vulnerable citizens – the elderly or disabled. But, beyond these priority groups, these innovations could radically transform all means of communication between man and machine.

Experimentation within the framework of the "Maia"
Experimentation within the framework of the “Maia” project (Mental augmentation through determination of intended action). Connected by some thirty or so ECG electrodes, the operator gets the computer to recognise a movement of his right or left hand, simply as a result of mental concentration. The project is aimed specifically at applications such as the steering of wheelchairs in an enclosed environment and the control of robotic arms for carrying out remote handling tasks.

How can someone suffering from motor neuron problems manage without accessing the traditional keyboard/mouse combination, or how can a visually impaired person manage without the inevitable screen? The world of IT is still difficult for hundreds of millions of individuals through - out the world, whether they are disabled or affected by some cultural divide due to age or social conditions. There is an enormous consensus at the present time for believing that the digital divide constitutes a major democratic challenge for the information society. At the technological level, one of the questions raised is therefore that of diversification and innovation in interfaces for communication with the computer.

Sight and hearing, speech and touch
In IT, interface technologies between the user and the machine constitute a specific software area, known as the API level (Application Programming 13 Interface), which is added on a strictly independent basis to the internal architecture of the computer. A long-established example: text handling is separate from the keyboard used to type in the characters. Between the two, there is the API, which is responsible for receiving information transmitted by this or that keyboard and converting it into “universal” instructions that can be used for text handling, whatever the “linguistic script” used.

At the present time, the two interface controls are the screen – based on the sense of sight – and the mouse, that multi-functional touch control. Born in the 60s, and then becoming enormously popular in the 80s, this extraordinary invention introduced a fundamentally new tool in terms of the secular usage of typewriter keys. However, the mouse is of fundamental importance in the process that allows “anything graphic”. What would we do without it?

An innovation that has been studied for a long time (IBM was already working on it in the 70s) is the use of voice and hearing. Speech recognition and synthesis are becoming standard currency in numerous applications. A future step will be their adaptation to text handling, whether for dictation or listening: algorithms capable of extracting a string of phonemes from a text and transferring it to a vocal synthesiser, to reproduce it audibly, will soon be part of standard PC equipment.

Another foreseeable development is associated with the sense of touch – what specialists call the haptic (1) interface. Partners in the “Grab” project (cf. panel) are therefore developing robotic simulation tools to trace the contours of a three-dimensional image with the tips of the fingers. Such an application extends beyond the simple adaptation of standard IT tools for the visually impaired by proposing the seizure of digital information via this totally new channel of tactile perception. Will architects be using it for producing “hand-made” digital models? Will tele-surgeons be able to “touch” the organs of a patient they are operating on several thousand miles away using a robot? These technologies are opening up application possibilities in many areas, extending far beyond the world of the disabled.

Interfaces adapted to visual control, developed by the European "Cogain" project
For paralysed people, it is eye movements that issue commands to the computer. Here, an infrared beam allows accentuation of the contrast between the cornea, the iris and the pupil. Software detects the position of the pupil in order to determine where the operator is looking. Interfaces adapted to visual control, developed by the European “Cogain” project, allow up to 25 words a minute to be written, i.e. slightly less than what can be achieved using a keyboard. © Cogain/Paivi Marajanta


From look to thought
But innovation prospects go much further – not without causing a certain amount of fear regarding the power of intrusion that they afford the machine – when the interface is capable of interpreting someone’s gaze or even their mental activity. The European “Cogain” project (cf. panel) is dedicated to the tens of millions of people suffering from paralysis in the world, aiming at communication between man and machine controlled by eye movements. Päivi Majaranta, from Tampere University (FI), its coordinator, believes that this technology “can also find outlets in other applications that render computers attentive to the user, for example, in the field of games or video-conferencing”. Big Brother is watching you?

For people who are paralysed, the ultimate solution would, however, be to manage without any movement at all. Are we in the world of science fiction when we imagine controlling a computer using only our thoughts? The marriage between cybernetics and neurosciences allows us to envisage this scenario. Certain researchers are experimenting with so-called intrusive techniques, where electronic chips, implanted in the human body, can be connected directly to the neurons. Consequently, John Donoghue’steam (Brown University – USA) recently published details, in Nature,of a 26 year old paraplegic, Matthew Naggle, who has a “neuromotorprosthesis” inserted in the area of the brain controlling voluntary movements. This prosthesis logs the activity of the neurons and relays this to computers, which translate this impulse in order to control a cursor.

Other researchers are working on more gentle methods, which do not call for surgical intervention. A helmet fitted with electrodes envelops the operator’s cranium in order to capture electrical waves produced by the upper part of the brain, the cerebral cortex. Neuron information, provided, forexample, by an electroencephalogram and relayed to software, is identified and classified in such a way that it corresponds to a simple, pre-defined command (such as moving the cursor to the right or to the left, as inthis case).

The challenge of cerebral plasticity
The computer must therefore associate a command with a mental pattern. The first challenge is of a technological nature: you have to teach the machine to recognise and identify signals produced by the brain. The other difficulty is human. Although, overall, neurosciences associate active areas of the brain with certain mental processes, man is incapable of identically reproducing the activation of the same neuron just by thinking the same thing. This plasticity of cerebral mechanisms means that the ECG for the same person imagining movement of his right hand will be different every time. For José del R. Millán, Coordinator of the European “Maia” project (Mental Augmentation through Determination of Intended Action), “brain/computer interfaces certainly have the potential to offer a new means of communication with computers. However, they cannot currently be used outside of the laboratory, and have to be overseen by experts.” According to him, the main channels to be explored are interaction and adaptation of the two main protagonists in the system – man and machine. The machine has to learn to adapt, at all times, to the changing cerebral patterns of the operator.

Assisted tele-autonomy
But the interface cannot do everything. The autonomy of the elderly and the disabled is also involved in other types of research, namely the telecommunications aspect of certain projects. Therefore, Galileo, which should lead to the future European “GPS”, will offer a satellite positioning system capable of indicating a position to the nearest metre. Thanks to this accuracy and a detailed geographical information system, everyone will be able to ascertain their exact position and calculate the best way to get to the bus stop or the nearest pedestrian crossing. For the visually impaired, it will simply be a matter of creating a voice interface providing a complete “audio map” of the area they are in.

Another great challenge over the coming decades is home support for the elderly. Projects like “Healthservice24”, “Chronic” or “MyHeart”(cf. panel) propose vital signs analysers that can be used in the home and which relay the information logged directly to the medical services via UMTS or GPRS (transmission protocols for mobile telephones). This information will be processed immediately in healthcare centres and will lead to action when necessary. Practitioners will then be able to contact the patient, thanks to a video-conferencing system, via the Internet or digital television. It is therefore a complete infrastructure that is being studied: monitoring of patients in their homes, remote consultation and storage of a person's medical data allowing the doctor to access complete information on their patient.

(1) Taken from the Greek haptein (touch), the term haptic refers to the science of touch, by analogy with acoustic or optic.

Version imprimable
  READ MORE  
  Images that can be touched
A project bringing together six teams from four countries (Spain, United Kingdom, Ireland, Italy), “Grab” (Computer GRaphics Access for Blind people through a haptic virtual environment) aims to offer the blind access to the graphic world of computers ...
 
  Controlling a computer with a look

Sarah is confined to her chair. She is unable to move her legs or arms. Despite her disability, she works at her computer, designing web pages. She stares at the screen, a movement monitor pointed at her eyes detecting their slightest movement...
 
  MyHeart: preventing cardio-vascular illnesses

Coronaries, arrhythmia, heart attacks… Around 20% of Europeans suffer from cardio-vascular illnesses and 45% die from them...
 

  TO FIND OUT MORE  
 
  • Grab
  • Maia
  • Cogain
  • MyHeart
  • Chronic
  •  

       
      Top
      Images that can be touched

    A project bringing together six teams from four countries (Spain, United Kingdom, Ireland, Italy), “Grab” (Computer GRaphics Access for Blind people through a haptic virtual environment) aims to offer the blind access to the graphic world of computers. The user slides their two index fingers into receptacles supported on arms mounted on jacks, which will simulate the resistance a three-dimensional image would offer if it was traced with the ends of the fingers.

    The equipment consists of this pair of articulated arms – the haptic(1) interface – a voice recognition and synthesis unit allowing for communication between man and machine, and tactile geometry modelling software, which drives the interface. This software is generic, which is to say, it can be adapted to different types of haptic interfaces and use any three-dimensional image coded using standard protocols for exchanging information between CAD (computer aided design) systems. Blind people have been able to try out this innovation by taking part in a treasure hunt in a building littered with booby-traps and finding the structure of a town on the basis of a reconstituted map.
      Controlling a computer with a look

    Etude des mouvements oculaires
    © Cogain/Paivi Marajanta
    Sarah is confined to her chair. She is unable to move her legs or arms. Despite her disability, she works at her computer, designing web pages. She stares at the screen, a movement monitor pointed at her eyes detecting their slightest movement. Cameras and infrared beams constantly identify which part of the screen Sarah is looking at. Her eyes cause the pointer to move around the screen, which allows her to activate software menus and buttons with an accuracy of half a centimetre. All she has to do is blink her eyes or stare at the same point for a few seconds in order to click.

    For typing text, “Cogain” is also developing interfaces that are specially adapted to visual control, which display sequences of letters separated by coloured areas. The user continuously moves the cursor on the screen, passing from one to another, selecting the associated letter. The clever part lies in the spatial organisation of these coloured areas, the juxtaposition of which has been studied to allow for the predictive completion of words. A space for the letter “Y” is, for example, alongside another reserved for the diphthong OU, which is itself alongside that for R. All this allows you to write the word “Your” using one eye movement. An experienced user can write up to 25 words a minute – just under half of what an average user sitting at a keyboard can write.

      MyHeart: preventing cardio-vascular illnesses

    Coronaries, arrhythmia, heart attacks… Around 20% of Europeans suffer from cardio-vascular illnesses and 45% die from them. For “MyHeart”, a consortium of some forty or so European partners, led by Philips, these figures could be reduced by better prevention and personal monitoring. The plan? Design a garment fitted with lots of bio-detectors and electronics to analyse the cardio-vascular condition of the person wearing it, provide them with information and also communicate, via the mobile telephone networks, with specialist medical centres. This garment is therefore able to identify lack of physical exercise, excess weight, sleeping disorders or excess stress. If it detects an anomaly such as an ischaemic incident (halting of the blood supply), an auricular fibrillation (anarchic contraction of the auricles) or any other sign indicative of a possible cardio-vascular incident, it contacts the respective healthcare centre directly. Although the manufacture of an intelligent garment fitted with biophysics detectors represents a significant technological achievement, we nevertheless wonder whether most patients will agree to their being permanently monitored by an electronic arsenal.