Authors: Roberto Viola, Director-General, DG Communications Networks, Content and Technologies and Robert-Jan Smits, Director-General, DG Research and Innovation
We think of the Internet as connecting people to content or to each other via web pages or web chats. But this year, nearly 5 billion things will be connected to the Internet; by 2020, that number will reach 25 billion, with the volume of annual global Internet traffic exceeding the equivalent of about 500 billion DVDs (2 zettabytes). Only powerful supercomputers able to perform massive and rapid computations can cope with this ever increasing amount of data.
High performance computing in our lives
What is a super computer? What is high performance computing, or HPC? It is not a very powerful or large P.C.(although today's PCs are more powerful than the giant machines of the past). A supercomputer is composed of thousands of processors working in parallel. It responds to our increasing needs to process zillions of pieces of data in real time with quality and accuracy. In the ability to understand the complexity of a huge amount of information, coming from many sources lies our response to some of the answered challenges of our society. HPC allows us to design and simulate the effects of new drugs, provide faster diagnosis and better treatments, control epidemics and support decision-making in many areas like electricity or water distribution or urban planning. The applications in technology and engineering are countless, from more energy-efficient buildings to better and safer cars and planes.
What does this mean in practice?
A more competitive industry
Supercomputing is an essential part of the digitisation process of industry. We could never have designed the world-beating Airbus A380 without HPC. Thanks to HPC-based simulation, the car industry has reduced the time for developing new vehicle platforms from 60 months to 24, while greatly improving safety and comfort. To explore the universe we need supercomputers. But the advantages of supercomputing are not confined to large research centers. Thanks to the cloud and to networking, every research center or even an innovative start-up can now access supercomputing as a service.
Direct benefits to our health
Supercomputers can detect genetic changes responsible for the onset and mutation of tumours in a simple, quick and precise way, in a few hours. For new-borns affected with genetic disorders - the main cause of infant deaths in our modern societies - time is essential, as they do not demonstrate all of the classic disease symptoms that make diagnosis possible. In one case, one day of supercomputer time was all that was required to analyse 120 billion nucleotide sequences, narrowing down the cause of a baby's illness to two genetic variants. Thanks to this, effective treatment was possible and the baby is alive and well 5 years later.
Severe weather costs 150.000 lives and €270 billion in economic damage in Europe between 1970 and 2012. The more powerful the supercomputer, the more precisely and the further in advance climate scientists can predict the size and paths of storms and floods and help activate early warning systems. For example, HPC has helped climate scientists predict the size and path of St. Jude's Day storm in October 2013 four days before it formed, helping to take preventive measures and reducing damage.
Making possible more scientific advances
HPC has revolutionised the way science is performed. Supercomputing is needed for processing sophisticated computational models able to simulate the cellular structure and functionalities of the brain. This should enable us to better understand how our brain works and how we can cope with diseases such as those linked to ageing.
These scientific advances are in turn pushing the frontiers of ICT research: the new data-driven science requires more data capacity and computing power, as well as an "open" environment where researchers can easily access and use this wealth of data and computing resources. This is the new reality of "open Science" and that is why the Commission is working to establish a "Science Cloud" as part of our Digital Single Market initiative.
More reliable decision-making
The world faces an increasing number of challenges – at the local level as well as at the planetary scale. The convergence of HPC, Big Data and Cloud technologies will allow new applications and services in an increasingly complex scenario where decision-making processes have to be fast and precise to avoid catastrophes. Supercomputers are in the front line for developing essential public policies, from homeland security to climate action. This is why HPC has become a national priority in U.S.A., Japan or China.
High performance computing in Europe
Last year, the European Commission set up a Public-Private Partnership with the European Technology Platform on HPC to develop HPC technologies and applications. The EU committed €700 million from the Horizon 2020 programme, with the private sector providing matching funds. The first investments have so far resulted in the establishment of 8 Centres of Excellence in HPC applications. We are also investing in quantum technologies that might become soon the new frontier for supercomputers. The first results are very encouraging.
However, the EU has only one supercomputer in the world top 10. A recent study suggests that public and private investments needed for Europe to achieve a leadership role in HPC are around €5 billion over the next 7 years.
The study we publish shows that in Europe there has been an improvement in our HPC capabilities but much more is needed. The exponential growth of data, networking and computing can become a driver of positive changes in our society, advances in our scientific capacity and gains in productivity across the economy. For that, we need to conceive our investments in a coordinated way at a continental scale. No Member State alone can close the gap between Europe and the fast runners. No Member State alone can aggregate the demand of the Scientific community.
Bringing big demand together with big capacity could provide a unique opportunity to create a European data ecosystem that can benefit our scientific community, our industry and society at large.