Today I gave this speech to the European Parliament's Committee on Culture and Education. I explained what the EU is doing in response to the threats posed by disinformation and fake news, especially in the context of the forthcoming European elections.

Tackling online disinformation is a major challenge for Europe and one of the Commission’s highest priorities, particularly ahead of the European elections, which will be decisive for the future of Europe.

The internet has not only increased the volume and variety of news available to us but has also profoundly changed the ways we access and engage with news. Younger users, in particular, now turn to online media as their main source of information.

New technologies can be used, notably through social media, to disseminate disinformation on a scale and with speed that is unprecedented. They can create personalised information spheres and become powerful echo chambers for disinformation campaigns.

The disinformation circulating on the major platforms can polarise public debate, create tensions in society, and ultimately undermine our electoral systems.

In December, the Commission and the High Representative adopted an Action Plan for a coordinated EU response to the challenge of disinformation. The Action Plan builds on a Communication on tackling online disinformation, which the Commission adopted last April.

 

What we are doing in view of the upcoming elections?

At the moment, our efforts are mainly dedicated to an intensive targeted monitoring of the implementation of the Code of Practice on disinformation, to ensure that policies with particular pertinence to electoral processes are in place ahead of the European elections.

With this Code of Practice, industry has agreed, for the first time worldwide, on a set of self-regulatory standards to fight disinformation on a voluntary basis. The Code is the result of broad and intense discussions with the platforms and a concrete outcome of the Commission’s Communication of April 2018.

Online platforms have committed to acting swiftly in view of the elections to effectively protect users from disinformation. In particular, Google, Facebook and Twitter are changing their services in order to:

  1. Improve scrutiny of ad placements and disrupting advertising revenues of accounts and websites that spread disinformation;
  2. Make political advertising and issue based advertising more transparent;
  3. Address the issue of fake accounts and online bots.

The latest reports by the platforms, published in mid-March, present encouraging signs of progress, in particular as regards the transparency and public disclosure of online political advertising, which has been ensured in all Member States with the deployment of political ads repositories in a very short time frame.

Furthermore, although the picture varies significantly by platform, the reports highlight progress in other areas, including measures to improve the scrutiny of ad placements and actions against fake or malicious automated accounts.

However, we still need to see improved measures to ensure compliance with the Code, including indicators for tracking progress. While the latest reports suggest that monitoring to date is yielding positive results, the Commission, in cooperation with national audio-visual regulators, will continue to monitor compliance at national level in the run-up to the elections.

Other commitments of the Code of Practice are related to:

  • Empowering consumers to report disinformation and access different news sources, while giving prominence to authoritative content;
  • Empowering the research community to monitor the spread and impact of online disinformation.

Access to platform data by academic researchers is key to allowing the necessary analysis of the disinformation phenomenon and also a robust assessment of the effectiveness of the actions taken by the platforms.

In view of the upcoming elections but also with a longer-term perspective, the High Representative, in cooperation with the Member States, is strengthening the Strategic Communication Task Forces and Union Delegations through additional staff and new tools, which are necessary to detect, analyse and expose disinformation activities.

Furthermore, the Commission has also set up a European election cooperation network, which has already discussed progress in implementing the Commission's elections package, and very recently, in collaboration with the High- Representative and the Member States, activated the Rapid Alert System (RAS).

The RAS allows the sharing of relevant instances of disinformation and the detection of malicious cross-border activities and thereby enables common situational awareness, coordinated attribution and response. The outcome of its work will be shared with the European election cooperation network, in particular to exchange information on threats relevant to elections and support the possible application of sanctions.

In particular, the RAS is looking at disinformation threatening democratic processes and cases of foreign influence operations. It will allow a better an effective share of information and best practices between Member States and European authorities to foster real-time responses.

Furthermore, to test preparedness against cybersecurity incidents affecting the European elections, the Commission, together with the EU Cybersecurity Agency (ENISA), the European Parliament and Member States is organising an EU level table-top exercise on 5 April 2019. This cyber-resilience exercise is aimed at improving coordination between Member States and enhancing cooperation between cybersecurity authorities and election authorities at national level.

As regards the support of the creation of a European network of independent fact- checkers in view of the European elections, the Commission is providing, through the Horizon 2020 support action SOMA (Social Observatory for Disinformation and Social Media Analysis), a platform for independent fact-checkers and researchers that features access to public data and collaborative tools. On 20 March 2019, SOMA held a workshop that attracted more than 80 participants, including 29 fact-checking organisations from 18 European countries.

Such platform includes an integrated access to several Commission and European Parliament services. Fact-checkers can query from the platform Eurostat database, the European open data portal as well as European Parliament (EP) databases (EP news hub, EP Multimedia, EP Documents). Moreover, a direct and dedicated communication module with the Commission communication services has been built in and a similar module connecting to the communication services of the EP is in the implementation phase.

We expect this platform to provide the technical means to increase capacity of detecting and analysing disinformation campaign all over Europe.

The action of the Parliament and the Commission increased attention on the important role of fact- checking in the modern media ecosystems. This has led to the creation of new fact-checking initiatives in Europe, such as the one of the International Fact-Checking Network announced two weeks ago. They have created a European branch of independent fact-checkers and launched a website that will focus on fact-checking in the context of the European elections.

Another key element of our approach is the promotion of media literacy. It is important for Europeans of all ages to acquire media literacy skills to enable them to take part in society and contribute to economic and social progress in the digital era.

Media literacy also enables citizens to evaluate the credibility of information they encounter online and to access alternative points of view. In the long- run, media literacy initiatives are our best instrument to counter the malicious effects of online disinformation. As you might know, a European Media Literacy Week took place two weeks ago (18-22 March) with a series of events and initiatives both at EU as well as at national level. The focus of the week was disinformation and how to counter it in view of the 2019 European elections.

During the discussions at the Conference, there was an overwhelming agreement that media literacy is an essential capability in the rapidly evolving 21st century media landscape. This will ensure that our citizens can make smart media choices and can make informed decisions based on pluralistic and quality information.

The revised Audiovisual Media Services Directive (AVMSD), adopted in November 2018, is a game-changer in two respects.

  • Firstly, it recognises the importance of media literacy and introduces a legal obligation for Member States for promoting and reporting on media literacy initiatives.
  • Secondly, the revised AVMSD also includes an obligation on video-sharing platform providers, under Member States’ jurisdictions, to take appropriate measures in order to protect the public from incitement to violence or hatred. In doing so, it provides for a regulatory backstop in respect of certain cases of disinformation campaigns aiming at propagating hate and violence.

 

 What’s next?

 We intend to continue reinforcing and expanding our actions. In particular,

  • We will continue the monitoring of the Code of practice before and after the European elections. This will lead us to a comprehensive review of the Code's implementation by the end of 2019.
  • To further increase the capacity of detecting and analysing disinformation campaigns, the Commission will deploy, under the Connecting Europe Facility work programme 2019, a new digital service infrastructure for the establishment of a European Platform on Disinformation. The digital service infrastructure should scale up the collaboration at the EU level in order to ensure full coverage of the Union territory and facilitate the build-up and interconnection of relevant national organisations, including national multidisciplinary teams of independent fact-checkers and researchers, which should be supported by Member States as requested in the Action Plan work.
  • Furthermore, we will continue investing through Horizon 2020 and the new Framework Programme on the use of new technologies like AI to create tools, which can assist human decision on the veracity of online content. This is very much in line with our human centric approach for Next Generation Internet and ensuring protection of freedom of expression.

 

Conclusion

I will conclude by underlining that there is no silver bullet against disinformation and that  a single solution cannot address all challenges related to disinformation.

We believe in inclusive solutions to guide our action against disinformation and this requires active involvement of all stakeholders: national and EU institutions, as well as civil society, online platforms, advertisers, journalists, media groups and fact checkers.

Just before the European elections in May 2019, the challenges are enormous, but we are working hard to guarantee a transparent, open debate that will enable citizens to privilege a strong Europe in which our democratic values are and will be protected.