• First of all I would like to thank to EU DisinfoLab for organising this important event with such an impressive line of speakers.
  • This is a testimony to your work and to the important role you play in addressing disinformation.
  • I can only confirm that disinformation and foreign interference is constantly climbing up in the political priorities of European politicians and institutions.
  • I remember when we started that journey several years ago. I remember the doubts and hesitation about getting involved in this issue at all, because it is too complicated, too sensitive, too confrontational.
  • I remember the avalanche of criticism that fell on me every time I mentioned those states that use disinformation as a foreign policy tool.
  • Ladies and gentlemen, the whole disinformation community can be proud of finding a way to bring this issue to the top of political agenda, of providing us with research and information so necessary to able to talk about it and try to design an appropriate policy response.
  • However, there is no room for complacency. We still have a long way ahead of us to design effective solutions to the threat of disinformation and online manipulation, but I am optimistic that by the end of this mandate of this Commission, by 2023, we will have an unprecedented and unique European response to this, which will also fully respect our fundamental rights.
  • Today, I want to talk to you about the next steps we are preparing and rolling out, and I want to share with you the key dilemmas when designing a policy response to everything you discussed today.
  • Let me start with the general notion that in my view, we still don’t know enough about how all the information disruptions work. And we need a common understanding of the issue.
  • Disinformation, as excellent research of many of the participants yesterday and today show, is rarely about big-bang events. It is a dynamic process, constantly evolving. It is also a patient process. Slowly injecting poison into our democratic information space, abusing our democracy where it hurts as most, by attacking our sacred freedom of speech.
  • So, it is clear that our policy responses have to try to take into account what we don’t know yet.
  • Let me start though from stating the obvious.
  • The digital revolution has a lot of positive elements. It brought new opportunities for our democracies, and during the COVID-19 pandemic, digital technologies allowed us to stay in touch with each other or work.
  • Technology is not good or bad. It’s a tool that can be used equally for both purposes.
  • Disinformation can be used to divide the public in political debates, to manipulate participation in elections, and to attack the legitimacy of our institutions.
  • The coronavirus pandemic is a stark example: information manipulation and interference spread rapidly across the online space and social media platforms and affected our ability to respond to the global health crisis in an effective manner.
  • This brings me to the dilemma number one, namely the debate about true vs. false content.
  • For me this is actually a false question and I know as a community you have largely moved away from it. Quite rightly so. This is not and should not be a debate about true vs false, right vs wrong information. Freedom of expression should always be protected, even if this means a freedom of saying stupid things.
  • The debate today is rather about the rights and obligations of those shaping and participating in the debate online.
  • Above all, if we want to ensure that people are free to choose, we need to make sure that the information they see online is not fuelled by obscure functioning of platforms’ algorithmic systems and an army of undetected bots.
  • Currently, it is primarily up to the platforms to enforce their terms and conditions. There are no rules on what these conditions should be and no effective democratic oversight. The past events and revelations, as well as today’s discussion, illustrates how inappropriate such a gap is.
  • The second dilemma is if to regulate or not.
  • My answer is that you have to do both. The time of self-regulatory tools alone is finished. At the same time, I also don’t believe we will be able to regulate everything; nor should we.
  • This is why we proposed the European Democracy Action Plan (EDAP). It offers a comprehensive approach to challenges to democracy to protect meaningful participation of citizens also by addressing disinformation and sets out targeted actions to preserve open democratic debate. It suggests both regulation and non-regulatory actions.
  • Only if we strengthen the EU’s preparedness on all levels will we be able to considerably raise the costs for actors like Russia, who aim to destabilise our democracies, our societies and are a threat to our security.
  • We are committed to working with international partners and organisations as well as with stakeholders like you here today.
  • Public authorities need to recognise the role of the community and support it. For our part, besides the policy responses such as effective data disclosure for research, I would mention
    • that the Citizens, Equality, Rights and Values programme could contribute to these activities,
    • that we will increase funding for media literacy actions under the cross-sectorial strand of the Creative Europe Programme, and also
    • that we expect EDMO to start interconnecting research hubs that bring together and leverage academic researchers, fact-checkers, media practitioners and other stakeholders.

Code of Practice on Disinformation

  • Let me now turn to an important part of the puzzle of our efforts on disinformation brought by the EDAP - the work on the Code of Practice on Disinformation.
  • The Code was a first-of-its-kind, bringing together industry to commit to voluntary standards to counter disinformation.
  • The Code was a good first step, but it was not enough to change the landscape in a sufficient way, with inconsistent implementation across platforms and Member States, gaps in coverage, and limitations inherent in the Code’s self-regulatory nature.
  • This is why I want to overhaul it. In May, we published our vision of how it should be reinforced.   
  • A stronger Code should promote cooperation with fact-checkers and researchers across EU countries in all official languages, and empower users to be able to control better what they see online.
  • Disinformation is also a business, a profitable one. Around 200 million EUR in ad revenue is reportedly flowing to disinformation sites each year, and we want the new Code to help reduce financial incentives.
  • The Commission has called for participation of new signatories to strengthen the Code’s impact. I am pleased that a large number of other companies and non-governmental organisations – among them for instance Avaaz, DoubleVerify and Twitch – have now joined the work on a new Code. We continue to encourage all organisations that can contribute to a more trustworthy and resilient online space to join the initiative.
  • We also want to offer the opportunity for stakeholders to be involved in the process and provide their views.  
  • But what I want to stress is that if the Code should be the predictable way for companies to mitigate the risks posed by disinformation, it must be strong and respect our guidance from May.
  • I continue to believe that industry and civil society are well placed to find answers to such remaining pertinent and pressing questions. However, if the revised Code does not provide the necessary leap forward, I am certain that Member States will take matters into their own hands, risking a fracturing of the digital single market.
  • We need a strong progress and we need it fast.
  • What is crucial to realise here is that the Code of Practice should evolve towards a co-regulatory instrument as foreseen in the Digital Services Act. Very large platforms will benefit from participating in the strengthened Code as it will help them prepare for new requirements applicable to them under the DSA.

DSA and disinformation

  • This brings me to what you just discussed - the DSA itself.
  • The protection of fundamental rights is at the core of it.
  • We clarify through accountable legal norms what precise responsibilities online intermediaries – and platforms in particular – have, both in tackling illegal content and protecting freedom of expression online.
  • We introduce robust safeguards to make sure that legitimate content, such as the editorial content of media services, stays online and remains available to citizens. We will make sure that those safeguards are applied.
  • And we will require more from the very large online platforms, which have become ‘public spaces’, on the organisation and design of their systems, in full respect of freedom of expression. In particular, we want to give more powers to the citizens to understand and interact with the information and contest decisions of the platforms.
  • I am pleased to see that the co-legislators are advancing well with their work on our proposal, and the great commitment to deliver a strong DSA.
  • Moving forward, we need to be careful and avoid changes that could have unintended consequences and that could undermine our common effort to protect citizens from disinformation.
  • In particular, as you have discussed here, I mean the prohibition on online platforms to moderate editorial content.
  • I will come back to the media in a minute but before that let me mention that I will also propose one more piece of legislation – on political advertising online. Today, digital advertising for political purposes is the unchecked race of dirty and opaque methods. Trying to influence the elections or public debate must be subject to tighter rules and meaningful transparency, also when it comes to the targeting methods. We have to hit the ‘slow down’ button, because our democracy is too precious.
  • The third dilemma is the role of media.
  • Addressing disinformation is much note than just fact checking. It is about the information ecosystem.
  • It is also in this context that strong and diverse media means strong democracy. No doubts about it.
  • We have to recognise that digitalisation has changed the landscape of the media and there is no way back.
  • Today, everyone can set up a Youtube channel, a podcast or a website. Some influencers are more influential than traditional media. We see more opinions than facts online today.
  • I am talking about this because today, when we talk about the media, we have to capture this complexity. The discussions on the DSA are just a recent example of this.
  •  And independent media are crucial in the fight against disinformation, information manipulation and interference.
  • This is why we need to do more to protect them.
  • One important step is the recently adopted recommendations to Member States on the protection of journalists in the EU. They send a clear signal that the safety of journalists and media professionals is of utmost importance for us.
  • The same is true for our external policies and actions. We will cooperate closely with international organisations such as the Council of Europe, the UN and the OSCE to protect and support journalists and independent media. 
  • In addition to all this, the Commission is working on a new tool to ensure that media companies can operate across the internal market without political interference. This is the Media Freedom Act.
  • The fourth and final dilemma is how to address foreign disinformation.
  • We must not remain defenceless. There are actors, also state ones, that are weaponising information and disinformation is a tool for them to try to influence what is happening in Europe.
  • This is why we are working on an updated toolbox to impose costs on the perpetrators of foreign disinformation and information manipulation.
  • We clearly need more coordinated approaches between EU member states in this area.
  • And among our many partners, the US has valuable experience and capabilities that contribute to our shared understanding of the threat and enables us to discuss best ways to respond, in full respect of our fundamental rights and freedoms.
  • Whether it is our cooperation with the US bilaterally, within international networks such as the G7 Rapid Response Mechanism or now within the EU-US Trade and Technology Council.

Conclusion

  • We have talked about an array of measures here. There is one thing we must be very cautious about: making any entity able to seize control over the online space. There must not be any centralised authority equipped to push through its own information agenda. I experienced this under Communism. That is why I want all the initiatives to be transparent, to include safeguards and multiple checkpoints, so that our democracies stay protected. 
  • To sum up, as the threat is complex and multi-facetted, we need a whole-of-society approach where all stakeholders, like civil society and private industry, are closely involved and advance the response to the threat using the instruments at their disposal.
  • We support and appreciate the work that organisations like yours are doing to inform us, expose information manipulation and interference, raise awareness about the threat and gather and support civil society. That is why we are committed to work with you even more closely.
  • Thank you.