A safer online environment

Today, online platforms can be misused to spread illegal content such as hate speech, terrorist content or child sexual abuse material, as well as sell dangerous goods and counterfeit products, or to offer illegal services, exposing citizens to harm.

61% of surveyed EU citizens say they have come across illegal content online, and 65% say they do not think the Internet is safe for use (2018 Eurobarometer survey)

child with smartphone

What the new Digital Services Act changes:

  • Easy and clear ways to report illegal content, goods or services on online platforms
  • Due diligence obligations for platforms and stronger obligations for very large platforms, where the most serious harms occur
  • Authorities will be better equipped to protect citizens by supervising platforms and enforcing rules together across the Union

Better protected consumers

Today, the fundamental rights of European citizens are not adequately protected online. Platforms can for example decide to delete users’ content, without informing the user or providing a possibility of redress. This has strong implications for users’ freedom of speech.

92% of respondents believe transparency from service providers is important to protect users’ freedom of expression (Open public consultation on the Digital Services Act)

What the new Digital Services Act changes:

  • Users are informed about, and can contest removal of content by platforms
  • Users will have access to dispute resolution mechanisms in their own country
  • Transparent terms and conditions for platforms
  • More safety and better knowledge of the real sellers of products that users buy
  • Stronger obligations for very large online platforms to assess and mitigate risks at the level of the overall organisation of their service for users’ rights, where restrictions of rights and risks of viral spreading of illegal or harmful content are most impactful
  • Access to platforms data for vetted researchers to understand risks on society and fundamental rights

Empowered citizens and users

Today, platforms optimise the presentation of information to capture attention and drive revenue, but their users are often unaware of how their systems sort content and how platforms profile them. The manipulation of recommender systems and abuse of advertising systems can fuel dangerous disinformation and propagation of illegal content.

70% of respondents believe disinformation is spread by manipulating algorithmic processes on online platforms (Open public consultation on the Digital Services Act)

What the new Digital Services Act changes:

  • Transparency of the rules for content moderation
  • Meaningful information about advertising and targeted ads: who sponsored the ad, how and why it targets a user
  • Clear information on why content is recommended to users
  • Users' right to opt-out from content recommendations based on profiling
  • Platforms' participation in codes of practice as a measure to mitigate their risks
  • Better access to data for authorities and researchers to better understand virality online and its impact with a view to lower societal risks

Quality digital services at lower price

The systemic role of a few online platforms affects the lives of billions of users and millions of companies in Europe. Some companies have a major impact on, control the access to, and are entrenched in digital markets. They can impose unfair take-it-or-leave-it conditions on both their business users and consumers.

60% of respondents say that consumers don’t have sufficient choices and alternatives regarding online platforms (Open public consultation on a New Competition Tool)

What the new Digital Markets Act changes:

  • Ban of unfair practices, opening up the possibility for business users to offer consumers more choices of innovative services
  • Better interoperability with services that are alternatives to those of gatekeepers
  • Easier possibilities for consumers to switch platforms if they wish so
  • Better services and lower prices for consumers