Notification Detail

Draft Federal Act on measures to protect users on communication platforms (Communication Platforms Act)

Notification Number: 2020/544/A (Austria )
Date received: 01/09/2020
End of Standstill: 02/12/2020

Issue of comments by: Commission,Sweden
bg cs da de el en es et fi fr hr hu it lt lv mt nl pl pt ro sk sl sv
de en fr
bg cs da de de el en es et fi fr hr hu it lt lv mt nl pl pt ro sk sl sv
bg cs da de el en es et fi fr hr hu it lt lv mt nl pl pt ro sk sl sv

Message 002

Communication from the Commission - TRIS/(2020) 03207
Directive (EU) 2015/1535
Translation of the message 001
Notification: 2020/0544/A

No abre el plazo - Nezahajuje odklady - Fristerne indledes ikke - Kein Fristbeginn - Viivituste perioodi ei avata - Καμμία έναρξη προθεσμίας - Does not open the delays - N'ouvre pas de délais - Non fa decorrere la mora - Neietekmē atlikšanu - Atidėjimai nepradedami - Nem nyitja meg a késéseket - Ma’ jiftaħx il-perijodi ta’ dawmien - Geen termijnbegin - Nie otwiera opóźnień - Não inicia o prazo - Neotvorí oneskorenia - Ne uvaja zamud - Määräaika ei ala tästä - Inleder ingen frist - Не се предвижда период на прекъсване - Nu deschide perioadele de stagnare - Nu deschide perioadele de stagnare.

(MSG: 202003207.EN)

1. Structured Information Line
MSG 002 IND 2020 0544 A EN 02-09-2020 A NOTIF

2. Member State

3. Department Responsible
Bundesministerium für Digitalisierung und Wirtschaftsstandort
Abteilung III/8
A-1010 Wien, Stubenring 1
Telefon +43-1/71100 805210
Telefax +43-1/71100

3. Originating Department
Abteilung V/3
Ballhausplatz 1, 1010 Wien
Telefon: +43 1 531 15-20 23 88
Fax: +43 1 531 15-20 42 85

4. Notification Number
2020/0544/A - SERV60

5. Title
Draft Federal Act on measures to protect users on communication platforms (Communication Platforms Act)

6. Products Concerned
Communication platform providers

Communication platforms within the meaning of this Act are information society services whose main purpose or essential function is to enable the exchange of messages or presentations with intellectual content in words, writing, sound or images between users with a larger group of people by means of mass distribution. Communication platform providers are not affected by the obligations of this Federal Act if the number of registered users does not exceed 100 000 and if the turnover generated by operating the communication platform does not exceed EUR 500 000. In addition, platform providers for the mediation of goods and services, non-profit online encyclopaedias and media companies that provide communication platforms in direct connection with their offers of journalistic content are explicitly excluded from the obligations of this Act.

7. Notification Under Another Act

8. Main Content
The Federal Act on measures to protect users on communication platforms (Communication Platforms Act) provides large communication platforms with organisational obligations for the effective and transparent handling of certain illegal content. Illegal content within the meaning of this Act is content which constitutes one of the criminal offences specified in § 2(6) and which is not justified.

The draft provides the following obligations for providers:

An effective and transparent procedure for reporting illegal content must be maintained on the platforms, including ensuring that users can report content easily at any time, content is quickly checked and, if necessary, blocked or deleted (obviously illegal, i.e. criminal content within 24 hours, other illegal, i.e. criminal content within seven days), affected users are informed of the decision on deletion or blocking of the platform and deleted or blocked content as well as the data required to identify the author are saved for ten weeks for the purpose of evidence, including for purposes of criminal prosecution (see §3(1)-(3)).

In addition, a review procedure must be provided, whereby the user making a report as well as the user whose content has been blocked or deleted can initiate a review of the decision concerning the blocking or deletion (or the absence thereof) by the platform (see § 3(4)).

Communication platform providers must describe their handling of reports concerning illegal content in an annual report, or quarterly report in the case of communication platforms with more than one million registered users (see § 4).

In order to ensure reachability (including a postal address), providers must appoint a responsible representative to ensure accountability (see § 5).

The supervisory authority will impose a fine on a provider, depending on the severity of the violation of the Act, if one of the obligations created by this Act has been violated in a systematic manner (see § 10). However, the draft provides that an improvement order must first be issued before a procedure for imposing a fine is initiated (see § 9). A fine can be imposed on a responsible representative if they do not ensure that they can be reached or do not take the expected care to ensure that the obligation to set up a reporting and review process and the obligation to create reports are fulfilled.

9. Brief Statement of Grounds
The main reason for the development of this draft Act is the worrying development that the Internet and social media, in addition to the advantages that these new technologies and communication channels provide, have also established a new form of violence, and hate on the Internet is increasing in the form of insults, humiliation, false information and even threats of violence and death. The attacks are predominantly based on racist, xenophobic, misogynistic and homophobic motives. A comprehensive strategy and a set of measures are required that range from prevention to sanctions. This strategy is based on the two pillars of platform responsibility and victim protection, with the present draft Act relating to ensuring platform responsibility.

The existing obligation to immediately delete or block access to illegal content is often not satisfactorily met by communication platform providers once they become aware of such content. In addition, content reported by users is generally only checked by the platforms against their own community guidelines and not against national criminal offences. Those affected are therefore often forced to take legal action in order to obtain a deletion. In the light of this, it is therefore important to make communication platforms much more responsible than hitherto. As this is a cross-border challenge, effective regulation at the European level is the best solution. In its Resolution of the Council of Ministers of 9 July 2020, the Federal Government therefore welcomed the submission of a Digital Services Act announced by the European Commission for the end of the year. Since this ongoing consultation process and in particular the corresponding legislative procedure at the European level will take quite some time to complete, it is necessary - on the basis of the experience of the German and French legislative initiatives - to take legal measures as soon as possible to achieve more transparency, responsibility and accountability of the platforms.

The urgency of the issue requires the implementation of immediate national measures. Until the regulatory deficit has been remedied at the European level, an act on measures to protect users on communication platforms is to be created to effectively combat hate on the Internet, in order to remedy the situation by means of a legal obligation for platforms to set up a complaint management system for handling illegal content. In addition, an obligation is provided to appoint a responsible representative, in order to ensure accountability (including a postal address). In order to increase the information base regarding the activities of the platforms in this sensitive area and to be able to evaluate the measures, the draft also contains the obligation to submit a regular report on the handling of illegal content. Less restrictive measures would be less effective with regard to the current risk situation, the desired level of protection against criminal content on communication platforms and the implementation of supervision of compliance with the requirements. The measures are based on the example of other Member States (Germany, France), for which the European Commission did not raise any formal objections that would have led to the extension of the standstill period.

10. Reference Documents - Basic Texts
No basic text(s) available

11. Invocation of the Emergency Procedure

12. Grounds for the Emergency

13. Confidentiality

14. Fiscal measures

15. Impact assessment

16. TBT and SPS aspects
TBT aspect

No - the draft is neither a technical regulation nor a conformity assessment.

SPS aspect

No - the draft is neither a sanitary nor phytosanitary measure

European Commission

Contact point Directive (EU) 2015/1535
Fax: +32 229 98043

Stakeholders Contributions

The TRIS website makes it easy for you or your organization to share your views on any given notification.
Due to the end of standstill we are currently not accepting any further contributions for this notification via the website.

  European Digital Rights on 09-11-2020
Click to expand

European Digital Rights (EDRi) is an association representing 44 human rights organisations from across Europe that defend rights and freedoms in the digital environment. This submission has been developed with the contributions of our members, Access Now and Article 19.


On 3 September 2020, the Austrian government released a legislative package to tackle online hate speech. Besides a comprehensive justice reform, the package also contains a bill that creates new obligations for online platforms to remove potentially illegal user-generated content (the so-called Kommunikationsplattformen-Gesetz, or KoPlG for short). On 1. September 2020, Austria notified the draft law to the European Commission in accordance with Directive (EU) 2015/1535.

EDRi strongly advises the European Commission to postpone the Austrian draft KoPlG for the following reasons:

  • The draft legislation would seriously hinder the fundamental right to freedom of expression and opinion by creating chilling effects and limit the right to conduct business for SMEs;

  • It de facto puts in the hands of platforms within scope the responsibility to enforce the law, although they neither have the necessary knowledge nor the ability to do so;

  • Its scope is disproportionate and may affect community-led, non-for-profit as well as small service providers in an unequal and disproportionate manner in contrast to Big Tech companies;

  • There is no evidence substantiating the claim that the proposed rules would be an effective and proportionate remedy to deal with the problem of online illegal hate speech in Austria;

  • The penalties foreseen in the draft legislation are disproportionate and will certainly lead the platforms to stay on the safe side and thus, to potentially overblock legitimate content in order to escape the threat of disproportionate fines;

  • The Commission should prevent the introduction of national measures that would compromise the adoption of the future Digital Services Act package by the European Parliament and the Council in the same field and thus, preventing harmonised legislative landscape across the EU.

    Please consult our full contribution attached.

  EuroISPA (European Internet Service Providers Association) on 15-10-2020
Click to expand

To whom it may concern,

Please find attached EuroISPA's points of critique on the Austrian Federal Act on measures to protect users on communication platforms. 

Kind regards, 

Mauro Sanna

Policy Executive
EuroISPA - European Internet Services Providers Association
Rue de la Loi 38 - 1000 Brussels
T : +32 (0) 289 665 83

M: +32 (0)491 258 232

Follow us on Twitter @euroispa


EuroISPA is the world's largest association of Internet Services Providers, representing over 2500 ISPs across Europe.

EU Transparency Register ID Number: 5443781311

  Wikimedia on 06-10-2020
Click to expand

Dear European Commission,

Wikimedia is a global network of associations, project communities and a foundation that has dedicated itself to making knowledge free and accessible globally. Our most popular projects are Wikipedia, Wikidata (a free database storage for structured data that can be edited and read by machines and humans) and Wikimedia Commons (a multimedia archive). All three projects are open, free, self-governing and the largest in their respective catogry, succefully competing with commercial, for-profit projects. They belong to the world and can thus be considered as the online equivalent of "public spaces" or the "commons".

We have been and continue to be engaged, together with our Austrian chapter Wikimedia Österreich, in the national debates ( As this is a legislative project that will have consequences way beyond the Austrian borders, we take the liberty to also engage the the EU level consultation.

We would highly recommend for the Austrian legislator postpone its plans until after the European Commission has presented its own legislative package known as the Digital Services Act. The reasons for this are manifold:

1. We are doing our utmost to keep up with all laws and legislations across Europe and, indeed, the world. But it is simply impossible. If we want to make space for alternative platforms to compete with gatekeepers we need clear rules across Europe. Otherwise we help entrench very large gatekeeper platforms who can simply pay for carrying the legal risk.

2. As opposed to the carve-outs of the Copyright in the Digital Single Market Directive, the proposed Austrian national law, as it currently stands, does not exclude all non-for profit Wikimedia projects , which are following the fules by and large. Both Wikimedia Commons and Wikidata remain within scope, although they are not meant to be targeted by the legislator. Again, this creates at least partial contradictions between EU-level and national principles

3. Our projects work and compete with the gatekeepers not on the grounds of money, but because we have thiriving communities. The community moderation model of Wikimedia projects seems incompatible with this legislation. It is unclear how the Wikimedia Foundation would comply with §5 “Verantwortlicher Beauftragter”, whithout changing the structure and governance fo the projects. The very same self-governane that helps us be an alternative to the gatekeepers.
We hope the European Commision and the Austrian legislator can find a way to compete on this. Ideally, all these issue will at least in principle be laid out in the Digital Services Act before Member States make their own and necessary national improvement.

I remain available for any feedback or comments.


Dimitar Dimitrov

EU Policy Director