Act improving law enforcement on social networks [Netzdurchführungsgesetz – NetzDG]
Communication from the Commission - TRIS/(2017) 00838
Directive (EU) 2015/1535
Translation of the message 001
No abre el plazo - Nezahajuje odklady - Fristerne indledes ikke - Kein Fristbeginn - Viivituste perioodi ei avata - Καμμία έναρξη προθεσμίας - Does not open the delays - N'ouvre pas de délais - Non fa decorrere la mora - Neietekmē atlikšanu - Atidėjimai nepradedami - Nem nyitja meg a késéseket - Ma’ jiftaħx il-perijodi ta’ dawmien - Geen termijnbegin - Nie otwiera opóźnień - Não inicia o prazo - Neotvorí oneskorenia - Ne uvaja zamud - Määräaika ei ala tästä - Inleder ingen frist - Не се предвижда период на прекъсване - Nu deschide perioadele de stagnare - Nu deschide perioadele de stagnare.
1. Structured Information Line
MSG 002 IND 2017 0127 D EN 27-03-2017 D NOTIF
2. Member State
3. Department Responsible
Bundesministerium für Wirtschaft und Energie, Referat E B 2, 11019 Berlin,
Tel.: 0049-30-2014-6353, Fax: 0049-30-2014-5379, E-Mail: email@example.com
3. Originating Department
Bundesministerium der Justiz und für Verbraucherschutz, Referat V B 2, 10117 Berlin
Tel.: 0049-30-18580-9522, Fax: 0049-30-18580-9525, E-Mail: firstname.lastname@example.org
4. Notification Number
2017/0127/D - SERV60
Act improving law enforcement on social networks [Netzdurchführungsgesetz – NetzDG]
6. Products Concerned
7. Notification Under Another Act
8. Main Content
The draft proposes the introduction of statutory compliance rules for social networks in order to encourage them to process complaints about hate speech and other criminal content more quickly and comprehensively.
By laying down a legal definition of a social network, the draft ensures that the duty to report applies only to the operators of large, influential social networks, instead of to all service providers as set out in the Telemedia Act [Telemediengesetz – TMG]. The draft does not cover media platforms that publish their own journalistic and editorial content. The definition of a social network includes both the exchange of content between users in a closed or ‘gated’ community, and the public distribution of content. A minimum size is provided for relatively small companies (start-ups). It is also clarified that the illegal content concerned is only that covered by the objective offences in the criminal provisions used to combat hate speech or other criminal content, as set out in § 1(3) of the draft act.
Social networks shall be legally bound to file quarterly reports on how they dealt with complaints regarding potentially criminal content. The reports shall contain statistics on the volume of complaints and information on the networks' decision-making process. They shall also provide information on the complaints team responsible for processing the complaints. The reports must be readily retrievable both in the electronic Bundesanzeiger [Federal Gazette] and on the social network's homepage.
The draft sets out legal standards for effective complaint management to ensure that social networks delete blatantly criminal content corresponding to an objective offence in one of the criminal provisions stated in § 1(3), as a rule 24 hours after receipt of the complaint from the user. The draft makes it compulsory to have effective, transparent methods for the prompt deletion of illegal content, including user-friendly mechanisms for registering complaints. This compliance obligation is based on the provision regarding service providers' liability pursuant to § 10 TMG. Service providers are bound to immediately remove illegal content they are storing for a user, or to block access to said content once they become aware of it. The compliance obligations laid down in this draft presuppose said requirement imposed on service providers and specify it further.
Pursuant to the draft, the following constitute regulatory offences punishable with a fine of up to EUR 5 million: deliberate or negligent non-compliance with the reporting obligation, violation of the obligation to have effective complaint management, or violation of the obligation to appoint a person on German soil authorised to accept service and an authorised recipient on German soil for requests for information from law enforcement authorities. According to § 17(4) of the Act on Regulatory Offences [OWiG], the fine shall exceed the financial benefit obtained from the regulatory offence.
Under § 130 OWiG, which is also applicable, the owner of the company running the social network can also be prosecuted if proper supervision could have prevented or significantly reduced the likelihood of contravention of the obligation to have effective complaint management, the reporting obligation, or the obligation to appoint a person authorised to accept service and an authorised recipient on German soil.
Pursuant to § 30 OWiG, fines can also be imposed on legal persons and associations of persons. For such cases, the maximum fine according to this draft is increased to EUR 50 million (§ 30(2), sentence 3 OWiG).
The draft designates the Federal Office of Justice [Bundesamt für Justiz] as the competent administrative authority under § 36 OWiG. In connection with prosecutions for the regulatory offences stated in this draft, the Office shall also be responsible for checking whether the content is criminal within the meaning of § 1(3).
9. Brief Statement of Grounds
The language used in discussions on the internet, and particularly on social networks, is currently undergoing dramatic changes. Internet debates are often aggressive and hurtful, and hateful speech is not uncommon. Hate speech and racist slurs may be used to vilify people because of their opinion, skin colour, ethnicity, religion, gender or sexuality. Hate speech and other criminal content that cannot be effectively combatted and tracked pose a great danger to the peaceful co-existence enjoyed by citizens of a free, open and democratic society.
Following the events during the US election campaign, Germany too has made fighting fake news on social networks a priority.
To do so requires improvements in law enforcement on social networks in order to promptly remove objectively criminal content, such as hate speech, abuse, defamation or content that could lead to a breach of the peace by misleading authorities into thinking a crime has been committed.
The spread of hate speech and other criminal content, especially on social networks, prompted the Federal Ministry of Justice and Consumer Protection in 2015 to set up a task force together with operators of the networks and civil society representatives. The companies represented in the task force agreed to improve their processes for dealing with messages notifying them of hate speech and other criminal content on their sites. The companies committed themselves to set up user-friendly mechanisms for reporting discriminatory posts. They also agreed to review and delete the majority of reported posts that turn out to be illegal within 24 hours, using teams of linguists and legal experts. German law shall form the basis for this review.
The voluntary measures taken by the companies brought initial improvements, yet these are still inadequate. Too much criminal content is still left on the sites. A check carried out by jugendschutz.net on the deletion practices of social networks in January/February 2017 showed that complaints from normal users about criminal content were still not processed promptly and satisfactorily. While YouTube now deletes criminal content in 90 % of cases, this only occurs in 39 % of cases on Facebook, and just 1 % on Twitter.
Social networks are also not sufficiently transparent. The information they published on the removal and blocking of illegal content on their platforms is lacking. Complaints received are not classified by type, and the companies do not give any information on what percentage of complaints led to content being deleted or blocked.
Social network operators must live up to the responsibility they have in terms of social debate. Since the current mechanisms and the voluntary measures agreed on by social networks are inadequate and given the significant problems in enforcing the current law, it is necessary to introduce rules to make social networks comply on pain of a fine, to enable prompt, effective action against hate speech and other criminal content on the internet.
10. Reference Documents - Basic Texts
No basic text(s) available
11. Invocation of the Emergency Procedure
12. Grounds for the Emergency
14. Fiscal measures
15. Impact assessment
The draft will entail total annual compliance costs of at least EUR 28 million for social networks. The draft will mean annual compliance costs of at least EUR 4 million for the Federal Government, as well as one-off costs of at least EUR 350 000. Federal states will incur estimated total compliance costs of at least EUR 200 000 a year.
16. TBT and SPS aspects
No - the draft has no significant impact on international trade.
No - the draft has no significant impact on international trade.
Contact point Directive (EU) 2015/1535
Fax: +32 229 98043
The TRIS website makes it easy for you or your organization to contribute with your opinion on any given notification.
You may submit your contribution in any of the official languages of the EU. However, mind that we will not provide translations of contributions on the site. Additionally we remind you that contributions will only be accepted until 23:59:59 CET of the date of end of the standstill period.
Pursuant to the notification procedure set out in Directive (EU) 2015/1535, ARTICLE 19 provides the attached detailed examination of Germany’s Draft Bill on the Improvement of Enforcement of Rights in Social Networks (the Draft Bill) for its compliance with international and European human rights law and standards, in particular on the right to freedom of expression. In relation to European Union law, our analysis takes into consideration the EU Charter on Fundamental Rights and the E-Commerce Directive (Directive 2000/31/EC).
ARTICLE 19 is an independent human rights organisation that works around the world to protect and promote the right to freedom of expression and the right to freedom of information. We have extensive experience in analysing laws pertaining to the right to freedom of expression, including in the fields of “hate speech” and intermediary liability.
Our analysis shows that the Draft Bill raises serious concerns under international and European human rights law We also believe that it would create barriers to intra-EU trade and prevent digital companies trading freely in the EU and beyond.
Our concerns are summarised as follows:
- The Draft Bill would establish a new regime of intermediary liability that would incentivise, through severe administrative penalties, the removal and blocking of online content, without a determination of the legality of that content by a court, and without sufficient safeguards for freedom of expression, including for Social Network users whose content is wrongly removed. In particular, it is difficult to see how the proposed duty to delete copies of "Violating Content" will be reconciled with the limited protection from liability provided in the E-Commerce Directive Article 14 as there will not necessarily be notice of offending copies, thus creating an obligation to monitor content against Article 15 of the E-Commerce Directive.
- There is significant ambiguity in the Draft Bill regarding: the definitions of key terms, including those that set out the scope of which entities the Draft Bill would apply to and those that limit or exclude liability, and the threshold at which blocking and removal processes will be considered so inadequate as to attract administrative liability. We highlight that this legal uncertainty would present a barrier to trade in single market.
- The content limitations that Social Networks would be required to enforce extend far beyond “hate speech” which may permissibly be limited under international human rights law, including criminal prohibitions on “defamation of religions” (blasphemy) and criminal defamation (including against the President of the Federation) that are against international and European human rights law and would bring business enterprises enlisted in enforcing them into conflict with their responsibilities to respect international and European human rights law. Therefore, not all expression proscribed under the specified provisions of the German Criminal Code should not be considered "illegal activity or information" as per Art 14 of the E-Commerce Directive, since it requires the removal of content that the EU Charter of Fundamental Rights should be interpreted to protect as freedom of expression.
- The Draft Bill provides only limited oversight of the liability regime by the Administrative Court, which does nothing to address potential risks of over-blocking, and provides little protection or due process to Social Networks that, in good faith, refrain from blocking or removing content in the interests of respecting freedom of expression.
- Proposals in the Draft Bill to amend the Telemedia Act to significantly widen the bases on which law enforcement authorities may request user data from intermediaries without a court order are concerning.
Overall, ARTICLE 19 finds the Draft Bill to be contrary to international human rights law and standards, and particularly dangerous to the protection and promotion of freedom of expression in Germany and internationally.
We respectfully recommend to the European Commission that Germany be required to withdraw the Draft Bill, with consideration given to retaining in alternative legislation reporting requirements to increase transparency around online content moderation. It should also be recommended that the German Criminal Code be substantially revised, to bring it in line with international and European freedom of expression standards, and that the Telemedia Act be reformed to ensure that any requests from law enforcement authorities to intermediaries for user data is made on the basis of a court order.
ARTICLE 19 stands ready to provide further assistance to the Commission and the government of Germany in this process.
Attached: full ARTICLE 19 analysis.
German Proposal Threatens Censorship on Wide Array of Online Services
Anticipating federal elections in September, Germany’s Minister of Justice has proposed a new law aimed at limiting the spread of hate speech and “fake news” on social media sites. But the proposal, called the “Social Network Enforcement Bill” or “NetzDG,” goes far beyond a mere encouragement for social media platforms to respond quickly to hoaxes and disinformation campaigns and would create massive incentives for companies to censor a broad range of speech.
The NetzDG scopes very broadly: It would apply not only to social networking sites but to any other service that enables users to “exchange or share any kind of content with other users or make such content accessible to other users.” That would mean that email providers such as Gmail and ProtonMail, web hosting companies such as Greenhost and 1&1, remote storage services such as Dropbox, and any other interactive website could fall within the bill’s reach.
Under the proposal, providers would be required to promptly remove “illegal” speech from their services or face fines of up to 50 million euros. NetzDG would require providers to respond to complaints about “Violating Content,” defined as material that violates one of 24 provisions of the German Criminal Code. These provisions cover a wide range of topics and reveal prohibitions against speech in German law that may come as a surprise to the international community, including prohibitions against defamation of the President (Sec. 90), the state, and its symbols (Sec. 90a); defamation of religions (Sec. 166); distribution of pornographic performances (Sec. 184d); and dissemination of depictions of violence (Sec. 131).
NetzDG would put online service providers in the position of a judge, requiring that they accept notifications from users about allegedly “Violating Content” and render a decision about whether that content violates the German Criminal Code. Providers would be required to remove “obvious” violations of the Code within 24 hours and resolve all other notifications within 7 days. Providers are also instructed to “delete or block any copies” of the “Violating Content,” which would require providers not only to remove content at a specified URL but to filter all content on their service.
The approach of this bill is fundamentally inconsistent with maintaining opportunities for freedom of expression and access to information online. Requiring providers to interpret the vagaries of 24 provisions of the German Criminal Code is a massive burden. Determining whether a post violates a given law is a complex question that requires deep legal expertise and analysis of relevant context, something private companies are not equipped to do, particularly at mass scale. Adding similar requirements to apply the law of every country in which these companies operate (or risk potentially bankrupting fines) would be unsustainable.
The likely response from hosts of user-generated content would be to err on the side of caution and take down any flagged content that broaches controversial subjects such as religion, foreign policy, and opinions about world leaders. And individuals – inside and outside of Germany – would likely have minimal access to a meaningful remedy if a provider censors their lawful speech under NetzDG.
The proposal is also completely out of sync with international standards for promoting free expression online. It has long been recognized that limiting liability for intermediaries is a key component to support a robust online speech environment. As then-Special Rapporteur for Freedom of Expression, Frank La Rue, noted in his 2011 report:
Holding intermediaries liable for the content disseminated or created by their users severely undermines the enjoyment of the right to freedom of opinion and expression, because it leads to self-protective and over-broad private censorship, often without transparency and the due process of the law.
The Council of Europe has likewise cautioned against the consequences of shifting the burden to intermediaries to determine what speech is illegal, in conjunction with the report it commissioned in 2016 on comparative approaches to blocking, filtering, and takedown of content: “[T]he decision on what constitutes illegal content is often delegated to private entities, which in order to avoid being held liable for transmission of illegal content may exercise excessive control over information accessible on the Internet.”
Shielding intermediaries from liability for third-party content is the first of the Manila Principles on Intermediary Liability, a set of principles supported by more than 100 civil society organizations worldwide. The Manila Principles further caution that “Intermediaries must not be required to restrict content unless an order has been issued by an independent and impartial judicial authority that has determined that the material at issue is unlawful.” It is a mistake to force private companies to be judge, jury, and executioner for controversial speech.
CDT recommends that the German legislature reject this proposed measure. It clearly impinges on fundamental rights to free expression and due process. The challenges posed to our democracies by “fake news,” hate speech, and incitement to violence are matters of deep concern. But laws that undermine individuals’ due process rights and co-opt private companies into the censorship apparatus for the state are not the way to defend democratic societies. Governments must work with industry and civil society to address these problems without undermining fundamental rights and the rule of law.
Proposed German Legislation Threatens Free Expression Around the World
The Global Network Initiative is deeply concerned by the “Draft Law to Improve Law Enforcement in Social Networks” (Netzwerkdurchsetzungsgesetz) approved by the German cabinet on April 5.
GNI is mindful of the complex challenges that governments and companies face when dealing with controversial content online. We recognize the legitimate interest that the German Government has in protecting the public. However, we are concerned that the rush to legislate, and to pressure companies under threat of fines to determine what is or is not illegal content, poses inadvertent but grave consequences for the right to freedom of expression in Germany, across the European Union, and worldwide.
We urge the German Parliament to embrace its leadership role on human rights and the digital economy by rejecting the proposed legislation.
“The practical effect of this bill would be to outsource decisions on the balance between the fundamental right of freedom of expression and other legally protected rights to private companies,” said GNI Board Chair Mark Stephens, CBE.
“Companies facing the threat of multi-million euro fines will be compelled to broadly censor the internet, restricting the use of their services for any content that could be considered controversial,” he said.
This legislation has been described as a measure to combat hate speech and disinformation online, but its potential impact would be broader censorship of the internet. The impact of this bill would be to pressure private companies to take down any content that might run afoul of some 24 current provisions of the German Criminal Code – including offenses as varied as "defamation of the state and its symbols," "anti-constitutional defamation of constitutional organs," defamation of religions, religious and ideological associations,” and "depictions of violence."
Although aimed at social networks, a wide array of online platforms and services, from email providers to web hosting and remote storage providers, could be affected, given the broad definition of social networks in the bill.
GNI recognizes that governments may restrict freedom of expression in circumstances that are compatible with constitutional law, and provided that their laws and policies accord with the conditions in Article 19 of the International Covenant on Civil and Political Rights and Article 10 of the European Convention on Human Rights. The proposed legislation does not meet these required tests of necessity and proportionality, and it poses a threat to open and democratic discourse.
“The internet has enabled huge advances in free expression and economic growth in large part because private intermediaries, including social media platforms and internet service providers, are not required to monitor and control what people can say or share or do online,” said GNI Executive Director, Judith Lichtenberg.
“We encourage the German Government to embrace these norms and use other means of managing illegal content,” she said.
GNI recently released a policy brief, Extremist Content in the ICT Sector, with recommendations directed at both governments and companies on how to address extremist content online without harming freedom of expression and privacy. These recommendations have the support of our multi-stakeholder membership, and may also provide guidance for governments as they consider how best to protect freedom of expression when dealing with other types of controversial content.
For comment or further information on this release, please contact Kath Cummins, GNI Director of Communications and Outreach: email@example.com, or call +1 202 590 0837.
About the GNI
Founded in 2008, The Global Network Initiative is an international multi-stakeholder group of companies, civil society organizations (including human rights and press freedom groups), investors and academics, who have created a collaborative approach to protect and advance freedom of expression and privacy in the ICT sector. GNI has built a framework of principles and implementation guidelines based on international human rights standards on which GNI member companies are independently assessed. Our membership collectively advocates with governments to protect and advance user freedom of expressions and privacy rights.
For more information on GNI’s members, the GNI Principles, and the GNI Independent Company Assessment process, visit our website: https://globalnetworkinitiative.org/