Notification Detail

Act improving law enforcement on social networks [Netzdurchführungsgesetz – NetzDG]

Notification Number: 2017/127/D (Germany )
Date received: 27/03/2017
End of Standstill: 28/06/2017

Issue of comments by: Italy,Sweden
de en fr
bg cs da de el en es et fi fr hr hu it lt lv mt nl pl pt ro sk sl sv


Mensaje 002

Comunicación de la Comisión - TRIS/(2017) 00838
Directiva (UE) 2015/1535
Traducción del mensaje 001
Notificación: 2017/0127/D

No abre el plazo - Nezahajuje odklady - Fristerne indledes ikke - Kein Fristbeginn - Viivituste perioodi ei avata - Καμμία έναρξη προθεσμίας - Does not open the delays - N'ouvre pas de délais - Non fa decorrere la mora - Neietekmē atlikšanu - Atidėjimai nepradedami - Nem nyitja meg a késéseket - Ma’ jiftaħx il-perijodi ta’ dawmien - Geen termijnbegin - Nie otwiera opóźnień - Não inicia o prazo - Neotvorí oneskorenia - Ne uvaja zamud - Määräaika ei ala tästä - Inleder ingen frist - Не се предвижда период на прекъсване - Nu deschide perioadele de stagnare - Nu deschide perioadele de stagnare.

(MSG: 201700838.ES)

1. Structured Information Line
MSG 002 IND 2017 0127 D ES 27-03-2017 D NOTIF


2. Member State
D


3. Department Responsible
Bundesministerium für Wirtschaft und Energie, Referat E B 2, 11019 Berlin,
Tel.: 0049-30-2014-6353, Fax: 0049-30-2014-5379, E-Mail: infonorm@bmwi.bund.de


3. Originating Department
Bundesministerium der Justiz und für Verbraucherschutz, Referat V B 2, 10117 Berlin
Tel.: 0049-30-18580-9522, Fax: 0049-30-18580-9525, E-Mail: poststelle@bmjv.bund.de


4. Notification Number
2017/0127/D - SERV60


5. Title
Ley para mejorar la aplicación de la legislación en las redes sociales


6. Products Concerned
-


7. Notification Under Another Act
-


8. Main Content
El proyecto prevé la introducción de reglamentaciones de cumplimiento jurídicas para las redes sociales, con el fin de incitar a las redes sociales a gestionar con más rapidez y en mayor medida las quejas relativas a delitos motivados por el odio y otros contenidos punibles.

Asimismo, a través de una definición legal de las redes sociales, se garantizará que la obligación de notificación solo afecte a los operadores de grandes redes sociales con influencia en la opinión pública y no a todos los proveedores de servicios de acuerdo con la Ley sobre medios de comunicación electrónicos (TMG, por su versión en alemán). El proyecto no aborda las plataformas mediáticas con contenidos propios de carácter periodístico y redaccional. La definición de las redes sociales incluye tanto el intercambio de contenidos con otros usuarios en una comunidad de red cerrada («gated community») como la difusión de contenidos en público. Se prevé un umbral mínimo para las empresas de menor dimensión (empresas emergentes). Además, se aclara que únicamente se abordan contenidos ilícitos que incluyen el hecho constitutivo objetivo de las normas penales que sirven para luchar contra los delitos motivados por el odio u otros contenidos punibles, de conformidad con el artículo 1, apartado 3, del proyecto de Ley.

Las redes sociales tienen la obligación legal de presentar informes, trimestralmente, sobre la gestión de quejas relativas a contenidos pertinentes para el derecho penal. El informe deberá contener datos estadísticos sobre el volumen de quejas y la práctica decisoria de las redes, y deberá informar acerca del equipo responsable de la gestión de las quejas. El informe se publicará en la versión electrónica del Boletín Oficial Federal y en la propia página web de la red social de forma que pueda consultarse fácilmente.

El proyecto establece normas jurídicas para una gestión eficaz de quejas que garantizarán que, por norma general, en un plazo de 24 horas desde la entrada de la queja del usuario, las redes sociales eliminen los contenidos claramente pertinentes para el derecho penal que desempeñan el hecho constitutivo objetivo de una de las normas penales previstas en el artículo 1, apartado 3. Se exigen procedimientos eficaces y transparentes para la eliminación inmediata de contenidos ilícitos, incluidos mecanismos de fácil utilización para la transmisión de quejas. El punto de partida de este deber de cumplimiento es la reglamentación de responsabilidad de los proveedores de servicios de acuerdo con el artículo 10 de la TMG. Estos están obligados a suprimir inmediatamente los contenidos ilícitos que almacenan para un usuario o a bloquear inmediatamente el acceso a estos cuando tengan conocimiento de dichos contenidos. Los deberes de cumplimiento establecidos en el presente proyecto presuponen y concretan esta obligación de los proveedores de servicios.

El incumplimiento intencionado o imprudente de la obligación de notificación, la infracción del deber de poner a disposición una gestión eficaz de quejas, así como la infracción del deber de designar un responsable nacional a efectos de notificaciones y un destinatario nacional para las solicitudes de información de las autoridades penales, según lo dispuesto en el proyecto, constituyen infracciones administrativas que pueden sancionarse con una multa de hasta 5 millones EUR. De conformidad con el artículo 17, apartado 4, de la Ley sobre infracciones administrativas (OWiG, por su versión en alemán), la multa deberá superar los beneficios económicos de la infracción administrativa.
En virtud del artículo 130 de la OWiG igualmente aplicable, también es posible la persecución del propietario de la empresa que opera la red social, cuando la infracción del deber de poner a disposición una gestión eficaz de quejas, el incumplimiento de la obligación de notificación o la infracción del deber de designar un responsable nacional a efectos de notificaciones y un destinatario nacional podía haberse evitado o dificultado considerablemente mediante una supervisión adecuada.
De acuerdo con el artículo 30 de la OWiG, asimismo es posible imponer una multa a personas jurídicas o a asociaciones de personas. El importe máximo de la multa de acuerdo con el presente proyecto asciende en este caso a 50 millones EUR (artículo 30, apartado 2, frase tercera, de la OWiG).

El proyecto designa como autoridad administrativa competente de acuerdo con el artículo 36 de la OWiG la Oficina Federal de Justicia, que en el marco de la persecución de las acciones relativas a los hechos constitutivos según la OWiG, citados en el presente proyecto, también deberá comprobar la existencia de contenidos punibles de conformidad con el artículo 1, apartado 3.


9. Brief Statement of Grounds
Actualmente se determina un cambio enorme del discurso social en la red y, en particular, en las redes sociales. La cultura de debate en la red a menudo es agresiva y abusiva, y no son pocas las veces que está cargada de odio. Los delitos motivados por el odio y la incitación al racismo pueden difamar a cualquier persona sobre la base de la opinión, el color de la piel o la procedencia, la religión, el sexo o la sexualidad. Los delitos motivados por el odio y otros contenidos punibles que no pueden combatirse o perseguirse de forma eficaz suponen un gran peligro para la coexistencia pacífica en una sociedad libre, abierta y democrática.

Como consecuencia de la experiencia durante la campaña electoral de los Estados Unidos, la lucha contra las noticias falsas («fake news») ilícitas en las redes sociales adquirió gran prioridad.

Por consiguiente, resulta pertinente la mejora de la aplicación de la legislación en las redes sociales, con el fin de eliminar de forma inmediata los contenidos objetivamente punibles, como incitaciones al odio, calumnias, difamaciones o perturbaciones de la paz pública mediante la simulación de infracciones.

La creciente difusión de delitos motivados por el odio y otros contenidos punibles, sobre todo en las redes sociales, ya en 2015, había llevado al Ministerio Federal de Justicia y Protección de los Consumidores a crear un grupo de trabajo con los operadores de las redes sociales y los representantes de la sociedad civil. Las empresas representadas en el grupo de trabajo aceptaron mejorar, por su parte, la gestión de indicios de delitos motivados por el odio y otros contenidos punibles. Las empresas se comprometieron a establecer mecanismos de fácil utilización para notificar publicaciones críticas, así como, en un plazo de 24 horas y con un equipo cualificado desde el punto de vista lingüístico y jurídico, a comprobar la mayoría de las publicaciones notificadas y eliminarlas cuando sean ilícitas. La legislación alemana constituye la base de la comprobación.

Los compromisos propios de las empresas dieron como resultado las primeras mejoras. No obstante, estas no son suficientes. El número de contenidos punibles eliminados sigue siendo demasiado reducido. Una supervisión de la práctica de eliminación de las redes sociales efectuada por jugendschutz.net entre enero y febrero de 2017 indicó que las quejas de usuarios normales frente a contenidos punibles siguen sin gestionarse de forma inmediata y adecuada. En YouTube ya se eliminan los contenidos punibles en el 90 por ciento de los casos. En cambio, Facebook solamente eliminó los contenidos punibles en el 39 por ciento de los casos y Twitter en el 1 por ciento.

Asimismo, la transparencia de las redes sociales es insuficiente. La información publicada por las redes sociales relativa a la eliminación y el bloqueo de contenidos ilícitos en sus plataformas no son lo suficientemente significativas. Las quejas recibidas no se clasifican en grupos de casos y las empresas no realizan indicaciones sobre el porcentaje de quejas que lleva a la eliminación o al bloqueo.

Los proveedores de redes sociales tienen una responsabilidad con respecto a la cultura de debate social que deben cumplir. Teniendo en cuenta el hecho de que los instrumentos actuales y los compromisos propios aceptados de las redes sociales no producen un efecto suficiente y que existen problemas considerables en la aplicación de la legislación vigente, se requiere la introducción de reglamentaciones de cumplimiento sujetas a multas para las redes sociales, con el objetivo de actuar de forma eficaz e inmediata contra delitos motivados por el odio y otros contenidos punibles en la red.


10. Reference Documents - Basic Texts
No existen textos de base


11. Invocation of the Emergency Procedure
No


12. Grounds for the Emergency
-


13. Confidentiality
No


14. Fiscal measures
No


15. Impact assessment
El proyecto causa a las redes sociales un coste de cumplimiento total de al menos 28 millones EUR anuales. El proyecto implica un coste de cumplimiento para el Estado federal que asciende a al menos 4 millones EUR anuales y un coste único de como mínimo 350 000 EUR. El coste de cumplimiento total necesario para los Estados federados se estima en al menos 200 000 EUR.


16. TBT and SPS aspects
Acuerdo OTC

NO. El proyecto no tiene ningún efecto significativo sobre el comercio internacional.

Acuerdo MSF

NO. El proyecto no tiene ningún efecto significativo sobre el comercio internacional.

**********
Comisión Europea

Punto de contacto Directiva (UE) 2015/1535
Fax: +32 229 98043
email: grow-dir83-189-central@ec.europa.eu

Stakeholders Contributions

The TRIS website makes it easy for you or your organization to share your views on any given notification.
Due to the end of standstill we are currently not accepting any further contributions for this notification via the website.


de
  Deutscher Anwaltverein - DAV on 28-06-2017
Click to expand

 

Sehr geehrte Damen und Herren,

 

 

 

im Folgenden finden Sie die Stellungnahme des Deutschen Anwaltvereins zum Entwurf des Netzwerkdurchsetzungsgesetzes (NetzDG).

 

 

 

Wir möchten im hiesigen Kontext besonders auf Punkt 3.3 der Stellungnahme hinweisen, unter welchem ausführlich der Verstoß des NetzDG gegen die E-Commerce-Richtlinie (ECRL) dargelegt wird.

 

Ausweislich ihres fünften und sechsten Erwägungsgrundes dient diese Richtlinie dazu die rechtlichen Hemmnissen für das reibungslose Funktionieren des Binnenmarktes zu beseitigen, die der Weiterentwicklung der Dienste der Informationsgesellschaft entgegenstehen und somit die Ausübung der Niederlassungsfreiheit und des freien Dienstleistungsverkehrs weniger attraktiv machen. In Anbetracht der Grundfreiheiten sind diese Hemmnisse durch Koordinierung bestimmter innerstaatlicher Rechtsvorschriften und durch Klarstellung von Rechtsbegriffen auf Unionsebene zu beseitigen, soweit dies für das reibungslose Funktionieren des Binnenmarktes erforderlich ist.

 

Mit dem Verstoß des NetzDG gegen diese Richtlinie werden diese Ziele jedoch konterkariert. Besonders der unter 3.3.2 unserer Stellungnahme erläuterte Verstoß gegen das Herkunftslandprinzip des Art. 3 der ECRL verdeutlicht die Unvereinbarkeit des  NetzDG in seiner aktuellen Fassung mit den Grundfreiheiten. Daher sei auch hier noch einmal auf den auf S. 27 des Gutachtens festgehaltenen Appell an die Kommission verwiesen, im Notifizierungsverfahren eine ausführliche Stellungnahme abzugeben, in der sie die Bundesregierung auffordert, darzulegen, welche Diensteanbieter in anderen EU-Mitgliedsstaaten von dem NetzDG konkret erfasst würden, und wann die Bundesregierung die dahingehende Information der jeweiligen Herkunftsländer nach Art. 3 Abs. 4 lit. b, Abs. 5 ECRL umzusetzen bzw. nachzuholen gedenkt.

 

 

 

Auch möchten wir Punkt 3.4 hervorheben, der die Unvereinbarkeit des NetzDG mit der Datenschutz-Grundverordnung (DSGVO) erörtert. Ausweislich des sechsten und siebten Erwägungsgrundes der DSGVO erfordern die raschen technologischen Entwicklungen und die Globalisierung einen „soliden, kohärenteren und klar durchsetzbaren Rechtsrahmen im Bereich des Datenschutzes in der Union, da es von großer Wichtigkeit ist, eine Vertrauensbasis zu schaffen, die die digitale Wirtschaft dringend benötigt, um im Binnenmarkt weiter wachsen zu können.“ Ein nationales Gesetz, das den Vorschriften dieses Rahmens entgegensteht, schafft demnach Hemmnisse die das Wachstum des Binnenmarktes verhindern. Eine Reaktion der Kommission wäre daher auch in dieser Hinsicht wünschenswert.

 

 

 

Zu guter Letzt sei noch auf Punkt 3.2. der Stellungnahme hingewiesen, unter welchem eine ausführliche Darlegung erfolgt, inwiefern das NetzDG die Kommunikationsfreiheiten des Art. 5 GG verletzt.  Die dortigen Ausführungen sind auch im Lichte des Art. 11 der Grundrechtecharta der Europäischen Union zu betrachten, so dass die Kommission nach unserer Auffassung auf die drohende Unvereinbarkeit des NetzDG mit dem Grundrecht auf Meinungs- und Informationsfreiheit hinweisen sollte.

 

 

 

 

 

Mit freundlichen Grüßen

Dorothee Wildt

 

 

Stv. Leiterin Büro Brüssel, Referentin für EU-Angelegenheiten

Deputy Head of Brussels Office, Legal Adviser for EU-Affairs


en
  European Digital Rights (EDRi) on 27-06-2017
Click to expand

 

European Digital Rights (EDRi) is an association of civil and human rights organisations from across Europe. We defend rights and freedoms in the digital environment.

Information technology has a revolutionary impact on our society. It has boosted freedom of expression and democracy but has also led to new approaches to surveillance and is increasingly used to impose restrictions on fundamental rights.

On 27 March, the Federal Republic of Germany sent to the European Commission a draft of a law that would change the way social networks deal with online content that has been accused of being a breach of their terms of service and/or being illegal.

This law contravenes Article 14 of the E-Commerce Directive (2000/31/EC) which provides a liability exception for online intermediaries, when they act expeditiously to remove illegal content, according to a notice-take-down procedure. The German draft law instead, provides disproportionate fines for social networks that do not delete within 24 hours “clearly violating content” or within a week  “violating content”. There is no indication of how a decision is to be made on what  “clearly violating content” or “violating content” might be. It is also far from clear what characteristics would be be used to definitively class a service as being a  social network. As a result, it is easy to see how the fear of high fines will bring platforms to delete and block any content that appears to generate a risk of being punished under this new law. This, of course, would seriously hinder the fundamental right to freedom of expression and opinion. Indeed, the entirely predictable impact of the law, if enacted, would be a breach of key European Court of Human Rights case law in this area:

"Freedom of expression constitutes one of the essential foundations of such a society, one of the basic conditions for its progress and for the development of every man. Subject to paragraph 2 of Article 10 (art. 10-2), it is applicable not only to "information" or "ideas" that are favourably received or regarded as inoffensive or as a matter of indifference, but also to those that offend, shock or disturb the State or any sector of the population. Such are the demands of that pluralism, tolerance and broad-mindedness without which there is no "democratic society"."
Cf. Handyside vs UK, European Court of Human Rights case 5493/72), paragraph 49

This law would not be only a problem from a human rights perspective, but also from a market perspective. Rules like this would create even more uncertainty for all the European social networks that would face new, different laws for every Member State, moving away from the idea of a European Digital Single Market. Regulating the Internet as if it consisted only of Facebook or Google will create an internet that consists only of Facebook and Google.

EDRi recommends abandoning the draft law. It would set a dangerous precedent, on both a European and global perspective. EDRi is not alone in this call. The UN Rapporteur on Freedom of Opinion and Expression David Kaye, other civil society organisations, industry, academia and even the legal services of the Bundestag have raised serious concerns about this bill.

en
  ARTICLE 19 on 27-04-2017
Click to expand

Pursuant to the notification procedure set out in Directive (EU) 2015/1535, ARTICLE 19 provides the attached detailed examination of Germany’s Draft Bill on the Improvement of Enforcement of Rights in Social Networks (the Draft Bill) for its compliance with international and European human rights law and standards, in particular on the right to freedom of expression. In relation to European Union law, our analysis takes into consideration the EU Charter on Fundamental Rights and the E-Commerce Directive (Directive 2000/31/EC).

ARTICLE 19 is an independent human rights organisation that works around the world to protect and promote the right to freedom of expression and the right to freedom of information. We have extensive experience in analysing laws pertaining to the right to freedom of expression, including in the fields of “hate speech” and intermediary liability.

Our analysis shows that the Draft Bill raises serious concerns under international and European human rights law We also believe that it would create barriers to intra-EU trade and prevent digital companies trading freely in the EU and beyond. 

Our concerns are summarised as follows:

  • The Draft Bill would establish a new regime of intermediary liability that would incentivise, through severe administrative penalties, the removal and blocking of online content, without a determination of the legality of that content by a court, and without sufficient safeguards for freedom of expression, including for Social Network users whose content is wrongly removed. In particular, it is difficult to see how the proposed duty to delete copies of "Violating Content" will be reconciled with the limited protection from liability provided in the E-Commerce Directive Article 14 as there will not necessarily be notice of offending copies, thus creating an obligation to monitor content against Article 15 of the E-Commerce Directive. 
  • There is significant ambiguity in the Draft Bill regarding: the definitions of key terms, including those that set out the scope of which entities the Draft Bill would apply to and those that limit or exclude liability, and the threshold at which blocking and removal processes will be considered so inadequate as to attract administrative liability. We highlight that this legal uncertainty would present a barrier to trade in single market.
  • The content limitations that Social Networks would be required to enforce extend far beyond “hate speech” which may permissibly be limited under international human rights law, including criminal prohibitions on “defamation of religions” (blasphemy) and criminal defamation (including against the President of the Federation) that are against international and European human rights law and would bring business enterprises enlisted in enforcing them into conflict with their responsibilities to respect international and European human rights law. Therefore, not all expression proscribed under the specified provisions of the German Criminal Code should not be considered "illegal activity or information" as per Art 14 of the E-Commerce Directive, since it requires the removal of content that the EU Charter of Fundamental Rights should be interpreted to protect as freedom of expression.
  • The Draft Bill provides only limited oversight of the liability regime by the Administrative Court, which does nothing to address potential risks of over-blocking, and provides little protection or due process to Social Networks that, in good faith, refrain from blocking or removing content in the interests of respecting freedom of expression.
  • Proposals in the Draft Bill to amend the Telemedia Act to significantly widen the bases on which law enforcement authorities may request user data from intermediaries without a court order are concerning.

Overall, ARTICLE 19 finds the Draft Bill to be contrary to international human rights law and standards, and particularly dangerous to the protection and promotion of freedom of expression in Germany and internationally.

We respectfully recommend to the European Commission that Germany be required to withdraw the Draft Bill, with consideration given to retaining in alternative legislation reporting requirements to increase transparency around online content moderation. It should also be recommended that the German Criminal Code be substantially revised, to bring it in line with international and European freedom of expression standards, and that the Telemedia Act be reformed to ensure that any requests from law enforcement authorities to intermediaries for user data is made on the basis of a court order.

ARTICLE 19 stands ready to provide further assistance to the Commission and the government of Germany in this process.

Attached: full ARTICLE 19 analysis. 


en
  Center for Democracy & Technology (CDT) on 21-04-2017
Click to expand

German Proposal Threatens Censorship on Wide Array of Online Services

Link: https://cdt.org/blog/german-proposal-threatens-censorship-on-wide-array-of-online-services/

Anticipating federal elections in September, Germany’s Minister of Justice has proposed a new law aimed at limiting the spread of hate speech and “fake news” on social media sites. But the proposal, called the “Social Network Enforcement Bill” or “NetzDG,” goes far beyond a mere encouragement for social media platforms to respond quickly to hoaxes and disinformation campaigns and would create massive incentives for companies to censor a broad range of speech.

The NetzDG scopes very broadly: It would apply not only to social networking sites but to any other service that enables users to “exchange or share any kind of content with other users or make such content accessible to other users.” That would mean that email providers such as Gmail and ProtonMail, web hosting companies such as Greenhost and 1&1, remote storage services such as Dropbox, and any other interactive website could fall within the bill’s reach.

Under the proposal, providers would be required to promptly remove “illegal” speech from their services or face fines of up to 50 million euros. NetzDG would require providers to respond to complaints about “Violating Content,” defined as material that violates one of 24 provisions of the German Criminal Code. These provisions cover a wide range of topics and reveal prohibitions against speech in German law that may come as a surprise to the international community, including prohibitions against defamation of the President (Sec. 90), the state, and its symbols (Sec. 90a); defamation of religions (Sec. 166); distribution of pornographic performances (Sec. 184d); and dissemination of depictions of violence (Sec. 131).

NetzDG would put online service providers in the position of a judge, requiring that they accept notifications from users about allegedly “Violating Content” and render a decision about whether that content violates the German Criminal Code. Providers would be required to remove “obvious” violations of the Code within 24 hours and resolve all other notifications within 7 days. Providers are also instructed to “delete or block any copies” of the “Violating Content,” which would require providers not only to remove content at a specified URL but to filter all content on their service.

The approach of this bill is fundamentally inconsistent with maintaining opportunities for freedom of expression and access to information online. Requiring providers to interpret the vagaries of 24 provisions of the German Criminal Code is a massive burden. Determining whether a post violates a given law is a complex question that requires deep legal expertise and analysis of relevant context, something private companies are not equipped to do, particularly at mass scale. Adding similar requirements to apply the law of every country in which these companies operate (or risk potentially bankrupting fines) would be unsustainable.

The likely response from hosts of user-generated content would be to err on the side of caution and take down any flagged content that broaches controversial subjects such as religion, foreign policy, and opinions about world leaders. And individuals – inside and outside of Germany – would likely have minimal access to a meaningful remedy if a provider censors their lawful speech under NetzDG.

The proposal is also completely out of sync with international standards for promoting free expression online. It has long been recognized that limiting liability for intermediaries is a key component to support a robust online speech environment. As then-Special Rapporteur for Freedom of Expression, Frank La Rue, noted in his 2011 report:

Holding intermediaries liable for the content disseminated or created by their users severely undermines the enjoyment of the right to freedom of opinion and expression, because it leads to self-protective and over-broad private censorship, often without transparency and the due process of the law.

The Council of Europe has likewise cautioned against the consequences of shifting the burden to intermediaries to determine what speech is illegal, in conjunction with the report it commissioned in 2016 on comparative approaches to blocking, filtering, and takedown of content: “[T]he decision on what constitutes illegal content is often delegated to private entities, which in order to avoid being held liable for transmission of illegal content may exercise excessive control over information accessible on the Internet.”

Shielding intermediaries from liability for third-party content is the first of the Manila Principles on Intermediary Liability, a set of principles supported by more than 100 civil society organizations worldwide. The Manila Principles further caution that “Intermediaries must not be required to restrict content unless an order has been issued by an independent and impartial judicial authority that has determined that the material at issue is unlawful.” It is a mistake to force private companies to be judge, jury, and executioner for controversial speech.

CDT recommends that the German legislature reject this proposed measure. It clearly impinges on fundamental rights to free expression and due process. The challenges posed to our democracies by “fake news,” hate speech, and incitement to violence are matters of deep concern. But laws that undermine individuals’ due process rights and co-opt private companies into the censorship apparatus for the state are not the way to defend democratic societies. Governments must work with industry and civil society to address these problems without undermining fundamental rights and the rule of law.


en
  Global Network Initiative on 21-04-2017
Click to expand

Proposed German Legislation Threatens Free Expression Around the World

 

The Global Network Initiative is deeply concerned by the “Draft Law to Improve Law Enforcement in Social Networks” (Netzwerkdurchsetzungsgesetz) approved by the German cabinet on April 5.

 

GNI is mindful of the complex challenges that governments and companies face when dealing with controversial content online. We recognize the legitimate interest that the German Government has in protecting the public. However, we are concerned that the rush to legislate, and to pressure companies under threat of fines to determine what is or is not illegal content, poses inadvertent but grave consequences for the right to freedom of expression in Germany, across the European Union, and worldwide. 

 

We urge the German Parliament to embrace its leadership role on human rights and the digital economy by rejecting the proposed legislation.

 

“The practical effect of this bill would be to outsource decisions on the balance between the fundamental right of freedom of expression and other legally protected rights to private companies,” said GNI Board Chair Mark Stephens, CBE.

 

“Companies facing the threat of multi-million euro fines will be compelled to broadly censor the internet, restricting the use of their services for any content that could be considered controversial,” he said.

 

This legislation has been described as a measure to combat hate speech and disinformation online, but its potential impact would be broader censorship of the internet. The impact of this bill would be to pressure private companies to take down any content that might run afoul of some 24 current provisions of the German Criminal Code – including offenses as varied as "defamation of the state and its symbols," "anti-constitutional defamation of constitutional organs," defamation of religions, religious and ideological associations,” and "depictions of violence."

 

Although aimed at social networks, a wide array of online platforms and services, from email providers to web hosting and remote storage providers, could be affected, given the broad definition of social networks in the bill.

 

GNI recognizes that governments may restrict freedom of expression in circumstances that are compatible with constitutional law, and provided that their laws and policies accord with the conditions in Article 19 of the International Covenant on Civil and Political Rights and Article 10 of the European Convention on Human Rights. The proposed legislation does not meet these required tests of necessity and proportionality, and it poses a threat to open and democratic discourse.

 

“The internet has enabled huge advances in free expression and economic growth in large part because private intermediaries, including social media platforms and internet service providers, are not required to monitor and control what people can say or share or do online,” said GNI Executive Director, Judith Lichtenberg.

 

“We encourage the German Government to embrace these norms and use other means of managing illegal content,” she said.

 

GNI recently released a policy brief, Extremist Content in the ICT Sector, with recommendations directed at both governments and companies on how to address extremist content online without harming freedom of expression and privacy. These recommendations have the support of our multi-stakeholder membership, and may also provide guidance for governments as they consider how best to protect freedom of expression when dealing with other types of controversial content.

 

For comment or further information on this release, please contact Kath Cummins, GNI Director of Communications and Outreach: kcummins@globalnetworkinitiative.org, or call +1 202 590 0837.

 

About the GNI

 

Founded in 2008, The Global Network Initiative is an international multi-stakeholder group of companies, civil society organizations (including human rights and press freedom groups), investors and academics, who have created a collaborative approach to protect and advance freedom of expression and privacy in the ICT sector. GNI has built a framework of principles and implementation guidelines based on international human rights standards on which GNI member companies are independently assessed. Our membership collectively advocates with governments to protect and advance user freedom of expressions and privacy rights.

 

For more information on GNI’s members, the GNI Principles, and the GNI Independent Company Assessment process, visit our website: https://globalnetworkinitiative.org/

 

 


de
  FSM e.V. on 21-04-2017
Click to expand

de
  Bitkom e.V. on 20-04-2017
Click to expand

en
  BIU - Bundesverband Interaktive Unterhaltungssoftware on 13-04-2017
Click to expand

en
  Rainer Fislage on 08-04-2017
Click to expand

de
  eco Verband der Internetwirtschaft e.V. on 05-04-2017
Click to expand

de
  SRIW on 03-04-2017
Click to expand