Notifizierungsangaben

Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (NetzDG)

Notifizierungsnummer: 2017/127/D (Deutschland )
Eingangsdatum: 27/03/2017
Ende der Stillhaltefrist: 28/06/2017

Abgabe von Bemerkungen durch: Italien,Schweden
de en fr
bg cs da de el en es et fi fr hr hu it lt lv mt nl pl pt ro sk sl sv


Meddelande 002

Meddelande från kommissionen - TRIS/(2017) 00838
Direktiv (EU) 2015/1535
Översättning av meddelandet 001
Anmälan: 2017/0127/D

No abre el plazo - Nezahajuje odklady - Fristerne indledes ikke - Kein Fristbeginn - Viivituste perioodi ei avata - Καμμία έναρξη προθεσμίας - Does not open the delays - N'ouvre pas de délais - Non fa decorrere la mora - Neietekmē atlikšanu - Atidėjimai nepradedami - Nem nyitja meg a késéseket - Ma’ jiftaħx il-perijodi ta’ dawmien - Geen termijnbegin - Nie otwiera opóźnień - Não inicia o prazo - Neotvorí oneskorenia - Ne uvaja zamud - Määräaika ei ala tästä - Inleder ingen frist - Не се предвижда период на прекъсване - Nu deschide perioadele de stagnare - Nu deschide perioadele de stagnare.

(MSG: 201700838.SV)

1. Structured Information Line
MSG 002 IND 2017 0127 D SV 27-03-2017 D NOTIF


2. Member State
D


3. Department Responsible
Bundesministerium für Wirtschaft und Energie, Referat E B 2, 11019 Berlin,
Tel.: 0049-30-2014-6353, Fax: 0049-30-2014-5379, E-Mail: infonorm@bmwi.bund.de


3. Originating Department
Bundesministerium der Justiz und für Verbraucherschutz, Referat V B 2, 10117 Berlin
Tel.: 0049-30-18580-9522, Fax: 0049-30-18580-9525, E-Mail: poststelle@bmjv.bund.de


4. Notification Number
2017/0127/D - SERV60


5. Titel
Lag om förbättrad efterlevnad av lagstiftning i sociala nätverk (NetzDG)


6. Products Concerned
-


7. Notification Under Another Act
-


8. Main Content
Genom utkastet införs lagstadgade efterlevnadsregler för sociala nätverk för att uppmana de sociala nätverken till en snabbare och mer omfattande handläggning av klagomål avseende hatbrott och annat straffbart innehåll.

En juridisk definition av ett socialt nätverk ska säkerställa att rapporteringsskyldigheten endast berör ägare av stora sociala nätverk med opinionsbildning och inte samtliga tjänsteleverantörer enligt telemedialagen (TMG). Medieplattformar med eget innehåll som består av journalistiska och redaktionella bidrag omfattas inte av utkastet. Definitionen av ett socialt nätverk omfattar såväl utbyte av innehåll med andra användare i ett slutet nätverk (gated community) som spridning av innehåll till allmänheten. En minimigräns för mindre företag (startup-företag) ska fastställas. Dessutom klargörs att endast sådant rättsstridigt innehåll omfattas vilket omfattar de objektiva kriterier i straffnormerna som syftar till att bekämpa hatbrott eller annat straffbart innehåll enligt § 1.3 i lagutkastet.

Sociala nätverk åläggs skyldigheten enligt lag att kvartalsvis rapportera om hanteringen av klagomål avseende straffrättsligt relevant innehåll. Rapporten ska innehålla såväl statistik om omfattningen av klagomål och nätverkens beslutspraxis som information om de arbetslag som ansvarar för att handlägga klagomålen. Rapporten kommer att publiceras lätt åtkomligt i den elektroniska versionen av publikationsbyrån och på det sociala nätverkets egen hemsida.

Genom utkastet fastställs rättsliga standarder för en effektiv hantering av klagomål som garanterar att sociala nätverk i regel raderar uppenbart straffrättsligt relevant innehåll som uppfyller de objektiva kriterierna i en av de straffbestämmelser som anges i § 1.3 inom 24 timmar efter att klagomålet tagits emot från användaren. Det som krävs är effektiva och transparenta förfaranden för en omedelbar borttagning av rättsstridigt innehåll inklusive användarvänliga mekanismer för överföring av klagomål. Utgångspunkt för denna efterlevnadsskyldighet är ansvarsbestämmelserna för tjänsteleverantörer enligt § 10 i telemedialagen. Tjänsteleverantörerna är skyldiga att omedelbart radera ett rättsstridigt innehåll som de sparar för en användare, eller att spärra åtkomsten till det, när de har fått kännedom om innehållet. De efterlevnadsskyldigheter som fastställs i detta utkast kräver en sådan skyldighet av tjänsteleverantörerna och definierar den mer exakt.

Den som underlåter, uppsåtligen eller genom vårdslöshet, att efterleva rapporteringsskyldigheten, bryter mot skyldigheten att upprätthålla en effektiv hantering av klagomål eller bryter mot skyldigheten att utse ett inhemskt ombud och en inhemsk behörig mottagare på delgivningsadressen för begäran om upplysningar från brottsbekämpande myndigheter, begår enligt utkastet en administrativ förseelse som kan leda till böter upp till 5 miljoner euro. Böterna ska enligt § 17.4 i lagen om administrativa förseelser (OWiG) överstiga den administrativa förseelsens ekonomiska fördel.
Genom den dessutom tillämpliga § 130 i OWiG kan även innehavaren till det företag som driver det sociala nätverket åtalas, om överträdelsen av skyldigheten att upprätthålla en effektiv hantering av klagomål, av rapporteringsskyldigheten eller av plikten att utse ett inhemskt ombud och en inhemsk behörig mottagare på delgivningsadressen hade kunnat förhindras eller göras betydligt svårare genom vederbörlig tillsyn.
I enlighet med § 30 i OWiG kan böter även påföras juridiska personer och sammanslutningar av sådana personer. Den högsta nivån för böter enligt detta utkast ökas i sådant fall till 50 miljoner euro (§ 30.2 tredje meningen i OWiG).

Som behörig myndighet enligt § 36 i OWiG utser utkastet det federala justitiekontoret som inom ramen för brottsbekämpningen av de kriterier enligt OWiG som anges i detta utkast även ska bedöma om ett straffbart innehåll enligt § 1.3 föreligger.


9. Brief Statement of Grounds
För närvarande pågår en mycket omfattande förändring av samhällsdebatten på nätet och i synnerhet i de sociala nätverken. Debattkulten på nätet är ofta aggressiv, kränkande och inte sällan hatfull. Hatbrott och hets med rasistiska motiv kan vara ärekränkande för varje person på grund av denna persons åsikter, hudfärg eller härkomst, religion, kön eller sexualitet. Hatbrott och annat straffbart innehåll som inte kan bekämpas och beivras effektivt utgör ett allvarligt hot mot fredlig samlevnad i ett fritt, öppet och demokratiskt samhälle.

Efter erfarenheterna från valrörelsen i USA ges bekämpningen av straffbara falska nyheter (”fake news”) i sociala nätverk hög prioritet även i förbundsrepubliken Tyskland.

Det är därför nödvändigt att förbättra efterlevnaden av lagstiftningen i sociala nätverk för att omedelbart ta bort objektivt straffbart innehåll såsom hets mot folkgrupp, förolämpning, förtal eller brott mot allmän frid genom simulering av ett brott.

Den ökade spridningen av hatbrott och annat straffbart innehåll, framför allt i sociala nätverk, fick det federala ministeriet för justis och konsumentskydd att inrätta en arbetsgrupp tillsammans med nätverkens ägare och representanter av civilsamhället redan år 2015. De företag som representeras i arbetsgruppen har gett sitt samtycke till att förbättra hanteringen av indikationer på hatbrott och annat straffbart innehåll på sina sidor. Företagen har förbundit sig att införa användarvänliga mekanismer för att rapportera känsliga bidrag och att granska majoriteten av de rapporterade bidragen tillsammans med språkligt och juridiskt kvalificerade arbetslag inom 24 timmar och att ta bort bidragen om de är rättsstridiga. Bedömningsnorm för granskningen är tysk lagstiftning.

Företagens självpålagda åtaganden har lett till vissa förbättringar. Dock är de inte tillräckliga ännu. Det raderas fortfarande för lite straffbart innehåll. Av en övervakning av borttagningsrutiner hos sociala nätverk som jugendschutz.net genomförde i januari/februari 2017 framgår att klagomål från vanliga användare mot straffbart innehåll fortfarande inte handläggs omedelbart och i tillräcklig utsträckning. Visserligen tar YouTube bort straffbart innehåll i numera 90 procent av fallen. Facebook däremot tar bort innehåll endast i 39 procent av fallen, Twitter endast i 1 procent.

Inte heller är transparensen tillräcklig hos de sociala nätverken. De uppgifter om avlägsnande och spärrning av rättsstridigt innehåll på deras plattformar som de sociala nätverken publicerar innehåller för lite relevant information. De klagomål som tas emot delas inte upp i kategorier, dessutom lämnar företagen inte ut information om hur många procent av klagomålen som leder till att innehåll tagits bort eller spärrats.

De som tillhandahåller sociala nätverk har ett ansvar gentemot den sociala debattkulturen som de måste leva upp till. Med tanke på det faktum att de nuvarande instrumenten och de utlovade självpåtagna åtagandena inte är tillräckligt effektiva och att det finns allvarliga problem vid efterlevnaden av gällande lagstiftning är det nödvändigt att införa efterlevnadsregler för sociala nätverk under hot om böter för att kunna vidta effektiva och omedelbara åtgärder mot hatbrott och annat straffbart innehåll på nätet.


10. Reference Documents - Basic Texts
Inga grundtexter finns


11. Invocation of the Emergency Procedure
Nej


12. Grounds for the Emergency
-


13. Vertraulichkeit
Nej


14. Fiscal measures
Nej


15. Impact assessment
Genom utkastet uppstår för de sociala nätverken genomförandekostnader på totalt minst 28 miljoner euro om året. Genom utkastet uppstår för förbundsstaten genomförandekostnader på minst fyra miljoner euro om året och engångskostnader på minst 350 000 euro. Genomförandekostnaderna för delstaterna uppskattas vara minst 200 000 euro om året.


16. TBT- und SPS-Aspekte
TBT-avtal

Nej - utkastet har ingen betydande inverkan på internationell handel.

SPS-avtal

Nej - utkastet har ingen betydande inverkan på internationell handel.

**********
Europeiska kommissionen

Kontaktadress Direktiv (EU) 2015/1535
Fax: +32 229 98043
e-post: grow-dir83-189-central@ec.europa.eu

Beiträge von Interessenträgern

Über die TRIS-Website können Sie und Ihre Organisation ganz einfach Ihre Meinung zu einer beliebigen Notifizierung mitteilen.


Da die Stillhaltefrist abgelaufen ist, nehmen wir derzeit keine weiteren Beiträge zu dieser Notifizierung über die Website an.


de
  Deutscher Anwaltverein - DAV on 28-06-2017
Klicken, um zu erweitern

 

Sehr geehrte Damen und Herren,

 

 

 

im Folgenden finden Sie die Stellungnahme des Deutschen Anwaltvereins zum Entwurf des Netzwerkdurchsetzungsgesetzes (NetzDG).

 

 

 

Wir möchten im hiesigen Kontext besonders auf Punkt 3.3 der Stellungnahme hinweisen, unter welchem ausführlich der Verstoß des NetzDG gegen die E-Commerce-Richtlinie (ECRL) dargelegt wird.

 

Ausweislich ihres fünften und sechsten Erwägungsgrundes dient diese Richtlinie dazu die rechtlichen Hemmnissen für das reibungslose Funktionieren des Binnenmarktes zu beseitigen, die der Weiterentwicklung der Dienste der Informationsgesellschaft entgegenstehen und somit die Ausübung der Niederlassungsfreiheit und des freien Dienstleistungsverkehrs weniger attraktiv machen. In Anbetracht der Grundfreiheiten sind diese Hemmnisse durch Koordinierung bestimmter innerstaatlicher Rechtsvorschriften und durch Klarstellung von Rechtsbegriffen auf Unionsebene zu beseitigen, soweit dies für das reibungslose Funktionieren des Binnenmarktes erforderlich ist.

 

Mit dem Verstoß des NetzDG gegen diese Richtlinie werden diese Ziele jedoch konterkariert. Besonders der unter 3.3.2 unserer Stellungnahme erläuterte Verstoß gegen das Herkunftslandprinzip des Art. 3 der ECRL verdeutlicht die Unvereinbarkeit des  NetzDG in seiner aktuellen Fassung mit den Grundfreiheiten. Daher sei auch hier noch einmal auf den auf S. 27 des Gutachtens festgehaltenen Appell an die Kommission verwiesen, im Notifizierungsverfahren eine ausführliche Stellungnahme abzugeben, in der sie die Bundesregierung auffordert, darzulegen, welche Diensteanbieter in anderen EU-Mitgliedsstaaten von dem NetzDG konkret erfasst würden, und wann die Bundesregierung die dahingehende Information der jeweiligen Herkunftsländer nach Art. 3 Abs. 4 lit. b, Abs. 5 ECRL umzusetzen bzw. nachzuholen gedenkt.

 

 

 

Auch möchten wir Punkt 3.4 hervorheben, der die Unvereinbarkeit des NetzDG mit der Datenschutz-Grundverordnung (DSGVO) erörtert. Ausweislich des sechsten und siebten Erwägungsgrundes der DSGVO erfordern die raschen technologischen Entwicklungen und die Globalisierung einen „soliden, kohärenteren und klar durchsetzbaren Rechtsrahmen im Bereich des Datenschutzes in der Union, da es von großer Wichtigkeit ist, eine Vertrauensbasis zu schaffen, die die digitale Wirtschaft dringend benötigt, um im Binnenmarkt weiter wachsen zu können.“ Ein nationales Gesetz, das den Vorschriften dieses Rahmens entgegensteht, schafft demnach Hemmnisse die das Wachstum des Binnenmarktes verhindern. Eine Reaktion der Kommission wäre daher auch in dieser Hinsicht wünschenswert.

 

 

 

Zu guter Letzt sei noch auf Punkt 3.2. der Stellungnahme hingewiesen, unter welchem eine ausführliche Darlegung erfolgt, inwiefern das NetzDG die Kommunikationsfreiheiten des Art. 5 GG verletzt.  Die dortigen Ausführungen sind auch im Lichte des Art. 11 der Grundrechtecharta der Europäischen Union zu betrachten, so dass die Kommission nach unserer Auffassung auf die drohende Unvereinbarkeit des NetzDG mit dem Grundrecht auf Meinungs- und Informationsfreiheit hinweisen sollte.

 

 

 

 

 

Mit freundlichen Grüßen

Dorothee Wildt

 

 

Stv. Leiterin Büro Brüssel, Referentin für EU-Angelegenheiten

Deputy Head of Brussels Office, Legal Adviser for EU-Affairs


en
  European Digital Rights (EDRi) on 27-06-2017
Klicken, um zu erweitern

 

European Digital Rights (EDRi) is an association of civil and human rights organisations from across Europe. We defend rights and freedoms in the digital environment.

Information technology has a revolutionary impact on our society. It has boosted freedom of expression and democracy but has also led to new approaches to surveillance and is increasingly used to impose restrictions on fundamental rights.

On 27 March, the Federal Republic of Germany sent to the European Commission a draft of a law that would change the way social networks deal with online content that has been accused of being a breach of their terms of service and/or being illegal.

This law contravenes Article 14 of the E-Commerce Directive (2000/31/EC) which provides a liability exception for online intermediaries, when they act expeditiously to remove illegal content, according to a notice-take-down procedure. The German draft law instead, provides disproportionate fines for social networks that do not delete within 24 hours “clearly violating content” or within a week  “violating content”. There is no indication of how a decision is to be made on what  “clearly violating content” or “violating content” might be. It is also far from clear what characteristics would be be used to definitively class a service as being a  social network. As a result, it is easy to see how the fear of high fines will bring platforms to delete and block any content that appears to generate a risk of being punished under this new law. This, of course, would seriously hinder the fundamental right to freedom of expression and opinion. Indeed, the entirely predictable impact of the law, if enacted, would be a breach of key European Court of Human Rights case law in this area:

"Freedom of expression constitutes one of the essential foundations of such a society, one of the basic conditions for its progress and for the development of every man. Subject to paragraph 2 of Article 10 (art. 10-2), it is applicable not only to "information" or "ideas" that are favourably received or regarded as inoffensive or as a matter of indifference, but also to those that offend, shock or disturb the State or any sector of the population. Such are the demands of that pluralism, tolerance and broad-mindedness without which there is no "democratic society"."
Cf. Handyside vs UK, European Court of Human Rights case 5493/72), paragraph 49

This law would not be only a problem from a human rights perspective, but also from a market perspective. Rules like this would create even more uncertainty for all the European social networks that would face new, different laws for every Member State, moving away from the idea of a European Digital Single Market. Regulating the Internet as if it consisted only of Facebook or Google will create an internet that consists only of Facebook and Google.

EDRi recommends abandoning the draft law. It would set a dangerous precedent, on both a European and global perspective. EDRi is not alone in this call. The UN Rapporteur on Freedom of Opinion and Expression David Kaye, other civil society organisations, industry, academia and even the legal services of the Bundestag have raised serious concerns about this bill.

en
  ARTICLE 19 on 27-04-2017
Klicken, um zu erweitern

Pursuant to the notification procedure set out in Directive (EU) 2015/1535, ARTICLE 19 provides the attached detailed examination of Germany’s Draft Bill on the Improvement of Enforcement of Rights in Social Networks (the Draft Bill) for its compliance with international and European human rights law and standards, in particular on the right to freedom of expression. In relation to European Union law, our analysis takes into consideration the EU Charter on Fundamental Rights and the E-Commerce Directive (Directive 2000/31/EC).

ARTICLE 19 is an independent human rights organisation that works around the world to protect and promote the right to freedom of expression and the right to freedom of information. We have extensive experience in analysing laws pertaining to the right to freedom of expression, including in the fields of “hate speech” and intermediary liability.

Our analysis shows that the Draft Bill raises serious concerns under international and European human rights law We also believe that it would create barriers to intra-EU trade and prevent digital companies trading freely in the EU and beyond. 

Our concerns are summarised as follows:

  • The Draft Bill would establish a new regime of intermediary liability that would incentivise, through severe administrative penalties, the removal and blocking of online content, without a determination of the legality of that content by a court, and without sufficient safeguards for freedom of expression, including for Social Network users whose content is wrongly removed. In particular, it is difficult to see how the proposed duty to delete copies of "Violating Content" will be reconciled with the limited protection from liability provided in the E-Commerce Directive Article 14 as there will not necessarily be notice of offending copies, thus creating an obligation to monitor content against Article 15 of the E-Commerce Directive. 
  • There is significant ambiguity in the Draft Bill regarding: the definitions of key terms, including those that set out the scope of which entities the Draft Bill would apply to and those that limit or exclude liability, and the threshold at which blocking and removal processes will be considered so inadequate as to attract administrative liability. We highlight that this legal uncertainty would present a barrier to trade in single market.
  • The content limitations that Social Networks would be required to enforce extend far beyond “hate speech” which may permissibly be limited under international human rights law, including criminal prohibitions on “defamation of religions” (blasphemy) and criminal defamation (including against the President of the Federation) that are against international and European human rights law and would bring business enterprises enlisted in enforcing them into conflict with their responsibilities to respect international and European human rights law. Therefore, not all expression proscribed under the specified provisions of the German Criminal Code should not be considered "illegal activity or information" as per Art 14 of the E-Commerce Directive, since it requires the removal of content that the EU Charter of Fundamental Rights should be interpreted to protect as freedom of expression.
  • The Draft Bill provides only limited oversight of the liability regime by the Administrative Court, which does nothing to address potential risks of over-blocking, and provides little protection or due process to Social Networks that, in good faith, refrain from blocking or removing content in the interests of respecting freedom of expression.
  • Proposals in the Draft Bill to amend the Telemedia Act to significantly widen the bases on which law enforcement authorities may request user data from intermediaries without a court order are concerning.

Overall, ARTICLE 19 finds the Draft Bill to be contrary to international human rights law and standards, and particularly dangerous to the protection and promotion of freedom of expression in Germany and internationally.

We respectfully recommend to the European Commission that Germany be required to withdraw the Draft Bill, with consideration given to retaining in alternative legislation reporting requirements to increase transparency around online content moderation. It should also be recommended that the German Criminal Code be substantially revised, to bring it in line with international and European freedom of expression standards, and that the Telemedia Act be reformed to ensure that any requests from law enforcement authorities to intermediaries for user data is made on the basis of a court order.

ARTICLE 19 stands ready to provide further assistance to the Commission and the government of Germany in this process.

Attached: full ARTICLE 19 analysis. 


en
  Center for Democracy & Technology (CDT) on 21-04-2017
Klicken, um zu erweitern

German Proposal Threatens Censorship on Wide Array of Online Services

Link: https://cdt.org/blog/german-proposal-threatens-censorship-on-wide-array-of-online-services/

Anticipating federal elections in September, Germany’s Minister of Justice has proposed a new law aimed at limiting the spread of hate speech and “fake news” on social media sites. But the proposal, called the “Social Network Enforcement Bill” or “NetzDG,” goes far beyond a mere encouragement for social media platforms to respond quickly to hoaxes and disinformation campaigns and would create massive incentives for companies to censor a broad range of speech.

The NetzDG scopes very broadly: It would apply not only to social networking sites but to any other service that enables users to “exchange or share any kind of content with other users or make such content accessible to other users.” That would mean that email providers such as Gmail and ProtonMail, web hosting companies such as Greenhost and 1&1, remote storage services such as Dropbox, and any other interactive website could fall within the bill’s reach.

Under the proposal, providers would be required to promptly remove “illegal” speech from their services or face fines of up to 50 million euros. NetzDG would require providers to respond to complaints about “Violating Content,” defined as material that violates one of 24 provisions of the German Criminal Code. These provisions cover a wide range of topics and reveal prohibitions against speech in German law that may come as a surprise to the international community, including prohibitions against defamation of the President (Sec. 90), the state, and its symbols (Sec. 90a); defamation of religions (Sec. 166); distribution of pornographic performances (Sec. 184d); and dissemination of depictions of violence (Sec. 131).

NetzDG would put online service providers in the position of a judge, requiring that they accept notifications from users about allegedly “Violating Content” and render a decision about whether that content violates the German Criminal Code. Providers would be required to remove “obvious” violations of the Code within 24 hours and resolve all other notifications within 7 days. Providers are also instructed to “delete or block any copies” of the “Violating Content,” which would require providers not only to remove content at a specified URL but to filter all content on their service.

The approach of this bill is fundamentally inconsistent with maintaining opportunities for freedom of expression and access to information online. Requiring providers to interpret the vagaries of 24 provisions of the German Criminal Code is a massive burden. Determining whether a post violates a given law is a complex question that requires deep legal expertise and analysis of relevant context, something private companies are not equipped to do, particularly at mass scale. Adding similar requirements to apply the law of every country in which these companies operate (or risk potentially bankrupting fines) would be unsustainable.

The likely response from hosts of user-generated content would be to err on the side of caution and take down any flagged content that broaches controversial subjects such as religion, foreign policy, and opinions about world leaders. And individuals – inside and outside of Germany – would likely have minimal access to a meaningful remedy if a provider censors their lawful speech under NetzDG.

The proposal is also completely out of sync with international standards for promoting free expression online. It has long been recognized that limiting liability for intermediaries is a key component to support a robust online speech environment. As then-Special Rapporteur for Freedom of Expression, Frank La Rue, noted in his 2011 report:

Holding intermediaries liable for the content disseminated or created by their users severely undermines the enjoyment of the right to freedom of opinion and expression, because it leads to self-protective and over-broad private censorship, often without transparency and the due process of the law.

The Council of Europe has likewise cautioned against the consequences of shifting the burden to intermediaries to determine what speech is illegal, in conjunction with the report it commissioned in 2016 on comparative approaches to blocking, filtering, and takedown of content: “[T]he decision on what constitutes illegal content is often delegated to private entities, which in order to avoid being held liable for transmission of illegal content may exercise excessive control over information accessible on the Internet.”

Shielding intermediaries from liability for third-party content is the first of the Manila Principles on Intermediary Liability, a set of principles supported by more than 100 civil society organizations worldwide. The Manila Principles further caution that “Intermediaries must not be required to restrict content unless an order has been issued by an independent and impartial judicial authority that has determined that the material at issue is unlawful.” It is a mistake to force private companies to be judge, jury, and executioner for controversial speech.

CDT recommends that the German legislature reject this proposed measure. It clearly impinges on fundamental rights to free expression and due process. The challenges posed to our democracies by “fake news,” hate speech, and incitement to violence are matters of deep concern. But laws that undermine individuals’ due process rights and co-opt private companies into the censorship apparatus for the state are not the way to defend democratic societies. Governments must work with industry and civil society to address these problems without undermining fundamental rights and the rule of law.


en
  Global Network Initiative on 21-04-2017
Klicken, um zu erweitern

Proposed German Legislation Threatens Free Expression Around the World

 

The Global Network Initiative is deeply concerned by the “Draft Law to Improve Law Enforcement in Social Networks” (Netzwerkdurchsetzungsgesetz) approved by the German cabinet on April 5.

 

GNI is mindful of the complex challenges that governments and companies face when dealing with controversial content online. We recognize the legitimate interest that the German Government has in protecting the public. However, we are concerned that the rush to legislate, and to pressure companies under threat of fines to determine what is or is not illegal content, poses inadvertent but grave consequences for the right to freedom of expression in Germany, across the European Union, and worldwide. 

 

We urge the German Parliament to embrace its leadership role on human rights and the digital economy by rejecting the proposed legislation.

 

“The practical effect of this bill would be to outsource decisions on the balance between the fundamental right of freedom of expression and other legally protected rights to private companies,” said GNI Board Chair Mark Stephens, CBE.

 

“Companies facing the threat of multi-million euro fines will be compelled to broadly censor the internet, restricting the use of their services for any content that could be considered controversial,” he said.

 

This legislation has been described as a measure to combat hate speech and disinformation online, but its potential impact would be broader censorship of the internet. The impact of this bill would be to pressure private companies to take down any content that might run afoul of some 24 current provisions of the German Criminal Code – including offenses as varied as "defamation of the state and its symbols," "anti-constitutional defamation of constitutional organs," defamation of religions, religious and ideological associations,” and "depictions of violence."

 

Although aimed at social networks, a wide array of online platforms and services, from email providers to web hosting and remote storage providers, could be affected, given the broad definition of social networks in the bill.

 

GNI recognizes that governments may restrict freedom of expression in circumstances that are compatible with constitutional law, and provided that their laws and policies accord with the conditions in Article 19 of the International Covenant on Civil and Political Rights and Article 10 of the European Convention on Human Rights. The proposed legislation does not meet these required tests of necessity and proportionality, and it poses a threat to open and democratic discourse.

 

“The internet has enabled huge advances in free expression and economic growth in large part because private intermediaries, including social media platforms and internet service providers, are not required to monitor and control what people can say or share or do online,” said GNI Executive Director, Judith Lichtenberg.

 

“We encourage the German Government to embrace these norms and use other means of managing illegal content,” she said.

 

GNI recently released a policy brief, Extremist Content in the ICT Sector, with recommendations directed at both governments and companies on how to address extremist content online without harming freedom of expression and privacy. These recommendations have the support of our multi-stakeholder membership, and may also provide guidance for governments as they consider how best to protect freedom of expression when dealing with other types of controversial content.

 

For comment or further information on this release, please contact Kath Cummins, GNI Director of Communications and Outreach: kcummins@globalnetworkinitiative.org, or call +1 202 590 0837.

 

About the GNI

 

Founded in 2008, The Global Network Initiative is an international multi-stakeholder group of companies, civil society organizations (including human rights and press freedom groups), investors and academics, who have created a collaborative approach to protect and advance freedom of expression and privacy in the ICT sector. GNI has built a framework of principles and implementation guidelines based on international human rights standards on which GNI member companies are independently assessed. Our membership collectively advocates with governments to protect and advance user freedom of expressions and privacy rights.

 

For more information on GNI’s members, the GNI Principles, and the GNI Independent Company Assessment process, visit our website: https://globalnetworkinitiative.org/

 

 


de
  FSM e.V. on 21-04-2017
Klicken, um zu erweitern

de
  Bitkom e.V. on 20-04-2017
Klicken, um zu erweitern

en
  BIU - Bundesverband Interaktive Unterhaltungssoftware on 13-04-2017
Klicken, um zu erweitern

en
  Rainer Fislage on 08-04-2017
Klicken, um zu erweitern

de
  eco Verband der Internetwirtschaft e.V. on 05-04-2017
Klicken, um zu erweitern

de
  SRIW on 03-04-2017
Klicken, um zu erweitern