Global Voices

Can Germany's ‘Lex Facebook’ Be Saved? A Business and Human Rights Analysis

Germany's NetzDG forces companies to police hate speech or face astronomical fines for persistent violations of up to 50 million Euro. Is this a good way to solve the problem?

Facebook CEO Mark Zuckerberg on stage at Facebook's F8 conference. Photo by pestoverde via Flickr (CC BY 2.0)

In an attempt to stifle hate speech and racial discrimination online, German lawmakers built one of the most controversial EU laws regulating online platforms in 2017.

Commonly known as “NetzDG”, Germany’s Netzwerkdurchsetzungsgesetz (Network Enforcement Law) requires large social media companies to proactively enforce German speech laws on their platforms. This has been met with a storm of criticism both at home and abroad.

The law imposes fines on social networks if they fail to remove “manifestly unlawful” content within 24 hours of receiving a complaint about such content, and gives companies up to seven days to consider the removal of more ambiguous material. Since Germany’s criminal code already defines hate speech, the law does not create new measures or definitions beyond requiring large social networks to answer legal summons related to the law through a German address.

Instead, it forces companies to police hate speech or face astronomical fines for persistent violations of up to 50 million Euro. While this increases pressure on the companies to respond, it also forces them to decide what is — and is not — hate speech. The short time frame in which the law expects companies to remove hate speech could easily lead them to err on the side of automated censorship, in an effort to avoid steep fines.

NetzDG is rooted in Europe’s current political moment, much more so than in long-standing tensions over the power and rights of social media companies when it comes to hosting online speech.

During Germany’s 2017 election season, fears of the right-wing Alliance for Germany (AfD) winning seats in parliament led to a cacophony of voices pressuring legislators to prevent hate speech on social media, with Facebook being a primary target. This largely drowned out voices that argued for a measured approach that would require greater transparency and accountability from social media companies, in an effort to protect user rights and interests.

While all social media companies must comply with the law, it is understood among policy experts that this “lex Facebook” was always meant to bring one specific company to heel.

What we now have in Germany is a law that demands respect and disproportionate attention from Facebook without actually forcing the company to reform its practices in the public interest.

Every time legislators feel pressure to systematically mitigate harmful speech online, they seem to transfer more and more of the actual responsibility of regulating content to private companies themselves. The end result is that Facebook is required to decide more — not less — about what types of content should stay online.

And while its implementation technically remains within German borders, its effects can be felt worldwide. Can the law be repaired sufficiently to prevent further harm to human rights online? Or should it be scrapped completely?

Better governance? Hard to say.

In practice, the NetzDG creates a slightly different user experience for German Facebookers who encounter speech that they think is illegal. When users seek to flag potentially illegal content, they are led to a separate complaint page through which they can ask Facebook to take down content under the next NetzDG law. The page is completely separate from the existing mechanisms for reporting harmful content that Facebook offers users in the rest of the world. The page also seems to have a separate escalation mechanism behind it.

Screenshot of NetzDG complaint page by author.

Most of what the NetzDG law does is refer to existing German legal norms on illegal content. Importantly, it also attaches tight time limits and fines to fulfilling requests for the company to remove illegal content.

Yet we still know very little about how Facebook actually determines what content abides by German law and what does not.

Will content flagged on Facebook be evaluated by a human being or an algorithm? If it is seen by a human being, does that person live in San Francisco, the Philippines or Dublin? What languages does that person speak? Does the person know the cultural context of each post flagged for review? Does that person have any legal training? Who makes the ultimate decision in the content review process?

We don’t know, because Facebook does not make this information public. The process could vary dramatically, depending on the answers to these questions.

Transparency, but not enough

One brighter element of the NetzDG is a requirement that companies publish periodic, detailed transparency reports about the results of the law’s implementation. However, the categories of data for this transparency report proposed by the NetzDG law do not offer a meaningful understanding of how Facebook makes decisions. The transparency required only extends to the implementation of the law itself and does not pertain to all content removal decisions made by Facebook. Decisions made within the scope of the Terms of Service are completely separate — the law requires no transparency about such decisions.

Moreover, the NetzDG requires a type of transparency from Facebook that the German government itself does not provide for its own institutions. It is impossible to get a national overview of all content removal or data requests made by German public authorities, let alone an agency-by-agency breakdown of these records. German companies such as the Posteo email platform suggest that a large proportion of the government requests they receive for user information are illegal under German law. Could this mean that a similar proportion of content removal requests are also illegal? Publicly-accessible data on such requests could help to answer this question.

At the same time, the NetzDG law does attempt to impose some additional transparency measures upon Facebook’s decision-making process, by requiring regular transparency reports from the company about how the law is implemented.

Anti-AfD demonstration in Cologne, Germany, April 2017. Photo by Elke Wetzig via Wikimedia (CC BY-SA 4.0)

Finally, the law misses the opportunity to define a common standard for the disclosure of data, that would allow for it to be meaningfully searched and cross-referenced between private companies. This means that every company will provide the information in a different format and with different metrics, making meaningful comparisons — for example, between Facebook and Twitter  — essentially impossible. Private companies could and should have developed such common disclosure standards jointly long ago, but since they show little interest in making their data meaningfully comparable, civil society organisations are left to step in and define appropriate standards.

These facts together make it impossible for independent third parties to cross-reference public sector transparency reports with private sector transparency reports. It’s not rocket science to ensure that transparency is provided by both public and private sectors in a manner that ensures the statements made are independently verifiable and thus far more accountable than if they were made by any one actor alone. But to the detriment of the public interest, the law does not go this far.

Ignoring the failures of self-regulation

Perhaps most absurdly, while much of the public debate on the law was full of reports on how the existing self-regulatory regime on Facebook wasn’t working, there are very few components of NetzDG that actually improve it.

This means it is still perfectly possible to for Facebook to take down images of breast-feeding mothers or non-violent political speech that it deems offensive while leaving incitement to violence on the platform completely untouched.

Higher minimum standards for self-regulation on Facebook would go a long way to responding to the many challenging questions about content regulation on Facebook, which we still know almost nothing about.

For example, a minimum level of legal training — in German law — for individuals making decisions about content would go a long way to ensuring that they are able to make effective decisions that respect German law.

A downward spiral of deregulation

At the German Ministry of Justice, the author of the NetzDG law noted: “with power and control comes responsibility.”

But the unique shape and scope of influence of social media companies has left legislators in a  regulatory paradox. As more and more legislators focus on “doing something about Facebook”, they find themselves in a downward spiral of deregulation. Perhaps vexed by the unique characteristics of social media companies, legislators are dumping decision-making power and responsibility on the companies in a manner that is neither helpful nor effective in resolving the problem. As such, they are privatising decisions that should be made by independent authorities. Each regulatory invention by legislators wanting to do the right thing just leaves them trapped in the same regulatory paradigm of transferring more responsibility to Facebook,Google and the many other (mostly smaller) players on the field.

Public sector governance failing equally

Moreover, there are other areas in which the public sector could play a meaningful role in mitigating hate speech on social media. If judicial decisions were easily accessible, affordable and swift, users facing harassment or threats of violence online could actually turn to law enforcement for help and protection. Rather than bringing in Facebook as a privatised police force, the law could instead shift the burden of decision-making to public actors. Of course, this would also require them to acknowledge that improvements are necessary to ensure swift and effective impartial public decision-making, within the existing constitutional bounds of German law and existing international standards.

This is not impossible.

The Netherlands Police service employs a special centralised unit that is setup to process all takedown orders and requests for data to online service providers, in an effort to ensure that they are legally sound. Such a model, combined with additional judicial oversight, could be a start for German legislators to show they are serious about fixing a problem, rather than expecting Facebook to do it for them. By offloading responsibility to Facebook, the public sector doesn’t just fail to solve the problem — it becomes actively complicit in making it worse.

Can we fix a broken law?

So what does this mean in practice? Can the law be fixed and if so how? Broadly speaking, legislators could improve the NetzDG by reducing the amount of regulatory power it gives to private companies and ensuring that instead to the public sector take greater responsibility and civil society play a greater role.

They could abandon the 24-hour window for content removal and replace this with a more limited solution in which Facebook would hand over all cases – whether manifestly illegal or not – to a German self-regulatory body, which would then have more time to consider them.

They could invest in the German legal system beyond requiring a local address for Facebook, making the judicial system more accessible and ensure that existing government decisions are more transparent and accountable.

An amended NetzDG also could set and enforce an open standard for its already-required transparency reports. A successful open standard would allow citizens to cross-reference data between public and private entities, and use these mechanisms to hold companies and the government to account for their responsibilities.

And at a higher level, the law could set a process in motion to begin an open public debate, incorporating all relevant stakeholders basic, that would determine minimum standards of self-regulation for online platforms which conform with human rights and international standards.

Towards a less awful NetzDG

These steps might turn a heavily criticised law into a slightly less problematic legal text. Such a revised NetzDG law that conforms with international human rights standards and actually considers the opinion of international legal experts while being drafted could help reduce some of the damage the law has done internationally. The German government needs to ask itself why a carbon copy of the NetzDG law is currently being considered by the Russian Duma and how this reflects on its obligations to uphold and protect human rights.

In order to develop a less awful NetzDG, legislators need to first acknowledge and then break out of the (de-)regulatory spiral they are currently trapped in. Business as usual is not an effective way to regulate large private intermediaries. Focusing on basic human rights norms and standards is.

Originally published in Global Voices.

More from Global Voices

Global Voices4 min read
In Nepal, Families That Farm Together Stay Together
The UN designated 2019–2028 as the "Decade of Family Farming". Some Nepali households are embracing sustainable agriculture to address climate change and enhance income.
Global Voices3 min read
Central Asia’s Unique Tradition Of Singing During Ramadan Keeps Evolving
Jaramazan’s growing popularity has taken it to restaurants, parks, roads, and offices, where it is performed by professional singers, small children, and youngsters alike.
Global Voices6 min read
Mali: Political Parties Call For Presidential Elections To End Military Transition
A military government has led Mali since the coup d’état on March 24, 2021 and refuses to hold fresh elections. It has also announced measures to restrict the press.

Related Books & Audiobooks