European plans to regulate internet will have major impacts on civic space at home and abroad

A Romanian man passes in front of a shop window of an Orange store that displays a huge 5G advertise banner, in Bucharest, Romania. EFE/EPA/ROBERT GHEMENT

In Hungary, Prime Minister Viktor Orban’s government wants people to believe that it is a crime to provide humanitarian assistance to asylum-seekers, refugees, and migrants. In Poland, Prime Minister Mateusz Morawiecki wants people to believe that LGBTI rights are “an ideology”—and a forbidden one at that. These are the latest worrying examples of Orwell’s “Ministry of Truth,” a dystopian depiction of government distorting the truth in service of self-serving propaganda.

Meanwhile, the European Union has proposed a draft law, dubbed by some to be a “new constitution” for the internet: the Digital Services Act. This ambitious new law seeks to bring some much-needed regulation to the digital world. 

On one hand, it presents a unique opportunity to tackle opaque algorithms and recommender systems of online platforms, phenomena that operate in ways that can disproportionately impact vulnerable and at-risk groups as well as those who work to protect them. 

On the other hand, the current draft risks giving repressive governments the opportunity to digitalise  “Ministries of Truth” by empowering them to suppress speech, including the ability to silence and undermine the crucial work of civil society. The General Data Protection Regulation (GDPR), demonstrated the EU’s normative power and its international reach. The bloc’s data privacy legislation has been copied by many other countries the world-over. Getting this right or wrong will have implications not only in Europe but for the entire global online civic space.

A vibrant and critical civil society is a prerequisite for a strong and resilient democracy. The pandemic has fast-forwarded digitalisation meaning that across the globe, hundreds of thousands of people are now organising online to fight racism and to protect the planet. However, there has also been a backlash against these demands for societal change and against the power of online organising. Civil society actors across Europe have witnessed online smear campaigns and the stigmatisation of their organisations, staff, as well as personal attacks on those working on the frontlines to protect the rights of others.  Already in 2019, the EU Fundamental Rights Agency Report on civic space in the EU found that three of the four most common threats and attacks on civil society actors took place online.

A vibrant and critical civil society is a prerequisite for a strong and resilient democracy.

Research from the Centre for Democracy & Technology  has outlined that disinformation campaigns are frequently designed deliberately as a tool to promote racist and misogynistic content. In Europe, online disinformation campaigns on migration have been swirling for years. Shockingly, some of these campaigns are state-supported and push the dangerous lie that humanitarian assistance for vulnerable people is a criminal activity. The link between shrinking civic space and attacks on minority rights is undeniable—the NGOs and volunteers assisting the vulnerable groups have also become targets. 

Targeting minorities or human rights defenders is nothing new, but the extent of these attacks in Europe is novel, as is the brutal efficacy with which they can be executed on online platforms by profiling users’ personal data and exploiting their prejudices. But the EU cannot “go low” in its response. Adopting the role of arbiter of truth—or asking private companies to do so—is a slippery Orwellian slope. 

The Digital Services Act does offer solutions. It proposes an introduction of mandatory transparency over social media companies’ algorithms and recommender systems. This could allow such algorithms to be subject to audits. If done properly, it could help better enforce the bloc’s data protection rules and, with that, reduce the amplification of campaigns that target vulnerable groups and the civil society organisations that work to protect them. 

Beyond the factors driving amplification of content, the Digital Services Act also seeks to bring more transparency and accountability to the management of user-generated content. It proposes welcome ideas for transparency for all users about why their content was removed and avenues to remedy and appeal for erroneous removals.

EU lawmakers will, however, need to reflect more broadly on how to incorporate rule of law safeguards around the legality of speech to avoid creating a digital Ministry of Truth. Globally, we see the unfortunate trend of governments waking up to the power of online spaces and attempting to quash them by adopting policies and laws that criminalise online dissent and expression, such as in Thailand. The EU should be wary of this reality and ensure that its newly proposed law does not add to the arsenal of governments in the EU, nor inspire those outside it, to silence vital work that journalists and human rights defenders do to protect democracies. 

Human rights law can help guide these considerations, especially as it calls to avoid delegating decisions on the legality of speech either to government agencies or to companies. Judicial authorities alone should remain the arbiters of lawful expression. 

The DSA misses this mark in ways that could have serious implications for civic space in the region and beyond. For example, the draft regulation mandates the use of so-called “trusted flaggers” to give notice to a social media company about illegal content online. A trusted flagger could be the law enforcement authorities, or any other government agency, and their notifications would be tantamount to an order to remove the content or face significant legal risk. 

The law as drafted would also create the role of Digital Services Coordinator, another type of state authority, which would also have the power to order the removal of “illegal content”.  We can imagine how this might play out in states where civic space and the rule of law is already under pressure. How would a government agency in Poland treat the online content of an LGBTI activist? Or how would the online speech of those standing up for refugee rights be handled in Hungary? 

Perhaps most controversially, the draft law puts the responsibility on social media companies to decide whether a piece of content is illegal or not. This allows yet another avenue for states to circumvent the usual safeguards by pressuring companies to remove content. The pressure to remove ill-defined, illegal content also creates an incentive for companies to overcompensate and err on the side of removal to keep governments happy. It also betrays the legislators’ goal of limiting corporate power over public discourse by formally assigning companies a role in deciding the legality of our speech.

Governments effectively create Orwellian Ministries of Truth when they perpetuate disinformation, or worse, adopt laws that criminalize and curtail the legitimate human rights work of civil society actors. This approach of stifling freedom of association and expression has serious negative consequences to civic space and democracy. Online platforms can also play this role when they use models that amplify disinformation, or act to grant or deny NGOs the ability to participate in online expression and organisation. 

The stakes are high and the decisions the EU makes on the direction of this new law will  have a global impact. Whilst the draft law introduces many welcome ideas on transparency and redress, its greatest weakness is its failure to ensure that decisions on the legality of speech remain the sole purview of independent and impartial judicial authorities. Mandating private companies and state authorities to usurp this role could have a devastating impact on an already strained civic space.  A robust participatory process and discussion in advance of the adaptation of the law will be vital to ensure that courts are empowered and mandated by ministries of “Justice” rather than “Truth.”