The European Union’s Digital Services Act: A New Era for the Internet?

Written by Onur Bakiner, Faculty Fellow and Associate Professor of Political Science
June 14, 2022

On April 23, 2022, the trifecta of European Union (EU) institutions, namely the European Commission, the Council of the European Union and the European Parliament, reached an agreement on the text of the Digital Services Act (DSA), proposed by the Commission in December 2020 to replace the E-Commerce Directive of 2000. The DSA’s goal is to get online platforms to regulate illegal content, hate speech, disinformation, and false advertisement more effectively, or in simple words, to make “what is illegal offline illegal online.” Asha Allen, Advocacy Director for Europe, Online Expression & Civic Space for the Centre of Democracy and Technology, summarizes its objectives as “increased algorithmic and content moderation transparency, extensive due diligence obligations, and risk assessment and mitigation measures.”

The DSA marks what one commentator calls a shift away from the 20-year paradigm of online self-regulation. To be more precise, it asks companies to become much more proactive about understanding, explaining and mitigating risks generated by their business decisions and threatens to impose penalties if they fail to do so. Daphne Keller, preeminent expert in intermediary liability, argues that Europe’s model of knowledge-based platform liability for illegal content has been in place for decades, but the Act adds new processes and regulatory rules. Combined with the General Data Protection Regulation (GDPR), which came into effect in 2018, the Digital Market Act, a bill in the final stages of the legislative process that seeks to bolster competition in a digital market dominated by a few powerful companies, and the proposed Artificial Intelligence Act, the DSA promises to overhaul the regulation of almost everything online and digital.

Content of the DSA           

The definitive version of the proposed act is yet to be released. The proposal stipulates that it will apply to all intermediary digital service providers, such as online marketplaces, social networks, app stores and content-sharing platforms, but disaggregates risk with respect to the service provided, size of the platform and the nature of the content. The intermediary services are divided into “mere conduit”, “caching” and “hosting” services. The first two categories are exempt from liability under certain conditions while hosting service providers, and especially online service providers, are exempt under a more limited set of circumstances. Very large online platforms (VLOPS), i.e., platforms with over 45 million monthly active users, receive greater scrutiny.

Journalists claim that the nature of illegal content is taken into consideration in shaping the severity of regulatory response, too: while platforms should take down certain forms of cyber-crime like revenge pornography “immediately” (it should be noted that stronger measures to eliminate image-based sexual abuse were taken off from the DSA text at the last stage of negotiations), other illegal content should be removed swiftly. The DSA imposes a ban on three practices: (1) targeted advertising based on “sensitive categories” such as religion, sexual orientation, political affiliation, ethnicity, trade union membership, and health condition; (2) targeted advertising aiming for minors; and (3) “dark patterns”, i.e., confusing or deceptive user interfaces that trick people into buying or signing up for products and services unintentionally.

The DSA operationalizes the terms “accountability” and “transparency”, two staples of the discourse on AI ethics. Accordingly, the European Commission and the member states should be allowed to access the algorithms of a platform. Independent researchers will also have access to large platforms’ data to assess risks. This provision will open the door to greater scrutiny by vetted academics and civil society actors. What is more, individual users should have the option of understanding how algorithms profile them and why they see the advertisement they do, and the choice not to be profiled at all. Companies are asked to provide information about how much illegal content they remove. In addition to these accountability and transparency requirements, the Act prescribes an appeals mechanism: users of online platforms are granted the right to appeal takedown decisions, and seek compensation for any loss or damages resulting from platforms’ non-compliance.

Part of the risk assessment process includes what is happening inside the company. All online platforms that fall under the jurisdiction of the DSA should conduct annual reports detailing their content-moderation activities. VLOPs have to make assessments of risks to democracy, human rights, and the well-being of minors – an important addition that requires companies to go beyond assessment of what is illegal to include assessment of what may be legal but still harmful. Failure to comply with the DSA may result in penalties as high as 6 percent of the company’s global annual turnover, and repeat offenders may face periodic penalties.

EU digital regulation sign with European map

                        Image Credit: pixabay

Enforcement of the Act

The enforcement of the DSA is premised on the EU’s peculiar mixture of supranational governance and deference to national governments. The European Board of Digital Services will undertake the enforcement work for matters concerning large companies, but every member state will also set up a Digital Services Coordinator. The idea is to incentivize out-of-court settlement by positioning these member-state coordinators and certified independent agencies in between platforms and potential complainants. In addition, online platforms are expected to establish a single point of contact in the EU, and platforms based outside the EU should appoint a legal representative in the EU.

The DSA also stipulates crisis protocols, widely cited as having been included in the context of the Russian invasion of Ukraine. The crisis protocols can be activated by the European Commission to enforce stricter regulation of large platforms than usual, potentially to force them to take down misleading content during acts of war or terrorism, pandemics, and natural disasters.

Criticisms and Implications

As with other EU legislation, the operationalization of key terms will be key to the DSA’s success. A law that seeks to eliminate hate speech, disinformation, illegal and otherwise harmful content, terrorism propaganda, and counterfeit products is likely to receive criticism for both overreach and underreach. Some member states’ definitions of what is illegal are unnecessarily restrictive. Limiting content removal to strictly illegal material will leave a lot of harmful online contact intact.

There are other concerns. The language on targeted advertising has been criticized for failing to include behavioral advertising that uses sensitive data. Another area of controversy has to do with a “trade secrets” exception that platforms may demand to avoid scrutiny of their data by independent researchers. Finally, the crisis mechanism, akin to a state of emergency that can be declared unilaterally by the European Commission, is likely to generate further dissent.

DSA will enter into force once the member states ratify it, but the grace period for platforms to comply with its stipulations is extended to January 1, 2024 at the earliest. The news of the agreement on DSA were greeted either with reference to Elon Musk’s Twitter bid, which took place almost at the same time, or with commentary on legislative impasse around the regulation of online platforms in the United States. For Big Tech, the adoption of the European model elsewhere risks loss of revenue due to further restrictions on targeted ads in other markets.

Calls to make social media data available to researchers have been gaining momentum. While the global regulatory environment has been shifting, US-based Big Tech companies are taking guard: they have reportedly lobbied European lawmakers to water down the DSA’s provisions, especially with respect to limits on targeted advertising and granting data access to researchers. Such lobbying managed to successfully ward off a total ban on tracking-based advertising.

Even though much about the DSA is still up in the air, any intermediary service provider with operations in the EU should be prepared for a changing online landscape in 2024. What is more, the EU’s self-positioning as the first mover in the regulation of online and digital content appears to have prompted U.S. lawmakers who have begun to bring up similar legislative proposals: the Digital Services Oversight and Safety Act, the Platform Accountability and Transparency Act and the Digital Platform Commission Act were introduced in December 2021, February 2022 and May 2022, respectively. If and when these proposals will be deliberated upon or voted on are valid questions, but still, the possibility of a more coherent regulatory landscape on both sides of the Atlantic cannot be entirely dismissed now.

 

Share on Facebook and LinkedIn!