EU Regulation Update: The Digital Services Act (DSA) Now in Force

by:
  • M-EPLI
in Law

Written by Mindy Nunez Duffourc, Johanna Gunawan and Caroline Cauffman.

The Digital Services Act (DSA), which entered into force on 17 February 2024, governs (online) intermediary services, such as social network platforms and online marketplaces. The DSA seeks to harmonise the rules for ‘safe, predictable and trusted online environment’ (Art. 1) for all consumers in the EU regardless of where the intermediary service providers are located. (Art. 2)

Information Inroads for Consumers in the Digital Market Place?

The DSA advances consumer protection in the digital marketplace by requiring more transparency about the existence and enforcement of rules governing online content. This includes clear explanations of content restrictions (Art. 14) and publication of content moderation activities, including details about automated content moderation practices and decisions (Art. 15). For services that host user-generated online content, the DSA requires these hosting services to provide a reporting mechanism to help identify potentially illegal content. (Art 16). On the other hand, if the hosting service takes an adverse action against a recipient for providing illegal content, the service must provide a detailed explanation of the reasons for this action and any means of redress. (Art 17).

The DSA also provides some protection of consumer autonomy by prohibiting deceptive design (Art. 25) advertising (Art. 26), and recommender (Art. 27) practices that seek to influence consumers’ choices, particularly when minors are involved (Art. 28). Relatedly, the DSA seeks to protect consumers from entering into online contracts for poor quality, dangerous, or illegal products or services by requiring online platforms that facilitate the conclusion of such contracts to ensure that consumers have sufficient information about the products and services and their traders (Art. 31). If a provider of an online platform discovers that a consumer purchased an illegal product or service, it must notify the consumer and provide information about the trader and any means of redress that the consumer may have. (Art. 32)

A Bold Stance against “Dark Patterns”

Recital 67 for the DSA articulates a robust definition of “dark patterns", referring to design practices that interfere with users’ autonomy towards unintended outcomes or decisions. How the DSA affects dark patterns enforcement and regulation remains to be seen, but the prohibition via Recital 67 is an exciting development against such practices. 

The guidance outlined by Recital 67 also includes explicit references to types of dark patterns, with examples. This represents a non-exhaustive list; platforms still lack decisive design regulations to prohibit many deceptive behaviors. However, the clear stance against dark pattern designs may offer support for enforcement decisions in courts or national consumer protections agency, or otherwise lend credence to dark patterns cases. As EU member states increasingly combat dark patterns (see cases led by the French DPA, CNIL, as well as the Dutch Consumer and Markets Authorities under platforms like Google, Meta, and Epic Games), a central definition and clear intention to regulate against dark patterns facilitates nation-level efforts. 

For researchers in human-computer interaction and legal studies, this presents a unique longitudinal opportunity to examine compliance or the reduction in dark pattern practices in the upcoming years.

Accountability Dead Ends for Hosting Services in the Digital Market Place?

It is disappointing that the DSA retained the E-Commerce directive’s ‘hosting exemption’, which exempts hosting service providers from liability for damages caused by information they store at the request of a recipient of the service in certain circumstances. A hosting service provider can take advantage of this exemption as long as it (1) was not aware of illegal activity or information or circumstances which would have made such illegal activity or information ‘apparent’ and (2) took action to remove illegal content upon becoming aware that it exists. Notably, hosting service providers are not under any obligation to actively discover the existence of illegal content stored on their services.

Fortunately, the hosting exemption is subject to a few exceptions. Firstly, it does not apply when the recipient of the service posting the illegal content is acting under the hosting provider’s authority or control. Secondly, online platforms that allow consumers to conclude online contracts with businesses cannot escape liability under consumer protection law if they mislead an average consumer into believing that the platform itself or someone acting under its authority or control is actually providing the products or services to the consumer.

In conclusion, although the DSA makes some progress toward protecting EU consumers from dangerous and deceptive online content and should be applauded for its attention to dark patterns, it still burdens consumers with bearing some high risks associated with digital services, particularly when third parties publish harmful information on hosting services.

Tags:
M-EPLI

Also read