Electronic Commerce

The Digital Service Act: the definitive guide

make internet great again

The Digital Services Act (“DSA”) has been finally published in the Official Journal of the European Union and will be soon applicable (see below). This new regulation substantially modifies and innovates, especially with respect to the liability regime for information society service providers, the European digital framework in force in the last 20 years. Together with another important legislative innovation, the DMA (aka Digital Markets Act, a regulation focusing on competition in the Internet), the DSA establishes a comprehensive set of rules that will apply across the whole EU and sets a potential global standard for online platform governance. This contribution has the scope to analyze the main innovations and solutions of continuity between the old and the new regime.

The background of the reform

In the late ’90, when the Internet was still unregulated, the European institutions decided to rule the emerging digital economy via a specific act, the so-called E-commerce Directive (Directive CE 2000/31, hereinafter: “Dir. 2000/31”). The scope of this legislation was to create a harmonized framework for the Internet in the EU, however without excessive intrusions, and therefore a light regulatory approach was preferred. Unlike telecom markets, where drastic rules were needed to liberalize and make competitive former State-owned monopolies, the Internet still appeared to be an infant market in need of support rather than restrictive regulation. At the end of ’90 online giants did not exist yet and most of the intrusive data-based applications, which are today creating concerns to the society, were even not predictable. Therefore, Dir. 2000/31 mostly aimed at harmonizing national legislations, overcoming administrative and regulatory barriers while addressing the few business problematics debated at that time, in particular liability of Internet operators (both access and hosting providers) vis-à-vis illegal content: it was the time of peer-to-peer applications which were jeopardizing the consolidated business model of the traditional music industry, then latter believing that their world would never change once the Internet piracy issue could be properly addressed.      

This 2000 regulatory framework has proven to be successful and fit-for-purpose, so that digital services are no longer niche and sophisticated functions as it was at beginning of the Internet age; by contrast, the Internet has now become the core infrastructure of the modern economy and society. However, things have changed over the last 20 years: technology and digital markets have developed, new business models have emerged, consolidated value-chains and markets have been challenged and replaced. This development has also created conflicts with other legislations, including intellectual property, content and media, sales and trasportation, while revealing the need to properly protect traditional rights in the online sector. The digital transformation and the greater use of digital services have given rise to new risks and challenges for citizens, businesses and society itself as a whole.

Thus, also because of the mounting pressure coming from civil societies and politics, the European institutions decided, at beginning of 2019, that it was time to update and clarify the regulatory Internet environment in Europe, in particular, but not only, for online platforms, social media, search engines and marketplaces.

The DSA, together with mentioned DMA as well as other acts concerning the data economy (especially the Data Act), is the main outcome of this modernization process. Facing the increasing complexity of the Internet economy and the dangers caused by market concentration, the DSA enacts a new set of rules which will force online platforms, in particulars these managed by global champions such as Meta, Alphabet and Amazon, to be more accountable in consideration of their size, for instance by doing more to tackle the spread of illegal content and other societal risks on their services in the EU. 

The DSA also constitutes a change of approach in addressing the issues stemming from the digital economy. The 2000’ light regime incentivized tech companies to essentially regulate themselves, setting their own policies and frequently limiting their responsibility to the adoption of transparency reports about their efforts to address various harms. While confirming some incentives for self-regulation (see the Code of conducts of arts. 45 and ff.), DSA however scales back this model by firmly holding online platforms to be accountable for the societal harms deriving from the use of their services, while forcing them to be more transparent about the way they work, for instance with respect to their algorithmic systems and internal procedures.

Main innovations

The general layout of Dir. 2000/31 is maintained, but new rules on transparency, disclosure obligations and accountability are introduced, largely by incorporating the jurisprudential guidelines that have already emerged over the years. 

Scope, legal form and ambit of application 

The stated purpose of the DSA (art. 1) is to contribute “to the proper functioning of the internal market for intermediary services by setting out harmonised rules for a safe,predictable and trusted online environment, that facilitates innovation, where fundamental rights enshrined in the Charter, including the principle of consumer protection, are effectively protected”. By doing so, the DSA recognizes the potential impact of online activities upon individual rights, something that was not completely clear at the infancy of the Internet when the new digital services looked unable to produce material harm for the society.    

An important legal innovation is represented by the form chosen for the new legislation: an European regulation instead of a directive. As a consequence of this choice, the direct applicability of the DSA is expected to help overcome the divergences between the national transpositions of Dir. 2000/31, so as to end the fragmented implementation of the 2000’ light regime that frequently required the interpretative intervention of the European Court to establish legal coherency throughout the European Union.

The new regulatory framework applies to “information society services” as defined in Directive (EU) 2015/1535), that is to say services provided from remote by electronic means, normally for remuneration, and at the individual request of users. The definition covers a very large category of services, from simple websites to online intermediaries such as online platforms, cloud providers or internet access providers. Remarkably, the DSA has de facto extraterritorial effects, because it applies (art. 2) to any service irrespective of the place of establishment of the provider (likewise the GDPR). In addition, non-European providers not having an establishment within the EU must however designate in writing a representative in the Union (art. 13).

The liability regime of intermediary service providers

The traditional liability exemption regime set forth by Dir. 2000/31 is fundamentally maintained for providers carrying out mere conduit, caching and hosting activities (although it needs now to be coordinated with the new liability regime introduced by the Copyright Directive with respect to video-sharing platforms). This exception, mainly focusing on hosting providers (art. 6), provides that an information society service provider is not responsible for the information stored on behalf of a client, provided that: (i) it has not actual knowledge of illegal activity or illegal content and, with respect to claims for damages, has no knowledge of facts or circumstances from which the illegal activity or illegal content is evident; and (ii) after obtaining such knowledge or awareness, act quickly to remove or disable access to illegal content. 

However, and taking into account of the jurisprudence developed over the last 20 years, the DSA updates the process by which digital service providers shall treat notices and information about existence of illegal content on their servers, with the purpose to rapidly delete it. These new notification and content-removing rules (art. 16) are tailored on the basis of the kind and size of the concerned provider, following the scalar discipline which will be described below.

Remarkably, the DSA introduces a sort of “Good Samaritan” clause (art. 7) aimed at clarifying that intermediaries are authorized to carry out voluntary investigations in good faith or other activities aimed at identifying and removing illegal content without risking not being able to benefit from exemptions for this only reason.

Finally, the absence of a general obligation to monitor the platform for user activities is maintained (art. 8), introducing however some exceptions. 

The scalar discipline

Perhaps the most significant novelty of the DSA is the introduction of a “scalar” discipline with four categories of providers and a progressive increase in the obligations, proportionally to the influence played and with the responsibilities tailored on the intermediary by reason of belonging to one or other specified categories. The categories are: intermediary services, hosting providers, online platform (e.g. social media) and very large platforms. In addition, it is worth-noting that various burdensome rules of the DSA do not apply to intermediary services that qualify as micro or small enterprises under European law (see the Annex to Recommendation 2003/361/EC).

This scalar system has a twofold objective: on one side, it recognizes that largest online platforms pose the greatest potential risks to society – such risks include negative effects on fundamental rights, political debate and elections, hate and violence, and public health, and therefore such issues shall be addressed with appropriate tools; on the other side, it tries not to over-regulate smaller or emerging businesses. With regard to the Internet global giants, normally called with the acronym GAFAM (just to mention the US ones), the DSA’s regime put on platforms with over 45 million users in the EU the maximum burden of accountability, for instance by requiring them to formally assess how their services, also via algorithmic systems, may increase these risks to society, and to take measurable steps to prevent them.

General transparency and reporting requirements 

The DSA imposes on intermediary providers transparency and reporting obligations depending on their nature and seize, as per the scalar system described above (although with specific exemptions provided for micro or small enterprises). 

In general, intermediary providers shall provide terms and conditions set out in clear, plain, intelligible, user friendly and unambiguous language, to be publicly available in an easily accessible and machine-readable format (art. 14 §1). Such terms and conditions shall also include information about content moderation policy, including algorithmic decision-making and human review, as well as the internal complaint handling system. Remarkably, when an intermediary service is normally used by minors, the related terms and conditions shall be drafted in a way that minors can understand (art. 14 §3).

Intermediary providers are also required to publish annual reports on their content moderation activities (art. 15), including the number of orders and notifications received from Member States or from professional entities (denominated “trusted flaggers”, art. 22) to take down illegal content, as well as the volume of complaints from users and how these were handled. Very large online platforms are subject to stricter and more detailed reporting requirements. 

Content moderation

Content moderation is an area where platform have traditionally tried to escape mandatory obligations by proposing code of conducts and self-regulation, also contemplating the use of innovative technology (AI). However, this model resulted in increasing frustrations amongst users facing with an impenetrable, grey and unpredictable mechanism where human feedbacks were quite rare. The DSA overcomes such practices by requiring platforms to clearly describe, in their terms of service and in the transparency reporting, how their content system works, including the use of automated systems and what their accuracy and possible error rate could be. 

Intermediary service providers shall set up fair and non-discriminatory internal handling procedures to deal with users’ complaints (art. 20), so as to provide users with detailed explanations in case of blocking of accounts, removing or demoting content. In case of disagreement, users must be entitled to out-of-court settlements (art. 21), in addition to appeal in front of ordinary courts.

Transparency about online advertising, and limitations about recommenders 

The DSA provides new rules for the transparency of online advertising (art. 26), with the purpose to enable users to recognize them easily with and make purchasing decisions on the basis of appropriate and clear information. Online platforms must also give users clear information about targeting practices concerning them and how to change ad targeting parameters.

In case of use of recommenders (art. 27), online platforms must clearly explain, in their terms of service, how their algorithmic recommender systems work. In addition, they must offer users at least one option for an alternative recommender system (or “feed”) not based on profiling. 

The DSA establishes a total ban for targeting advertisements destined to children (art. 28). 

Dark patterns, that is to say design practices that may potentially deceive and manipulate users, are restricted (art. 25).

Specific regime for very large online platforms

The DSA sets more stringent rules for online platforms and online search engines which have at least 45 million monthly active users on average and that are formally designated by the European Commission as very large online platforms (“VLOP”) or very large online search engines (art. 33). These rules create an ad hoc regulatory regime which constitutes the peak of the scalar discipline of the DSA. The concept of “active user” is based on commercial practices, although some criteria are indicated in recital 77. The Commission however reserves the option to enact an ad hoc implementing act (art. 24§2) in order to clarify its interpretation.

This more stringent set of rules provides that very large online platform (and search engine) shall, inter alia:

  • Carry out a risk assessment concerning the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services (art. 34)
  • put in place reasonable, proportionate and effective mitigation measures with respect to the specific systemic risks relating to their services (art. 35);
  • be required by the Commission to take appropriate measures to respond to specific situations of crisis (art. 36);
  • be subject to specific independent audits (art. 37). In practice, these platforms will be forced to share their internal data with independent auditors, EU and Member State authorities, in order to verify their self-assessments and risk-mitigation efforts;
  • be subject to more stringent requirements with respect to recommender systems (art. 38) online advertising (art. 39), compliance (art. 41) and reporting (art. 42);
  • be subject to mandatory access to data access and external scrutiny (art. 40). Access may be granted also to researchers from academia and civil society who may examine these data and findings in order to help identify systemic risks and hold platforms accountable for their obligations.

The European Commission expects online platforms to start publish the number of active users by February 2023 and successively on regularly basis. Based on these data, starting from March or April 2023 the platforms with more than 45 million users in the EU will be designated by the Commission as VLOP, meaning that they will have to comply with the above stricter regulatory regime.

The new institutional regime and the fines

Another noteworthy innovation is the creation of new national bodies responsible for supervising the application of the DSA itself. This figure is called the Coordinator of Digital Services, and will be responsible, in each Member State, to supervise precisely the application of the DSA with respect to the smaller intermediaries that have their main establishment in the respective Member State, and will have investigative, enforcement (including interim measures) and sanctioning powers. February 16, 2024 is the deadline for the Member States to appoint their Digital Service Coordinators.

The maximum amount of fines that may be imposed for a failure to comply with an obligation laid down by the DSA shall be 6 % of the annual worldwide turnover of the provider. In case of supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and failure to submit to an inspection, the maximum fine shall be 1 % of the annual income or worldwide turnover (art. 52 §3).

The Commission will be authorized to apply supervisory fees (art. 43) to platforms, based on the annual estimated costs, to help finance their own enforcement tasks (as it frequently happens in several Member States with respect to regulated markets). In a delegated act, the Commission will detail the methodology for estimating the costs, how the fees are determined, the threshold application and the details of the payment arrangements. Such fees will be proportionate to each provider, depending on the size of the platform in terms of average monthly users, and should not exceed the maximum cap of 0.05% of the global economic income of online platform.

Entry into force and application

The DSA will enter into force 20 days after its publication in the Official Gazette. It will be directly applicable across the EU and will apply from 17 February 2024, with the exception (art. 92) of the obligations for large online platforms, including search engines, which will apply as early as four months after their designation. Some minor provisions will apply immediately from November 22, 2022 (see art. 93§2).

In the meantime, EU countries and Commission need to create the necessary capacities and human resources to adequately implement the DSA. While each Member State will have to designate and empower the Digital Service Coordinator, the Commission will have two adjust its actual organization, especially the offices of DG Connect, to meet the objectives of the Digital Service Act. In addition, the Commission has announced to develop a “high-profile European Centre for Algorithmic Transparency” to help in its enforcement efforts.

Leave a comment