What platforms need to know about the Digital Services Act (DSA): Part One, The Scope

WRITTEN BY Pex Team
Nov 8, 2022

At Pex, we have extensively covered the adoption of the European Union’s Copyright Directive, especially its Article 17, to help educate platforms, rightsholders, and creators on their rights and responsibilities. We have also carefully watched the development of another new EU regulation, The Digital Services Act (DSA), which creates new obligations for platforms that share user-generated content and clarifies some parts of Article 17. 

The DSA regulates online intermediaries digital platforms and services that connect consumers with goods, services, and content and was adopted by the European Parliament in October 2022. The DSA was proposed by the European Commission to address issues that arose since the adoption of the E-Commerce Directive over 20 years ago. It seeks to protect all EU users from illegal goods, content, or services, and protect their fundamental rights.

To help bring clarity and transparency to this new regulation, we’ve put together a DSA starter kit that’s free to download, and a three-part blog series to highlight key areas. In this first installment, we review “The Scope” of the DSA, which is quite broad. Here’s a preview of what you’ll learn in the starter kit:

The Scope

The scope of the DSA is very broad – it applies to “any information, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law.” This covers not only copyright-protected content but also counterfeits, hate speech, child abuse, revenge porn, and other types of illegal content, without prejudice to sector-specific regulation such as Terrorist Content Online Regulation proposal or the EU Copyright Directive.

The DSA only applies to platforms which offer services in the EU, described as platforms which:

  • are established in the Union or
  • have a significant number of users in one or more Member States (significant in relation to the States’ population); or
  • target their activities towards one or more Member States.

Factors which determine whether activities are targeted towards a Member State

  • the use of a language or a currency generally used in that Member State
  • the possibility of ordering products or services
  • using a national top-level domain
  • the availability of an application in the relevant national application store
  • the provision of local advertising or advertising in the language used in that Member State
  • the provision of customer service in the language generally used in that Member State

Illegal content

Illegality of content may be determined by either the national or EU law. Illegal content is also defined very broadly. It is information that either:

  • Is illegal in itself (hate speech, terrorist content, unlawful discriminatory content)
  • Relates to activities that are illegal (the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorized use of copyright protected material or activities involving infringements of consumer protection law)

Download the starter kit to dive deeper. Stay tuned for the next two installments, covering the obligations for platforms, how the DSA will be enforced, and how platforms can begin to comply. 

Recent stories