What platforms need to know about the Digital Services Act (DSA): Part Two, The Obligations

WRITTEN BY Pex Team
Nov 16, 2022

This is the second installment of a three-part blog series on the DSA. Miss the first blog? Catch up on “The Scope.

 

The Digital Services Act (DSA) goes into effect in January 2024, but compliance deadlines may come sooner for intermediaries designated by the EU Commission as very large platforms or search engines. There’s no shortage of obligations imposed by the DSA, so platforms have a great deal to comply with in one year or less. 

While platforms likely to be designated as “very large” must adhere to additional requirements, they may have an easier time, as many already have measures in place to manage illegal content. Even with a head start, there are still several obligations that very large platforms will need to meet, and possibly on a shorter timeline: very large platforms are required to comply just four months after being designated by the Commission. Medium-size platforms that generally don’t have these same measures in place may have the longest path to compliance. Download our free DSA starter kit to learn all the requirements for platforms and how to jumpstart compliance. Here’s a preview of what you’ll learn in the kit: 

DSA obligations

The DSA imposes different ranges of obligations on platforms depending on their type and size. The obligations apply cumulatively, meaning that the next highest level of platforms always encompasses the obligations of the previous level, in addition to new obligations.

Intermediaries affected by The DSA

Information courtesy of the European Commission

Obligations applicable to all intermediaries 

Compliance with orders against illegal content and provision of information

All platforms are obliged to act upon orders to remove content issued by national judicial or administrative authorities and must inform the authority if and when the order was applied. All platforms should act upon orders to provide information about service users and inform the authority about the execution of the order.

Single point of contact (Article 10) and legal representative (Article 11)

The DSA requires platforms to identify a single point of contact for communication with Member State authorities, the EU Commission, and the Board. If not established in the EU, platforms must appoint a legal representative in one of the Member States where they offer services. The representative can be held liable for non-compliance with the regulation, with no prejudice to the liability of the platforms.

Content moderation information in terms and conditions

Platforms must provide information, in their terms and conditions, on the content moderation activities they employ. They must explain in clear and unambiguous language how content moderation is carried out, i.e. which measures and tools, including algorithmic decision-making and human review, are used.

Obligations for platforms (excluding small and micro enterprises)

In addition to complying with the above obligations for all intermediaries, online platforms, including hosting services, have additional duties described below.

Notice and action mechanism

Platforms have a duty to put in place a mechanism for the flagging of illegal content, which must be available to any entity or individual. The notice should be sufficiently substantiated (= the flagger should justify why content is illegal). Decisions about content (takedown, demotion, demonetization etc.) should be accompanied by a statement of reasons.

Trusted flaggers

Trusted flaggers are entities whose notices are processed with priority. They have expertise in detecting illegal content, and they have been awarded the status of trusted flagger by the Digital Services Coordinator. Trusted flaggers are accountable to Digital Services Coordinators and their status may be suspended in case of misuse. Trusted flaggers will often be rightsholders or organizations of rightsholders.

Transparency

Platforms are required to publish ‘clear, easily comprehensible and detailed reports’ on their content moderation activity performed in the period of reference, at least once per year. The reports must provide specific information on a series of elements listed under Article 15.

Internal complaint-handling mechanism

Platforms have the duty to establish an internal complaint-handling mechanism (equivalent to the complaint and redress mechanism in the Copyright Directive) which must be free of charge for a complainant and available not only for users, but also for individuals or entities who submitted a notice. It should be available for six months following the decision, but there are no specific time limits for the decision to be made.

Obligations for very large platforms

Download the starter kit for the full list of obligations for platforms, including additional obligations for very large platforms, and the exception criteria for small and micro enterprises. 

Recent stories