August 6, 2024 | Max Atallah

Data

Overview of the Digital Services Act

The Regulation on a Single Market for Digital Services, or Digital Service Act (DSA), is part of the European Union’s (EU) legislative package that lays down ground rules for the operating environment of the digital age. The regulation aims to create a safer online environment for EU consumers and businesses. At the heart of the DSA is to ensure that illegal content is addressed, to increase due diligence obligations on online platforms and to impose increased obligations on very large operators, while facilitating the expansion of small operators in the EU.

The regulation entered into force on 16 November 2022 and its application has started gradually for different services, but since 17 February 2024 it has applied to all services covered by the Regulation.

Content of the Regulation

The regulation defines clear responsibilities for online platforms and social media, and increases transparency through improved reporting and oversight. All providers of online intermediation services in the EU internal market must comply with the rules set out in the DSA, regardless of whether they are established in or outside the EU. Intermediation service providers include e.g. application stores, content sharing platforms and e-commerce marketplaces.

For example, e-commerce marketplaces are obliged to know the sellers who sell products or services on the online platform. In addition, targeted advertising may no longer be used for minors, and the use of sensitive information will not be allowed.

However, the regulation takes into account the proportionality of the obligations to the nature of the services, and they are tailored to the number of users of the services so that, for example, micro and small enterprises have obligations commensurate with their capabilities and size. Compliance with the regulations is monitored by the Commission, together with the national authorities that monitor the compliance of platforms established in their territory. However, the Commission is primarily responsible for monitoring and implementing additional obligations applicable to contracts on very large online platforms and search engines, such as measures to mitigate systemic risks.

Very large online platforms and search engines

Very large online platforms and search engines are subject to stricter requirements. Services with more than 45 million active users in the EU are considered such a very large online platform or search engine. Under the regulation, such operators must, for example, provide users with a system in which the recommendation of content is not based on user profiling. The DSA also sets an annual requirement for these actors to analyze the systemic risks they pose and how to reduce them. This aims to reduce risks such as:

  • dissemination of illegal content
  • negative impact on fundamental rights
  • manipulation of services affecting democratic processes and public security

Practical Implications of the Regulation

The regulation imposes various due diligence obligations on all providers of intermediary services. These companies must inter alia update their terms and conditions of the service concerning the information provided by service recipients. The terms and conditions must include details on, for example, the principles, procedures, and measures used for content moderation, as well as information on the internal complaint handling system. Providers of intermediary services must also publicly provide clear and easily understandable reports annually on all content moderation activities carried out during the year.

As a practical example of the regulation’s impact, cookie banners must be designed so that rejecting cookies is as easy as accepting them, and design choices such as color cannot be used to influence user decisions.

The regulation affects most significantly on very large online platforms and search engines providers. In addition to the obligations imposed on all intermediary service providers, these entities must comply with the several additional requirements, such as the aforementioned risk assessment provision. The practical implications of the regulation are already evident, as the Commission has sent preliminary findings of non-compliance to X and initiated formal proceedings against platforms such as TikTok.

Finally

The DSA gives people more control over what they see online. A practical example of this is cookie settings. In the future, cookie banners must be designed so that preventing the collection of cookies should be as easy as accepting, and color choices, for example, cannot be used to guide user choice. By laying down rules on online intermediaries and platforms, the DSA aims to improve user safety, promote fair competition and respect fundamental rights in the digital environment.

Our Associate Trainee Adeliina Sulkanen took part in writing this article.

Nordic LawPioneer in Web3 and Fintech law