The Digital Services Act: creating a safe digital space with Europe

I. Introduction

The Digital Services Act (DSA) emerged after years of rapid growth in online environments. It builds on the rules of the European Union (EU) e-Commerce Directive adopted in 2000,[1] and became law on 19 October 2022[2] when it was published in the Official Journal of the European Union.

The principal purpose of the DSA is to “keep users safe from illegal goods, content or services, and to protect their fundamental rights online.”[3] In summary, the DSA applies to providers of intermediary information society services, in an effort to create a consistent legal framework to govern the ever-evolving growth of the digital space. The focus is on empowering users with relevant information about their online activities, and at the same ensuring that online businesses can rely on a harmonised and transparent set of rules to manage compliance.

More broadly, the DSA is the result of a two-pronged project to upgrade rules governing digital services in the EU, together with the Digital Markets Act (DMA). While the DSA focuses on consumers and protecting their rights online, the DMA focuses on businesses, with the aim to establish a level playing field for competition in the digital space.[4] 

II. Scope

Who is regulated?

The DSA sets out a series of obligations for intermediary service providers (ISPs), that is, providers of any services for remuneration, at a distance, by electronic means and at the individual request of the recipient (Recital 5 and Art.3). As such, it covers most online services, such as internet service providers, email providers, online retailers, video streaming, social media platforms and search engines.

The DSA can be seen as a 4-tiered system, with cumulative obligations for each new tier which become more onerous as the size and complexity of the businesses captured increases. Users of services such as Amazon, Netflix, Facebook and Google in the EU (which comprises almost half of the EU population) will likely fall within the scope of this regulation, and those services will likely be regulated within tiers 3 and 4, which bear the burden of its most onerous obligations (as further set out below).

Small and medium enterprises (SMEs) are also subject to the DSA: however, they are less likely to be captured within the more onerous obligations of tier 3 for online platforms, and, due to their size, do not qualify to be captured within tier 4 (see also below). There are also specific exemptions for smaller businesses from liability and from certain obligations. 


Tier 1: Common obligations applicable to all ISPs 

Tier 1 captures all ISPs, including those that merely transmit information provided by the service recipient in a communication network, or the communication to access such network. As such, they act as ‘mere conduit, caching or hosting services’ (Art.3(g)). For example, cloud infrastructure services and virtual private networks commonly used in the workplace fall within this definition (Recital 28).

Overall, the aim of these procedural obligations is to provide enhanced transparency for consumers and to empower them by granting a higher level of control over their online experience. For example, all ISPs must have a clear point of contact to communicate “directly and rapidly” with recipients of the services (Art.12). The communication mechanism must be user-friendly and cannot be solely reliant on automated tools. ISPs must also provide certain basic information in their terms and conditions, and clearly state any restrictions they may enforce on the user, including any content moderation policies (Art.14).

Tier 2: Obligations to ISPs which provide hosting services

Hosting services consist of “the storage of information provided by, and at the request of, a recipient of the service” (Art.3(g)(iii)). This includes cloud service providers and webhosting services.  In addition to the Tier 1 obligations, hosting services are subject to additional obligations, notably mandating the provision of a “notice and action mechanism” (Art.16 and Recital 50).

The “notice and action mechanism” should allow anyone to flag the presence of illegal content (the “notice” element) in a user-friendly way. Pursuant to the notice, the provider can then decide whether it agrees with the user’s assessment as to illegality, and remove or disable access to that content (the “action” element). As the DSA went through the EU legislative process, a point of contention was how quickly providers should take action after allegedly illegal content is flagged: in its final version, the DSA simply says that it should be done “in a timely, diligent, non-arbitrary and objective manner” (Art.16(6))

Tier 3: Obligations to ISPs which provide online platforms services

This level focuses on providers of hosting services that also qualify as online platforms by storing and disseminating information to the public (Art.3(i)), i.e., by making that information accessible to a potentially unlimited number of persons (Recital 14).

Tier 3 obligations do not apply (for the most part) to small enterprises (Art. 19), but cover larger online platforms including online marketplaces, app stores and social media platforms that store and use recipient’s personal information for business purposes, such as recommending purchases to a particular demographic.

The obligations for online platforms build on previous tiers to mandate more stringent transparency and reporting responsibilities (Art.24), including in relation to advertisements run on the platform (Art.26). There is also a specific obligation to put in place appropriate measures to safeguard the privacy, safety and security of minors, including not profiling minors for advertisement purposes. This last obligation proved again contentious as the DSA was being drafted. Legislators pondered whether platforms would be encouraged to collect data on minors, paradoxically, to be in a position not to serve them personalised advertising. The final text of the DSA makes clear that platform do not have an obligation to collect further data for this purpose, and the obligation only applies when the platform is “aware with reasonable certainty” that the recipient of the service is a minor (Art.26).

Another obligation subject to much debate is the ban on so-called “dark patterns”, i.e., certain choices in visual structure when displaying online content which “materially distort or impair […] the ability of [users] to make autonomous and informed choices or decisions” (Recital 67). Examples would be default settings that are very difficult to change, or making the procedure of cancelling a service significantly more cumbersome than signing up to it (Recital 67). Article 25 of the DSA aims at combating this, mandating that platforms avoid designing interfaces that “deceive or manipulate” the recipients of the service.

Additionally, there are specific obligations only applicable to online marketplaces (Articles 30 to 32), which again do not apply to small businesses (Art.29). These obligations focus on traceability of traders within platforms (for example, third-party sellers on Amazon), and measures designed to combat the spread of counterfeit goods on these platforms (Art.32).

Tier 4: Very Large Online Platforms (VLOPs) and very large search engines (VLOSEs)

VLOPs and VLOSEs are online platforms which have at least 45 million average monthly active recipients in the EU (roughly 10% of the total EU population) (Art.33). Tech giants such as Google (the search website), Facebook and Amazon will likely fall within this tier.

Given their sheer size and power, VLOPs and VLOSEs are deemed to have a “systemic” impact on safety online, the shaping of public opinion and discourse, as well as online trade (Recital 79): for this reason, additional and more onerous obligations apply to them.

Transparency and reporting obligations focus on identifying and monitoring any systemic risks that these large platforms might pose, including in their use of algorithms (Art.34), as well as mitigating those risks by putting adequate measures in place (Art.35). The measures taken will be monitored through independent audits, to be conducted yearly at the expense of the platform (Art.37).

VLOPs and VLOSEs are also subject to additional transparency requirements in relation to advertisement (Art.39), as well as data sharing obligations with designated authorities to monitor the compliance measures taken (Art.40). 

III. Enforcement and jurisdictional reach

Alongside the DMA, the DSA marks a shift in paradigm, compared to previous EU legislation, towards ex-ante enforcement to prevent infringement of its obligations, rather than ex-post enforcement where an investigation is conducted after the alleged infringing conduct has already happened. The DSA aims to achieve this purpose by actively monitoring and scrutinising the measures that providers (in particular VLOPs and VLOSEs) put in place to avoid breaching its obligations.

A point debated during the DSA’s gestation was the role that the European Commission (Commission) would play in enforcing the new legislation. In earlier drafts, designated enforcement authorities in European member states (MS), which are called Digital Service Coordinators (DSC), were set to take a primary enforcement role, in a similar way to how they are tasked with enforcing the General Data Protection Regulation. This is still for the most part still true in the final DSA text, save that now the Commission will have exclusive powers to supervise and enforce the DSA against the largest platforms, VLOPs and VLOSEs (Art.56), whilst the DSCs will have primary responsibility in relation to ISPs in the remaining tiers. The role of the DSCs is therefore reduced in relation to the tech giants who are the primary focus of this legislation, although the reservation of responsibility to the Commission in this regard may assist in ensuring equality of arms, as well as a pan-European perspective to enforcement in relation to the most significant providers of regulated services.

Despite the allocation of competences, the Commission and the DSCs shall nonetheless cooperate closely and exchange information when appropriate (Art.57). Coordination between DSCs is also achieved through the creation of an advisory group known as the “European Board for Digital Services”, composed by the DSCs of each MS, which will provide guidance to DSCs and to the Commission to ensure that the DSA is applied uniformly across MSs, including in relation to enforcement (Art.61).

In terms of scope, the DSA will cover all ISPs operating in the EU, regardless of whether they have an establishment or residence in the EU. This will ensure that digital companies with global operations, a feature common amongst tech giants, will be captured by the new legislation regardless of their place of residence.

 IV. Approach in the UK

One issue that remains open is how the DSA will interact with and influence other nascent regimes aimed at creating safer digital spaces.

In the UK, the relevant piece of legislation is the draft Online Safety Bill (OSB).[5] The OSB has been in the works for a few years: after being reviewed by successive governments, in October this year it was announced that it will return to Parliament for further scrutiny “imminently”.[6]

Though the OSB and the DSA are similar in many respects, in particular in setting reporting obligations and mandating the implementation of notice and action mechanisms, they diverge in scope when it comes to the type of online content they target for regulation.

The OSB takes a broader approach than the DSA, covering not only illegal, but also lawful but “harmful” content. It also places special emphasis on regulating content relating to terrorism and child exploitation and abuse (Chapter 5).

The OSB’s stated aim is to “make the UK the safest place in the world to be online while defending free expression.[7] Unsurprisingly, given the inherent tension between ensuring digital safety and protecting freedom of expression, successive governments have placed their emphasis on either one of the two prongs: most recently, the UK government indicated the intention to tweak the draft text in favour of stronger protections for freedom of expression.[8]

V. Outlook

Will the DSA deliver?

Whilst the legislative process unfolded swiftly compared to previous legislation adopted in Europe, the majority of the DSA’s obligations are set to only take effect from 16 February 2024. The hiatus is potentially helpful, as it will allow businesses to adapt to the demands of the new legislation, and the Commission to put in place the necessary structures to monitor compliance, and take enforcement action when necessary. Notably, for VLOPs and VLOSEs the new rules might take effect earlier than 2024, specifically four months after they have been designated as such by the Commission (Art.92).

The ethos behind ex-ante enforcement and the strengthened role of the Commission, crystallised in the final text, in regulating the largest platforms, are certainly welcome developments in the European legislative landscape. But ultimately, at least two factors will determine the effectiveness of the DSA.

The first is the degree of businesses’ cooperation in implementing the new measures. Some providers of online services, such as Google, expressed their support for the DSA’s stated goals, but are also unsurprisingly concerned about the impact that the new legislation will have on their business activities.[9] As the new rules take effect, it will become clear whether these concerns will result in challenges to the application of some of the new rules in the European courts, which could hinder the DSA’s deployment and in turn divert resources from ex-ante regulation.

The second, more prosaic, factor is whether the Commission can dedicate enough resources to ensure comprehensive monitoring and effective enforcement under the new rules. This second factor was clearly in the mind of the legislators: the DSA makes laudable efforts to ensure that enough resources are dedicated, such as making VLOPs and VLOSEs pay for some aspects of the monitoring process (Art.43) and creating tools to enlist the help of reputable third parties – such as trusted flaggers (Art.22) and vetted researchers (Art.40) – in the monitoring effort. A further – albeit indirect – enforcement tool is Art. 54, which enshrines users’ rights to seek compensation from ISPs in respect of any damage or loss suffered due to a breach of the DSA’s obligations. However, given the scale and reach of the legislation, the cooperation of online businesses will still play a vital role in ensuring that the law works well for everyone, and that resources can be valuably deployed on monitoring and compliance rather than on enforcement.

*Luke Streatfeild is a Partner in London. 

With special thanks to Shantal Kishinchand Situmal for her assistance with authoring this article.


[1] Directive 2000/31/EC:
[6] (requires subscription).
[7] Department for Digital, Culture, Media & Sport, 19 April 2022, Policy Paper: Online Safety Bill factsheet
[8] (requires subscription).

Other Publications