The Digital Services Act: What are the key provisions, and does it strike the right balance?

Introduction

The draft Digital Services Act (DSA) is one of two legislative measures proposed by the European Commission (Commission) aimed at promoting a borderless single market for digital services and increasing responsibility and accountability for digital platform; the second being the Digital Markets Act (DMA).[1]

The DSA introduces a new set of rules and obligations for providers of so-called “intermediary services,” including online platforms, which are intended to build on the framework introduced by the E-Commerce Directive over two decades ago,[2] but without superseding sector-specific measures such as those targeting illegal hate speech or terrorist content. In order to avoid potential divergences in its implementation across EU Member States, the DSA will take the form of a Regulation.[3]

In this article, we will explain to whom the DSA will apply, then set out the key obligations before examining its enforcement mechanisms.

Who will the DSA regulate?

The thorny issue for the Commission when trying to regulate online services providers is how to design a regime that is not only effective but proportionate, where it will apply both to global behemoths such as Google and Facebook and to small businesses trading on marketplace platforms.

To address this issue, the Commission has designed a tiered system of regulation[4] with the broadest and most onerous obligations applying only to the largest companies who have the capacity to absorb the cost of additional monitoring and regulation. The obligations in each of the tiers are cumulative, meaning that “very large online platforms” (VLOPs) bear the greatest regulatory burden, proportionate to their “societal impact and means[5]. The first tier, comprising the broadest set of obligations, will apply to all “intermediary services;” which, essentially, encompasses all online services that either transmit, store or host information online. This includes internet service providers, messaging applications, search engines, and social networks.[6] 

The second and third tiers apply to hosting services which store information provided by, and at the request of, a recipient of the service. Third tier obligations only apply where a hosting service is an “online platform,” which is defined as a hosting service that, in addition to hosting information, also disseminates that information to the public, such as a social network or online marketplace. However, hosting services where dissemination of information to the public is a “minor and purely ancillary feature[7] of another service (such as the comments section in an online newspaper) are not considered online platforms.

Finally, the fourth tier will only apply to VLOPs, defined as platforms with more than 45 million users in the EU (equivalent to approximately 10% of the EU population).[8] VLOPs are subject to the most onerous rules because they are deemed to be capable of serious societal and economic harm. According to the Commission, “[v]ery large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union.[9]

With an eye to the global reach of online services, the DSA applies to all providers of intermediary services including those that do not have an establishment in the EU but offer services there. Such entities are required to designate a legal representative in one of the Member States where the provider offers its services.[10]

Obligations for all online intermediaries, hosting services and online platforms – Tier 1

Liability and absence of a general monitoring obligation

One of the key features of the DSA is its continuity with two core principles of the E-Commerce Directive. The first is that there should be no liability by online intermediaries for users’ content, provided that intermediaries merely transmit or store the information in question without creating it, modifying it, or selecting it themselves.[11] The second is that there is no general monitoring or active fact-finding obligation with respect to the information transmitted or stored by the intermediary.[12]

However, in contrast with the E-Commerce Directive, the DSA makes clear that service providers that conduct voluntary own-initiative investigations and legal compliance initiatives will not lose the exemption from liability merely because they carried out those monitoring activities. This is a current feature of the E-Commerce Directive that in fact disincentives voluntary monitoring.[13] Once illegal content has been identified, however, online intermediaries will be obliged to remove it.[14] Similarly, Member States will have the power to order the takedown of illegal content or receive further information relating to the activities of specific individual users.[15]

Due diligence obligations

Chapter III of the DSA sets out the general due diligence obligations that apply to all intermediary services. The focus is on transparency: all intermediaries will be obliged to set out clearly in their terms and conditions any restrictions they impose on the use of their services and any policies and tools used for content moderation.[16] They will also be subject to basic yearly reporting on any content moderation they engage in, both voluntarily and in compliance with orders to take down content issued by the relevant Member State authorities.[17] Hosting services (tier 2), including online platforms, will also be subject to enhanced obligations regarding content moderation. Namely, they will be subject to the requirement to put in place a mechanism for users to flag illegal content, and then for the hosting service to take down that content if it is found to be illegal.[18] They will also have to provide a statement of reasons to those affected explaining why the content was removed.[19]

Obligations for online platforms only: complaint handling, and transparency on traders and advertisement – Tier 3

In addition to the above obligations, online platforms will also need to give users access to an internal complaints-handling system, which will enable them to lodge complaints against decisions to suspend or terminate accounts.[20] They will have to allow users to seek redress through certified out-of-court dispute resolution mechanisms.[21] Interestingly, if the user is successful in challenging an online intermediary’s decision through an out-of-court dispute resolution mechanism, the platform will have to pay the users’ costs, but the users will not be liable for the platform’s costs if they lose.[22] Online platforms must also prioritize notices flagging illegal content or violation of the platform’s terms of service submitted by “trusted flaggers,”[23] a status awarded by the Digital Services Coordinators (see below) of the Member State in which the applicant for such status is established.

Online platforms will also be subject to enhanced transparency obligations. Specifically, they will have to verify the identity of third-party traders that sell goods or services on their platforms,[24] a measure aimed at challenging the sale of counterfeit or illegal goods by traders who can simply create a new account on a platform under a different name once they are banned from it. Platforms will also have to give users basic information about any advertisement displayed to them, to clearly indicate that the information in question is an advertisement, and to disclose on whose behalf it is displayed and what parameters were used to determine the recipient of the advertisement.[25]

Obligations that only apply to VLOPs -Tier 4

VLOPs are considered by the Commission to have a “systemic impact” on EU citizens and, as such, they are subject to additional obligations beyond those that apply to other online intermediaries, hosting services, and smaller online platforms. Most of these obligations are “procedural.” For example, the requirement to appoint a compliance officer,[26] enhanced reporting[27]and a detailed annual assessment of the risks their services create in relation to: (a) dissemination of illegal content; (b) negative effect on the exercise of fundamental rights for private and family life, freedom of expression and information and rights of the child; and (c) intentional manipulation of their service, to negatively impact the protection of, amongst others, public health or civic discourse.[28] Measures must also be put in place to mitigate against the risks identified.[29]

Transparency is also a key concern. VLOPs will be subject to independent audits regarding their compliance with the Chapter III obligations (see above) and commitments made when adhering to voluntary codes of conduct.[30] The DSA’s transparency requirements for VLOPs also extend to providing information on the algorithms used to display information to users, as well as giving users options to modify or influence the parameters used to display it.[31] For online advertisements, there is an extra obligation to keep a publicly available repository of online advertisements displayed by the VLOPs including details of which user groups were targeted by each advertisement.[32] Vetted researchers will also be able to access data produced by the VLOPs in order to conduct research on the systemic risks identified in Article 26 (see above).[33]

Illegal vs “harmful”: what about online disinformation?

Crucially, all of the obligations set out above that concern content moderation and the obligation to take down content relate only to “illegal” rather than “harmful” content. “Illegal” content is defined by the DSA as simply what is illegal under European Union law or the law of a Member State.[34] Obvious examples of illegal content are hate speech, terrorist propaganda, or misuse of copyrighted materials.[35]

Harmful content, such as the spread of misinformation online or other harmful content on social media falls squarely within the categories of “systemic risks” VLOPs need to mitigate (see above). However, what counts as “harmful” content is left undefined by the DSA, and the DSA does not give details as to how VLOPs should address harmful content once they identify it.

Instead, the DSA leaves the task of dealing with harmful content to voluntary codes of conduct. The bodies tasked with the DSA’s enforcement (see below) will only indirectly address harmful content by encouraging platforms to sign up to these voluntary codes, facilitating and coordinating their development, and monitoring compliance.[36] The Commission is also increasing its efforts to fight disinformation elsewhere, having announced an overhaul of the existing Code of Practice on Disinformation.[37] In contrast, in the UK, the recently published Online Safety Bill requires social media and dating apps to take down “harmful” content, even if lawful; a move welcomed by children’s safety campaigners but criticized by civil liberties organizations for undermining free speech.[38] 

The Commission’s decision not to propose the direct regulation of harmful content is not surprising. Attempting to define what constitutes harmful content, and how to deal with it, would risk making it politically even more challenging to get the DSA through what is already a difficult legislative process. On top of that, the Commission has to balance its desire to encourage effective content moderation with the need to preserve freedom of expression, a preoccupation that is evident from the number of obligations in the DSA which provide redress mechanisms to users on the receiving end of content takedowns and account suspensions.

Enforcement

In contrast to the DMA, the primary responsibility for enforcement of the DSA lies with Member States rather than with the Commission. Member States will have to designate an independent authority that will act as their Digital Services Coordinator (DSC). Jurisdiction over an online intermediary will depend on the main place of establishment of the intermediary or, in case of intermediaries not established in any Member State, the location of the designated EU legal representative.[39]

As intermediaries, and VLOPs in particular, often operate on at least a pan-European basis and, not infrequently, globally, the DSA also establishes an advisory group comprised of DSCs from each Member State, referred to as the “European Board for Digital Services” (the Board). The Board will be charged with assisting in coordinating enforcement activity (including joint investigations), where necessary and will supervise the activity of the DSCs.[40] When VLOPs are involved, the Commission may also be called upon by the relevant DSC to take the necessary investigatory and enforcement measures.[41]

Penalties can be administered by Member States or by the Commission and can be up to 6% of annual turnover for serious or repeated violations or 1% for providing false or misleading information.[42] More proactive measures, such as requiring the infringing intermediary to adopt an action plan to terminate the infringement including measures enabling the DSC to monitor compliance, are also available.[43]

Reconciling the seemingly irreconcilable

The measures proposed in the DSA will soon face the scrutiny of the Member States (in the Council of the EU) and the EU Parliament. The Council will issue a report on 27 May, followed by a similar report by Parliament on 28 May.[44] News reports indicate that some Member States are skeptical about how cross-border enforcement of the DSA will work in practice, in particular for takedown orders and orders for information.[45] One obvious hurdle is that what constitutes “illegal” varies across Member States, which could pose problems for consistent enforcement.

A second contested element of the DSA is the decision to tie a Member State’s jurisdiction to take action under the DSA to the place of establishment of the intermediary rather that the place where the users of that intermediary are located. This looks set to be challenged by France (amongst others) who consider that the users’ location should determine jurisdiction. The Commission’s proposed approach may have the unintended effect of placing a disproportionate burden on countries such as Luxembourg or Ireland where online intermediaries are frequently domiciled for tax reasons. This could, at least in part, be mitigated by the ability of the Commission to “step in” when enforcing rules against VLOPs.

Conclusion

Whether or not the final text of the DSA will reflect the right balance on these key issues depends on your view of how best to address the social and financial risks posed by digital platforms whilst protecting freedom of expression, a key part of Europe’s democratic make-up. It can only be hoped that the practical compromises which may be required to introduce at least some form of regulation do not come at the expense of achieving the overarching goal of better and more consistent regulation of online businesses across Europe.

Lesley Hannah is a Partner and Kio Gwilliam and Antonio Delussu are Associates in the London office.

Footnotes

[1] We have written about the DMA proposal here: https://www.hausfeld.com/what-we-think/publications/the-digital-markets-act-radical-reform-or-conservative-compromise/
[2] The E-commerce Directive 2000/31/EC.
[3] While this will avoid the need for national implementing legislation, some ancillary legislation will likely still be required – in particular in relation to the creation of national Digital Services Coordinators – see further below.[4] https://ec.europa.eu/info/strategy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en
[5] DSA, Recital 54.
[6] DSA, Article 2(f).
[7] DSA, Recital 13.
[8] See DSA, Recital 54. The DSA recommends that the threshold is kept up to date as the EU population changes.
[9] DSA, Recital 54.
[10] DSA, Article 11.
[11] DSA, Articles 3-5.
[12] DSA, Article 7.
[13] DSA, Article 6.
[14] DSA, Article 4(1)(e) and 5(1)(b).
[15] DSA, Articles 8 and 9.
[16] DSA, Article 12.
[17] DSA, Article 13.
[18] DSA, Article 14.
[19] DSA, Article 15.
[20] DSA, Article 17.
[21] DSA, Article 18. The relevant dispute resolution bodies will be certified by the Digital Services Coordinators, the appointed authorities in each Member States to ensure compliance with the DSA – see also below.
[22] DSA, Article 18(3).
[23] DSA, Article 19.
[24] DSA, Article 22.
[25] DSA, Article 24.
[26] DSA, Article 32.
[27] DSA, Article 33.
[28] DSA, Article 26.
[29] DSA, Article 27.
[30] DSA, Article 28.
[31] DSA, Article 29.
[32] DSA, Article 30.
[33] DSA, Article 31.
[34] DSA, Article 2(g).
[35] For further examples see DSA, Recital 12.
[36] DSA, Articles 34 and 35.
[37] The update will be part of the broader “European Democracy Action Plan” designed to make European democracies more resilient. See https://ec.europa.eu/commission/presscorner/detail/en/ip_20_2250. See also DSA, Recital 69.
[38] See https://www.theguardian.com/media/2021/may/12/uk-to-require-social-media-to-protect-democratically-important-content
[39] DSA, Article 40.
[40] DSA, Article 47.
[41] DSA, Article 46.
[42] DSA, Article 42.
[43] DSA, Article 41.
[44] MLex Article: https://www.mlex.com/GlobalAntitrust/DetailView.aspx?cid=1290313&siteid=190&rdir=1
[45] Under Articles 8 and 9 of the DSA.

Other Publications