DSA priorities: The Commission attempts to defuse the disinformation atom bomb before EU elections

Journalist and Nobel peace prize winner Maria Ressa has compared disinformation to an “atomic bomb in our information ecosystem”. [1] In anticipation of the threats that disinformation and generative AI may pose to upcoming European elections, the European Commission has taken significant actions to mitigate systemic risks online that may impact the integrity of the elections. The Digital Services Act, which is directed explicitly at mitigating the risk of disinformation in elections, is now in force, and the Commission has supplemented this with guidelines, whistleblowing tools, and has even commenced enforcement activity. 

The Digital Services Act

The DSA became fully applicable as of 17 February 2024 and its primary purpose is to regulate the provision of certain online services to EU users, irrespective of where the service provider itself is located. Its main goal is to prevent illegal and harmful activities online, including the spread of disinformation.

The DSA imposes different obligations for different providers in accordance with their role, size and impact in the online ecosystem. [2] Platforms that reach more than 45 million users in the EU (10% of the EU’s population) may be designated as Very Large Online Platforms (VLOPs) or Very Large Online Search Engines (VLOSEs). Such a designation brings with it the most stringent obligations because of the potential risks posed by these large platforms in the spread of disinformation and harmful online activities.

This year will see a large number of local and parliamentary elections in Europe, including notably elections to the European Parliament which will take place between 6 and 9 June 2024. It is against this backdrop that the Commission has taken further actions in line with the DSA’s stated aims.

Recent EC guidance

On 26 March 2024, the Commission published guidance in which it recommended proactive mitigation measures and best practices to be undertaken by VLOPs and VLOSEs before, during, and after electoral events. [3]

Recommendations include:

  • The requirement to promote official information on electoral processes, implement media literacy initiatives, and adapt recommender systems to empower users and reduce the monetisation and virality of content that threatens the integrity of electoral processes;[4]
  • Adoption of specific mitigation measures linked to generative AI, including clear labelling of AI content, measures to ensure the reliability of AI content, and adequate enforcement as required;[5]
  • Cooperation with authorities and other organisations to foster an efficient exchange of information before, during and after the election;[6]
  • Adoption of specific measures during an electoral period to reduce the impact of incidents that could have a significant effect on the election outcome or turnout;[7] and
  • A post-election assessment of the effectiveness of the measures taken[8] and an encouragement for third party scrutiny and research into the same. [9]

Any VLOP/VLOSE that opts not to follow the Commission’s guidance will then bear the burden of proving its alternative measures were sufficient in achieving the same aims. Should the Commission receive information “casting doubt on the suitability of such measures”, it can request further information or start formal proceedings under the DSA. [10] Formal proceedings could seek to investigate:

  • a failure to put in place mechanisms by which users can notify the platform of potentially illegal content; [11]
  • whether the design or operation of the service materially impairs the ability of users to make free and informed decisions; [12] or
  • a failure to identify systemic risks stemming from the design, function and/or use of its service. [13]

Should a VLOP or VLOSE be found to be in breach of the DSA, the Commission may impose fines of up to 6% of its global turnover and order the VLOP or VLOSE to take measures to address the breach. [14]

Whistleblowing tools

As a further statement of intent in this regard, the Commission has also now launched new whistleblowing tools for the DSA. [15] These tools make it possible for individuals to provide, without fear of reprisals, information that will allow the Commission to identify and uncover harmful practices of VLOPs and VLOSEs.

Immediate impact

Some platforms are already taking steps to comply.  For example, Google has announced that it will seek to educate its users on how to identify and avoid misinformation in a series of 50-second adverts that are to be rolled out across five European countries ahead of the elections. In an interview with TIME published on 25 April 2024, Beth Goldberg, the head of research at Google’s internal Jigsaw unit, explained that Google’s attempts to educate voters on how they may be being misled is an attempt to tackle a practice known as ‘decontextualization’, in which the spread of disinformation will attempt to divorce a particular fact or statement from its original and proper context.

However, the Commission has shown it will act where it has concerns that there has been a failure to comply with its requirements. On 30 April 2024, the Commission opened formal proceedings to assess whether Meta, the provider of Facebook and Instagram, may have breached its obligations under the DSA. The investigation concerns, among other things, suspicions that Meta does not comply with DSA obligations in respect of (i) addressing disinformation campaigns, (ii) the visibility of political content, and (iii) the non-availability of an effective third-party real-time civic discourse and election-monitoring tool ahead of the upcoming elections.

Commission President Ursula von der Leyen said: "This Commission has created means to protect European citizens from targeted disinformation and manipulation by third countries. If we suspect a violation of the rules, we act. This is true at all times, but especially in times of democratic elections. Big digital platforms must live up to their obligations to put enough resources into this and today's decision shows that we are serious about compliance”.[16]

Politicians are similarly in support. Speaking at the Politico Live Tech & AI Summit in Brussels on 16 April 2024, Jean-Noël Barrot, France’s minister for European Affairs, said Facebook, YouTube, X and other social media platforms should do “whatever it takes” to tackle disinformation before European elections to ensure that Europeans’ trust in democracy isn’t undermined.[17]

Comment

By taking a range of proactive measures, the Commission is showing that it is serious about protecting democracy. Furthermore, the Commission’s stated willingness to work with VLOPs/VLOSEs and third parties in continually developing these guidelines, and ensuring the measures being taken are effective, demonstrates a real commitment to addressing the problem at its source.

However, in an election year, we will see before long whether the disinformation bomb is still ticking, or whether the steps taken to disarm it have been successful.

Footnotes

[1] The Guardian – Disinfo black ops: exposing the companies and states spreading false information
[2] European Commission – The Digital Services Act, Ensuring a safe and accountable online environment

[3] European Commission – Guidelines for providers of Very Large Online Platforms and Very Large Online Search Engines on the mitigation of systemic risks for electoral processes pursuant to the Digital Services Act (Regulation (EU) 2022/2065)
[4] Commission Guidelines, section 3.2
[5] Commission Guidelines, section 3.3
[6] Commission Guidelines, section 3.4
[7] Commission Guidelines, section 3.5
[8] Commission Guidelines, section 3.6
[9] Commission Guidelines, section 4
[10] Ibid
[11] Digital Services Act, Article 16(1)
[12] Digital Services Act, Article 25(1)
[13] Digital Services Act, Article 34(1)
[14] Digital Services Act, Article 52
[15] MLex – Digital platforms whistleblower tools rolled out by EU Commission
[16] European Commission press release, 30 April 2024 – Commission opens formal proceedings against Facebook and Instagram under the Digital Services Act
[17] MLex – Facebook, YouTube, X, social media platforms should do ‘whatever it takes’ to fight misinformation, Barrot says