Guidelines for Social Media Platforms and Search Engines to Safeguard Elections

on 13.03.2024

Digital Services Act (DSA) guidelines on the integrity of electoral processes aim to provide best practices and mitigation measures to address systemic risks on very large online platforms and search engines (VLOPs and VLOSEs) that could affect the integrity of democratic elections. The draft guidelines published by the European Commission for consultation include examples of potential mitigation measures related to election-related risks, generative AI content, and guidance specific to European Parliament elections.

These guidelines represent a positive step forward, offering specific measures and best practices for VLOPs and VLOSEs to address systemic risks to electoral process integrity. We appreciate the focus on understanding local contexts to customise mitigation strategies effectively, highlighting the importance of engaging with local stakeholders, utilising resources like EDMO hubs and establishing an incident response mechanism involving senior leadership and stakeholders before elections.  

The guidelines establish comprehensive recommendations on the implementation of the media literacy campaigns of VLOPs and VLOSEs. Based on our work and experience, GLOBSEC has suggested the following points for consideration and improvement: 

  • Acknowledgment of Challenges Journalists Face: Recognize the hurdles faced in EU countries where media and independent journalists encounter smear campaigns. Target audiences for media literacy campaigns may not trust them due to these challenges. 

  • Tailoring Campaigns: Customize media literacy and inoculation campaigns to specific countries and target audiences. For instance, in the summer of 2023, META launched the "Focus on Facts" campaign featuring cartoons in English. However, in environments saturated with disinformation from over 300 outlets and various domestic actors, including AI-generated content, English-language cartoons or posters may have limited impact, especially among vulnerable groups as middle-aged and elderly who do not speak English, identified by GLOBSEC Trends public opinion polls. 

  • Consistency and Reporting: Maintain consistency in media literacy campaigns and report their impact. Initiatives launched just weeks or months before elections are unlikely to counteract years of information operations spread by political representatives and other domestic actors across the EU. Additionally, it is essential to regularly inform relevant stakeholders about the impact, reach, and effectiveness of these campaigns during their planning and execution phases. Effective communication campaigns necessitate continuous monitoring and potential adjustments, particularly if messages fail to resonate with the intended audience or if significant developments or disinformation emerge. Conduct a post-election review of media literacy campaigns within three months of the elections to evaluate their impact. 

  • Financial Support: Provide financial assistance to local civil society organizations to effectively carry out activities associated with the DSA and the Code of Practice of Disinformation. Many organisations within EDMO hubs may lack adequate resources and capabilities if their projects do not allocate funding for such initiatives specifically. Moreover, the requirement for 50% co-financing from partners, including civil society organizations, in new EDMO projects presents challenges. This financial framework, along with additional expectations for CSOs to engage in activities such as analysing transparency reports and conducting media literacy campaigns without compensation, may not be sustainable in the long term. 

  • Prohibition of Deepfake Political Advertising: VLOPs and VLOSEs should ban political advertising that includes deepfake disinformation content. A case from Slovakia exemplifies this issue, where a political party circulated a deepfake video of President Zuzana Čaputová as a paid political advertisement, even airing it during a silent period. It is crucial for political entities to adhere to platform community rules, regardless of whether the content is promoted through paid advertising. Moreover, platforms should be aware that false information and disinformation are spread by prominent political figures in some EU countries. Therefore, political content should be subject to the same regulations and moderation processes as any other content.

  • Prohibition of Paid Content During Electoral Silence: VLOPs should prohibit paid content on their platforms during electoral moratoria. 

  • Transparency in Ad Repositories: Include information on paid or sponsored content by social media platforms themselves in ad repositories as part of their media literacy campaigns. 

  • Access to Data: Expedite processes to grant researchers meaningful access to data ahead of the EP elections under Article 40. 

  • Removal of Inconsistencies in AI Content Moderation: Addressing inconsistencies in labelling and moderation of AI content is crucial for establishing uniform policies and improving content moderation. Understanding the reasons behind platforms' varying approaches and decisions regarding the removal of certain AI-generated content is essential for this effort. During the Slovak parliamentary elections, diverse approaches were observed in the removal or labelling of AI-generated videos questioning the integrity of elections on social media platforms. Out of 48 posts featuring such videos, META removed 15, and labelled 14, while 19 remain online without any form of fact-checker label. 

  • Ensuring Political Neutrality of Digital Service Coordinators (DSCs): Ensure that DSCs and institutions responsible for organizing elections in EU countries act in an apolitical manner. 


Research Fellow, Centre for Democracy & Resilience



Research Fellow, Centre for Democracy & Resilience