Publication

Central and Eastern Europe’s Blueprint for Social Media Regulation: Recommendations for the New EU Commission

on 02.10.2024
Fortifying Digital Democracy: Central and Eastern Europe’s Blueprint for Social Media Regulation

Since 2018, the EU has led efforts to regulate social media through frameworks such as the Code of Practice on Disinformation and the Digital Services Act (DSA), aimed at combating the misuse of social media platforms, which contributes to deepening polarisation, undermining of democratic processes, and spreading of state-sponsored propaganda and disinformation.

The adoption of the 2018 Code of Practice on Disinformation[1] marked a pioneering effort in regulation of social media platforms representing one of the first self-regulatory approaches in the field. The voluntary self-regulation, particularly following the challenges posed by the Infodemic during the COVID-19 pandemic, however, was deemed insufficient,[2] and led to a strengthened version adopted in 2022 through a more inclusive process including non-platform signatories.[3]   

The DSA[4] and Digital Markets Act (DMA), partially entering into force in August 2023,[5] alongside the adoption of the EU AI Act, lay the groundwork for global standards pertaining to transparency and platforms’ oversight, anchored in the principles of human rights. However, monetisation of disinformation in the form of paid advertisement and consequent multi-billion annual revenues continue to trump over democratic principles and adherence to own terms and services by the platforms.[6]

In the efforts to assess key achievements and gaps of the last EU election cycle and provide recommendations for the next EU Commission, GLOBSEC conducted a Central and Eastern Europe-wide exercise that resulted in a report “Pivotal moment for Europe: Central European proposals for the next EU leadership”. It was based on a consultation of dozens of experts from across and beyond the region in the first quarter of 2024, with chapters covering a wide range of EU-relevant policies. For the purpose of this paper, two workshops were organised with 13 experts, fact-checkers and policy advisers focusing on social media regulation and countering disinformation. The focal point of the discussions was to gather feedback on the processes and effectiveness of the Code of Practice on Disinformation and the DSA, and explore solutions to fill existing gaps. Consulted experts identified major shortcomings both in self-regulatory and regulatory mechanisms,[7] including content moderation, identity verification process, advertisements or AI-generated content.

According to the experts, while regular transparency reports submitted under the revised Code of Practice on Disinformation, as well as those mandated by the DSA, furnished some degree of insight into the policies and structures of social media platforms, the reports continue to suffer from deficiencies in quality and informational value.[8] Spanning hundreds of pages, these reports often lack contextualisation, contain partial or extraneous information without furnishing specific and coherent details regarding the impact and efficacy of mentioned policies.

Analyses conducted by the European Fact-Checking Standards Network (EFCSN) and civil society organisations, many of which are non-platform signatories to the Code, concluded that platforms' policies and practices exhibited minimal alteration since the Strengthened Code's inception. A systematic review conducted by EFCSN concerning the implementation of the Code underscored the persistent shortcomings of platforms in fulfilling their commitments, indicating that “platforms and search engines are still far from fulfilling their promises and do not have effective risk mitigation measures against disinformation in place, as DSA requires.”[9]

Consulted CEE experts have compared transparency reports to "PR exercises" for platforms, rather than robust instruments capable of demonstrating the outcomes of policy measures and actions. Moreover, the data provided by the Very Large Online Platforms (VLOPs) lack standardisation and precludes meaningful comparisons across platforms due to discrepancies in data collection methodologies and computations, thereby impeding comprehensive analysis and evaluation.

At the moment VLOPs can report anything without anybody having the ability to catch them or provide scrutiny,” said media expert from Czechia.

Furthermore, with the DSA’s provisions being slowly enacted for over a year, one of the key gaps identified remains data access to researchers, operationalisation of which remains crucial for researchers’ abilities to keep the platforms in check.

Overall, CEE experts concluded that no major impactful changes were observed to have been taken by platforms and that besides the above-mentioned gaps, the need for strengthened, structured and ongoing cooperation between social media platforms and civil society organisations persists as well.

These issues remain problematic due to continuous foreign information manipulation and interference (FIMI) by rising number of countries, including Russia and China which have become multi-faceted running simultaneously on a number of social media platforms, utilising both authentic behaviour in the form of bots and trolls, while being promoted by real people, as well as paid influencers. For instance, the Doppelgänger operation, first observed in 2022, demonstrates how VLOPs have been unable to prevent inauthentic behaviour on their platforms by failing to enforce standards for political ads while continuing to generate revenue from the spread of disinformation.[10] Recent report by the AI Forensics on Doppelgänger found that 60% of ads on Meta did not adhere to the company’s own guidelines concerning political advertising and pro-Russian messages reached over 38 million users in France and Germany months before the elections to the EU Parliament.[11] Monetisation of disinformation was also observed in connection to the war in Ukraine.[12] This lack of enforcement remains an issue also due to an increasing number of domestic actors, including political representatives, spreading disinformation,[13] monetisation of which occurred, for example, prior to Slovak presidential elections in spring 2024.[14]

The EU's continuous policy priorities should focus on increasing the transparency of VLOPs, empowering users, and enhancing the transparency of algorithms. By improving the transparency reports and better understanding of measures taken by the platforms, the EU can hold platforms accountable and ensure compliance with regulations. Empowering users with better tools to understand and control their data and content exposure is crucial for fostering a safer online environment. Additionally, transparency of algorithms will help demystify the decision-making processes that influence content visibility and dissemination, reducing the potential for biased or harmful outcomes, with the provision of access to reliable data to researchers being an important element of platform’s oversight.

Recommendations: 

  • Expedite finalisation of processes and structures outlined in the DSA, both at national and EU levels

Despite the DSA coming into effect in August 2023, delays in legislative processes and the establishment of critical bodies, such as national Digital Service Coordinators (DSCs), [15] have slowed down progress. In April, the EU Commission launched infringement procedures against Estonia, Poland and Slovakia, and in July the Commission initiated procedures for Belgium, Spain, Croatia, Luxembourg, the Netherlands, and Sweden because they had not yet designated their Digital Services Coordinators, failed to grant these authorities the essential powers to perform their duties, or both. By September 15, 2024, still Poland and Belgium have not appointed national coordinators. Such delays are hindering effective implementation of DSA and enforcement of social media regulation in EU member states.[16]

Furthermore, the potential politicisation of DSCs necessitates careful consideration amidst growing concerns of democratic erosion in various EU member states. While the responsibility for establishing DSCs rests with member states, the new EU Commission should mandate these institutions to possess political independences, alongside adequate competencies and resources. Inclusive decision-making processes involving diverse stakeholders can foster a more representative approach, mitigating the risk of undue political influence on DSCs. Additionally, transparent communication of these processes to the public is imperative.

The implementation of the DSA may also encounter difficulties due to insufficient financial resources and personnel within DSCs, impeding their capacity to meet the newly mandated obligations. Both the EU and member states should allocate sufficient financial resources for own institutions dealing with the implementation for DSA and CoP but also for researchers and civil society organisations.

  • Ensure comprehensive mapping of capacities and requirements for regulation enforcement

The regulation of social media platforms, its effective enforcement, and the protection of democratic principles necessitate a collective endeavour encompassing a range of stakeholders.[17] The efficacy of such efforts may, however, be compromised by a lack of sufficient capacities for delivery and oversight. The EU's initiation of regulation without a thorough understanding of its own capabilities, such as the composition of trust and safety teams within platforms or the distribution of moderators across countries, underscores the need for a comprehensive analysis of resources and capacities at the EU, member state, and VLOP levels.

  • Establish a competence centre for enforcement and implementation

The effectiveness of the regulation hinges on the enforcement of the measures in place. The EU Commission should demonstrate a commitment and capacity to enforce regulations and holding non-compliant VLOPs accountable. While certain cases may necessitate litigation, slower response to enforce rules against non-compliant VLOPs sets a negative precedent and undermines the legislation’s power. A competence centre focused on enforcement and implementation of the Code and DSA would have a potential to close some of the gaps currently identified by the research and mitigate for a temporary lack of resources in some member states.

With the new EU Commission still being confirmed, research community anticipates clear division of competences on the portfolio of social media regulation, countering disinformation and FIMI and their key priorities in this respect. A special INGE Committee of the European Parliament running out of the mandate before the 2024 EU elections was a good institutional practice that should be repeated for the next cycle.

  • Secure funding for researchers, fact-checkers and civil society organisations

The provision of funding for researchers and civil society organisations involved in protecting digital space from digital threats is insufficient. Several counter-disinformation NGOs led by the EU DisinfoLab already in May 2023 called on the EU Commission to create a dedicated EU budget line for CSOs working in the field of countering disinformation to ensure their sustainability and viability.[18] While the EU institutions rely on data analysis, inputs to consultations, and feedback on transparency reports from civil society organisations, such activities are mostly remuneration free. Both the Code of Practice and the DSA expect participation from civil society organisations, which is welcomed and necessary. However, researchers often end up overwhelmed by the amount of work that must be done to monitor the enforcement, as well as bureaucracy and finances to receive access to necessary data through platforms’ or commercial monitoring solutions.[19]  While some opportunities for funding have been developed, such as the European Digital Media Observatory and its HUBs, the 50% co-financing frameworks make it difficult for non-for-profit entities to sustain their activities. According to the EFCSN, European fact-checkers do not believe VLOPs offer a “fair financial contribution for fact-checkers to combat disinformation on its service” as they committed to do in the Code of Practice.[20]

The expectations from CSOs and fact-checkers on the involvement in the DSA, the Code, research, and other activities without appropriate financial support are not sustainable in the long run. Allocation of such money in the new EU budget is a must in the next cycle.  

  • Enhance strategic communication on the DSA, DSCs, and their functions

Addressing criticism of social media regulation and dispelling misinformation regarding its objectives requires robust strategic communication efforts. Given the limited awareness of EU legislation among citizens, particularly in CEE countries, comprehensive explanations of the DSA and its implications are necessary. Strategic communication efforts must be tailored to different audiences and political and cultural contexts of EU member states, emphasising the objectives of social media regulation and misconceptions regarding censorship.[21] Moreover, building media literacy and fostering public understanding of EU processes are crucial for garnering support for regulatory measures. These efforts should include capacity building and communication with civil servants within EU member states, many of whom have limited knowledge of the legislation. Understanding that the regulation of social media is not about censorship, but building trust, online safety, and protecting democratic principles and processes, should be universal.

The EU and its representatives should conduct better strategic communication both in terms of engagement with EU citizens, but also coordination and effective targeting of malign narratives and building a resilient EU society. One-size-fits-all campaigns, such as the EU NextGen or You Are EU, have limited impact in CEE countries. Media literacy initiatives of VLOPs, for example, could be used to complement the EU’s strategic communication in this area. 

  • Develop tailored media literacy campaigns in collaboration with local stakeholders

Effective media literacy initiatives must be tailored to specific countries and target demographics, utilizing appropriate communication channels and narratives. Collaborating with local partners, including civil society organisations and influencers, ensures culturally relevant content with a potential to resonate with the intended audience. Continuous monitoring of campaign effectiveness and sharing data on the impact with stakeholders are essential for refining strategies and maximising outreach. In an ideal scenario, media campaigns should have pre-and-post surveys among the target audience to measure the effectiveness of a campaign via observed changes of attitudes.

In the summer of 2023, META launched the "Focus on Facts"[22] campaign featuring cartoons in English in three countries, Bulgaria, Lithuania, and Slovakia. However, in environments saturated with disinformation spread by various domestic actors, including AI-generated content, English-language cartoons or posters had limited impact, especially among vulnerable societal groups such as middle-aged and elderly who do not speak English.[23]

Instead, media literacy efforts by social media platforms should be consistent and long-term. Initiatives launched a few weeks or months before elections are unlikely to counteract years of disinformation spread by political representatives and other domestic or foreign actors across the EU.

  • Require improved access to data

Access to comprehensive data is vital for researchers, fact-checkers, and civil society organisations engaged in social media regulation. Access to data on most platforms under Article 40 of the DSA is currently non-existent, as it requires the researchers to be vetted by national DSCs. While the Code of Practice (Commitment 14) also requires increased cooperation with researchers and data sharing, there are still large discrepancies among platforms on what data is provided and accessible for analysis.[24]

While in the context of article 40 of the DSA, researchers will be able to request a broad range of previously undisclosed data for their research, the research community needs to operationalise the so-called scrapping provision under article 40, as it is exactly this type of data that fosters exploratory research.[25] If researchers, fact-checkers, and civil society organisations are to be actively involved in the Code and DSA, access to data and appropriate tools are the key preconditions. The announcement of Meta to shut down CrowdTangle, a monitoring tool used by tens of thousands of journalists, researchers and election observers, in August without a proper replacement was widely criticised by the research community.[26]

  • Promote DISARM framework increasing effectiveness of cooperation and counter-measures

Widespread utilisation of a common DISARM framework by all stakeholders will streamline data sharing, analysis, and coordination of effective measures. This must  go hand in hand with the need to strengthen information-sharing facilities and structures.

  • Increase requirements for the Journalism Trust Initiative certification

The goal of Journalism Trust Initiative certification, which is being executed by Reporters Without Borders, is to standardise media practices of legitimate media newsrooms rewarding ethical journalistic practices.[27] However, at the moment, the JTI certification is rather a checklist exercise of having certain documents without the quality of the content published by media outlets being taken into consideration. Such an approach to certification focusing on administration can be easily misused by outlets spreading problematic content and disinformation. For example, disinformation outlets could have ethical standards without properly enforcing them. Due to the limitations of this check-list approach of certifications, even outlets like RT (Russia Today), could technically become JTI certified.

The concerning part is that Reporters Without Borders are negotiating with search engines and the European Commission to put JTI-certified outlets higher up on search engine results. According to consulted experts, it is also suggested that JTI certification should be one of the criteria for media to receive funding from the EU and other donors. The limitations of JTI certifications can therefore pose a threat to EU democracy by distorting the current media environment and media's collaborations with VLOPs. 

Awarding of such certification should thus go beyond a checklist of having necessary documentation, but include real enforcement of editorial independence and journalistic standards. Local researchers and independent DSCs should play a role and provide insights on the independence of media.

  • Invest in research on AI-driven technologies

With the proliferation of AI-generated disinformation, investing in research and development of AI-driven technologies is imperative for bolstering resilience against digital threats. Allocating funding for research initiatives focused on AI-for-good technologies ensures the EU remains at the forefront of technological innovation and digital resilience. In addition, AI companies like OpenAI could be in the future included into the DSA and Code of Practice on Disinformation framework, since they have become platforms people are beginning to use to work with information.[28] On the other hand, researchers from AI Forensics found out that moderation safeguards deployed by these AI-driven virtual assistants as Copilot, ChatGPT, and Gemini is widely different and inconsistent across languages, dropping from 90% for English to 30% or less for Romanian, Swedish, or German.  Furthermore, AI Forensics also proved that AI-driven chatbots could spread false information in connection to elections - in their research, Microsoft Copilot's answers to simple election-related questions contained factual errors 30% of the time.[29]

  • Address challenges posed by platforms not covered by DSA

Despite not falling under the purview of the DSA, platforms such as Telegram[30] have emerged as significant sources of foreign influence and disinformation within the EU. Growing popularity of social media platforms originating from or influenced by companies in authoritarian countries, such as TikTok, VKontakte, or services like Telegram, presents several security challenges in democratic countries. Already in January 2021, Telegram was the most downloaded non-gaming application worldwide. While these platforms have become important channels of (pro-Kremlin) propaganda and disinformation, a misuse of users’ personal data by authoritarian regimes presents another security challenge. Advocating for the participation of such platforms in codes of conduct and fostering cooperation with relevant stakeholders can mitigate their negative impact on digital discourse.

 

Consulted experts:

Researchers from the Council for Media Services, Slovakia

Assistant of Markéta Gregorová, Czech Member of the European Parliament

Justin E. Lane, CulturePulse, Slovakia

Martin Luhan, Rekonstrukce Statu, Czechia

Meta’s Factcheckers covering the CEE region

Maia Mazurkiewicz, Alliance for Europe, Poland

Pawel Terpilowski, Demagog Poland, Poland

Peter Jančárik, Seznam, Czechia

Richard Kuchta, Reset/Reporting Democracy International, Germany

Dávid Púchovský, Assistant to the MP and former social media coordinator at the Ministry of Interior, Slovakia

Jakub Goda, expert on disinformation and former Social Media Administrator at the Office of the President of the Slovak Republic

Giedrius Sakalauskas, Director of Res Publica – Civic Resilience Center, Lithuania

Ieva Ivanauskaitė, Innovation and Partnerships Team Lead, Delfi, Lithuania

NED logo

[2] GLOBSEC (2022) Fighting disinformation while safeguarding human rights: Key recommendations. Available at: https://www.globsec.org/what-we-do/publications/fighting-disinformation-while-safeguarding-human-rights-key-recommendations

[3] European Commission (2022) The 2022 code of practice on disinformation. Available at: https://digital-strategy.ec.europa.eu/en/policies/code-practice-disinformation

[4] DSA is applicable for the services that have been already designated as Very Large Online Platforms (VLOPs).

[6] Paul, K. (2023) Reversal of content policies at Alphabet, Meta and X threaten democracy, warn experts, The Guardian. Available at: https://www.theguardian.com/media/2023/dec/07/2024-elections-social-media-content-safety-policies-moderation

[7] Elghawaby, A. (2018) Social media’s self-regulation isn’t enough, Centre for International Governance Innovation. Available at: https://www.cigionline.org/articles/social-medias-self-regulation-isnt-enough/

[8] Goujard, C. (2023) Critics hit out at social media platforms’ disinformation reports, Politico. Available at: https://www.politico.eu/article/critics-social-media-platforms-disinformation-report-european-union-meta-youtube-twitter-tiktok/

[9] European Fast-Checking Standards Network (2024) Fact-checking and related risk-mitigation measures for disinformation in the very large online platforms and search engines. Available at: https://efcsn.com/app/uploads/2024/01/FINAL_Fact_checking_and_related_Risk_Mitigation_Measures_for_Disinformation.pdf

[10] Goujard, C. (2024) Big, bold and unchecked: Russian influence operation thrives on Facebook, Politico. Available at: https://www.politico.eu/article/russia-influence-hackers-social-media-facebok-operation-thriving/;

[11] Bouchaud, P. et al. (2024) No embargo in sight: Meta lets pro-Russian propaganda ads flood the EU, AI Forensics. Available at: https://aiforensics.org/work/meta-political-ads

[12] Visser, F. et al. (2023) Cashing in on conflict: TikTok profits from pro-Kremlin disinformation ads. Available at: https://www.isdglobal.org/digital_dispatches/cashing-in-on-conflict-tiktok-profits-from-pro-kremlin-disinformation-ads/

[13] Amnesty International (2022) Myanmar: Facebook’s systems promoted violence against Rohingya; Meta owes reparations – New report. Available at: https://www.amnesty.org/en/latest/news/2022/09/myanmar-facebooks-systems-promoted-violence-against-rohingya-meta-owes-reparations-new-report/

[14] Suchý, M. & Daňko, L. (2024) Antikampaň proti Korčokovi tesne pred moratóriom zasiahla milión používateľov. Platia ju ľudia blízki Pellegrinimu, Sastavme Korupciu. Available at: https://zastavmekorupciu.sk/kauzy/antikampan-proti-korcokovi-tesne-pred-moratoriom-zasiahla-milion-pouzivatelov-platia-ju-ludia-blizki-pellegrinimu/

[15] Cunningham, F. & Sasdelli, P. (2024) What countries have appointed their Digital Services Coordinators under the DSA? Available at: https://www.twobirds.com/en/insights/2024/global/which-countries-have-appointed-their-digital-services-coordinators-under-the-dsa

[16] Tar, J. (2024, July 26). EU Commission urges six member states to appoint authorities for DSA enforcement. Euractiv. Available at: https://www.euractiv.com/section/data-privacy/news/eu-commission-urges-six-member-states-to-appoint-authorities-for-dsa-enforcement/; Kroet, C. (2024, April 24). Six EU countries pressed to appoint platform watchdogs. Euronews. Available at: https://www.euronews.com/next/2024/04/24/six-eu-countries-pressed-to-appoint-platform-watchdogs

[17] Gori, P. (2024) Countering disinformation: A whole-of-society approach beyond traditional frameworks. Available at: https://edmo.eu/blog/countering-disinformation-a-whole-of-society-approach-beyond-traditional-frameworks/

[18] EU Disinfo Lab (2023) Stand by us - counter-disinformation community urges the European Commission to walk the talk and back their efforts to protect European democracies against disinformation. Available at: https://www.disinfo.eu/advocacy/stand-by-us-counter-disinformation-community-urges-the-european-commission-to-walk-the-talk-and-back-their-efforts-to-protect-european-democracies-against-disinformation/

[19] Jaursch, J. (2024) The Digital Services Act is in effect - now what? Stiftung Neue Verantwortung. Available at: https://www.stiftung-nv.de/en/publication/digital-services-act-now-what

[20] European Fast-Checking Standards Network (2024) Fact-checking and related risk-mitigation measures for disinformation in the very large online platforms and search engines. Available at: https://efcsn.com/app/uploads/2024/01/FINAL_Fact_checking_and_related_Risk_Mitigation_Measures_for_Disinformation.pdf

[21] Holan, A. (2024) Let‘s say it plainly: Fact-checking is not censorship, Poynter. Available at: https://www.poynter.org/commentary/2024/fact-checking-is-not-censorship/

[22] Meta (2023) Facts in focus: Combatting fake news through the artists‘ lens. Available at: https://about.fb.com/news/facts-in-focus-combating-fake-news-through-the-artists-lens/

[23] GLOBSEC (2021) GLOBSEC Vulnerability Index 2021. Available at: https://www.globsec.org/sites/default/files/2021-11/Vulnerability-Index_Comparative-report.pdf

[24] Digital Democracy Monitor (n.d.) Data access. Available at: https://digitalmonitor.democracy-reporting.org/data-access/

[25] VLOPs and other providers use technical restrictions, such as blocking of IP addresses to limit the access to data for researchers and block unlimited scrapping of data from their platforms. Article 40(12) of the DSA lifts these technical restrictions or enables the researchers to obtain special “scraping exceptions” from them. “While API access based on Article 40(12) should be further developed, the possibility to scrap data provides alternative access to public data regardless of what APIs or other tools are offered by providers. Scraping puts pressure on VLOPs to create properly functioning APIs, and facilitates that researchers explore the relevant risks before they formulate their requests under Article 40(4).” Husovec, M. (2023, May 19). How to Facilitate Data Access under the Digital Services Act. Available at: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4452940

[27] Journalism Trust Initiative (2024) The Journalism Trust Initiative. Available at: https://www.journalismtrustinitiative.org/

[28] Similarweb. chat.openai.com Website Analysis for August 2024. Available at: https://www.similarweb.com/website/chat.openai.com/#overview

[29] Romano, S., Stanusch, N., Schüler, M., et. Al. (2024). Chatbots: (S)elected Moderation: Measuring the Moderation of Election-Related Content Across Chatbots, Languages and Electoral Contexts. AI Forensics. Available at: https://aiforensics.org/work/chatbots-moderation

[30] Telegram was not designated as a VLOP that has to comply with DSA because the number of monthly active users they reported is lower than 45 million, which is the required threshold to be designated as such. Telegram along with all other platforms must publish updated user numbers at least once every 6 months.

Authors

Senior Research Fellow, Centre for Democracy & Resilience

Authors

Senior Research Fellow, Centre for Democracy & Resilience