Publication

Fighting Disinformation While Safeguarding Human Rights: Key Recommendations

16.02.2021

The Alliance for Healthy Infosphere has come together to contribute to the UN OHCHR Special Rapporteur’s paper on disinformation and human rights, that will be presented before the 47th Human Rights Council in June 2021.

Disinformation campaigns are not only utilised to spread false narratives and undermine democratic processes, but they are also part of a self-sustainable financial ecosystem. Because of their devastating impact on social fabric, managing the digital space which enables uncontrolled dissemination of disinformation through efficient regulation is essential. However, half-baked digital laws can also backfire and serve to oppress journalists, civil society, political opposition and minorities in non-democratic regimes.  Implementation of such laws was particularly prevalent in 2020, as states restricted flows of certain information in an effort to mitigate the COVID-19 infodemic or under the disguise of doing so.

Alliance for Healthy Infosphere advocates for a healthier information environment for all, while preserving democratic principles and human rights, one of the most important of them being the right to free speech. Below are our joint inputs:

  1. Media literacy and critical thinking: Education systems need to be reformed to entail digital skills and responsibility. Such education needs to start at an early age. Finland could serve as a successful example showing how resilience to disinformation can be fostered through an agile and modern education system. Life-long learning and other special programmes should be established as well for older generations or disadvantaged communities.
  2. Digital platforms regulation: Digital platforms need to operate in a clearer legal framework. As voluntary self-regulation measures applied by social platforms have not worked, efficient digital regulation/ co-regulation regimes, as currently being developed by the EU, will need to be implemented. There are other promising models which would help address the situation such as updating digital platforms to information fiduciaries.
  3. Application of community standards: Rules of conduct need to be enforced systematically, regardless of the size of a country. Social media platforms need to invest in local capacities for fact-checking and content moderation. Furthermore, functional independent appeal processes need to be set up.
  4. Demonetize disinformation: Tech giants need to do more to demonetize disinformation. One of the successful examples is an appeal to advertisers to stop placing ads on disinformation sources.
  5. Transparency: We need greater transparency of algorithms and independent audits in order to see whether the measures taken by social media platforms are working and effective.
  6. Invest in independent journalism and develop strategic communication of public institutions.
  7. More data: More detailed country information on malign coordinated bot activities, trolls or fake accounts or how much malign content was taken down by social media platforms is required in order to gain full scope of the problem. Similarly, more detailed information on advertisements on social media platforms are needed to identify who is being targeted and why as well as to identify sources of funding.
  8. What is illegal online/offline: Often, existing legislation on illegal content is not systematically enforced. Capacities to prosecute such digital offences should be further developed and funded, for example through taxation of digital platforms.

Learn more in our position paper below.