EC Library Guide on disinformation and misinformation: Selected publications
Selected publications from international organisations
- The 101 of disinformation detection
Miller, C. and Colliver, C., Institute for Strategic Dialogue, 2020.
Not every organisation can or should become a disinformation detective. But disinformation can threaten the activities, objectives and individuals associated with civil society groups and their work. Disinformation tactics and the responses in place to try to mitigate them online are changing rapidly. Organisations witnessing or targeted by disinformation therefore require a baseline understanding of the threats posed by disinformation and how to spot them while conducting their work. This toolkit sets out simple steps to do so.
The toolkit lays out an approach that organisations can undertake to begin to track online disinformation on subjects that they care about. The process is intended to have a very low barrier to entry, with each stage achievable using either over-the-counter or free-to-use social listening tools. For a deeper explanation of the methods, teams and skills required to build a disinformation detection system, see ISD’s accompanying roadmap for the disinformation research sector: ‘Developing a Civil Society Response to Online Manipulation’.
- Collective human action against deepfakes
Casale, P.,Osin, V., Raguckaja, G., et al., Freedom from Fear, 2018 (15), United Nations, 2020.
For Immanuel Kant, our senses are the gate to perceive information from the environment and to generate our knowledge. Yet, in the age of advanced technology, our senses are easily becoming subject of manipulation. In such context, the fundamental question arises whether we, humans with manipulated sense, can continue relying on our own decision making.
There has been an unprecedent progress in the quality of techniques for human image synthesis based on Artificial Intelligence (AI), which can manipulate our sense of sight. Deepfakes constitutes the most famous example of it. In just few years, many alarming examples of fake content have involved politicians, governments, technology leaders, and media celebrities. What does this mean for our future, the future of our societies and the future of our countries? What will this manipulation entail at the moment we exercise our rights as citizens and voters? Perhaps instead of jumping into the complexity of these questions, it is worth focusing on how our collective efforts can help us preventing technology from manipulating our senses. This consideration served as a guiding principal for the solution developed by the Open|DSE team in response to the UNICRI challenge at the Hackathon for Peace, Justice and Security (The Hague, June 2019). Before proceeding with the description of the solution, let’s have a closer look at the AI technology behind the creation of this fake content. - The conspiracy theory handbook
Lewandowsky, S. and Cook, J., Center for Climate Change Communication, 2020.
Conspiracy theories attempt to explain events as the secretive plots of powerful people. While conspiracy theories are not typically supported by evidence, this doesn’t stop them from blossoming. Conspiracy theories damage society in a number of ways. To help minimise these harmful effects, this book explains why conspiracy theories are so popular, how to identify the traits of conspiratorial thinking, and what are effective response strategies.
- Disentangling untruths online: Creators, spreaders and how to stop them
Lesher, M., Pawelec, H. and Desai, A., in OECD Going Digital Toolkit Notes, nº 23, OECD, 2022.
Stopping the creators and spreaders of untruths online is essential to reducing political polarisation, building public trust in democratic institutions, improving public health, and more generally improving the wellbeing of people and society. This Going Digital Toolkit note discusses the importance of access to accurate information online and presents a novel typology of the different types of untruths that circulate on the Internet. It considers how untruths are spread online as well as the consequences, and it surveys the evidence base of false and misleading information online. The note concludes by identifying approaches to fighting untruths online and mitigating their negative effects.
- Facts not fakes: Tackling disinformation, strengthening information integrity
OECD Publishing, 2024.
Rising disinformation has far-reaching consequences in many policy areas ranging from public health to national security. It can cast doubt on factual evidence, jeopardise the implementation of public policies and undermine people's trust in the integrity of democratic institutions. This report explores how to respond to these challenges and reinforce democracy. It presents an analytical framework to guide countries in the design of policies, looking at three complementary dimensions: implementing policies to enhance the transparency, accountability, and plurality of information sources; fostering societal resilience to disinformation; and upgrading governance measures and public institutions to uphold the integrity of the information space.
- Good practice principles for public communication responses to mis- and disinformation
OECD, OECD Public Governance Policy Papers, (30), 2023.
This document presents the principles of good practice for public communication responses to mis- and disinformation. The Principles aim to help governments counter mis- and disinformation via the public communication function and other policy responses through strengthening domestic and international media and information ecosystems and reinforcing democracy. This document identifies nine common principles underpinning good practices for how governments can engage with partners across citizens, civil society and the private sector, based on evidence and interventions observed around the world during the COVID-19 pandemic and beyond.
- How to spot spin and inappropriate use of statistics
Bolton, P., UK Parliament, House of Commons Library, 2023.
Statistics can be misused, ‘spun’ or used inappropriately in many different ways. This is not always done consciously or intentionally, and the resulting facts or analysis are not necessarily wrong. They may, however, present a partial or overly simplistic picture. This briefing sets out some common ways in which statistics are used inappropriately or spun and gives some tips to help spot this.
- Media freedoms and civic space in the digital age for transparency, accountability and citizen participation
OECD, in The protection and promotion of civic space: Strengthening alignment with international standards and guidance, 2022.
This chapter provides an overview of the status of press freedom and civic space in a digitalised world, including relevant legal frameworks. It discusses harassment and attacks targeting journalists and makes suggestions on building the necessary enabling environment for reliable, fact-based journalism. It considers the protection of online civic space for citizens and related challenges such as hate speech and mis- and disinformation. It concludes with an analysis of the importance of personal data protection for civic space and safeguarding civic freedoms in the context of increased use of artificial intelligence (AI).
- Our common agenda policy brief 8: Information integrity on digital platforms
United Nations, 2023.
The present brief outlines potential principles for a code of conduct that will help to guide Member States, the digital platforms and other stakeholders in their efforts to make the digital space more inclusive and safe for all while vigorously defending the right to freedom of opinion and expression, and the right to access information. The Code of Conduct for Information Integrity on Digital Platforms is being developed in the context of preparations for the Summit of the Future.
- Last Updated: Oct 7, 2024 4:44 PM
- URL: https://ec-europa-eu.libguides.com/disinformation
- Print Page