Prebunking and Public Inoculation to Information Integrity

An approach to building public resistance to information pollution is ‘pre-bunking’, also called ‘inoculation’. This seeks to pre-emptively build psychological immunity against online misinformation by presenting audiences with content intended to convey the types of narratives, techniques or sources that may be used to influence them in the course of an election.

ACTIVITY

DESCRIPTION

Pre-bunking attempts to counter information pollution messages before they are deployed. Pre-bunking messages are intended to increase audiences’ ability to make reliable judgements on whether content is information pollution. The messages may be formed within different content types, such as interactive games, videos, posts, etc.

This approach is expected to avoid some of the constraints associated with other established information-integrity activities. Namely, it avoids the continued challenge facing post-facto exposure to information pollution. It is also envisaged to be more deployable than broad civic education programmes.

There are three main types of pre-bunking content types:

  1. Fact-based: correcting a specific false claim or narrative.
    In the context of elections, experts may be able to decipher the most likely narratives that may be used to undermine the integrity of the process. Examples include false claims about the accuracy of the voter register, the correctness of the election results, tampering by an out-group or the EMB’s impartiality.
  2. Logic-based: explaining tactics used to manipulate.
    Logic-based pre-bunking is seen to be particularly effective in terms of strengthening the ability of people to resist information pollution. Given the techniques are likely to be the same for electoral and non-electoral content, previous country experience can guide approaches. Furthermore, applying this approach may have an impact on the greater political process, beyond just elections.
  3. Source-based: pointing out bad sources of information.
    It should be noted that pointing out certain types of outlets can be easily politicized. Source-based pre-bunking may be complicated in the electoral context, with the candidates themselves often the root of information pollution. Depending upon who is implementing this work and what the sources may be, there may be protection/security concerns to consider.

What to pre-bunk – The first step is to understand what concerns should be addressed, specifically what voters are susceptible to, what are the risks to the process and how information pollution is deployed.

How to design the pre-bunk – The structure of a pre-bunk is key to its successful psychological effect. Messages that are bound in lived experiences and truth are more impactful. Users should be informed early on what the nature of the message and pre-bunking goals are.

How to get the pre-bunking to users – Contents must be sharable and simple, increasing its chance to go viral. Paid advertising can also ensure that content reaches its audience. Of course, distribution planning should prioritize the ‘locations’ where target groups are.
Public information campaigns, similar to the one UNESCO implemented in Kenya, are another option.

1.

Who is best placed to implement the activity? 

Key will be the trust in, and reach of, information messages. Trust, as a measure of how likely someone is to believe and act on a message, is usually a combination of how well someone trusts the message and the messenger. Trusted information providers are individuals and organizations with a high degree of credibility. In health, these may include appropriate national and international health authorities or locally well-known medical-related media figures. For elections, they are EMBs and other official organizations who are generally trusted to provide credible information about polling places, election dates, etc. However, these may have online presence as websites and social media accounts but are often not active in making information visible in the places that people seek and share rumours and misinformation. Depending on the circumstance and organization, civil society actors may be well placed to share credible content widely. Fact-checking organizations might also be able to play a major role in this process.

As disinformation producers will often attack the credibility of trusted information providers as a way to further boost their disinformation, the trust in a particular institution or organization may have diminished.

2.

How to ensure context specificity and sensitivity? 

Underpinning the decision and design of this work should be a clear understanding of the unique country context. A clear understanding of the logic-based and fact-based risks is important to the design of the programme. Conflict risk assessments may help provide an understanding of the critical activities within the election cycle. Similarly, a review of the tactics and narratives that are used to spread information pollution in the past can inform the activity.

Strong user-testing can help check how content and messages will be received by the audience and if they are likely to have the intended impact. How this will be formed will depend on the circumstances but could involve approaches such as focus groups, A/B testing, etc.

3.

How to involve youth?  

Youth are likely to be a key audience here, so their incorporation in any testing is essential. Youth-dominated CSOs may also contribute to the production of any such content.

4.

How to ensure gender sensitivity/inclusive programming?  

Attention should be placed to narratives that target women and vulnerable groups, with consideration to challenge these attacks through specific pre-bunking campaigns. As with youth, women and vulnerable groups should be involved in any review, making their incorporation in any testing essential.

5.

How to communicate about these activities?  

Fundamental to the success of this approach is that the right messages reach the right people. This may mean heavy targeting or the widest possible distribution, depending on the context and capabilities. This may be online but also offline means such as radio or television. Leveraging any coordination structures may assist here. Also, platforms may be engaged to assist with the distribution of content. Secondly, the message will need to be ‘sticky’. Developed by Chip Heath and Dan Heath, the six S.U.C.C.E.S. factors—Simple, Unexpected, Concrete, Credible, Emotional Stories—are useful when considering ways to make narratives stickier.

6.

How to coordinate with other actors/which other stakeholders to involve? 

Engagement with the EMB may be key to designing an appropriate campaign. More broadly, the larger the institutional network that is built, the further the messaging will reach. Media and digital platforms may also be valuable partners in this work—especially when it comes to design and distribution.

7.

How to ensure sustainability? 

Working through civil society actors or local media organizations on the design and production of the activity can help strengthen capacity of such parties to undertake these types of activities in the future.

COST CENTRES

While it is difficult to truly advise upon costings without a clear vision of a particular project, some factors to consider include:
Experts – A broad strategy is required to understand the broad approach, what the key issues that should be protected against are and how this activity fits within electoral concerns. Experts may be required to help support the design of this activity. The quality of content is vital to the likelihood of success, and so having experts able to support design of messaging is important.
Timeframe – The topics serviced may not be solely electoral concerns. In fact, to increase its efficacy for an election, these activities are likely required to be initiated well before an election process. This duration may have costing implications.
Technical – The production values of the content are important, making access to an agency with the appropriate capacities essential. At the same time, ensuring the agency also has thematic knowledge is likely to make things more straightforward.
Technology – Depending on the approach, development and hosting capabilities may be required—for example if the media of choice is a video game.
Research – User research is key to being able to hone messages, so funds should be assigned to recruit participants and conduct sessions. Impact research is also important to understand how campaigns are—or are not—influencing behaviour and how to improve the next campaign. The design of the approach will dictate the cost; however, this is a place not to try to trim costs.
Distribution – In order to expand viewership of content, the ability to publicize it—both online and on traditional media—is important.

LIMITATIONS AND CHALLENGES

  • This approach enjoys a growing empirical evidence base indicating that it can have a tangible impact. At the same time, while potentially having the capacity to improve the resilience of users, it is a matter of degrees and for some users, some information pollution will continue to be impactful. Accordingly, as with other interventions, it should be complemented with various approaches as part of a holistic approach.
  • Changing user behavior is challenging, and so requires time and money, creating a budgetary hurdle.
  • While engagement with platforms may be valuable, organizations should be careful around lending their credibility to the social media companies.
  • While pre-bunks generally do not contain the mis/disinformation they aim to address and are therefore less delicate than debunks, sometimes the answer may be to do nothing, as the relative risk of trying to counter a particular message may be greater than not engaging at all. Risk evaluation matrixes may help in assessing the risk associated with a certain type of response based on the threat level of a certain piece of content.

RESOURCES

The Vaccine Misinformation Field Guide

The field guide is a resource created by UNICEF in collaboration with The Public Good Projects, First Draft and Yale Institute of Global Health. It was developed to facilitate the development of strategic and well-coordinated national action plans to rapidly counter vaccine misinformation and build demand for vaccination that are informed by social listening. While specific to COVID-19 and vaccine related disinformation, this guide could help practitioners in other areas of misinformation by providing an operational framework for misinformation management

Beautiful Trouble Toolbox

A global network of organizers, artists, trainers, and writers who equip social movements with a set of strategic tools and training to help grassroots movements be more creative, effective, and irresistible. The online toolbox and Action Lab are based on a book, strategy card deck , an online toolbox and creative campaign incubator offering content on how nonviolent change happens.

The Narrative Power Analysis Worksheet 

The Worksheet is a tool to assess narratives and their potential impact alongside potential narrative shifts or prebunks

EXAMPLES

Sander van der Linden’s work on countering narratives include three separate games developed in collaboration with DROG and Gusmanson Design, each with a different theme:
1. Bad News
2. Harmony Square
3. Go Viral

IMPLEMENTATION PROCESS

COUNTRY DEPLOYMENTS

ADDITIONAL INFORMATION

DO NOT DELETE THIS SECTION - CONTAINS THE CODE TO HIDE EMPTY ELEMENTS

Information Integrity E-learning

Coming soon