Information Landscape Assessment

Understanding the information landscape in a country is foundational to the design and implementation of an information integrity strategy. This activity describes the conduct of an in-country information landscape assessment to map the information ecosystem relating to elections and disinformation/hate speech. The ultimate goals of this is to understand the dynamics and risks in the country, which will support the design of an appropriate strategy that will, in turn, support the conduct of peaceful and credible elections.

The assessment may include various data-collection methods, including monitoring, investigation and consultations with a broad range of stakeholders. Several approaches may be required, depending on the specific scope and needs.

ACTIVITY

DESCRIPTION

This tool serves as a starting point for many of the information integrity workstream-related programmatic activities included in this knowledge hub and aims to provide the required data to inform planning and implementation decisions. In turn, the design of a specific assessment is informed by the requirements that the assessment is intended to serve and by the needs, to be defined a priori.

There are a number of complex underlying concerns that assessors may seek to understand, though they will vary depending upon the country, and the ability to answer them will often be limited. These include the relationship between the information landscape and various measures, such as the degree of polarization, election participation rates, level of freedom of expression and trust in institutions—especially the Election Management Body (EMB)—and ultimately the risk to credible and peaceful elections.

This information landscape assessment involves the collection and analysis of data on disinformation and hate speech narratives to produce evidence-based reports and assessments. While the initial areas of review will be refined depending on the country’s specific needs, the following may constitute overarching inputs:

  • Narratives – Determine what are the common hate speech and disinformation narratives, both with regards to organic and coordinated content.
  • Impact – Identify the specific themes and impact of information pollution in the country/context, and how and why these became altered during the investigation.
  • Threat Actors – Determine who is orchestrating coordinated political/electoral hate speech and disinformation, and to what extent these activities are impactful.
  • Context – Establish how narratives, trends and actors differ depending on the geographic region within a certain country/context.
  • Monitoring – Conduct ongoing monitoring of online and offline hate speech, disinformation and incitement within agreed parameters. Particular attention may be required to the targeting of particular vulnerable communities of concern, including women.

The assessment may be conducted using a variety of data collection methods, including some or all of the following: interview/consultations, monitoring of the online and offline media using social network analysis methods, content discovery and social listening tools.

The following deliverables may constitute part of the assessment assignment:

  • Inception report and workplan
  • Final landscape assessment report
  • Periodic monitoring reports or ad-hoc investigative reports
  • Datasets

1.

Who is best placed to implement this activity?

Experts and/or organizations that have expertise in conducting similar exercises in other country contexts, and with access to bespoke software may be preferred candidates for conducting an information landscape assessment. Nonetheless, understanding context-specific sensitivities and how information is sought and shared in that particular context adds value.

2.

How to ensure context specificity and sensitivity?

The information landscape assessment is by default a tool to determine context-specific concerns to inform future activities. Consultations with the broadest set of national actors involved in both the electoral process and the information landscape allow context-specific considerations to be uncovered.

3.

How to ensure inclusive programming, including youth and gender sensitivity? 

Part of the assessment should be to determine the various relevant societal components and how they are vulnerable, or party to, information pollution. Research indicates that age is a significant factor in the sharing or acceptance of information pollution, making disaggregating data according to age key. The assessment should prioritize concerns of marginalized groups in a specific context, including but not limited to women.

4.

What could the activity consist of?

  • A risk assessment can be conducted, deploying various types of activities and mitigation strategies.
  • An assessment can investigate how information is sought and information is shared in the particular country context; trends in misinformation, disinformation and hate speech; and the government/civil society/private sector response and where gaps exist. More specifically, this can include:
    • A comprehensive analysis of the use of information consumption, Internet penetration and use, news sources and platforms used, languages used, demographic considerations of target groups, sources of messages (anonymous/famous people), etc.
    • A review of the existing legislation and regulations relevant to the information landscape.
    • The analysis of cases of misinformation and hate speech observed during previous elections or major political events. Identify the actors involved, their motivations, how narratives travel, what narratives are most prominent, tactics and techniques used, the geographical spread and the consequences observed.
    • An assessment of the main influencers in the information ecosystem, including those with potentially positive roles.
    • An assessment of past government/media/private sector practices when responding to information pollution and where response gaps exist.
  • A monitoring and evaluation plan could be part of the assessment but may also be a stand-alone activity.
  • Recommendations for the inclusive, context-specific and effective roll-out of the assessment are meant to inform and provide recommendations to the national institutions concerned to ensure that the activity is integrated coherently into their procedures.

5.

How to coordinate with other actors/which other stakeholders to involve?

The assessment will be partly informed by stakeholder interviews to ensure that the assessment clearly captures the roles and responsibilities of the various stakeholders in the information landscape. The finalization of the assessment could be marked by the organization of a workshop to discuss and validate the findings and recommendations of the assessment. This would allow all relevant stakeholders to share their views on how the activity fits within the existing landscape of activities and programmes, while facilitating a discussion around effective coordination and synergy building.

6.

7.

COST CENTRES

The costing will depend upon the approach being taken, including the detail of monitoring, the level of qualitative and quantitative analysis required and the longitudinal nature of the activity. These are factors that determine the level of technical expertise and technology required for the assessment.

A broad assessment that leans on qualitative information will require an expert or a small team of experts. Typically this would require support for a broad range of interactions with key stakeholders—in-person or remote. More data driven assessments may require a third-party company. In such cases, analytical and qualitative techniques remain important. A number of firms do exist and often will provide similar services to governments, international organizations and the platforms themselves. The costs for such services can be considerable.

LIMITATIONS AND CHALLENGES

  • Ethical considerations, including data protection and privacy protection mechanisms should form part of the methodology.
  • Assessments often capture the situation at a point in time; however, their accuracy will likely degrade over time in line with the changes to the political, social and technological landscape. This implies these assessments should be accompanied by mechanisms for periodic revision.
  • The assessment should strive to define the concepts covered.
  • Analysing information in vernacular languages may be challenging in case certain social listening, monitoring and content discovery platforms are being used, which may not cater to all relevant languages in a particular context.
  • If the assessment is meant to inform the development of digital tools, parameters should be explored in an inclusive manner, incorporating human-centered design.

RESOURCES

In order for iVerify to be implemented in a particular country with support of UNDP, an assessment mission will be deployed to:
1. analyse the legal and information landscape
2. map existing initatives
4. propose partnerships and an operational structure

Digital Democracy Risk Assessment, Democracy Reporting International

The Risk Assessment is a tool for civil society organisations and other researchers to assess a country’s vulnerabilities to online manipulation around elections. The Assessment’s approach is to focus on the vulnerabilities. It does not assist users to establish the existence of adversarial manipulation campaigns, networks, or their impact. Instead, it helps users to map vulnerabilities of a country’s election ahead of the event. All resources related to the tool including the background paper, user guide, guiding questions and guidance on accessing data and social media tools can be found here

Election Watch for the Digital Age, Freedom House

Election Watch is a research initiative investigating the interplay between digital platforms and election integrity. It tracks a number of countries where elections are foreseen in the near future; ranking them according to a selection of key election-related indicators.

EXAMPLES

Preliminary Assessment of the Information landscape in Cambodia, UNDP Cambodia

This research brief, aimed to identify the influence of information sources for spreading disinformation in Cambodia and how citizens discern between credible and non-credible sources in the context of COVID-19.

Assessing the Human Rights Impact of Meta’s platform in the Philippines, Article One

To determine the degree to which Meta’s platforms may or may not have contributed to adverse human rights impacts and to mitigate the risk of further adverse impacts, Meta partnered with Article One from February to July 2020 to conduct a country-level human rights impact assessment (HRIA) of its platforms in the Philippines.

Firmer ground for advancing women’s participation in Libya: Social media monitoring report, Democracy Reporting International

In this new report, DRI assessed gender-based violence against women politicians in Libya in the online sphere through social media monitoring. This report was published as part of the Foreign, Commonwealth, and Development Office (FCDO) funded project “Firmer Ground for Advancing Women’s Participation in Libya” from October 2021 until May 2022.

IMPLEMENTATION PROCESS

COUNTRY DEPLOYMENTS

ADDITIONAL INFORMATION

DO NOT DELETE THIS SECTION - CONTAINS THE CODE TO HIDE EMPTY ELEMENTS

Information Integrity E-learning

Coming soon