Platform Engagement

As the channel through which online information largely spreads, social media platforms are integral to efforts towards a healthy information ecosystem. They may form partners in these activities, targets for advocacy, or both. However, for many electoral practitioners, they are complex entities to engage. Having a better understanding of the opportunities that engagement might provide may support election practitioners in their successful approach, as well as manage expectations.

ACTIVITY

DESCRIPTION

Platforms have significant control over content moderation, the features they deploy, the online advertising system, how they share information and capacity to support election-related outreach, among other capabilities. There are various means by which platforms can support the work of election practitioners, and vice versa. However, engaging with platforms can be difficult; efforts are complicated by their size, competing priorities and varying entry points.

Below are some types of activity that engagement may seek to influence or collaborate on.

Adherence to human rights
At a high-level, practitioners should urge the firms to adhere to human rights, following approaches such as those outlined within the United Nations Guiding Principles on Business and Human Rights. However, truly determining how these apply in a country is complicated and should be elaborated in a contextual and inclusive way—in part as a component of determining the content moderation rules.

A relevant activity that may be pursued with the platforms is a human rights assessment, ideally conducted in an inclusive and independent manner.

Content moderation
Most platforms conduct content moderation by limiting material that violates their various policies, either by removing it or reducing its visibility. There are two broad areas of concern here; what policies are in place that relate to elections and election-related violence and to what extent the platform adequately enforces these policies.

There are a number of types of content that are fairly easy to determine—specifically those related to the electoral facts, such as the electoral calendar, eligibility requirements and such, are largely accepted on the basis of the information published by the electoral authority. However, as content slides towards political speech, the questions become thornier.

Policies: Election practitioners may seek to influence the substance of the policies to a varying level of degrees. On one end of the spectrum, platforms have worked with international experts in a country context to better understand what types of content may lead to election-related violence, in order to expedite the identification and removal of such material. While these activities may be conducted bilaterally, more inclusive and coordinated advocacy and advice may lead to better outcomes. On the other hand, detailed legislation may attempt to impose binding rules upon platforms, despite a surge in various difficulties in doing so.

Facilitating an inclusive multi-stakeholder coalition to engage with platforms can serve to address the challenges around overlapping mandates on digital content regulation, varying influence and reach, and harness against any form of censorship and allow national counterparts to speak in one voice towards the platforms.

The implementation of content moderation activities by platforms follows both manual and automated ‘flagging’ through a combination of individual reports, trusted flaggers, the platforms’ dedicated reviewers, and algorithms, with the platforms’ reliance on the latter mounting due to the increase in content volume.

The election community may attempt to urge platforms to assign adequate resources to the content moderation tasks. Within this, specific concerns may include ensuring that political advertising is accurately reviewed and labeled or that the moderation efforts are staffed with people who have the appropriate language and dialect capabilities. It may also urge the platforms to measure how accurately the algorithms are functioning on the various languages and country context.

Around the conduct of an election, multi-stakeholder partnerships may be established with researchers, civil society or electoral bodies, along-side platforms, to collaboratively respond to emergent issues. Ultimately, these relationships may partially rely on a high degree of trust, whereby the platforms will action content removals based on advice—distinct from the work of a fact-checking organization. Another, more interventionist approach has also been seen where the EMB will instruct platforms to remove content.

Investigative coordination
With the control of the platform and its data comes enhanced ability to identify suspicious concerns and the ability to attribute the source of information pollution. Some platforms are willing to work with appropriate practitioners on such efforts.

Features
Platforms have a variety of tools that they can deploy for elections to combat information pollution. A number of new approaches were developed to respond to the COVID-19 pandemic; however, platforms have been reluctant to provide such capabilities in an electoral context. Tools may support transparency, signposting people to reliable election-related content, downranking contested content or making it harder for people to share content. Electoral practitioners may seek to advocate the deployment of existing tools or the creation of new tools, as they find appropriate.

Some specific examples are listed below; however, how and if they apply to different platforms will vary:

Labelling – Labels may be applied to electoral content with a link to reliable information or to mark posts made by State-controlled media.
Outreach buttons – ‘I voted’ or ‘I registered’ are some commonly available buttons
Limiting – During the COVID-19 pandemic, messaging platforms made it harder to distribute content by reducing the mass share functions. Twitter has invoked prompts when a user seeks to share articles they have not read.
Algorithmic downranking – Using algorithms to reduce the spread of concerning content.
Elevating information – Platforms may also use different tools to promote information from reliable sources or create specific pages with such content.
Political advertising and ad libraries – Platforms that accept political adverts may enable or disable this option in a specific country and provide various types of transparency for the advertisements that are carried, such as a library or public labeling for who funded the adverts.
Account security tools – Options for supporting account security, verification and expedited assistance for candidates or government institutions.

Data Sharing
Platforms are notoriously guarded about sharing data, much to the frustration of researchers and civil society. However, better access to information can be invaluable for tasks such as fact-checking, media monitoring and research. Advocating for access to the relevant APIs/platforms may be pursued for appropriate organizations.

1.

Who is best placed to implement the activity?   

Various parties have an interest in the electoral environment and would benefit from collaboration or engagement with platforms. For some entities, such as EMBs, the path is relatively clear, though it is not always clear to these bodies how platforms can support them and the extent of support they can hope for. For CSOs, things can be somewhat more difficult, though there are still opportunities, for example in the outreach and fact-checking realms. International organizations may have institutional relationships, which can be leveraged in support of national actors of their own activities.

Ultimately, election practitioners are best served in their pursuits by engaging with platforms in a coordinated fashion. Some models have been designed to guide such approaches.

2.

How to ensure context specificity and sensitivity? 

Part of the drive for election practitioners to pursue this activity is to press platforms to conduct themselves in a manner appropriate to the country context. Often platforms will have commissioned their own risk-assessment of the country; however, these are typically not shared, and their conclusions may not align with those of the electoral experts.

Where the election has a particularly polarized discourse, there exist fears of foreign interference, or there are concerns regarding electoral violence in rejection of the election results or to target out-groups, it is particularly appropriate to press the platforms to acknowledge and respond to the risks.

3.

How to involve youth?  

The youth audience increasingly is moving to newer platforms. Given the importance of engaging with youth and supporting their information environment, it is vital that the relevant platforms are engaged with—even if there is not the same level of institutional relationships or those firms are not as developed in their election programming.

4.

How to ensure gender sensitivity/inclusive programming?  

Various platforms are developing programmes that are targeted to support the safety of women within their platforms. Online abuse of women in politics is an increasingly relevant concern. Engaging with platforms to increase their attention and support to countering such content may be important. Also, there may be opportunities to engage their support in training political candidates and parties in how to secure accounts and to protect themselves from online harassment.

5.

How to communicate about these activities?  

The use of public statements can help support advocacy efforts and build a consensus with the platforms. At the same time, some efforts may be more palatable without public exposure, at least in the early stages of relationship building and the understanding of opportunities.

6.

How to coordinate with other actors/which other stakeholders to involve? 

Ultimately, election practitioners are best served in their pursuits by engaging with platforms in a coordinated fashion. Some models have been designed to guide such approaches. International organizations working in the area may be able to use their institutional connections to support national actors’ activities.

7.

How to ensure sustainability? 

To build sustainability of the relationship, efforts should be made to increase their institutional basis. This may be approached, for example, through bilateral MoUs or through the establishment of coalitions on the practitioner side.

COST CENTRES

LIMITATIONS AND CHALLENGES

  • There is a fragmentation of the online social media space, with increasing platforms emerging—each with their distant technical features or audience composition. This will complicate engagement, multiplying the amount of work required.
  • Each platform will have differing levels of sophistication and technical capabilities, varying the options available.
  • Some platforms will be reluctant to engage in content moderation. However, even with the most optimistic level of platform engagement and resource allocation, it is unlikely that they can be relied upon to fully police their platforms, which leads to the need for further activities and actors.
  • Furthermore, preventing the over-moderation of political content and stifling of freedom of expression is equally important.
  • Content moderation has challenges of its own including the lack of transparency on the trusted flagger programmes, the lack of a systematic approach and prioritization of countries, challenges relating to the ability of contextualizing content moderation rules and discrimination inherent in moderation algorithms.
  • While engagement with platforms may be valuable, organizations should be careful around lending their credibility to the social media companies.

RESOURCES

EU Initiative Political Advertising – Improving Transparency 

As stated in President von der Leyen’s Political guidelines, the Commission will present a package of measures to ensure greater transparency in political advertising, particularly as political campaigning moves online. This initiative addresses economic actors, EU political parties and others, and aims to:
Support the functioning of the single market for advertising services.
Ensure the source and purpose of advertising is known.
Combat disinformation and interference in democracy in the EU.

The Santa Clara Principles 

The Santa Clara Principles have been developed by a broad coalition of human rights and electoral organizations, advocates and academic experts to guide companies in their compliance with their responsibilities to respect human rights and enhance their accountability, and to assist human rights advocates in their work. They also include toolkits/notes for regulators, companies and platforms.

Social Media Councils

EXAMPLES

Mexico

In a unique coordinated effort, the National Electoral Institute (INE) secured formal cooperation agreements with Facebook, Twitter and Google. This facilitated the first-ever live streaming of the Mexican presidential debates and INE election announcements, watched by millions of voters around the country. INE also worked with Facebook to implement interactive ‘buttons’ that allowed users to access the official election authority website and spread get-out-the-vote messages, as well as engage users in debate topic selection.

Indonesia

Bawaslu initiated a declaration to ‘Reject and Counter Vote Buying, Insults, Incitements, and Divisive Conflict in the 2018 Pilkada and 2019 General Elections,’ that secured signatures from 102 civil society groups along with relevant platforms including Google, Facebook and Twitter.

Libya

Ahead of the launch of the UN-facilitated Libyan Political Dialogue Forum (LPDF) in 2020, the UN Support Mission in Libya established a partnership with Facebook known as the ‘Trusted Partner’ service. This allowed UNSMIL to address hate speech, incitement to violence, and mis- and disinformation. As a result of UNSMIL reports, Facebook removed dozens of harmful social media posts that attacked activists, youth and peace promoters.

Kenya

Various stakeholders have been involved on aspects of online expression and content moderation, ranging from civil society to government to social media platforms. The mapping is available in this ARTICLE 19 Report. Following the observations presented in the report, ARTICLE 19 and UNESCO called for a coalition on Freedom of Expression and Content Moderation to facilitate coordination, as part of the Social Media 4 Peace Initiative supported by the European Union.

IMPLEMENTATION PROCESS

COUNTRY DEPLOYMENTS

ADDITIONAL INFORMATION

DO NOT DELETE THIS SECTION - CONTAINS THE CODE TO HIDE EMPTY ELEMENTS

Information Integrity E-learning

Coming soon