Codes of Practice for Platforms

Platforms hold a great deal of power over the information environment. However, how this power should be directed is often a matter of dispute, depending upon the perspective of the various actors involved in the election process and the true extent that platforms can influence user outcomes. The introduction of a code of practice for platforms around an election is an innovation that has been implemented in a number of countries, as well as being applied to the broader information landscape. The approach is closely related to that of more traditional codes of conduct, whereby a voluntary agreement is established between the key players—such as political actors—potentially with monitoring and censure provisions. In practice, the activities may overlap.



Platforms invariably have an impact upon the online space, primarily through their decision around accepting political advertisements, their content moderation policies and enforcement. There are also concerns from various corners about the way that platforms promote content or provide transparency.

One means of tackling these concerns is to establish a code of practice for the platforms. Such may be convened by the relevant State authorities or, in the case where this is not possible, civil society may assume such a role. Modalities for monitoring and censuring compliance may also be established.


The broader legal framework may influence what provisions are required. For example, there may be campaigning rules that the EMB may see fit to translate to the online environment or a broader set of legislation to regulate the information environment.

It is probably worthwhile to prioritize the provisions that are most important in the specific information context and can be realistically achieved. There are a number of factors that platforms have stated influence their decisions coupled with limited resources to assign, and these factors may constrain goals. Furthermore, not all things that might be desirable can be technically achieved in the timeframe of an election.

Related, it is reasonable to consider what types of provisions can be realistically utilized by the information integrity community in the country. For example, requiring platforms to make their recommender algorithms auditable may be attractive; however, if there is no feasible capacity to review them, attention may be better directed elsewhere. It may be challenging to obtain the willingness of all platforms to enter into codes of practice. Identifying a convener with suitable weight will be important. Alternatively, in some cases, entities in countries will effectively issue a code of practice and then invite platforms to adhere to it.


There are a number of issues that a code of practice may try to influence. Some types of provisions that might attempt to address the following.

  • Endeavour to adhere to the principle of freedom of expression.
  • Establish internal timeframes and commitments with regards to the implementation of the code of practice and adherence to content decisions.
  • Broadly, ensure that the appropriate resources are in place to cater to the requirements of the election.
  • Set up a task-force aimed at evolving and adapting the code.
  • Establish a risk assessment methodology and a rapid response system.
  • At the request of the relevant authorities, provide proportionate and appropriate information and data, including ad-hoc specific reports within the regular monitoring.
  • Consider some form of monitoring or reporting provisions for platform compliance/progress with the code of practice.
  • Support post-election activities, such as reporting on participation or reviews.
  • Develop coordination mechanisms with the EMB to support expedited response to content of concern.
  • Develop mechanisms with other platforms to support coordination of content moderation decisions/actions.
  • Institute transparency mechanisms where platforms offer political advertising.
  • Provide a library of election advertising activity.
  • Restrict election advertising from outside of the country.
  • Apply labelling to election advertising.
  • Conduct registration and verification of political advertisers.
  • Develop and enforce content moderation policies regarding the prevention of election-related violence, election integrity, online violence and user privacy, in consultation with the relevant authorities and civil society.
  • Remove content that incites violence or hate speech within the shortest possible period of time once identified.
  • Prevent political adverts that incite violence or hate speech from being accepted.
  • Take measures to counter inaccurate information on key points, such as registration, voting and results requirements/processes.
  • Raise public awareness on the code of practice.
  • Work with the authorities to increase awareness of the elections and relevant rules.


Who is best placed to implement the activity?

The most appropriate actor to convene such an activity will depend on a number of factors. Key among them will be who has the greatest ability to convene the platforms and is best placed to motivate compliance. The most likely actor will be the EMB or another State authority. It is possible that in such a case, however, the body will require support in the negotiation of the terms, especially if they believe they could benefit from assistance in determining which technical and policy options to call for. However, in some cases, the State authorities may not be well placed. For example, they may not feel well suited to such a task. In such a case there is an opportunity for civil society to intervene.

International organizations may also have a role in supporting this process, providing their expertise behind the endeavour, as has happened in some countries, including the Netherlands. Furthermore, their institutional weight may support the acceptance of platforms.


How to ensure context specificity and sensitivity?

As with many activities, an assessment of the existing information landscape should provide a basis for action. Furthermore, the conflict and human rights dynamics in the country should have influence on determining if this is an appropriate approach and what the provisions should be. The broader political advertising rules as well as any broader legislation targeting platforms will also guide which provisions are sought.


How to involve youth?

Specific activities and provisions to advocate to youth influencers, creators and political actors may be valuable.


How to ensure gender sensitivity/inclusive programming?

Online abuse of women in politics is a relevant concern here. Incorporating specific rules around this may be appropriate, as well as committing the platforms to proactive actions and resource allocation, for example in the support of female political candidates or other communities who are likely to be subjected to online harassment.


How to communicate about these activities?

The integrity of the information ecosystem is a concern to various stakeholders, so the knowledge that actions are being taken can help to increase confidence in the election process. The greater the attention drawn to the agreement, the more likelihood that the entities will adhere to it. A communications plan is vital to establishing clear expectations and supporting appropriate coverage. An agreement with the platforms on how their participation and adherence will be publicized will be required.


How to coordinate with other actors/which other stakeholders to involve?

The impact of a code of practice will in large part depend upon having the appropriate groups agree to participate. The code may put impositions upon the work of political parties with regards to their ability to advertise online. It may also seek to support the work of various parties. Part of the design of the code may include consultations on what various stakeholders may desire from it.


How to ensure sustainability?

Typically, sustainability is not a concern; however, codes could be reused over time, with improvements based on innovations and newly identified challenges. Nevertheless, in some cases, especially if there are broader information integrity issues, there may be a decision to establish a task force among the signatories, which would help to maintain the exercise and build learning.



Experts – Experts may be needed to identify appropriate texts and support the identification of topics that are relevant concerns. Facilitators may be required to support negotiation of the terms of the code.
Consultations – Depending upon the design of the activity, support to consultations may be required. Here, consideration should be given to expert inputs from organizations who are involved in the country’s information landscape and, of course, the platforms and political parties. Beyond the identification and recruitment of appropriate facilitators, the organization of such activities must be provided for.
Monitoring – The area where additional and complex resources may be required is where there is a need to support the monitoring and enforcement of the code. This may include the need to modify and deploy software to organize work, provide hardware, recruit staff to review the content and analyse outcomes, and conduct trainings and monitoring. These tasks may also be outsourced, with some professional—but often costly—firms on the market. Consideration should be given to requiring platforms to contribute.


  • The first challenge will be to get broad agreement on the terms of the code and to adopt it. There is also the possibility that some parties will be willing to only adhere to some aspects of the code.
  • Having clear terms are vital to supporting a well understood agreement, reducing non-compliance or limiting the scope for disputes over the implementation of the agreement.
  • For some provisions, it will be difficult to prove that parties have not delivered compliance. Furthermore, where a platform has not delivered in the way that the code envisioned, the types of recourse available to the convener will be limited, outside of public criticism.



EC 2022 Strengthened Code of Practice on Disinformation

The European Commission’s 2022 Strengthened Code of Practice on Disinformation builds on the pioneering 2018 Code while setting more ambitious commitments and measures aimed at countering online disinformation. The new Code brings together a more diverse range of stakeholders than ever, empowering them to contribute to wide-ranging improvements by signing up to precise commitments relevant to their field. Such commitments include demonetising the dissemination of disinformation; guaranteeing transparency of political advertising; enhancing cooperation with fact-checkers; and facilitating researchers access to data. Supporting platforms and industry to meet their commitments under the Code of Practice on Disinformation feeds in to the European Commission’s commitment to a more transparent, safe and trustworthy online environment.

Code of Practice for Providers of Online Social Media Platforms, United Kingdom Government – Department for Digital, Culture, Media & Sport

This Code provides guidance for social media platforms, in advance of the new regulatory framework envisaged in the Online Harms White Paper. It sets out actions that the Government believes social media platforms should take to prevent bullying, insulting, intimidating and humiliating behaviours on their sites. This code of practice does not affect how illegal or unlawful content or conduct is dealt with.

India Voluntary Code of Ethics.

The Internet and Mobile Association of India signed the “Voluntary Code of Ethics” in preparation of the 2019 General Elections in Lok Sabha. The objective of this voluntary Code was to identify the measures that ‘Participants’ can put in place to increase confidence in the electoral process. This would help safeguard the products and /or service of the ‘Participant’s against misuse to vitiate the free and fair character of the 2019 general Elections in India.





Information Integrity E-learning

Coming soon