Today, Commission welcomes the publication of the strengthened Code of Practice on Disinformation. The 34 signatories, such as platforms, tech companies and civil society followed the 2021 Commission Guidance and took into account the lessons learnt from the COVID19 crisis and Russia’s war of aggression in Ukraine. The reinforced Code builds on the first Code of Practice of 2018, which has been widely acknowledged as pioneering framework globally. The new Code sets out extensive and precise commitments by platforms and industry to fight disinformation and marks another important step for a more transparent, safe and trustworthy online environment.
Věra Jourová, Vice-President for Values and Transparency, said: “This new anti-disinformation Code comes at a time when Russia is weaponising disinformation as part of its military aggression against Ukraine, but also when we see attacks on democracy more broadly. We now have very significant commitments to reduce the impact of disinformation online and much more robust tools to measure how these are implemented across the EU in all countries and in all its languages. Users will also have better tools to flag disinformation and understand what they are seeing. The new Code will also reduce financial incentives for disseminating disinformation and allow researchers to access to platforms’ data more easily.”
Thierry Breton, Commissioner for Internal Market, said: “Disinformation is a form of invasion of our digital space, with tangible impact on our daily lives. Online platforms need to act much strongly, especially on the issue of funding. Spreading disinformation should not bring a single euro to anyone. To be credible, the new Code of Practice will be backed up by the DSA – including for heavy dissuasive sanctions. Very large platforms that repeatedly break the Code and do not carry out risk mitigation measures properly risk fines of up to 6% of their global turnover.”
Together with the recently agreed Digital Services Act and the upcoming legislation on transparency and targeting of political advertising, the strengthened Code of Practice is an essential part of the Commission’s toolbox for fighting the spread of disinformation in the EU.
The 34 signatories include major online platforms, notably Meta, Google, Twitter, TikTok, and Microsoft, as well as a variety of other players like smaller or specialised platforms, the online ad industry, ad-tech companies, fact-checkers, civil society or that offer specific expertise and solutions to fight disinformation.
The strengthened Code aims to address the shortcomings of the previous Code, with stronger and more granular commitments and measures, which build on the operational lessons learnt in the past years.
Concretely, the new Code contains commitments to:
- Broaden participation: the Code is not just for big platforms, but also involves a variety of diverse players with a role in mitigating the spread of disinformation, and more signatories are welcome to join;
- Cut financial incentives for spreading disinformation by ensuring that purveyors of disinformation do not benefit from advertising revenues;
- Cover new manipulative behaviours such as fake accounts, bots or malicious deep fakes spreading disinformation;
- Empower users with better tools to recognise, understand and flag disinformation;
- Expand fact-checking in all EU countries and all its languages, while making sure fact-checkers are fairly rewarded for their work;
- Ensure transparent political advertising by allowing users to easily recognise political ads thanks to better labelling and information on sponsors, spend and display period;
- Better support researchers by giving them better access to platforms’ data;
- Evaluate its own impact through a strong monitoring framework and regular reporting from platforms on how they’re implementing their commitments;
- Set up a Transparency Centre and Task Force for an easy and transparent overview of the implementation of the Code, keeping it future-proof and fit for purpose.
Finally, the Code aims to become recognised as a Code of Conduct under the Digital Services Act to mitigate the risks stemming from disinformation for Very Large Online Platforms.
The 2018 Code of Practice on Disinformation brought together industry players to commit to voluntary commitments to counter disinformation. At the core of the EU strategy against disinformation, the Code has proven to be an effective tool to limit the spread of online disinformation, including during electoral periods and to quickly respond to crises, such as the coronavirus pandemic and the war in Ukraine.
Following the Commission’s Assessment of its first period of implementation, the Commission published in May 2021 detailed Guidance on how the Code should be strengthened, asking to address the shortcomings of the 2018 Code, proposing solutions to make it more effective.
The signatories of the 2018 Code, and a broad range of prospective signatories engaged in the re-drafting of the commitments and measures worked together to ensure that the reinforced version of the Code is fit to address the important new challenges that disinformation poses to our societies.
The revision drafting process has been facilitated – contracted by the Signatories – by Valdani, Vicari and Associates (VVA) an independent consultant and Oreste Pollicino, a Constitutional Law professor of the Bocconi University as honest broker.
Signatories will have 6 months to implement the commitments and measures to which they have signed up. At the beginning of 2023, they will provide the Commission with their first implementation reports.
Taking into account expert advice and support from the European Regulators Group for Audiovisual Media Services (ERGA) and the European Digital Media Observatory (EDMO), the Commission will regularly assess the progress made in the implementation of the Code, based on the granular qualitative and quantitative reporting expected from signatories.
The established Task Force will monitor, review and adapt the commitments in view of technological, societal, market and legislative developments. Today the Task Force already held its first kick-off meeting. It will meet as necessary and at least every 6 months.