Commission opens formal proceedings against X under the Digital Services Act

The European Commission has opened formal proceedings to assess whether X may have breached the Digital Services Act (DSA) in areas linked to risk management, content moderation, dark patterns, advertising transparency and data access for researchers.

On the basis of the preliminary investigation conducted so far, including on the basis of an analysis of the risk assessment report submitted by X in September, X’s Transparency report published on 3 November, and X’s replies to a formal request for information, which, among others, concerned the dissemination of illegal content in the context of Hamas’ terrorist attacks against Israel, the Commission has decided to open formal infringement proceedings against X under the Digital Services Act.

The proceedings will focus on the following areas:

  • The compliance with the DSA obligations related to countering the dissemination of illegal content in the EU, notably in relation to the risk assessment and mitigation measures adopted by X to counter the dissemination of illegal content in the EU, as well as the functioning of the notice and action mechanism for illegal content in the EU mandated by the DSA, including in light of X’s content moderation resources.
  • The effectiveness of measures taken to combat information manipulation on the platform, notably the effectiveness of X’s so-called ‘Community Notes’ system in the EU and the effectiveness of related policies mitigating risks to civic discourse and electoral processes.
  • The measures taken by X to increase the transparency of its platform. The investigation concerns suspected shortcomings in giving researchers access to X’s publicly accessible data as mandated by Article 40 of the DSA, as well as shortcomings in X’s ads repository.
  • A suspected deceptive design of the user interface, notably in relation to checkmarks linked to certain subscription products, the so-called Blue checks.

If proven, these failures would constitute infringements of Articles 34(1), 34(2) and 35(1), 16(5) and 16(6), 25(1), 39 and 40(12) of the DSA. The Commission will now carry out an in-depth investigation as a matter of priority. The opening of formal infringement proceedings does not prejudge its outcome.

These are the first formal proceedings launched by the Commission to enforce the first EU-wide horizontal framework for online platforms’ responsibility, just 3 years from its proposal.

Next Steps

After the formal opening of proceedings, the Commission will continue to gather evidence, for example by sending additional requests for information, conducting interviews or inspections.

The opening of formal proceedings empowers the Commission to take further enforcement steps, such as interim measures, and non-compliance decisions. The Commission is also empowered to accept any commitment made by X to remedy on the matters subject to the proceeding.

The DSA does not set any legal deadline for bringing formal proceedings to an end. The duration of an in-depth investigation depends on a number of factors, including the complexity of the case, the extent to which the company concerned cooperate with the Commission and the exercise of the rights of defence.

The opening of formal infringement proceedings does not prejudge its outcome. It relieves Digital Services Coordinators, or any other competent authority of EU Member States, of their powers to supervise and enforce the DSA in relation to the suspected infringements of Articles 16(5), 16(6) and 25(1).

Background

X (formerly known as Twitter) has been designated as a Very Large Online Platform (VLOP) on 25 April 2023 under the EU’s Digital Services Act, following its declaration of having 112 million monthly active users in the EU as reported to the Commission on 17 February 2023.

As a VLOP, since four months from its designation, X has had to comply with a series of obligations set out in the DSA. In particular:

  • Pursuant to Articles 34(1), 34(2) and 35(1), VLOPs are obliged to diligently identify, analyse, and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, or from the use made of their services. When conducting risk assessments, VLOPs shall take into account a number of factors that influence the systemic risks, including recommender systems, advertising systems or the intentional manipulation of the service, including through inauthentic use or automated exploitation of the service, as well as the amplification and potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions. VLOPs are obliged to put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified.
  • Pursuant to Articles 16(5) and 16(6), online platforms have to notify without undue delay individuals or entities of content moderation decision, providing information on the possibilities for redress in respect of that decision; platforms shall take such decisions in a timely, diligent, non-arbitrary and objective manner.
  • Pursuant to Article 25(1), online platforms shall not design, organise or operate their online interfaces in a way that deceives or manipulates their users or in a way that otherwise materially distorts or impairs the ability of the users of their service to make free and informed decisions.
  • Pursuant to Article 39, VLOPs have to compile and make publicly available through a searchable and reliable tool a repository containing advertisements on their platforms, until one year after the advertisement was presented for the last time, in a way that the information is accurate and complete.
    Pursuant to Article 40(12), VLOPs have to provide researchers with effective access to platform data.