As part of their transparency obligations under the Digital Services Act (DSA), designated Very Large Online Platforms and Very Large Online Search Engines are due to publish their first transparency reports – and some did so already. The first seven platforms have already published their reports, and the rest have until 6 November to do so. These first seven platforms are Amazon, LinkedIn, TikTok, Pinterest, Snapchat, Zalando and Bing.
The transparency report, together with a Commission database on statements of reasons, as well as additional requirements for designated services, such as the future data access for researchers, will ensure transparency and accountability over content moderation online – for the benefit of citizens, researchers and regulators. This will contribute significantly to public scrutiny and accountability.
The transparency reports must include information concerning content moderation on the platforms’ services, detailing the number of notices they receive from users (and once in place, trusted flaggers), the number of pieces of content taken down on the platform’s own initiative, the number of orders they receive from all relevant national judicial or administrative authorities, and the accuracy and rate of error of their automated content moderation systems. The reports must also include information on content moderation teams, including their qualifications and linguistic expertise.
Very Large Online Platforms and Search Engines have to publish these transparency reports for the first time, following their designation on 25 April, and will need to do so every six months. Platforms with less than 45 million users and intermediary services will also have to publish annual transparency reports once the DSA starts applying to them in February 2024. The DSA also gives the Commission the possibility to adopt implementing acts to lay down templates on the form, content, and other details of the transparency reports.