TL;DR

• Today, we have issued our first Transparency Report.


• The report covers from November 2024 to August 2025.


• It includes overarching statics, platform fact-sheets, country fact-sheets and a look ahead to 2026.

People and Organisations Seize Chance to Challenge Platforms


For years, if you disagreed with a social media platform, your choices were limited: appeal to the platform or decide whether you had the time and money to go to court.


Last year, that changed. For the first time, people and organisations in the EU could challenge decisions by social media platforms to independent dispute settlement bodies through a new right created under the Digital Services Act (DSA).


Today, people are making the most of this new right. During the period covered by this report (November 2024 – August 2025), Appeals Centre Europe received nearly 10,000 disputes challenging decisions on social media. Of these, more than 3,300 disputes fell within our scope. Most of these eligible disputes related to Facebook, followed by Instagram, TikTok and YouTube. We believe people’s awareness of the right to dispute a platform’s decision is a reflection of the willingness of platforms to engage in out-of-court dispute settlement and the prominence of “signposting” to dispute settlement bodies on each platform.


Since launch, we have expanded to cover more areas within our certified scope to respond to user demand. In the months ahead we will continue to make more platforms, content and actions within our certified scope eligible for dispute.

When Platforms Make Mistakes, People Pay the Price

By the end of August, we’d made over 1,500 decisions, more than three-quarters of which overturned the platform. Most of these decisions overturning the platform’s original decision were to restore a user’s content or account. Although the law allows up to 90 days for decisions in normal cases, the average case handling time has already reduced to a few weeks since our first decisions were issued.


The stories of social media users who submitted disputes show that when platforms make mistakes, people pay the price. From an unjustly suspended account, to a post showing someone examining themselves for cancer which was wrongly removed for violating rules on nudity, to hate speech targeting vulnerable groups of people which was wrongly left up, these errors have a real impact on people’s lives.


Over time, our transparency reports will represent a ‘heatmap’ of platforms’ most common mistakes – whether it’s leaving up harmful content or removing posts unnecessarily. For example, where we received and reviewed the content, we overturned around two-thirds of platforms’ decisions related to Restricted Goods and Services and more than half of platforms’ decisions related to Adult Nudity and Sexual Activity.

Co-operation With Platforms “Mixed” So Far

The Digital Services Act is clear: platforms must engage in good faith with bodies like the Appeals Centre and provide information to people about dispute settlement that is accessible, clear and user-friendly. Where platforms have told people about dispute settlement bodies, shared content with us and implemented our decisions, the benefits are clear. The people who appeal to us get the decision they deserve, while platforms are protected from the kinds of costly mistakes which erode trust with users.


However, despite progress, co-operation with platforms has been mixed. Platforms are still keeping dispute settlement bodies Europe’s best-kept secret. Where users do find their way to us and raise a dispute, some platforms are still not sharing the content we need to review disputes. The DSA clearly requires platforms to inform their users about bodies like the Appeals Centre and obliges them to engage constructively.


In cases where the platform did not provide us with the original content for an eligible dispute, we started issuing ‘default decisions’ in favour of the user. In addition, where the content was still live but the platform refused to provide it, we reviewed the content – and delivered a decision – using the link provided by the user. We provide assessments of how individual platforms have engaged with us later in the report.


The national regulators involved in the DSA – called Digital Services Coordinators (DSCs) – also have a key role to play. We would like every DSC to have a dedicated page about dispute settlement bodies on their website. They should also ensure their citizen helplines tell people about this new option, and share information through social media, newsletters and other channels.

What’s Next?

As an independent, not-for-profit, mission-driven organisation, the aim of the Appeals Centre is to promote an online environment that protects freedom of expression and human rights for the benefit of users and platforms alike. While we have made substantial progress in our first year towards achieving this, we’ve only seen a glimpse of the true potential of out-of-court dispute settlement in contributing to a more transparent, fair and rights-respecting online world. With co-operation from platforms, support from civil society and researchers, and oversight from regulators, we can spread the word and help people take control of what they see and post online.