
Bluesky has seen remarkable growth in its user base over the past year, prompting the platform to significantly enhance its content moderation strategies. According to the recently published 2024 moderation report, Bluesky’s user count surged by approximately 23 million, skyrocketing from 2.9 million to nearly 26 million active users. This dramatic increase in membership led to moderators receiving an astonishing 17 times more user reports than in the previous year, with a total of 6.48 million reports filed in 2024 compared to just 358,000 in 2023. This escalation highlights the urgent need for effective moderation as the platform grows.
The primary concerns reported by users included issues related to harassment, trolling, and intolerance, as well as spam and misleading content, which encompasses impersonation and misinformation. The influx of accounts impersonating others coincided with Bluesky’s rapid rise in popularity, prompting the platform to adopt a more aggressive moderation approach. The report indicates that Bluesky has quadrupled its moderation team, which now consists of around 100 dedicated members, with ongoing recruitment efforts. Notably, some moderators focus on specific policy areas, including child safety, ensuring that the platform can adequately address various concerns.
In addition to the aforementioned categories, Bluesky reported a significant number of incidents related to illegal activities and urgent issues, as well as unwanted sexual content. Among the total reports, 726,000 were categorized as “other.” The platform has also been responsive to legal matters, complying with 146 out of 238 requests from law enforcement, government entities, and legal firms last year. This compliance illustrates Bluesky’s commitment to maintaining a safe environment for its users.
Looking ahead, Bluesky has plans to revamp its processes for handling reports and appeals, aiming to enhance user communication. This includes informing users about actions taken on reported content and eventually allowing them to appeal takedown decisions directly within the app. In 2024, moderators took decisive action to remove 66,308 accounts, while automated systems successfully eliminated 35,842 spam and bot profiles. As Bluesky prepares for future growth in 2025, the platform emphasizes its commitment to investing in stronger proactive detection systems. This investment will support user reporting and ensure that the expanding network can swiftly identify and mitigate harmful content.