Bluesky’s Content Moderation Sees Significant Boost in 2024


January 20, 2025 by our News Team

Bluesky faces challenges with moderation as its user base grows rapidly, receiving a significant increase in reports of harmful content and implementing changes to improve the process.

  • Bluesky has seen a significant increase in its user base, indicating its popularity as an alternative social media platform.
  • The moderation team has expanded to over 100 members and is continuously hiring more to address the increasing workload.
  • Bluesky is committed to cooperating with law enforcement and ensuring a safer online environment by responding to requests from authorities.


With the recent changes to X and Meta’s platforms, many users are on the hunt for alternatives to their social media needs. One platform that has been gaining traction as a viable alternative is Bluesky. Seen as a fresh and different option, Bluesky has been attracting a significant number of new users in recent weeks. However, with this influx of users comes the challenge of moderation, as the platform needs to combat certain comments and content that have been appearing more frequently.

According to the moderation report for 2024, Bluesky has seen its user base grow from 23 million to over 26 million in just a few months. Alongside this growth, the moderation team has received a staggering 17 times more reports of harmful content compared to 2023. In 2024 alone, they dealt with 6.48 million reports, compared to just 358,000 the previous year.

The majority of these reports were related to harassment, abuse, trolling, spam, and misleading content. One persistent issue that Bluesky still faces is the presence of fake accounts impersonating well-known personalities. It’s clear that the moderation team has their work cut out for them.

To address the increasing workload, the team has expanded to over 100 members, with ongoing efforts to hire more. Improvements to the moderation process are expected to be implemented in early 2025.

In addition to handling user reports, Bluesky also responded to 146 requests from authorities, out of a total of 238 received last year. This demonstrates the platform’s commitment to cooperating with law enforcement and ensuring a safer online environment.

Lastly, the report highlights Bluesky’s plans to make changes to the way users can report content and how these reports will be managed by the moderation team. The goal is to create a more transparent and efficient process.

It’s clear that Bluesky is facing the challenges that come with rapid growth. As more users flock to the platform, it’s crucial for the team to stay on top of moderation and ensure a positive user experience. With upcoming improvements and a commitment to transparency, Bluesky aims to create a safer and more enjoyable social media environment for its ever-expanding user base.

About Our Team

Our team comprises industry insiders with extensive experience in computers, semiconductors, games, and consumer electronics. With decades of collective experience, we’re committed to delivering timely, accurate, and engaging news content to our readers.


Leave a Reply