Council Claims Meta’s Moderation Prioritizes Politics over Security


January 31, 2025 by our News Team

Meta's recent changes to content moderation, aimed at promoting user freedom of expression, have raised concerns from the Meta Safety Advisory Council about potential impacts on safety and moderation, with worries that the changes could normalize harmful content and undo progress.

  • Meta has implemented a new community fact-checking system, similar to X, which could benefit user freedom of expression.
  • The Meta Safety Advisory Council, an independent group associated with Meta, reviews the platform's measures and changes to address safety concerns.
  • The council believes that Meta's changes could potentially normalize harmful and offensive content, undoing years of social progress.


Recently, Meta made some changes to how content is moderated on its platform, placing a strong emphasis on a new community fact-checking system, similar to what we see on X, and aligning with the ideas of the US government. While Meta sees these measures as beneficial for user freedom of expression, the Meta Safety Advisory Council has recently raised some criticisms and even concerns.

According to the council, the recent changes that Meta intends to implement on its platform could have a significant impact on content moderation, prioritizing discussions and political topics and potentially compromising safety. The council believes that Meta’s changes may not only affect content moderation on the platform but also have broader implications for internet usage and activity in general. The worry is that the company could normalize harmful and offensive content, undoing years of social progress.

It’s worth noting that the Meta Safety Advisory Council is an entity associated with Meta but consists of a group of independent experts and analysts who review the platform’s measures and changes, highlighting potential issues that may arise. Established in 2009, the council has been instrumental in addressing various public safety and platform usage concerns.

These statements come after Mark Zuckerberg announced earlier this year that several measures and changes would be implemented across Meta’s platforms, aimed at providing greater freedom of expression for users. The changes would allow users to engage in more open discussions on a wide range of topics, including those previously considered sensitive.

However, the changes have also faced criticism for enabling a high degree of abuse, such as the revisions to hate speech policies that now allow abusive comments targeting sensitive topics.

The Meta Safety Advisory Council claims to have raised its concerns with Meta, emphasizing that the platform should remain focused on combating online hate through its platforms rather than exacerbating the issue further.

Regarding the changes, the council also shares some thoughts on the new community notes. They believe that community notes can help combat misinformation to some extent but highlight existing flaws in their effectiveness. Additionally, many shared contents still do not receive community notes, a trend particularly noticeable on X, where only the most viral posts receive attention through notes.

In conclusion, while Meta’s recent changes aim to enhance user freedom of expression, concerns have been raised by the Meta Safety Advisory Council regarding the potential impact on content moderation and overall safety. It remains to be seen how Meta will address these concerns and strike a balance between fostering open discussions and maintaining a safe online environment.

About Our Team

Our team comprises industry insiders with extensive experience in computers, semiconductors, games, and consumer electronics. With decades of collective experience, we’re committed to delivering timely, accurate, and engaging news content to our readers.


Leave a Reply