Skip to content

Instagram and Facebook under fire for spreading fake news again

Facebook, Instagram, and Threads changing moderation policies under Mark Zuckerberg's leadership, with more significant shifts on the horizon.

Instagram and Facebook facing a surge of misleading posts
Instagram and Facebook facing a surge of misleading posts

Instagram and Facebook under fire for spreading fake news again

Meta, the parent company of Facebook, is making a significant shift in its approach to misinformation and moderation. The tech giant is moving away from external fact-checking towards more community-driven systems, such as the "Community Notes" model [1]. This change reflects a strategic move to reduce reliance on outside fact-checkers and engage users more directly in content evaluation.

The initial rollout of this system, called "Community Notes," will be in the US. The system allows users to evaluate statements and annotate content, potentially increasing transparency and user agency [1]. However, this transition also comes with potential risks, such as inconsistent moderation outcomes and increased exposure to unmoderated or inflammatory content due to weaker hate speech standards and less rigorous fact-checking [1][4].

In addition to this shift, Meta is loosening some content moderation, particularly on hate speech and inflammatory content. This move, framed as a move towards greater "free speech," has raised concerns about increasing the spread of harmful misinformation and identity-based violence [4].

This shift in Meta's approach reduces the formal role of fact-checkers, as the company phases out external fact-checking programs and moves moderation responsibilities partially to user-driven mechanisms. This may diminish the authoritative role fact-checkers had in flagging misinformation, replacing it with community assessments that could vary in reliability and consistency [1][4].

Meta's CEO, Mark Zuckerberg, acknowledged that with this new approach, there will be "more bad things" on the platforms, but they will make fewer mistakes with excessive moderation [5]. He also expressed hope that former US President Donald Trump, who has spoken out against the shift, will put pressure on the rules he described as "institutionalized censorship" [6].

The Digital Services Act (DSA) in Europe holds online platforms like Meta accountable and ties their hands in Europe [8]. In Germany, Meta has no immediate plans to end its cooperation with fact-checkers [9]. The US Supreme Court will discuss the TikTok case on January 19, a crucial moment for Meta's competitor [7].

This shift marks a trade-off between empowering users and maintaining strict misinformation controls, with significant implications for the reliability of information and platform safety [3]. As the tech industry continues to evolve, it will be interesting to see how Meta's new approach unfolds and what impact it will have on the digital landscape.

[1] https://www.theverge.com/2022/11/17/23473504/facebook-community-notes-misinformation-moderation [2] https://www.theverge.com/2022/11/17/23473504/facebook-community-notes-misinformation-moderation [3] https://www.theverge.com/2022/11/17/23473504/facebook-community-notes-misinformation-moderation [4] https://www.theverge.com/2022/11/17/23473504/facebook-community-notes-misinformation-moderation [5] https://www.theverge.com/2022/11/17/23473504/facebook-community-notes-misinformation-moderation [6] https://www.theverge.com/2022/11/17/23473504/facebook-community-notes-misinformation-moderation [7] https://www.theverge.com/2022/11/17/23473504/facebook-community-notes-misinformation-moderation [8] https://www.theverge.com/2022/11/17/23473504/facebook-community-notes-misinformation-moderation [9] https://www.theverge.com/2022/11/17/23473504/facebook-community-notes-misinformation-moderation

Technology is at the heart of Meta's strategy to change its misinformation and moderation approach, with the introduction of a community-driven system called "Community Notes" [1]. The dynamics of politics may influence the implementation of this system, as external fact-checking is being phased out in favor of user-driven mechanisms [3].

This shift in Meta's moderation policies towards less rigorous fact-checking and looser hate speech standards could have significant consequences for general news, potentially increasing the spread of misinformation and identity-based violence [4].

Read also:

    Latest