Meta chief Mark Zuckerberg has ended eight years of independent fact-checking on Facebook and Instagram, claiming it suppressed “too many voices”. This has led to fears of an uncontrolled spread of propaganda and counterfeiting. Mint explains the facts behind the argument.
What did Zuckerberg announce last week?
Zuckerberg said Meta's platforms will phase out fact checks in favor of “community notes,” popularized by Elon Musk's X (formerly Twitter). The latter will use a consensus mechanism of users of a platform and add comments if certain information is incorrect. Zuckerberg also rolled back filters on allowed posts and promised more political posts on Facebook, Instagram and Threads. He said there will be fewer automatic bans of posts and accounts because “machines are making mistakes… suppressing too many voices lately.” The measure will start in the US, but will eventually expand worldwide.
Why is this step important?
Experts say fact-checking is crucial for dealing with nationalistic, polarizing content around the world. Furthermore, intermediaries such as Meta's platforms are not intended to express their opinions on topics; their “safe harbor” protection is based on this logic. Political agendas that lead to disinformation are widespread today. However, some point to bias among fact-checkers: while unavoidable, this can lead them to make qualitative judgments. Various political ideologies have argued over the past decade that tech companies tend to lean toward liberals—citing this as a reason why they shouldn't be called “middlemen.”
Does India have its own fact-checking units?
Meta has eleven fact-checking partners in India. India is Meta's most complicated fact-checking geography, with teams tasked with checking facts in 18 languages – the most globally. Alt News also conducts independent third-party fact checks. The Center set up its own fact-checking unit, but it was suspended by the courts last year. For most fact-checkers, Big Tech is an important source of income.
Can this put an end to the filtering of disinformation?
Check Community Notes for agreement between users who typically disagree. However, the process has often failed to filter out misinformation; domestic experts say the lack of human fact-checkers is one reason for X's declining influence in India – one of the world's largest online economies. That said, Meta could adjust its efforts to add some degree of human intervention in the coming year. This could be critical to Meta's goal of selling its platforms as a hub for business and policy advertising to the online world.
How do Indian laws define fact checking?
IT Rules, 2021 puts the onus on social media intermediaries to conduct due diligence and rid platforms of misinformation. Lawyers say due diligence can be subjective. So the community notes mechanism could work fine, as is the case with X. However, there is no legal requirement for human fact checking. Many said this could work against any ruling political party in states or at the Center as community notes automatically filter agendas and suggest other political views as a potential middle ground.