Troubled by the number of unvaccinated COVID-19 patients showing up at his hospital, the French doctor logged onto Facebook and uploaded a video urging people to get vaccinated. He was soon inundated by dozens, then hundreds, then more than 1,000 hateful messages from an extremist anti-vaccination group known as V_V. The group, active in France and Italy, has harassed doctors and public health officials, vandalized government offices and tried to disrupt vaccine clinics.
Alarmed by the misuse of its platform, Facebook started several accounts linked to the group last December. But it hasn’t stopped V_V from continuing to use Facebook and other platforms and, like many anti-vaccine groups around the world, has expanded its portfolio to include climate change denial and anti-democratic messages.
“Let’s go get them home, they don’t have to sleep anymore,” reads a message from the group. “Fight with us!” reads another.
The largely uncontrolled nature of the attacks on the vaccine’s indisputable health benefits highlights the clear limits of a social media company to thwart even the most destructive form of disinformation, especially without a sustained aggressive effort.
Researchers at Reset, a US-based nonprofit, identified more than 15,000 abusive or misinformation-laden Facebook posts of V_V — activity that peaked in the spring of 2022, months after the platform announced its actions against the organization. . In a report on V_V’s activities, Reset researchers concluded that its continued presence on Facebook “raises questions about the effectiveness and consistency of Meta’s self-reported intervention.”
Facebook’s parent company Meta noted in response that its 2021 actions were never intended to remove all V_V content, but to remove accounts that were found to be participating in coordinated harassment. After The Associated Press notified Facebook of the group’s continued activities on its platform, it said it had removed another 100 accounts this week.
Meta said it is trying to strike a balance between removing content from groups like V_V that clearly break the rules against harassment or dangerous misinformation, while not silencing innocent users. That can be particularly difficult when it comes to the controversial issue of vaccines.
“This is a very hostile space and our efforts continue. Since our first removal, we have taken numerous actions against this network’s attempts to come back,” a Meta spokesperson told the AP.
V_V is also active on Twitter, where Reset researchers found hundreds of accounts and thousands of posts from the group. Many of the accounts were created shortly after Facebook took action on the program last winter, Reset found.
In response to Reset’s report, Twitter said it took enforcement action against several accounts linked to V_V, but did not detail those actions.
V_V has proven to be particularly resilient in attempts to stop it. Named after the movie V for Vendetta, in which a lone, masked man seeks revenge on an authoritarian government, the group uses fake accounts to evade detection, often coordinating its posts and activities on platforms like Telegram that lack Facebook’s more aggressive moderation policies.
That adaptability is one of the reasons it’s hard to stop the group, according to Jack Stubbs, a researcher at Graphika, a data analytics firm that’s been tracking V_V’s activities.
“They understand how the Internet works,” Stubbs says.
Graphika estimated the number of members of the group at 20,000 at the end of 2021, with a smaller core of members involved in the online harassment efforts. In addition to Italy and France, Graphika’s team found evidence that V_V is trying to create divisions in Spain, the United Kingdom, Ireland, Brazil and Germany, where a similar anti-government movement known as Querdenker is active.
Groups and movements such as V_V and Querdenker have increasingly alarmed law enforcement and extremism researchers who say there is evidence that far-right groups are using skepticism about COVID-19 and vaccines to extend their reach.
Increasingly, such groups are moving from online harassment to real-world action.
For example, in April, V_V used Telegram to announce plans to pay a $10,000 bounty to vandals who sprayed the group’s symbol (two red Vs in a circle) on public buildings or vaccine clinics. The group then used Telegram to distribute photos of the vandalism.
A month before Facebook took action against V_V, Italian police raided the homes of 17 anti-vaccine activists who had used Telegram to make threats against government, medical and media figures for their alleged support of COVID-19 restrictions.
Social media companies have struggled to respond to a wave of vaccine misinformation since the start of the COVID-19 pandemic. Earlier this week, Facebook and Instagram suspended Children’s Health Defense, an influential anti-vaccine organization led by Robert Kennedy Jr.
One reason is the tricky balancing act between moderating harmful content and protecting free speech, according to Joshua Tucker of New York University, who is co-director of NYU’s Center for Social Media and Politics and a senior advisor at Kroll, a technology -, government and economic consultancy.
Striking the right balance is especially important as social media has become a major source of news and information around the world. Leave too much bad content and users could be misinformed. Get too much out of it and users will start to mistrust the platform.
“It’s dangerous for society to move in a direction where no one thinks they can trust information,” Tucker said.