Washington:
Major tech companies, including Facebook owner Meta and Google, said Tuesday they would work together in a new program to combat online child sexual abuse or exploitation. Child victims of online abuse are a hot topic for regulators, and tech companies are keen to demonstrate that they are taking adequate measures to protect children and teenagers.
Under the new program, called Lantern, major tech companies will share signals of activity that violates their child exploitation policies so platforms can take faster action to detect, remove and report problematic content.
Signals can include email addresses, certain hashtags, or keywords used to entice young people to abuse them, or to buy and sell material that depicts child abuse and exploitation.
“Until now, there has been no consistent process for companies to work together against predatory actors evading detection across services,” said Sean Litton, executive director of the Tech Coalition, which brings together tech companies on the topic.
“Lantern fills this gap and shines a light on cross-platform attempts at online child sexual exploitation and abuse, making the internet a safer place for children,” Litton added.
Other platforms in the Tech Coalition include Snap, Discord and Mega, a privacy-focused platform from New Zealand.
Tech Coalition said that during a pilot of the program, Meta deleted more than 10,000 Facebook profiles, pages and Instagram accounts after data was shared by Mega.
Meta reported the affected accounts to the US-based National Center for Missing & Exploited Children and shared findings with other platforms for their own investigation.
“Predators don’t limit their attempts to harm children to individual platforms,” said Antigone Davis, Global Head of Safety at Meta.
“The tech industry must work together to stop predators and protect children on the many apps and websites they use,” she added.
Lantern’s announcement came the same day a former Meta senior engineer told a Senate hearing in Washington that top executives, including Mark Zuckerberg, ignored his warnings that teens were unsafe on the company’s platforms.
Arturo Bejar told lawmakers that in an internal survey of 13-15 year olds on Instagram, 13 percent of respondents had received unwanted sexual advances on Instagram in the past seven days.
“Meta knows the harm children experience on their platform and executives know that their measures fail to address this harm,” Bejar said.
(Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)