Washington:
Meta CEO Mark Zuckerberg and the CEOs of TikTok, X, Discord and Snap faced criticism Wednesday from hostile US lawmakers over the dangers children and teens face on social media platforms.
The tech chiefs were convened by the US Senate Judiciary Committee, where they were tasked with examining the effects of social media in a session titled 'Big Tech and the Online Child Sexual Exploitation Crisis'.
Executives are facing an outpouring of political anger for not doing enough to thwart online dangers to children, including sexual predators and teen suicide.
During a round of particularly heated questioning, Zuckerberg had to stand up and apologize to the families of the victims who packed the committee room.
“Mr. Zuckerberg, you and the companies before us, I know you don't mean it, but you have blood on your hands. You have a product that is killing people,” Senator Lindsey Graham told the CEOs.
Witnesses against senators included Zuckerberg, Linda Yaccarino of X, Shou Zi Chew of TikTok, Evan Spiegel of Snap and Jason Citron of Discord.
“We are working hard to provide parents and teens with support and control to reduce potential harm,” Meta's Zuckerberg told the committee in his opening statement.
“Keeping young people safe online has been a challenge since the internet began and as criminals evolve their tactics, we must evolve our defenses too,” he added.
Zuckerberg also told lawmakers that research found social media was “on balance” not harmful to young people's mental health.
TikTok's Chew said: “As a father of three young children, I know the issues we're discussing today are horrifying and every parent's nightmare.”
“I plan to invest more than $2 billion in trust and security, with 40,000 security professionals working on this issue this year alone,” Chew said.
Meta also said that 40,000 of its employees are working on online security and that $20 billion has been invested since 2016 to make the platform more secure.
Before their testimony, Meta and X, formerly Twitter, announced new measures ahead of the heated session.
Meta, which owns the world's leading platforms Facebook and Instagram, said it would block direct messages sent by strangers to young teenagers.
By default, teens under 16 can now only receive messages or add to group chats from people they already follow or are connected to.
Meta has also tightened content restrictions for teens on Instagram and Facebook, making it harder for them to view posts that discuss suicide, self-harm or eating disorders.
Lawsuit against multiple states
Senators mentioned Meta and pointed to internal company documents showing that Zuckerberg declined to strengthen teams dedicated to identifying online dangers to teens.
“The hypocrisy is staggering,” Senator Richard Blumenthal told the New York Times.
These documents are part of a major lawsuit filed by approximately forty states jointly suing Meta for alleged failures with children.
Under US law, web platforms are largely shielded from legal liability related to the content shared on their site.
While lawmakers would like to create more regulations to increase online security, new laws are being thwarted by a politically divided Washington and intense lobbying by big tech companies.
One existing proposal is the Kids Online Safety Act, or KOSA, which aims to protect children from algorithms that can cause anxiety or depression.
Another idea would require social media platforms to verify the age of account holders and completely exclude children under 13.
“I don't think you're going to solve the problem. Congress is going to have to help you,” Senator John Neely Kennedy told the executives.
(Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)