Zuckerberg apologizes to families during heated US Senate hearing on child safety on social media


Sexual predators. Addictive features. Suicide and eating disorders. Unrealistic beauty standards. Intimidation. These are just some of the problems young people face on social media — and children’s advocates and lawmakers say companies aren’t doing enough to protect them.

The CEOs of Meta, TikTok, Youngs’ life.

The hearing began with recorded testimony from children and parents who said they or their children had been exploited on social media. Throughout the hours-long event, parents who have lost children to suicide silently held up photos of their deceased loved ones.

“They are responsible for many of the dangers our children face online,” U.S. Senate Majority Whip Dick Durbin, a Democrat who chairs the committee, said in his opening remarks. “Their design choices, their failure to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our children and grandchildren at risk.”

During a heated question-and-answer session with Mark Zuckerberg, Republican Sen. Josh Hawley of Missouri asked the Meta CEO if he had personally compensated any of the victims and their families for what they experienced.

A person gestures while speaking in front of a sign.
During Wednesday’s Senate hearing, Republican Senator Josh Hawley asked Meta CEO Mark Zuckerberg if he had personally compensated the victims and their families for what they experienced. (José Luis Magana/Associated Press)

“I don’t think so,” Zuckerberg replied.

“There are families of victims here,” Hawley said. “Would you like to apologize to them?”

Parents at the hearing stood up and held up photos of their children. Zuckerberg also stood up, turning away from his microphone and the senators to address them directly.

“I’m sorry for everything you’ve been through. No one should have to go through the things your families have gone through,” he said, adding that Meta continues to invest and work on “scale efforts industry” to protect children.

” Dangerous products “

But time and time again, children’s advocates and parents have stressed that none of the companies are doing enough.

“Meta’s general approach is ‘trust us, we’ll do the right thing,’ but how can we trust Meta? The way they talk about these issues makes it seem like they’re trying to ‘light up the world,’ said Arturo Bejar, a former engineer. director of the social media giant known for his expertise in combating online harassment who recently testified before Congress about child safety on Meta’s platforms.

WATCH | What effect does scrolling on social media have on children’s brains?

How does scrolling social media affect children’s brains?

With most children and teens spending hours a day on a smartphone, CBC’s Christine Birak explains what research reveals about how social media use changes children’s behavior, whether it restructures their brain and what can be done about it.

“Every parent I’ve ever met with a child under 13 is afraid of when their child is old enough to be on social media.”

Hawley continued to pressure Zuckerberg, asking if he would take personal responsibility for the damage his company caused. Zuckerberg stayed on message and reiterated that Meta’s job is to “create cutting-edge tools” and empower parents.

“To make money,” Hawley cut in.

South Carolina Sen. Lindsay Graham, the top Republican on the Judiciary panel, echoed Durbin’s sentiments and said he was willing to work with Democrats to resolve the issue.

“After years of working on this issue with you and others, I have come to the following conclusion: Social media companies as they are currently designed and operating are dangerous products,” Graham said.

He told leaders that their platforms have enriched lives but it is time to tackle the “dark side.”

A federal bill in preparation

Starting with Discord’s Jason Citron, the executives touted the existing safety tools on their platforms and the work they’ve done with nonprofits and law enforcement to protect minors.

Snapchat had broken ranks before the hearing and began supporting a federal bill that would create legal liability for apps and social platforms that recommend content that is harmful to minors. Snap Inc. CEO Evan Spiegel on Wednesday reiterated the company’s support and asked the industry to support the bill.

TikTok CEO Shou Zi Chew said TikTok is vigilant about enforcing its policy banning children under 13 from using the app. CEO Linda Yaccarino said X, formerly Twitter, was not aimed at children.

People look to their right when sitting at a table.
From left: Discord CEO Jason Citron, Snapchat CEO Evan Spiegel, TikTok CEO Shou Zi Chew, X CEO Linda Yaccarino and Zuckerberg watch video of the victims shown during Wednesday’s Senate hearing American. (Andrew Caballero-Reynolds/AFP/Getty Images)

“We don’t have a dedicated children’s business line,” Yaccarino said, adding that the company would also support the Stop CSAM Act, a federal bill that will allow victims of child exploitation to sue more easily technology companies.

Yet children’s health advocates say social media companies have repeatedly failed to protect minors.

“When you’re faced with very important security and privacy decisions, net revenue should not be the first factor these companies consider,” said Zamaan Qureshi, co-president of Design It For Us, an organization led by young people. coalition advocating for safer social media. “These companies had the opportunity to do this before they failed, so independent regulation must come.”

Republican and Democratic senators came together in a rare show of agreement throughout the hearing, although it is not yet clear whether it will be enough to pass legislation such as the Kids Online Safety Act, proposed in 2022 by Sen. Richard Blumenthal of Connecticut and Sen. Connecticut. Marsha Blackburn of Tennessee.

Meta-emails under surveillance

Meta is being sued by dozens of states who claim it deliberately designs features on Instagram and Facebook that get children addicted to its platforms and has failed to protect them from online predators.

New internal emails between Meta executives released by Blumenthal’s office show Nick Clegg, president of global affairs, and others asking Zuckerberg to hire more people to strengthen “well-being across the board.” ‘business’, as concerns grow over the effects on young people’s mental health.

“From a political perspective, this work has become increasingly urgent in recent months. Politicians in the United States, United Kingdom, European Union and Australia are publicly and privately expressing their concerns regarding the impact of our products on the mental health of young people,” Clegg wrote in a statement released in August. Email 2021.

One installation depicts two people sitting on a pile of cash and clinking champagne while children appear distraught.
An installation against social media companies, featuring Meta’s Zuckerberg and TikTok’s Chew, is displayed in front of the U.S. Capitol building on Wednesday. (Julia Nikhinson/AFP/Getty Images)

Emails released by Blumenthal’s office do not appear to contain a response, if any, from Zuckerberg. In September 2021, the Wall Street Journal published the Facebook Files, a report based on internal documents from whistleblower Frances Haugen, who later testified before the Senate.

Meta has been beefing up its child safety features in recent weeks, announcing earlier this month that it would begin hiding inappropriate content from teen accounts on Instagram and Facebook, including posts about suicide, self-harm and eating disorders.

It also restricted minors’ ability to receive messages from anyone they don’t follow or connect with on Instagram and Messenger, and added new “nudges” to try to discourage teenagers from browsing Instagram videos or messages late at night. Nudges encourage kids to close the app, even if it doesn’t force them to do so.

Google’s YouTube was notably absent from the list of companies summoned to the Senate on Wednesday, even though more children use YouTube than any other platform, according to the Pew Research Center. Pew found that 93 percent of American teens use YouTube, followed distantly by TikTok at 63 percent.

Source link

Scroll to Top