Meta Boss Apologises To Families over Harms Caused By Social Media to Their Children
Meta CEO Mark Zuckerberg has apologized to families who say their children have been harmed by social media during a fiery hearing in the US Senate.
Mr. Zuckerberg, who runs Instagram and Facebook, turned to them and said, “No one should go through” what they had.
He and the bosses of TikTok, Snapchat, X, and Discord were questioned for almost four hours by senators from both parties. Lawmakers wanted to know what they were doing to protect children online.
Legislation is currently going through Congress, which aims to hold social media companies accountable for material posted on their platforms.
Mr. Zuckerberg and TikTok CEO Shou Zi Chew voluntarily agreed to testify, but the heads of Snap, X (formerly Twitter), and messaging platform Discord initially refused and were sent government-issued subpoenas.
Behind the five tech bosses sat families who said their children had self-harmed or killed themselves as a result of social media content.
Read Also:
Trump Closes To Republican Nomination After Big Hampshire Win
They made their feelings known throughout, hissing when the CEOs entered and applauding when lawmakers asked tough questions.
Despite acknowledging the challenges, Zuckerberg cited research claiming that “on balance,” social media does not negatively impact the mental health of young individuals.
Senator Dick Durbin, who chaired the meeting, challenged this perspective, stating, “I don’t think it makes any sense. There isn’t a parent in this room who’s had a child…(who) hasn’t changed right in front of (their) eyes” due to an “emotional experience” on social media.
Preceding their testimony, Meta and X (formerly Twitter) announced new measures to address the impact on young social media users. Meta, the owner of Facebook and Instagram, disclosed that it would now block direct messages sent to young teens by strangers. Additionally, teens under age 16 can only be messaged or added to group chats by individuals they already follow or are connected to.
Furthermore, Meta implemented stricter content restrictions for teens on Instagram and Facebook, making it more challenging for them to access posts discussing suicide, self-harm, or eating disorders. These initiatives aim to enhance the safety of young users within the digital realm.