
Meta’s boss, Mark Zuckerberg, took the witness stand in Los Angeles this week in a landmark trial that could reshape how social media companies are held accountable for the mental health of young users.
The case centres on allegations that platforms owned by Meta Platforms, including Instagram, were deliberately designed to encourage compulsive use among children and teenagers, contributing to anxiety, depression and suicidal thoughts.
Lawyers pressed Zuckerberg over internal documents suggesting the company prioritised teen engagement despite public claims to the contrary.
The lawsuit, brought by a 20-year-old woman identified in court as K.G.M., is the first of more than 1,600 similar claims against major technology firms to be tested before a jury.

Zuckerberg, appearing before jurors for the first time on issues of child safety, denied that Meta seeks to make its products addictive or that it knowingly targets children under the age of 13.
He repeatedly told the court that company lawyers were “mischaracterising” internal emails and research documents presented by the plaintiffs.
However, the plaintiff’s lead lawyer, Mark Lanier, confronted Zuckerberg with a series of internal communications dating back more than a decade.
One 2018 presentation discussed retaining so-called “tweens”, while a 2015 email from Zuckerberg outlined goals to increase time spent on the platform and to reverse a decline in teen usage.
Lanier argued that these documents contradicted Meta’s public assurances that under-13 users were barred and that engagement metrics were no longer central to the company’s strategy.
Zuckerberg responded that some users lie about their age to access social media and that enforcing age limits is technically difficult.
He said Meta had “always” worked to identify underage users and had improved its tools over time, adding that he regretted not moving faster. “I always wish we could have gotten there sooner,” he told the court.

The trial also highlighted internal research commissioned for Instagram in 2019, which found that some teenagers felt “hooked” on the platform despite negative emotional effects.
Zuckerberg acknowledged the research but noted it was conducted by an external firm and also identified positive experiences among users.
Meta’s lawyers framed the study as evidence of the company’s efforts to understand how its products are used and to mitigate potential harm.
Under questioning from Meta’s legal team, Zuckerberg argued that long-term success depends on user wellbeing rather than maximising screen time. “If you do something that’s not good for people, they may use it more in the short term, but they won’t stick around,” he said.
He pointed to tools introduced since 2018 that allow users to set daily limits, mute notifications and manage time spent on Instagram.
Plaintiffs countered that such tools have had limited impact. Internal metadata presented in court showed that only a small fraction of teenage users activated daily time limits, raising questions about whether the safeguards were effective or sufficiently promoted.

The case is being closely watched because it could weaken the legal protections that have long shielded technology companies from liability.
In the United States, firms have relied on Section 230 of the Communications Act, which generally protects platforms from being sued over user-generated content.
Plaintiffs are attempting to sidestep that defence by arguing that social media apps themselves are defective products due to their design features, such as infinite scrolling and algorithmic recommendations.
Legal experts say a verdict against Meta and co-defendant YouTube could trigger far-reaching consequences, from large financial damages to changes in how platforms are built and regulated.
TikTok and Snap, which were also named in the lawsuit, settled with the plaintiff before the trial began, though the terms were not disclosed.

Outside the courtroom, parents who have lost children to suicide or other harms linked to online activity described the proceedings as a long-awaited moment of accountability.
Advocacy groups argue that internal warnings about teen mental health were downplayed in favour of growth and engagement.
Meta has strongly denied the allegations. In a statement, the company said the jury must decide whether Instagram was a substantial factor in the plaintiff’s mental health struggles, adding that evidence would show she faced significant challenges before using social media.
Meta also maintains that teenagers account for less than one per cent of its advertising revenue, undermining claims that the company profits heavily from young users.
The trial is expected to run for several weeks and is likely to serve as a bellwether for hundreds of related cases across the United States.
Whatever the outcome, it marks a pivotal moment in the global debate over social media, child safety and corporate responsibility in the digital age.
Read Also: Why Kenyan MPs Rejected a TikTok Ban and Chose Regulation Instead – Business News
porntude
A really good blog and me back again.
Gakunyi Njoroge
Welcome