Key Points:
- Adam Mosseri testified in a landmark trial, denying that Instagram is “addictive” in a clinical sense while acknowledging concerns about problematic use among teens.
- The lawsuit argues Instagram’s design—like infinite scrolling and algorithmic recommendations—intentionally fosters compulsive behavior, causing harm to youth mental health.
- The case could set a precedent for social media liability, potentially reshaping industry regulation and accountability for platform design.
Instagram’s chief executive, Adam Mosseri, took the stand in a closely watched civil trial that could reshape how social media companies are held accountable for youth mental health. The proceedings, unfolding in Los Angeles, center on allegations that Instagram’s design intentionally fosters addictive behavior among teenagers.
The lawsuit was brought by a young woman who claims that prolonged use of the platform during her adolescence led to anxiety, depression, and severe emotional distress. Her legal team argues that Instagram’s core features, including algorithm-driven recommendations and infinite scrolling, were engineered to maximize engagement while disregarding psychological harm. They contend that such design strategies mirror tactics historically associated with industries accused of exploiting vulnerable consumers.
Adam Mosseri, under oath, pushed back against claims that Instagram is “addictive” in a clinical sense. He maintained that while some users may develop unhealthy patterns of use, the platform itself is not comparable to substances or medically recognized addictions. He described excessive scrolling as a behavioral issue rather than a condition intentionally manufactured by the company. His testimony marked a pivotal moment in the trial, which is considered one of the first major courtroom tests of social media liability in the United States.
Debate Over Design, Responsibility, and Youth Safety
The courtroom exchanges revealed internal discussions within Instagram about balancing user growth with safety measures. Lawyers for the plaintiff highlighted past debates among employees regarding certain features, including cosmetic filters and recommendation systems that may amplify appearance-based comparisons. They argued that internal communications showed awareness of potential harm to teenage users.
Adam Mosseri acknowledged that Instagram has long wrestled with how to address “problematic use,” particularly among minors. However, he emphasized that the company has introduced multiple safeguards in recent years, including parental controls, teen account settings, and limits on sensitive content. According to the defense, these steps demonstrate an evolving effort to prioritize well-being rather than exploit user behavior.
The trial has also drawn emotional testimony from families who claim social media platforms intensified mental health struggles in their children. These accounts have underscored the broader societal debate over whether digital platforms bear responsibility for the psychological consequences of prolonged engagement.
Legal experts observing the case note that its outcome could hinge on whether the court accepts the argument that platform design alone constitutes negligence, or whether user choice and parental oversight remain decisive factors.
Broader Industry Implications
The lawsuit extends beyond Instagram. Other major technology platforms face similar allegations in related cases, suggesting that this trial could serve as a bellwether for the industry. Executives from parent company Meta, including CEO Mark Zuckerberg, are expected to testify, potentially elevating the stakes even further.
At the heart of the dispute lies a fundamental question: Should social media platforms be legally accountable for how their design influences user behavior? For years, technology companies have largely been shielded from liability over user-generated content. However, plaintiffs in this case argue that the issue is not content alone, but the architecture that promotes compulsive use.
A ruling against Instagram could open the door to sweeping changes in how digital products are designed and regulated, particularly those aimed at young audiences. It may also encourage lawmakers to pursue stricter oversight of algorithmic systems and engagement-driven models.
As the trial continues, it represents more than a single dispute between a user and a platform. It stands as a defining moment in the ongoing reckoning between Silicon Valley and growing public concern over the mental health impact of social media on the next generation.









