Kaley’s case was selected as a “bellwether”—a test case to go to trial first and show how a jury would react to the claims—from more than a thousand lawsuits filed against social-media companies by individuals and school districts in California, which were consolidated into a single proceeding before a California judge, Carolyn B. Kuhl. She allowed the design claim to go to trial, meaning that the jury would decide, based on the evidence, whether the design features were addictive, whether the companies were negligent in designing them, and whether that addiction had caused harm to Kaley. As Kuhl explained it, “the allegedly addictive features of defendants’ platforms (such as endless scroll) cannot be analogized to how a publisher chooses to make a compilation of information, but rather are based on harm allegedly caused by design features that affect how plaintiffs interact with the platforms regardless of the nature of the third-party content viewed.” Thousands of similar federal lawsuits were also consolidated into a proceeding in a district court in California, and the first federal bellwether trial is scheduled for June. Separately, a coalition of dozens of states sued Meta on similar claims, and a trial in federal court, also in California, can be expected in the next year.
Kaley testified that she had been on YouTube since the age of six, had posted more than two hundred videos by age ten, and had created nine additional social-media accounts for the purpose of liking and commenting on her own content: “I spent all my time on it. I would sneak it. I would watch it in class. Every time I set limits for myself, it didn’t work. I just couldn’t get off,” she said. Social media “made” her give up hobbies and prevented her from making friends. She added that it still consumes her as a twenty-year-old woman: “I just can’t be without it.” When Mark Zuckerberg, the C.E.O. of Meta, testified at the trial, Kaley’s lawyer showed the jury a collage of hundreds of selfies that Kaley had posted to Instagram, which she said she had used since she was eleven.
Meta suggested that Kaley’s mental-health struggles were attributable not to social-media addiction but, rather, to her mother’s emotional and physical abuse and neglect, and that Kaley’s social-media use was not the source of her troubles but a way to cope with them. Kaley denied being abused or neglected, though Meta’s attorneys did show some Instagram posts about her mother screaming at her. But the strategy of attempting to pin the blame elsewhere was stymied, because California has a highly lenient standard in cases alleging that a defendant caused injury to a plaintiff: defendants can be liable if their negligence was a “substantial factor” in causing the harm—not necessarily the only cause or even the primary one. So the jury could have decided in Kaley’s favor even if it believed that the platforms’ negligent designs merely contributed to the many possible causes of her injury, such as, perhaps, school pressures, economic pressures, the political landscape, climate change—or bad parenting.
The contest over causation goes to parents’ simultaneous senses of responsibility and helplessness about their children’s fates. If parents have in the past felt they were competing with bad influences on children—questionable friends, shady neighbors, or profanity-laced music among them—the core anxiety in this era is that algorithms have made it so that there is no competition at all, undermining parents’ opportunity to steer their children right. (The day before the verdict in Kaley’s case, a New Mexico jury imposed a civil penalty on Meta of three hundred and seventy-five million dollars, under state consumer-protection laws, for misleading users about platform safety and enabling child sexual exploitation.) This generation of parents was also warned by those opposed to helicopter or tiger parenting not to monitor kids like hawks, and even to try some “free-range” parenting to let them explore and make mistakes. Meanwhile, engineers in Silicon Valley were allegedly designing ingenious ways to make explorations of digital rabbit holes irresistible. In millions of American homes, while parents were making dinner or paying bills, their kids were in another room scrolling social media and talking to chatbots.
In response to the verdict, a Meta spokesperson said that “teen mental health is profoundly complex and cannot be linked to a single app.” Google said in a statement that the case “misunderstands YouTube, which is a responsibly built streaming platform, not a social media site.” (Both companies said that they would appeal.) In the end, though, what made the verdict remarkable was the relative ordinariness of Kaley’s story. Her testimony about her habits, her behavior, and her anxieties was relatable to many people. The jury award was a spur to understand a life recognizably shaped by social-media algorithms, in ways that were perhaps near-impossible to resist, as a serious injury to an entire generation.
