Lawyers may not be able to hold Snap and Meta accountable for the content that users post—so they’re trying something else.

The stories are hauntingly similar: A teenager, their whole life ahead of them, buys a pill from someone on Snapchat. They think it’s OxyContin or Percocet, but it actually contains a lethal amount of fentanyl. They take it; they die. Their bereaved parents are left grasping for an explanation.

A 2021 NBC News investigation found more than a dozen such cases across the country. And now, parents of teens and young adults who died or were injured after purchasing drugs laced with fentanyl are turning to the courts, suing Snap over features that they believe made the deals possible—and allowed them to happen in secrecy.

Under federal law—in particular, a controversial section of the Communications Decency Act of 1996 known as Section 230—platforms typically aren’t responsible for the content people post on them. (Otherwise, social-media platforms would have to answer for all of the speech posted by their millions or billions of users—a level of responsibility they’ve repeatedly argued they should not have.) In this case, however, the plaintiffs are maneuvering around this issue by arguing that drug-related content isn’t the problem. Instead, they say that the very design of Snapchat encourages criminal behavior—that the app was “developed and launched … for the express purpose of encouraging and enabling lewd, illicit, and illegal conduct.” Earlier this month, a California judge ruled that the case can proceed.

Snap naturally takes issue with the lawsuit. In an emailed statement, Rachel Racusen, a spokesperson for the company, said in part, “While we are committed to advancing our efforts to stop dealers from engaging in illegal activity on Snapchat, we believe the plaintiffs’ allegations are both legally and factually flawed and will continue to defend that position in court.”

The plaintiffs allege that several Snapchat features—including disappearing messages, friend suggestions, and Snapstreaks, which encourage users to use the app multiple times a day—are “unsafe” for minors. Perhaps none is as classically Snapchat as disappearing messages, though. With the app’s flagship feature, users can send one another pictures or videos that digitally evaporate after being watched. Matthew Bergman, the attorney for the plaintiffs and the founder of the Social Media Victims Law Center, told me that these messages are concerning not just because they vanish from a user’s device but because “they disappear on the back end,” referring to how messages are automatically removed from Snap’s servers after a set amount of time.

Read: How telling people to die became normal

This isn’t the first lawsuit to take aim at design, a strategy devised to avoid the stalemate over Section 230. Late last year, a group of more than 30 attorneys general sued Meta, accusing the company of creating addictive product features. And in 2021, the Ninth Circuit ruled that Snapchat could be liable for a filter that shows how fast a user is moving, which plaintiffs in an earlier lawsuit had argued encouraged reckless driving. Again, the suit focused on allegedly “negligent” product design rather than content distributed by the app. The company removed the filter shortly thereafter.

Parents have plenty of reasons to be afraid and question the companies responsible for these apps. Snapchat has certainly had problems beyond this recent case: Teens have reported being bullied on the platform, their intimate photos shared widely without their consent. (Messages may disappear, but screenshots, or even photographs of a screen, are of course possible.) Others have been tricked into sending private photos to scammers.

But any lawsuit that goes after the fundamental features of a platform has to reckon with the fact that there aren’t easy fixes for the overall problem of teen safety online. Nor is the problem cut-and-dried: Last year, when the United States surgeon general issued a warning about social media’s potential harm for developing brains, it acknowledged, too, that platforms may have some benefits for children from marginalized groups, such as those in the LGBTQ and disabled communities. (It also warned that heavy users may nonetheless “face double the risk of poor mental health including experiencing symptoms of depression and anxiety.”) Some teens may be perfectly capable of using certain features safely and without problem, while others might require more parental oversight. A site’s functions might simultaneously serve radically different purposes, as in the case of YouTube, whose algorithms have surfaced extremist content but also plenty of anodyne videos too.

You could apply the same logic to disappearing messages, a substantial aspect of the fentanyl case. These were novel when Snapchat launched more than a decade ago, presenting a mode of communication that seemed more natural and lower stakes than the permanent-seeming Facebook timeline. Now Snapchat’s functionality is commonplace: Disappearing messages are an option in privacy-minded apps such as Signal, but also in Instagram, Facebook Messenger, WhatsApp, and even Gmail. Although ephemeral messaging comes with risks—including obstructing parents’ ability to keep tabs on their kids—it also comes with the benefit of offering some semblance of privacy.

Read: No one knows exactly what social media is doing to teens

Even if these platforms can’t guarantee discretion, they promise users a space to talk with one another, sometimes without a permanent record living on a company’s servers. As anyone with an unfortunate LiveJournal lingering in their past could tell you, that’s valuable. “If we’re talking about communications that build in privacy by design, in general, I think we probably want more of those, not less, as a society,” Eric Goldman, a Section 230 expert who teaches law at Santa Clara University, told me.

Teens in particular can benefit from access to such features. “We know that teenagers still have developing brains. We know that teenagers are impulsive,” Evan Greer, the director of Fight for the Future, a nonprofit advocacy group focused on tech policy, told me. “There’s an argument to be made that, particularly for teenagers, disappearing messages can be a really important safety feature.” Ephemeral-messaging platforms may allow some teens to find support and acceptance online that they cannot find in person.

One could always argue that perceived ephemerality encourages teens to take risks online that they shouldn’t, or that they simply shouldn’t be on social media in the first place, or at least that Snapchat is not the right place for the most sensitive topics; Signal, for example, offers better privacy features. But Snapchat is one of the main places where America’s teenagers are communicating. It trails only YouTube and TikTok in popularity: 60 percent of teens ages 13 to 17 say they use it, compared with 59 percent who report using Instagram, per a recent survey by the Pew Research Center. In other words, Snapchat is a regular part of many teens’ lives.

Read: Get phones out of schools now

As these lawsuits progress, they’ll push on where and when lines can plausibly be drawn when it comes to keeping kids safe online. Meanwhile, there are other steps to consider, such as a proposal that the social psychologist Jonathan Haidt made in this magazine, to remove phones from schools. Whatever ultimately happens to apps such as Snapchat, danah boyd, the author of It’s Complicated: The Social Lives of Networked Teens (and who stylizes her name in lowercase), emphasized to me over email that kids’ mental health must be supported in ways that have nothing to do with social media. She highlighted the importance of creating a “robust social fabric” for teens, giving them “a range of trusted adults in their lives.” For those parents who are concerned about the effect that Snapchat—or any other app, for that matter—might have on their teens, Julianna Miner, the author of Raising a Screen-Smart Kid, gave me a bit of advice. “By all means wait before giving them access to it,” she wrote over email. “I’ve spoken to a lot of parents on this subject and many of them regret giving access too soon, while very few regret waiting until their kids were a little older and more responsible.”

In any case, there is a simple, unavoidable reality. When she was a teenager, boyd told me, people worried about encoded messages sent on beepers aiding drug dealers. “Technology mirrors and magnifies the good, bad, and ugly,” she said. Social-media companies should do whatever they can to limit illegal content. They could change how we are able to speak with one another, what media we can distribute, which material is moderated and which is not; they might opt to work with law enforcement or they may do whatever they can not to. But ultimately, they can’t control what people talk about.

QOSHE - Faulty Design Is the Next Big Social-Media Battleground - Caroline Mimbs Nyce
menu_open
Columnists Actual . Favourites . Archive
We use cookies to provide some features and experiences in QOSHE

More information  .  Close
Aa Aa Aa
- A +

Faulty Design Is the Next Big Social-Media Battleground

2 1
31.01.2024

Lawyers may not be able to hold Snap and Meta accountable for the content that users post—so they’re trying something else.

The stories are hauntingly similar: A teenager, their whole life ahead of them, buys a pill from someone on Snapchat. They think it’s OxyContin or Percocet, but it actually contains a lethal amount of fentanyl. They take it; they die. Their bereaved parents are left grasping for an explanation.

A 2021 NBC News investigation found more than a dozen such cases across the country. And now, parents of teens and young adults who died or were injured after purchasing drugs laced with fentanyl are turning to the courts, suing Snap over features that they believe made the deals possible—and allowed them to happen in secrecy.

Under federal law—in particular, a controversial section of the Communications Decency Act of 1996 known as Section 230—platforms typically aren’t responsible for the content people post on them. (Otherwise, social-media platforms would have to answer for all of the speech posted by their millions or billions of users—a level of responsibility they’ve repeatedly argued they should not have.) In this case, however, the plaintiffs are maneuvering around this issue by arguing that drug-related content isn’t the problem. Instead, they say that the very design of Snapchat encourages criminal behavior—that the app was “developed and launched … for the express purpose of encouraging and enabling lewd, illicit, and illegal conduct.” Earlier this month, a California judge ruled that the case can proceed.

Snap naturally takes issue with the lawsuit. In an emailed statement, Rachel Racusen, a spokesperson for the company, said in part, “While we are committed to advancing our efforts to stop dealers from engaging in illegal activity on Snapchat, we believe the plaintiffs’ allegations are both legally and factually flawed and will continue to defend that position in court.”

The plaintiffs allege that several Snapchat features—including disappearing messages, friend suggestions, and Snapstreaks, which encourage users to use the app multiple times a day—are “unsafe” for minors. Perhaps none is as classically Snapchat as disappearing messages, though. With the app’s flagship feature,........

© The Atlantic


Get it on Google Play