There was plenty of high drama at the recent US Senate hearing about child sexual exploitation on social media sites. But after the dust settled, the most important question was left unanswered: What can be done about it?
State and federal lawmakers have proposed a variety of new laws aimed at forcing online services like Facebook, Instagram, Snap, and X to shield minors from sexual exploitation. And some of these proposed laws have attracted bipartisan support — a rare feat in these days of political polarization.
But if these bills become laws, some of them may not be enforceable, as civil libertarians say they violate user privacy and threaten First Amendment rights.
“The government has to bear the burden of proof,” said Leah Plunkett, a lecturer at Harvard Law School and author of “Sharenthood,” a book about the internet and parenting. “For content-based regulation of protected speech, they have a very, very big lift.”
But Plunkett also said that the unique problems posed by social media may lead the US Supreme Court to look more favorably on special protections for minors.
On the federal level, Democratic US Senator Ed Markey of Massachusetts and GOP US Senator Bill Cassidy of Louisiana want to revise the 1998 Children and Teens’ Online Privacy Protection Act, which blocked online companies from collecting personal information from minors under age 13. The new version would lift the age limit to 16 and would ban online advertisements targeting children and teens.
Far more controversial is the Kids Online Safety Act, or KOSA, sponsored by Connecticut Democratic US Senator Richard Blumenthal and Tennessee GOP US Senator Marsha Blackburn. KOSA would require social media companies to avoid exposing minors to possibly harmful materials relating to everything from eating disorders to drug addiction, suicide, and sexual abuse.
Several major tech companies, including Microsoft, X, and Snap, support KOSA. But the law faces intense opposition from groups like the American Civil Liberties Union, which argues that KOSA is unconstitutional. “While the bill’s purported goal of addressing child safety online is laudable, its means ... will silence important conversations, limit minors’ access to potentially vital resources, and violate the First Amendment,” said a letter from the ACLU to members of the US Senate.
But KOSA supporters say that the law does not require social media companies to block such content. Instead, the companies must not send such material to minors who aren’t looking for it.
Under KOSA, social media companies would be required to use the most restrictive default settings possible on all accounts belonging to minors. The service must refrain from using algorithms to decide what to show minors, a common practice that could direct unsuspecting children to harmful content. However, minors could lawfully seek out such conduct on their own.
For instance, the social media service Reddit, which permits sexually explicit images, could be held liable if it automatically displayed such images to a minor. But if that same minor deliberately searched for such images, Reddit would not be liable.
KOSA supporter Michael Toscano, executive director of the Institute for Family Studies, a conservative think tank, said that this feature of the law secures the First Amendment rights of minors. “This law respects the autonomy of the individual user,” Toscano said. Still, if KOSA ever becomes law, its fate will probably be decided in a federal court.
Even if KOSA survives, there’s another big problem. How can social media companies know a user’s age?
Toscano says the companies can figure it out. He notes that they make their vast profits by analyzing the online behavior of users, then showing them precisely targeted advertisements. Toscano said the companies could use the same technology to determine whether a user is a child or an adult. For instance, a user who spends a lot of time in forums devoted to toys or cartoon shows is more likely than not a child. Such users could be granted access only to child-friendly portions of the site.
“You have the capacity already,” Toscano said, “and it’s a big part of your business model, to know who’s on your platform.” Indeed, in 2021 Facebook said that it uses an artificial intelligence system to monitor user activity to determine whether the user is a minor.
But such a system won’t be foolproof. Adults who collect toys and love cartoons might be falsely identified as minors, for instance. A truly robust system would require that users prove they’re over 18. A number of companies offer such systems. For instance, Meta’s Instagram and Facebook services work with Yoti, a company that uses video images shot by a user’s smartphone to estimate the person’s age. (Yoti claims its system is 99.9 percent accurate for people aged 13 to 17, and 96.7 percent accurate for people aged 6 to 11.)
For now, the Yoti system is voluntary. But lawmakers in multiple states have proposed laws to require some form of age verification to access social media. Already Arkansas, Utah, and Louisiana have enacted such laws. In addition, some states have enacted similar laws specifically targeting pornographic websites. Pornhub, the nation’s leading operator of such sites, has blocked access to users in several states that have enacted such laws, including Utah, Mississippi, Virginia, North Carolina, and Montana.
These age verification laws are anathema to privacy activists, who fear that online companies will force users to hand over sensitive personal information at the price of membership.
“Do you show an ID before going into the library?” said Joe Mullin, senior policy analyst at the Electronic Frontier Foundation, an internet civil liberties group. “We have not seen an age verification system that does not compromise the privacy of adults.” (Indeed, Utah legislators have proposed modifications to that state’s social media law after a civil liberties group filed a lawsuit to overturn it.)
But Plunkett at Harvard said she believes social media companies can develop systems for verifying a user’s age without undermining privacy. “We are seeing available technology grow by leaps and bounds,” Plunkett said. “Let’s solve this problem.”