World

Tech Trade Group Sues California to Halt Children’s Online Safety Law

A tech industry trade association sued the state of California on Wednesday in an effort to halt a new children’s online safety law, a legal challenge that comes at a moment of intensified public concern over the risks that content on popular platforms like Instagram and TikTok could pose to younger users.

The new law, called the California Age-Appropriate Design Code Act, will require many online services to install sweeping safeguards for minors, including protecting children from potentially harmful content and turning off friend-finder features that could enable adult strangers to contact young people. Gov. Gavin Newsom signed the children’s online safety bill, the first of its kind in the nation, into law in September.

The trade association, called NetChoice, is suing to block the law before it is scheduled to take effect in 2024. The trade group’s members include Amazon; Pinterest; TikTok; Google, which owns YouTube; and Meta, the parent company of Facebook and Instagram.

In a legal complaint filed in the U.S. District Court for the Northern District of California, NetChoice said the legislation would require online services to act as content censors, violating constitutional protections for free speech. The group also argued that the law would harm minors and others by hindering their access to free and open online resources.

The law “presses companies to serve as roving censors of speech on the internet,” the NetChoice complaint said. “Such over-moderation,” it added, “will restrict the availability of information for users of all ages and stifle important resources, particularly for vulnerable youth who rely on the internet for lifesaving information.”

Over the past several years, children’s groups, parents and researchers have raised concerns that algorithms on platforms like TikTok and Instagram have promoted harmful content about eating disorders and self-harm to younger users. In response, legislators and regulators in the United States and Europe have bolstered safeguards for children’s online privacy and security.

The California children’s safety law was a bipartisan effort that passed both houses of the state legislature by unanimous votes. It was based on children’s online safety rules that Britain put into effect last year.

The British rules require online services that are likely to have minors as users to prioritize children’s safety. In practice, that means many popular social media and video game platforms must turn on the highest privacy settings for younger users in Britain. They must also turn off certain features that could prod children into staying online for hours on end, such as autoplay — videos that automatically play one after another.

Last year, as the British rules were poised to take effect, Google, Instagram, Pinterest, TikTok, Snap, YouTube and others introduced new safeguards for younger users worldwide. YouTube, for instance, turned off default video autoplay for minors.

The California rules similarly require online services to turn off features like video autoplay for children.

In the complaint, NetChoice argued that such rules were overly broad, would affect an overly wide range of online services and would chill the ability of platforms to freely select and promote content for users. In particular, the tech trade group argued that systems like autoplay and content recommendation algorithms were widely used, “benign” features.

In response to a question from a reporter about why the group wanted to block the California law when many of its members were already complying with similar British rules, NetChoice said that the state law was unconstitutional under the First Amendment.

“Although the U.K. has a similar law on the books, it has neither a First Amendment nor a long tradition of protecting online speech,” said Chris Marchese, NetChoice’s counsel.

Back to top button