in summary
A first nationwide law that would protect the privacy of children online was passed by the California Legislature. Here’s what it does and what happens next.
When does a child grow up? It’s an elusive question that developmental psychologists, philosophers, and parents may answer differently.
But legislators cannot work with ambiguity. In the late 1990s, for example, Congress decided that children were under the age of 13, at least when it came to surfing the Internet.
Last week, California lawmakers said no. Children are people under the age of 18. And if Gov. Gavin Newsom signs a law they just passed, kids under 18 in California will get a lot more privacy rights online.
What young people are experiencing in apps and online has become a source of growing concern for parents, fueled by alarming headlines and new research. So a bipartisan group of lawmakers pushed the California Age-Appropriate Design Code Act, also known as AB 2273. The bill passed unanimously in the Legislature last week and could become a model for other states — or provide a roadmap for Congress to consider privacy legislation of its own.
“Social media is something that wasn’t designed for kids,” said Emily “Emi” Kim, an 18-year-old who lives at Porter Ranch near Los Angeles.
Kim splits her time as legislative director for the opt-out movement, a youth-led organization campaigning for the bill, while also attending college classes and working at Chipotle.
Here’s what the bill would do
If signed into law, by July 2024, California businesses that offer online services or products accessible to children under the age of 18 would be required to offer higher levels of privacy by default. Specifically, the bill would:
- Require companies to assess the potential harm of using child data in a new service or feature, and create a risk mitigation plan before the feature rolls out.
- Prohibit companies from using children’s information in a way that the company knows (or has reason to believe) is “materially detrimental” to their well-being – such as B. Pushing kids for photos of skinny supermodels after they search for weight loss information.
- Generally prohibit companies from collecting, selling, disclosing, or keeping personally identifiable information about a child, except as necessary to provide the service that the child is directly using.
- By default, prohibit companies from collecting, selling, or sharing accurate location data for children unless it’s strictly necessary for the feature to work, and then only for a limited time.
- If the company allows parents or adults to track children online, require the product to make it clear to children when they are being tracked.
If some of those requirements sound vague, the bill also creates a new working group — made up of experts in child privacy, computing, mental health and more — to make recommendations to lawmakers.
The bill would be enforced by the Attorney General, who could file civil lawsuits that could result in penalties of up to $7,500 per child for willful violations.
Karla Garcia, a parent of an 11-year-old in the Palms area of west Los Angeles, supports the law because she hopes it will rein in the algorithms sucking her son Alessandro Greco into YouTube. “He knows it’s an addiction,” she said of her son’s America’s Got Talent binges that keep him from doing his homework. “Honestly, I have this fight with my kid every night.”
“I want him to have his independence, but that’s stronger than him,” Garcia said.
How the law has worked elsewhere
The idea was borrowed from a UK law that came into force in September 2021. Since the law was passed, technology companies have made changes, including the following:
- YouTube has turned off autoplay — the feature that plays videos continuously — for users under the age of 18.
- Google has made SafeSearch the default for users under the age of 18 and has stopped collecting location data from children.
- TikTok has stopped sending push notifications to teens late at night. Teens aged 13-15 will not receive push notifications after 9pm, and 16-17 year olds will not receive push notifications after 10pm. The company has also disabled direct messages for users under the age of 16.
Who will be a child?
The bill was opposed by lobbying organizations representing tech companies and other businesses, including the California Chamber of Commerce, the Entertainment Software Association and TechNet. TechNet counts Amazon, Google, Meta (formerly known as Facebook) and Uber among its members. The organizations argued that the bill would apply to more sites than necessary.
“This is another example of why we need a federal privacy law that includes universal standards to protect children online, rather than a patchwork of state laws that creates confusion and compliance complications for businesses,” said Dylan Hoffman, Executive Director from TechNet, which covers California and the Southwest. in an opinion.
“He knows it’s an addiction. To be honest, I have this fight with my kid every night.”
Karla Garcia, parent of an 11-year-old in Los Angeles
One of the key changes the groups were pushing for was lowering the definition of a child from 18 to 13, as in federal law. Then they endorsed 16, which is a threshold in a California privacy law, Hoffman said. But the corporate groups were unsuccessful in this push.
“Any parent, to be honest, any grandparent, any sister, any brother would tell you that a 13-year-old is not an adult,” said Baroness Beeban Kidron, Member of the UK House of Lords, who led the effort to get the UK law passed and Founded the 5Rights Foundation, which sponsored California law. “You can’t ask a 13-year-old to make adult decisions,” Kidron said.
What happens next?
First, Newsom will decide whether to sign the law or veto it. If he signs it, most of the conditions of the measure will not come into force until 2024.
But companies need to start identifying and mitigating risks to children immediately, said Nichole Rocha, director of US affairs at the 5Rights Foundation. In other words, if the law is signed, companies could start rolling out changes well before 2024.
What if companies don’t want to comply? Would the threat of a possible lawsuit from the California Attorney General be enough to get them to act?
“I’m going to be following this very closely,” said Buffy Wicks, a Democratic State Assemblyman from Oakland and one of the authors of the bill. Legislatures could pass another bill if the way the law is enforced needs refinement, she said. “We can sit here and make policies all day, but if they’re not implemented, they’re not enforced, what’s the point?”