A three-bill package designed to protect kids online would change how some of the most vulnerable New Jerseyans interact with social media.
Assemblywoman Andrea Katz (D-Burlington), the legislation’s sponsor, says the technology exists to make it happen. Her Kids Code Act would require content sharing platforms to change the default settings for minors’ accounts.
Katz says the companies, such as Instagram and Meta, can determine users’ true ages — no matter if they submit false information when they register.
“Even if the kids say, ‘Hey, I’m older than you think I am,’ through all the data that these companies collect, they know how old these kids are,” Katz told NJ Spotlight News. “So when we have a known minor on a social media platform, their account is gonna be automatically defaulted to the highest security levels, and it’s not gonna be just as easy as flipping one little switch and turning that off. It’s gonna be a lot more complex.”
In 2023, then-U.S. Surgeon General Vivek Murthy issued an advisory, “Social Media and Youth Mental Health,” that identified harmful consequences of personal content sharing. Last year, the New Jersey Commission on the Effects of Social Media on Adolescents recommended that social-media platforms should restrict access for those younger than 16, and “develop and improve resources to prevent cyberbullying, discrimination and child exploitation.”
Katz’s second bill would require “black-box” warnings about mental health on such sites — a step that Murthy suggested nationally — plus monitors showing for how long young users use the apps. “Doom-scrolling,” or the habitual consumption of negative news or posts, Katz says, is particularly harmful.
“We want to make sure that our kids are not having to scroll endlessly all night long — that they’re not being delivered content that is inappropriate for them,” Katz said. “We know that they have the tools necessary to protect our kids, and we need to make sure that they’re doing that.”
A third bill would create a social media research center at a four-year college or university to study impacts on children, particularly adolescents, and make recommendations on how to protect young users.
This story is made possible in part by the Corporation for Public Broadcasting, a private corporation funded by the American people.
We’re in this together.
For a better-informed future.
Support our nonprofit newsroom.
