Two new bills meant to protect children’s mental health online by changing the way they are served content on social media and by limiting companies’ use of their data will be introduced in the New York state legislature, state and city leaders said Wednesday.
New York Gov. Kathy Hochul and New York Attorney General Letitia James made the announcement at the headquarters of the United Federation of Teachers Manhattan, joined by UFT President Michael Mulgrew, State Senator Andrew Gounardes, Assemblywoman Nily Rozic and community advocates.
“Our children are in crisis, and it is up to us to save them,” Hochul said, comparing social media algorithms to cigarettes and alcohol. “The data around the negative effects of social media on these young minds is irrefutable, and knowing how dangerous the algorithms are, I will not accept that we are powerless to do anything about it.”
The “Stop Addictive Feeds Exploitation (SAFE) for Kids Act” would limit what New York officials say are the harmful and addictive features of social media for children. The act would allow users under 18 and their parents to opt out of receiving feeds driven by algorithms designed to harness users’ personal data to keep them on the platforms for as long as possible. Those who opt out would receive chronological feeds instead, like in the early days of social media.
The bill would also allow users and parents who opt in to receiving algorithmically generated content feeds to block access to social media platforms between 12am and 6am or to limit the total number of hours per day a minor can spend on a platform.
“This is a major issue that we all feel strongly about and that must be addressed,” James said. “Nationwide, children and teens are struggling with significantly high rates of depression, anxiety, suicidal thoughts and other mental health issues, largely because of social media.”
The bill targets platforms like Facebook, Instagram, TikTok, Twitter and YouTube, where feeds are comprised of user-generated content along with other material the platform suggests to users based on their personal data. Tech platforms have designed and promoted voluntary tools aimed at parents to help them control what content their kids can see, arguing that the decision about what boundaries to set should be up to individual families. But that hasn’t stopped critics from calling on platforms to do more — or from threatening further regulation.
“Our children deserve a safer and more secure environment online, free from addictive algorithms and exploitation,” said Gounardes. “Algorithms are the new tobacco. Simple as that.”
The New York legislation comes amid a raft of similar bills across the country that purport to safeguard young users by imposing tough new rules on platforms.
States including Arkansas, Louisiana and Utah have passed bills requiring tech platforms to obtain a parent’s consent before creating accounts for teens. Federal lawmakers have introduced a similar bill that would ban kids under 13 from using social media altogether. And numerous lawsuits against social media platforms have accused the companies of harming users’ mental health. The latest of these suits came on Tuesday, when Utah’s attorney general sued TikTok for allegedly misleading consumers about the app’s safety.
Mulgrew called the New York legislation necessary in part due to a lack of action by the federal government to protect kids.
“The last time, first and only time that the United States government passed a bill to protect children in social media was 1998,” Mulgrew said, referring to the Children’s Online Privacy Protection Act (COPPA), a federal law that prohibits the collection of personal data from Americans under the age of 13 without parental consent. In July, the US Senate commerce committee voted to advance a bill that would expand COPPA’s protections to teens for the first time.
New York officials on Wednesday also highlighted risks to children’s privacy online, including the chance their location or other personal data could fall into the hands of human traffickers and others who might prey on youth.
“While other states and countries have enacted laws to limit the personal data that online platforms can collect from minors, no such restrictions currently exist in New York,” a press release from earlier Wednesday stated. “The two pieces of legislation introduced today will add critical protections for children and young adults online.”
The New York Child Data Protection Act would protect children’s data online by prohibiting all online sites from collecting, using, sharing or selling the personal data of anyone under 18 for the purposes of advertising, without informed consent or unless doing so is strictly necessary for the purpose of the website. For users under 13, this informed consent must come from a parent or guardian.
Both bills would authorize the attorney general to bring an action to enjoin or seek damages or civil penalties of up to $5,000 per violation and would allow parents or guardians of minors to sue for damages of up to $5,000 per user incident or for actual damages, whichever is greater.
The US Department of Health and Human Services says that while social media provides some benefits, it also presents “a meaningful risk of harm to youth.” The Surgeon General’s Social Media and Youth Mental Health Advisory released in May said children and adolescents who spend more than three hours a day on social media face double the risk of mental health problems like depression and anxiety, a finding the report called “concerning” given a recent survey that showed teens spend an average of 3.5 hours a day on social media.