Seattle’s public school system on Friday filed a lawsuit against several Big Tech companies alleging their platforms have a negative impact on students’ mental health and claiming that has impeded the ability of its schools “to fulfill its educational mission.”
The lawsuit was filed against the parent companies of some of the most popular social media platforms, including Facebook, Instagram, TikTok, Snapchat and YouTube.
The school district, which is the largest in the state of Washington with nearly 50,000 students, alleges in the suit that the companies “have successfully exploited the vulnerable brains of youth” to maximize how much time users spend on their platforms in order to boost profits. The actions taken by the platforms, according to the suit, have “been a substantial factor in causing a youth mental health crisis, which has been marked by higher and higher proportions of youth struggling with anxiety, depression, thoughts of self-harm, and suicidal ideation.”
The school district said students experiencing anxiety, depression, and other mental health issues perform worse in school, are less likely to attend school, more likely to engage in substance use, and to act out. The district said it continues to take additional steps to train teachers and screen students for mental health symptoms who may need further support but it needs a comprehensive, long-term plan and funding amid the growing mental health crisis today’s “youth are experiencing at [the companies’] hands.”
The school district is seeking unspecified monetary damages.
The lawsuit comes more than a year after executives from social media platforms faced tough questions from lawmakers during a series of congressional hearings over how their platforms may direct younger users – and particularly teenage girls – to harmful content, damaging their mental health and body image. While a growing number of families have filed lawsuits against social media companies for their alleged impact on the mental health of their children, it’s unusual to see a school district take such a step.
In a statement sent to CNN on Monday, Antigone Davis, Meta’s global head of safety, said it continues to pour resources into ensuring its young users are safe online. She said the platforms have more than 30 tools to support teens and families, including supervision tools that let parents limit the amount of time their teens spend on Instagram, and age verification technology that helps teens have age-appropriate experiences.
“We’ll continue to work closely with experts, policymakers and parents on these important issues,” she said.
The other companies did not immediately respond to requests for comment.
In the past year, a number of prominent social media platforms have introduced more tools and parental control options aimed at better protecting younger users amid mounting scrutiny.
TikTok, which has faced pressure from lawmaker both for its potential impact on younger users and its ties to China, announced in July that it would introduce new ways to filter out mature or “potentially problematic” videos. The added safeguards allocate a “maturity score” to videos detected as potentially containing mature or complex themes. TikTok also rolled out a tool that aims to help people decide how much time they want to spend on the app.
Snapchat, meanwhile, has introduced a parent guide and hub aimed at giving guardians more insight into how their teens use the app. That includes more information about who their kids have been talking to over the last week, without divulging the content of those conversations.