YouTube on Tuesday announced a series of changes to how it deals with content related to eating disorders.
The platform has long removed content that glorifies or promotes eating disorders, and YouTube’s Community Guidelines will now also prohibit content that features behaviors such as purging after eating or extreme calorie counting that at-risk users could be inspired to imitate. For videos that feature such “imitable behaviors” in the context of recovery, YouTube will allow the content to remain on the site but restrict it to users who are logged into the site and are over the age of 18.
The policy changes, developed in consultation with the National Eating Disorder Association and other nonprofit organizations, aim to ensure “that YouTube creates space for community recovery and resources, while continuing to protect our viewers,” YouTube’s Global Head of Healthcare Garth Graham told CNN in an interview.
“We’re thinking about how to thread the needle in terms of essential conversations and information that people might have,” Graham said, “allowing people to hear stories about recovery and allowing people to hear educational information but also realizing that the display of that information … can serve as a trigger as well.”
The changes come as social media platforms have faced increased scrutiny for their effects on the mental health of users, especially young people. In 2021, lawmakers called out Instagram and YouTube for promoting accounts featuring content depicting extreme weight loss and dieting to young users. And TikTok has faced criticism from an online safety group that claimed the app served eating disorder related content to teens (although the platform pushed back against the research). They also follow several updates by YouTube in recent years to how it handles misinformation about medical issues such as abortion and vaccines.
In addition to removing or age restricting some videos, YouTube plans to add panels pointing viewers to crisis resources under eating disorder-related content in nine countries, with plans to expand to more areas. And when a creators’ video is removed for violating its eating disorder policy, Graham said YouTube will send them resources about how to create content that’s less likely to harm other viewers.
As with many social media policies, however, the challenge often isn’t introducing it but enforcing it, a challenge YouTube could face in discerning which videos are, for example, pro-recovery. YouTube said it will be rolling out enforcement of the policy globally in the coming weeks, and plans to use both human and automated moderation to review videos and their context.
“These are complicated, societal public health [issues],” Graham said, “I want never to profess perfection, but to understand that we have to be proactive, we have to be thoughtful … it’s taken a while to get here because we wanted to articulate a process that had different layers and understood the challenges.”