Two years after testifying before the U.S. Senate over child safety concerns, including teens’ exposure to eating disorder content, YouTube today announced additional product safeguards around its content recommendations aimed at teens. Specifically, the company said it would limit repeated recommendations of videos on topics that can trigger body image issues, like those comparing physical features or that idealize body types, certain weights, or fitness levels. Separately, it will also limit repeat viewing of videos that display “social aggression” in the form of non-contact fights or intimidation.
YouTube said that some of these videos may be innocuous when viewed on their own but could become problematic when teens watch the same type of content repeatedly. And, of course, YouTube’s recommendations are driven by what content users engage with, which is why such controls are needed.
The company said it will initially limit repeated viewing of these videos in the U.S. to start, with more countries to follow next year — a signal that YouTube is trying to get out ahead of proposed child safety regulations, like the bipartisan bill KOSA (the Kids Online Safety Act) proposed last year following the hearings on teen mental health. The bill recently added Senator Elizabeth Warren (MA-D) as a co-sponsor, after its formal introduction in May by Senators Marsha Blackburn (TN-R) and Richard Blumenthal (CT-D).
Alongside the changes to recommendations, YouTube said it will also revamp its “take a break” and “bedtime” reminders first introduced in 2018. Now, these features will become “more visually prominent” and will appear more often to viewers under the age of 18. The features are turned on by default in the account settings, YouTube notes, and will now appear as full-screen takeovers on YouTube Shorts and long-form videos, with the default “take a break” reminder set for every 60 minutes. Competitor TikTok already has similar reminders, in the form of short-form videos that pop up in its For You feed to suggest when users have been scrolling too long.
YouTube says it will also expand its crisis resource panels to become full-page experiences when viewers are exploring topics related to suicide, self-harm, and eating disorders. Here, they’ll see resources like third-party crisis hotlines and other suggested prompts to steer them to other topics, like “self-compassion” or “grounding exercises,” YouTube says.
To develop its new standards, YouTube says it’s partnered with the World Health Organization (WHO) and Common Sense Networks, an affiliate of Common Sense Media. The latter will help YouTube produce new educational resources for parents and teens, including “guidance on developing intentional and safe online habits, creating content with empathy and awareness, and best practices for approaching comments, shares, and other online interactions,” the company said. Meanwhile, the WHO and the British Medical Journal will host a roundtable of experts to examine strategies around teen mental health, in terms of providing resources and information online. A report is expected to be published in early 2024.