- United States
- Va.
- Letter
I am writing to urge you to support comprehensive platform accountability measures rather than blanket social media bans for minors. The United Kingdom's Online Safety Act, passed in 2023, offers a proven alternative that addresses the root problem: dangerous content and algorithmic harm.
Ian Russell's daughter Molly died by suicide at age 14 in November 2017. An inquest found she died from self-harm while suffering from depression and the negative effects of online content. In her final six months, Molly was exposed to 2,100 pieces of harmful content on Instagram alone, with only 12 days when she didn't interact with such material. Algorithms fed her increasingly disturbing images and videos about suicide and self-harm, with messages like "Fat. Ugly. Worthless. Suicidal." When Russell initially reported this content to Instagram, they replied it didn't violate community guidelines.
Despite widespread support for an under-16 social media ban in the UK, including from 74% of British adults according to a December YouGov poll, Russell opposes it. Instead, he advocates for the Online Safety Act approach, which requires platforms to conduct robust age verification, prevent harmful content from reaching children, and gives regulators power to fine or remove non-compliant platforms.
This legislation has already demonstrated effectiveness. When X integrated AI tools that could create deepfake child sexual abuse images, the regulator Ofcom opened an investigation, the government threatened fines or blocking, and Elon Musk reversed course within days. This shows that strong enforcement works.
Russell proposes platform-by-platform age classification rather than blanket bans. Safe platforms could be rated for 13-year-olds, while others might require users to be 16 or 18 plus. This incentivizes platforms to create safer environments to attract younger users while protecting vulnerable groups like LGBTQ+ and neurodiverse children who rely on online support networks.
I urge you to pursue legislation modeled on the UK's Online Safety Act that holds platforms accountable for the content their algorithms promote to children, rather than implementing bans that children will circumvent while companies escape responsibility.