Demand Federal Child Protection Standards for AI Technology
4 so far! Help us get to 5 signers!
Congress must pass legislation requiring AI companies to implement mandatory child protection safeguards before their technology reaches the market. Three Tennessee teenagers just filed a lawsuit against xAI after its algorithm powered an app that created realistic, nonconsensual nude images and videos of them as minors. The perpetrator used photos from yearbooks, social media, and direct messages to generate child sexual abuse material that was traded online with images of 18 other victims.
This isn't an isolated incident. xAI's image generation tools have been implicated in producing millions of sexualized images over the past year. While Google and OpenAI include digital watermarks that identify AI-generated content, xAI has refused to adopt these basic standards. The company is deliberately licensing its technology to app makers outside the U.S. to dodge liability for what they know is a dangerous tool.
Federal law must require AI companies to embed detection mechanisms in their models, prohibit the generation of sexualized content depicting minors, and hold companies liable when their technology is used to create child sexual abuse material. Voluntary industry standards have failed. These companies won't act until Congress forces them to choose between protecting children and protecting profits.