- United States
- Pa.
- Letter
I am writing to urge you to demand public hearings before the Pentagon or any federal agency deploys artificial intelligence tools for surveillance of Americans. Recent reports reveal that Defense Secretary Pete Hegseth is threatening to designate Anthropic, maker of the Claude AI model, as a supply chain risk because the company refuses to allow its technology to be used for mass surveillance of Americans without safeguards.
The Pentagon insists on using AI tools for "all lawful purposes" without restrictions. While existing law may technically permit the collection of vast amounts of information about civilians, including social media posts and concealed carry permits, privacy advocates warn that current mass surveillance law was never written with AI capabilities in mind. Artificial intelligence could supercharge these authorities in ways that fundamentally transform government surveillance power.
This is not a theoretical concern. Claude is currently the only AI model deployed in the military's classified systems and was used during the Maduro raid in January. The Pentagon is now negotiating similar terms with OpenAI, Google, and xAI. A senior administration official expressed confidence these companies will agree to the "all lawful use" standard that Anthropic is resisting. These negotiations are setting precedent for how AI will be used across defense and intelligence agencies.
The American people deserve a voice in this decision. The deployment of AI for domestic surveillance represents a qualitative leap in government capability that could affect every citizen. Before the Pentagon forces AI companies to remove safeguards or severs ties with those that refuse, Congress must hold public hearings to examine the implications, establish clear limits, and ensure meaningful oversight.
I urge you to call for immediate public hearings on the use of AI for surveillance of Americans and to oppose any Pentagon contracts that lack explicit prohibitions on domestic mass surveillance applications.