The federal government is laying the groundwork for using artificial intelligence to monitor protests and suppress dissent, according to a new report from the House Subcommittee on Government Weaponization.
The report claims the executive branch has invested millions of taxpayer dollars in AI tools designed to censor speech, citing examples of censorship abroad as warnings for potential misuse domestically. It highlighted partnerships, such as those between U.S. allies and Logically.AI, to suppress protests related to COVID-19 measures, including Canada’s trucker convoy in Ottawa two years ago.
The report alleges Biden’s executive orders on AI fairness and bias have influenced controversial developments like Google’s Gemini AI chatbot, which faced criticism for avoiding depictions of white figures in its image generation. Internal documents and testimony from Alphabet employees suggest federal pressure played a role in shaping these models.
“The Biden-Harris Administration has regulated new AI models directly and indirectly,” the report states, accusing the government of pushing AI companies to enforce policies under the guise of “equity” and “algorithmic fairness.” It warns that such actions could enable the federal government to suppress dissenting views and infringe on First Amendment rights.
The report also points to federal funding for AI projects aimed at combating “misinformation,” such as grants from the National Science Foundation and the State Department. It argues that such initiatives risk embedding government-preferred biases in AI models, threatening free expression.
In response, the Weaponization Subcommittee is calling for restrictions on federal involvement in AI development. It urges Congress to end funding for content moderation-related AI research, roll back regulatory authority, and oppose global efforts to regulate AI and lawful speech. The subcommittee also promoted the Censorship Accountability Act, which mandates transparency for federal communications related to content moderation.
Read the report here.












