SUMMARY President Trump’s December 11, 2025 executive order attempts to override state AI laws by creating a Justice Department task force to sue states and threatening to withhold federal funding. Legal experts say the order is unconstitutional since only Congress can preempt state laws. The move has united an unusual coalition of MAGA Republicans and progressive Democrats against what they view as tech billionaire influence undermining state consumer protections.
President Trump signed an executive order on December 11, 2025 that’s causing controversy in Washington. The order attempts to stop states from regulating artificial intelligence, claiming that a patchwork of different state laws will hurt America’s ability to compete with China. But legal experts across the political spectrum say the president doesn’t have the constitutional authority to do what this order attempts. The resulting battle could reshape how America governs AI for years to come.

What the Order Actually Does
The executive order creates an “AI Litigation Task Force” within the Justice Department with one job: sue states over their AI laws. The Attorney General has 30 days to establish this task force to challenge state laws on grounds that they unconstitutionally regulate interstate commerce or are otherwise unlawful.
But lawsuits aren’t the only tool. The order also threatens federal funding. States with AI laws the administration considers “onerous” could lose billions in broadband funding and other federal grants. The Commerce Department gets 90 days to identify problematic state laws, and all federal agencies must examine whether they can condition grants on states not enforcing their AI regulations.
The order does include exceptions. Any future federal law shouldn’t preempt state rules on child safety, data center infrastructure or government AI procurement. But these protections only apply to hypothetical future legislation, not the order’s immediate effects.
The Constitutional Problem
Here’s the fundamental issue: only Congress can pass laws that override state laws. The president can’t simply declare state laws invalid through executive order, no matter how strongly worded the order might be.
Florida Governor Ron DeSantis put it bluntly: “An executive order doesn’t/can’t preempt state legislative action. Congress could, theoretically, preempt states through legislation.” John Bergmayer of the nonprofit Public Knowledge told NPR that while the administration is trying to bypass Congress with various legal theories, “legally, I don’t think they work very well.”
The Center for American Progress called the order “an unprecedented, unconstitutional, and dangerous assertion of the federal executive branch into the powers of state and local government.” Legal analysis from multiple law firms reaches the same conclusion: courts have consistently held that executive orders cannot negate duly enacted state legislation.
Brad Carson of Americans for Responsible Innovation predicted the order will “hit a brick wall in the courts.” Multiple states, led by Colorado, have already committed to legal challenges.
Colorado’s Law: Ground Zero
The executive order specifically names Colorado’s AI Act as an example of problematic regulation. Colorado’s law, the first comprehensive AI legislation in the United States, takes effect in June 2026. Understanding what Colorado did helps explain what’s at stake.
Colorado’s law focuses on “high-risk” AI systems, those that make or substantially influence important decisions about people’s education, employment, financial services, housing, healthcare or legal services. The law requires these systems to avoid “algorithmic discrimination,” meaning unlawful differential treatment based on race, gender, disability or other protected characteristics. Companies developing or deploying high-risk AI must take reasonable care to prevent discrimination. They must conduct impact assessments, notify people when AI makes decisions about them and provide ways to appeal adverse decisions.
The Trump administration claims Colorado’s law “may even force AI models to produce false results” to avoid discrimination. Colorado officials dispute this, noting the law simply requires AI systems not discriminate against people based on protected characteristics, much like existing civil rights laws. Colorado Attorney General Phil Weiser announced the state will challenge the executive order in court.
Why Tech Companies Support the Order
The tech industry has lobbied heavily for federal preemption. Companies like OpenAI, Google and venture firm Andreessen Horowitz argue that navigating 50 different state regulatory regimes creates a “compliance nightmare” that stifles innovation. Trump told reporters that companies need “one source of approval” rather than going to California, New York and other states separately. The White House notes that state legislatures have introduced over 1,000 AI bills, creating what they call fragmented regulation that undermines America’s competitive position against China.
For startups in particular, the promise of uniform national standards instead of varying state requirements sounds appealing. The argument resonates with anyone who has built a business across state lines.
The Uncertainty Paradox
Here’s where the order’s logic breaks down. It promises to reduce regulatory uncertainty by creating one national framework. But legal experts and even some in the tech industry say it does the opposite.
Andrew Gamino-Cheong of AI governance company Trustible told TechCrunch that the order will backfire on innovation. “Big Tech and the big AI startups have the funds to hire lawyers to help them figure out what to do, or they can simply hedge their bets. The uncertainty does hurt startups the most, especially those that can’t get billions of funding almost at will.”
The problem is straightforward: state laws remain enforceable unless courts block them or states voluntarily pause enforcement. Companies now face extended legal battles with unclear outcomes. Do they comply with state laws and risk federal disapproval? Do they ignore state laws and risk state enforcement? For startups without armies of lawyers, this creates paralysis rather than clarity.
Gary Kibel of Davis + Gilbert told TechCrunch that while businesses would welcome a single national standard, “an executive order is not necessarily the right vehicle to override laws that states have duly enacted.”
An Unexpected Political Coalition
Perhaps the most surprising aspect is who’s opposing the order. It’s not the usual partisan divide. Instead, you have progressive Democrats and MAGA populists united against what they see as tech billionaire influence.
Steve Bannon, one of Trump’s most loyal supporters, harshly criticized the move: “After two humiliating face plants on must-pass legislation now we attempt an entirely unenforceable EO—tech bros doing upmost to turn POTUS MAGA base away from him while they line their pockets.”
Ron DeSantis has proposed his own AI Bill of Rights focusing on protecting children from deepfakes and unauthorized use of people’s images. Other Republican governors including Arkansas’s Sarah Huckabee Sanders and Utah’s Spencer Cox have expressed similar concerns.
On the Democratic side, Senators Elizabeth Warren, Ed Markey and Brian Schatz oppose federal preemption, arguing it would “prevent states from responding to the urgent risks posed by rapidly deployed AI.”
NBC News characterized this as “a coalition of almost everyone against a few extreme tech billionaires.” A YouGov poll found that surveyed adults opposed congressional efforts to block state AI regulation.
This creates fascinating dynamics for future Republican politics. Senator Ted Cruz, who stood beside Trump at the signing ceremony, represents the tech-friendly wing. DeSantis represents a competing vision emphasizing state authority and consumer protection. As one Republican strategist told The Hill, their positions paint “an interesting picture as to two potential guys who could be vying for the party’s nomination in a few years.”
Child Safety Concerns
Conservative family advocacy groups criticized the order’s approach to protecting children. Michael Toscano of the Family First Technology Initiative told NPR the order represents “a huge lost opportunity by the Trump administration to lead the Republican Party into a broadly consultative process” on protecting children from AI harms.
White House AI czar David Sacks said “kid safety, we’re going to protect” and the order won’t push back on that. But critics note this promise only applies to hypothetical future legislation. States with comprehensive AI laws including child safety provisions could still face legal challenges and funding restrictions now.
What Happens Next
The Trump administration tried multiple times this year to halt state AI regulation through Congress and failed each time. In December, GOP lawmakers couldn’t insert AI preemption into the defense spending bill. In July, the Senate dropped an AI moratorium from budget legislation. These failures explain why the administration turned to an executive order, but they also highlight the lack of legislative support.
Multiple states will challenge the order in court. These cases involve fundamental questions of federalism and separation of powers that will likely reach the Supreme Court. Legal analysis from Littler notes that while agencies may implement some directives, the sweeping attempt to override state laws faces near-certain judicial scrutiny.
State lawmakers from both parties say they’ll continue passing AI legislation regardless, seeing AI regulation as a core state responsibility for protecting constituents.
The Bottom Line
This executive order represents one of the most aggressive federal assertions of power over state regulatory authority in recent memory. Whether you view it as necessary leadership or unconstitutional overreach depends on how you weigh competing values: innovation versus consumer protection, uniformity versus local control.

For now, the order creates more questions than answers. Companies face uncertainty about whether to comply with state laws. States are preparing court battles to defend their authority. Consumers have no clear sense of what protections they’ll have or who will enforce them.
The only certainty is that this fight over who regulates AI is far from over. Watch for court challenges in the coming months. This developing story will shape American technology policy for years to come.
You may also enjoy:
- California’s Transparency in Frontier AI Act: A New Standard for AI Safety
- America’s AI Action Plan: A Legal Perspective
- Council of Europe Adopts AI Convention
- US Department of Labor AI Guidelines
and if you like what you read, please subscribe below or in the right-hand column.