On 11 March 2026, two deadlines from Trump’s December AI executive order land simultaneously. The Commerce Department must identify state AI laws it considers burdensome and the FTC must describe when those laws are preempted by federal consumer protection rules. For businesses with US operations, both reports will shape compliance decisions for the rest of 2026.


The Executive Order and What It Does

On 11 December 2025, the Trump administration signed the “Ensuring a National Policy Framework for Artificial Intelligence” executive order. The order frames a patchwork of state AI regulations as a threat to national AI competitiveness and directs several federal agencies to take action within defined timeframes.

The order’s core concern is that states, particularly Colorado, are requiring AI developers to alter outputs to avoid differential impacts on protected groups. The administration’s legal theory is that forcing AI models to modify their outputs in this way compels them to produce results that are less faithful to underlying data, which it frames as a form of compelled deception under existing federal consumer protection law. That theory is novel and untested in court. Businesses should treat it as the administration’s stated position, not settled law.

Within 30 days of signing, the order established a Department of Justice AI Litigation Task Force, with a mandate to challenge state AI laws in federal court on grounds including the Dormant Commerce Clause. The task force was operational from around 10 January 2026.


What the Two March 11 Reports Are Expected to Cover

The Commerce Department evaluation must identify existing state AI laws that conflict with federal policy, are “overly burdensome,” or require AI systems to alter “truthful outputs.” Colorado’s Artificial Intelligence Act (SB 24-205) is named in the order as an example. California’s transparency requirements and New York’s proposed algorithmic accountability legislation are also candidates.

The Commerce report feeds into the DOJ’s AI Litigation Task Force for potential legal action against specific state laws. One important carve-out: the order expressly exempts child safety protections from the preemption framework. Since most state chatbot bills focus on protecting minors, the core provisions of those bills are likely shielded.

Colorado’s Artificial Intelligence Act (SB 24-205) takes effect 30 June 2026. It is specifically named in the executive order as an example of a law the administration considers burdensome.

The FTC policy statement must describe how the FTC Act’s prohibition on deceptive practices applies to AI, and when state laws requiring alterations to AI outputs are preempted. The statement is interpretive rather than a binding regulation, and courts may not accept the underlying premise that bias mitigation constitutes deception.


The BEAD Leverage

One of the order’s more direct mechanisms is financial. The order instructs the Commerce Department to condition approximately USD 21 billion in remaining Broadband Equity, Access and Deployment programme nondeployment funds on states avoiding “onerous” AI laws (Baker Botts via JD Supra, March 2026). The BEAD programme provides broadband infrastructure grants, and conditioning those funds gives the federal government leverage over state legislatures currently considering AI legislation.

The White House has already used informal pressure. In February 2026, Axios reported that the White House Office of Intergovernmental Affairs wrote to Utah lawmakers asking them to kill an AI transparency bill, and the bill was subsequently withdrawn.


The Compliance Paradox

The executive order was framed as reducing the regulatory burden on businesses. In practice, it has increased compliance complexity. Businesses now face a period of genuine uncertainty in which federal agencies are challenging state laws, state attorneys general are defending their authority, and courts have not yet ruled.

A bipartisan coalition of 36 state attorneys general has warned Congress that federal preemption of state AI laws would have “disastrous consequences” (Baker Botts, March 2026 analysis). That opposition is not going away regardless of what the March 11 reports say.

Legal advisors across multiple firms have converged on the same guidance: continue building compliance with the most stringent applicable state requirements while monitoring federal developments. If preemption is ultimately confirmed for a specific law, compliance work built to the higher standard is not wasted. It typically satisfies the federal baseline as well. Building a compliance programme around anticipated federal preemption creates material execution risk.


What Businesses Should Do Before and After March 11

The period between now and 30 June 2026 is the working window for Colorado compliance. Several actions are time-sensitive.

Businesses should complete an AI use inventory: what systems are deployed, in which states, interacting with which employee and customer populations, and subject to which state-specific obligations. That inventory should identify requirements that may be affected if specific state laws are preempted, and also document which requirements have no preemption risk because they are grounded in existing privacy, employment, or civil rights law.

Legal teams should schedule a review within 48 hours of the March 11 reports being published. The actual text of the Commerce evaluation and FTC policy statement will determine how much pre-publication analysis holds and where businesses need to adjust.

Governance programmes should be designed for adaptability, not optimised for a specific regulatory scenario. The organisations that navigate this period best will be those with documented AI controls, clear accountability structures, and the ability to adjust quickly as court decisions clarify the legal landscape. For a practical starting point on structuring that governance, the AI governance framework guide lays out what a working framework actually contains.

Businesses tracking this alongside international developments should also read the EU AI Act guide for non-EU businesses. The US and EU compliance timelines overlap significantly through mid-2026.

For the Ontario equivalent to these federal moves, see the analysis of the IPC-OHRC AI principles.


Stay across AI governance and compliance developments. Subscribe to the Shadow AI Watch newsletter.


Sources