The Trump administration’s new Executive Order seeks to preempt strict state regulations on private AI development to reduce compliance burdens, but it explicitly leaves state & local government authority over internal AI use and procurement intact. While federal agencies look to challenge provisions of state laws on private sector use of AI, impacted public agencies retain full responsibility for managing their own AI tools, making internal governance a critical and immediate priority.

The Executive Order (EO) issued yesterday by the Trump administration has generated significant attention, but its immediate operational clarity remains limited. While the direction itself is not surprising, the immediate implications, particularly for local governments, remain confusing. At its core, the order signals a definitive policy trajectory: the establishment of a minimally burdensome national AI regulatory framework aimed at removing obstacles to U.S. leadership in artificial intelligence, specifically including the potential challenge and preemption of state laws deemed inconsistent with federal policy.
For local governments and public agencies committed to operating responsibly, efficiently, and transparently, this raises legitimate questions. Should AI initiatives be paused? Are local AI policies still relevant? Does the Executive Order change what is permitted or expected of us? These are reasonable concerns. This article aims to provide clarity by examining the Executive Order through the lens of local AI governance, focusing on what has changed, what has not, and how local agencies should think about the period ahead.
The Executive Order, in Brief
The Executive Order is primarily focused on state-level regulation of the private AI market - specifically, state laws that regulate AI developers, models, and commercial AI services. The administration’s central argument is that a fragmented regulatory environment, described as “a patchwork of 50 different regulatory regimes,” creates compliance challenges, increases costs for innovators, and undermines national competitiveness against global rivals like China.
In response, the order initiates a decisive federal effort to challenge and deter certain state AI laws viewed as particularly “onerous” or restrictive toward the private sector. The administration specifically targets state rules that may “require AI models to alter their truthful outputs” or mandate disclosures that could violate the First Amendment or other Constitutional rights. For example, the administration has pointed to state laws that attempt to embed certain ideologies, such as Diversity, Equity, and Inclusion (DEI), into AI models.
The EO mobilizes federal agencies to aggressively dismantle state-level AI regulation:
- DOJ: Establish an AI Litigation Task Force to challenge (sue) state laws deemed unconstitutional, preempted, or otherwise unlawful.
- Commerce: Evaluate "onerous" state AI laws and potentially withhold non-deployment BEAD funding from non-compliant states.
- FTC/FCC: Examine state laws and explore adopting a federal disclosure standard to assert federal primacy and preempt conflicting state requirements.
What Has Not Changed for Local Government
Amid ongoing discussion about what may evolve at the federal and state regulatory levels, it is equally important for local governments to understand what the Executive Order explicitly does not preempt or change.
From an operational standpoint, the core responsibilities of states, cities, counties, and local public agencies in managing AI remain firmly in place. The EO specifies that its legislative recommendation shall not propose preempting otherwise lawful state AI laws relating to:
- Child safety protections (e.g., laws addressing deepfake exploitation).
- AI compute and data center infrastructure (e.g., local permitting reforms).
- State government procurement and use of AI.
This means that local agencies retain full authority over:
- Internal AI Use: AI tools used for operational efficiency, service delivery, data analysis, decision support, or administrative functions are not regulated by this EO.
- AI Procurement and Vendor Management: The EO does not alter how agencies evaluate AI-enabled products, issue RFPs, negotiate contracts, or require disclosures from vendors regarding data use, security, documentation, and oversight.
- Just as importantly, existing legal and compliance obligations remain fully intact. The Executive Order does not suspend or override requirements related to records retention, privacy and data protection, cybersecurity, transparency, or public accountability. Of course, the specific obligations of any local agency continue to depend on applicable state and local law. The Executive Order does not displace those requirements, nor does it create new exemptions from them. What Happens Next: Expected
Timelines and Ongoing Uncertainty
One of the most common questions following the Executive Order is what will actually happen next and when. At this stage, it is important to be clear - many of the next steps outlined in the order remain undefined, and their practical implications are still uncertain. While the order sets expectations around timing, it does not provide detailed information about how reviews will be conducted, what criteria will be applied, or how quickly outcomes will emerge.
- Near Term (Next 30–90 Days): Federal agencies, including the DOJ’s AI Litigation Task Force and the Commerce Department, are directed to begin identifying and reviewing existing and proposed state-level AI laws. However, existing state and local obligations remain fully in effect, as the EO does not immediately invalidate any law. This phase will be largely internal and technical, involving legal and policy analysis.
- Medium Term (Ongoing Litigation): Legal challenges are certain to follow as the Task Force challenges state laws (e.g., on grounds of unconstitutionally regulating interstate commerce). States like Colorado and California, with existing AI laws on algorithmic discrimination or transparency, are likely targets. For local governments, the impact of this stage is largely indirect, affecting the regulatory environment for your private-sector AI vendors.
- Longer Term (Legislative Action): The Executive Order explicitly calls for the development of a national AI legislative framework to preempt state laws that conflict with the policy set forth in the order. This legislative process is often slow and involves significant political and institutional negotiation. There is no clear timeline for when or whether comprehensive federal rules will emerge from Congress.
For local agencies, this means policy uncertainty is likely to persist. The absence of immediate clarity should not be interpreted as a pause in responsibility, but rather as a signal that agencies must continue operating in an evolving environment.
Why Local AI Governance Is Now an Operational Imperative
Taken together, this is the defining reality local governments face today. While the federal government signals an aggressive effort to restrain and remove state regulations on the private AI market, the responsibility for how AI is actually used inside public institutions has shifted decisively inward.
The Executive Order does not replace state law, does not provide operational guidance for public agencies, and does not resolve the practical questions local agencies already confront. Instead, it leaves greater discretion and therefore greater accountability at the local level. This is where the distinction between regulation and governance becomes decisive. Regulation debates what the market should be allowed to do. Governance determines how AI is approved, managed, documented, and explained inside public institutions.
In a moment of regulatory uncertainty and evolving policy timelines, AI systems are already influencing public decisions, and expectations around transparency, accountability, and fairness remain unchanged. As a result, local AI governance is no longer a forward-looking best practice, it is an immediate operational necessity for responsible public administration today.