California just gave agencies 120 days to embed AI governance into procurement. Vendors must now prove safeguards around bias, civil rights, and content safety and agencies need a clear inventory of every AI tool they use. AI governance isn’t optional anymore. It’s becoming a requirement.
.png)
On March 30, 2026, Governor Newsom signed Executive Order N-5-26, one of the most significant AI procurement directives any state has issued to date. The order doesn't just encourage responsible AI use. It hardwires governance requirements directly into California's contracting process.
If you work in state or local government, or if you sell technology to government, this changes the game.
The order gives California's Department of General Services (DGS) and Department of Technology (CDT) 120 days to build new vendor certification requirements into state contracting. That deadline lands in late July 2026.
Under these new certifications, any entity seeking to do business with the state will need to attest to and explain their safeguards across three critical areas:
Beyond vendor certifications, the order also directs agencies to provide employees with access to vetted GenAI tools (with appropriate privacy and cybersecurity safeguards), update the State Digital Strategy to address AI transparency and accountability, develop a pilot application that gives Californians AI-powered access to government services, and expand AI training for government employees.
Here's what the order doesn't say explicitly but makes unavoidable. To vet AI tools, you first need to know what AI tools your agency is using.
That sounds obvious. But in practice, most agencies don't have a complete picture. AI tools are being adopted by individual departments, embedded in existing software platforms, and deployed through vendor relationships that weren't originally scoped for AI. The result is a blind spot, and this executive order just made that blind spot a compliance risk.
Before an agency can certify vendors, classify risk, or enforce safeguards, it needs a system of record for every AI tool in its environment. Who's using it, what data it touches, what risk it carries, and whether it meets the new certification standards.
Section 3 of the order directs the Government Operations Agency to recommend reforms to contractor responsibility provisions, including the authority to suspend or debar vendors that have been judicially determined to have violated privacy or civil liberties.
This isn't theoretical. It creates a direct line between a vendor's AI governance posture and their ability to win and keep state contracts. For agencies, that means tracking vendor compliance isn't optional. It's a procurement management function now.
One of the more notable provisions is Section 2, which directs the state CISO to review federal supply chain risk designations and, if deemed improper, issue guidance so agencies can continue procuring from those companies.
Read that again. California is explicitly building a mechanism to override federal procurement restrictions it considers inappropriate. This signals that state-level AI governance will not simply mirror federal policy. States are going to make their own calls, and they're going to need their own governance infrastructure to do it.
For agencies still figuring out their AI governance approach, this executive order is a forcing function. The 120-day clock means California will have new certification requirements in place by late summer. Agencies that wait until the rules are finalized to start building their governance infrastructure will be scrambling.
The agencies that move now, the ones that stand up AI inventories, establish risk classification frameworks, and build vendor compliance tracking into their procurement workflows, will be ahead of the curve when the new requirements take effect.
California is the largest state procurement market in the country. When California builds AI governance into its contracting process, it sets a standard that other states watch closely. We've already seen similar momentum in Colorado, Vermont, and other states moving toward formal AI governance frameworks.
The direction is clear. AI governance is becoming a procurement requirement, not an aspiration. The question isn't whether your state will follow. It's when.
Need help setting up a visible, defensible AI policy? Reach out to the Darwin team. We’re here to help you navigate the policy side and the governance side without the guesswork