California’s ‘Trusted AI’ Procurement Order: The Contract Clause Will Be the Regulator

Newsom just turned AI governance into an RFP checkbox: attestations, watermarking rules, and a state-level veto on federal ‘supply chain risk’ labels.

California’s new executive order on ‘Trusted AI’ does not try to out-philosophy Washington. It does something more effective: it tells agencies to harden the purchasing process. Certifications, contract standards, watermarking guidance, and a review process for federal supply-chain designations are all mechanisms that turn ‘responsible AI’ from a press release into something procurement can enforce.

Why this matters: procurement is regulation with a purchase order

California is not waiting for a national framework to decide what’s ‘safe.’ It is building a state procurement machine that can say: show us your safeguards, or don’t invoice us.

Per CalMatters, the order also pushes back on the Trump administration’s posture by instructing California to independently assess federal designations of companies as supply-chain risks. This is a direct response to the Anthropic dispute, but the pattern is bigger than one company: California is asserting that ‘trust’ is a state-level purchasing decision, not a federal vibe.

The mechanism layer (aka the part that actually changes behavior)

1) New AI contracting certifications

The order directs DGS and CDT to recommend new certifications that can be incorporated into state contracting. The point is simple: vendors attest to safeguards around misuse (including CSAM and non-consensual imagery), civil liberties, and bias governance, and the state gets something it can put in a contract file.

That’s not just compliance theatre. It is a procurement template that can diffuse across departments, then across jurisdictions.

2) A state review of federal ‘supply chain risk’ designations

The CDT State CISO is tasked with reviewing new federal supply-chain risk designations. If the CISO finds a designation improper, DGS and CDT jointly issue guidance so agencies can continue procuring from that company. Translation: California is building an appeal lane for vendors, and a political shield for agencies that want to keep buying.

The background here matters. As Just Security explains, the cited DoD authority is a narrow procurement tool for national security systems, not a general ‘sanctions everyone who talks to them’ power. California is, in effect, telling agencies: we will not automatically adopt an overbroad federal threat label as gospel.

3) Watermarking guidance with statutory hooks

Within 120 days, CDT (with GovOps) must publish best-practice guidance for watermarking AI-generated or significantly manipulated images/video, aligned to California Business & Professions Code requirements. This is how a ‘transparency principle’ becomes a default workflow.

What this is really doing

California is building a state-level ‘trusted access’ pattern, but for procurement. Instead of gating cyber-capable models, it’s gating who gets state contracts. The order’s brilliance (and danger) is that it creates a compliance surface that can scale without passing a new law every time the tech changes.

The Singularity Soup Take

Every AI governance debate eventually ends in the same place: who signs the contract, who audits the vendor, and who eats liability when the demo turns into a scandal. California just skipped the debate and went straight to the paperwork. Resistance is futile, but at least you’ll have a watermark.

What to Watch

  • Whether the new certifications become strict enough to exclude vendors, or stay broad enough to be an attestation sticker pack.
  • Whether other states copy the procurement language (template diffusion is the real policy vector).
  • Whether federal agencies respond by tightening definitions or leaning harder on procurement vehicles to enforce their preferred ‘risk’ designations.