Economy March 30, 2026

California order demands AI safeguards for companies seeking state contracts

State requires watermarking of AI-generated media, will reassess federal supply-chain risk labels and set vendor certification recommendations within 120 days

By Caleb Monroe
California order demands AI safeguards for companies seeking state contracts

On March 30, California Governor Gavin Newsom issued an executive order obliging companies that pursue state contracts to adopt measures guarding against AI misuse, including producing illegal content, introducing harmful bias, or violating civil rights. The order mandates watermarking of AI-generated images and video per state guidance, directs state review of federal supply-chain risk designations, and asks two state departments to propose new AI-related vendor certifications within 120 days.

Key Points

  • California requires firms seeking state contracts to implement safeguards against AI misuse, including protections against illegal content, harmful bias and civil rights violations - impacts state procurement and AI vendors.
  • The order mandates watermarking of images and video that may be generated by AI according to state guidance to limit misinformation - affects state agencies, media verification, and technology providers.
  • If the federal government labels a company as a supply-chain risk, California will perform its own assessment and may still permit the company to contract with the state if it does not find a risk - influences defense-related suppliers and technology contractors.

March 30 - Governor Gavin Newsom signed an executive order requiring firms that want to contract with California to put in place safeguards aimed at preventing misuse of artificial intelligence. The measure targets a range of potential harms tied to AI, explicitly citing the creation of illegal content, the embedding of harmful bias and the risk of civil rights violations.

The order also includes a requirement that agencies ensure images or videos that may have been created with AI are watermarked, following guidance issued by the state. That provision is intended to reduce the spread of misinformation by clearly identifying media that may have been generated or altered by AI systems.

On the question of federal supply-chain risk designations, the order directs California to perform its own evaluations. If the federal government designates a company as a supply-chain risk, the state will undertake an independent assessment and may still permit the company to remain a contractor if the state’s review does not substantiate the federal finding.

The directive comes after the Pentagon applied a formal supply-chain risk designation to artificial intelligence lab Anthropic, a move that prevents government contractors from using the firm’s technology in work for the U.S. military. The California order underscores that the state will not automatically adopt federal determinations without its own review.

Implementation steps are specified in the order. Within 120 days, California’s Department of General Services and Department of Technology must deliver recommendations for new AI-related vendor certifications. Those proposed certifications would create a mechanism for vendors to attest that they follow responsible AI governance practices and maintain public-safety protections.

The action also reflects the state’s intent to chart an independent path on AI oversight. The order notes this stance even as some Republican lawmakers have pushed for state deference to federal authorities on legal and regulatory matters related to AI.

In related activity earlier this year, California Attorney General Rob Bonta told reporters in February that his office is building internal expertise through an "AI oversight, accountability and regulation program." That development is referenced in the order as part of the broader state effort to develop in-house capabilities for AI governance.


Context and next steps

The executive order establishes both immediate compliance expectations for prospective state contractors and a short timeline for the state to propose formal certification standards. Agencies are instructed to follow watermarking guidance for possible AI-created media, and two state departments have 120 days to recommend a vendor certification framework that would allow firms to formally attest to responsible AI practices.

Risks

  • Uncertainty over contractor eligibility when federal supply-chain risk designations differ from California’s own assessments - impacts firms supplying state and federal contracts, particularly in defense-adjacent technology.
  • Short 120-day timeline for the Department of General Services and Department of Technology to recommend AI-related vendor certifications may leave an interim period without standardized attestation processes - affects vendors and procurement teams.
  • Requirement to watermark AI-generated media hinges on state guidance and agency implementation; inconsistent application could limit effectiveness against misinformation - affects state communications and media verification efforts.

More from Economy

Markets Find Temporary Relief as Geopolitical Tensions Show Signs of Cooling Mar 31, 2026 Tokyo Calls Recent Yen Decline 'Speculative' as Iran Conflict Spurs Market Shock Mar 31, 2026 China Faces Trade-off Between Rising Imported Inflation and Slowing Growth, PBOC Adviser Warns Mar 31, 2026 Bank Negara Raises 2026 Growth Ceiling as Malaysia Weathers Trade and Energy Shocks Mar 30, 2026 Brent Nears Record Monthly Gain as Middle East Conflict Drives Markets into Risk-Off Mode Mar 30, 2026