GSA's Proposed AI Contract Clause: What Every Contractor Selling AI to the Government Needs to Know

If your company sells AI products or services through a GSA Schedule, or if you use commercial AI tools to perform government contracts, pay attention. On March 6, 2026, GSA published a proposed contract clause that would fundamentally change the rules for AI in federal contracting. Comments are due March 20, 2026. That's tomorrow.

The clause is GSAR 552.239-7001, "Basic Safeguarding of Artificial Intelligence Systems." GSA intends to include it in MAS Solicitation 47QSMD20R0001, Refresh 31, which is expected to drop in March or April 2026. Once that mass modification goes out, MAS Schedule holders will have 60 days to accept the new terms.

This is not a minor tweak to existing requirements. The clause rewrites the rules on data ownership, vendor liability, permissible AI sources, and government audit rights. And it overrides any commercial licensing terms you've already agreed to.

Server room lights representing AI infrastructure and government technology
The clause covers AI systems end to end: inputs, outputs, training data, and the commercial vendors behind the models.

The short version

Here is what the clause does at a high level:

  • Requires "American AI Systems" only. AI components from non-U.S. entities are prohibited.
  • Gives the government broad ownership of all data inputs, data outputs, and custom AI work product created under the contract.
  • Bars contractors from using government data to train models, improve products for other customers, or inform business decisions.
  • Makes prime contractors responsible for their commercial AI vendors' compliance with these rules.
  • Requires "Unbiased AI Principles" adherence, with government audit rights and the ability to suspend noncompliant systems.
  • Imposes 72-hour incident reporting to CISA and the contracting officer.
  • Overrides existing commercial terms of service and licensing agreements.
  • Requires data portability and interoperability so the government can switch vendors without losing data.

If you are selling AI under a Schedule contract, or reselling commercial AI tools as part of a government solution, every one of those items affects your business model, your vendor agreements, and your legal exposure.

Who does this apply to?

The clause applies to all solicitations and contracts "for AI capabilities" on GSA Schedules. What counts as "AI capabilities" is not further defined in the clause itself, which is one of the things contractors should flag in comments.

The practical reach is wide. If you are:

  • Selling an AI-powered SaaS tool to a federal agency through your MAS Schedule,
  • Integrating a commercial LLM into a solution delivered under a Schedule task order,
  • Reselling or licensing an AI product from a third-party vendor to the government,

...you are in scope. And so is your vendor, because you are now on the hook for their compliance.

American AI Systems: what the clause requires (and what it leaves unclear)

The clause requires contractors and their service providers to "only use American AI Systems," defined as AI systems "developed and produced in the United States," citing OMB memo M-25-22. It further states that "use of foreign AI systems in the performance of this contract, including any AI components manufactured, developed, or controlled by non-U.S. entities, is prohibited."

The problem is that the clause does not define "manufactured," "developed," or "controlled." It also does not specify which test applies. Crowell & Moring flagged this in their analysis: it is unclear whether GSA will use the Trade Agreements Act substantial transformation standard, something stricter, or something else entirely. Many commercial AI models are built with global teams, open-source components from multiple countries, and infrastructure that spans borders.

If you are not sure your AI stack clears this bar, that uncertainty is exactly what the comment period is for.

Government data rights: the government owns a lot more than you might think

This is where the clause gets teeth. Under the proposed terms, the government would own:

  • All "Government Data," which covers every data input (prompts, queries, source documents, knowledge bases) and every data output (responses, logs, synthetic data, metadata, annotations) generated during contract performance.
  • All "Custom Developments," meaning any modifications, configurations, fine-tuning, or enhancements to an AI system made for the government. IP rights in any government data derivatives are assigned to the government at creation.
  • All feedback provided by government users to the AI system.

Contractors keep ownership of their underlying AI system and base models. But the government gets an irrevocable, royalty-free license to use those systems for any lawful government purpose during the contract.

And there is a hard restriction: the AI system cannot refuse to produce outputs or analyses based on the contractor's or vendor's "discretionary policies." If your commercial AI vendor has content policies that limit what the model will generate, that policy cannot be applied to government users in a way that withholds analysis. The government is paying for the output, and the clause says it gets the output.

You cannot use government data to improve your product. Period.

The clause explicitly prohibits using government data to:

  • Train, fine-tune, or otherwise improve any LLM or other AI model, including third-party models.
  • Develop or improve AI systems for other customers.
  • Inform advertising, marketing, sales, or other business decisions.

This is the OMB M-25-22 prohibition baked directly into the contract. It is not new as a concept, but having it in an enforceable contract clause with defined consequences is a different story than a policy memo.

For commercial AI vendors whose business model involves using interaction data to improve their models, this creates a direct conflict with standard terms of service. That is why the clause also states it overrides existing commercial licensing terms, including terms of sale and service agreements the government had previously agreed to.

Person reviewing documents and compliance checklist on a desk
Prime contractors will need to renegotiate agreements with commercial AI vendors that currently claim broad rights over interaction data.

You are responsible for your vendors

Here is the part that should get every prime contractor's attention. The clause makes the prime contractor responsible for compliance by "Service Providers," defined as any entity that "directly or indirectly provides, operates, or licenses an AI system," including subcontractors and commercial vendors.

The clause is not a mandatory flowdown, meaning GSA is not requiring you to formally flow the exact clause language into your vendor agreements. But if your vendor violates these requirements and the government comes looking, you are the one on the hook. That means in practice, you will need to get compliance commitments from your AI vendors or risk the consequences yourself.

If you resell or incorporate a commercial AI product such as a large language model from a major platform provider, your current agreement with that vendor almost certainly does not include these commitments. You will need to renegotiate, or stop using that vendor for government work.

"Unbiased AI Principles" and government audit rights

The clause includes a section on "Unbiased AI Principles" tied directly to President Trump's July 2025 executive order directing the government not to procure AI models with "ideological biases or social agendas."

Under the clause, the AI system must:

  • Be truthful when responding to factual queries, prioritize scientific accuracy, and acknowledge uncertainty.
  • Be neutral and nonpartisan. The clause specifically bars intentionally encoding "partisan or ideological judgments such as Diversity, Equity, Inclusion" into system outputs.
  • Implement continuous improvement processes to detect and reduce bias, including regular evaluation against verified factual sources.
  • Implement OMB directives on AI systems, to the maximum extent possible, when the government requests it during contract performance.

The government's enforcement tools here are significant. GSA can conduct automated assessments of your AI system at any time using its own benchmarks, assessing bias, truthfulness, safety, and "unsolicited ideological content." If it finds noncompliance, it can suspend use of the AI system until the issues are resolved. If the contract is terminated for cause for failure to meet the Unbiased AI Principles, the contractor is liable for "reasonable decommissioning costs." Neither "performance issues" nor "decommissioning costs" are defined in the clause.

The ambiguity here is real. What counts as ideological content? What benchmarks will the government use? Contractors should flag these questions in comments.

Incident reporting: 72 hours to CISA and your contracting officer

The clause borrows heavily from DFARS 252.204-7012 and FedRAMP incident reporting. Upon discovering any confirmed or suspected incident, contractors must:

  • Report to CISA via the incident reporting form and notify the contracting officer and government points of contact within 72 hours of discovery.
  • Submit daily status updates until the incident is resolved.
  • Preserve logs, forensic images, and incident artifacts for at least 90 calendar days.

One carve-out: if FedRAMP incident reporting procedures apply and conflict with the clause requirements, FedRAMP controls. FedRAMP actually requires faster reporting for cloud service providers (1 hour for suspected incidents), so understanding which regime governs your specific situation matters.

Data portability, interoperability, and change management

The clause includes several provisions designed to prevent vendor lock-in:

  • Contractors must use standard, open data formats and APIs for all outputs. No proprietary formats that require additional licensing.
  • Contractors must provide tools to export all government data in open, machine-readable formats (such as JSON or XML) that allow complete ingestion into another system.
  • When replacing or discontinuing a model used under the contract, contractors must provide the government concurrent access to the successor model for 30 calendar days (major versions) or 15 calendar days (minor versions) before switching.
  • Any change that materially increases output bias or decreases safety guardrails must be disclosed to the government within 7 calendar days of identifying the change.
  • Contractors must provide 30 days' notice before any material change to privacy protections, or before adding or materially changing a service provider used in performance.

Track federal AI contracting opportunities as they drop

GovContractAlerts monitors SAM.gov daily and filters by NAICS code, set-aside type, and keywords. You see the opportunities that match your business before your competitors do.

Sign Up Free

The comment period closes tomorrow

GSA set a short window: comments on GSAR 552.239-7001 are due March 20, 2026. That is an unusually tight timeline for a clause with this level of operational impact. Crowell & Moring called it "7 days for contractor comments," and the analysis from their team is worth reading before you decide whether to submit feedback.

GSA's comment portal is at buy.gsa.gov. Comments should be specific and grounded in how the proposed requirements interact with your actual systems, vendor agreements, and operational processes. GSA is more likely to address substantive technical objections than general opposition.

Key areas where comments could be productive:

  • Clarification of "American AI Systems" and what test applies to determine U.S. origin.
  • How "AI capabilities" is scoped, and whether it captures incidental AI tool use versus dedicated AI product sales.
  • How the clause interacts with existing commercial terms and FAR Part 12 commercial item treatment.
  • Definitions for "decommissioning costs" and "performance issues" in the Unbiased AI Principles section.
  • Whether the 72-hour incident reporting requirement aligns with or conflicts with FedRAMP obligations for specific contract types.
  • The practical feasibility of obtaining compliance commitments from commercial AI vendors who operate under standard, non-negotiable terms of service.

What happens after comments close

GSA expects to issue MAS Refresh 31 in March or April 2026. Once the mass modification goes out to existing MAS Schedule holders, those contractors will have 60 days to accept the new terms. That is not a lot of time to audit your AI stack, renegotiate vendor agreements, and build out the compliance infrastructure the clause requires.

The clause applies to new solicitations and contracts. It also applies to modifications to existing contracts where it is incorporated. If you have current task orders under your Schedule, the timeline for when those get swept in will depend on the specific modification language.

What to do now

If your company is on a GSA Schedule and uses or sells AI tools in the performance of government work, here is a practical short list:

1) Map your AI stack

Identify every AI system, model, or tool used in government contract performance. Include the underlying models your SaaS vendors use, not just your own code. Note the country of origin, ownership structure, and where the model was developed and trained.

2) Pull your vendor agreements

Find your terms of service and licensing agreements with commercial AI vendors. Look specifically at: who owns interaction data, whether they use your data to improve their models, what content policies restrict outputs, and whether those terms can be modified for government use cases.

3) Assess the "American AI Systems" question

Until GSA clarifies the definition, conservatively flag any AI tool with significant non-U.S. development, ownership, or control as a potential compliance risk. That includes open-source models with international contributor bases and commercial tools owned by foreign entities or with material offshore development.

4) Submit comments if the clause creates genuine operational problems

The comment window is short, but submitting specific, technical comments is how contractors influence final clause language. If you cannot comply with a particular provision as written, say so, and explain why. That record matters if the clause ends up in litigation later.

5) Start conversations with your legal and contracts teams now

Once Refresh 31 drops, the 60-day acceptance clock starts. That is not enough time to do the analysis from scratch. Get ahead of it.

Sources