Incident Report January 30, 2026 8 min read

If CISA Can Put FOUO in ChatGPT, Your Exception Process Is the Breach

If the nation's top cybersecurity agency can send sensitive-but-unclassified documents to a public chatbot, your exception process is already the breach. Reports indicate the acting CISA director uploaded FOUO contracting files into public ChatGPT — a leadership-level governance failure, not a one-off user mistake.

Incident Summary

Multiple reports say at least four "For Official Use Only" contracting documents were pasted into public ChatGPT between mid-July and early August 2025. The uploads triggered automated security alerts and a DHS damage assessment. CISA says the authorized use was "short-term and limited" and that the last use was in mid-July 2025. The core failure isn't the model — it's the exception path that let sensitive material leave the perimeter.

Government AI governance gap visual

This isn't a story about a rogue employee. It's about exception drift: the moment senior users get carve-outs, policy stops being policy. Reports describe a straightforward sequence: the acting head of the Cybersecurity and Infrastructure Security Agency (CISA) requested an exception to use ChatGPT, then uploaded sensitive-but-unclassified contracting documents marked For Official Use Only (FOUO). Security sensors detected the uploads and DHS initiated a review to assess potential exposure.

What Happened (As Reported)

The documents were not classified, but they were explicitly marked as nonpublic government material. They were entered into a public AI service that, unlike DHS-approved internal tools, is not confined to federal networks. Reports place the uploads between mid-July and early August 2025, while CISA says the authorized use was short-term and limited and last used in mid-July 2025.

Timeline (Reported)

Mid-July 2025:
Temporary exception granted; CISA says last authorized use was mid-July
Mid-July–Early Aug:
FOUO contracting documents uploaded (reported timeline)
Early Aug 2025:
Automated alerts fire; DHS initiates damage assessment
Jan 2026:
Public reporting and official response confirming limited use

Why FOUO Still Matters

That sequence matters because FOUO is a boundary, not a suggestion. FOUO is not classified, but it is explicitly designated to stay inside government systems. The risk isn't espionage alone — it's loss of control. Once sensitive documents are placed in a public AI system, retention, access, and downstream reuse depend on the provider's policies and account settings, not your agency's.

The Exception Pathway Is the Breach

Once you accept the data classification, the governance failure becomes obvious. This incident underscores a failure pattern we see across enterprises: AI usage is blocked by default, then re-enabled through exceptions for senior users. The result is a governance gap where the highest-risk users get the least consistent controls. If security tools can't enforce policy for leadership, they won't hold for everyone else.

Control Failure Pattern

  • Exceptions bypass default controls.
  • Public AI tools lack containment and retention guarantees.
  • Detection happens after data leaves the perimeter.

What Security Teams Should Do Now

  • Eliminate exception sprawl: If access is needed, route it through a governed internal AI platform.
  • Enforce at runtime: Prevent sensitive document flows to unapproved tools, not just in policy docs.
  • Audit leadership usage: High-privilege users need the tightest controls and most transparent logs.
  • Define FOUO equivalents: Treat “sensitive but unclassified” labels as hard policy boundaries.

How AARSM Helps

AARSM applies the same guardrails to leadership that everyone else gets: runtime blocks on sensitive data leaving approved tools, plus audit trails for every exception path.


About This Analysis

This analysis is based on public reporting about the reported ChatGPT uploads, CISA's statement that the use was short-term and limited under DHS controls, and coverage of the resulting alerts and internal review.

Related Articles