Back to all articles
Incident Report January 30, 2026 8 min read

If CISA Can Put FOUO in ChatGPT, Your Exception Process Is the Breach

The story is not a rogue user. It is the exception path. Reports say FOUO contracting files were pasted into public ChatGPT under a temporary exemption, triggering alerts and a DHS review.

Article focus

Treatment: photo

Image source: Kindel Media on Pexels

License: Pexels License

Hands sorting office paperwork beside a laptop, used for the CISA ChatGPT documents article
Pexels photo used for the CISA ChatGPT documents article. Kindel Media on Pexels

Executive summary

Multiple reports say at least four "For Official Use Only" contracting documents were pasted into public ChatGPT between mid-July and early August 2025. The uploads triggered automated security alerts and an internal DHS assessment. CISA says the authorized use was "short-term and limited" and that the last use was in mid-July 2025. The core failure isn't the model — it's the exception path that let sensitive material leave the perimeter.

This isn't a story about a rogue employee. It's about exception drift: the moment senior users get carve-outs, policy stops being policy. Reports describe a straightforward sequence: the acting head of the Cybersecurity and Infrastructure Security Agency (CISA) requested an exception to use ChatGPT, then uploaded sensitive-but-unclassified contracting documents marked For Official Use Only (FOUO). Security sensors detected the activity and DHS initiated an internal review of potential exposure in a public AI system.

How FOUO Contracting Files Reached Public ChatGPT

The documents were not classified, but they were explicitly marked as nonpublic government material. They were entered into a public AI service that, unlike DHS-approved internal tools, is not confined to federal networks. CSO, summarizing Politico's reporting, places the uploads between mid-July and early August 2025. CISA says the authorized use had DHS safeguards, was short-term and limited, and that default policy blocks ChatGPT unless an exception is granted.

Reported Sequence: Exception, Uploads, Review

After joining CISA:
Gottumukkala requested special permission to use ChatGPT while default policy blocked access without an exception
Mid-July-Early Aug:
Politico-derived reporting says at least four FOUO contracting documents were uploaded
Early Aug 2025:
Cybersecurity sensors detected activity and generated alerts
Jan 2026:
CISA says the approved use had DHS safeguards, was short-term and limited, and last occurred in mid-July

What the CISA Exception Signals for Organizations

The lesson is not specific to government labels. Any organization that creates exceptions for senior users is creating a trust boundary problem. Staff treat the chat session like a temporary workspace, but once sensitive material is pasted into a public AI system the organization no longer controls retention, downstream sharing, or later discoverability in the same way.

Why FOUO Is a Hard Boundary, Not a Courtesy Label

That sequence matters because FOUO is a boundary, not a suggestion. FOUO is not classified, but it is explicitly designated to stay inside government systems. The risk isn't espionage alone — it's loss of control. Once sensitive documents are placed in a public AI system, retention, access, and downstream reuse depend on the provider's policies and account settings, not your agency's.

Why the ChatGPT Exception Became the Failure Point

Once you accept the data classification, the governance failure becomes obvious. This incident underscores a failure pattern we see across enterprises: AI usage is blocked by default, then re-enabled through exceptions for senior users. The result is a governance gap where the highest-risk users get the least consistent controls. If security tools can't enforce policy for leadership, they won't hold for everyone else.

Control Failure Pattern at the Boundary

  • Exceptions bypass default controls.
  • Public AI tools lack containment and retention guarantees.
  • Detection happens after data leaves the perimeter.

What to Lock Down After a FOUO Exception

  • Eliminate exception sprawl: If access is needed, route it through a governed internal AI platform.
  • Enforce at runtime: Prevent sensitive document flows to unapproved tools, not just in policy docs.
  • Audit leadership usage: High-privilege users need the tightest controls and most transparent logs.
  • Define FOUO equivalents: Treat “sensitive but unclassified” labels as hard policy boundaries.

How 3LS Would Govern This Exception Path

3LS applies the same guardrails to leadership that everyone else gets: runtime blocks on sensitive data leaving approved tools, audit trails for every exception path, and visibility into where policy is being overridden before the exception becomes the breach.

Continue reading

Related articles

Browse all