COMMENTARY 14 APRIL 2026

After Harcros, How Closed Is Your AI?

Andrew Wordsworth

The American Midwest is known more for its wheat and straight talk than for the finer points of digital philosophy. But a recent order from a Kansas court regarding Jefferies v. Harcros Chemicals has provided something far more useful than the usual transatlantic hype: a bit of adult supervision. While the London and New York world are still debating whether an LLM is an “agent” or a “tool,” Magistrate Judge Angel D. Mitchell has focused instead on what happens to the files once they disappear into the machine.

For those of us in corporate intelligence, the ruling is a slightly unpleasant one. It moves the goalposts from whether a tool is “useful” to whether it is “controlled.” In the Harcros litigation, a messy affair involving chemical emissions, the court extended AI-related restrictions to all discovery material, not just the bits marked “Confidential.” The logic is simple: once you feed data into a public model, you have effectively broadcast it. To maintain the integrity of the process, one must prove the environment is closed, the training is restricted, and the deletion is verifiable.

This is where the game turns cynical.

Behind the judicial concern for privacy lies a brewing arms race. Big Law firms are currently positioning their “closed,” “secure,” and “proprietary” AI models as the only legitimate way to handle data. It is a classic “moat” strategy. By convincing the courts that only these bespoke, high-tariff systems are “safe,” they effectively disenfranchise the smaller firms and the independent investigators who rely on more standard, cost-effective tools.

If a court accepts that “security” is defined by the price tag of the software, the plaintiff’s bar, and by extension the  investigator, finds itself in a bind. It is a strategic use of “compliance” to price the opposition out of the market.

In the UK, where we already navigate the thickets of GDPR and professional indemnity, this Kansas ruling signals a shift in the “burden of proof.” It will no longer be enough to take a vendor at their word when they say they “don’t train on your data.” One expects the next round of disputes to involve “audit logs” and “jurisdictional segregation” – technical jargon for “how much can you afford to spend on your firewall?”

Strip away the talk of neural networks and we are back to a very old-fashioned power play. The big firms want to define “safety” in a way that only they can satisfy. As the Kansas court rightly noted, the question isn’t about the “spirit” of the AI; it is about who has the documents and who gets them next. For the investigator on the ground, the lesson is clear: the machine is only as good as the contract that governs it, and in this new era, “privacy” is becoming a luxury item used to keep the hoi polloi from using the best tools.

The following Raedas team members have prepared this update: Andrew Wordsworth, Emad Rajeh

Andrew Wordsworth – Partner, Dubai (awordsworth@raedas.com)
Emad Rajeh – Senior Associate, Washington, D.C. (erajeh@raedas.com)