The AI Scaling Problem Nobody’s Solving: Why 140 Applications Delivered Zero Scale | SPONSORED
Jason Schern, Field CTO at Cognite, reveals the uncomfortable truth about industrial AI: companies are building solutions that can’t scale beyond a single site, and it’s killing their ROI.
The 140-to-Zero Problem
During a recent audit, a large upstream oil company in Asia discovered something shocking: they had developed 140 applications and agents across 11 offshore assets. Every single one was critical to operations; they couldn’t turn off a single application without disrupting the business.
However, here’s the kicker: zero of those 140 solutions had been deployed beyond the original asset for which they were built.
“Value scaling is key,” Jason explains. “If I build an agent for one site and I can’t lift and shift it to the next site, things that provide incremental value in one place never become measurable value to the business.”
Why AI Projects Stay in Bubbles
The problem isn’t technical capability—it’s data architecture. Companies are building AI solutions on top of fragmented, context-poor data foundations that make scaling impossible.
Jason puts it bluntly: “If you want to be good at AI, you’ve got to be great at data.”
The difference between companies that achieve measurable results and those stuck in limbo: the ability to scale and rapidly validate value across multiple assets. “You don’t want some magical use case that drives outsized advantage in one place, in a bubble,” Jason says. “You want things you can scale rapidly across them all.”
The Context Problem Industrial Data Doesn’t Have
Large language models work because context is built into the content: grammar, word choice, paragraph structure. But with industrial data it’s a different story. “If I’ve got streams and streams of time series data, how much context is there in that? Almost none,” Jason notes.
A midstream operator in the U.S. proved this point by achieving a 15% reduction in energy consumption, not through better sensors, but by combining operational data with financial forecasting, energy cost predictions, and equipment relationships. They developed virtual flow meters that eliminated laboratory testing by using analytics from pressure, temperature, and flow rates combined with contextual business data.
“You need to know: Was there a work order? When was it executed? Where is this equipment in the plant schematic? What’s upstream and downstream?” Jason explains. “All of those things provide context that allows you to reason.”
The Cultural Barriers That Cost Millions
Two outdated paradigms are blocking progress:
- “Industrial data belongs on-site and should stay on-site.”
- “Copying data is bad.”
Both stem from an era when storage costs were prohibitive. “Storage costs are pennies on the gigabyte now,” Jason says. “What kills you is compute; when you try to find relevant data and connect it.”
The solution? Copy the data. Optimize it for different use cases. “There’s almost zero cost in making that copy, and in the copying, you’re optimizing things in a way that dramatically lowers compute costs, which in the age of AI is everything.”
The Six-Week-to-Six-Hour Compression
Aker BP’s work with root cause analysis (RCA) illustrates AI’s real potential. Traditionally, RCA activities take months, meaning companies can only investigate a fraction of incidents that warrant analysis.
Using AI augmentation, one aspect of the RCA process was compressed from six weeks to less than six hours.
“The company has not been able to do all the RCAs they should be doing,” Jason points out. “With the same people, I can now attack that backlog of RCAs that I would never get to and get the business benefit of those resolutions.”
The Too-Early-Too-Late Paradox
Jason’s advice for organizations feeling paralyzed by AI’s rapid evolution: “You’re going to feel like you’re too early and too late at the same time. And it’s true. You are both.”
Things are evolving so quickly that it feels like catch-up mode (too late), but also like the technology isn’t quite ready (too early).
“Don’t let that ambiguity slow you down,” Jason warns. “Value now is much more important than potential value in the future. The value you capture at this moment has an outsized impact versus equivalent value achieved a year later.”
The companies winning at industrial AI aren’t finding magical use cases. They’re building data architectures that allow small incremental improvements to scale across all assets rapidly.
As Jason puts it: “If you don’t have a plan to handle the data properly, that whole data ops conversation fueling AI is the most relevant conversation to value scaling.”
The question isn’t whether AI will transform industrial operations. It’s whether your data architecture will let you scale that transformation beyond the first site.
Sponsored by Cognite
About the author
This article was written by Greg Orloff, Industry Executive, IIoT World. Greg previously served as the CEO of Tangent Company, inventor of the Watercycle™, the only commercial residential direct potable reuse system in the country.