Industrial AI Needs a Backbone—And That Backbone Is DataOps

  /  Smart Manufacturing   /  Discrete Manufacturing   /  Industrial AI Needs a Backbone—And That Backbone Is DataOps
Industrial DataOps

Industrial AI Needs a Backbone—And That Backbone Is DataOps

At Hannover Messe 2025, John Harrington of HighByte offered a timely reminder: as AI races ahead in the industrial sector, the infrastructure supporting it—specifically, DataOps—needs to keep up.

While AI dominates the headlines, its power is fundamentally limited without access to the right data. Harrington put it bluntly: “Data is the oxygen for AI.” But not just any data—clean, consistent, and contextualized industrial data is what enables AI to deliver real, scalable value in manufacturing environments.

AI and DataOps: Accelerating Each Other

One of the most promising shifts today is the convergence of DataOps and AI. DataOps solutions don’t just enable analytics—they accelerate the development and deployment of AI models by reducing the time and effort needed to transform, contextualize, and deliver usable data across systems.

Rather than struggling to configure fragmented data pipelines, manufacturers that invest in a true DataOps strategy can create agile data environments where AI tools—from predictive maintenance to quality monitoring—have instant access to reliable data.

But John Harrington warns that most industrial firms still lack a cohesive data strategy. Legacy mindsets remain application-centric (focused on SCADA, historians, or individual enterprise apps), making it difficult to scale data usage across the business. A modern data strategy treats data as a shared asset, flowing freely and securely across both OT and IT ecosystems.

The Hidden Risks of “Bad” Industrial Data

When people talk about data quality, it’s often misunderstood. In industrial settings, bad data isn’t just inaccurate—it’s:

  • Inconsistent (e.g., time series with unpredictable gaps),
  • Uncontextualized (e.g., sensor readings without location or machine relationships),
  • Unusable (e.g., raw data streams that can’t be interpreted downstream).

These issues aren’t just annoying—they break AI systems, which rely on predictable, structured data flows. Imagine a factory operator working efficiently, only to have their view of reality suddenly disappear for five minutes. AI responds the same way: without continuity, it stalls or fails.

Platforms that can identify, flag, and help resolve these data issues in real-time are not a nice-to-have—they’re foundational.

From Operators to Orchestrators

As automation expands, the role of the human operator is shifting. Just as physical labor evolved into control-room supervision, today’s operators are becoming orchestrators—responsible for monitoring, managing, and reacting to systems that are increasingly autonomous. And that requires a new kind of interface. John Harrington points to a future where operators will interact with systems using natural language interfaces—simply asking machines what’s happening or issuing verbal commands. While we’re not quite at the “Jarvis from Iron Man” stage, we’re moving closer each year.

Behind these interfaces, agent-based architectures will run thousands of micro-AI systems, each assigned to specific tasks, coordinating with one another and surfacing exactly the insight needed, exactly when it’s needed.

Generative + Agentic AI: A Scalable Future

Where traditional AI models operate in isolation, the next phase is about agentic AI—systems of small, purpose-built agents that handle discrete tasks and collaborate across a shared environment. This modular approach means manufacturers can scale faster, break down problems, and automate more intelligently than ever before.

Generative AI further enhances this landscape by adding context and communication, converting complex results into actionable feedback, even if the end-user is not a data expert.

Scale Is Now a Strategy

In the end, what John Harrington makes clear is that data strategy is not an IT concern—it’s a business imperative. Organizations that fail to modernize their DataOps foundations will struggle to keep pace with the rapid evolution of industrial AI.

Those that succeed will be the ones who understand that the future isn’t just about machines that learn—it’s about architectures that adapt, scale, and self-heal.

The future is being built in real-time. The question is: will your data be ready?

Sponsored by HighByte

About the author

Greg OrloffThis article was written by Greg Orloff, Industry Executive, IIoT World. Greg previously served as the CEO of Tangent Company, inventor of the Watercycle™, the only commercial residential direct potable reuse system in the country.

 

Related articles: