How Agentic AI Changes Factory Data Requirements

Most factory data architectures today serve between 10 and 50 consuming systems. Agentic AI will push that number into the thousands, a 100x increase in edge-based data consumers. The ISA-95 layer-by-layer model was not designed for this volume, and manufacturers need to rethink their data infrastructure before scaling AI agents. IDC reports that 56.6% of industrial organizations are already planning, piloting, or in early stages of AI agent deployment (IDC report US52869825).

The Math Behind the Scaling Problem

During Industry 3.0, between 1 and 5 systems consumed shop floor data: SCADA, MES, and historians. Industry 4.0 and the introduction of cloud computing pushed that to 10 to 50 consuming systems as organizations added data lakes, analytics platforms, ERP integrations, and IoT services.

For a single work cell, a manufacturer might deploy a quality agent, a maintenance agent, a scheduling agent, and a supply chain agent. Each one is goal-oriented and specialized, unlike dashboards that pull pre-defined datasets on a schedule. A maintenance agent analyzing a pump needs pressure data, service history, vendor info, and batch context coming from several systems like the OPC server, MES, CMMS, and possibly the ERP. A quality agent for the same pump needs batch identification, product specifications, regulatory thresholds, and alarm history. A raw PLC tag without any of that context is meaningless to either of them.

Multiply that across every work cell, every line, and every site in an enterprise, and the number of individualized data feeds grows rapidly. The ISA-95 model of moving data layer by layer was not built for this. The alternative is a hub-and-spoke integration model that connects to any system and delivers curated data wherever it is needed.

A New Protocol for a New Type of Consumer

This is where Model Context Protocol (MCP) enters the picture. MCP is an open protocol designed specifically for LLM-based agents, a fundamentally different data consumer than what OPC UA, MQTT, or REST APIs were built to serve. MCP does not replace these existing protocols. OPC UA, SQL, and MQTT continue to do their jobs. MCP aggregates and contextualizes data from these sources and exposes it for agent discovery and use.

There is a practical constraint: agents work best with a small, focused set of MCP tools (5 to 10) rather than hundreds. When given too many tool options, agents make poor decisions, leading to hallucinations and errors. This creates a governance requirement. Each agent should be authorized for specific tasks and limited in scope, even as the total quantity of agents needed grows exponentially.

DataOps as the Foundation for AI Agents

The solution lies in a bidirectional relationship between AI and data operations. DataOps for AI provides curated, contextualized, and governed data through managed pipelines. AI for DataOps applies AI to speed configuration and management of the DataOps solution itself. Each investment in one side accelerates the return on the other.

Three priorities stand out for manufacturers. First, treat data as a first-class citizen with governance independent of consuming applications. Second, adopt a hub-and-spoke architecture that contextualizes data at the edge to reduce latency and lower cloud costs. Third, plan for MCP alongside existing protocols and establish governance before scaling.

Read the full article written John Harrington, the Chief Product Officer of HighByte.

See HighByte Intelligence Hub at Hannover Messe 2026

HighByte will exhibit as a Gold Sponsor in the AWS booth at Hannover Messe 2026 (April 20 to 24, Hannover, Germany), Hall 15, Stand D76. Stop by for live demonstrations of the HighByte Intelligence Hub and learn how industrial companies are using Industrial DataOps to bridge OT and IT systems and curate data for industrial AI, including MCP services.

Theater presentations include John Harrington on “How Alcon Deployed Industrial AI with HighByte Powered by AWS” (Tuesday, April 21, 10:00 am) and Aron Semle, CTO, on “Enabling Industrial AI with Validated Data” (Wednesday, April 22, 4:00 pm). John Harrington also will present at the Microsoft Theater on Thursday, April 23 at 3:00 pm.

Get a complimentary Hannover Messe ticket courtesy of HighByte or book a meeting with the HighByte leadership team during the event.

Sponsored by HighByte.


Frequently Asked Questions

Why can’t existing factory data architectures support AI agents?

ISA-95 architectures were designed for 1 to 50 data-consuming systems. Agentic AI requires thousands of specialized agents at the edge, each needing unique, contextualized data feeds from multiple source systems simultaneously. The layer-by-layer data movement model was never built for this volume.

What is Model Context Protocol (MCP) and how does it work with OPC UA?

MCP is an open protocol designed for LLM-based AI agents. It does not replace OPC UA, MQTT, or REST APIs. Instead, it aggregates and contextualizes data from these existing protocols and exposes it for agent discovery and consumption.

How many MCP tools should an AI agent have access to?

AI agents perform best with 5 to 10 MCP tools. When given access to hundreds of options, agents make poor decisions about which tool to use, resulting in hallucinations and errors. Organizations need a governance layer that limits each agent’s scope while scaling total agent count.