Eminence at the Edge

  /  Industrial IoT   /  Digital Disruption   /  Eminence at the Edge

Eminence at the Edge

Since Internet of Things technology started to gain mainstream traction, multiple platforms, solutions and strategies have been developed. At the moment there are more than 450 ‘platforms’ commercially available. Yet, realistically speaking, most of these have been designed for a very specific function on outdated technology and mostly down a vertical application path. Similarly, gateway players have developed powerful gateway technology that generically aggregates data to the cloud.

Why? Well, historically, technology companies venturing into IoT argued that the best way to rapidly create commercial value was to develop strong vertically integrated applications encompassing an ecosystem of partners. The quickest way to show value was to focus on a vertical and go after it. We have a different view. The true power and differentiator of IoT implementations resides in full IoT stack capability encompassing the edge and the cloud.


In 2002, it was all about the cloud. Amazon Web Services was launched and, when OPC Unified Architecture was released in 2006 enabling secure communication between devices, data sources and applications, adoption of IoT began to rise. The early adopters developed their projects with the cloud in mind. The thinking being a simple connected mindset where billions of sensors will be deployed and easily spin up supercomputers at low cost in the cloud to process all of this valuable Big Data… how could they go wrong? Find more about 10 Rules Behind Amazon’s Success You Can Use on Your Digital Transformation Journey.

During the .com bomb era, people ran around with amazing ideas and vertically developed product centric applications that they thought would take over the world once mass adoption took place. This was followed by an implosion which saw a huge number of concepts, ideas and investments disappear. A similar trend is developing in the adoption of IoT and in digitalisation in general.

The .com bomb was a rationalisation and a reality for companies and their investors resulting in fortunes being made and lost in the hype. Timing is key in driving Big Tech. If you’re too soon, you are potentially busy developing a concept that will not only age quickly but give competitors plenty to learn from and piggy back off, allowing them to develop better tech that is more relevant and value driven. Often a cool idea is exactly that – a cool idea, but without real substance it doesn’t get wide commercial adoption. The commercial viability ultimately sits with the ability of a product to produce ‘real value’, whether quantitative or qualitative.

The importance of timing

Timing is everything, and tech is hard to time.

If we look at the cross-cut majority of IoT ‘platforms’, we understand more than most companies that these ‘platforms’ have all been built in the cloud. Five years ago, everything was in the cloud, it is therefore unsurprising that it is still dominating IoT discussions.

Anyone who has, up until this point, embarked on an IoT initiative, has probably

  1. built a solution that resides in the cloud;
  2. leveraged the power of the cloud and its ability to centralise and leverage processing power from the supercomputers that exist there;
  3. adopted a top down approach incorporating the cloud as the central power behind the application.

The competitive landscape

Looking at the IoT industry and where the ‘competition’ and ‘incumbents’ are in the current IoT cycle, it is evident that IoT development are in a perfect bubble that I believe is not far from rationalisation. I think it will be less severe than in 2000 as I think investors have been more calculated; but there certainly will be a correction in the not so distant future. Driving this belief is that you need this type of event for eminence to be created. The companies that have the ability to lock into this IoT business value proposition and convert that into investor value will survive and will gain eminence. There are a number of great technologies and concepts available but only the ones that are able to truly unlock value will remain.

What  is changing in IoT thinking:

Addressing the problems of inter-connectivity from the bottom up is key, whilst acknowledging the power of the cloud and Big Data, but also acknowledge that that power is greatly diminished or even nullified if the edge layer is not correctly managed.

The general platform interoperability discussion talks to cloud interoperability. This is a hugely complex play that causes massive headaches for some of the most influential players as they try to fathom how to seamlessly integrate multiple platforms. API’s are the talk of the day, with the current solution to solving this dilemma, but it is simply not sustainable or practical. On a whiteboard it might look great having several platforms integrated via API and then plugging into some ESB via microservices, but I challenge you to construct all of that and take into consideration the small part all of these guys initially did not deem necessary – the edge.

This methodology is hugely reliant on smart sensor technology that has the ability to push data into the cloud. There’s a heavy reliance on networks and, as a result, ‘platforms’ are struggling to grapple with edge technology, all the while hopeful that a ‘network’ will resolve this problem.

For some time now we’ve been saying that the edge is eating the cloud.

We’re not implying that the cloud will lose relevance. What we’re saying is that a true IoT ecosystem will become less and less reliant on the cloud and, in fact, that ecosystem design will rely heavily on edge capabilities.

A natural oversight, but a crucial detail destined to form an integral part of this industry’s ability to commercialise in the near future. The IoT industry is inhibited by an inability to create interconnectivity and interoperability at the edge.

Retrofit and decrease the barrier to entry and sweat the assets.

Correctly designed and engineered, edge technology enables edge interoperability and, more importantly, the ability to retrofit into legacy systems. Legacy systems, to a large extent, were disregarded, with current players relying on the ‘rip and replace ‘mentality that has governed and, to a degree, plagued the IT industry since the beginning, befuddling brands that have become household names.

This mentality of winner takes all is not congruent with the ideation of a connected world and certainly does not embrace the concept of true scalability. Having to rip out and replace existing technology and infrastructure on your journey towards digitalisation introduces a huge amount of additional complexity, disruption and a cost, all of which makes it a difficult sale to the business, contributing to the slow adoption rate of the 4th Industrial Revolution.

So whilst the ‘big dogs’ are all trying to figure how they can develop and ensure technology lock-in to secure future revenue, they’re contributing towards the mixed message that is being sent out to the market, diluting the value of IoT technology as a tool to unlocking real business value.

Value is a simple exercise for any business leader – look at expenditure, then ROI. Satisfied? Great. Here’s the next question – is it relevant to my business?

All data is not the answer.

When we enter into discussions with big companies, the issue of legacy investments in technology at the edge comes up without fail. Remember that everyone is selling some type of cloud platform that is going to ‘change the business’, but that cloud engine is reliant on edge data i.e. devices, sensors, machines, protocols, PLCs, SCADAs, CCTV, access control systems – the list goes on and on. Clients start considering negotiating with each vendor and realising that, much like when a 1000-piece holiday puzzle has 1 missing piece it can ruin the picture and make the whole exercise seem futile. It’s the same with many of the algorithms and predictive applications – the true power of these platforms lies in their ability to provide companies with insights. For this they are 100% dependent on having the correct, filtered, aggregated, curated, secure, real-time data from the edge, and they need all the pieces of the data puzzle to build the Big Data picture.

In every environment, on every piece of the puzzle there is information that is critical to the task at hand, and then there’s other information that isn’t needed in real-time. Things like whether a device needs to be serviced in a weeks’ time, whether stock is going to be depleted by the end of the month, etc. Now consider a sensor having a fixed normal range, and only recording exceptions rather than all data all the time – you’re able to reduce the amount of data passed by around 60%- 90% in real time monitoring environments, as a basic statistic.

A normalised, edge layer of physical and virtual intelligence that can be retrofitted, deployed and connected seamlessly into an ecosystem of existing technologies and things, radically reducing the cost and time of having to develop multiple edge integrations into disparate cloud applications.

Being able to retrofit onto all deployed devices, whether analog-, or IP-based has a huge benefit.

  • It reduces disruption to business processes,
  • cost of implementation,
  • cost of training,
  • cost and impact of enterprise-wide change management,
  • reduces vulnerability and cyber risk because of less technology disparity at the edge,
  • reduces data moving across the network,
  • reduces the cost of the network and network congestion,
  • reduces processing required at the cloud platform level as the data has already been curated at the edge,
  • reduces the cost associated with maintenance of edge integrated gateways,
  • has less attack surface at the edge as the gateways are rationalised and
  • simplifies real-time subsystem integration.

All of this allows us to better leverage the power of our cloud platform as we can now understand the up-, and down- stream effects of an event-triggered occurrence and effect dynamic and seamless recalibration and interoperability throughout ALL edge connected devices.

We ensure that all the pieces of the puzzle are in the box, and ready to be pieced together to create the big picture.


Edge normalisation of data at the edge gateway layer form the foundation for rapid digitalisation and digital transformation. The disruption that everyone talks about is vested in the ability for an organisation to continue its business but iteratively and rapidly start to address the core issues within its business through digitalisation. This leads to more visibility on a real-time basis allowing for dynamic recalibration back into the business ecosystem to achieve optimised levels of production and efficiency that bring about change and new ways of doing the same thing, better.

If you control the edge you unlock the cloud, a bottom up approach.


Nico SteynThis article was written by Nico Steyn, CEO of IoT.nxtNico is a respected entrepreneur in technology in South Africa and with his team a leading innovator in the development of new technologies offered by the Internet of Things. Prior to the establishment of IoT.nxt he was the Managing Director of a leading JSE listed ICT distribution company Pinnacle. He was instrumental in the turnaround of Datanet, now a subsidiary of Pinnacle.


Sorry, the comment form is closed at this time.