The IoT data deluge in industry and manufacturing

4 mins read

Internet of Things (IoT), from a top-level standpoint, refers to a network of physical devices such as embedded sensors, driverless vehicles, smartphones/tablets, wearables, or home appliances, that create and share information without human intervention.

Even though there is currently a strong drive towards IoT and digitization, the concept has been around for the last 10 years at least, with interconnected devices and applications prevalent in industry and consumables. 

What has recently changed is the augmented capabilities of said devices, faster comms networks, the standardisation of communication protocols and more affordable IT, which is giving the IoT phenomena a turbocharge.  As such, it is transforming operational processes and product lifecycles across a range of markets and applications. That said, the detailed level of information current IoT devices are capable of capturing should be empowering manufacturers to leverage the benefits of Industry 4.0 to operate truly automated production lines/assembly lines, but this isn’t happening as quickly as you might expect. Whilst some of the barriers may be cultural or finance-related, a much bigger barrier, in many instances, is highly intelligent devices versus substandard data handling and storage management infrastructure.

An unavoidable consequence of IoT and the devices and applications it powers, is the colossal amount of constantly changing data that is generated as a result. This data needs to be processed in real-time if meaningful conclusions are to be drawn and swift decisions made to avoid bottlenecks and keep production lines operational, as smallest of delays can have major repercussions further down the line. This is particularly important for manufacturers reliant on artificial intelligence (AI) and machine learning (ML). Both disciplines are data intensive, bandwidth hungry and require robust storage management processes that enable parallel processing at scale. Indeed, the value of any IoT derived data is incredibly short lived, and unless the associated storage management infrastructure can keep pace with the constantly changing data, an IoT investment can very quickly become an expensive white elephant. 

So, what happens to all the IoT data?

The data flow in an industrial IoT (IIoT) network, in simple terms it is a three-layer process as follows:

  1. Data sources

IoT gathers data from an array of devices and/or embedded sensors and the information can either be processed locally, depending on the availability of appropriate infrastructure, the sensitivity of the data, or the nature of industry, or transported via an edge gateway to a colocation facility or the cloud for processing and handling. 

  1. Data storage:

The data captured by the embedded technologies then needs to be appropriately stored for long-term and short-term applications. Some of the data might require immediate processing depending on the application (the operability of an industrial robot for example), whereas some might need to be securely transported and/or stored for future applications.

  1. Data analytics & applications:

This layer analyses the data so useful information can be generated and acted upon to speed up operations and process control. Detailed insights into production lines/product life cycles enable slicker operations through predictive maintenance and other troubleshooting measures, thus avoiding downtimes and outages that can impact efficiency and profitability.

Data storage is a small cog in a big IoT wheel

Storage is just one element of the IoT data processing ecosystem. BUT it is an element that is becoming increasingly integral as insufficient storage capacity is detrimental to operability. The storage capabilities of any IoT network must assure data integrity, reliability, and safety. Moreover, they must be agile to support a range of environments, technologies, and applications, whilst facilitating seamless interconnectivity between edge gateways, other edge devices and the cloud.  Substandard storage is the Achilles heel for many manufacturers, with outdated comms rooms not allowing them to harness IoT data to its full potential. Insufficient storage capacity is such an issue that, according to industry research, between 60% and 73% of machine generated data goes unanalysed. 

The IoT data that powers Industry 4.0 needs to be processed as close to the source as possible for operability and safety reasons and organisations reliant on mission critical data are quickly realising that conventional colocation facilities cannot always assure the ultrahigh speed and litra low latency needed.  In paradox, many on-premises facilities are not fit for purposes as far as IoT data storage management is concerned because they are unable to house the specialist IT needed. And even if they are, there is seldom room for expansion as capacity requirements escalate. High performance computers (HPC), because of their sheer magnitude, GPU-based processing power, associated cooling technology, and high energy consumption, need specialist facilities that are fireproof, weatherproof, comprise seamless connectivity to the cloud and support dynamic power consumption. 

Commissioning a bespoke facility robust enough to meet the demands of IoT data is a non-starter for many manufacturers because of the high costs involved – anything between £7-£12 million per MW and lead times in excess of 18 months. What is needed is a viable means of providing centralised data centre capabilities locally without the associated expense of building a bespoke facility that assures HPC processing. This has not been possible, thus far, however, due to financial constraints, complex project management requirements and excessive deployment times.  However, the IoT data handling quandary in manufacturing is about to be transformed thanks to a disruptive approach to edge date centre infrastructures being championed by UK firm DataQube Global.

Recognising the need for data handling at source, the company has developed a portfolio of podular data centres for internal and external usage that assure high-speed, high-performance, low latency processing needed for IoT data. Installs are possible from less than 10 watts to +100 MW and individual pods can operate independently as a mini data centre or merged in stacks, depending on the size of a manufacturing facility or the storage capacity needed.

Without a cost effective and viable means of delivering HPC at the edge, IoT data will remain untapped regardless of the accuracy or sophistication of the associated embedded sensors. Edge data centre infrastructures must adapt to meet this changing data processing landscape. 

Author details: David Keegan, Group CEO of DataQube Global