Why manufacturers are being left in the dark when it comes to data
The Fourth Industrial Revolution has been characterised by demands of perfection — manufacturers are expected to produce the perfect product in the most effective way possible when it comes to time and resources. But what happens when they run into unexpected obstacles such as interruptions or malfunctions?
Breakdowns in production lines can cost manufacturers anywhere from $50,000 to $2 million per hour.1 Unsurprisingly, manufacturers often can’t afford the expenses associated with unforeseen downtime — yet according to technology research company Vanson Bourne,2 82% of companies have experienced at least one unplanned outage over the past three years.
While manufacturers are investing heavily in data-led technologies such as the Industrial Internet of Things (IIoT), machine learning and artificial intelligence (AI), they are not always equipped with the necessary analytics skills to leverage data gathered from these technologies to their fullest potential. In fact, Capgemini3 found that this is the case for almost 60% of organisations in a recent study.
We can see a clear gap between the potential of technologies like the IIoT and the realisation of this potential. So what’s the problem?
A stab in the dark
‘Dark data’ is a common but little-acknowledged problem experienced by most manufacturers. It occurs when a company is generating information but is unable to use it in a meaningful way.
Often dark data rears its head when data is being created by machines but is not visible and as a result not used for making decisions. More frequently, disconnect occurs where there are no adequate storage facilities to retain data long enough to process it or where huge volumes of data can’t be scaled by data science teams. Alternatively, companies are unable to hire enough data scientists to process all the gathered information so they end up working with just a limited sample. In addition, many predictive maintenance systems end up sending alerts for too many anomalies (false positives) or not enough (false negatives), and manufacturers are paying the price.
Getting predictive maintenance right can have a real impact on the bottom line. To give a practical example, more than a third of manufacturers lose 1–2% their annual sales4 to scrap and rework. This loss could be avoided by putting effective systems in place to identify issues before the quality check stage and, ultimately, save valuable resources.
The light at the end of the tunnel
The good news is there has been a shift in industry thinking, and some manufacturers are now implementing effective ways of storing and processing data.
The IIoT and AI are among a slew of technologies detecting early signals of future problems and helping manufacturers take proactive actions to prevent them. Automating the process of analysing a growing number of datasets is key to mitigating the risk of dark data and predicting machine health accurately.
Applying a cognitive approach to predictive maintenance is a good way to kick off this process. While a manual approach to predictive maintenance is useful to identify common issues occurring across all machines, it can only use known problems from past experiences and assumes that only outliers are anomalies.
By implementing a cognitive, ‘machine-first’ approach to anomaly detection, manufacturers can create a mechanism where the algorithms can adapt to changing conditions and learn the data domain for each individual machine. This knowledge is then transferred across similar machines and validated through feedback from subject matter experts.
In layman’s terms — a cognitive approach will eventually create a fully automated and cognitively enabled machine learning system, which can predict anomalies before they occur. Imagine how many resources could be saved if manufacturers were alerted about potential downtime and could fix the issue to prevent costly interruptions.
The next frontier
Even in this age of digitisation, manufacturers are still left in the dark when it comes to knowing when equipment is due for maintenance, upgrade or replacement.
Investing in data-led technologies and taking a cognitive approach can help build a rock-solid foundation for accurate anomaly detection scenarios, and enable truly efficient predictive maintenance strategies. Using these technologies to solve the dark data problem unveils a competitive advantage to any organisation brave enough to take the plunge.
References
- Vanson Bourne Ltd 2017, ‘After the Fall: Cost, Causes and Consequences of Unplanned Downtime’, published on behalf of ServiceMax, <https://lp.servicemax.com/Vanson-Bourne-Whitepaper-Unplanned-Downtime-LP.html>
- Ibid.
- Subrahmanyam KVJ 2018, ‘Unlocking the Internet of Things: Why scale is the smart route to high value’, Capgemini, <https://www.capgemini.com/2018/04/unlocking-the-internet-of-things-why-scale-is-the-smart-route-to-high-value/>
- Vanson Bourne Ltd op cit.
Building a critical infrastructure security dream team
Today it’s essential to have a strong cyber strategy, with all corners of the business...
Anticipating maintenance problems with predictive analytics
By utilising predictive analytics, process manufacturers can predict failures, enhance...
Air-gapped networks give a false sense of security
So-called 'air-gapped' OT networks can still fall victim to cyber attacks, so what is the...