kdb+ for industrial internet of things 4.0

Kx Insights: IIoT for Predictive Maintenance and Big Data

9 Jan 2018 | , , , , , ,
Share on:

By James Reyes-Picknell and Przemek Tomczak

 

We are often asked about the impact of the Industrial Internet of Things (IIoT) on equipment maintenance for industrial companies. When it comes to repairs, we don’t anticipate that much will change because of the IIoT, except in identifying when repairs are needed. Making systems safe after they’ve suffered failures and taking things apart and replacing components will always require human intervention. In the area of proactive maintenance however, we see a big impact and huge potential benefits.

IIoT for predictive maintenance enables more extensive monitoring of equipment and processes at a much lower cost than traditional methods and delivers actionable warnings to prevent or minimize the consequences of an impending failure. Where IIoT for predictive maintenance is deployed in a well-designed program using Reliability Centered Maintenance (RCM) it will reduce surprise outages, lost production, extensive repairs, secondary damage and increase safety.

IIoT and Big Data – it’s more than just addressable sensors

The underlying premise of the IIoT is that sensors, each with an individual IP address and the ability to communicate wirelessly with a network, allow us to see what is going on remotely when placed on industrial equipment. The stream of signals provided by these sensors provides the raw data from which valuable insight can be gleaned. Using predictive algorithms the data is analyzed and systems are programmed to send notifications to operators like alarms, warnings or work request notifications.

The combination of statistics and machine learning algorithms are effective in analyzing IIoT sensor data for predictive maintenance. For example, a model developed at the University of Toronto can predict fairly accurately the time to failure based on correlations of historic failure data and condition monitoring signals. Companies typically combine several of these types of models into a single platform that can handle various streams of condition monitoring signal data when building their own systems.

Implementing a Predictive Maintenance and Analytics Solution

We suggest some key steps when implementing a predictive maintenance and analytics solution

  • Apply a robust framework for Reliability Centered Maintenance, such as UPTIME and RCM-R®
  • Identify what type of sensor to install, and what location/position will best provide meaningful data streams. This is typically done with some sort of Reliability Centered Maintenance process.
  • Deploy sufficient numbers of sensors to collect needed data.
  • Gather and transmit streams of temporal, or time-series, data from sensors for processing and analysis at a frequency that will give sufficient information to facilitate more accurate troubleshooting.
  • Store the generated data for future analysis and historical investigations.
  • Use algorithms that can handle multiple streams of data in real or near-real time.
  • Use multiple algorithms in combination as a single service.
  • Implement the appropriate technology platform that can ingest, process, store, and analyze the velocity and volume of data, and power the predictive algorithms.
  • Develop, or acquire, expert technical support so the technology and algorithms deliver on their promise.
  • Measure and compare your results as you go so you can continually improve your RCM, using the data and experience gathered to validate and refine predictive maintenance assumptions.
  • Communicate frequently and quickly to management to trigger further field investigations and work request actions.

The need for speed in IIoT predictive analytics

One of the biggest technical hurdles companies face with IIoT predictive maintenance systems is when they run into data bottlenecks that can slow down their response times. The goal of a solution is to remove any friction or delay from when data is generated, to when those insights trigger action in the field. This will become a greater challenge when more sensors and more frequent measurements are being sent wirelessly to networks. Companies that have been relying on manual methods of predictive maintenance will now begin receiving ever growing volumes of IIoT data. New infrastructure, as well as new technologies, will be required to provide the bandwidth to accommodate the data streams.

One industry that has faced this challenge of exploding volumes of time series data is the financial services sector. Over the past two decades it has largely replaced voice and face-to-face trading with algorithmic trading. Banks and other trading organizations take in large volumes of real-time trade and quote data from stock exchanges to make microsecond decisions about buying and selling securities. They are simultaneously drawing on many terabytes of historical trade data using computer models and machine learning programs to make decisions.

The in-memory database platform kdb+, from Kx, has been part of this transition in trading technology from the beginning. Kdb+ is designed to quickly process escalating velocities and volumes of time-series data faster than traditional technologies. Widely implemented across the capital markets industry, kdb+ is also now being used in high-precision manufacturing and predictive maintenance applications. The use case for kdb+ in financial services is similar to the use case for IIoT, both industries need a technology that can efficiently ingest, process, store and analyze real-time and historical streams of data from either sensors or stock exchanges, to make microsecond decisions.

For industrial companies looking to build IIoT predictive maintenance systems, choosing the right technology is essential. A successful implementation of RCM using IIoT sensor data will support growing volumes of sensor data and continual development and improvement in predictive algorithms.

This combination of process and technology has the potential to disrupt the predictive maintenance world and for many companies, it will mean reduced downtime, fewer emergency repairs, lower repair costs, increased asset availability and increased revenues.

 

James Reyes-Picknell is Principal Consultant at Conscious Asset, a specialist consulting and training firm that helps companies increase productivity, decrease costs and get the most value from their physical assets. James has over 40 years experience in contract engineering and consulting. He is based in Ontario, Canada.

Przemek Tomczak is Senior Vice-President, Internet of Things and Utilities at Kx. Previously, Przemek held senior roles at the Independent Electricity System Operator in Ontario, Canada and top-tier consulting firms and systems integrators. Przemek has a CPA, CA and has a background in business, technology and risk management.

 

© 2018 Kx Systems
Kx® and kdb+ are registered trademarks of Kx Systems, Inc., a subsidiary of First Derivatives plc.

SUGGESTED ARTICLES

Migrating a kdb+ historical database to the Amazon Cloud

18 Apr 2018 | , , ,

If you are currently migrating, or considering migrating an historical kdb+ database (HDB) to the Cloud, you will want to read this white paper on the kdb+ developers’ site which looks at popular storage solutions available within the Amazon Web Services (AWS) Cloud. The paper also compares and contrasts the performance of kdb+ on EC2 instances versus physical hardware.

Head of Products, Solutions and Innovation at Kx on Product Design and the Vision for the Future

16 Mar 2018 | , , ,

As the SVP of Products, Solutions and Innovation at Kx Systems, James Corcoran is part of a new chapter in software development at Kx. Since joining Kx parent First Derivatives as a financial engineer in 2009, James has worked around the world building enterprise systems at top global investment banks before moving to the Kx product team in London. James sat down with us recently to discuss his perspective on product design and our technology strategy for the future.

Kdb+ Utilities: Essential utility for identifying performance problems

28 Feb 2018 | ,

If you are a kdb+/q developer, you will find the utilities created by Kx Managing Director and Senior Solution Architect Leslie Goldsmith to be a valuable resource. The “Kdb+ Utilities” series of blog posts gives a quick introduction to the utilities, available at Leslie Goldsmith’s GitHub. In this third part of the series we look at Leslie’s qprof, which allows a programmer to drill down into q functions or applications to inspect performance and CPU usage in a fine-grained fashion.