kdb+ for sensor analytics

Kx Insights: Benefits of Utility Predictive Maintenance Analytics

17 Oct 2017 | , , , ,
Share on:

By Przemek Tomczak and Kate Jory

 

Utilities are going through significant modernization, with the adoption of smart grid technologies such as advanced metering, advanced distribution management, outage management, customer engagement and analytics. This modernization is creating a wealth of diverse data about assets, operations, and customers.

At the same time, the job of utilities is becoming more challenging with pressure to reduce costs, competition from new forms of technologies and energy providers as well as the need to integrate renewable energy resources. These new challenges, competition and pressures are leading to innovation and transformation in the utilities industry.

Navigant has estimated that cumulative utility spending on asset management and condition monitoring systems for the power grid will total $49.2 billion during the ten-year period ending in 2023. These investments in preventative repairs are required to forestall the higher costs associated with letting assets run to failure.

How Utilities Benefit

Recognizing the need for industry-wide standards for asset management programs the international standard ISO 55000 for Asset Management and the Publicly Available Specification (PAS) 55 have been developed for optimal management of physical assets. These are great resources for establishing a data-driven asset management program. They promote generally agreed upon best-practices for a data- and risk-based approach across the industry.

For example, one energy provider was able to predict failures of equipment weeks in advance with over 98% confidence by analyzing sensor data from equipment including temperature, vibration and sound, together with maintenance records and equipment manufacturer specifications. These types of data and analytics programs enable utilities to have better visibility of their entire system’s assets, and to incorporate this information into their predictive maintenance program and operations going forward.

Some examples of the use of data and analytics for improving operations and assets include:

  • Assessing the probability and consequences of asset failures by collating asset health and network model data with information on outages and previous failures.
  • Improving the accuracy of short- to long-term forecasts of demand by developing customer specific load profiles which incorporate disaggregated consumption information, weather, demographic and firmographic data.
  • Identifying risks to safety, such as energized wires, by correlating data from different sources in and around an outage.
  • Identifying anomalies and predicting events, failures or failure modes, to achieve proactive and prioritized maintenance and repair activity.
  • Identifying the root cause of failures or issues on systems and then undertaking targeted repairs.

The realization of these benefits is predicated on deciding what assets are important, what needs to be optimized, what needs to be measured and the analyses to be performed. These all depend on having the appropriate data.

Fortunately, utilities have a lot of data sources — from SCADA, power-line sensors, GIS, outage management, systems, and smart meters. When the relevant data is integrated and collated in meaningful formats, and time intervals, then operators and asset managers can then use it to  synchronize predictive maintenance planning across many systems in near real-time.

Utilities adapting to Big Data

As utilities integrate new data streams, increased data volumes may begin to put a strain on existing operational and IT systems not designed for high volumes and frequencies of data. For example, systems setup for smart meter readings once every few minutes to once per hour, will need to accommodate synchrophasors and power-line sensors that generate measurements at many times per second.

The combined streaming, real-time and historical data analytics capabilities of technologies like Kx are helping companies accelerate their use of diagnostic and predictive analytics for extremely large datasets. They do this by supporting the increasing velocity and volume of data, providing a consistent and integrated view of operations, while also supporting real-time anomaly detection and decision making.

Kx is augmenting existing data collection, data historian and asset management systems. Kx  is also helping companies travel backwards in time to replay and investigate  historical events and conditions, as well as enabling machine learning algorithms for making better predictions of events and system conditions.

The utility business model is being challenged and transformed by new sources of data. With the cost of sensors dropping and the availability and adoption of Big Data technologies increasing, critical asset maintenance can now largely be monitored remotely with Big Data predictive analytics. The benefits for utilities are enhanced reliability and uptime, cost reduction and improved safety.

 

Przemek Tomczak is Senior Vice-President Internet of Things and Utilities at Kx. Previously, Przemek held senior roles at the Independent Electricity System Operator in Ontario, Canada and top-tier consulting firms and systems integrators. Przemek has a CPA and a background in business, technology and risk management.

Kate Jory is a Business Development Executive for Kx and is currently based in Seoul, South Korea. Her academic and business experience is in physics, marketing and sales.

SUGGESTED ARTICLES

Signal processing in kdb+

Signal processing with kdb+

6 Sep 2018 | , ,

In the latest in our ongoing series of kdb+ technical white papers published on the Kx Developer’s site, Kx engineer Callum Biggs examines how kdb+/q can be used instead of popular software-based signal processing solutions. Signal processing is used for analyzing observable events, such as IoT sensor data, sounds, images and other types of pulses […]

Kx Insights: Machine learning and the value of historical data

2 Aug 2018 | , , ,

Data is being generated at a faster rate now than ever before. IDC has predicted that in 2025, there will be 163 zettabytes of data generated each year—a massive increase from the 16.1 zettabytes created in 2016. These high rates of data generation are partially an outcome of the multitude of sensors found on Internet of Things (IoT) devices, the majority of which are capable of recording data many times per second. IHS estimates that the number of IoT devices in use will increase from 15.4 billion devices in 2015 to 75.4 billion in 2025, indicating that these immense rates of data generation will continue to grow even higher in the years to come.