kdb+ for sensor analytics

Kx Insights: Benefits of Utility Predictive Maintenance Analytics

17 Oct 2017 | , , , ,
Share on:

By Przemek Tomczak and Kate Jory

 

Utilities are going through significant modernization, with the adoption of smart grid technologies such as advanced metering, advanced distribution management, outage management, customer engagement and analytics. This modernization is creating a wealth of diverse data about assets, operations, and customers.

At the same time, the job of utilities is becoming more challenging with pressure to reduce costs, competition from new forms of technologies and energy providers as well as the need to integrate renewable energy resources. These new challenges, competition and pressures are leading to innovation and transformation in the utilities industry.

Navigant has estimated that cumulative utility spending on asset management and condition monitoring systems for the power grid will total $49.2 billion during the ten-year period ending in 2023. These investments in preventative repairs are required to forestall the higher costs associated with letting assets run to failure.

How Utilities Benefit

Recognizing the need for industry-wide standards for asset management programs the international standard ISO 55000 for Asset Management and the Publicly Available Specification (PAS) 55 have been developed for optimal management of physical assets. These are great resources for establishing a data-driven asset management program. They promote generally agreed upon best-practices for a data- and risk-based approach across the industry.

For example, one energy provider was able to predict failures of equipment weeks in advance with over 98% confidence by analyzing sensor data from equipment including temperature, vibration and sound, together with maintenance records and equipment manufacturer specifications. These types of data and analytics programs enable utilities to have better visibility of their entire system’s assets, and to incorporate this information into their predictive maintenance program and operations going forward.

Some examples of the use of data and analytics for improving operations and assets include:

  • Assessing the probability and consequences of asset failures by collating asset health and network model data with information on outages and previous failures.
  • Improving the accuracy of short- to long-term forecasts of demand by developing customer specific load profiles which incorporate disaggregated consumption information, weather, demographic and firmographic data.
  • Identifying risks to safety, such as energized wires, by correlating data from different sources in and around an outage.
  • Identifying anomalies and predicting events, failures or failure modes, to achieve proactive and prioritized maintenance and repair activity.
  • Identifying the root cause of failures or issues on systems and then undertaking targeted repairs.

The realization of these benefits is predicated on deciding what assets are important, what needs to be optimized, what needs to be measured and the analyses to be performed. These all depend on having the appropriate data.

Fortunately, utilities have a lot of data sources — from SCADA, power-line sensors, GIS, outage management, systems, and smart meters. When the relevant data is integrated and collated in meaningful formats, and time intervals, then operators and asset managers can then use it to  synchronize predictive maintenance planning across many systems in near real-time.

Utilities adapting to Big Data

As utilities integrate new data streams, increased data volumes may begin to put a strain on existing operational and IT systems not designed for high volumes and frequencies of data. For example, systems setup for smart meter readings once every few minutes to once per hour, will need to accommodate synchrophasors and power-line sensors that generate measurements at many times per second.

The combined streaming, real-time and historical data analytics capabilities of technologies like Kx are helping companies accelerate their use of diagnostic and predictive analytics for extremely large datasets. They do this by supporting the increasing velocity and volume of data, providing a consistent and integrated view of operations, while also supporting real-time anomaly detection and decision making.

Kx is augmenting existing data collection, data historian and asset management systems. Kx  is also helping companies travel backwards in time to replay and investigate  historical events and conditions, as well as enabling machine learning algorithms for making better predictions of events and system conditions.

The utility business model is being challenged and transformed by new sources of data. With the cost of sensors dropping and the availability and adoption of Big Data technologies increasing, critical asset maintenance can now largely be monitored remotely with Big Data predictive analytics. The benefits for utilities are enhanced reliability and uptime, cost reduction and improved safety.

 

Przemek Tomczak is Senior Vice-President Internet of Things and Utilities at Kx. Previously, Przemek held senior roles at the Independent Electricity System Operator in Ontario, Canada and top-tier consulting firms and systems integrators. Przemek has a CPA and a background in business, technology and risk management.

Kate Jory is a Business Development Executive for Kx and is currently based in Seoul, South Korea. Her academic and business experience is in physics, marketing and sales.

SUGGESTED ARTICLES

Kdb+ Transitive Comparisons

6 Jun 2018 | , ,

By Hugh Hyndman, Director, Industrial IoT Solutions. A direct comparison of the performance of kdb+ against InfluxData and, by transitivity, against Cassandra, ElasticSearch, MongoDB, and OpenTSDB

Kx provides rapid access to unstructured data

Insights into kdb+ 3.6

5 Jun 2018 | , ,

At Kx25, the international kdb+ user group conference held on May 18th, we released kdb+ version 3.6. With this release, we are opening up new possibilities for kdb+ application developers.
It provides a lot of new capabilities and streamlines previously complex processes so that programmers will be able to take a simpler approach to writing systems compared to what they did in the past.