IoT Sensor Data Analytics Driving Utility Innovation

5 Feb 2019 | , ,
Share on:

By Przemek Tomczak

The utility industry has always been data-intensive in monitoring grids and assets, and using meter data for billing purposes. But today more than ever, utilities are realizing there is untapped value in their operational data. As a result, there is a push for greater digitalization that is transforming utilities’ IT infrastructure and business processes. Propelling this digital overhaul are demands from tech-savvy customers, competitive pressures, and basic shifts in how electricity is produced with distributed energy resources (DER), such as renewables, playing a larger role.

Utility sensor data essential for digitalization

Because utilities provide an essential service for the functioning of our society, they are under tremendous pressure to deliver energy reliably and safely, without interruption, at as low a cost as possible. Managing their complex networks requires a high degree of situational awareness, which necessitates integrating diverse data from all systems, from SCADA, to DER, to wide-area monitoring systems, energy markets, and energy consumers.

The digitalization of utilities is accelerating with the reduction in the cost of sensors and the availability of communications and connectivity. Utilities are deploying more sensors than ever on transmission, distribution, and generation assets, as well as leveraging more data from advanced metering infrastructure like smart meters. In order to predict or detect faults for high value and important assets measurements are now being sampled at higher frequencies, over fifty times per second. This data is being used for advanced analytics correlating behaviors across different parts of the distribution and transmission systems.

The traditional role of using meter data for billing and planning is being revisited, with utilities in Europe beginning to use innovative data analytics products with one second measurement frequency data from behind the residential, commercial and industrial consumer meter.

Taken together, I see the velocity and volume of data that utilities will have to manage and analyze increase by ten to over 100 fold over the next five years.

Challenges to existing systems

Most of the software deployed by utilities was designed when operational systems, IT and enterprise systems were kept separate and used by different teams, with very little data sharing across departments and systems. These systems also supported significantly less data – a few hundred to a few thousand measurements per second, as well as planning and billing using periodic data (often collected monthly) and profiles of energy consumption.

It is recognized today, that when more granular data can be collected, analyzed and stored, compared to legacy systems, business and operations analysts can make better-informed decisions. The limited visibility into the data from legacy software, means that assumptions made about behaviors and trends must be more conservative. This results in higher levels of operating contingencies than are required when analysts have more detailed and reliable information at their fingertips.

Existing systems are also challenged in the availability of their data. Typically, data from smart meters, or other devices and equipment, is batched and then transmitted. Whether the frequency is five minutes, 15 minutes or one hour, this delay can be significant when analysts want to be able to derive actionable information from their IoT network. Software that can analyze real-time data and batched data, and integrate both into the same system, can raise the value of this data significantly for analysis and reporting. Responsive visualizations made possible by the latest software, can include alerts and triggers to speed analyst response times as well.

State-of-the art IoT analytics

The best analytic systems should be able to scale with high volume, high velocity data comprised of hundreds of thousands, to millions, of measurements and events per second. They should also be able to ingest and process data simultaneously from hundreds of thousands, to millions, of sensors, devices, and locations. But scale is just one consideration –  the ability to link and join data from heterogeneous data sources and put it into the right context is critical to deliver actionable insights for real-time control, asset management and business operations.

An example of this type of innovative technology is the implementation of Kx for Sensors (KxS) by our customers in distribution management systems, meter data management and planning systems, and energy market systems. Our customers are helping utilities deliver advanced analytics for real-time control of their grids, planning and customer engagement. This is because KxS is uniquely able to ingest many different sources of time-series data in both real-time, and batch simultaneously, and perform complex calculations on streaming and historical data.

This enables richer situational awareness, rapid detection and predictions of faults, extended useful life of assets, and reduced and optimized capital expenditures. This type of platform is particularly useful for those utilities looking to better manage DER – such as solar panels, electricity storage, electric vehicles and other controllable loads – given the intermittency and variability of the outputs. Integrated, continuous utility network IoT data analytics can also be used to vastly improve situational awareness for other types of energy asset management — linking SCADA, micro-phasor and behind-the-meter data, to more accurately predict the state of the power system and take action.

With converged visibility across the power system, more innovation is made possible, like the modernization of the power grid to integrate DER with traditional energy generation. Using technologies like KxS, that can provide an integrated view in one place, make it possible for utilities to leverage the full potential of their data. This includes key performance indicators, quality measurements and analytics on historical data, providing flexible and extensible support for use cases, and applications requiring converged visibility across IT and OT systems and practices.

Przemek Tomczak is Senior Vice-President of Internet of Things and Utilities at Kx Systems. For over twenty five years, Kx has been providing the world’s fastest database technology and business intelligence solutions for high velocity and large data sets. Previously, Przemek held senior roles at the Independent Electricity System Operator in Ontario, Canada and top-tier consulting firms and systems integrators. Przemek also has a CPA and has a background in business, technology, and risk management.

 

SUGGESTED ARTICLES

Kx Insights: Machine learning and the value of historical data

2 Aug 2018 | , , ,

Data is being generated at a faster rate now than ever before. IDC has predicted that in 2025, there will be 163 zettabytes of data generated each year—a massive increase from the 16.1 zettabytes created in 2016. These high rates of data generation are partially an outcome of the multitude of sensors found on Internet of Things (IoT) devices, the majority of which are capable of recording data many times per second. IHS estimates that the number of IoT devices in use will increase from 15.4 billion devices in 2015 to 75.4 billion in 2025, indicating that these immense rates of data generation will continue to grow even higher in the years to come.

Top three ICT data themes at NXTAsia 2018

25 Jul 2018 | , , , ,

By Thomas Kearney  Collecting and analyzing big data in order to differentiate services in a competitive marketplace has been a common theme for many years­­­­ among Western developed nations. In recent years, emerging markets have seen double-digit growth rates from similar provision of cheaper, faster telco services and from digitizing and optimizing their delivery chain. Now, […]