Gerry Buggy on Streaming Analytics and In-Memory Computing

4 Sep 2016 | , , ,
Share on:

By Gerry Buggy

Stream processing of real-time data enables business leaders to discover new insights that can’t be gleaned from reports after the fact. Those companies that can act on these insights in real-time can innovate and disrupt established business sectors.

The most significant and profitable insights need immediate action. Hence, streaming analytics are fuelling business innovation and it is becoming essential that organisations can react in real-time to the information they are collecting and processing as part of their business functions.

Kx technology directly addresses the requirements of streaming analytics. There are three key technology approaches that are driving the need for a solution:

  1. The arrival of new data architectures, as seen within the Big Data arena, now looking for a streaming architecture.
  1. The Internet of Things (IoT): assets, sensors, machines pushing data into real-time analytics and operational management.
  1. Cloud analytics: cloud deployments are increasingly looking at dealing with real-time ingestion and distribution.

How does streaming processing change things? The most common type of analytic processing is based on a pull model: the data is collected in a repository and queries are run against the data. The responses are based on the data at the query time. Queries have to be rerun as the data updates, and results may take seconds, minutes or even hours to produce. This is a reactive style of management; the insight in the data is only realised when someone looks for that value.

The new paradigm is to have “active queries” where the answers update in real-time as the data changes. This is streaming analytics.

One of the key drivers, IoT, embodies these concepts. IoT is connecting objects and devices: everything from cell phones, televisions, fitness trackers, as well as manufacturing equipment, automobiles and jet engines. IoT is creating real-time business events that present significant opportunities for companies to transform how they: engage with customers, operate their business and deliver their services. However, in order to do this, companies are faced with the task of analysing large volumes of data; in motion and at rest. Their challenge is to identify patterns, assess the past, make real-time predictions and act on these insights.

After quietly dominating the real-time decision processing space in finance for years, Kx looked at other industries and realised it already had the technologies they were trying to build and integrate. Their projects to capture and process business events, were requiring them to become software developers and system integrators, distracting them from their main purpose: to serve their customers with better products and value.

We believe that Gartner, in their white paper ‘How to Select In-Memory Computing Technologies for Big Data*’, have managed to distill the components required to implement a state of the art platform for making real-time insights into business events. Gartner have clearly defined the components required and how they need to interoperate. We believe the task of integrating several open source packages for this architecture may not be the most performant solution nor the most cost effective.

 

Kx offers an alternative approach: one that is complete, supported and proven in the harshest of real-time environments, financial services.

*Gartner, How to Select In-Memory Computing Technologies for Big Data, Massimo Pezzini | Kurt Schlegel | Roxane Edjlali | Nick Heudecker | Keith Guttridge, 15 April 2015

Read the full Gartner report (requires Gartner access): https://www.gartner.com/doc/3030417/select-inmemory-computing-technologies-big

Or you can read Gartner author Massimo Pezzini’s blog post: .https://www.linkedin.com/pulse/how-select-in-memory-computing-technologies-big-data-massimo-pezzini

 

 

 

SUGGESTED ARTICLES

Kx Product Insights: Inter-Trading Alert

5 Dec 2018 | , , ,

by Aidan O’Neill Kx has a broad list of products and solutions built on the time-series database platform kdb+ that capitalize on its high-performance capabilities when analyzing very large datasets. Kx for Surveillance is a robust platform widely used by financial institutions for monitoring trades for regulatory compliance. The Surveillance platform instantly detects known trading […]

Kx extends relationship with NASA Frontier Development Lab and the SETI Institute

The Exploration of Space Weather at NASA FDL with kdb+

4 Dec 2018 | , , , ,

Our society is dependent on GNSS services for navigation in everyday life, so it is critically important to know when signal disruptions might occur. Physical models have struggled to predict astronomic scintillation events. One method for making predictions is to use machine learning (ML) techniques. This article describes how kdb+ and embedPy were used in the ML application.