Comparing and Contrasting Kx and the Hadoop Ecosystem

1 Nov 2016 | , , , , ,
Share on:

“Through 2018, 70% of Hadoop deployments will fail to meet cost savings and revenue generation objectives due to skills and integration challenges.”

Gartner, 100 Data and Analytics Predictions Through 2020

 

 

By Glenn Wright

The exponential increase in available digitized data, or Big Data, is transforming business and research. The appreciation of the potential of Big Data to change how companies operate has tracked the rise of the Apache Hadoop ecosystem, which includes open-source computing frameworks for working with large datasets.

Over the past two decades companies in the financial services industry working with extremely large datasets have turned to the Kx platform, a high-performance time-series database called kdb+ with a built-in programming language called q for high performance analytics. Kx predates the Apache Hadoop ecosystem by decades, and Kx is proven to be more performant, especially as data volumes increase.

In my latest whitepaper I discuss of some of the differences between the two approaches for tackling large-scale, complex business analytics. I highlight the advantages of both systems and inspect some of the key architectural differences between the two. While both approaches have merits, it is only when you are required to deploy the tools for complex analytics that the true merits of the Kx approach can be fully realized.

SUGGESTED ARTICLES

Kx Insights: Machine learning and the value of historical data

2 Aug 2018 | , , ,

Data is being generated at a faster rate now than ever before. IDC has predicted that in 2025, there will be 163 zettabytes of data generated each year—a massive increase from the 16.1 zettabytes created in 2016. These high rates of data generation are partially an outcome of the multitude of sensors found on Internet of Things (IoT) devices, the majority of which are capable of recording data many times per second. IHS estimates that the number of IoT devices in use will increase from 15.4 billion devices in 2015 to 75.4 billion in 2025, indicating that these immense rates of data generation will continue to grow even higher in the years to come.

SEMICON 2018 Snapshot: Data and the Era of AI

24 Jul 2018 | ,

By Bill Pierson The future of the semiconductor industry is looking bright judging by the breadth of new developments, initiatives and innovations on display at SEMICON West in San Francisco this July. Industry leading companies presented the latest technical and business insights into today’s opportunities and challenges, particularly in the areas of smart manufacturing and […]