FROM CONCEPT TO REALITY
It’s not surprising that the IoT, IIoT, Industrie 4.0 and Factory of the Future are currently much hyped subjects generating high levels of interest amongst enterprises. By enabling countless connections between devices across organizations, the IoT provides us with opportunities to be more efficient in how we do things, saving us time, improving yields and adding value. For many businesses the IoT is still new with the largest investments to data being primarily in IIoT. However the focus is moving from what the IoT could to, to what it can do, how it contributes to business goals and how it delivers value. Organizations are moving from discussions around the sound bite of the number of connected devices to how the IoT, combined with ML, AI and advanced analytics can help them to extract insights, make better more timely decisions and improve revenues from sensor data.
Accompanying these opportunities of smarter connected machines and devices are the challenges of storing, sharing and analyzing the enormous volumes of data that are now being generated and collected. According to Gartner, by 2021 the installed base for IoT endpoints will be 25 billion. In short the IoT means more data at higher volumes in a continuous flow from multiple sources and sensors requiring the ability to handle more complexity and increasing levels of automation.
Moving from Possibilities to Implementation
As businesses move away from the hype around IoT and towards the practical implementation of applications that will deliver business results and improvements like operational efficiency, cost reductions and workforce productivity, they face some hurdles. One of the major blocks facing IT teams is how to better ingest, digest and integrate the large amounts of data that is being produced within the enterprise. Data verification and data management are two further areas that companies will need to address as they move further along their IoT journey. Most analysts agree that the true value of IoT technology will be realized when intelligence is extracted from the data and information in a timely manner. IoT analytics requires the ability to:
- Handle higher volumes, continuous flow from multiple sources/sensors
- Store, ingest, integrate and manage time-series data
- Work with multiple analytics techniques
- Manage edge analytics
- Integrate with operation and legacy systems
Accelerating the IIoT
Industrie 4.0 refers to the current trend of improved automation, machine-to-machine and human-to-machine communication, artificial intelligence, continued technological improvements and digitalisation in manufacturing. This ‘fourth industrial revolution’ has been driven by 4 disruptors: a rise in data volumes; increased computational power and connectivity; the emergence of advanced analytics and business intelligence capabilities; improvements in transferring digital instructions to the physical world in areas such as advanced robotics and 3D printing
IIoT is rapidly changing the ways manufacturing companies operate with many organisations investing in IIoT to drive further automation, enable predictive analytics and provide real-time production line data in order to improve efficiency, productivity and performance levels across the enterprise. Many businesses are expecting to deliver major gains through reduced maintenance costs, energy savings, improved yields and reduced waste. Manufacturing, oil and gas, automotive and the utilities sectors are leading the way in the use of Industrial IoT applications to develop new business and operating models under the umbrella of Industrie 4.0 and the Factory of the Future.
The underlying premise of the IIoT is that sensors, each with an individual IP address and the ability to communicate wirelessly with a network, allow us to see instantly what is going on remotely when placed on industrial equipment. The stream of signals provided by these sensors provides the raw data from which valuable insight can be gleaned. One of the biggest technical hurdles companies face with IIoT systems is when they run into data bottlenecks that can slow down their response times. One industry that has overcome the challenge of exploding volumes of time-series data is the financial services sector. Over the past two decades voice and face-to-face trading has been largely replaced by algorithmic trading. Banks and other trading organizations take in large volumes of real-time trade and quote data from stock exchanges to make microsecond decisions about buying and selling securities, in many cases based on terabytes of historical trade data using computer models and machine learning programs.
According to Philip Howard in the Bloor InDetail Paper, “kdb+ has proved itself in what is arguably the most demanding big data market: financial trading and risk management. The technology is well suited to a variety of other environments including preventative maintenance (asset management), smarter cities, the connected car and other such environments where the collection and analysis of time-series based data is important”.
Long-term success in implementing Industrial IoT will be dependent on a company’s ability to handle the snowballing amounts of sensor data being created. A growing number of firms are discovering that traditional relational database systems are proving inadequate in handling these volumes; this inadequacy is reflected in the significantly increased spend on license fees, infrastructure and operating costs. By turning to the high-performance, low latency technology provided by Kx, industrial firms can make use of its combination of in-memory processing of streaming and historical data, efficient columnar design, and highly-integrated query language to supercharge their legacy systems with the fastest analytics engine on the market. We are currently working, amongst others, with:
- Aston Martin Red Bull Racing as an innovation partner where our technology’s first deployment is for the analysis of wind tunnel data, a critical element in the development of faster, more competitive F1 cars
- a Fortune 500 engineering solutions company within precision manufacturing to incorporate Kx technology into the next version of its fault detection product range from improved performance and scalability
Powered by the world’s fastest time-series in-memory and columnar database, Kx is able to handle millions of events and measurements per second, gigabytes to petabytes of historical data, with nanosecond resolution, more efficiently and cost-effectively than any available alternatives — thus accelerating your evolution into IIoT.
The basis for Kx Technology is a unique integrated platform which includes a high-performance historical time-series columnar database called kdb+, an in-memory compute engine, and a real-time streaming processor all unified with an expressive query and programming language called q. Designed from the start for extreme scale, and running on industry standard servers, the kdb+ database has been proven to solve complex problems faster than any of its competitors.LEARN MORE