-
Accelerating Python Workflows using PyKX
28 June, 2023
-
Slim Down That Python with PyKX
27 June, 2023
-
PyKX: Run Python, q the Speed
14 June, 2023
-
kdb Insights Enterprise On Amazon FinSpace
31 May, 2023
-
10 Takeaways from Gartner Data & Analytics Summit, London, May 22-24
25 May, 2023
-
Business Intelligence: Engines & Horses for Courses
22 May, 2023
-
KX & Snowflake: Empowering Snowflake with Up to 100x Faster and More Efficient Vector and Time Series Analytics
19 May, 2023
-
The Montauk Diaries
17 May, 2023
-
Stop Stuffing Your Time Series Data into Your Data Warehouse Like it’s Your Sock Drawer
21 April, 2023
-
kdb Insights Enterprise on Azure is now General Availability (GA)
20 March, 2023
-
5 Steps to Building AI-Ready Trading Systems
3 March, 2023
-
Kdb+ Features In Dell STAC-M3™ Benchmark Tests
26 January, 2023

GPU accelerated deep learning: Real-time inference
While model training is often the key focus in deep learning, the demands of high-velocity data, necessitate optimizing inference performance via GPU acceleration.