In financial services, transformation is constant. So, reflecting on the lyrics of Tears for Fears’ 1985 global hit “Everybody Wants to Rule the World,” I find myself drawing parallels to the shifting power dynamics in financial technology. My favorite teenage song subtly echoes a truth relevant to today’s financial industry: the quest for progress hinges not just on desire but on access to tools and opportunities too often limited to the elite.
Undoubtedly, Tears for Fears weren’t singing about being resentful of the tier 1 and tech hedge fund dominance of the 1990s to 2010s. Luckily, for FinTech, at least, times have changed, and much tooling, once only available to tier 1 organizations, is now available to the masses, with out-of-reach quant skills ubiquitously distributed across the industry.
kdb, for one, is no longer exclusive. Cloud marketplaces and a Python-first mentality have lowered kdb costs of ownership significantly, helping make what was once exclusive available to all financial tiers and offices. That includes back and middle office, buyside, regulator, insurer, and sell-side, bringing the speed and efficiency-busting kdb data management and analytics to all. Benefits include:
- The serving up of data instantaneously
- The operation of analytics at the speed of thought on the largest data-sets
- Real-time analytics capabilities
That means all organizations can now trade, manage risk, analyse portfolios faster and more efficiently, and bring 10 to 100x more data to quantitative research, AI, and data science workflows.
The Trends Dominating Computational Finance
In the last fifteen years, the commodifying effects of cloud and open-source technologies have transformed computational finance. Four key trends have predominated:
- Machine Learning and Generative AI have taken quantitative finance beyond the abstract derivatives pricing of old. All functions, asset classes, and anything with a sizeable data load are within scope, with real-time inference transforming MLOps in financial services.
- The need for speed has (mostly) leveled out. Leaders spent millions being first to market, but with some exceptions, the pipes and the plumbing (the gateways, routers, messaging systems and API protocols) of trading networks are accessible for most. The speed of cloud has helped, supporting most at-speed financial analytics operations requiring millisecond performance.
- Open-source programming, in particular Python (programmers exist in their millions), brings unsurpassed agility and, with the right muscle, the performance of C++. The Jupyter Notebook is supplanting the Excel spreadsheet for many numerical applications.
- Differentiating tools such as KX and kdb, once a preserve of the Tier 1s, are available to all, easy to access and use via your Cloud Service Provider, with no need to commission hardware. They’re also easy to instantiate from Python and SQL to service the biggest and fastest data and analytics use cases, with minimal ramp-up required.
Let’s explore.
Quant Workflows: Democratized and Open-Source-Engaged
KX with kdb transforms the following use cases:
- Quantitative Research. Whether investigating portfolios, risks, or policies with any particular asset classes, add increased data volumes, real-time replays, and unstructured and alternative data-sets into and alongside Pythonic quant pipelines. The result: faster time to production of research and data science, with immediate impact to customers and stakeholders.
- Continuous Markets Surveillance. Observe patterns, anticipate anomalies in real-time, and facilitate “time-travel” investigations to monitor for fraudulent activities such as spoofing and front-running.
- Real-Time Analytics for all. This includes model portfolios, valuations, lending decisions, reinsurance products, and commodity prices. Real-time analytics is no longer the preserve of high-frequency equities or FX. At the same time, the limits of batch processing are blown away by the most efficient in-memory computing anywhere.
- Blending Unstructured Data and Structured Data. With vector-time-oriented databases, bring reports, research, filings, and social media insights to inform portfolio valuations, risk insights, and trading strategies.
- Machine Learning as a Service. Why have boundaries between training and inference or between online and offline feature stores? With more memory for inference and faster training over larger data, you’ll decrease data duplication, improve compute efficiency and MLOps lifecycles.
- Trading Analytics. Tier 1s have sold trading analytics services to others for decades, but why not roll some of your own? Market impact, volume curves, liquidity metrics, order book assessment, back-testing, and capturing the intersection between pre and post-trade are all in scope.
Tier 1 Technology for all
The toughest Tier 1 environments have proven the hardest analytics data management and analytics use cases. How can you enjoy the opportunities brought by differentiating tooling without the Tier 1 price tag or resource requirement?
- Provision from your favorite cloud marketplace. No expensive setup or dedicated hardware. Get up and running in minutes, not months.
- Use Python, SQL, and Low Code to expedite your queries. There’s no need to learn a new language or pay for expensive developers (but know too that q is fun to learn, and q developers are highly productive).
- Hook up big and fast data to your dashboards. Marry your data and analytics with your Power BI, Tableau, and dashboard environments to please stakeholders and analysts.
- Use your existing hardware with kdb. Use 1 or 2 instances, not hundreds. Leverage CPU technology; don’t rely on commissioning GPUs.
With KX, then, “Everybody CAN Rule the World,” at least in Financial Services with a working computer and access to financial data and github.
Read our new ebook, 11 Insights to Help Quants Break Through Data and Analytics Barriers to accelerate your analytics journey.