How Modern Market Makers Stay Competitive In Volatile, Data Driven Markets

How modern market makers stay ahead in volatile markets

Author

Daniel Tovey

Senior Content Marketing Manager

Key Takeaways

  1. Modern market making demands adaptive infrastructure, not just speed.
  2. Legacy systems can't handle the volatility, fragmentation, and latency today’s markets impose.
  3. Top firms are embedding real-time data, AI, and dynamic risk models directly into trading workflows.
  4. Success depends on closing the loop between signal and execution continuously, and at scale.
  5. KX powers modern market making with time-aware systems built for streaming, scale, and speed.

Discover why modern market making relies on real-time infrastructure to adapt faster, quote smarter, and compete in today’s volatile trading landscape.

The rules of liquidity provision are being rewritten.

In April, U.S. equities sank 13% on a single policy shock. Treasury yields swung 47 basis points. The VIX rose 50% to 44 points, one of the sharpest spikes on record. U.S. markets processed 545 million equity trades and over $11 trillion in Treasury volume in a single session. Some hedge funds faced margin calls not seen since 2020.

Macro shocks keep coming: tariffs, rate pivots, structural shifts. Across crypto, FX, and equities, the cost of inventory has climbed, while hedge windows have narrowed. Quoting risk is up. Spreads are tougher to price. Flow quality is deteriorating. And as firms deploy ever-faster models, the shelf life of alpha is collapsing.

This is more than just episodic volatility, it’s sustained structural pressure. For market makers, that means reexamining how liquidity is priced, how risk is hedged, and how signal detection happens in real time.

The new pressures shaping modern market making

Speed hasn’t always been the defining factor in market making. Strategies built around structured products, manual execution, or slower-moving end-of-day models can perform effectively without streaming analytics or sub-millisecond latency. But for firms leaning into automation, quoting electronically across fragmented venues, or managing tighter hedge windows, infrastructure becomes a performance edge.

Rising volatility and shrinking risk buffers are now pulling even slower strategies into faster decision cycles. Price discovery is happening in real time, and inventory risk is harder to manage with static models or overnight processes. Whether you’re reacting in microseconds or minutes, responsiveness matters more than ever.

So the real differentiator today isn’t speed in isolation. It’s adaptability.

Across every market-making model we support, including high-frequency, OTC, arbitrage, and digital assets, the themes are consistent:

  • Fragmented data across venues and instruments
  • Latency-sensitive decisioning that strains legacy infrastructure
  • AI models that can’t keep up with regime shifts
  • Compliance requirements that demand explainability at speed

Whether you’re quoting EURUSD, ETHBTC, or S&P futures, the bar has risen. Legacy tech stacks, originally designed for overnight analysis and manual overrides, weren’t built for this environment.

And it shows.

Systems built for predictable regimes now struggle under the weight of real-time expectations. If your strategy is end-of-day or intraday, latency may not be a constraint. But if you’re adapting to faster flows, tighter spreads, or event-driven models, those systems start to show their limits.

The firms gaining ground are the ones closing the loop between signal and execution continuously, and at scale.

What modern market makers are doing differently

Not every desk needs to operate at streaming speed. But for firms moving toward more frequent signal updates, tighter execution windows, or real-time pricing, the infrastructure question becomes unavoidable.

This shift is playing out in several ways:

  • Legacy BI tools are being replaced with streaming analytics that run directly on live data
  • Static model pipelines are evolving into retraining loops that adjust to shifts in flow, spread dynamics, and inventory pressure
  • Fragmented market data is being normalized in real time to support consistent decisioning across instruments, venues, and latency-sensitive workflows
  • Warehouses are being supplemented, or entirely bypassed, with architectures built for real-time joins and low-latency aggregation
  • Simulation environments are running in-session to test strategies and manage risk while trades are still being executed
  • Compliance teams are demanding explainable outputs, with full visibility into how signals are generated and used

Time-sensitive data demands analytics that preserve structure, deliver context, and operate at production speed. The firms setting the pace are building systems that understand how markets move and how every price, signal, and spread fit together.

We have seen several firms already make this transition, applying these principles to live trading workflows across different asset classes.

One crypto-native market maker operating across dozens of venues uses our technology to normalize tick data, detect microstructure shifts, and retrain inference models continuously, all within the trade window. That means smarter quoting and tighter spreads when markets move.

A multi-asset FX desk team uses our technology to run real-time risk overlays on top of predictive pricing models. When flows spike, they adjust inventory in milliseconds. It allows them to simulate stress, test new strategies, and deploy in production without waiting on nightly cycles or batch jobs.

In both cases, the underlying infrastructure had to change, moving from delayed to streaming, from static models to continuously retraining pipelines, and from context-poor to context-rich analytics.

Infrastructure that doesn’t blink when markets do

When volatility hits, systems fail in interesting ways. Latency spikes. Compliance checks lag. Models go stale.

That’s why time-aware infrastructure, designed to ingest, process, and analyze every tick in context, is becoming foundational. It’s what lets you compare what’s happening now against what just happened, and what usually happens, all in real time.

KX is purpose-built for this. Our platform processes billions of rows per day with sub-millisecond latency, supports production-grade AI, and powers quoting engines and risk systems at some of the world’s largest trading firms.

Organizations choose us not just because we’re fast, but because we’re robust. As our customer Jad Sarmo, Head of Quant Development at B2C2, observed on a recent webinar:

“We’ve had major market events, crashes, even exchanges disappearing overnight. The infrastructure was boringly smooth, and boring is good when you’re talking about real-time trading systems.” – Jad Sarmo, B2C2

Modern market making requires infrastructure that doesn’t blink when markets do. That means systems built for real-time volume, model retraining, and explainable decisioning. Time-aware by design. Scalable by default. And proven in production across some of the most latency-sensitive desks in the world.

Explore how leading capital markets firms use KX for real-time analytics and AI-driven research. From backtesting and quant research to pre- and post-trade analytics, we support your critical use cases at scale, with full visibility and precision.

Demo the world’s fastest database for vector, time-series, and real-time analytics

Start your journey to becoming an AI-first enterprise with 100x* more performant data and MLOps pipelines.

  • Process data at unmatched speed and scale
  • Build high-performance data-driven applications
  • Turbocharge analytics tools in the cloud, on premise, or at the edge

*Based on time-series queries running in real-world use cases on customer environments.

Book a demo with an expert

"*" indicates required fields

By submitting this form, you will also receive sales and/or marketing communications on KX products, services, news and events. You can unsubscribe from receiving communications by visiting our Privacy Policy. You can find further information on how we collect and use your personal data in our Privacy Policy.

This field is for validation purposes and should be left unchanged.

A verified G2 leader for time-series