We have made changes to our privacy policy.

Securities Trading Industry Laying Groundwork for Machine Learning

9 May 2018 | 4 minutes

By Manus McGuire

As a participant in the global securities trading industry for over two decades, KX has played a large role in the transformation of the industry to electronic and high frequency trading.  At this year’s annual Securities Traders’ Association of New York (STANY) Markets Conference on April 19th, the All Things Data – Big Data, AI and Machine Learning panel evidenced how KX continues to be a market leader in financial services technology.

The panel brought into focus what the next generation Machine Learning technology challenges are expected to be for the securities and trading sector as well as identified current trends hindering that evolution. Panel participants discussed the need to address legacy back-office platforms, data quality, data normalization, and reducing barriers to implementation when looking towards the future.

Legacy back-office platforms

Both the push for ML and increasing greater regulatory oversight are moving securities trading operations toward simplifying the large variety of platforms and back-office solutions they are currently using.

KX Managing Director Conor Twomey noted that in many instances, current back-office legacy systems are simply not fit for purpose. He explained that a majority of back-office systems are a noodle soup that includes order management systems, smart order routers, execution platforms, and other applications. These have been added incrementally in response to new regulations and business needs. This means that a huge amount of technical development effort and continuing maintenance resources are allocated to support these tactical solutions.

Conor described how this situation has resulted in bifurcated datasets within firms. Front-office operations have faster data feeds and more up-to-date software, while back-office departments, like regulatory and compliance groups, have older systems with slower data delivery. Conor noted that KX customers are increasingly looking to consolidate their front, middle and back-office systems using solutions to normalize data across groups and share powerful analytic and reporting tools that draw from the same dataset.

Data Quality, Data Normalization

In the current early stage of ML adoption, banks and trading firms are facing a conundrum. They want to attract and retain top data science talent, but their data will require labor intensive data hygiene efforts before it can be used in ML applications. This was highlighted as an obstacle to developing and keeping expert in-house ML technologists.

Conor said, “In some cases, with back office projects, the role of a data scientist has turned into more of a ‘data janitor’ where they spend all their time normalizing data feeds when they in fact want to take that normalized data feed and turn it into something exciting.”

During the period of data cleansing and normalization, banks can’t lose sight of developing data scientists with domain expertise. Twomey described the KX model where ML experts from outside consultancies, like KX, work alongside bank employees to develop best data practices and ML strategies.

Reduction in barriers to entry

Finding ways to reduce the barriers of entry for using financial ML/AI applications is a mandate many financial institutions are moving to address. In the first instance, they will need to change their practices around collecting, cleaning and storing data to use ML. Poor data quality was highlighted by the panel as a real hinderance to faster ML evolution.

For example, fixed income market participants are realizing that some of their biggest data problems are from capturing market data from multiple sources in different formats, which requires a large amount of resources to normalize and cleanse. Conor noted that correct feature engineering, which increases the predictive power of machine learning algorithms, depends on the quality of the raw data.

Conclusion

Breaking down data silos is a powerful first step for financial institutions who are looking to improve their data decision making practices. Solutions like KX Data Refinery present the opportunity to run sophisticated analytics and create reports from centralized cleansed data in one platform.

Regulatory change in fixed income markets, as well as with other asset types, is driving the industry towards greater transparency. The panelists at the conference all agreed that new transparency requirements can empower investors and energize competition in the markets when they are balanced correctly.

Manus McGuire is a Regulatory Reporting Executive with KX based in New York City.

Demo kdb, the fastest time-series data analytics engine in the cloud








    For information on how we collect and use your data, please see our privacy notice. By clicking “Download Now” you understand and accept the terms of the License Agreement and the Acceptable Use Policy.