Share |

Q&A: OneMarketData's Lovas on the Low Latency Arms Race, TCA and Big Data

From time to time, it's good to take the pulse of the low-latency world to get updated on the status quo, and find out what's coming next. IntelligentTradingTechnology.com got such insight from Louis Lovas, director of solutions at OneMarketData.

Q: Compared to a year ago, where are trading firms focusing in their efforts to align latency reduction with their business?

A: A lot of effort continues to be devoted to latency reduction in the network pipes, the plumbing layer so-to-speak.  But there is an increased emphasis on looking further up the stack towards the application layer where algorithms execute.  The demands of algo-trading are increasingly creating more sophistication and complexity and also revealing areas for improvements such as transaction costs controls.  Firms are finding improvements in slippage that are a direct corollary to latency.

Q: Within a trading firm, given the many different projects calling for funding, what are the justifications for investment in reducing latency?

A: Latency reduction and its implied benefits does not always mean replacing/upgrading the network plumbing.  Slimmer margins, and the ever increasing difficulty to find alpha means firms have to look in all corners for opportunity.  Greater opportunity can lie elsewhere and come in many shapes and sizes from increasingly complex algos to TCA.  The ability to find that in real-time can come from a hunt through the past - specifically historical data.  It can reveal the biggest bang for your buck in alpha model analysis and slippage analysis.

Q: What is your company doing to help trading firms reduce latency in a way that benefits their business?

A: We are a vendor of trading infrastructure and data analysis solutions.  Those tools focused on the application “algo” layer in the trading “stack”.  We offer firms the high performance tools to analyse markets both in real-time and historically in order to build more sophisticated algos and discover cost improvement opportunities.

Q: Do you think the 'Low-Latency Arms Race" is over?  Or is it entering a new phase?

A: It will never be over, just morph into a new phase every few years.  As has been in the past and will be in the future an external catalyst will create a new tipping point.  That could be the invention of a new generation of hardware technology creating an order of magnitude performance change or an external force such as a regulatory change that forces a shift of focus. The Market Access Rule banning naked access is one such action.  It has caused brokerage firms to focus on providing the best “microsecond low-latency” access and still conform to the SEC's rule 15c3-5 which mandates pre-trade risk checks.  The rule ensures everyone pays a latency tax for checking credit limits, and order constraints.  Brokers must enforce it and they’ve responded by engaging in a latency war.  One narrowly defined, of course, but a war nonetheless.

Q: Where do you expect new business opportunities to come from over the next year?

A: There will be a continued focus on transaction cost analysis as firms increasingly become multi-asset; they will look to architect their own TCA solutions because no single broker will provide a complete view across their tradable markets.  TCA may not appear so obviously latency-related but slippage is all about latency.  When you introduce the complexity of trading across multiple ECNs to multiple geographies for equities, futures and FX, price slippage is a constant reminder of the need to focus on latency.

Q: What are some of the technology developments that your company is leveraging in its offerings?

A: The key areas that we focus on for low-latency:

a) Fast access to market data.  The critical importance of strategy decision time is immediacy of pricing data whether from single or multiple sources, order books need to be consolidated and possibly currency converted.  Fast access improves the overall decision time for algos such as spreads and pair trading.

b) Analytical function library.  Increasing algo sophistication implies more calculations and analysis - in realtime.  This should not be diametrically opposed to latency.   We continually focus on this to ensure optimal performance of our analytics

Q: Are there technology developments happening that you're tracking for possible future use in your offerings?

A: Hadoop and MapReduce is an area we’re looking into planning an R&D effort around.

Q: What is exciting you about the financial markets these days - and how you fit in - right now.

A: The buzz about big data is gaining a lot of attention in finance.  The OneTick product is well architected for consuming the fire hose of big data in finance.  Firms are no longer content to be single asset and they’re branching out to other types of data.  OneTick provides the flexibility, capacity and analytical capability to take on big data’s challenge.


Add comment

Member Login or Join the Community to post comments