News

Cloud-Delivered Market Data for Institutional Users – A Reality Check

klangstaff@xignite.com

Read the article on A-Team Insight Blog

By Mike O’Hara, Special Correspondent

Cloud-delivered market data was once ‘over my dead body’ territory for institutional market data managers, who understandably fretted aloud about performance, security and licence compliance issues. But Covid-19 has forced those same data managers to confront the fact that many of their professional market data users are able to work from home (WFH), in turn driving financial firms to question whether the pandemic could be the catalyst for a rethink of their expensive-to-maintain market data infrastructures, with cloud part of the data delivery solution.

For many financial firms, today’s cloud delivery and hosting capabilities offer a viable solution for supporting trading and investment teams and their support staff, accelerating demand for cloud-based market data delivery infrastructures. The thinking is that cloud may help firms with their broader aim of reducing their on-premises technology and equipment footprint, a trend that was emerging even before the Coronavirus struck.

But embracing cloud delivery introduces new challenges for market data and trading technology professionals. While WFH will doubtless continue in some form, it’s far from clear that all market data delivery can be migrated to the cloud. Essential market data functions will remain on-premise. High-performance trading applications and low-latency market data connectivity, for example, will continue to rely on state-of-the-art colocation and proximity hosting data centres.

For many financial institutions, the challenge will be how to manage these several tiers of market data delivery and consumption. Going forward, practitioners will face a three-way hybrid of on-premises, cloud-based (private/public) and collocated market data services in order to support a range of users: from work-from-home traders and support staff to trading-room-based traders, analysts and quants, to collocated electronic applications like algorithms, smart order routers and FIX engines.

Indeed, A-Team will be discussing the infrastructure, connectivity and market data delivery challenges associated with cloud adoption in a webinar panel session on November 3. The webinar will offer a ‘reality check’ that discusses best practices for embracing cloud, colo and on-prem to support this new mix of user types, with emphasis on capacity, orchestration, licensing, entitlements and system / usage monitoring.

With firms’ appetite for exploring the potential of the cloud piqued, data managers are now looking at whether they can hope to take advantage of some of the more widely recognised benefits of the cloud – flexibility, agility, speed-to-market, scalability, elasticity, interoperability and so on – as they grapple with the future market data delivery landscape.

“Market data infrastructure, in terms of data vendor contracts, servers, and data centre space, typically represents a large, lumpy, cap ex expenditure”, says independent consultant Nick Morrison. “And so having the ability to transition that to something with costs that are more elastic, is highly attractive”.

Of course, every firm has its own unique requirements and nuances in this regard. Proprietary trading firms, asset managers, hedge funds, brokers and investment banks are all heavy consumers of market data. But the volume, breadth, depth and speed of the data they need in order to operate is highly diverse. Which means that there is no ‘one size fits all’ when it comes to sourcing and distribution mechanisms (including the cloud).

Market data and the cloud – what’s applicable?

As they consider their options for including cloud in their overall data delivery plans, data managers need to assess whether and how specific data types could be migrated to a hybrid environment: Level 1 (best bid/offer), level 2 (order book with aggregated depth at each price level) or level 3 (full order book)? Historic, end of day, delayed or real-time? Streaming or on-demand? This all has a bearing on the feasibility of cloud as a delivery mechanism.

Firms also need to consider their mix of public and private cloud, or what mix or hybrid cloud solution best fits their needs. What about virtualisation? Or internal use of cloud architecture, such as building a market data infrastructure around microservices and containers?

The marketplace already has identified at least one workable use-case: the use of historical, tick or time-series market data, usually to drive some form of analytics. A growing number of trading venues (such as ICE and CME) and service providers (Refinitiv, BMLL and others) now offer full level 3 tick data on a T+1 basis, delivered via the cloud. Plenty more providers can offer historic level 1 & 2 data.

This kind of capability can be used for critical use-cases, such as back-testing trading models for signal generation and alpha capture, performing transaction cost analysis (TCA), developing and testing smart order routers (SORs), or fine-tuning trading algos to better source liquidity. In all of these cases, cloud-hosted historical tick databases can reduce on-premises footprint and cost, while offering flexible access to vast computing resource on demand, and many are finding this compelling. “When churning through such vast quantities of data, having access to a cloud environment enables you to scale up horizontally to process that data”, says Elliot Banks, Chief Product Officer at BMLL.

Where things start to get more complicated, though, is with real-time market data, where two of the biggest hurdles from a cloud delivery perspective are speed and complexity.

Deterministic speed

From a trading standpoint, speed is always going to be a significant factor. Nobody, regardless of whether they’re an ultra-low latency high-frequency trading firm or a human trader dealing from a vendor or broker screen, wants to trade on stale prices. The tolerances may be different but the principle applies across the board.

It’s a safe bet that any firm currently receiving market data directly from a trading venue into a trading server (collocated at the venue’s data centre or hosted at a specialized proximity hosting centre operated by the likes of Interxion) relies on deterministic low latency, and is therefore unlikely to consider cloud as an alternative delivery mechanism.

Clearly, HFT firms with trading platforms that require microsecond-level data delivery won’t be replacing their direct exchange feeds and often hardware-accelerated infrastructure with the cloud, as the performance just isn’t there, for now at least. This, of course, could change if and when the trading venues themselves migrate to cloud platforms, creating a new kind of colocation environment, but that’s likely some way off. “But these guys only have a few applications that really need ultra-low latency data”, says Bill Fenick, VP Enterprise at Interxion. “Most of their applications, be they middle office, settlements or risk, they’re perfectly happy to take low-millisecond latency”.

And what about other market participants? Particularly those that currently make use of consolidated feeds from market data vendors, where speed is perhaps a secondary consideration? This is where cloud delivery may have some real potential. But it’s also where the issue of complexity rears its head.

Navigating the complexity

To deal with the myriad of sources, delivery frequencies, formats and vendor connections used to feed real-time market data into their trading, risk, pricing and analytics systems, many financial firms have built up a complex mesh of infrastructure that ensures the right data gets delivered to the right place at the right time. The integration layer required to handle these data inputs may be delivered as part of the data service or may stand alone as a discrete entity. In either case, it’s unrealistic to expect that all of this infrastructure can just be stripped out and replicated in a cloud environment.

To address this challenge, some service providers are starting to offer solutions where the source of the data is decoupled from the distribution mechanism, aiming for the holy grail where either, or both, can be cloud-based.

By building individual cloud-hosted microservices for sourcing market data, processing that data in a variety of ways, and delivering it into end-user applications, such solutions can help firms migrate their market data infrastructure incrementally from legacy to cloud-based platforms. Refinitiv is starting to shift much of its infrastructure onto AWS, and other specialist cloud-centric vendors such as Xignite and BCC Group also enable internal systems to be decoupled from data sources, thus facilitating a shift towards cloud-based infrastructure. “We believe the customer should be able to easily move from source to source and get as many sources as they want. The cloud enables this kind of flexibility”, says Bill Bierds, President & Chief Business Development Officer at BCC Group.

Firms have long wanted to become more vendor-agnostic by decoupling their data integration capability from the primary data source. One investment bank in London, for example, was able to decouple Refinitiv’s TREP platform from its Elektron data feed and switch to Bloomberg’s B-Pipe for its data, delivered via the TREP framework. From a market data perspective, this has given the bank more negotiating power and less vendor lock-in, opening up greater opportunities to utilise cloud-based market data sources in the future.

Permissioning and entitlements

Perhaps one of the toughest challenges that firms face around real-time market data on the cloud is that of entitlements and usage authorisation. Firms sourcing data from the two main data vendors, Refinitiv and Bloomberg, will generally be tied into their respective DACS and EMRS entitlements systems, often augmented by data inventory and contract management platforms like MDSL’s MDM or TRG Screen’s FITS and InfoMatch.

Entitlements can be a thorny subject when it comes to cloud-based distribution of market data. Firms are wary of falling foul of their licence agreements with their various data vendors, all of whom have different commercial considerations and penalties for non-compliance. This is why accurate tracking and reporting of market data access and usage is crucial.

The cloud can be a double-edged sword in this regard. One the one hand, transitioning from a dedicated infrastructure to the cloud might trigger extra licensing costs for what is effectively an additional data centre, so they may need to go through a period of paying twice for the same data. Indeed, firms may already be facing this situation as they entitle staff to operate from home while holding enterprise licences covering only their headquarters and regional offices.

On the other hand, cloud-based services such as those offered by Xignite and others can make it easier for firms to manage entitlements across multiple data vendors from a central source via a UI. “Our entitlements microservice is integrated with our real time microservice, to make sure that any distribution and any consumption of data is authenticated and entitled properly, so that only the right users have access to the data,” says Stephane Dubois, CEO of Xignite, whose microservices suite is supporting NICE Actimize’s cloud-based market data delivery infrastructure.

Where next?

With new products, services and technologies emerging all the time, firms can be optimistic about the growing opportunities that the cloud can offer for managing market data. One particularly interesting development worth watching is the rise of Low Code Application Platforms (LCAPs), such as that offered by Genesis, which provides a cloud-based microservices framework that can be used for rapidly developing and delivering applications around real-time market data. One example is on-demand margining. “A prime broker can link to all of its customers and know exactly what their risk positions are based on real-time market data, so within minutes, they can be sending out margin calls”, says Felipe Oliviera, Head of Sales and Marketing at Genesis.

Industry behemoths such as Refinitiv, SIX and FactSet are also embracing the cloud. Refinitiv has now launched delivery of market data via AWS, is making its tick history data available on Google Cloud and has also recently announced a partnership with Microsoft Azure. FactSet has launched a cloud-based ticker plant on Amazon EC2. And SIX is partnering with Xignite for real-time market data delivery via the cloud. Bloomberg is also partnering with AWS to make its B-Pipe data feed available through the cloud. And the main cloud vendors themselves – Amazon, Google and Microsoft – have established dedicated teams to develop these markets

In conclusion, it’s clear that there are a number of challenges that firms still face when transitioning any part of their market data infrastructure to the cloud. (To register for A-Team’s free webinar on the topic, click here.) And in many cases, particularly where ultra-low latency is required, cloud is not the answer. But equally, by migrating certain elements of their market data infrastructure to the cloud, cost savings can be achieved, efficiencies can be gained and firms can potentially do more with less.

RECENT NEWS

In the foreign metal market and the world of international rates, currencies play the crucial role of acting as the medium of exchange in the transactions that take place.

Currencies like the United States dollar, the Euro, or the British Pound are commonly used around the world in order to get a metal rate. Some companies that offer precious metal live and historical rates have exposed their APIs (Application Programming Interfaces) to allow developers to integrate current and historical metal rates, currency conversion, or other capabilities into their applications.

In order to know about precious metals live and historical rates, there’s a lot of APIs available online, and if you want to try one, Barchart is going to be one of your first options. But if you take a look at what else is in the market, you’ll find alternatives so many great alternatives:

Xignite Market Data Cloud Platform

Xignite Market Data as a Service was one of the first market data services built to run in AWS and they are one of the few vendors that is an AWS Advanced Technology Partner with a Financial Services Competency.

With more than a decade of cloud expertise in building, scaling and operating cloud-based market data technology, it is no surprise that leading financial services and capital markets firms rely on this company to empower their journey to the cloud. Their Metals API Service offers real-time prices and quotes for metals including Gold, Silver, Palladium, Platinum and other base metals. In addition to real-time precious metals prices, the service provides daily London Fixing prices as well as historical precious metal prices and metal news. 

Xignite Cloud APIs are sourced from leading providers such as FactSet and Morningstar as well as Xignite’s own curated, high-quality data.

Read the article Top 3 Alternatives for Barchart Precious Metals Rates

02/25/2021

Each year, Bobsguide asks the market to vote for fintech companies they believe stand out from the competition – those who have gone the extra mile in terms of development and servicing their clients. Xignite is proud to be listed as the "Best API Management" vendor on the Bobsguide 2020 Rankings.


Read article on Bobsguide

01/26/2021

Web services data provider Xignite captured the AFTAs judges’ attention on the infrastructure front with its release of Xignite Enterprise Microservices in July 2020, a suite of cloud-based microservices for data management, storage and distribution in the cloud, designed to help financial firms migrate from monolithic legacy data architectures to more agile and less expensive cloud services and data sources.

Requires subscription to read the article on WatersTechnology

01/25/2021

Xignite, Inc., a provider of market data distribution and management solutions for financial services and technology companies, today revealed the results of its collaboration with StockCharts, a leading technical analysis and financial charting platform for online retail investors. The collaboration involved a move from an on-premise market data provider to Xignite’s cloud-native technology hosted in Amazon Web Services (AWS). Download the case study containing the full results.

StockCharts requires vast quantities of financial data to power its visualization, charting and tracking tools, which investors use to analyze the markets to help with investment decisions. The company was frustrated by the limits of its on-premise market data center, which was forcing the team to make architectural decisions based on what its data center could handle in terms of speed and storage, not on their technology. Its previous market data provider was just starting to build out some cloud offerings, but they were far away from what the business required. StockCharts decided to migrate its infrastructure to the AWS cloud and partner with Xignite to gain access to endlessly scalable market and financial data delivered through innovative cloud APIs.

The collaboration made an immediate impact as StockCharts was able to expand its offerings and customer base by pursuing growth strategies enabled by Xignite’s cloud-based approach, which provides easy access to data and eliminates architectural limits on storage and speed.

The pandemic provided further validation. Seattle-based StockCharts was in one of the first areas hit by COVID-19 and was forced to quickly shut down its office. Pandemic-driven market volatility followed and StockCharts customers wanted to visualize what was happening. The company’s ability to scale quickly and accommodate a high volume of new requests would not have been possible without Xignite.

“The move to the AWS cloud and Xignite has unlocked tremendous new potential for us in a lot of architectural ways, and has given us a lot of data options that we could not even consider before,” said Grayson Roze, Vice President of Operations at StockCharts. “It relieved us of the burden of figuring out how to source things. Instead, we know exactly where we need to go to get the data and can access it instantly. That is a huge, huge benefit for our business.”

“We are proud to have played a role in transforming how StockCharts approaches data,” said Stephane Dubois, CEO and Founder of Xignite. “The events of this year unleashed a massive spike in retail trading and a host of other unexpected forces that reinforced the need for financial services firms to leverage the cloud. Despite the disruption of this year, StockCharts was positioned for success, and we look forward to continuing to deliver our financial and market data solutions to the industry at large.”

Xignite

Xignite has been disrupting the financial and market data industry from its Silicon Valley headquarters since 2006 when it introduced the first commercial REST API. Since then, Xignite has been continually refining its technology to help fintech and financial institutions get the most value from their data. Today, more than 700 clients access over 500 cloud-native APIs and leverage a suite of specialized microservices-delivered modules to build efficient and cost-effective enterprise data management solutions. Visit http://www.xignite.com or follow on Twitter @xignite

01/12/2021