News

The API movement in robo-advisory space

Xignite

Open APIs

Source

There are 1700 APIs (rounded) shared on the Financial Category of the Programmable Web directory, plus 1200 in payments. Government has 1150, e-commerce 2500, and mobile 2450. Xignite launched last September the #FintechRevolution API ecosystem, which is a consortium of 21 companies supporting and connecting developers with best-of-breed financial APIs. The focus is on non-traditional workflow, data and analytics providers (e.g. TradierEstimizeAutochartist).

Financial businesses, startups or those innovating, may or may not have an API strategy. The main choices I see are:

  • A white label offering
  • A private API
  • An open API

white label offering, is a package without a label that a business can buy from a provider. This package can be integrated with existing processes and can be offered to your clients with your own logo. Customization is only feasible on the window dressing level.

An API (Application Programming Interface) is a software program that enables other software programs to interact. It can enable access to backend data and functions of an organization.

It is easy to grasp the concept in FX trading: an API is the software that enables a trading platform to connect with the market where execution occurs. Investopedia says:

Proprietary APIs are offered by almost every major online forex brokerage.

Which brings us to the distinction between private and Open APIs; the latter not being yet well defined. Private APIs or Proprietary APIs, are those accessible only to customers and in-house developers. As such, the security problems are controlled and whether modifications are truly valuable, is also monitored.

On the other hand, linking to Open APIs, a developer or business, doesn’t need to login into the provider site or be a customer. Open APIs, sit on the programmableWeb. They are similar to the Internet or open-source Linux operating system. Security problems and software development is not controlled centrally.

More detailed considerations between the two architectures and business strategies can be found on the API Academy Private APIs vs. Open APIs. Clearly, the market is versioning their business strategies and testing waters, from private APIs, to open APIs, and some refer to hybrid APIs.

Zooming into Digital Wealth management, which we defined as broad and intrusive in our early April post “How we define and categorize Fintech” wont help us take a look at the API movement, its versions and its business strategy alternatives. We choose to start by the, by now, boring robo-advisory space. There are two broad types of such businesses:

  • Standalone robo-advisor startups
  • Broader businesses with a robo-advisory offering

We stressed their dynamics in a 90sec videographic in early March, as we were seeing serious signs of leapfrogging of the “incumbents”. From the entire universe of robo-advisors, we distinguish two meaningful clusters involved in robo-advisory:

  • Robo-advisory business with brokerage and or custody services
  • Robo-advisory business without brokerage and or custody services

Most of those in the first category, have been adopting the API movement around their robo-advisory business along with a white label offering. A few examples are:

White label & API offering

Multi asset

TD America

Interactive Brokers

Saxo Bank

Apex Clearing

Only API offering

Equities

Robinhood (free commission brokerage)

Tradier (brokerage as a service)

Bitcoin

Coinbase

From the second batch, those without brokerage and custody, we see more of a white label rather than an API approach. Betterment for example has taken the white label approach for their institutional offering. And so has Sig Fig and NextCapital in the US (just to name of few in the US). In Italy, AdviseOnly has a white label offering; Money on Toast in the UK. Who am I missing in Europe?

This leads to the distinction of the “Coming Soon” API of Hedgeable which will be included on the programmable web. Announced about 2 weeks ago on “Daily API RoundUp: NASA, MapFruition, Hedgeable, ProPublica Campaign Finance”. Hedgeable, the first robo-advisor (albeit not robo 1.0) that offered retail clients the opportunity to invest in bitcoin. Now they are taking a lead in joining the OPEN API movement.

Wealthfront, who is focusing on direct consumers (not advsiors) seems to have chosen an API approach (no white label offering) that is closer to the Robinhood style:

“Wealthfront 3.0 will feature direct integrations with platforms like Venmo, Redfin, Lending Club and Coinbase as well as bank accounts and external brokerage accounts.”

Redfin is an on-line real estate firm. Venmo is a digital wallet that allows sharing payments with friends (like splitting cab fare). Coinbase is a bitcoin wallet. Apex Clearing is the clearing partner for Wealthfront. All these are integrations are possible via “private API” access of these parties.

Glancing at the 1700 financial APIs on the programmable web, it is evident that they are mostly data related or crypto-currency related. As there is no easy way to granulate the categorization, we can only conclude that most open financial APIs are related to brokerage (e.g. Tradable), banking of course (e.g. Capital One), payments and PFM.

As the broader trend of Convergence and integration is underway in a couple of verticals of financial services (digital wealth management and lending, for example); we will be watching the attractors in action.

Where will the puck head to in the robo-advisory space?

  • White Label
  • API
  • Open API

Which European robo-advisors (with no brokerage and custody business) have a white label offering or some version of an API strategy?

Anybody in Asia? Smartly, 8Now!, are just launching locally and aren’t fired up via a white-label from the West.

Daily Fintech Advisers provides strategic consulting to organizations with business and investment interests in Fintech. Efi Pylarinou is a Digital Wealth Management thought leader.

Source: Daily Fintech

RECENT NEWS

Read the article on A-Team Insight Blog

By Mike O’Hara, Special Correspondent

Cloud-delivered market data was once ‘over my dead body’ territory for institutional market data managers, who understandably fretted aloud about performance, security and licence compliance issues. But Covid-19 has forced those same data managers to confront the fact that many of their professional market data users are able to work from home (WFH), in turn driving financial firms to question whether the pandemic could be the catalyst for a rethink of their expensive-to-maintain market data infrastructures, with cloud part of the data delivery solution.

For many financial firms, today’s cloud delivery and hosting capabilities offer a viable solution for supporting trading and investment teams and their support staff, accelerating demand for cloud-based market data delivery infrastructures. The thinking is that cloud may help firms with their broader aim of reducing their on-premises technology and equipment footprint, a trend that was emerging even before the Coronavirus struck.

But embracing cloud delivery introduces new challenges for market data and trading technology professionals. While WFH will doubtless continue in some form, it’s far from clear that all market data delivery can be migrated to the cloud. Essential market data functions will remain on-premise. High-performance trading applications and low-latency market data connectivity, for example, will continue to rely on state-of-the-art colocation and proximity hosting data centres.

For many financial institutions, the challenge will be how to manage these several tiers of market data delivery and consumption. Going forward, practitioners will face a three-way hybrid of on-premises, cloud-based (private/public) and collocated market data services in order to support a range of users: from work-from-home traders and support staff to trading-room-based traders, analysts and quants, to collocated electronic applications like algorithms, smart order routers and FIX engines.

Indeed, A-Team will be discussing the infrastructure, connectivity and market data delivery challenges associated with cloud adoption in a webinar panel session on November 3. The webinar will offer a ‘reality check’ that discusses best practices for embracing cloud, colo and on-prem to support this new mix of user types, with emphasis on capacity, orchestration, licensing, entitlements and system / usage monitoring.

With firms’ appetite for exploring the potential of the cloud piqued, data managers are now looking at whether they can hope to take advantage of some of the more widely recognised benefits of the cloud – flexibility, agility, speed-to-market, scalability, elasticity, interoperability and so on – as they grapple with the future market data delivery landscape.

“Market data infrastructure, in terms of data vendor contracts, servers, and data centre space, typically represents a large, lumpy, cap ex expenditure”, says independent consultant Nick Morrison. “And so having the ability to transition that to something with costs that are more elastic, is highly attractive”.

Of course, every firm has its own unique requirements and nuances in this regard. Proprietary trading firms, asset managers, hedge funds, brokers and investment banks are all heavy consumers of market data. But the volume, breadth, depth and speed of the data they need in order to operate is highly diverse. Which means that there is no ‘one size fits all’ when it comes to sourcing and distribution mechanisms (including the cloud).

Market data and the cloud – what’s applicable?

As they consider their options for including cloud in their overall data delivery plans, data managers need to assess whether and how specific data types could be migrated to a hybrid environment: Level 1 (best bid/offer), level 2 (order book with aggregated depth at each price level) or level 3 (full order book)? Historic, end of day, delayed or real-time? Streaming or on-demand? This all has a bearing on the feasibility of cloud as a delivery mechanism.

Firms also need to consider their mix of public and private cloud, or what mix or hybrid cloud solution best fits their needs. What about virtualisation? Or internal use of cloud architecture, such as building a market data infrastructure around microservices and containers?

The marketplace already has identified at least one workable use-case: the use of historical, tick or time-series market data, usually to drive some form of analytics. A growing number of trading venues (such as ICE and CME) and service providers (Refinitiv, BMLL and others) now offer full level 3 tick data on a T+1 basis, delivered via the cloud. Plenty more providers can offer historic level 1 & 2 data.

This kind of capability can be used for critical use-cases, such as back-testing trading models for signal generation and alpha capture, performing transaction cost analysis (TCA), developing and testing smart order routers (SORs), or fine-tuning trading algos to better source liquidity. In all of these cases, cloud-hosted historical tick databases can reduce on-premises footprint and cost, while offering flexible access to vast computing resource on demand, and many are finding this compelling. “When churning through such vast quantities of data, having access to a cloud environment enables you to scale up horizontally to process that data”, says Elliot Banks, Chief Product Officer at BMLL.

Where things start to get more complicated, though, is with real-time market data, where two of the biggest hurdles from a cloud delivery perspective are speed and complexity.

Deterministic speed

From a trading standpoint, speed is always going to be a significant factor. Nobody, regardless of whether they’re an ultra-low latency high-frequency trading firm or a human trader dealing from a vendor or broker screen, wants to trade on stale prices. The tolerances may be different but the principle applies across the board.

It’s a safe bet that any firm currently receiving market data directly from a trading venue into a trading server (collocated at the venue’s data centre or hosted at a specialized proximity hosting centre operated by the likes of Interxion) relies on deterministic low latency, and is therefore unlikely to consider cloud as an alternative delivery mechanism.

Clearly, HFT firms with trading platforms that require microsecond-level data delivery won’t be replacing their direct exchange feeds and often hardware-accelerated infrastructure with the cloud, as the performance just isn’t there, for now at least. This, of course, could change if and when the trading venues themselves migrate to cloud platforms, creating a new kind of colocation environment, but that’s likely some way off. “But these guys only have a few applications that really need ultra-low latency data”, says Bill Fenick, VP Enterprise at Interxion. “Most of their applications, be they middle office, settlements or risk, they’re perfectly happy to take low-millisecond latency”.

And what about other market participants? Particularly those that currently make use of consolidated feeds from market data vendors, where speed is perhaps a secondary consideration? This is where cloud delivery may have some real potential. But it’s also where the issue of complexity rears its head.

Navigating the complexity

To deal with the myriad of sources, delivery frequencies, formats and vendor connections used to feed real-time market data into their trading, risk, pricing and analytics systems, many financial firms have built up a complex mesh of infrastructure that ensures the right data gets delivered to the right place at the right time. The integration layer required to handle these data inputs may be delivered as part of the data service or may stand alone as a discrete entity. In either case, it’s unrealistic to expect that all of this infrastructure can just be stripped out and replicated in a cloud environment.

To address this challenge, some service providers are starting to offer solutions where the source of the data is decoupled from the distribution mechanism, aiming for the holy grail where either, or both, can be cloud-based.

By building individual cloud-hosted microservices for sourcing market data, processing that data in a variety of ways, and delivering it into end-user applications, such solutions can help firms migrate their market data infrastructure incrementally from legacy to cloud-based platforms. Refinitiv is starting to shift much of its infrastructure onto AWS, and other specialist cloud-centric vendors such as Xignite and BCC Group also enable internal systems to be decoupled from data sources, thus facilitating a shift towards cloud-based infrastructure. “We believe the customer should be able to easily move from source to source and get as many sources as they want. The cloud enables this kind of flexibility”, says Bill Bierds, President & Chief Business Development Officer at BCC Group.

Firms have long wanted to become more vendor-agnostic by decoupling their data integration capability from the primary data source. One investment bank in London, for example, was able to decouple Refinitiv’s TREP platform from its Elektron data feed and switch to Bloomberg’s B-Pipe for its data, delivered via the TREP framework. From a market data perspective, this has given the bank more negotiating power and less vendor lock-in, opening up greater opportunities to utilise cloud-based market data sources in the future.

Permissioning and entitlements

Perhaps one of the toughest challenges that firms face around real-time market data on the cloud is that of entitlements and usage authorisation. Firms sourcing data from the two main data vendors, Refinitiv and Bloomberg, will generally be tied into their respective DACS and EMRS entitlements systems, often augmented by data inventory and contract management platforms like MDSL’s MDM or TRG Screen’s FITS and InfoMatch.

Entitlements can be a thorny subject when it comes to cloud-based distribution of market data. Firms are wary of falling foul of their licence agreements with their various data vendors, all of whom have different commercial considerations and penalties for non-compliance. This is why accurate tracking and reporting of market data access and usage is crucial.

The cloud can be a double-edged sword in this regard. One the one hand, transitioning from a dedicated infrastructure to the cloud might trigger extra licensing costs for what is effectively an additional data centre, so they may need to go through a period of paying twice for the same data. Indeed, firms may already be facing this situation as they entitle staff to operate from home while holding enterprise licences covering only their headquarters and regional offices.

On the other hand, cloud-based services such as those offered by Xignite and others can make it easier for firms to manage entitlements across multiple data vendors from a central source via a UI. “Our entitlements microservice is integrated with our real time microservice, to make sure that any distribution and any consumption of data is authenticated and entitled properly, so that only the right users have access to the data,” says Stephane Dubois, CEO of Xignite, whose microservices suite is supporting NICE Actimize’s cloud-based market data delivery infrastructure.

Where next?

With new products, services and technologies emerging all the time, firms can be optimistic about the growing opportunities that the cloud can offer for managing market data. One particularly interesting development worth watching is the rise of Low Code Application Platforms (LCAPs), such as that offered by Genesis, which provides a cloud-based microservices framework that can be used for rapidly developing and delivering applications around real-time market data. One example is on-demand margining. “A prime broker can link to all of its customers and know exactly what their risk positions are based on real-time market data, so within minutes, they can be sending out margin calls”, says Felipe Oliviera, Head of Sales and Marketing at Genesis.

Industry behemoths such as Refinitiv, SIX and FactSet are also embracing the cloud. Refinitiv has now launched delivery of market data via AWS, is making its tick history data available on Google Cloud and has also recently announced a partnership with Microsoft Azure. FactSet has launched a cloud-based ticker plant on Amazon EC2. And SIX is partnering with Xignite for real-time market data delivery via the cloud. Bloomberg is also partnering with AWS to make its B-Pipe data feed available through the cloud. And the main cloud vendors themselves – Amazon, Google and Microsoft – have established dedicated teams to develop these markets

In conclusion, it’s clear that there are a number of challenges that firms still face when transitioning any part of their market data infrastructure to the cloud. (To register for A-Team’s free webinar on the topic, click here.) And in many cases, particularly where ultra-low latency is required, cloud is not the answer. But equally, by migrating certain elements of their market data infrastructure to the cloud, cost savings can be achieved, efficiencies can be gained and firms can potentially do more with less.

10/21/2020

Xignite, Inc., a provider of market data distribution and management solutions for financial services and technology companies, today announced it won the Best Real-Time Market Data Initiative at the Inside Market Data & Inside Reference Data Awards.

A longtime leader in the market data cloud space, Xignite provides financial data through its innovative cloud APIs, which are developer-friendly, reliable and endlessly scalable. Xignite data is normalized and ready-to-use, eliminating common pain points with legacy providers, while maintaining global coverage and institutional quality.

This award recognized Xignite’s work with SoFi, a leading digital personal finance company. In 2019, SoFi launched SoFi Invest, a free consumer investing service, and enlisted Xignite to power the entire platform, from its robo-advisor capabilities, to financial newsfeed, to real-time market alerts and curated stock list. SoFi has identified a number of ways in which these key features are driving member engagement – for example, 10% of users who receive a market alert make a trade within an hour. For more details on this collaboration, download the case study HERE.

“We are honored to be recognized for Best Real-Time Market Data Initiative. Xignite was the first to bring market data to the cloud, and we have continued to innovate and point the way to the future of this unique subset of the industry,” said Stephane Dubois, CEO and Founder of Xignite. “The SoFi collaboration is a great example of how a firm can leverage our diverse range of APIs to build an all-encompassing platform and scale it rapidly. As we look to the future, we will continue to serve our clients through transformative offerings, including our suite of Xiginite Enterprise Microservices, which we announced in July.”

The Inside Market Data & Inside Reference Data Awards are held by WatersTechnology and recognize industry excellence within market data, reference data and enterprise data management. The award ceremony took place during the publication’s Innovation Exchange held virtually from September 9 to September 22.

This is the latest honor in what has been a fruitful year for Xignite on the awards circuit. In the spring, the firm was named an SIIA CODiE Awards finalist and included on the WealthTech 100 list.

About Xignite
Xignite has been disrupting the financial and market data industry from its Silicon Valley headquarters since 2006 when it introduced the first commercial REST API. Since then, Xignite has been continually refining its technology to help fintech and financial institutions get the most value from their data. Today, more than 700 clients access over 500 cloud-native APIs and leverage a suite of specialized microservices to build efficient and cost-effective enterprise data management solutions. Visit http://www.xignite.com or follow on Twitter @xignite

About the Inside Market Data and Inside Reference Data Awards
The annual Inside Market Data and Inside Reference Data Awards, now in their 17th year, play a key role in WatersTechnology’s awards program, and are the only awards that feature a mix of call-for-entry categories determined by a panel of judges and categories compiled by WatersTechnology’s journalists and voted on by the brand’s readership. This year’s awards featured 32 categories in total: 21 call-for-entry categories, 10 journalist-compiled categories, and a hall of fame (lifetime achievement) award.

09/23/2020

Over time, the market has come to embrace cloud in more and more aspects of trading technology. Processing large sets of data and calculation of computationally intense formulas (or both) are common uses of cloud. While the market may not be quite ready to move every part of the trading cycle to the cloud, market data is becoming more and more mainstream. 

Market Data + Cloud Solutions

In fact, somewhat ironically, market data is very fertile “ground” for cloud offerings. Not only are third-party cloud providers continuing to enhance their market data offerings (i.e., Bloomberg,[1] Refinitiv,[2] Xignite[3]), but exchanges are also offering access to data directly via their own cloud services or innovation partners (e.g., CBOE,[4] IEX,[5] Nasdaq[6]). In a post-COVID-19 world, cloud has only become more entrenched in the trading lifecycle across both buy-side and sell-side firms. Even looking back to views from 2019, the growing importance of cloud servicing market data needs is clear.

In fact, almost three-quarters of respondents in our 2019 Market Data Study[7] identified innovation in market data as highly important, with cloud seen as the second most impactful innovation (trailing only slightly behind artificial intelligence). 


Read entire blog post by Shane Swanson, Senior Analyst, Market Structure and Technology at Greenwich Associates.

09/22/2020

Xignite, Inc., a provider of market data distribution and management solutions for financial services and technology companies, announced today that it recently enhanced its Bond Master API. Xignite offers several APIs that provide real-time, delayed, historical fixed income pricing and reference data for corporate and agency debt bonds. The Bond Master API enhancement increases the coverage from the United States to 190+ countries, adds additional bond types to support more than 2 million active bond issues, and increases the ease of use of the API with several new endpoints.

Unlike legacy fixed-income data solutions, Xignite’s Bond Master API is cloud-native and offers a robust selection of use-case-based endpoints. Developers can easily integrate these endpoints into their product or app, regardless of type, amount, or frequency of data, without the need for any complex integration logic. Unlike file-based data delivery solutions, the Bond Master API makes on-demand integration into downstream security master or compliance systems frictionless.

Additional detail on the enhanced Bond Master endpoints:

  • The List endpoint for bond type, issuer type, and domicile enables clients to slice and dice the bond universe differently based on use-case.
  • The ScreenBonds endpoint enables clients to dynamically and easily screen the bond universe by combining criteria based on the coupon rate, maturity date, callability, and issue convertibility.
  • The ListBondDataPoints and GetBondDataPoints endpoints enable clients to more easily pick and choose the reference data points they need to integrate into their systems.
  • The GetBondDataPoints endpoint enables access to additional reference data points without requiring changes to an existing implementation.

“Because much of the benefits of a reference data service derives from its breadth, depth and quality of coverage, these enhancements give you the added peace of mind that comes from knowing your holdings are validated against a complete universe,” said Vijay Choudhary, Vice President, Product Management, Market Data Solutions at Xignite. “These enhancements eliminate the need to maintain an on-site bond security master, which ultimately saves our clients time and eliminates significant unnecessary expenses.”

Additional bond issuer types now include: Government Agency, Government Controlled Company, State Government, Supranational

Additional new bond types now include: Bankers Acceptance, Capital Securities, Cash Management Bill, Certificate, Certificate of Deposit, Commercial Paper, Covered Bond, Debenture, Depository Receipt, Discount Notes, Loan Note, Loan Stock, Medium Term Notes, Note, Permanent Interest Bearing Shares, Preferential Security, Preferred Security, Reference Bills, Structured Product, Strip Package, Treasury Bill

Additional reference data points are also now available for all bond types:

  • Issue instrument identifiers (CUSIP, ISIN, Symbol, etc.)
  • Bond Issuer details including issuer name, domicile, unique company identifier, issuer status, industry and sector
  • Bond Issue details including maturity, coupon, coupon type, par value, dated date, distribution and amortization details, day count convention, original issue details, liquidation right, callable, convertible, guarantor, redemption, and other issue details

This is just the latest example of Xignite’s ability to innovate. Earlier this year, the firm unveiled its suite of market data management microservices and also received a patent for its market alerts technology.

About Xignite

Xignite has been disrupting the financial and market data industry from its Silicon Valley headquarters since 2006 when it introduced the first commercial REST API. Since then, Xignite has been continually refining its technology to help fintech and financial institutions get the most value from their data. Today, more than 700 clients access over 500 cloud-native APIs and leverage a suite of specialized microservices to build efficient and cost-effective enterprise data management solutions. Visit http://www.xignite.com or follow on Twitter @xignite

09/16/2020