News

Is It API or Die for Banks?

Xignite

By Joy Macknight, The Banker
03.01.16

The banking industry’s use of application programme interfaces (APIs) is rapidly accelerating. According to the ‘Banking APIs: state of the market’ report, jointly published by software developer Axway, API conference APIdays and open-source API store Open Bank Project (OBP) in November, “2015 is seeing a significant cultural shift within banks, and a greater readiness to utilise APIs and related technologies to [help] banks meet current challenges in transforming to a digital landscape”.

the baker xignite api fintechArguably, banks have arrived late to the party – API technology is widespread across a range of other industries. “It is the accepted norm for secure data sharing and embedding functionality in an online environment,” says Maurice Cleaves, CEO of industry trade association Payments UK. Basically, it allows one software programme to ask another to perform a service, accessing specific data in the process.

Banks are beginning, however, to understand the critical role the technology will play in the digital transformation of their business. APIs hold the promise of giving banks more flexibility to improve customer experience, reducing time to market and cutting costs, effectively putting them on a level playing field with financial technology (fintech) firms.

“Banks recognise the value that APIs can bring in terms of efficiency and innovation – many use APIs internally already,” says Mr Cleaves. “They also recognise that many of the new players that they want to work or partner with are built on infrastructures that utilise APIs.”

Benny Boye Johansen, head of Saxo Bank’s API programme, OpenAPI, believes such interfaces are a business imperative for financial institutions. “In the future, the best solutions will involve parties that are able to co-operate through APIs,” he says. “There is going to be a continued evolution of banks having to offer APIs or die, as they will be overtaken by other companies that can interface to create better solutions.”

Overcoming obstacles

Nevertheless, there remain a number of issues to overcome in order for banks to embrace this new paradigm, including internal technology and cultural inertia, a lack of a global standard and, above all, data security concerns that must be addressed before banks will feel comfortable in opening up access to their technology.

Simon Redfern, the founder of OBP, says: “The biggest challenge is addressing bureaucracy, security and privacy concerns relating to opening banking services to third parties. However, banks are realising they need to open the gates of their walled gardens a little to reveal some innovation gems on the other side.”

Vikram Gupta, vice-president of product strategy and development for IT solutions firm Oracle Financial Services, agrees. Over the past six months he has seen a sea change happening among the banks he speaks to. He says: “As the tablet and mobile world takes off, banks are looking to expose some services to third parties that can help them go to market faster because it takes longer to develop them in house.”

The state of the market report confirms these observations. Despite banks’ current concentration on internal APIs, the research predicts a budding trend towards public APIs in the near future. A majority of those surveyed believed in an open platform future for banks, in which 60% of a bank’s APIs would be made available to partners and third-party providers.

Listening to legislation

Emerging regulations, such as the EU’s Payment Services Directive 2 (PSD2), have helped in opening up the banking sector to API developments. The revised directive, which will come into force from January 13, 2018, includes access to accounts whereby banks are required to offer an API to third parties under the supervision of the European Banking Authority (EBA). The EBA is expected to finalise its regulatory technical standards and release a consultation document in the second quarter of 2016.

“The PSD2 legislation has put open APIs firmly on the banks’ agenda,” says Alex Mifsud, CEO of Ixaris, a cloud-based open payments platform. In June 2015, Ixaris launched the first phase of the EU-funded open payments ecosystem project, which allows banks to expose some services and co-operate with developers in a safe environment, says Mr Mifsud.

Not waiting for the EBA’s lead, the UK’s Open Banking Working Group (OBWG) released its recommendations for the design and delivery of an open banking standard at the beginning of February 2016. The framework provides guidance on how open financial data should be “created, shared and used by its owners and those who access it”. A basic standard will be launched by the end of the year, followed by personal customer transaction data included on a read-only basis at the start of 2017. Full business, customer and transactional data will be included by 2019.

Mr Cleaves, who participated in the OBWG, says the experience was a “positive one”, showing that “banks, fintechs and government can work together positively to help to advance these important issues”.

It is noteworthy that the fintech community got a place at the table, demonstrating a common acceptance that they are important constituents of the financial ecosystem and are here to stay.

Rising to the challenge

It is hardly surprising that smaller or 'challenger' banks are more ready to take a leap of faith with open APIs than many of the larger incumbent banks. The former are actively exploring new business models and do not face the same legacy issues.

For example, Munich-based digital lender Fidor Bank is grounding its whole business strategy on open APIs. In order to be flexible and scalable the Fidor team built FidorOS, a digital middleware with open APIs. “Open APIs play an active, strategic and crucial role in our infrastructure because in addition to our own retail and business banking products, we also offer ‘no-stack banking’ to non-banks, retailers or challenger banks,” says Matthias Kröner, CEO of Fidor.

Other organisations can connect to Fidor Bank’s technology stack and offer full banking services to their customers without a proprietary banking licence or infrastructure. The bank is hoping to announce its initial partners in the first half of this year, and this will “shake up the market”, according to Mr Kröner, who declined to provide further details at this time.

Copenhagen-based Saxo Bank made a pivot towards openness during a revamp of its trading platform strategy. In September 2015, it launched an additional channel called OpenAPI. The open strategy will facilitate a broader white-label business, which supports its mission as a market facilitator.

“By having an open API we make it possible for any third party to not just accept our trading platform, but also make it a part of their own business and develop ways to present the trading experience in a way that fits their target market better,” says Mr Johansen. Today, Saxo Bank has two institutional clients on OpenAPI, with more in the pipeline.

Logical steps

Silicon Valley Bank (SVB) also began migrating to open APIs from a legacy infrastructure position. Megan Minich, the head of product and channel delivery, believes this was a logical step given its innovative corporate customer base. The bank’s open API platform went live in a private beta at the end of last year and the bank is onboarding more clients every week.

SVB began with a specific card payment, for which it already had an API solution via a partner but knew it could enhance the experience and the API. The bank is currently adding new features and plans to do a broader beta by the end of the first quarter of 2016, as well as launching additional products throughout the year. “We take a similar approach to all our digital delivery – get something out, test it and iterate,” says Ms Minich.

In addition, SVB plans to apply the APIs it builds for clients within the bank. Ms Minich explains: “While it is important to think of APIs as a client-facing tool, we also want to leverage the flexibility, ways of collaborating and gathering data to use internally when building out our front-end user experiences, such as online and mobile.”

In India, Yes Bank’s decision to embrace APIs was prompted by the central bank’s decision to issue new payment bank licences in August 2015. “To sustain and grow our current share of the payment business, we needed to proactively read our customers’ minds to cater their needs,” says Yes Bank's chief information officer Anup Purohit.

Additionally, the private sector bank wanted to improve the stickiness of corporate clients. “Our immediate priority was neither to charge customers for API banking nor returns on investments, but to transform the bank into an agile and flexible service provider and bring a ‘wow’ [factor] to customers by making their lives simpler,” he adds.

The first API published on its platform was a funds transfer, which is the most critical and frequently used product, according to Mr Purohit. Initially the bank targeted e-commerce vendors, such as Snapdeal, a large Indian online marketplace. It was able to make Snapdeal’s instant refund API available in three weeks and went live with it in September. The bank is now targeting corporates with numerous distributors, for example, pharmaceutical companies.

Today, every application that comes into the bank must have APIs. Similar to Ms Minich, Mr Purohit is also rolling out internal APIs to create a flexible banking infrastructure.

Opening up the large banks

The smaller banks are not the only ones focused on open APIs. For example, in addition to ongoing initiatives on the retail side, Citi is rolling out APIs in the institutional space, according to Hubert JP Jolly, global head of channel and enterprise services for Citi treasury and trade solutions. Citi’s institutional clients are driving more sales through e-commerce sites and are publishing APIs to advise the bank on how to process transactions, typically collections, from their websites.

Plus, Citi’s innovation lab in Dublin is working on APIs to enable treasury workstation providers to connect directly with the bank. To date, the bank has released APIs around payment initiation, payment status and account balance, which are the main services that clients look to do on the back of a treasury workstation.

Opening up APIs is essential to Spain-based BBVA’s digital strategy, says Shamir Karkal, the co-founder of US-based lender Simple who is now heading up BBVA’s open API unit. “Opening up some of our core assets allows us to support the financial services innovators of tomorrow, puts us at the heart of the positive change that’s under way and offers us a chance to grow rapidly,” he says.

The bank’s open API roadmap is in its early stages, with a small-scale alpha trial currently under way in Spain under the name API_Market. “We are focused on where the greatest innovation is happening in the market, where the greatest demand is and what our customers are doing,” says Mr Karkal. “Given that security of customers’ data is paramount, we’re giving developers access to a sandbox with dummy data and building a system to gradually give them more access.”

Tips for success

While the examples illustrate diverse drivers, approaches and strategies, the common themes are: start small, pick a use case that delivers value for clients and use APIs to digitally transform the internal organisation. “Banks are suited to a progressive modernisation journey,” says Paul Leadbetter, newly appointed group vice-president and chief technologist at Oracle Financial Services. “Choose something that is easy to consume, can be delivered in a reasonable amount of time and brings value back to the bank.”

Hackathons are a low-cost way for banks to leverage APIs in a controlled environment, according to Mr Redfern. “For example, a bank could do a hackathon around PSD2 APIs, which is addressing a pressing regulatory directive,” he suggests. “Another avenue is to deploy API sandboxes that serve test data.” OBP recently launched a sandbox to support the UK's open banking standard initiative and plans to launch a similar sandbox to help banks fine-tune their APIs for PSD2.

Chae An, chief technology officer and vice-president for financial services at IBM, stresses the need for good governance from the outset. “This isn’t a one-month project where a bank develops a set of APIs and then calls it quits – it is a long-term journey,” he says. “As such it is important to have good governance around API creation, prioritisation and ensuring that there is a re-use.”

He adds: “Plus ensure you have the right skills – whether internal or external – which understand the bank’s infrastructure and emerging tools.” IBM is in talks with some clients regarding an API factory, where it would augment their staff for creating APIs based on business needs.

Stephane Dubois, CEO at Xignite, which served more than 50 billion API requests from its market data cloud platform in July 2015, warns against banks just taking existing internal systems, formats and processes and then exposing them as APIs. “If they do that, they will see zero adoption," he says. "APIs must be simple to succeed. Banks must put themselves in the mind of the developer who is going to build an app and has just three days to do it.”

Ms Minich agrees. “Learn to think differently and don’t do APIs the same way as everything before. Banks must begin thinking as technology companies,” she says. 

Source: The Banker

RECENT NEWS

Read the article on A-Team Insight Blog

By Mike O’Hara, Special Correspondent

Cloud-delivered market data was once ‘over my dead body’ territory for institutional market data managers, who understandably fretted aloud about performance, security and licence compliance issues. But Covid-19 has forced those same data managers to confront the fact that many of their professional market data users are able to work from home (WFH), in turn driving financial firms to question whether the pandemic could be the catalyst for a rethink of their expensive-to-maintain market data infrastructures, with cloud part of the data delivery solution.

For many financial firms, today’s cloud delivery and hosting capabilities offer a viable solution for supporting trading and investment teams and their support staff, accelerating demand for cloud-based market data delivery infrastructures. The thinking is that cloud may help firms with their broader aim of reducing their on-premises technology and equipment footprint, a trend that was emerging even before the Coronavirus struck.

But embracing cloud delivery introduces new challenges for market data and trading technology professionals. While WFH will doubtless continue in some form, it’s far from clear that all market data delivery can be migrated to the cloud. Essential market data functions will remain on-premise. High-performance trading applications and low-latency market data connectivity, for example, will continue to rely on state-of-the-art colocation and proximity hosting data centres.

For many financial institutions, the challenge will be how to manage these several tiers of market data delivery and consumption. Going forward, practitioners will face a three-way hybrid of on-premises, cloud-based (private/public) and collocated market data services in order to support a range of users: from work-from-home traders and support staff to trading-room-based traders, analysts and quants, to collocated electronic applications like algorithms, smart order routers and FIX engines.

Indeed, A-Team will be discussing the infrastructure, connectivity and market data delivery challenges associated with cloud adoption in a webinar panel session on November 3. The webinar will offer a ‘reality check’ that discusses best practices for embracing cloud, colo and on-prem to support this new mix of user types, with emphasis on capacity, orchestration, licensing, entitlements and system / usage monitoring.

With firms’ appetite for exploring the potential of the cloud piqued, data managers are now looking at whether they can hope to take advantage of some of the more widely recognised benefits of the cloud – flexibility, agility, speed-to-market, scalability, elasticity, interoperability and so on – as they grapple with the future market data delivery landscape.

“Market data infrastructure, in terms of data vendor contracts, servers, and data centre space, typically represents a large, lumpy, cap ex expenditure”, says independent consultant Nick Morrison. “And so having the ability to transition that to something with costs that are more elastic, is highly attractive”.

Of course, every firm has its own unique requirements and nuances in this regard. Proprietary trading firms, asset managers, hedge funds, brokers and investment banks are all heavy consumers of market data. But the volume, breadth, depth and speed of the data they need in order to operate is highly diverse. Which means that there is no ‘one size fits all’ when it comes to sourcing and distribution mechanisms (including the cloud).

Market data and the cloud – what’s applicable?

As they consider their options for including cloud in their overall data delivery plans, data managers need to assess whether and how specific data types could be migrated to a hybrid environment: Level 1 (best bid/offer), level 2 (order book with aggregated depth at each price level) or level 3 (full order book)? Historic, end of day, delayed or real-time? Streaming or on-demand? This all has a bearing on the feasibility of cloud as a delivery mechanism.

Firms also need to consider their mix of public and private cloud, or what mix or hybrid cloud solution best fits their needs. What about virtualisation? Or internal use of cloud architecture, such as building a market data infrastructure around microservices and containers?

The marketplace already has identified at least one workable use-case: the use of historical, tick or time-series market data, usually to drive some form of analytics. A growing number of trading venues (such as ICE and CME) and service providers (Refinitiv, BMLL and others) now offer full level 3 tick data on a T+1 basis, delivered via the cloud. Plenty more providers can offer historic level 1 & 2 data.

This kind of capability can be used for critical use-cases, such as back-testing trading models for signal generation and alpha capture, performing transaction cost analysis (TCA), developing and testing smart order routers (SORs), or fine-tuning trading algos to better source liquidity. In all of these cases, cloud-hosted historical tick databases can reduce on-premises footprint and cost, while offering flexible access to vast computing resource on demand, and many are finding this compelling. “When churning through such vast quantities of data, having access to a cloud environment enables you to scale up horizontally to process that data”, says Elliot Banks, Chief Product Officer at BMLL.

Where things start to get more complicated, though, is with real-time market data, where two of the biggest hurdles from a cloud delivery perspective are speed and complexity.

Deterministic speed

From a trading standpoint, speed is always going to be a significant factor. Nobody, regardless of whether they’re an ultra-low latency high-frequency trading firm or a human trader dealing from a vendor or broker screen, wants to trade on stale prices. The tolerances may be different but the principle applies across the board.

It’s a safe bet that any firm currently receiving market data directly from a trading venue into a trading server (collocated at the venue’s data centre or hosted at a specialized proximity hosting centre operated by the likes of Interxion) relies on deterministic low latency, and is therefore unlikely to consider cloud as an alternative delivery mechanism.

Clearly, HFT firms with trading platforms that require microsecond-level data delivery won’t be replacing their direct exchange feeds and often hardware-accelerated infrastructure with the cloud, as the performance just isn’t there, for now at least. This, of course, could change if and when the trading venues themselves migrate to cloud platforms, creating a new kind of colocation environment, but that’s likely some way off. “But these guys only have a few applications that really need ultra-low latency data”, says Bill Fenick, VP Enterprise at Interxion. “Most of their applications, be they middle office, settlements or risk, they’re perfectly happy to take low-millisecond latency”.

And what about other market participants? Particularly those that currently make use of consolidated feeds from market data vendors, where speed is perhaps a secondary consideration? This is where cloud delivery may have some real potential. But it’s also where the issue of complexity rears its head.

Navigating the complexity

To deal with the myriad of sources, delivery frequencies, formats and vendor connections used to feed real-time market data into their trading, risk, pricing and analytics systems, many financial firms have built up a complex mesh of infrastructure that ensures the right data gets delivered to the right place at the right time. The integration layer required to handle these data inputs may be delivered as part of the data service or may stand alone as a discrete entity. In either case, it’s unrealistic to expect that all of this infrastructure can just be stripped out and replicated in a cloud environment.

To address this challenge, some service providers are starting to offer solutions where the source of the data is decoupled from the distribution mechanism, aiming for the holy grail where either, or both, can be cloud-based.

By building individual cloud-hosted microservices for sourcing market data, processing that data in a variety of ways, and delivering it into end-user applications, such solutions can help firms migrate their market data infrastructure incrementally from legacy to cloud-based platforms. Refinitiv is starting to shift much of its infrastructure onto AWS, and other specialist cloud-centric vendors such as Xignite and BCC Group also enable internal systems to be decoupled from data sources, thus facilitating a shift towards cloud-based infrastructure. “We believe the customer should be able to easily move from source to source and get as many sources as they want. The cloud enables this kind of flexibility”, says Bill Bierds, President & Chief Business Development Officer at BCC Group.

Firms have long wanted to become more vendor-agnostic by decoupling their data integration capability from the primary data source. One investment bank in London, for example, was able to decouple Refinitiv’s TREP platform from its Elektron data feed and switch to Bloomberg’s B-Pipe for its data, delivered via the TREP framework. From a market data perspective, this has given the bank more negotiating power and less vendor lock-in, opening up greater opportunities to utilise cloud-based market data sources in the future.

Permissioning and entitlements

Perhaps one of the toughest challenges that firms face around real-time market data on the cloud is that of entitlements and usage authorisation. Firms sourcing data from the two main data vendors, Refinitiv and Bloomberg, will generally be tied into their respective DACS and EMRS entitlements systems, often augmented by data inventory and contract management platforms like MDSL’s MDM or TRG Screen’s FITS and InfoMatch.

Entitlements can be a thorny subject when it comes to cloud-based distribution of market data. Firms are wary of falling foul of their licence agreements with their various data vendors, all of whom have different commercial considerations and penalties for non-compliance. This is why accurate tracking and reporting of market data access and usage is crucial.

The cloud can be a double-edged sword in this regard. One the one hand, transitioning from a dedicated infrastructure to the cloud might trigger extra licensing costs for what is effectively an additional data centre, so they may need to go through a period of paying twice for the same data. Indeed, firms may already be facing this situation as they entitle staff to operate from home while holding enterprise licences covering only their headquarters and regional offices.

On the other hand, cloud-based services such as those offered by Xignite and others can make it easier for firms to manage entitlements across multiple data vendors from a central source via a UI. “Our entitlements microservice is integrated with our real time microservice, to make sure that any distribution and any consumption of data is authenticated and entitled properly, so that only the right users have access to the data,” says Stephane Dubois, CEO of Xignite, whose microservices suite is supporting NICE Actimize’s cloud-based market data delivery infrastructure.

Where next?

With new products, services and technologies emerging all the time, firms can be optimistic about the growing opportunities that the cloud can offer for managing market data. One particularly interesting development worth watching is the rise of Low Code Application Platforms (LCAPs), such as that offered by Genesis, which provides a cloud-based microservices framework that can be used for rapidly developing and delivering applications around real-time market data. One example is on-demand margining. “A prime broker can link to all of its customers and know exactly what their risk positions are based on real-time market data, so within minutes, they can be sending out margin calls”, says Felipe Oliviera, Head of Sales and Marketing at Genesis.

Industry behemoths such as Refinitiv, SIX and FactSet are also embracing the cloud. Refinitiv has now launched delivery of market data via AWS, is making its tick history data available on Google Cloud and has also recently announced a partnership with Microsoft Azure. FactSet has launched a cloud-based ticker plant on Amazon EC2. And SIX is partnering with Xignite for real-time market data delivery via the cloud. Bloomberg is also partnering with AWS to make its B-Pipe data feed available through the cloud. And the main cloud vendors themselves – Amazon, Google and Microsoft – have established dedicated teams to develop these markets

In conclusion, it’s clear that there are a number of challenges that firms still face when transitioning any part of their market data infrastructure to the cloud. (To register for A-Team’s free webinar on the topic, click here.) And in many cases, particularly where ultra-low latency is required, cloud is not the answer. But equally, by migrating certain elements of their market data infrastructure to the cloud, cost savings can be achieved, efficiencies can be gained and firms can potentially do more with less.

10/21/2020

Xignite, Inc., a provider of market data distribution and management solutions for financial services and technology companies, today announced it won the Best Real-Time Market Data Initiative at the Inside Market Data & Inside Reference Data Awards.

A longtime leader in the market data cloud space, Xignite provides financial data through its innovative cloud APIs, which are developer-friendly, reliable and endlessly scalable. Xignite data is normalized and ready-to-use, eliminating common pain points with legacy providers, while maintaining global coverage and institutional quality.

This award recognized Xignite’s work with SoFi, a leading digital personal finance company. In 2019, SoFi launched SoFi Invest, a free consumer investing service, and enlisted Xignite to power the entire platform, from its robo-advisor capabilities, to financial newsfeed, to real-time market alerts and curated stock list. SoFi has identified a number of ways in which these key features are driving member engagement – for example, 10% of users who receive a market alert make a trade within an hour. For more details on this collaboration, download the case study HERE.

“We are honored to be recognized for Best Real-Time Market Data Initiative. Xignite was the first to bring market data to the cloud, and we have continued to innovate and point the way to the future of this unique subset of the industry,” said Stephane Dubois, CEO and Founder of Xignite. “The SoFi collaboration is a great example of how a firm can leverage our diverse range of APIs to build an all-encompassing platform and scale it rapidly. As we look to the future, we will continue to serve our clients through transformative offerings, including our suite of Xiginite Enterprise Microservices, which we announced in July.”

The Inside Market Data & Inside Reference Data Awards are held by WatersTechnology and recognize industry excellence within market data, reference data and enterprise data management. The award ceremony took place during the publication’s Innovation Exchange held virtually from September 9 to September 22.

This is the latest honor in what has been a fruitful year for Xignite on the awards circuit. In the spring, the firm was named an SIIA CODiE Awards finalist and included on the WealthTech 100 list.

About Xignite
Xignite has been disrupting the financial and market data industry from its Silicon Valley headquarters since 2006 when it introduced the first commercial REST API. Since then, Xignite has been continually refining its technology to help fintech and financial institutions get the most value from their data. Today, more than 700 clients access over 500 cloud-native APIs and leverage a suite of specialized microservices to build efficient and cost-effective enterprise data management solutions. Visit http://www.xignite.com or follow on Twitter @xignite

About the Inside Market Data and Inside Reference Data Awards
The annual Inside Market Data and Inside Reference Data Awards, now in their 17th year, play a key role in WatersTechnology’s awards program, and are the only awards that feature a mix of call-for-entry categories determined by a panel of judges and categories compiled by WatersTechnology’s journalists and voted on by the brand’s readership. This year’s awards featured 32 categories in total: 21 call-for-entry categories, 10 journalist-compiled categories, and a hall of fame (lifetime achievement) award.

09/23/2020

Over time, the market has come to embrace cloud in more and more aspects of trading technology. Processing large sets of data and calculation of computationally intense formulas (or both) are common uses of cloud. While the market may not be quite ready to move every part of the trading cycle to the cloud, market data is becoming more and more mainstream. 

Market Data + Cloud Solutions

In fact, somewhat ironically, market data is very fertile “ground” for cloud offerings. Not only are third-party cloud providers continuing to enhance their market data offerings (i.e., Bloomberg,[1] Refinitiv,[2] Xignite[3]), but exchanges are also offering access to data directly via their own cloud services or innovation partners (e.g., CBOE,[4] IEX,[5] Nasdaq[6]). In a post-COVID-19 world, cloud has only become more entrenched in the trading lifecycle across both buy-side and sell-side firms. Even looking back to views from 2019, the growing importance of cloud servicing market data needs is clear.

In fact, almost three-quarters of respondents in our 2019 Market Data Study[7] identified innovation in market data as highly important, with cloud seen as the second most impactful innovation (trailing only slightly behind artificial intelligence). 


Read entire blog post by Shane Swanson, Senior Analyst, Market Structure and Technology at Greenwich Associates.

09/22/2020

Xignite, Inc., a provider of market data distribution and management solutions for financial services and technology companies, announced today that it recently enhanced its Bond Master API. Xignite offers several APIs that provide real-time, delayed, historical fixed income pricing and reference data for corporate and agency debt bonds. The Bond Master API enhancement increases the coverage from the United States to 190+ countries, adds additional bond types to support more than 2 million active bond issues, and increases the ease of use of the API with several new endpoints.

Unlike legacy fixed-income data solutions, Xignite’s Bond Master API is cloud-native and offers a robust selection of use-case-based endpoints. Developers can easily integrate these endpoints into their product or app, regardless of type, amount, or frequency of data, without the need for any complex integration logic. Unlike file-based data delivery solutions, the Bond Master API makes on-demand integration into downstream security master or compliance systems frictionless.

Additional detail on the enhanced Bond Master endpoints:

  • The List endpoint for bond type, issuer type, and domicile enables clients to slice and dice the bond universe differently based on use-case.
  • The ScreenBonds endpoint enables clients to dynamically and easily screen the bond universe by combining criteria based on the coupon rate, maturity date, callability, and issue convertibility.
  • The ListBondDataPoints and GetBondDataPoints endpoints enable clients to more easily pick and choose the reference data points they need to integrate into their systems.
  • The GetBondDataPoints endpoint enables access to additional reference data points without requiring changes to an existing implementation.

“Because much of the benefits of a reference data service derives from its breadth, depth and quality of coverage, these enhancements give you the added peace of mind that comes from knowing your holdings are validated against a complete universe,” said Vijay Choudhary, Vice President, Product Management, Market Data Solutions at Xignite. “These enhancements eliminate the need to maintain an on-site bond security master, which ultimately saves our clients time and eliminates significant unnecessary expenses.”

Additional bond issuer types now include: Government Agency, Government Controlled Company, State Government, Supranational

Additional new bond types now include: Bankers Acceptance, Capital Securities, Cash Management Bill, Certificate, Certificate of Deposit, Commercial Paper, Covered Bond, Debenture, Depository Receipt, Discount Notes, Loan Note, Loan Stock, Medium Term Notes, Note, Permanent Interest Bearing Shares, Preferential Security, Preferred Security, Reference Bills, Structured Product, Strip Package, Treasury Bill

Additional reference data points are also now available for all bond types:

  • Issue instrument identifiers (CUSIP, ISIN, Symbol, etc.)
  • Bond Issuer details including issuer name, domicile, unique company identifier, issuer status, industry and sector
  • Bond Issue details including maturity, coupon, coupon type, par value, dated date, distribution and amortization details, day count convention, original issue details, liquidation right, callable, convertible, guarantor, redemption, and other issue details

This is just the latest example of Xignite’s ability to innovate. Earlier this year, the firm unveiled its suite of market data management microservices and also received a patent for its market alerts technology.

About Xignite

Xignite has been disrupting the financial and market data industry from its Silicon Valley headquarters since 2006 when it introduced the first commercial REST API. Since then, Xignite has been continually refining its technology to help fintech and financial institutions get the most value from their data. Today, more than 700 clients access over 500 cloud-native APIs and leverage a suite of specialized microservices to build efficient and cost-effective enterprise data management solutions. Visit http://www.xignite.com or follow on Twitter @xignite

09/16/2020