Reference Data Management in Financial Services
Increased global regulatory pressure coupled with fragmented regulatory landscape is making ﬁnancial institutions realize the value of putting a data governance strategy in place. With eﬀective reference data management (RDM), companies can facilitate seamless ﬂow of clean, consolidated, and accurate data throughout the enterprise. By doing so, they will realize more value—save millions of dollars every year, dramatically improve value chain and augment eﬃciencies, manage risk eﬃciently, improve customer loyalty, and support sound corporate governance.
The Case for Reference Data Management (RDM)
Even as ﬁnancial institutions, exchanges, and market participants are undergoing a fundamental transformation, data management is becoming increasingly challenging. In this context, it is extremely important to manage creation and maintenance of data to ensure its relevance and mitigate risks arising out of data inconsistency. Data accuracy and reliability is mission-critical and key enabler for all business operations including trade execution, risk management, and compliance reporting.
Data management is the development and execution of architectures, policies, practices, and procedures to manage the information lifecycle needs of an enterprise in an eﬀective manner. Eﬀective data management calls for seamless integration between all elements of the overall data management lifecycle—strategy, governance, operations, review, analysis, and actions.
In most ﬁnancial institutions, data is spread across multiple regions, departments, and systems. Many of these entities have to reference data pertaining to the parent company. However, they are unable to do so with ease when there is no central source of data. Instead, the entities have their own nomenclature and data sources piled in silos, with redundant systems designed to extract and process data for individual requirements. Apart from being an ineﬃcient design, this is extremely cost-ineﬀective and prone to data inconsistency.
RDM addresses all the above stated issues. It is a methodology of managing the creation and maintenance of data that can be shared across multiple regions, departments, and systems. It collates data from multiple sources, normalizes it into a standard format, validates the data for accuracy, and consolidates it into a single consistent data copy for distribution.
This white paper analyses the need for reference data management in the ﬁnancial services industry and delves into the challenges in the absence of eﬀective RDM, critical elements of RDM implementation, and some of the major beneﬁts an organization can derive by implementing a robust RDM solution.
Save for LaterDownload White Paper
Why Data Management cannot be Ignored Anymore
Fundamental changes in the ﬁnancial services industry have created a signiﬁcant impact on data management platforms. Some of the key drivers of change are:
Diverse Instruments: In the quest to oﬀer compelling products to customers brokers/ dealers have created many innovative ﬁnancial instruments. Currently, there are more than eight million instruments, each requiring a ﬁrm to maintain detailed, timely, and accurate information. Derivative issues are only one example of ﬁnancially engineered securities that did not exist a few years ago. These new ﬁnancial products and their complex terms have become a challenge for executives managing ﬁnancial information.
Changes in Market Mechanism: Trade execution mechanisms have been altered by the shifting composition of market participants. For example, there has been a rapid increase in the number of hedge funds and the emergence of mega “buy-side” ﬁrms, many of which use program trading and other algorithmic execution models. Decimalization and program trading have led to a reduction in the trade size with a corresponding increase in volume. These factors have put a strain on data management platforms as they are required to deliver high volumes of data with low latency to black-box trading systems.
Regulations and Compliance: Regulation and compliance are also key drivers in the march toward an improved data management platform. The emergence of Basel III, Sarbanes-Oxley, and other key risk and compliance considerations has forced ﬁrms to place high priority on production of accurate and timely data to feed internal risk management systems. As a result, institutions must now meet a more stringent ﬁduciary responsibility to provide correct data to regulatory agencies. Faulty information can result in dire consequences and catastrophic ﬁnancial exposure.
Expanding Role of Data Aggregators: The industry’s demand for a wide range of security attributes and pricing information has given rise to an entire sub-industry populated by vendors who specialize in ﬁnancial data capture and distribution. These vendors are playing an increasingly signiﬁcant role in managing and providing data. However, managing multiple sources of data creates cost and consistency issues that must be ﬁxed.
Data Classiﬁcation: Recognize, Categorize, Then Analyze
Data is not a homogeneous entity. It consists of diﬀerent categories, each with its own set of characteristics. Each of these categories may have strong dependencies on each other. However, failure to recognize these diﬀerences is risky. Projects that do not address the unique nature of each data category will invariably encounter problems and are likely to fail.
Financial services data can be categorized into the following types:
- Transaction Activity Data: It represents the transactions that operational systems are designed to automate.
- Transaction Audit Data: It is the data that tracks the progress of an individual transaction such as web logs and database logs.
- Enterprise Structure Data: This is the data that represents the structure of an enterprise, particularly for reporting business activity by responsibility. It includes organizational structure and charts of accounts.
- Master Data: Master Data represents the parties to the transaction of an enterprise. It describes the interactions when a transaction occurs.
- Reference Data: Reference Data is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise. In ﬁnancial services, it includes descriptive information about securities, corporations, and individuals.
- Market Data: In ﬁnancial services, market data refers to real-time or historical information about prices.
- Derived Data: Derived data refers to data that is derived from other data. It is calculated by various calculators and models made available to a wide range of applications.
The Many Challenges of RDM in Financial Services
Improving data quality is an ongoing eﬀort and ﬁnancial institutions are facing the challenge of improving their technology infrastructure to address this issue. Reference data management projects are major technology investments to improve the data quality. Data integration and the concept of a single source is a massive challenge, especially in APAC (Asia-Paciﬁc) banks, as data is still being managed in silos.
Increasing volume of data means working with multiple data sources. Client data and the single view of the customer is a critical area driven by regulations such as Anti-Money Laundering (AML) and Know Your Customer (KYC).
Historically ﬁrms have maintained, built, and managed their own security and client master databases in isolation from other market participants. As these organizations expanded organically or through acquisition, data silos matching each line of business emerged. Most of these data platforms are similar in style and content within and across ﬁrms. Typically, they are maintained through a combination of automated data feeds from external vendors, internal applications, and manual entries and adjustments. It is not uncommon for these platforms to contain aging infrastructure and disparate, highly de-centralized data stores.
Some of the common challenges ﬁnancial institutions face in reference data management are:
- Exponential increase of asset classes, new securities, and volume
- Duplicate data vendor purchase, expensive manual data cleansing, and poor data management, leading to high aggregate costs
- Management of multiple securities masters, repositories, and diﬀerent sources of asset classes across diﬀerent geographical markets
- Prevalence of diﬀerent identiﬁers like Committee on Uniform Securities Identiﬁcation Procedures (CUSIP), International Securities Identiﬁcation Number (ISIN), Stock Exchange Daily Oﬃcial List (SEDOL), and internal identiﬁer used by front- and mid-oﬃces
Get a Grip on Reference Data Management
NIIT Technologies deploys new methodologies, proprietary software, and tools from industry-leading software vendors to tackle reference data management challenges. There are many third-party product providers who focus on speciﬁc elements in the chain of reference data management without having a holistic view of the complexities surrounding the entire lifecycle of reference data. Our RDM process focuses on these complexities and is divided into four critical stages:
- Data is acquired via robust market-facing interfaces such as Bloomberg, Reuters, and JJ Kenney
- Data is continuously updated and monitored as it is critical for successful data acquisition
Data Validation and Mapping
- Automated via rule engines as exception management and support is required to perform manual data mapping
Data Enrichment and Transformation
- Reference data is enriched and standardized
- A golden copy of data is created for instrument pricing
- Golden data is distributed to external third-party systems
- Audit trail and action tracking is performed as it is extremely important at this stage
The Road to Eﬀective RDM: One Step at a Time
Based on the fundamental components of the data lifecycle, we have developed a comprehensive solution for end-to-end reference data management. Our reference data management solution enables ﬁrms to manage the entire reference data environment from vendor data rationalization to enterprise reference data architecture design and integration, and from indexing to automated data cleansing and distribution.
Our reference data management oﬀering includes the following elements:
- Reference and Data Rationalization: This process workﬂow creates a cross reference of each data element and rationalizes reference data spend by identifying duplicate purchases
- Enterprise Data Architecture Assessment and Package Implementations: Evaluates current architecture, aligns it with future growth plans, and identifies constraints for the enterprise reference data architecture.
- Index and Normalize Securities Data: Uses a set of industry-standard tools to create a consistent and single enterprise-wide key matrix for all securities.
- Automated Data Cleansing system: Supports a rule-based commercial reference data cleansing systems to process reference data.
- Data validation and Mapping: Automates data mapping and data validation based on rules engine. This prevents automatic overrides.
- Corporate Actions Processing: Helps maintain security reference data by automatically applying corporate actions with manual support for complex electives.
- New Securities Setup: Enables continuous monitoring of security masters and sets up new securities on demand.
- Enterprise Reference Data Distribution: Enable BOCADE (buy once, clean and distribute everywhere) reference data distribution across the enterprise and build audit capability for price requests.
- Instrument Pricing: Provides timely and accurate instrument pricing data to bankers and ﬁnancial advisors.
- Reference Data Eﬃciency Dashboard: Makes RDMS black box transparent by monitoring reference data consumption, quality, and cleansing status.
The NIIT Technologies Thought Board:
Reference Data Management in Financial Services
The Holy Grail of Data Singularity
Financial services organizations deal with numerous ﬁnancial instruments, ranging from stocks and funds to derivatives, to meet the ever-increasing demands of the global securities marketplace. They need to tackle a huge amount of data to trade and keep track of these instruments.
NIIT Technologies’ Reference Data Management solution helps ﬁnancial services institutions rationalize the process of reference data consumption. It is designed to consolidate, cleanse, govern, and distribute these key business data objects across the enterprise. It includes pre-deﬁned extensible data models and access methods with powerful applications to centrally manage the quality and lifecycle of business data. The solution is augmented by our implementation know-how to develop and utilize the best data management practices with proven industry knowledge. These strengths have led to a large data ecosystem with a numerous specialist partners.
We deliver a single, well-deﬁned, accurate, relevant, complete, and consistent view of the data across multiple regions, departments, and systems. Companies that have implemented our solution are successfully achieving the elusive goal—consolidated version of data across the enterprise.
More Value at the Heart of Partnerships
As a partner to ﬁnancial services institutions, NIIT Technologies brings new ideas and more value to every engagement. Our strengths include:
- Strong Industry Focus: NIIT Technologies has several thousand person years of experience in designing, building, and maintaining largescale applications for day-to-day business with considerable experience in front-, mid- and back-oﬃce operations. As per the Datamonitor Black Book of Outsourcing 2010 survey, in the overall satisfaction ratings, NIIT Technologies ranked number 1 in the Data Management Services. Our team has expertise in tools such as Charles River, Calypso, Advent Moxy, Linedata Longview, MacGregor ITG, Eze Castle, Omgeo, Bloomberg, Reuters, and Yodlee Account Data gathering.
- Technology Depth: Our oﬀerings span business and technology consulting, application development and management services, IT infrastructure services, and business process outsourcing services. Our services are underpinned by a strong valueoptimizing framework with a cost-eﬀective delivery model, which can be used in single shore, dual - or multi-shore formats.
- Mature Best-in-class Process Framework: NIIT Technologies is ISO 27001, CMMi Level 5 and PCM Level 5 accredited. Our resources are, therefore, well-versed with operating in a highly mature process oriented and secure environment and bring this expertise to all client engagements.
- Ability to Scale : With a large resource base of over 5,000 analysts and consultants, NIIT Technologies is able to quickly source professionals with the desired skill sets required for a project. Furthermore, we also possess the capability to ensure quick ramp-up of project resources as and when needed.