SlideShare a Scribd company logo
Enterprise Data
Management
APRIL 2008 • emii.com
Moving Data Downstream
Downstream data integration becomes less
of an upstream swim.
Managing a Data Overload
Growing data problems drive financial services
firms to reevaluate their information strategies.
FromThe Publishers of:
Report
2008EDM 4/15/08 4:16 PM Page 1
2008EDM 4/15/08 4:16 PM Page 2
Enterprise Data Management Report
APRIL 2008 • emii.com
APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 3
5 Managing a Data Overload
By David Lewis
An ever-increasing volume of data, a good portion of
which contain errors, is driving financial services firms
to reevaluate their information strategies. The biggest
challenge, however, may be getting these firms to buy
into the fact that this is an ongoing process that
needs to be managed properly.
8 Moving Data Downstream
By Stephen Mauzy
Downstream data integration - the goal of which is to get
cleansed, reliable data to the right departments - can be
harder than most financial firms think, especially for an
industry built on stand-alone and legacy back-office sys-
tems. But the goal is definitely achievable through invest-
ments of time and money and proper governance.
12 Tomorrow’s EDM Solutions Today
By Edward McGann
Investors demand real-time, accurate and, in some
cases, enhanced data so they can understand their
positions and performance in their investment portfo-
lios. To meet that demand, BNY Mellon Asset
Servicing employs not only its powerful technology
infrastructure but also its product management and
business units to develop strategies and devise solu-
tions to clients’ requests.
14 A Critical Year for XBRL
By David Lewis
The SEC's effort to make financial reporting interac-
tive through its eXtensible Business Reporting
Language program is on the fast track to adoption,
despite opposition from some corporations and
investors.
16 One Step Ahead in OTC Derivatives
By Gregory Morris
In an industry facing calls for improvements,
The Depository Trust & Clearing Corp already
is making them.
One Step Ahead
in OTC Derivatives
Table of Contents
16
5
MANAGING A
DATA OVERLOAD
2008EDM 4/15/08 4:16 PM Page 3
www.emii.com
A Publication of Institutional Investor, Inc.
© Copyright 2008. Institutional Investor, Inc. All rights reserved. New York Publishing offices:
225 Park Avenue South, New York, NY 10003 • 212-224-3800 • www.iinews.com
Copyright notice. No part of this publication may be copied, photocopied or duplicated in any form or
by any means without Institutional Investor’s prior written consent. Copying of this publication is in vio-
lation of the Federal Copyright Law (17 USC 101 et seq.). Violators may be subject to criminal penalties
as well as liability for substantial monetary damages, including statutory damages up to $100,000 per
infringement, costs and attorney’s fees.
The information contained herein is accurate to the best of the publisher’s knowledge; however, the pub-
lisher can accept no responsibility for the accuracy or completeness of such information or for loss or
damage caused by any use thereof.
VINCENT YESENOSKY
Senior Operations Manager
(212) 224-3057
DAVID SILVA
Senior Fulfillment Manager
(212) 224-3573
REPRINTS
DEWEY PALMIERI
Reprints & Premission Manager
(212) 224-3675
dpalmieri@iinvestor.net
CORPORATE
GARY MUELLER
Chairman & CEO
CHRISTOPHER BROWN
President
STEVEN KURTZ
Director of Finance & Operations
ROBERT TONCHUK
Director/Central Operations & Fulfillment
Customer Service: PO Box 5016,
Brentwood, TN 37024-5016.
Tel: 1-800-715-9195. Fax: 1-615-377-0525
UK: 44 20 7779 8704
Hong Kong: 852 2842 6910
E-mail: customerservice@iinews.com
Editorial Offices: 225 Park Avenue
South, New York, NY 10003.
Tel: 1-212-224-3279
Email: eblackwell@iinews.com.
EDITORIAL
ERIK KOLB
Editor of Business Publishing
DAVID LEWIS
Contributing Reporter
STEPHEN MAUZY
Contributing Reporter
GREGORY MORRIS
Contributing Reporter
PRODUCTION
AYDAN SAVASER
Art Director
MARIA JODICE
Advertising Production Manager
(212) 224-3267
ADVERTISING/BUSINESS
PUBLISHING
JONATHAN WRIGHT
Publisher
(212) 224-3566
jwright@iinews.com
PAT BERTUCCI
Associate Publisher
(212) 224-3890
LANCE KISLING
Associate Publisher
(212) 224-3026
LESLIE NG
Advertising Coordinator
PUBLISHING
BRISTOL VOSS
Publisher
(212) 224-3628
MIKE FERGUS
Marketing Director
(212) 224-3266
Editor’s Note
Welcome to the 2008 Enterprise Data Management Report, an update
on how financial services firms are addressing the issue of accurate,
transparent data and its consistent integration into various applications
across their businesses.
Enterprise data management is a relatively new business objective
that few firms fully understand. That is why the Enterprise Data
Management Report begins with an overview of the concept and
the issues firms face as they move through the various stages of
implementation. For most, that means starting at the beginning and
dealing with an ever-increasing volume of data, a good portion of
which contain errors. The biggest challenge, however, may be get-
ting these firms to buy into the fact that this is an ongoing process
that needs to be managed properly (see story, page 5).
Next, the Report addresses the issue of downstream data integration,
the goal of which is to get cleansed, reliable
data to the right departments.That can be
harder than most financial firms think,
especially for an industry built on stand-
alone and legacy back-office systems. But
the goal is definitely achievable through
investments of time and money and proper
governance (see story, page 8).
Beyond that, the Report includes an item
on the SEC's effort to make financial
reporting interactive through its eXtensible
Business Reporting Language program
(see story, page 14) and sponsored articles from BNY Mellon Asset
Servicing and The Depository Trust & Clearing Corp.
Enterprise Data Management Report is the latest in a series of special
supplements produced by Institutional Investor News exclusively for our
newsletter subscribers. It is part of our commitment to bringing our read-
ers the freshest news and in-depth analysis on important sectors and
timely topics within the financial markets.
Enjoy,
Erik Kolb
Editor of Business Publishing
Institutional Investor News
Enterprise Data
Management
APRIL 2008 • emii.com
Moving Data Downstream
Downstream data integration becomes less
of an upstream swim.
Managing a Data Overload
Growing data problems drive financial services
firms to reevaluate their information strategies.
FromThe Publishers of:
Report
4 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008
2008EDM 4/15/08 4:16 PM Page 4
APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 5
WALL STREET HAS A PROB-
LEM - a big, fat, ugly data problem.
While bad information represents a
large part of the problem, another sig-
nificant issue is that the sheer volume
of data has surpassed tidal wave proportions. With data growing at
exponential rates, particularly at financial services institutions, the
tidal wave of data is more like a series of tsunamis.
The reasons are many. Two of them are the plummeting cost and
growing volume of storage. According to BearingPoint, storage costs
have fallen 99.75% per gigabyte since 1980, while online storage
volume is projected to grow 273% between this year and 2011.
Another reason is the continuing rise of online commerce, includ-
ing 30% annual increases in online auto insurance purchases and an
expected 27% annual rise in online banking.
Cost used to be a filter, according to Ed Hagan, managing director
and global leader of BearingPoint's Information Management
Solution Suite. Now, however, there is no filter. “The new filter
needs to be, what is valuable information? What is the most impor-
tant information to the enterprise? Most financial organizations real-
ly can't tell you that,” he said.
The answer is a concept called enterprise data management
(EDM). “The biggest challenge is that there is this roiling sea of
data and, if you jump in just anywhere with your tin cup and start
bailing, it's a long and not-so-productive process,” Hagan said.
“Whereas if you can somehow start to classify this sea of informa-
tion into areas of primary, secondary and tertiary value, you can
prioritize your initiatives and start to look at what applications are
most critical to your most valuable information. Then you can
take a more structured approach to solving the problem and man-
aging the problem on an ongoing basis.”
Roots of the Problem
Part of today's swelling data problem is simply the ballooning
volumes of information that afflict all U.S. corporations. Part
of it, however, can be attributed to the financial services com-
panies themselves. “The financial services industry is an inter-
esting space,” Hagan said. “On many elements, they are lead-
ing the charge with cutting-edge practices. But they also have
some of the biggest problems, so their pain around these issues
is pretty substantial.”
Scott Dillman, managing director at PricewaterhouseCoopers,
relayed a story that explains how financial have brought some of
this burden upon themselves. Not long ago, he and his team
audited and ‘cleaned’ the basic information of a major bank. Lo
and behold, about 75% of the bank's accounts contained errors.
Managing a Data Overload
Growing data problems drive financial services firms
to reevaluate information strategies
By David Lewis
2008EDM 4/15/08 4:16 PM Page 5
6 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008
There was no intent to mislead, rather the bank had an informa-
tion technology problem, Dillman noted. That problem caused it
to create an account for ‘John Doe’ at one branch, an account for
‘John R. Doe’ at another branch and a ‘J. Randolph Doe’ money
market account, before later adding a ‘J.R. Doe’ loan.
The potential for error is obvious. So is the point that this method
makes it nearly impossible to sort all this out from the ‘John Doe
and Mary K. Doe’ joint account.
The bank's problem clearly is that it has no technology capable of
creating a single identity for ‘John Doe’ with multiple attributes.
That means it has no way to provide value-added information
such as ‘John Doe, who has two joint accounts with Mary Doe,
who has a new line of credit and likes to use our supermarket-
based branches.’ It also means that, potential errors aside, the
multiple accounts the bank creates and maintains simply clog the
system, making it redundant and costly.
But the sins of the financial institutions are vastly compounded
by questions of what to do with and how to make sense of the
burgeoning heap of numbers and text. According to
BearingPoint, e-mail at many financial services firms is increasing
40-60% per year, and at some financial companies email volumes
are rising at a rate of 100% annually. Meanwhile, trading volumes
on the New York Stock Exchange and NASDAQ have increased
19 times in the past decade.
Prerequisites to EDM
The answer to the data glut is to be found in a cluster of
acronyms - IM, MDM and EDM. Respectively, these are
information management, master data management and
enterprise data management.
To break those down a bit, information management is the umbrel-
la term for the notion that information must be managed before it
can be processed. This has led to a new breed of ‘C suite’ officer
known as the chief data officer or something similar. Yahoo! appears
to have been the pioneer in creating this post, followed in the finan-
cial space by CitiGroup and JPMorgan Chase. While the preoccu-
pation of the typical chief information officer has been information
processing—the manipulation of data by particular applications—
the job of the CDO is data and only data.
The development of the role of chief data officer or its equivalent is
one measure of the growing maturity of data management. “One of
the challenges of that role, like any kind of C-level role, is that it is
pretty immature at the beginning,” Hagan said. “As new leadership
in this space starts to emerge, we will see a broader business perspec-
tive around dealing with information management.”
The goal of data management is actionable information. “If you
don't have clean data, you're not going to find that you have real
information,” said Dillman. “If you don't have real information,
you really have no way of knowing whether decisions are being
made on a solid basis.”
Master data manage-
ment, in effect, is the
backbone that under-
lies enterprise data
management. “The
way we look at MDM
is as a subset of the
whole information
management space.
Master data manage-
ment is looking at
which pieces of data
need to be standardized
across the organiza-
tion,” Hagan said.
“This is really the back-
bone of how you move
information across the
organization.”
Historically, every
application defined its
own master data. “If their ability to pass the information across
the organization and their referential data is not consistent, it is a
big challenge,” Hagan noted.
The idea of common hierarchies brings up the problem of the
‘silos’ of data. “The financial industry is just like every other
industry out there in the sense that it developed its business ver-
tically, silo by silo,” noted Michael Atkin, managing director of
the EDM Council, an industry trade association. “So you have all
of these silos within a financial institution by function, by data
area, by geography, by a whole host of things.”
Then, as the world changed and no longer could just operate in a
vertical business framework, companies had to start looking at
their organization horizontally. “All of a sudden, they needed to
reconcile all of these data stores that existed all over their organi-
zation without coordination and without alignment,” Atkin said.
Understanding EDM
Operationally, the key concept to unwinding this Gordian knot is
the final acronym, EDM. That concept means just what it says:
managing data that is transparent, detailed, relational, accurate
and appropriately shared across an entire enterprise.
To elaborate, a 2006 Finextra Research study interviewed 17 chief
technology officers, chief information officers and other senior
data and business managers of major banks and buy-side and sell-
side firms in Europe and North America. The analysis found that
the meaning of enterprise data management differed by manage-
ment position, business type and company size.
Yet, there was broad agreement that enterprise data management
is a process required to enable disparate applications and parts of
a business to share information. It is driven by a need to promote
Ed Hagan, managing director
at BearingPoint
2008EDM 4/15/08 4:16 PM Page 6
APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 7
accuracy, transparency and efficiency in the business, to ease reg-
ulatory compliance and improve client service and performance.
Most interviewees said that, for them, EDM meant capturing,
managing and analyzing product, customer, counterparty and
operational data at a very granular level. They noted that they had
begun to, or wanted to, standardize and manage data centrally
and share it across the business.
“EDM is not a system, a technology or a process; it is an objec-
tive,” Atkin said. “If you understand it as an objective with incre-
mental strategies that deal with your individual challenges, then
the goal is to achieve enterprise data management.”
Enterprise data management thus includes issues such as data
quality, management, governance and architecture.
“Governance means this needs to become a broader enterprise
discipline, as opposed to something we look at as, ‘Here's this
problem we need to fix,’” Hagan said. “Particularly in organi-
zations like financial services organizations, the same level of
discipline that is put around managing financial assets is not
there when it comes to managing information assets. It's real-
ly a new dimension by which we need to manage our business-
es, as opposed to a problem that we need to fix.”
The Finextra study also underlined the criticality of EDM.
Asked what were the driving forces behind their enterprise's
investment in EDM, 59% of respondents said risk manage-
ment was very important and the primary driving force. That
was followed by compliance, 47%; business growth, 41%; and
operational efficiency/cost reduction, 35%.
Atkin noted that corporate investment in EDM beats the
alternative. “It is more expensive not to be able to manage
your business, not to be able to meet your time-to-market
objectives and not identify and profit from niches and market
opportunities,” he said. “It's more expensive to have trades
fail, to fail a regulatory audit or to compensate your clients
because you're not meeting the terms of your investment
agreements.” And so on.
The State of the Industry
As to how financial institutions are meeting the data chal-
lenge, most observers agreed that few are. A handful of players
are, maybe five percent, they noted, citing Morgan Stanley,
Goldman Sachs and Barclays Capital as leaders in the sector.
“Financial institutions differ pretty substantially,” Hagan
said. “The one common element, at least in the larger, global
organizations and especially for those involved in the capital
markets, is the recognition that this is a challenge, and it is
getting more challenging every day. With the exponential
growth of information and the risk associated with that infor-
mation increasing every day, executives are trying to figure
out, ‘How can I manage the cost side of this equation? How
do I manage the risk? And ultimately, how do I get some value
from all this data that
is piling up across our
organization?’”
While financial institu-
tions might be far from
getting on top of
EDM, they're not
ignoring the concept
either. “EDM is com-
pletely accepted by
most of the Tier 1
financial institutions,
and it is absolutely
above and beyond the
‘What is it and why
should we care?’
phase,” Atkin said.
“That, in my opinion,
has occurred relatively
quickly. Four years ago,
we couldn't even spell
‘reference data.’ Now,
there are data management programs underway at virtually every
financial institution around the world, and data is understood as
the asset that it is.”
However, there is more variation regarding what actions these
financial institutions have taken. Indeed, just as levels of cor-
porate engagement with EDM differ from company to compa-
ny, EDM governance and strategies also vary widely.
To help, BearingPoint identified a four-part framework to help
companies address their EDM strategy:
1. What information is critical to understanding whether
the company is executing its business strategy?
2. Who needs what information to make strategic deci-
sions from an operational perspective?
3. What are the critical processes within the organization
from the standpoint of performance and quality?
4. What information is critical to regulatory or other exter-
nal requirements?
“Most financial institutions that we deal with – primarily Tier
1 and 2 – recognize EDM, understand it and know they have
a problem that needs to be managed,” Atkin said. “Most of
them have created data groups, appointed people to be respon-
sible for data activity and are trying to set up appropriate
internal governance mechanisms.”
That being said, it's still a foreign concept for most financial
institutions, Atkin noted. “At the moment, it's almost been
relegated to just another task to perform within the financial
institution. That is not necessarily good,” he said. “The ten-
dency is to have a short-term view and try to fix things tacti-
cally. That is not the makings for a good data strategy.”i
Scott Dillman,
managing director at
PricewaterhouseCoopers
2008EDM 4/15/08 4:16 PM Page 7
8 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008
ENTERPRISE DATA MANAGEMENT
(EDM) appears to have transcended the
rhetoric stage, garnering not only lip serv-
ice but increased attention and money.
Most people recognize the value of clean,
accurate data that is ‘fit for purpose.’ Unfortunately, the
process of getting ‘fit for purpose’ data to end-users can be a
slow slog of a process.
The first step in moving data from here to there is assuring
quality. Data integrity degrades over time. Consider a simple,
but common, scenario: a regulated investment firm transacts
with another firm it initially regards as low risk. Over time,
however, the counterparty's investments gradually tilt toward
high risk because of changes in geographic location, macro-
economic variables, poor management or obsolescence, thus
changing the firm's risk profile from low to high.
The obvious remedy is to update the data - if only it were that
simple. Data that is extracted and aggregated into down-
stream systems is usually controlled by the managers of the
upstream system, and they typically lack sufficient incentive
to improve the data's quality beyond their
specific application domain.
The solution is to centralize and automate
data cleansing to create a sustainable and
continuous approach by integrating the
process with data gathering. Such a system
improves data consistency, removes inaccu-
racies and, ultimately, improves risk man-
agement and overall business performance.
Of course, not all the data can be auto-
matically and centrally cleansed. Logical
error types in data structure can be cor-
rected through programmed processes,
but there will always be errors that are
immune to logical conclusions or the
value of a particular field. “There is
always going to be conflicting data on a
price, a name, a date or a corporate action
message,” said Barry Raskin, president of
Telekurs Financial USA. “If you want
straight-through processing, you can't get it unless it's close
to being fully automated.”
All Aboard
Once an initial data cleansing is completed, the real work
begins. Financial institutions still confront a superfluity of
stand-alone and legacy back-office systems. “The large invest-
ment banks could have 200 systems with several sources of
reference data,” said Tom Stock, senior vice president of prod-
uct management at GoldenSource. “The big issue on distribu-
tion is what architecture you can put in place to get it to all
the downstream systems.”
That architecture doesn't come cheap. Boston-based Aite
Group estimates downstream connectivity spending will
exceed $2.8 billion in the United States and Europe this year.
According to Adam Honore, senior analyst at Aite, Tier 1
institutions (bulge-bracket firms and global banks) will spend
an average of $9 million each to improve their data connec-
tivity, with some firms spending more than $20 million. Tier
2 firms (mid-sized asset managers and large hedge funds) will
spend roughly $4 million to solve their connectivity prob-
lems, while Tier 3 firms (small asset managers and hedge
funds) will expend between $250,000 and $500,000.
A variety of options exist for moving data
downstream. Some firms rely on internal
solutions, while others rely on vendor sup-
port. Others have adopted outsourced
solutions that encompass both business
process and technology. Most Tier 1 and
Tier 2 firms tend to adopt some combina-
tion of each. Whatever the strategy, Aite
has found that downstream data integra-
tion consumes about 10% of the initial
EDM budget.
While hardware and vendors can be vetted
through budget analysis, people can't.
Having being prompted for information
about the data, business users frequently
expect IT to respond with answers.
However, IT people are rarely the origina-
tors of the data. For that reason, firms
have a difficult time gathering accurate
requirements for new initiatives. “Getting
a full and complete inventory is a must,”
Moving Data Downstream
Downstream data integration becomes less of an upstream swim
By Stephen Mauzy
Tom Stock, senior v.p. of
product management
at GoldenSource
2008EDM 4/15/08 4:16 PM Page 8
APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 9
Raskin said. “But often things like func-
tionality, data elements and data depart-
ments get overlooked.”
Allocating big money to get everyone
aboard is a formidable challenge. EDM is
an ambivalent priority for many firms,
even though projects higher on the totem
benefit from downstream connectivity.
Honore cited trade processing, settlement
and generic integration efforts as initiatives
that derive value from effective connectivi-
ty strategies.
Risk mitigation is another hot-button issue
that begs to be mitigated by reducing dis-
crepancies, according to Raskin. “EDM is
sometimes viewed more as an obstacle to
getting things done,” he said. “But there
are consequences for not implementing
EDM correctly. In the most benign sense,
you might be buying data multiple times. In a more malig-
nant sense, you have one guy quoting a price in a trading
room and settlement people having no idea where the price
originated.”
Managing stakeholder requirements is a critical process that's
anchored in understanding workflow, data dependencies and
organizational tolerance for operational disruption. Many
firms use formal processes such as service level agreements to
specify requirements and establish EDM objectives.
A Common Language
Standardized jargon is another invaluable commodity. A com-
mon language requires standardized data definitions. During
the requirements gathering process, it can be difficult to get
people to agree on the purpose of the data. For many institu-
tions, the natural tendency is to allow customization or
unique interpretations instead of forcing people to exploit an
existing standard.
Several firms advocated creating a solid data dictionary and
requiring people to talk in the terms of that dictionary. From
that point forward, projects can define tags off the dictionary
and map everything to it. “You have to get everyone on the
same page,” Honore said. “You don't want to send mixed
messages by using interest rate or coupon when you are talk-
ing about the same thing.”
Metadata repositories are popular components of many of the
larger EDM integration projects. The side benefits to a deep
metadata repository are expansive, from enhancing reporting to
risk management to something as simple as an intranet search
engine. The people closest to the source are allowed to contribute
the definition and grow the information repository.
Once everyone is referencing the same
page, attention can turn to managing
consequences. Downstream consumers
need to be prepared for changes because
reporting and entitlements can be affect-
ed by changing data. The problem is one
part technical and one part standards. In
some situations, data changes need to be
reflected at one point in time, and EDM
implementations need to accommodate
different ‘go live’ times for a data change.
“Downstream applications go across trade
executives, books and records, clearing
and settlement, valuation, portfolio man-
agement and risk systems,” said Michael
Atkin, managing director at the EDM
Council. “You look at the process and the
steps involved, and you realize that peo-
ple have built up a way of operating
under one set of approaches. Now you are
trying to change that and you have to understand all those
inter-relationships.”
There is just one caveat: don't read too much into inter-rela-
tionships. It isn't uncommon for a firm to look back and
realize that many data fields sought in the project should
have been rejected. In this instance, performance implica-
tions of the request were the roadblock. Business users do not
always need every attribute on every instrument in every
instance.
Minimizing Silos
Every chief information officer and risk manager would like
all connections running off the most accurate, most con-
trolled data repository. Most firms have bypassed this push
by turning their legacy security master producers into data
consumers as well. The migration off legacy systems will take
years of effort in most instances, but it should be addressed
according to business priorities.
“Getting everyone connected can take up to three years,” said
Stock. “A lot of it depends on the size of the organization and
the sophistication of the enterprise in past EDM efforts. The
large investment banks have some sort of reference data in
consolidation projects, but it's still a long process. That's one
of the biggest challenges in enterprise data management –
speeding up that integration process.”
One of the challenges many firms insufficiently address is the
idea of a response mechanism from the downstream system.
A firm can create the perfect data extract with the cleanest
data possible for a particular interface. But how does it assure
the downstream application actually got the data, processed
it and acquired the results the application expected?
Adam Honore, senior analyst
at Aite Group
2008EDM 4/15/08 4:16 PM Page 9
2008EDM 4/15/08 4:16 PM Page 10
APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 11
The answer for many firms is a centralized data model that
emphasizes understanding market and reference data, legacy
systems and how content is intended to be used throughout
the organization. But the system only works with tight super-
vision. “Without tight controls on centralized ‘data depots,’
separate applications and fixes can creep in at a business unit
level and an institution will begin down the path to silo-
based data,” said Honore.
More cross-product selling and more sophisticated clients
increase the need for consistent data across traditional busi-
ness silos. Customers often receive direct access to data via
the Web and online service capabilities, so the data needs to
be as accurate and timely as possible without the need for
manual intervention – a primacy reason why firms have
learned the key to continued growth is increased straight-
through processing rates.
Good, clean quality reference data necessitates minimizing
data silos. Many firms implement source system controls to
ensure data satisfies quality standards at its point of origin.
When properly implemented, source quality controls can
effectively prevent the proliferation of invalid data. But
source system quality controls alone cannot enforce data
quality. They cannot, for example, ensure that data quality is
maintained throughout the data life cycle, especially when
multiple data sources with varying levels of cleanliness are
combined in downstream data integration processes.
To further ensure data remains high quality, many firms
adopt a flexible data quality strategy that incorporates data
quality components directly into the data integration archi-
tecture. Successful application of this strategy requires a data
integration platform that can implement a broad range of
generic and specific business rules and also adhere to a vari-
ety of data quality standards.
Proper Governance
Effective data management requires top-down, enterprise-
wide guidelines that align the information architecture with
the business goals of the financial institution. Effective gov-
ernance is key, but it can present a formidable obstacle
because of the difficulties associated with providing a clear
business case on the benefits of data management.
“Data alone has no intrinsic value,” said Gary Barr, global
head of EDM at Reuters. “It is an enabler of other processes,
and the true benefits of effective data management are sys-
tematic and intertwined with other processes, which makes it
difficult to quantify all the downstream implications or
upstream improvements.”
Ultimate responsibility of data integration usually falls to
the CTO, CIO or COO but, in most cases, they work in
conjunction with business heads or dedicated data managers
from various trading and operations areas across the enter-
prise. “The bottom line is that you need C-level buy in,”
Raskin said. “You can't go anywhere without that because
things can get political real fast and then they can get
bogged down.”
Difficulty calculating return on investment (ROI) also ham-
pers the ‘buy in.’ Benefits aren't as readily transparent and as
easily measure for EDM as they are in other projects. Honore
suggested designing an environment that provides for met-
rics to be gathered, which, in turn, allow an institution to
measure ROI in specific data operations/business functions.
One can calculate the cost of maintaining accurate counter-
party data against the revenues from the client and/or the
potential losses. Delays in settlement and failed trades can be
measured against those caused by data errors.
The goal in downstream data integration is to get cleansed,
reliable data to the right departments. “Data utopia is core
client data, core securities reference and core trade data all
being fed to a centralized data model that then gets distrib-
uted,” Honore said. “It's not realistic, but that's what you
work toward.” i
CRUNCHING THE DATA
Aite Group interviewed a wide range of people who have
successfully completed initial EDM projects and are currently
engaged in subsequent downstream activities. Some of the
key findings can be found in a white paper titled Navigating
the Rapids of Downstream Data Connectivity. They include:
• Spending on downstream connectivity in the United States
and Europe will exceed US$2.5 billion this year.
• EDM projects mature an average of three years before
diving into downstream connectivity.
• On average, only 10% of an initial EDM budget is allocated
for downstream connectivity.
• The average connector takes between five and 22 days of
engineering effort.
• Internal IT produces 57% of all connectors.
• More than 60% of new connections go to back-office
systems.
• Metadata dictionaries are one of the emerging trends in
managing connectivity.
• Global trading and settlement, trade processing and
compliance solutions ranked near the top of last year’s
IT initiatives.
Source: Aite Group
2008EDM 4/15/08 4:16 PM Page 11
Tomorrow’s EDM
Solutions Today
A look at expanding technology and data management with
BNY Mellon Asset Servicing
By Edward McGann, Managing Director
“i
12 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008
WITH TODAY’S GLOBAL cred-
it environment symptomatic of an
overall global economic slowdown -
and the United States seemingly at
the nexus of that environment - the
need for information is ever more critical. This is particularly true
for institutions involved in the investment industry, whether they
are managing money, creating investment vehicles or providing
services to the buyers and sellers of investments.
That is where service providers like BNY Mellon Asset Servicing
come in. Such providers are constantly in the crosshairs of demand-
ing clients and industry participants to provide real-time, accurate
and, in some cases, enhanced data so they can understand their
positions and performance in their investment portfolios.
To meet that demand, BNY Mellon Asset Servicing employs not
only its powerful technology infrastructure but also its product
management and business units to develop strategies and devise
solutions to clients’ requests. These strategies and products pro-
vide clients the access to the data they need and the tools neces-
sary to analyze and make sense of the wealth of information.
BNY Mellon Asset Servicing is not only in the business of safekeep-
ing and accounting for its clients’ assets, it is in the business of pro-
viding information on those assets. Complicating this somewhat
‘simple’ effort is the huge amount of derivatives and other complex
instruments, where the information may not be complete at the
primary level. These instruments in the modern-day portfolio
derive their value from underlying instruments that may behave
differently, depending on extraneous conditions and events.
The challenge is to provide clients with relevant information in a
manner that is useful and supports decision-making systems and
management information platforms as varied as the investments
themselves. This requires service providers to combine data ele-
ments that exist on a variety of internal platforms with that of
other vendors’ platforms, as well as sometimes combining that
with proprietary information resident at the clients’ sites.
The Mechanics of EDM
Within BNY Mellon Asset Servicing, how the information is collect-
ed, combined, parsed and disseminated to end-users begins within
Global Product Management. By collecting inputs from clients,
industry participants, internal sources such as BNY Mellon Asset
Management and other leading indicators, Global Product
Management works closely with the Technology Group to mine the
enterprise’s platform of systems and data warehouses to identify the
information necessary to meet a particular demand or potential new
product offering. By employing modern-day programming tools, the
Data Management team can extract the information, often in a real-
time manner; package the information; and create a report, data file
or some other method of output that the end-user can utilize.
When the internal sources do not contain all the required fields
of information, outside resources are reviewed. This is often the
case when the product requires additional ‘enhancements’ to
allow for a productive, analytic tool. Being able to stress test
data and perform ‘what if’ analysis is often at the forefront of
the new products and services provided by firms like BNY
Mellon Asset Servicing. The more robust and deep the infor-
mation, the more valuable and accurate the results are.
SPONSORED ARTICLE
2008EDM 4/15/08 4:17 PM Page 12
BNY Mellon Asset Servicing, the largest custodian in the world and
one of the largest cash processing organizations in the world, has access
to a wealth of data and platforms from which to mine information.
The trick is putting it in a place that is readily accessible and secure. In
addition, the footprint in which we operate places us in every time
zone across the globe – the very same places where our clients operate
and conduct business.
Being able to regionalize data while simultaneously making it
available 24/7 on a global basis is another challenge facing the
enterprise data management field. Our clients’ need for informa-
tion may span that of a local office in a location like Singapore to a
regional office in London or Paris, and ultimately to a global head-
quarters located in New York. Investment decisions are often made
on that information, with the results captured and reported on.
Credit meltdowns need to be reacted to quickly, and the best deci-
sions are those made with optimal information. Our platforms are
positioned and configured in a manner that provides access to the
necessary core information and then supplements that with value-
added information from non-core sources, embeds analytics and
decision tools and delivers it all to clients via FTP, host-to-host
connections or Web-based applications like our Workbench or
Inform platforms. In some cases, that same information must be
translated to a second or third language so the user has it in a for-
mat that is useful, and readable, to them. Today, some of our
reports are available in as many as 12 languages.
Warehousing the information is critical to ready access, especially
when the base information resides in separate locations. Eagle
Investment Systems, a BNY Mellon Asset Servicing subsidiary, is a
leading global provider of financial technology solutions. Serving
many of the world's most prominent financial institutions, Eagle's
Web-based solutions integrate and streamline the investment
process. Eagle PACETM
is an advanced data-centric platform that is
fed information from various sources so firms can execute queries
on the data and provide analysis.
Such tools are powerful, as the structure of the underlying data
allows for a variety of ways to dissect it, including monitoring the
investment performance of the portfolio and anticipating potential
changes to the portfolio through our performance products. Eagle
Investment Systems’ products are an important cornerstone to
how BNY Mellon Asset Servicing solves its clients’ diverse enter-
prise data management needs.
Looking to the Future
The challenges ahead will only become more complicated and
demanding as the nature of the investment environment evolves
and new instruments are added. Consumers of the information -
clients and internal parties - will become more demanding as the
need to track performance and understand risk become even more
critical. Firms like BNY Mellon Asset Servicing will be required to
harvest the data and information that exists across its diverse, glob-
al footprint and provide it in real time to the end-user.
As the world becomes smaller in terms of global communication
and interactive markets, success will be measured in fractions of
seconds. The firms that meet those demands to provide accurate
and complete information, along with providing tools to under-
stand the information, will be the ones who survive and excel.
Data and relevant information are critical to an enterprise’s suc-
cess. BNY Mellon Asset Servicing and our partners in the
Technology and Data Management groups are meeting those
needs and positioning ourselves for continued success in the
future. After all, information is at the core of what we do.
About the Author:
Edward McGann is a managing director and head of product management
for financial institutions and international markets within the Global
Product Management unit at BNY Mellon Asset Servicing. In that capac-
ity, he and his team are respon-
sible for ensuring the firm’s cur-
rent suite of products and servic-
es meets the needs of those
important client bases and for
developing future product offer-
ings for the asset servicing mar-
ketplace. His responsibilities also
include strategic and financial
planning, overseeing capital
plans related to product and
technology development and
strategic initiatives that ensure
clients' satisfaction with The
Bank of New York Mellon.
About the Company:
Operating in 34 countries and serving more than 100 markets, the Bank
of New York Mellon is a global financial services company focused on help-
ing clients manage and service their financial assets. The company is a lead-
ing provider of financial services for institutions, corporations and high-net-
worth individuals, providing superior asset management and wealth man-
agement, asset servicing, issuer services, clearing services and treasury servic-
es through a worldwide client-focused team. It has more than $23 trillion
in assets under custody and administration, more than $1.1 trillion in assets
under management and services $11 trillion in outstanding debt.
Additional information is available at www.bnymellon.com.
“The firms that meet those demands to provide accurate and complete
information, along with providing tools to understand the information,
will be the ones who survive and excel.”
APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 13
SPONSORED ARTICLE
2008EDM 4/15/08 4:17 PM Page 13
14 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008
THIS IS A PIVOTAL year in the SEC's effort
to make filings interactive through eXtensible
Business Reporting Language (XBRL), the lan-
guage of interactive financial reporting.
Securities and Exchange Commission chairman
Christopher Cox said so earlier this year, and he was right.
Cox was referring to the SEC's program, as well as the interactive
financial data picture worldwide. Indeed, Israel, China, Singapore
and Japan also are moving to XBRL-based financial reporting.
“The global movement to interactive data for financial reporting is
truly underway,” he said. “Without question, 2008 will be a water-
shed year for interactive data.”
Indeed, this year already has been, although not always in the way
Cox meant. In February, the Advisory Committee on
Improvements to Financial Reporting recommended slowing the
adoption of tagged financial disclosure. The panel's final recom-
mendations are expected this summer, although its advice has been
roundly ignored by SEC officials so far.
Big Benefits
When implemented, XBRL would enable users large and small to
drill deeply and instantaneously through the body of public filings
in that format, to access SEC documents as they are filed in real
time and to be able to feed the numbers into a spreadsheet or other
modeling applications of their choice. XBRL is designed to be a
standards-based way to express financial statements, including
sometimes critical but hard-to-pinpoint data such as footnotes to
SEC reports. “We look at XBRL as being the ideal data structure
for financial reporting,” said David Blaszkowsky, director of the
SEC's Office of Interactive Disclosure.
That ideal is attainable because each of the thousands of data and
text components that compose financial reporting can be described
by XBRL ‘tags.’ The tags in turn refer to files in a taxonomy that
defines them, making them machine readable. A myriad of plat-
forms and applications can then slice, dice, analyze and present the
data. Because any one tag can call upon any other tag, the numbers
and texts can be compared by data categories such as date, compa-
ny, industry and topic.
In fact, the SEC already does that in a limited way, so far posting
307 filings from 74 companies through its XBRL Voluntary Filer
Program. And in February, the agency unveiled Financial Explorer,
an online XBRL tool that demonstrates the system’s potential for
interactive research and graphics.
According to proponents of the program, another reason the finan-
cial world needs XBRL is because the SEC's current database,
Edgar, is outmoded. “Edgar is essentially a document collection sys-
tem,” noted Christopher Whalen, co-founder and managing direc-
tor of Institutional Risk Analytics in Croton-on-Hudson, N.Y. “It
doesn't read the documents, and it doesn't validate them. It just
collects them, assigns them a unique ID number and off they go.”
“With XBRL, what the SEC needs to do is migrate this fairly
ancient system over to something that is far more data-centric, as
opposed to document-centric,” Whalen continued. “That probably
includes some level of validation. In other words, when the docu-
ment hits, they are going to have to look at it and ask, ‘Okay, did
you follow the rules for the tagging?’”
Building Momentum
The advisory panel's comments aside, XBRL appears to be steam-
ing ahead. Recent milestones on the road to adoption include:
• September 2007 - Cox and the XBRL U.S. project team
announce the creation of data tags for all U.S. generally
accepted accounting principles (GAAP).
• October 2007 - the commission creates the Office of
Interactive Disclosure.
• December 2007 - the XBRL U.S. team releases the first
Taxonomies for U.S. GAAP, the all-important dictionary of tags.
• February 2008 - the second draft of the GAAP taxonomies is
released.
• April 2008 - the comment period for XBRL draws to a close.
With the comment period over, a preliminary ruling is
expected soon, followed by a final ruling in autumn. If all
goes as scheduled, this ruling will require the top 500 compa-
nies by market capitalization to file their 2008 annual reports
through XBRL.
A Critical Year
for XBRLThe SEC's interactive data program takes key steps toward adoption,
despite some corporate opposition
By David Lewis
“of
2008EDM 4/15/08 4:17 PM Page 14
APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 15
Some of the public companies and other filers that anticipate some-
day having to code their financial statement in XBRL are not so
pleased by the prospect. Reaction to the project can be roughly
divided this way: favorable among analysts, numbers-crunchers and
smaller investors; less favorable among medium-sized and small
public companies, mutual funds and others who see an undefined
expense looming for little or no gain. In the financial sector, the
conversion is eagerly anticipated by medium- and small-sized shops
and less eagerly by larger institutions, which already may have built
an XBRL-like database on their own dime.
Still, an SEC-sanctioned XBRL data vault over time would be sheer
heaven for most analysts. “The long-term potential is that it is def-
initely going to be a benefit for the analyst community,” said Glenn
Doggett, CFA and policy analyst for financial reporting at the
Charlottesville, Va.-based CFA Institute Centre for Financial
Market Integrity. “This is especially true for small and mid-tier
investment analysts who, when you go to visit them, still have a
stack of 10-Qs and 10-Ks on their desks. They're really going to be
the first beneficiaries of that XBRL framework.”
For analysts that work at the big investment banks, however, the
response may be a bit more muted. “Whether it is on the buy side
or the sell side, they already are subscribers to services that provide
much of the same information to them,” Doggett noted. “For these
players, the change to XBRL is not going to be as much a question
of how they operate as it is providing them information faster and
with less potential for error.”
Why reduced error? “Right now, third-party databases are tran-
scribing numbers, whereas with XBRL analysts will be getting
company-identified values with company-identified tags,”
Doggett explained. “As a result, you have a very one-to-one com-
munication between what the company says and what every
investment analyst gets.”
An Issue of Politics
No one doubts XBRL can work because it already does; the real issue
is politics. “The challenge for the SEC is not the technical challenge
related to XBRL,” Whalen noted. “It is more, how do you align this
technology with the business rules and legal responsibilities of the
SEC and do it in such a way that you don't piss off all of the filers?”
Whalen is among those who believe a 2008 rulemaking deadline
might be pushing it. “In theory, they want to have a rule ready for
the commission to consider in September that would set a
timetable for adoption,” he said. “The reality is that there is a lot of
work to be done between now and then, and I'm not sure that they
are going to have enough time to get everything aligned correctly
in order to hit that deadline.”
Still, the 2008 deadline is important to some of the program’s key
players, namely SEC Chairman Cox. “Let's face it, Chris Cox is
done at the end of this year,” Whalen said. “One way or the other,
you are going to see a new FCC Chairman, and I don't know
whether or not the future leadership of that agency is going to sup-
port something like XBRL as strongly as he has.”
Meanwhile, the SEC's Blaszkowsky argued that, at least for larger
corporations, the transition to XBRL will not be so difficult. “This
is not just a document or a bunch of linear or analog data that hap-
pens to be converted to a digital form, this is inherently digital data
that can be found and applied across other databases,” he contend-
ed. “As such, it is inherently digital, it is inherently tagged and it is
inherently available for the kind of constructive engagement that
enterprise data management systems are developed for.”
The main point, according to Blaszkowsky, is that universal adoption
of XBRL is inevitable. “This has its own compelling internal logic,” he
said. “The investment world and the corporate world have worries
and concerns and some of them are very legitimate ones, but they will
find that those concerns are unwarranted or exaggerated and that the
benefits are real. They will want to see XBRL implemented.” i
“The investment world and the corporate world have worries and concerns and some
of them are very legitimate ones, but they will find that those concerns are unwarranted
or exaggerated and that the benefits are real.”— David Blaszkowsky
Some of the interactive research and graphics available
through Financial Explorer.
2008EDM 4/15/08 4:17 PM Page 15
16 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008
One Step Ahead
in OTC Derivatives
In an industry facing calls for improvements, The Depository
Trust & Clearing Corp already is making them
By Gregory Morris
MARCH WAS THE MONTH for
getting in step. On the 13th of that
month, Treasury Secretary Henry
Paulson, Jr. set the pace in his
remarks on recommendations from
the President’s Working Group (PWG) on Financial Markets.
He cited a number of the working group’s key findings and
specifically called for a joint industry response in several finan-
cial sectors, including over-the-counter (OTC) derivatives.
Just two weeks later, almost two dozen financial institutions
and trade associations sent an open letter to the President of
the New York Federal Reserve, Timothy Geithner. That let-
ter cited progress to date and underscored the financial com-
munity’s support for further improvements. The NY Fed
replied the same day with encouragement and suggested sev-
eral near-term goals.
With all the mutual support and affirmation carrying on in the
foreground, it was easy to miss a groundbreaking initiative
occurring in the background. Demonstrating that some seg-
ments of the market are already on the case, The Depository
Trust & Clearing Corp (DTCC), through its Trade
Information Warehouse, completed the first automated pro-
cessing of a credit event for a Canadian printing firm, comply-
ing with protocols issued by the International Swaps and
Derivatives Association (ISDA).
With this first automated credit event now concluded, DTCC
is focused on enhancing this functionality of the Warehouse in
preparation for future events. Some of these priorities include
allowing for automatic adherence by counterparties and adding
index tranches to the products the Warehouse supports for
credit events.
A Track Record of Progress
DTCC first launched Deriv/SERV, its automated matching
and confirmation platform for OTC derivatives, in late 2003 to
help the derivatives community address the operational chal-
lenges they faced in a market growing at breakneck speed. The
service has been instrumental in allowing market participants
to meet their commitment to global regulators to strengthen
their infrastructure by increasing the automated processing of
OTC derivatives transactions.
Today, more than 95% of credit derivatives transactions are
electronically matched and confirmed on the Deriv/SERV plat-
form. Transaction volume for all OTC derivatives products
jumped 123% to 5.9 million transactions last year, up from 2.6
million the previous year. More than 1,100 global dealers, asset
managers, hedge funds and other end-users in 31 countries
automate their OTC derivatives transactions through
Deriv/SERV, with more being added each week.
To further support derivatives trading, DTCC launched its
Trade Information Warehouse, the first automated global
repository for OTC derivatives, in November 2006. Part of
DTCC Deriv/SERV’s family of automated post-trade process-
ing services for the OTC derivatives community, the
Warehouse provides an automated environment where con-
tracts can be tracked and serviced over their lifecycle. Last year,
about three million contracts were recorded into the
Warehouse, with an average of an additional 10,000 new con-
tracts now being added daily.
SPONSORED ARTICLE
2008EDM 4/15/08 4:17 PM Page 16
APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 17
Expanding on the Warehouse’s functional-
ity, late last year, DTCC launched a cen-
tral settlement service for OTC credit
derivatives transactions, in conjunction
with CLS Bank International of New
York. It is the OTC derivatives industry’s
only automated solution for calculating,
netting and settling payments between
counterparties to bilateral contracts.
The new service has been designed to enable
payments associated with transactions con-
firmed through Deriv/SERV and residing in
the Warehouse’s global contract repository to
be netted by value, date, currency and coun-
terparty. Payments eligible for settlement
include initial payments and one-time fees,
coupon payments and payments associated
with post-trade events. Central settlement
greatly reduces operating risks for users by
replacing manually processed bilateral pay-
ments with automated, netted payments.
The Warehouse generates bilaterally netted payment instruc-
tions and sends them to CLS for settlement. CLS automatical-
ly notifies its Settlement Members, who effect settlement
through CLS. Reports are generated and delivered to counter-
parties early in the morning on settlement day.
In the second quarterly central settlement cycle for the new
service on March 20, 2008, the amount of trading obligations
requiring financial settlement was reduced by 93%, from a
gross of $18 billion in aggregate U.S. dollar terms to $1.2 bil-
lion net. Gross settlements by the 15 participating OTC deriv-
atives dealers were consolidated from 400,000 to 200 net set-
tlements. Payments were made in five currencies: the U.S. dol-
lar, euro, British pound, Japanese yen and Swiss franc. Over
time, the number of currencies in which payments can be made
will be expanded from the initial five.
Ahead of Regulators’ Recommendations
“Our initial offering for automated processing of OTC
derivatives products began to take form in 2003 and went
live in 2004, prior to the earliest regulatory calls to address
the deficiencies in the system,” according to Frank De
Maria, managing director and chief operating officer of
DTCC’s Deriv/SERV. “Since then, we have increased our
service offering and are working with both buy-side and sell-
side counterparties.”
Indeed, DTCC’s products and services seem
to complement Paulson’s March 13th
remarks. “We need a dedicated industry
cooperative,” he said at the time. “Market
volume and instrument complexity call for a
clear, functional, well-designed infrastruc-
ture that can meet the needs of the OTC
derivatives markets in the years ahead.”
Paulson further commented that such an
industry cooperative “must capture all sig-
nificant processing events over the entire
lifecycle of trades. It must have the capabili-
ty to accommodate all major asset classes
and product types. It must be operationally
reliable and scalable and use automation to
promote standardization that will create effi-
ciency and moderate excessive complexity.”
Paulson noted that the PWG specifies that
the infrastructure must have a flexible and
open architecture for inter-operability,
upgrades and improvements. “The facility also should
enhance counterparty risk management through netting and
collateral agreements by promoting portfolio reconciliation
and accurate valuation of trades,” he added, urging the indus-
try to “incorporate, without delay, cash settlement protocol
into standard documentation.”
Integrating the Infrastructure
As with the overall thrust of the industry initiative to make the
OTC derivatives market more transparent and efficient, De
Maria said Deriv/SERV’s expanded capabilities respond to
Paulson and the PWG’s recommendation to develop a longer-
term plan for an integrated operational infrastructure in the
OTC derivatives market.
The PWG calls for maximizing “the efficiencies obtainable
from automation and electronic platforms by promoting stan-
dardization and interoperability of infrastructure compo-
nents.” It also urges participants to enhance their ability to
“manage counterparty risk through netting and collateral
agreements by promoting portfolio reconciliation and accu-
rate valuation of trades.”
De Maria noted that these initiatives are already part of the
Warehouse’s daily process. “Maintaining the most up-to-date
information on trade details in one central portal addresses
the challenges participants face in keeping their collective
“Maintaining the most up-to-date information on trade details in
one central portal addresses the challenges participants face
in keeping their collective deal books in synch.”
SPONSORED ARTICLE
Frank De Maria, managing
director and chief operating
officer of DTCC’s Deriv/SERV
2008EDM 4/15/08 4:17 PM Page 17
18 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008
deal books in synch,” he said. Because the Warehouse enables
each participant to see the positions they hold with their
counterparties, there is more transparency and portfolio rec-
onciliation is much more seamless.
“The first step is efficient and timely reconciliation among coun-
terparties,” said De Maria. “That enables them to terminate,
assign and amend positions as the front office sees fit and do so in
a controlled manner.”
In terms of providing an integrated infrastructure that encompasses
the buy-side as well as the dealer community, DTCC’s Deriv/SERV
offers it. “Well over 1,000 of our customers represent buy-side
firms,” De Maria said. “We are an industry owned organization
whose policy and priorities are set in conjunction with our
Operations Management Group, which includes representation
from both dealers and buy-side firms. This helps us build consensus
on our key initiatives that reflect the interests of a wide range of
industry members.”
To ensure broad participation in DTCC’s derivatives services, it was
critical that Deriv/SERV’s matching and confirmation service and
the Warehouse had a full spectrum of interface capabilities. “It can
be used by the most technologically sophisticated firms, as well as by
those who do not have as robust an infrastructure,” De Maria said.
It is also important to note that buy-side services are charged no fees
to use the service.
And what about risk mitigation? Risk awareness currently is a very
hot topic, De Maria noted. “The OTC derivatives market has all
three major forms of risk: market, credit and operational. In a philo-
sophical sense, the market risks and the credit risks are what bring
the business into being. But wasting money on operational risks
benefits no one. If you increase automation, you both increase effi-
ciency and reduce risk,” he said.
A New Focus on Novation
The major new emphasis for this year is on novation. Novation
Consent, a new service launched earlier this year, is intended to
automate the email process that takes place between counterparties
of assignment transactions. Provided through the Trade Information
Warehouse, Novation Consent automates the request, approval and
notification procedures among the three trading parties involved in
an OTC credit derivative contract assignment, as stipulated by
ISDA in its Novation Protocol.
Under the ISDA Novation Protocol, when a party to an OTC deriv-
ative transaction wishes to exit that contract by assigning its position
to a third party, that party - the transferor - must notify the remain-
ing party and the entering party - the transferee - and seek permis-
sion for the assignment from the remaining party.
“Deriv/SERV worked with the OTC derivatives industry to
build an automated tool for the processing of novations fully
compliant with ISDA’s novation protocol,” De Maria said.
“We have designed Novation Consent to deliver the features
trading parties have been seeking in terms of speed, efficien-
cy and inter-operability across platforms.”
Novation Consent streamlines assignment processing by allowing
firms to consolidate consents messages. Furthermore, it leverages the
Warehouse’s power as a global repository of confirmed OTC credit
derivative transactions by retrieving trade data from the Warehouse
and enabling users to submit assignments to Deriv/SERV’s
Matching and Confirmation service.
In the new novation service, as with all its products, De Maria noted
that DTCC is careful not to let the drive to standardization and effi-
ciency impinge on the flexibility that makes the OTC market so
vibrant. “Dealers and end-users are working through ISDA standard
master agreements,” he said. “We will continue to see standardiza-
tion in legal documents, allowing all participants to speak the same
language, but there is still a great deal of flexibility. Clearly, our
industry has been very proactive and has had great foresight.”
About the Company:
Depository Trust & Clearing Corp (DTCC), through its subsidiaries,
provides clearing, settlement and information services for equities,
corporate and municipal bonds, government and mortgage-backed
securities, money market instruments and OTC derivatives. The firm
also is a leading processor of mutual fund and insurance transactions,
linking funds and carriers with their distribution networks.
DTCC’s depository provides custody and asset services for about 3.5
million securities issues from the U.S. and 110 other countries and
territories worth more than $40 trillion. In 2007, DTCC cleared
and settled more than $1.86 quadrillion in securities transactions.
DTCC’s OTC derivatives services are provided by its wholly-owned sub-
sidiary, DTCC Deriv/SERV. As managing director and chief operating
officer of that subsidiary, Frank De Maria is responsible for the day-to-
day operations of DTCC’s automated services for the OTC derivatives
market. He oversees the company’s matching and confirmation system for
credit derivatives and leads a cross-organizational team in charge of sup-
porting and developing the Trade Information Warehouse.
For more information, please visit www.dtcc.com.
SPONSORED ARTICLE
0
5
10
15
20
25
35
30
2003 2004 2005 2006 Source:ISDA Market Survey
Growth in OTC Credit Derivatives Volume
Notional Amounts in trillions of US Dollars
3.8
8.4
17.1
34.4
2008EDM 4/15/08 4:17 PM Page 18
www.dtcc.com
DTCC Deriv/SERV’s family of services for OTC derivatives is:
• Reducing risk
• Cutting costs
• Enhancing efficiency
• Building the largest community of users worldwide
DTCC Deriv/SERV’s electronic trade matching and confirmation
service and Trade Information Warehouse make paper-based,
error-prone, manual processing obsolete. Join global dealers
and the buy-side community in automating and streamlining
your OTC derivatives post-trade deal flow.
Deriv/SERV services OTC credit, equity and interest rate
derivatives on a global basis, at no charge to buy-side firms
and at cost to dealers.
To learn more call London +44 (0)20 7444 0411
New York +1 212 855 5424 Or visit www.dtcc.com
Say Buenos Dias to Automation,
Sayonara to Risk
The Logical Solutions Provider
Clearance and
Settlement - Equities
and Fixed Income
Asset Servicing
Mutual Funds
Managed Accounts
Alternative
Investments
Insurance
Global Corporate
Actions
OTC Derivatives
2008EDM 4/15/08 4:17 PM Page 19
Having comprehensive securities
reference data is great.
Having a way to make sense
of it is even better.
Add more value to your reference data with
Standard & Poor’s Cross Reference Services™
.
Now there is a global solution that can help tie all your securities reference data
together—Standard & Poor’s Cross Reference Services. Customizable and available
through multiple delivery channels, the service links identifiers, entities, issuers and
obligors across global and domestic markets. It’s the insight you need to help manage
your enterprise-wide exposure, enhance compliance, highlight potential conflicts of
interest and identify investment opportunities.
Analytic services and products provided by Standard & Poor’s are the result of separate activities designed to preserve the independence and objectivity of each analytic process.
Standard & Poor’s has established policies and procedures to maintain the confidentiality of non-public information received during each analytic process.
© 2008 Standard & Poor’s, a division of The McGraw-Hill Companies, Inc. All rights reserved.
STANDARD & POOR’S is a registered trademark of The McGraw-Hill Companies, Inc.
Learn more. Visit www.sp.crossrefservices.com, e-mail sp_marketing@standardandpoors.com or
call 212.438.4500 (North America) +44.(0).20.7176.7445 (Europe). www.standardandpoors.com
2008EDM 4/15/08 4:17 PM Page 20

More Related Content

PPTX
Tech M&A Monthly: China - What's Really Happening?
PDF
Federal it-cost-commission-report accelerating-the mission-july 21.2016
PDF
Bitcoin:What is the Future?
PDF
Strategies for the Age of Digital Disruption #DTR7
DOCX
Windy City CIOs report 6 8 16
PDF
The Impact of the Internet on SME Businesses
PDF
A New French Revolution? Building a National Economy for the #Digital Age
PPTX
The creation of michael dadoun, upclick fuelling ecommerce growth
Tech M&A Monthly: China - What's Really Happening?
Federal it-cost-commission-report accelerating-the mission-july 21.2016
Bitcoin:What is the Future?
Strategies for the Age of Digital Disruption #DTR7
Windy City CIOs report 6 8 16
The Impact of the Internet on SME Businesses
A New French Revolution? Building a National Economy for the #Digital Age
The creation of michael dadoun, upclick fuelling ecommerce growth

What's hot (19)

PDF
The Mega Dealerships
PDF
The Substitution of Labor
PPTX
Blockchain the inception of a new database of everything by dinis guarda bloc...
PDF
CIO Insights from the Global C-suite Study
PDF
E marketer small_businesses_as_tough_b2b_customers-shaky_in_their_own_marketi...
PDF
Gov2020: A Peek into the Future of Government
PPT
All bankers are criminals
PDF
The Next Recession is Coming... This is Your Survival Guide
PDF
Achieving digital maturity: Adapting your company to a changing world
PDF
Infographic- Why Do Hybrid IT Solutions Create Jekyll & Hyde
PPTX
The technology trends of 2020
PPTX
Tech Update Summary from Blue Mountain Data Systems June 2015
PDF
Investing in fintech: Trends in financial technology for investors and entrep...
PDF
Regulating corporate vc
PDF
ICO 2.0 Summit - Keynote Presetnation
PDF
The CFO in Insurance
PDF
deloitte-au-consulting-digital-disruption-whitepaper-0912
PDF
Aug-Sep cover story
The Mega Dealerships
The Substitution of Labor
Blockchain the inception of a new database of everything by dinis guarda bloc...
CIO Insights from the Global C-suite Study
E marketer small_businesses_as_tough_b2b_customers-shaky_in_their_own_marketi...
Gov2020: A Peek into the Future of Government
All bankers are criminals
The Next Recession is Coming... This is Your Survival Guide
Achieving digital maturity: Adapting your company to a changing world
Infographic- Why Do Hybrid IT Solutions Create Jekyll & Hyde
The technology trends of 2020
Tech Update Summary from Blue Mountain Data Systems June 2015
Investing in fintech: Trends in financial technology for investors and entrep...
Regulating corporate vc
ICO 2.0 Summit - Keynote Presetnation
The CFO in Insurance
deloitte-au-consulting-digital-disruption-whitepaper-0912
Aug-Sep cover story
Ad

Similar to Enterprise Data Mgmt Report (20)

PDF
Presentatie van Mateen Greenway
PDF
Stewarding data why financial services need a chief data officer
PDF
Stewarding Data : Why Financial Services Firms Need a Chief Data Officier
PDF
Savvis - Rising to the Challenge (2009)
PDF
A Conceptual Framework for Digital Business Transformation
PDF
GigaOM Putting Big Data to Work by Brett Sheppard
PDF
In Data we Trust - RICS BSJ May June 2018
PDF
Running at the Speed of Digital: Hyper-Digital Information Management
PDF
Reference data management in financial services industry
PDF
Digital Transformation Review Nr. 5
PDF
Capgemini Consulting Digital Transformation Review No. 5
PDF
A Vision for Quantitative Investing in the Data Economy by Michael Beal at Qu...
PDF
Cfo outlook 2016
PDF
The Power of a Complete 360° View of the Customer - Digital Transformation fo...
PDF
Disruptions Driving FinTech Investing
PDF
The State of Big Data Adoption: A Glance at Top Industries Adopting Big Data ...
PDF
Business First, Technology Second for Italy's CIOs
DOCX
S T R A T E G I E S FOR1 RANSITIONING O L D ECONOMY FI.docx
PDF
Is effective Data Governance a choice or necessity in Financial Services?
Presentatie van Mateen Greenway
Stewarding data why financial services need a chief data officer
Stewarding Data : Why Financial Services Firms Need a Chief Data Officier
Savvis - Rising to the Challenge (2009)
A Conceptual Framework for Digital Business Transformation
GigaOM Putting Big Data to Work by Brett Sheppard
In Data we Trust - RICS BSJ May June 2018
Running at the Speed of Digital: Hyper-Digital Information Management
Reference data management in financial services industry
Digital Transformation Review Nr. 5
Capgemini Consulting Digital Transformation Review No. 5
A Vision for Quantitative Investing in the Data Economy by Michael Beal at Qu...
Cfo outlook 2016
The Power of a Complete 360° View of the Customer - Digital Transformation fo...
Disruptions Driving FinTech Investing
The State of Big Data Adoption: A Glance at Top Industries Adopting Big Data ...
Business First, Technology Second for Italy's CIOs
S T R A T E G I E S FOR1 RANSITIONING O L D ECONOMY FI.docx
Is effective Data Governance a choice or necessity in Financial Services?
Ad

More from Erik Kolb (9)

PDF
Mutual Fund Stars 08
PDF
Plan Sponsor Roundup 08
PDF
New Horizons in SF
PDF
HF Survival Guide 08
PDF
2014 PERE 50
PDF
2014 Investor 30 ranking
PDF
2012 Aussie Supplement
PDF
US MBA Feature
PDF
Dart Cayman Feature
Mutual Fund Stars 08
Plan Sponsor Roundup 08
New Horizons in SF
HF Survival Guide 08
2014 PERE 50
2014 Investor 30 ranking
2012 Aussie Supplement
US MBA Feature
Dart Cayman Feature

Enterprise Data Mgmt Report

  • 1. Enterprise Data Management APRIL 2008 • emii.com Moving Data Downstream Downstream data integration becomes less of an upstream swim. Managing a Data Overload Growing data problems drive financial services firms to reevaluate their information strategies. FromThe Publishers of: Report 2008EDM 4/15/08 4:16 PM Page 1
  • 3. Enterprise Data Management Report APRIL 2008 • emii.com APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 3 5 Managing a Data Overload By David Lewis An ever-increasing volume of data, a good portion of which contain errors, is driving financial services firms to reevaluate their information strategies. The biggest challenge, however, may be getting these firms to buy into the fact that this is an ongoing process that needs to be managed properly. 8 Moving Data Downstream By Stephen Mauzy Downstream data integration - the goal of which is to get cleansed, reliable data to the right departments - can be harder than most financial firms think, especially for an industry built on stand-alone and legacy back-office sys- tems. But the goal is definitely achievable through invest- ments of time and money and proper governance. 12 Tomorrow’s EDM Solutions Today By Edward McGann Investors demand real-time, accurate and, in some cases, enhanced data so they can understand their positions and performance in their investment portfo- lios. To meet that demand, BNY Mellon Asset Servicing employs not only its powerful technology infrastructure but also its product management and business units to develop strategies and devise solu- tions to clients’ requests. 14 A Critical Year for XBRL By David Lewis The SEC's effort to make financial reporting interac- tive through its eXtensible Business Reporting Language program is on the fast track to adoption, despite opposition from some corporations and investors. 16 One Step Ahead in OTC Derivatives By Gregory Morris In an industry facing calls for improvements, The Depository Trust & Clearing Corp already is making them. One Step Ahead in OTC Derivatives Table of Contents 16 5 MANAGING A DATA OVERLOAD 2008EDM 4/15/08 4:16 PM Page 3
  • 4. www.emii.com A Publication of Institutional Investor, Inc. © Copyright 2008. Institutional Investor, Inc. All rights reserved. New York Publishing offices: 225 Park Avenue South, New York, NY 10003 • 212-224-3800 • www.iinews.com Copyright notice. No part of this publication may be copied, photocopied or duplicated in any form or by any means without Institutional Investor’s prior written consent. Copying of this publication is in vio- lation of the Federal Copyright Law (17 USC 101 et seq.). Violators may be subject to criminal penalties as well as liability for substantial monetary damages, including statutory damages up to $100,000 per infringement, costs and attorney’s fees. The information contained herein is accurate to the best of the publisher’s knowledge; however, the pub- lisher can accept no responsibility for the accuracy or completeness of such information or for loss or damage caused by any use thereof. VINCENT YESENOSKY Senior Operations Manager (212) 224-3057 DAVID SILVA Senior Fulfillment Manager (212) 224-3573 REPRINTS DEWEY PALMIERI Reprints & Premission Manager (212) 224-3675 dpalmieri@iinvestor.net CORPORATE GARY MUELLER Chairman & CEO CHRISTOPHER BROWN President STEVEN KURTZ Director of Finance & Operations ROBERT TONCHUK Director/Central Operations & Fulfillment Customer Service: PO Box 5016, Brentwood, TN 37024-5016. Tel: 1-800-715-9195. Fax: 1-615-377-0525 UK: 44 20 7779 8704 Hong Kong: 852 2842 6910 E-mail: customerservice@iinews.com Editorial Offices: 225 Park Avenue South, New York, NY 10003. Tel: 1-212-224-3279 Email: eblackwell@iinews.com. EDITORIAL ERIK KOLB Editor of Business Publishing DAVID LEWIS Contributing Reporter STEPHEN MAUZY Contributing Reporter GREGORY MORRIS Contributing Reporter PRODUCTION AYDAN SAVASER Art Director MARIA JODICE Advertising Production Manager (212) 224-3267 ADVERTISING/BUSINESS PUBLISHING JONATHAN WRIGHT Publisher (212) 224-3566 jwright@iinews.com PAT BERTUCCI Associate Publisher (212) 224-3890 LANCE KISLING Associate Publisher (212) 224-3026 LESLIE NG Advertising Coordinator PUBLISHING BRISTOL VOSS Publisher (212) 224-3628 MIKE FERGUS Marketing Director (212) 224-3266 Editor’s Note Welcome to the 2008 Enterprise Data Management Report, an update on how financial services firms are addressing the issue of accurate, transparent data and its consistent integration into various applications across their businesses. Enterprise data management is a relatively new business objective that few firms fully understand. That is why the Enterprise Data Management Report begins with an overview of the concept and the issues firms face as they move through the various stages of implementation. For most, that means starting at the beginning and dealing with an ever-increasing volume of data, a good portion of which contain errors. The biggest challenge, however, may be get- ting these firms to buy into the fact that this is an ongoing process that needs to be managed properly (see story, page 5). Next, the Report addresses the issue of downstream data integration, the goal of which is to get cleansed, reliable data to the right departments.That can be harder than most financial firms think, especially for an industry built on stand- alone and legacy back-office systems. But the goal is definitely achievable through investments of time and money and proper governance (see story, page 8). Beyond that, the Report includes an item on the SEC's effort to make financial reporting interactive through its eXtensible Business Reporting Language program (see story, page 14) and sponsored articles from BNY Mellon Asset Servicing and The Depository Trust & Clearing Corp. Enterprise Data Management Report is the latest in a series of special supplements produced by Institutional Investor News exclusively for our newsletter subscribers. It is part of our commitment to bringing our read- ers the freshest news and in-depth analysis on important sectors and timely topics within the financial markets. Enjoy, Erik Kolb Editor of Business Publishing Institutional Investor News Enterprise Data Management APRIL 2008 • emii.com Moving Data Downstream Downstream data integration becomes less of an upstream swim. Managing a Data Overload Growing data problems drive financial services firms to reevaluate their information strategies. FromThe Publishers of: Report 4 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008 2008EDM 4/15/08 4:16 PM Page 4
  • 5. APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 5 WALL STREET HAS A PROB- LEM - a big, fat, ugly data problem. While bad information represents a large part of the problem, another sig- nificant issue is that the sheer volume of data has surpassed tidal wave proportions. With data growing at exponential rates, particularly at financial services institutions, the tidal wave of data is more like a series of tsunamis. The reasons are many. Two of them are the plummeting cost and growing volume of storage. According to BearingPoint, storage costs have fallen 99.75% per gigabyte since 1980, while online storage volume is projected to grow 273% between this year and 2011. Another reason is the continuing rise of online commerce, includ- ing 30% annual increases in online auto insurance purchases and an expected 27% annual rise in online banking. Cost used to be a filter, according to Ed Hagan, managing director and global leader of BearingPoint's Information Management Solution Suite. Now, however, there is no filter. “The new filter needs to be, what is valuable information? What is the most impor- tant information to the enterprise? Most financial organizations real- ly can't tell you that,” he said. The answer is a concept called enterprise data management (EDM). “The biggest challenge is that there is this roiling sea of data and, if you jump in just anywhere with your tin cup and start bailing, it's a long and not-so-productive process,” Hagan said. “Whereas if you can somehow start to classify this sea of informa- tion into areas of primary, secondary and tertiary value, you can prioritize your initiatives and start to look at what applications are most critical to your most valuable information. Then you can take a more structured approach to solving the problem and man- aging the problem on an ongoing basis.” Roots of the Problem Part of today's swelling data problem is simply the ballooning volumes of information that afflict all U.S. corporations. Part of it, however, can be attributed to the financial services com- panies themselves. “The financial services industry is an inter- esting space,” Hagan said. “On many elements, they are lead- ing the charge with cutting-edge practices. But they also have some of the biggest problems, so their pain around these issues is pretty substantial.” Scott Dillman, managing director at PricewaterhouseCoopers, relayed a story that explains how financial have brought some of this burden upon themselves. Not long ago, he and his team audited and ‘cleaned’ the basic information of a major bank. Lo and behold, about 75% of the bank's accounts contained errors. Managing a Data Overload Growing data problems drive financial services firms to reevaluate information strategies By David Lewis 2008EDM 4/15/08 4:16 PM Page 5
  • 6. 6 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008 There was no intent to mislead, rather the bank had an informa- tion technology problem, Dillman noted. That problem caused it to create an account for ‘John Doe’ at one branch, an account for ‘John R. Doe’ at another branch and a ‘J. Randolph Doe’ money market account, before later adding a ‘J.R. Doe’ loan. The potential for error is obvious. So is the point that this method makes it nearly impossible to sort all this out from the ‘John Doe and Mary K. Doe’ joint account. The bank's problem clearly is that it has no technology capable of creating a single identity for ‘John Doe’ with multiple attributes. That means it has no way to provide value-added information such as ‘John Doe, who has two joint accounts with Mary Doe, who has a new line of credit and likes to use our supermarket- based branches.’ It also means that, potential errors aside, the multiple accounts the bank creates and maintains simply clog the system, making it redundant and costly. But the sins of the financial institutions are vastly compounded by questions of what to do with and how to make sense of the burgeoning heap of numbers and text. According to BearingPoint, e-mail at many financial services firms is increasing 40-60% per year, and at some financial companies email volumes are rising at a rate of 100% annually. Meanwhile, trading volumes on the New York Stock Exchange and NASDAQ have increased 19 times in the past decade. Prerequisites to EDM The answer to the data glut is to be found in a cluster of acronyms - IM, MDM and EDM. Respectively, these are information management, master data management and enterprise data management. To break those down a bit, information management is the umbrel- la term for the notion that information must be managed before it can be processed. This has led to a new breed of ‘C suite’ officer known as the chief data officer or something similar. Yahoo! appears to have been the pioneer in creating this post, followed in the finan- cial space by CitiGroup and JPMorgan Chase. While the preoccu- pation of the typical chief information officer has been information processing—the manipulation of data by particular applications— the job of the CDO is data and only data. The development of the role of chief data officer or its equivalent is one measure of the growing maturity of data management. “One of the challenges of that role, like any kind of C-level role, is that it is pretty immature at the beginning,” Hagan said. “As new leadership in this space starts to emerge, we will see a broader business perspec- tive around dealing with information management.” The goal of data management is actionable information. “If you don't have clean data, you're not going to find that you have real information,” said Dillman. “If you don't have real information, you really have no way of knowing whether decisions are being made on a solid basis.” Master data manage- ment, in effect, is the backbone that under- lies enterprise data management. “The way we look at MDM is as a subset of the whole information management space. Master data manage- ment is looking at which pieces of data need to be standardized across the organiza- tion,” Hagan said. “This is really the back- bone of how you move information across the organization.” Historically, every application defined its own master data. “If their ability to pass the information across the organization and their referential data is not consistent, it is a big challenge,” Hagan noted. The idea of common hierarchies brings up the problem of the ‘silos’ of data. “The financial industry is just like every other industry out there in the sense that it developed its business ver- tically, silo by silo,” noted Michael Atkin, managing director of the EDM Council, an industry trade association. “So you have all of these silos within a financial institution by function, by data area, by geography, by a whole host of things.” Then, as the world changed and no longer could just operate in a vertical business framework, companies had to start looking at their organization horizontally. “All of a sudden, they needed to reconcile all of these data stores that existed all over their organi- zation without coordination and without alignment,” Atkin said. Understanding EDM Operationally, the key concept to unwinding this Gordian knot is the final acronym, EDM. That concept means just what it says: managing data that is transparent, detailed, relational, accurate and appropriately shared across an entire enterprise. To elaborate, a 2006 Finextra Research study interviewed 17 chief technology officers, chief information officers and other senior data and business managers of major banks and buy-side and sell- side firms in Europe and North America. The analysis found that the meaning of enterprise data management differed by manage- ment position, business type and company size. Yet, there was broad agreement that enterprise data management is a process required to enable disparate applications and parts of a business to share information. It is driven by a need to promote Ed Hagan, managing director at BearingPoint 2008EDM 4/15/08 4:16 PM Page 6
  • 7. APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 7 accuracy, transparency and efficiency in the business, to ease reg- ulatory compliance and improve client service and performance. Most interviewees said that, for them, EDM meant capturing, managing and analyzing product, customer, counterparty and operational data at a very granular level. They noted that they had begun to, or wanted to, standardize and manage data centrally and share it across the business. “EDM is not a system, a technology or a process; it is an objec- tive,” Atkin said. “If you understand it as an objective with incre- mental strategies that deal with your individual challenges, then the goal is to achieve enterprise data management.” Enterprise data management thus includes issues such as data quality, management, governance and architecture. “Governance means this needs to become a broader enterprise discipline, as opposed to something we look at as, ‘Here's this problem we need to fix,’” Hagan said. “Particularly in organi- zations like financial services organizations, the same level of discipline that is put around managing financial assets is not there when it comes to managing information assets. It's real- ly a new dimension by which we need to manage our business- es, as opposed to a problem that we need to fix.” The Finextra study also underlined the criticality of EDM. Asked what were the driving forces behind their enterprise's investment in EDM, 59% of respondents said risk manage- ment was very important and the primary driving force. That was followed by compliance, 47%; business growth, 41%; and operational efficiency/cost reduction, 35%. Atkin noted that corporate investment in EDM beats the alternative. “It is more expensive not to be able to manage your business, not to be able to meet your time-to-market objectives and not identify and profit from niches and market opportunities,” he said. “It's more expensive to have trades fail, to fail a regulatory audit or to compensate your clients because you're not meeting the terms of your investment agreements.” And so on. The State of the Industry As to how financial institutions are meeting the data chal- lenge, most observers agreed that few are. A handful of players are, maybe five percent, they noted, citing Morgan Stanley, Goldman Sachs and Barclays Capital as leaders in the sector. “Financial institutions differ pretty substantially,” Hagan said. “The one common element, at least in the larger, global organizations and especially for those involved in the capital markets, is the recognition that this is a challenge, and it is getting more challenging every day. With the exponential growth of information and the risk associated with that infor- mation increasing every day, executives are trying to figure out, ‘How can I manage the cost side of this equation? How do I manage the risk? And ultimately, how do I get some value from all this data that is piling up across our organization?’” While financial institu- tions might be far from getting on top of EDM, they're not ignoring the concept either. “EDM is com- pletely accepted by most of the Tier 1 financial institutions, and it is absolutely above and beyond the ‘What is it and why should we care?’ phase,” Atkin said. “That, in my opinion, has occurred relatively quickly. Four years ago, we couldn't even spell ‘reference data.’ Now, there are data management programs underway at virtually every financial institution around the world, and data is understood as the asset that it is.” However, there is more variation regarding what actions these financial institutions have taken. Indeed, just as levels of cor- porate engagement with EDM differ from company to compa- ny, EDM governance and strategies also vary widely. To help, BearingPoint identified a four-part framework to help companies address their EDM strategy: 1. What information is critical to understanding whether the company is executing its business strategy? 2. Who needs what information to make strategic deci- sions from an operational perspective? 3. What are the critical processes within the organization from the standpoint of performance and quality? 4. What information is critical to regulatory or other exter- nal requirements? “Most financial institutions that we deal with – primarily Tier 1 and 2 – recognize EDM, understand it and know they have a problem that needs to be managed,” Atkin said. “Most of them have created data groups, appointed people to be respon- sible for data activity and are trying to set up appropriate internal governance mechanisms.” That being said, it's still a foreign concept for most financial institutions, Atkin noted. “At the moment, it's almost been relegated to just another task to perform within the financial institution. That is not necessarily good,” he said. “The ten- dency is to have a short-term view and try to fix things tacti- cally. That is not the makings for a good data strategy.”i Scott Dillman, managing director at PricewaterhouseCoopers 2008EDM 4/15/08 4:16 PM Page 7
  • 8. 8 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008 ENTERPRISE DATA MANAGEMENT (EDM) appears to have transcended the rhetoric stage, garnering not only lip serv- ice but increased attention and money. Most people recognize the value of clean, accurate data that is ‘fit for purpose.’ Unfortunately, the process of getting ‘fit for purpose’ data to end-users can be a slow slog of a process. The first step in moving data from here to there is assuring quality. Data integrity degrades over time. Consider a simple, but common, scenario: a regulated investment firm transacts with another firm it initially regards as low risk. Over time, however, the counterparty's investments gradually tilt toward high risk because of changes in geographic location, macro- economic variables, poor management or obsolescence, thus changing the firm's risk profile from low to high. The obvious remedy is to update the data - if only it were that simple. Data that is extracted and aggregated into down- stream systems is usually controlled by the managers of the upstream system, and they typically lack sufficient incentive to improve the data's quality beyond their specific application domain. The solution is to centralize and automate data cleansing to create a sustainable and continuous approach by integrating the process with data gathering. Such a system improves data consistency, removes inaccu- racies and, ultimately, improves risk man- agement and overall business performance. Of course, not all the data can be auto- matically and centrally cleansed. Logical error types in data structure can be cor- rected through programmed processes, but there will always be errors that are immune to logical conclusions or the value of a particular field. “There is always going to be conflicting data on a price, a name, a date or a corporate action message,” said Barry Raskin, president of Telekurs Financial USA. “If you want straight-through processing, you can't get it unless it's close to being fully automated.” All Aboard Once an initial data cleansing is completed, the real work begins. Financial institutions still confront a superfluity of stand-alone and legacy back-office systems. “The large invest- ment banks could have 200 systems with several sources of reference data,” said Tom Stock, senior vice president of prod- uct management at GoldenSource. “The big issue on distribu- tion is what architecture you can put in place to get it to all the downstream systems.” That architecture doesn't come cheap. Boston-based Aite Group estimates downstream connectivity spending will exceed $2.8 billion in the United States and Europe this year. According to Adam Honore, senior analyst at Aite, Tier 1 institutions (bulge-bracket firms and global banks) will spend an average of $9 million each to improve their data connec- tivity, with some firms spending more than $20 million. Tier 2 firms (mid-sized asset managers and large hedge funds) will spend roughly $4 million to solve their connectivity prob- lems, while Tier 3 firms (small asset managers and hedge funds) will expend between $250,000 and $500,000. A variety of options exist for moving data downstream. Some firms rely on internal solutions, while others rely on vendor sup- port. Others have adopted outsourced solutions that encompass both business process and technology. Most Tier 1 and Tier 2 firms tend to adopt some combina- tion of each. Whatever the strategy, Aite has found that downstream data integra- tion consumes about 10% of the initial EDM budget. While hardware and vendors can be vetted through budget analysis, people can't. Having being prompted for information about the data, business users frequently expect IT to respond with answers. However, IT people are rarely the origina- tors of the data. For that reason, firms have a difficult time gathering accurate requirements for new initiatives. “Getting a full and complete inventory is a must,” Moving Data Downstream Downstream data integration becomes less of an upstream swim By Stephen Mauzy Tom Stock, senior v.p. of product management at GoldenSource 2008EDM 4/15/08 4:16 PM Page 8
  • 9. APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 9 Raskin said. “But often things like func- tionality, data elements and data depart- ments get overlooked.” Allocating big money to get everyone aboard is a formidable challenge. EDM is an ambivalent priority for many firms, even though projects higher on the totem benefit from downstream connectivity. Honore cited trade processing, settlement and generic integration efforts as initiatives that derive value from effective connectivi- ty strategies. Risk mitigation is another hot-button issue that begs to be mitigated by reducing dis- crepancies, according to Raskin. “EDM is sometimes viewed more as an obstacle to getting things done,” he said. “But there are consequences for not implementing EDM correctly. In the most benign sense, you might be buying data multiple times. In a more malig- nant sense, you have one guy quoting a price in a trading room and settlement people having no idea where the price originated.” Managing stakeholder requirements is a critical process that's anchored in understanding workflow, data dependencies and organizational tolerance for operational disruption. Many firms use formal processes such as service level agreements to specify requirements and establish EDM objectives. A Common Language Standardized jargon is another invaluable commodity. A com- mon language requires standardized data definitions. During the requirements gathering process, it can be difficult to get people to agree on the purpose of the data. For many institu- tions, the natural tendency is to allow customization or unique interpretations instead of forcing people to exploit an existing standard. Several firms advocated creating a solid data dictionary and requiring people to talk in the terms of that dictionary. From that point forward, projects can define tags off the dictionary and map everything to it. “You have to get everyone on the same page,” Honore said. “You don't want to send mixed messages by using interest rate or coupon when you are talk- ing about the same thing.” Metadata repositories are popular components of many of the larger EDM integration projects. The side benefits to a deep metadata repository are expansive, from enhancing reporting to risk management to something as simple as an intranet search engine. The people closest to the source are allowed to contribute the definition and grow the information repository. Once everyone is referencing the same page, attention can turn to managing consequences. Downstream consumers need to be prepared for changes because reporting and entitlements can be affect- ed by changing data. The problem is one part technical and one part standards. In some situations, data changes need to be reflected at one point in time, and EDM implementations need to accommodate different ‘go live’ times for a data change. “Downstream applications go across trade executives, books and records, clearing and settlement, valuation, portfolio man- agement and risk systems,” said Michael Atkin, managing director at the EDM Council. “You look at the process and the steps involved, and you realize that peo- ple have built up a way of operating under one set of approaches. Now you are trying to change that and you have to understand all those inter-relationships.” There is just one caveat: don't read too much into inter-rela- tionships. It isn't uncommon for a firm to look back and realize that many data fields sought in the project should have been rejected. In this instance, performance implica- tions of the request were the roadblock. Business users do not always need every attribute on every instrument in every instance. Minimizing Silos Every chief information officer and risk manager would like all connections running off the most accurate, most con- trolled data repository. Most firms have bypassed this push by turning their legacy security master producers into data consumers as well. The migration off legacy systems will take years of effort in most instances, but it should be addressed according to business priorities. “Getting everyone connected can take up to three years,” said Stock. “A lot of it depends on the size of the organization and the sophistication of the enterprise in past EDM efforts. The large investment banks have some sort of reference data in consolidation projects, but it's still a long process. That's one of the biggest challenges in enterprise data management – speeding up that integration process.” One of the challenges many firms insufficiently address is the idea of a response mechanism from the downstream system. A firm can create the perfect data extract with the cleanest data possible for a particular interface. But how does it assure the downstream application actually got the data, processed it and acquired the results the application expected? Adam Honore, senior analyst at Aite Group 2008EDM 4/15/08 4:16 PM Page 9
  • 10. 2008EDM 4/15/08 4:16 PM Page 10
  • 11. APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 11 The answer for many firms is a centralized data model that emphasizes understanding market and reference data, legacy systems and how content is intended to be used throughout the organization. But the system only works with tight super- vision. “Without tight controls on centralized ‘data depots,’ separate applications and fixes can creep in at a business unit level and an institution will begin down the path to silo- based data,” said Honore. More cross-product selling and more sophisticated clients increase the need for consistent data across traditional busi- ness silos. Customers often receive direct access to data via the Web and online service capabilities, so the data needs to be as accurate and timely as possible without the need for manual intervention – a primacy reason why firms have learned the key to continued growth is increased straight- through processing rates. Good, clean quality reference data necessitates minimizing data silos. Many firms implement source system controls to ensure data satisfies quality standards at its point of origin. When properly implemented, source quality controls can effectively prevent the proliferation of invalid data. But source system quality controls alone cannot enforce data quality. They cannot, for example, ensure that data quality is maintained throughout the data life cycle, especially when multiple data sources with varying levels of cleanliness are combined in downstream data integration processes. To further ensure data remains high quality, many firms adopt a flexible data quality strategy that incorporates data quality components directly into the data integration archi- tecture. Successful application of this strategy requires a data integration platform that can implement a broad range of generic and specific business rules and also adhere to a vari- ety of data quality standards. Proper Governance Effective data management requires top-down, enterprise- wide guidelines that align the information architecture with the business goals of the financial institution. Effective gov- ernance is key, but it can present a formidable obstacle because of the difficulties associated with providing a clear business case on the benefits of data management. “Data alone has no intrinsic value,” said Gary Barr, global head of EDM at Reuters. “It is an enabler of other processes, and the true benefits of effective data management are sys- tematic and intertwined with other processes, which makes it difficult to quantify all the downstream implications or upstream improvements.” Ultimate responsibility of data integration usually falls to the CTO, CIO or COO but, in most cases, they work in conjunction with business heads or dedicated data managers from various trading and operations areas across the enter- prise. “The bottom line is that you need C-level buy in,” Raskin said. “You can't go anywhere without that because things can get political real fast and then they can get bogged down.” Difficulty calculating return on investment (ROI) also ham- pers the ‘buy in.’ Benefits aren't as readily transparent and as easily measure for EDM as they are in other projects. Honore suggested designing an environment that provides for met- rics to be gathered, which, in turn, allow an institution to measure ROI in specific data operations/business functions. One can calculate the cost of maintaining accurate counter- party data against the revenues from the client and/or the potential losses. Delays in settlement and failed trades can be measured against those caused by data errors. The goal in downstream data integration is to get cleansed, reliable data to the right departments. “Data utopia is core client data, core securities reference and core trade data all being fed to a centralized data model that then gets distrib- uted,” Honore said. “It's not realistic, but that's what you work toward.” i CRUNCHING THE DATA Aite Group interviewed a wide range of people who have successfully completed initial EDM projects and are currently engaged in subsequent downstream activities. Some of the key findings can be found in a white paper titled Navigating the Rapids of Downstream Data Connectivity. They include: • Spending on downstream connectivity in the United States and Europe will exceed US$2.5 billion this year. • EDM projects mature an average of three years before diving into downstream connectivity. • On average, only 10% of an initial EDM budget is allocated for downstream connectivity. • The average connector takes between five and 22 days of engineering effort. • Internal IT produces 57% of all connectors. • More than 60% of new connections go to back-office systems. • Metadata dictionaries are one of the emerging trends in managing connectivity. • Global trading and settlement, trade processing and compliance solutions ranked near the top of last year’s IT initiatives. Source: Aite Group 2008EDM 4/15/08 4:16 PM Page 11
  • 12. Tomorrow’s EDM Solutions Today A look at expanding technology and data management with BNY Mellon Asset Servicing By Edward McGann, Managing Director “i 12 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008 WITH TODAY’S GLOBAL cred- it environment symptomatic of an overall global economic slowdown - and the United States seemingly at the nexus of that environment - the need for information is ever more critical. This is particularly true for institutions involved in the investment industry, whether they are managing money, creating investment vehicles or providing services to the buyers and sellers of investments. That is where service providers like BNY Mellon Asset Servicing come in. Such providers are constantly in the crosshairs of demand- ing clients and industry participants to provide real-time, accurate and, in some cases, enhanced data so they can understand their positions and performance in their investment portfolios. To meet that demand, BNY Mellon Asset Servicing employs not only its powerful technology infrastructure but also its product management and business units to develop strategies and devise solutions to clients’ requests. These strategies and products pro- vide clients the access to the data they need and the tools neces- sary to analyze and make sense of the wealth of information. BNY Mellon Asset Servicing is not only in the business of safekeep- ing and accounting for its clients’ assets, it is in the business of pro- viding information on those assets. Complicating this somewhat ‘simple’ effort is the huge amount of derivatives and other complex instruments, where the information may not be complete at the primary level. These instruments in the modern-day portfolio derive their value from underlying instruments that may behave differently, depending on extraneous conditions and events. The challenge is to provide clients with relevant information in a manner that is useful and supports decision-making systems and management information platforms as varied as the investments themselves. This requires service providers to combine data ele- ments that exist on a variety of internal platforms with that of other vendors’ platforms, as well as sometimes combining that with proprietary information resident at the clients’ sites. The Mechanics of EDM Within BNY Mellon Asset Servicing, how the information is collect- ed, combined, parsed and disseminated to end-users begins within Global Product Management. By collecting inputs from clients, industry participants, internal sources such as BNY Mellon Asset Management and other leading indicators, Global Product Management works closely with the Technology Group to mine the enterprise’s platform of systems and data warehouses to identify the information necessary to meet a particular demand or potential new product offering. By employing modern-day programming tools, the Data Management team can extract the information, often in a real- time manner; package the information; and create a report, data file or some other method of output that the end-user can utilize. When the internal sources do not contain all the required fields of information, outside resources are reviewed. This is often the case when the product requires additional ‘enhancements’ to allow for a productive, analytic tool. Being able to stress test data and perform ‘what if’ analysis is often at the forefront of the new products and services provided by firms like BNY Mellon Asset Servicing. The more robust and deep the infor- mation, the more valuable and accurate the results are. SPONSORED ARTICLE 2008EDM 4/15/08 4:17 PM Page 12
  • 13. BNY Mellon Asset Servicing, the largest custodian in the world and one of the largest cash processing organizations in the world, has access to a wealth of data and platforms from which to mine information. The trick is putting it in a place that is readily accessible and secure. In addition, the footprint in which we operate places us in every time zone across the globe – the very same places where our clients operate and conduct business. Being able to regionalize data while simultaneously making it available 24/7 on a global basis is another challenge facing the enterprise data management field. Our clients’ need for informa- tion may span that of a local office in a location like Singapore to a regional office in London or Paris, and ultimately to a global head- quarters located in New York. Investment decisions are often made on that information, with the results captured and reported on. Credit meltdowns need to be reacted to quickly, and the best deci- sions are those made with optimal information. Our platforms are positioned and configured in a manner that provides access to the necessary core information and then supplements that with value- added information from non-core sources, embeds analytics and decision tools and delivers it all to clients via FTP, host-to-host connections or Web-based applications like our Workbench or Inform platforms. In some cases, that same information must be translated to a second or third language so the user has it in a for- mat that is useful, and readable, to them. Today, some of our reports are available in as many as 12 languages. Warehousing the information is critical to ready access, especially when the base information resides in separate locations. Eagle Investment Systems, a BNY Mellon Asset Servicing subsidiary, is a leading global provider of financial technology solutions. Serving many of the world's most prominent financial institutions, Eagle's Web-based solutions integrate and streamline the investment process. Eagle PACETM is an advanced data-centric platform that is fed information from various sources so firms can execute queries on the data and provide analysis. Such tools are powerful, as the structure of the underlying data allows for a variety of ways to dissect it, including monitoring the investment performance of the portfolio and anticipating potential changes to the portfolio through our performance products. Eagle Investment Systems’ products are an important cornerstone to how BNY Mellon Asset Servicing solves its clients’ diverse enter- prise data management needs. Looking to the Future The challenges ahead will only become more complicated and demanding as the nature of the investment environment evolves and new instruments are added. Consumers of the information - clients and internal parties - will become more demanding as the need to track performance and understand risk become even more critical. Firms like BNY Mellon Asset Servicing will be required to harvest the data and information that exists across its diverse, glob- al footprint and provide it in real time to the end-user. As the world becomes smaller in terms of global communication and interactive markets, success will be measured in fractions of seconds. The firms that meet those demands to provide accurate and complete information, along with providing tools to under- stand the information, will be the ones who survive and excel. Data and relevant information are critical to an enterprise’s suc- cess. BNY Mellon Asset Servicing and our partners in the Technology and Data Management groups are meeting those needs and positioning ourselves for continued success in the future. After all, information is at the core of what we do. About the Author: Edward McGann is a managing director and head of product management for financial institutions and international markets within the Global Product Management unit at BNY Mellon Asset Servicing. In that capac- ity, he and his team are respon- sible for ensuring the firm’s cur- rent suite of products and servic- es meets the needs of those important client bases and for developing future product offer- ings for the asset servicing mar- ketplace. His responsibilities also include strategic and financial planning, overseeing capital plans related to product and technology development and strategic initiatives that ensure clients' satisfaction with The Bank of New York Mellon. About the Company: Operating in 34 countries and serving more than 100 markets, the Bank of New York Mellon is a global financial services company focused on help- ing clients manage and service their financial assets. The company is a lead- ing provider of financial services for institutions, corporations and high-net- worth individuals, providing superior asset management and wealth man- agement, asset servicing, issuer services, clearing services and treasury servic- es through a worldwide client-focused team. It has more than $23 trillion in assets under custody and administration, more than $1.1 trillion in assets under management and services $11 trillion in outstanding debt. Additional information is available at www.bnymellon.com. “The firms that meet those demands to provide accurate and complete information, along with providing tools to understand the information, will be the ones who survive and excel.” APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 13 SPONSORED ARTICLE 2008EDM 4/15/08 4:17 PM Page 13
  • 14. 14 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008 THIS IS A PIVOTAL year in the SEC's effort to make filings interactive through eXtensible Business Reporting Language (XBRL), the lan- guage of interactive financial reporting. Securities and Exchange Commission chairman Christopher Cox said so earlier this year, and he was right. Cox was referring to the SEC's program, as well as the interactive financial data picture worldwide. Indeed, Israel, China, Singapore and Japan also are moving to XBRL-based financial reporting. “The global movement to interactive data for financial reporting is truly underway,” he said. “Without question, 2008 will be a water- shed year for interactive data.” Indeed, this year already has been, although not always in the way Cox meant. In February, the Advisory Committee on Improvements to Financial Reporting recommended slowing the adoption of tagged financial disclosure. The panel's final recom- mendations are expected this summer, although its advice has been roundly ignored by SEC officials so far. Big Benefits When implemented, XBRL would enable users large and small to drill deeply and instantaneously through the body of public filings in that format, to access SEC documents as they are filed in real time and to be able to feed the numbers into a spreadsheet or other modeling applications of their choice. XBRL is designed to be a standards-based way to express financial statements, including sometimes critical but hard-to-pinpoint data such as footnotes to SEC reports. “We look at XBRL as being the ideal data structure for financial reporting,” said David Blaszkowsky, director of the SEC's Office of Interactive Disclosure. That ideal is attainable because each of the thousands of data and text components that compose financial reporting can be described by XBRL ‘tags.’ The tags in turn refer to files in a taxonomy that defines them, making them machine readable. A myriad of plat- forms and applications can then slice, dice, analyze and present the data. Because any one tag can call upon any other tag, the numbers and texts can be compared by data categories such as date, compa- ny, industry and topic. In fact, the SEC already does that in a limited way, so far posting 307 filings from 74 companies through its XBRL Voluntary Filer Program. And in February, the agency unveiled Financial Explorer, an online XBRL tool that demonstrates the system’s potential for interactive research and graphics. According to proponents of the program, another reason the finan- cial world needs XBRL is because the SEC's current database, Edgar, is outmoded. “Edgar is essentially a document collection sys- tem,” noted Christopher Whalen, co-founder and managing direc- tor of Institutional Risk Analytics in Croton-on-Hudson, N.Y. “It doesn't read the documents, and it doesn't validate them. It just collects them, assigns them a unique ID number and off they go.” “With XBRL, what the SEC needs to do is migrate this fairly ancient system over to something that is far more data-centric, as opposed to document-centric,” Whalen continued. “That probably includes some level of validation. In other words, when the docu- ment hits, they are going to have to look at it and ask, ‘Okay, did you follow the rules for the tagging?’” Building Momentum The advisory panel's comments aside, XBRL appears to be steam- ing ahead. Recent milestones on the road to adoption include: • September 2007 - Cox and the XBRL U.S. project team announce the creation of data tags for all U.S. generally accepted accounting principles (GAAP). • October 2007 - the commission creates the Office of Interactive Disclosure. • December 2007 - the XBRL U.S. team releases the first Taxonomies for U.S. GAAP, the all-important dictionary of tags. • February 2008 - the second draft of the GAAP taxonomies is released. • April 2008 - the comment period for XBRL draws to a close. With the comment period over, a preliminary ruling is expected soon, followed by a final ruling in autumn. If all goes as scheduled, this ruling will require the top 500 compa- nies by market capitalization to file their 2008 annual reports through XBRL. A Critical Year for XBRLThe SEC's interactive data program takes key steps toward adoption, despite some corporate opposition By David Lewis “of 2008EDM 4/15/08 4:17 PM Page 14
  • 15. APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 15 Some of the public companies and other filers that anticipate some- day having to code their financial statement in XBRL are not so pleased by the prospect. Reaction to the project can be roughly divided this way: favorable among analysts, numbers-crunchers and smaller investors; less favorable among medium-sized and small public companies, mutual funds and others who see an undefined expense looming for little or no gain. In the financial sector, the conversion is eagerly anticipated by medium- and small-sized shops and less eagerly by larger institutions, which already may have built an XBRL-like database on their own dime. Still, an SEC-sanctioned XBRL data vault over time would be sheer heaven for most analysts. “The long-term potential is that it is def- initely going to be a benefit for the analyst community,” said Glenn Doggett, CFA and policy analyst for financial reporting at the Charlottesville, Va.-based CFA Institute Centre for Financial Market Integrity. “This is especially true for small and mid-tier investment analysts who, when you go to visit them, still have a stack of 10-Qs and 10-Ks on their desks. They're really going to be the first beneficiaries of that XBRL framework.” For analysts that work at the big investment banks, however, the response may be a bit more muted. “Whether it is on the buy side or the sell side, they already are subscribers to services that provide much of the same information to them,” Doggett noted. “For these players, the change to XBRL is not going to be as much a question of how they operate as it is providing them information faster and with less potential for error.” Why reduced error? “Right now, third-party databases are tran- scribing numbers, whereas with XBRL analysts will be getting company-identified values with company-identified tags,” Doggett explained. “As a result, you have a very one-to-one com- munication between what the company says and what every investment analyst gets.” An Issue of Politics No one doubts XBRL can work because it already does; the real issue is politics. “The challenge for the SEC is not the technical challenge related to XBRL,” Whalen noted. “It is more, how do you align this technology with the business rules and legal responsibilities of the SEC and do it in such a way that you don't piss off all of the filers?” Whalen is among those who believe a 2008 rulemaking deadline might be pushing it. “In theory, they want to have a rule ready for the commission to consider in September that would set a timetable for adoption,” he said. “The reality is that there is a lot of work to be done between now and then, and I'm not sure that they are going to have enough time to get everything aligned correctly in order to hit that deadline.” Still, the 2008 deadline is important to some of the program’s key players, namely SEC Chairman Cox. “Let's face it, Chris Cox is done at the end of this year,” Whalen said. “One way or the other, you are going to see a new FCC Chairman, and I don't know whether or not the future leadership of that agency is going to sup- port something like XBRL as strongly as he has.” Meanwhile, the SEC's Blaszkowsky argued that, at least for larger corporations, the transition to XBRL will not be so difficult. “This is not just a document or a bunch of linear or analog data that hap- pens to be converted to a digital form, this is inherently digital data that can be found and applied across other databases,” he contend- ed. “As such, it is inherently digital, it is inherently tagged and it is inherently available for the kind of constructive engagement that enterprise data management systems are developed for.” The main point, according to Blaszkowsky, is that universal adoption of XBRL is inevitable. “This has its own compelling internal logic,” he said. “The investment world and the corporate world have worries and concerns and some of them are very legitimate ones, but they will find that those concerns are unwarranted or exaggerated and that the benefits are real. They will want to see XBRL implemented.” i “The investment world and the corporate world have worries and concerns and some of them are very legitimate ones, but they will find that those concerns are unwarranted or exaggerated and that the benefits are real.”— David Blaszkowsky Some of the interactive research and graphics available through Financial Explorer. 2008EDM 4/15/08 4:17 PM Page 15
  • 16. 16 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008 One Step Ahead in OTC Derivatives In an industry facing calls for improvements, The Depository Trust & Clearing Corp already is making them By Gregory Morris MARCH WAS THE MONTH for getting in step. On the 13th of that month, Treasury Secretary Henry Paulson, Jr. set the pace in his remarks on recommendations from the President’s Working Group (PWG) on Financial Markets. He cited a number of the working group’s key findings and specifically called for a joint industry response in several finan- cial sectors, including over-the-counter (OTC) derivatives. Just two weeks later, almost two dozen financial institutions and trade associations sent an open letter to the President of the New York Federal Reserve, Timothy Geithner. That let- ter cited progress to date and underscored the financial com- munity’s support for further improvements. The NY Fed replied the same day with encouragement and suggested sev- eral near-term goals. With all the mutual support and affirmation carrying on in the foreground, it was easy to miss a groundbreaking initiative occurring in the background. Demonstrating that some seg- ments of the market are already on the case, The Depository Trust & Clearing Corp (DTCC), through its Trade Information Warehouse, completed the first automated pro- cessing of a credit event for a Canadian printing firm, comply- ing with protocols issued by the International Swaps and Derivatives Association (ISDA). With this first automated credit event now concluded, DTCC is focused on enhancing this functionality of the Warehouse in preparation for future events. Some of these priorities include allowing for automatic adherence by counterparties and adding index tranches to the products the Warehouse supports for credit events. A Track Record of Progress DTCC first launched Deriv/SERV, its automated matching and confirmation platform for OTC derivatives, in late 2003 to help the derivatives community address the operational chal- lenges they faced in a market growing at breakneck speed. The service has been instrumental in allowing market participants to meet their commitment to global regulators to strengthen their infrastructure by increasing the automated processing of OTC derivatives transactions. Today, more than 95% of credit derivatives transactions are electronically matched and confirmed on the Deriv/SERV plat- form. Transaction volume for all OTC derivatives products jumped 123% to 5.9 million transactions last year, up from 2.6 million the previous year. More than 1,100 global dealers, asset managers, hedge funds and other end-users in 31 countries automate their OTC derivatives transactions through Deriv/SERV, with more being added each week. To further support derivatives trading, DTCC launched its Trade Information Warehouse, the first automated global repository for OTC derivatives, in November 2006. Part of DTCC Deriv/SERV’s family of automated post-trade process- ing services for the OTC derivatives community, the Warehouse provides an automated environment where con- tracts can be tracked and serviced over their lifecycle. Last year, about three million contracts were recorded into the Warehouse, with an average of an additional 10,000 new con- tracts now being added daily. SPONSORED ARTICLE 2008EDM 4/15/08 4:17 PM Page 16
  • 17. APRIL 2008 ENTERPRISE DATA MANAGEMENT REPORT 17 Expanding on the Warehouse’s functional- ity, late last year, DTCC launched a cen- tral settlement service for OTC credit derivatives transactions, in conjunction with CLS Bank International of New York. It is the OTC derivatives industry’s only automated solution for calculating, netting and settling payments between counterparties to bilateral contracts. The new service has been designed to enable payments associated with transactions con- firmed through Deriv/SERV and residing in the Warehouse’s global contract repository to be netted by value, date, currency and coun- terparty. Payments eligible for settlement include initial payments and one-time fees, coupon payments and payments associated with post-trade events. Central settlement greatly reduces operating risks for users by replacing manually processed bilateral pay- ments with automated, netted payments. The Warehouse generates bilaterally netted payment instruc- tions and sends them to CLS for settlement. CLS automatical- ly notifies its Settlement Members, who effect settlement through CLS. Reports are generated and delivered to counter- parties early in the morning on settlement day. In the second quarterly central settlement cycle for the new service on March 20, 2008, the amount of trading obligations requiring financial settlement was reduced by 93%, from a gross of $18 billion in aggregate U.S. dollar terms to $1.2 bil- lion net. Gross settlements by the 15 participating OTC deriv- atives dealers were consolidated from 400,000 to 200 net set- tlements. Payments were made in five currencies: the U.S. dol- lar, euro, British pound, Japanese yen and Swiss franc. Over time, the number of currencies in which payments can be made will be expanded from the initial five. Ahead of Regulators’ Recommendations “Our initial offering for automated processing of OTC derivatives products began to take form in 2003 and went live in 2004, prior to the earliest regulatory calls to address the deficiencies in the system,” according to Frank De Maria, managing director and chief operating officer of DTCC’s Deriv/SERV. “Since then, we have increased our service offering and are working with both buy-side and sell- side counterparties.” Indeed, DTCC’s products and services seem to complement Paulson’s March 13th remarks. “We need a dedicated industry cooperative,” he said at the time. “Market volume and instrument complexity call for a clear, functional, well-designed infrastruc- ture that can meet the needs of the OTC derivatives markets in the years ahead.” Paulson further commented that such an industry cooperative “must capture all sig- nificant processing events over the entire lifecycle of trades. It must have the capabili- ty to accommodate all major asset classes and product types. It must be operationally reliable and scalable and use automation to promote standardization that will create effi- ciency and moderate excessive complexity.” Paulson noted that the PWG specifies that the infrastructure must have a flexible and open architecture for inter-operability, upgrades and improvements. “The facility also should enhance counterparty risk management through netting and collateral agreements by promoting portfolio reconciliation and accurate valuation of trades,” he added, urging the indus- try to “incorporate, without delay, cash settlement protocol into standard documentation.” Integrating the Infrastructure As with the overall thrust of the industry initiative to make the OTC derivatives market more transparent and efficient, De Maria said Deriv/SERV’s expanded capabilities respond to Paulson and the PWG’s recommendation to develop a longer- term plan for an integrated operational infrastructure in the OTC derivatives market. The PWG calls for maximizing “the efficiencies obtainable from automation and electronic platforms by promoting stan- dardization and interoperability of infrastructure compo- nents.” It also urges participants to enhance their ability to “manage counterparty risk through netting and collateral agreements by promoting portfolio reconciliation and accu- rate valuation of trades.” De Maria noted that these initiatives are already part of the Warehouse’s daily process. “Maintaining the most up-to-date information on trade details in one central portal addresses the challenges participants face in keeping their collective “Maintaining the most up-to-date information on trade details in one central portal addresses the challenges participants face in keeping their collective deal books in synch.” SPONSORED ARTICLE Frank De Maria, managing director and chief operating officer of DTCC’s Deriv/SERV 2008EDM 4/15/08 4:17 PM Page 17
  • 18. 18 ENTERPRISE DATA MANAGEMENT REPORT APRIL 2008 deal books in synch,” he said. Because the Warehouse enables each participant to see the positions they hold with their counterparties, there is more transparency and portfolio rec- onciliation is much more seamless. “The first step is efficient and timely reconciliation among coun- terparties,” said De Maria. “That enables them to terminate, assign and amend positions as the front office sees fit and do so in a controlled manner.” In terms of providing an integrated infrastructure that encompasses the buy-side as well as the dealer community, DTCC’s Deriv/SERV offers it. “Well over 1,000 of our customers represent buy-side firms,” De Maria said. “We are an industry owned organization whose policy and priorities are set in conjunction with our Operations Management Group, which includes representation from both dealers and buy-side firms. This helps us build consensus on our key initiatives that reflect the interests of a wide range of industry members.” To ensure broad participation in DTCC’s derivatives services, it was critical that Deriv/SERV’s matching and confirmation service and the Warehouse had a full spectrum of interface capabilities. “It can be used by the most technologically sophisticated firms, as well as by those who do not have as robust an infrastructure,” De Maria said. It is also important to note that buy-side services are charged no fees to use the service. And what about risk mitigation? Risk awareness currently is a very hot topic, De Maria noted. “The OTC derivatives market has all three major forms of risk: market, credit and operational. In a philo- sophical sense, the market risks and the credit risks are what bring the business into being. But wasting money on operational risks benefits no one. If you increase automation, you both increase effi- ciency and reduce risk,” he said. A New Focus on Novation The major new emphasis for this year is on novation. Novation Consent, a new service launched earlier this year, is intended to automate the email process that takes place between counterparties of assignment transactions. Provided through the Trade Information Warehouse, Novation Consent automates the request, approval and notification procedures among the three trading parties involved in an OTC credit derivative contract assignment, as stipulated by ISDA in its Novation Protocol. Under the ISDA Novation Protocol, when a party to an OTC deriv- ative transaction wishes to exit that contract by assigning its position to a third party, that party - the transferor - must notify the remain- ing party and the entering party - the transferee - and seek permis- sion for the assignment from the remaining party. “Deriv/SERV worked with the OTC derivatives industry to build an automated tool for the processing of novations fully compliant with ISDA’s novation protocol,” De Maria said. “We have designed Novation Consent to deliver the features trading parties have been seeking in terms of speed, efficien- cy and inter-operability across platforms.” Novation Consent streamlines assignment processing by allowing firms to consolidate consents messages. Furthermore, it leverages the Warehouse’s power as a global repository of confirmed OTC credit derivative transactions by retrieving trade data from the Warehouse and enabling users to submit assignments to Deriv/SERV’s Matching and Confirmation service. In the new novation service, as with all its products, De Maria noted that DTCC is careful not to let the drive to standardization and effi- ciency impinge on the flexibility that makes the OTC market so vibrant. “Dealers and end-users are working through ISDA standard master agreements,” he said. “We will continue to see standardiza- tion in legal documents, allowing all participants to speak the same language, but there is still a great deal of flexibility. Clearly, our industry has been very proactive and has had great foresight.” About the Company: Depository Trust & Clearing Corp (DTCC), through its subsidiaries, provides clearing, settlement and information services for equities, corporate and municipal bonds, government and mortgage-backed securities, money market instruments and OTC derivatives. The firm also is a leading processor of mutual fund and insurance transactions, linking funds and carriers with their distribution networks. DTCC’s depository provides custody and asset services for about 3.5 million securities issues from the U.S. and 110 other countries and territories worth more than $40 trillion. In 2007, DTCC cleared and settled more than $1.86 quadrillion in securities transactions. DTCC’s OTC derivatives services are provided by its wholly-owned sub- sidiary, DTCC Deriv/SERV. As managing director and chief operating officer of that subsidiary, Frank De Maria is responsible for the day-to- day operations of DTCC’s automated services for the OTC derivatives market. He oversees the company’s matching and confirmation system for credit derivatives and leads a cross-organizational team in charge of sup- porting and developing the Trade Information Warehouse. For more information, please visit www.dtcc.com. SPONSORED ARTICLE 0 5 10 15 20 25 35 30 2003 2004 2005 2006 Source:ISDA Market Survey Growth in OTC Credit Derivatives Volume Notional Amounts in trillions of US Dollars 3.8 8.4 17.1 34.4 2008EDM 4/15/08 4:17 PM Page 18
  • 19. www.dtcc.com DTCC Deriv/SERV’s family of services for OTC derivatives is: • Reducing risk • Cutting costs • Enhancing efficiency • Building the largest community of users worldwide DTCC Deriv/SERV’s electronic trade matching and confirmation service and Trade Information Warehouse make paper-based, error-prone, manual processing obsolete. Join global dealers and the buy-side community in automating and streamlining your OTC derivatives post-trade deal flow. Deriv/SERV services OTC credit, equity and interest rate derivatives on a global basis, at no charge to buy-side firms and at cost to dealers. To learn more call London +44 (0)20 7444 0411 New York +1 212 855 5424 Or visit www.dtcc.com Say Buenos Dias to Automation, Sayonara to Risk The Logical Solutions Provider Clearance and Settlement - Equities and Fixed Income Asset Servicing Mutual Funds Managed Accounts Alternative Investments Insurance Global Corporate Actions OTC Derivatives 2008EDM 4/15/08 4:17 PM Page 19
  • 20. Having comprehensive securities reference data is great. Having a way to make sense of it is even better. Add more value to your reference data with Standard & Poor’s Cross Reference Services™ . Now there is a global solution that can help tie all your securities reference data together—Standard & Poor’s Cross Reference Services. Customizable and available through multiple delivery channels, the service links identifiers, entities, issuers and obligors across global and domestic markets. It’s the insight you need to help manage your enterprise-wide exposure, enhance compliance, highlight potential conflicts of interest and identify investment opportunities. Analytic services and products provided by Standard & Poor’s are the result of separate activities designed to preserve the independence and objectivity of each analytic process. Standard & Poor’s has established policies and procedures to maintain the confidentiality of non-public information received during each analytic process. © 2008 Standard & Poor’s, a division of The McGraw-Hill Companies, Inc. All rights reserved. STANDARD & POOR’S is a registered trademark of The McGraw-Hill Companies, Inc. Learn more. Visit www.sp.crossrefservices.com, e-mail sp_marketing@standardandpoors.com or call 212.438.4500 (North America) +44.(0).20.7176.7445 (Europe). www.standardandpoors.com 2008EDM 4/15/08 4:17 PM Page 20