Data Management: The Heart of Financial Reform

while the Dodd-Frank Wall Street Reform and Consumer Protection Act, signed into law July 21, 2010, by President Barack Obama, represents multifaceted regulation that essentially reaches every component of the financial services industry in the United States, its implications on data management represent a mandate to improve infrastructure or risk being unable to meet the evolving requirements set forth by regulators.

The Dodd-Frank legislation establishes the Office of Financial Reform (OFR), a new department within the U.S. Department of the Treasury that is tasked with gathering and reporting to lawmakers information regarding potential risks and threats within the nation’s financial industry. To accomplish this, the OFR’s director can use his or her subpoena power to gather data from any financial institution.

Simply, says Michael Atkin, director of the Enterprise Data Management Council, a nonprofit trade association focused on managing and leveraging data, the regulation gives banks’ corporate leadership a new opportunity to examine the growing problem of managing skyrocketing amounts of data and finally to budget appropriately to meet the challenge. “It kicked the practice of data management into high gear,” Atkin says. “We’re now set up for addressing the data dilemma that we have because we finally have a reason that is not subject to the whim of a business case. It is a regulatory requirement.”

The OFR director, who has not yet been appointed, will make his or her report to Congress in 2012, adds Atkin. But that initial report, he notes, likely will be more on the state of the industry than a detailed analysis of its data, giving financial institutions a window of several years to prepare for potential requirements. “The implications from an infrastructure perspective are about getting the core building blocks of risk management in place,” Atkin relates.

Changing Priorities

Financial institutions now will be required to operate with greater transparency, points out Frank Fanzilli, chairman of the RainStor Advisory Council, which provides guidance to the San Francisco-based data retention provider, and former managing director and global CIO of Credit Suisse First Boston. “The Dodd-Frank reform is just starting to take effect, and we will likely see many more IT changes evolve over the next few years to accommodate these external demands,” he says.

“CIOs have been heavily focused on deploying solutions that focus on low-latency applications and web-enabled self-service capabilities, which have served the business well,” Fanzilli observes. “Attention now needs to turn to the improvement of infrastructure and data management systems to provide the transparency that these reforms have mandated. If the IT division cannot efficiently store and retain their most critical asset — data — it will become extremely difficult to stay compliant, let alone stay in business.”

Jane Griffin, an Atlanta-based principal with Deloitte Consulting, affirms that data management must become a top priority. “Historically, before regulation, we certainly saw a recognition of a need — a need for cost reduction, a need for better customer management, risk management, financial management,” she says. “Definitely the regulatory oversight and the level of confidence of management and different regulators has brought this to a board-level visibility like it’s never had before.”

As the EDM Council’s Atkin explains, getting data management right starts with establishing quality, consistent data. According to Atkin, there are five types of “building block” data on which the regulatory reporting is based: reference data, entity reference data, pricing, positions/positions and transactions, and economic statistics. “The industry is moving toward getting that data structure right,” asserts Atkin, who says the key is developing standards. “We have a fragmented chain of supply without standards,” he explains.

But defining terms and setting standards within which data is classified will be an early challenge for the OFR and financial institutions, says Fred Cohen, VP of the asset management practice at Patni Systems, a Cambridge, Mass.-based IT and business processes provider. “You have to create standards, people have to conform to those standards, and then they need to be able to collect data and build reports off of it,” Cohen comments.

Doing so is particularly difficult when there are hundreds of vendors that independently acquire, rename and integrate various systems into existing infrastructure, adds the EDM Council’s Atkin. Further creating dissonance in data standardization, he continues, financial institutions often silo data, and individual units may use their own proprietary terms to define it.

New York-based Citibank ($1.9 trillion in assets) is among the financial institutions that has gotten a head start on consolidating and standardizing its data environment. According to Anthony DiSanto, Citi managing director, North American region head, the bank has invested heavily in data consolidation and standardization for the better part of the past decade. “We have worked on a consolidation of data centers over the past several years that is fairly dramatic,” he says. “It’s all engineered around the way we do our work.”

And while it has been a costly project, DiSanto acknowledges, the savings come through centralization of the bank’s physical footprint, a standardized desktop environment that has all Citi employees working on effectively the same software image, and the adoption of virtualization, he reports. “I like to use the tag phrase that ‘It’s better, faster and cheaper’ — in that order,” DiSanto says.

Incentivizing Change

In the case of Citi’s consolidation, virtualization and standardization of its data centers, PCs and phone systems, the initiative was brought on by the bank’s executives, who saw a business need to modernize the way the bank managed its data, according DiSanto. But Deloitte’s Griffin suggests that not every bank is as forward thinking. “Without the [regulatory] impetus to have the data management architecture and hubs that are needed to manage data across the enterprise — without the external need to consolidate information across product lines, across finance risk and compliance — they would be hard pressed to allocate their budgets,” she says.

But with the pending establishment of the OFR and the specter of having to provide the government with any type of financial data it requests, getting data in order at the enterprise level becomes a higher priority. “The regulation helps put fuel to the fire of something that’s already in motion but probably didn’t have the motivation to change without this catalyst,” Griffin contends.

Banks are concerned about several areas from an implementation standpoint, according to the EDM Council’s Atkin, including the current state and future condition of governance, the operating model for data management, the state of internal standards and the obstacles to adopting those standards, the state of systems/data integration, and data quality gaps. But, he adds, these are things financial institutions have been working to resolve for some time.

“The banks collectively have been doing a great job,” Atkin asserts. “So this is just pushing it up the priority list and pushing it up the recognition list at the top of the house. And all of that is good for the practice of data management.”

And whether financial institutions recognize the need for improved data management or not, the consequences of failing to align with regulation add motivation, Patni Systems’ Cohen adds. “Those people who don’t have their data management world in order,” he says, “it’s going to be an incredibly painful world to be in.”

Source

Financial IT Support

Share the Knowledge:
Tweet about this on TwitterShare on FacebookShare on Google+Share on LinkedInEmail this to someonePin on Pinterest