BETA

The EDM Council is pleased to add search capabilities to our website. This is currently in beta (testing) mode.

We appreciate any feedback regarding the new search functionality or on any aspect of your website experience.

Contact Us
News & Press / Announcement

EDM Council’s Commentary at the FDTA Forum 2024: Defining Success

Thu, Jun 27, 2024

On Thursday, June 27, 2024, the Data Foundation, in partnership with Donnelley Financial Solutions (DFIN), convened a half-day public forum, FDTA Forum 2024: Defining Success, featuring a broad base of substantive comments and discussions leading into a public comment period.

John Bottega, President of the EDM Council, provided commentary about the U.S. FDTA (Financial Data Transparency Act) on behalf of the EDM Council and data management community as part of the FDTA Forum’s public commentary session. The full recording is available online, with John Bottega’s portion running from 1:29:48-1:35:45. The below text of the commentary has been generated from the FDTA Forum Transcript.

EDM Council Commentary by John Bottega

My name is John Bottega. I’m the president of the EDM Council, a non-profit trade association that focuses on best practices in data management. We’ve been collaborating and working with the Data Foundation for many years and I thank you for that collaboration.

The FDTA opens with the following describing its objectives: “A bill to amend securities and banking laws to make the information reported to financial regulatory agencies electronically searchable, to further enable the development of regulatory technologies and artificial intelligence applications, to put the United States on a path towards building a comprehensive Standard Business Reporting program to ultimately harmonize and reduce the private sector’s regulatory compliance burden, while enhancing transparency and accountability.”

For my time this afternoon, I’d like to focus on exactly what this statement means and how we should view the objectives of the statement and focus on the success of the FDTA. When we speak of the FDTA, we immediately go to requirements of standards. And yes, standards are a critical and fundamental component of the law, but I would argue that this is only a fraction of what the law is designed to achieve. Standards as a foundational component harmonizes reported data, critical to reducing the private sector’s regulatory compliance burden while enhancing transparency and accountability. But I will contend that this can’t be achieved without an efficient data management program to support it. You need both: adoption of data standards, implemented through the establishment of comprehensive data management program. And here’s the standards in compliance to regulatory reporting are fundamental to improving – we’ve heard it all today – quality, consistency, minimizing misinterpretations, reducing errors, bringing efficiency. It also enables the harmonization of data coming from multiple different sources which dramatically improves the ability of those receiving this data to perform more accurate, timely, and trusted analytics, and to leverage with confidence, all the latest technologies and machine learning and artificial intelligence.

But what happens when we don’t have consistent data? We simply have to look back at the financial crisis. Now let me be clear. The financial crisis was not caused by data. There were many other factors involved and we’ll be talking about that for the next 10 years. But from a data perspective, there are two fundamental issues that hampered the ability of the decision makers to respond effectively to the crisis.

First was the lack of financial instrument standards. As the industry continued to develop and introduce more complex financial instruments – derivative CMO, CMO squared – our ability to understand the specific structure of these instruments and the relationship to the underlying collateral was lacking. As the Chief Data Officer (CDO) at one of the major financial institutions, I was asked at the time to help identify the bank’s exposure to the subprime holdings. But when I asked the desk to describe what a subprime instrument was, they were unable to do so. How then can you manage what you can’t describe? The lack of transparency, to the underlying collateral, meant the industry was unable to see the deterioration of that collateral, as bankruptcies and foreclosures dominated the market. Data we needed to determine the health and wellness of these instruments was inconsistent, disparate, and incomplete.

Second related challenge, which we mentioned many times, was the lack of a consistent standard Legal Entity Identifier. As Lehman collapsed, The Street rushed into the offices and everybody wanted to know what their exposure was to Lehman. But Lehman wasn’t Lehman, Lehman was hundreds of sub-Lehmans. Subentities with varying legal obligations. Without identification standards, it was nearly impossible for the regulatory community to aggregate the Lehman positions and exposures coming from all different financial institutions.

There’s no argument that data standards are important. But how do we ensure that data standards are adopted and implemented and only be enabled the best practices of data management? The data management has significantly improved since the financial crisis. Data management is not one thing, it’s a collaboration and coordination of a number of capabilities. Data strategy, business information architecture, data quality, data governance, technology – all have to work together to manage the information asset. Interesting, evidence of the importance of data management already exists in the public sector. We just had a report, 2024 coming out from the SEC, entitled “Semi Annual Report to Congress Regarding Public and Internal Use of Machine-Readable Data for Corporate Disclosure.” Nice long title. The report concludes by saying the FDTA directives coincide with the internal commission and staff efforts to improve the management and use of data across the agency.

In 2022, the Office Inspector General of the Federal Reserve requested the use of the EDM Council’s framework called the DCAM (Data Management Capability Assessment Model) to assess the data management readiness of the Federal Reserve System. In their concluded report, dated January 18th 2023, they endorse the use of data management best practices as a way to achieve the mission success of the Federal Reserve, to force the stability, integrity, and efficiency of the merchants, monetary financial, and payment systems to promote economic performance. The report emphasized the need for establishing data management, understanding data inventory, establishing best practice governance policies, and promoting data management training. Without data management principles, implementation and adherence to standards, it’s just an uphill climb.

So in conclusion, I hope I’ve been able to bring forward what I’m going to call the “1-2 punch.” Hearing this learns the opportunity to adopt critical data standards, needing to improve quality and transparency and trust, and to make it happen – the establishment of best practice-driven data management.