EDM Council logo

Web Access

forgotten your password?


Data Management Benchmarking

The Council is conducting a data management benchmarking study of the global financial information industry on behalf of our membership. This year’s benchmarking is derived directly from the Data Management Capability Assessment Model (DCAM) and has been designed to ensure that all participants have a consistent view of where we collectively stand against the requirements for sustainable data management.
The 2015 benchmarking initiative is being conducted in partnership with Sapient Global Markets who has been working jointly with the Council to synthesize the full scope of DCAM into this study.
This year’s benchmark will be conducted via Pellustro a web-based assessment platform designed to support DCAM. The Pellustro tool will enable aggregation for analysis and provides a mechanism for firms to evaluate their own DCAM assessments against the industry-side benchmark.   
The 2015 benchmarking initiative will represent the first time that a study of data management capability across the financial industry will be conducted. Instead of just measuring how firms are setting up their data management programs, this study is measuring where firms are in their implementation. The approach features twenty carefully structured questions covering data management strategy, funding mechanisms, end-to-end lineage, operating models, governance approaches, data architecture and data quality practices that will provide a baseline to ensure the consistent measure of data management program effectiveness.   The subjects are based on the EDM Council’s rules of effective data management.

EDM Council Rules of Data Management

  1. Confidence Rule: We have an endorsed data management strategy that is meaningful to business users
  2. Owner Rule: We have a senior executive (with authority) in charge of the data management program
  3. Alignment Rule: Stakeholders understand (and buy into) the need for the data management program
  4. Communication Rule: We do a good job communicating the value proposition and operational implications of the data management program
  5. Business Case Rule: The business case and funding model for the data management program is established and operational
  6. Metrics Rule: We do a good job of measuring the costs and benefits of the data management program
  7. Policy Rule: The data management program has the authority to enforce adherence to policy
  8. Resources Rule: The data management program has enough resources to be sustainable
  9. Accountability Rule: The data management governance and accountability structures are operationa
  10. Responsibility Rule: The roles and responsibilities of data “owners” and “stewards” are assigned
  1. Enforcement Rule: Data policies and standards are implemented and enforced                     
  2. Ontology Rule: The business meaning of data is defined, governed and harmonized across repositories
  3. Critical Data Rule: Critical data elements are identified, verified and managed
  4. Data Domains Rule: Logical categories of data have been defined and catalogued
  5. Capability Rule: The data management program is aligned with technical and operational capabilities
  6. Profiling Rule: Data in existing repositories has been profiled, analyzed and graded
  7. DQ Control Rule: Data quality control procedures, business rules and measurement criteria are operational
  8. Root Cause Rule: The root cause of data quality problems are identified and corrected
  9. Lineage Rule: End-to-end data lineage has been identified across the full data lifecycle
  10. Ecosystem Rule: Data management collaborates with existing enterprise control functions