Details
When market and reference data is being used as input into valuations and risk calculations the need for accurate and validated pricing and reference data is paramount. This webinar will cover several use cases focusing on innovative processes for managing that data. These include validations using statistical techniques, transforming market data using derivations/quantitative techniques/proxy methodologies, the linking of market and reference data in the underlying data model, and the validation of instruments that belon to curves and surfaces. Real-world, in-production use cases will be referred to for illustration.
Speakers
Post-event summary
The webinar titled “Enabling Innovation: Solutions for Managing and Transforming Market and Reference Data,” hosted by EDM Council and GoldenSource, discussed the transformation of market data management and the implications for data validation and standardization in financial services. The conversation was led by experts in the industry:
- Charlie Browne, VP, Head of Market Data Solutions, GoldenSource
- Dr. Wolfgang Kugler, Team Leader Market Data, RSU GmbH & Ko.
- Moderator: Mike Meriton, Co-founder, EDM Council
The discussion revolved around the necessity of having solid foundations in reference data and centralized data validations to meet regulatory requirements and improve operational efficiency. Charlie emphasized the critical role of standardizing data to ensure accurate valuations and risk assessments, stating, “You need a centralized reference data foundation because that reference data has an important impact on the valuations and the risk. If you get your spot, log your day count basis wrong, then your valuation will be wrong.”
Key points covered included the building block approach to data management which entails establishing strong reference data foundations, centralizing validation processes to meet audit and regulatory standards, and the advanced techniques borrowed from quantitative finance to handle illiquid market data and enhance data quality. The speakers also addressed the integration of predictive analytics and AI in data systems, stressing the importance of transparency and auditability in these technologies to satisfy regulatory scrutiny.
Wolfgang highlighted the challenges and potential of using advanced data analytics to improve decision-making processes within financial institutions, while also maintaining compliance with regulatory frameworks. The webinar concluded with a discussion on the practical aspects of implementing these systems, particularly the shift towards cloud solutions and the importance of starting with foundational data practices to build a robust business case for further investment in data management technologies. The insights provided a comprehensive overview of the evolving landscape of enterprise data management in financial services, underscoring the pivotal role of data accuracy and standardization in driving business and regulatory compliance.