Régis Deshayes, Head of Data Quality, and Stefan Hilbert, Data Scientist at Zeiss Group, speak with Derek Strauss, Chairman Gavroshe, and Editorial Board Member, CDO Magazine, about implementing data quality programs and in-house toolkits.
Deshayes begins the conversation by introducing Zeiss as a 177-year-old German manufacturing company that specializes in industrial metrology, medical device manufacturing, vision care, and semiconductor manufacturing technology.
He states that the massive data amassed by the company offers opportunities for analysis and exploration, and a proper data quality program is essential to make the most of it.
Further, Deshayes shares that the organization carried out an analysis to identify vendor tools already in use, data quality enthusiasts, early adopters, and the pain points. Then, a multi-layer approach was adopted based on the tools, people, and processes pillars.
Shedding light on the tool aspect, he mentions that the organization developed its data quality tool kit. Starting with an existing MVP, the company offered services to all the digital initiatives across the organization.
Moving on to the people aspect, he discusses developing arguments on the value and cost of data quality and creating educational material in the company’s e-learning platform. The two courses he mentions are:
Proactive guide to data quality
Data quality dashboard walkthrough for data owners and data stewards
Additionally, the company started a data quality community of practice and currently organizes an online event called the Data Quality Day to increase data quality awareness and interest.
About processes, Deshayes discusses the initiative charter under the enterprise data governance program. The charter is meant to anchor data quality reporting and monitoring across the organization. Further, he affirms nominating data stewards for a major enterprise-wide project.
When asked about his opinion on in-house tool kits, Deshayes says that for an organization that needs to connect more than 1,000 active source systems on the cloud and premises, it is critical to be able to measure data quality consistently.
Explaining further, he says that during an analysis it was found that two vendor tools did not offer the identical data quality dimension. Therefore, the organization needed a clear and common definition of dimension and aggregation methods for reporting.
Furthermore, there was no standard data quality dimension, says Deshayes. Fortunately, he continues, the organization uses a study from DAMA NL that compiles definitions from authoritative sources as a reference.
In continuation, Deshayes states that enabling common dimension definitions and in-house tool kits enable enterprise-wide definitions for aggregation methods. He then mentions adopting the concept of data quality score which was integrated into the data quality dashboard.
Also, the in-house tool kit prevents the organization from ending up in a vendor lock-in situation and Zeiss’ craftsmanship lies in developing expertise in-house, he adds.
Highlighting the data query tool kit, Hilbert discusses its technical advantages. He maintains that building it in-house allows the creation of new rules that are specific to certain business requirements.
In conclusion, Hilbert shares that the company wants the tool kit to work on both small and large data sets and leverage it for monitoring, filtering, and improving the data. Also, it would be used to create result output for various output channels that are being used for data catalogs, ad-hoc dashboards, and other data consumables.
CDO Magazine appreciates Regis Deshayes and Stefan Hilbert for sharing their insights and data success stories with our global community.