(US & Canada) | Data Quality Can’t Just Be a Backend Discipline Anymore — BCBS Massachusetts CDAO

Himanshu Arora, Chief Data and Analytics Officer at Blue Cross Blue Shield of Massachusetts, speaks with Sue Pittacora, Chief Strategy Officer at Wavicle Data Solutions, in a video interview about data strategy for AI and GenAI use cases, how culture impacts AI adoption, and investing in data quality beyond databases.

Blue Cross Blue Shield of Massachusetts (BCBSMA) is a state-licensed nonprofit private health insurance company under the Blue Cross Blue Shield Association with headquarters in Boston.

Initiating the conversation, Arora discusses the difference in approaching data strategy between traditional advanced analytics use cases and AI use cases. About data science and advanced analytics, the focus remains on getting clean, cohesive data that can be collated together.

For instance, large observation versus small observation, synthetic data, and data enrichment have been focal areas. Arora says that the output from the advanced analytic models can be looked at through the lens of bias.

Whereas, with AI and GenAI solutions, instead of de-biasing model outputs, it is critical to de-bias the data inputs to the model. He affirms that these models will run independently from the traditional advanced analytics solutions of today.

Delving further, Arora discusses de-biasing data, which involves manual work, and mentions that the organization is looking for automated solutions to do that. He notes that having an AI-driven system from an external customer-facing perspective is challenging.

As a challenge, the AI system needs to encapsulate not just the context of the question being asked but also a fuller understanding of the individual who asks the question. This translates to having much more data about an individual than just healthcare data. For instance, it would include the preferences of individuals in terms of engagement and availability, among other things.

Consequently, the data strategy with AI and generative AI use cases boils down to adding all aspects of member data and de-biasing the input into the model itself.

Shedding light on the role of culture in adoption, Arora brings up the mass adoption of ChatGPT as an example of general awareness around GenAI. According to him, this makes internal cultural adoption easier.

From a cultural perspective, the sense of veering into the forefront of encapsulating knowledge work into automation also develops inquisitiveness among individuals, says Arora. From understanding its impact on an individual’s knowledge path to learning things differently, the aim is to automate the repeatable, time-consuming tasks of frontline executives.

Continuing, Arora refers to benefit inquiry as an example, where the member gets all the information, and associates can handhold members who face challenges. It activates a different part of learning new skillsets, which, he says, is another cultural impact.

Furthermore, Arora discusses the need to assess how the adoption aligns with the organizational mission. The organizational mission is to personalize member experiences internally and externally, and it does help propel the culture in that direction.

Once the organization has enabled the “how” for the internal teams, the capabilities can be exposed externally. Then, the organization can have proof points to show how things turned out better, not just with AI applications but also with human experience.

Moving forward, Arora highlights data security as an increasing concern. He says that with a growing reliance on data from different entities, it can be susceptible to breaches and attacks. Therefore, the organization is working to strike a balance between the massive amount of data and valuable data and apply AI capabilities in threat detection.

According to Arora, AI capabilities should be applied to preemptively keep questionable data from leaving the organization and to accelerate recovery after a data breach. Also, the organization is prescriptive about how, where, when, and for what context the data can be used, internally or externally, as the data-driven AI models can be misused.

Delving deeper, he says that the outgoing data has to be in stipulation, must be governed, and needs to have audit capabilities. Arora opines that security is no longer confined to the castle, as it creates new opportunities and challenges now.

Thereafter, Arora states that different entities are collaborating in the interest of delivering the highest health outcomes for the lowest unit cost. To do it equitably, different data categories are being combined, from clinical data with claims data to pharmaceutical data with lab data.

While each of these has data quality issues, it is mandatory to invest in data quality efforts. He affirms that the focus of data quality is shifting towards the intersection of all the data sets.

In conclusion, Arora states that data quality concerns are moving out of databases into real-world processes that people are experiencing. Therefore, data quality can no longer just be a discipline for the backend but rather must be applied at the process level.

CDO magazine appreciates Himanshu Arora for sharing his insights with our global community.

Also Read
(US & Canada) VIDEO | PHI Is Both a Blessing and a Challenge With Gen AI — BCBS Massachusetts CDAO
(US & Canada) | Data Quality Can’t Just Be a Backend Discipline Anymore — BCBS Massachusetts CDAO

Executive Interviews

No stories found.
CDO Magazine
www.cdomagazine.tech