Data Management
Written by: CDO Magazine Bureau
Updated 12:00 PM UTC, Tue August 12, 2025
Wendy S Batchelder, SVP and Chief Data Officer at Centene Corporation, speaks with Sezin Palmer, Partner at EY, about her professional trajectory, Centene’s approach to data, core data challenges of fragmentation, timeliness, quality, governance, and AI ethics, and the need for an agile data strategy.
Batchelder began her career at EY in IT audit, focusing on tracking and validating information within systems. Around 12 years ago, she transitioned into data management, taking on responsibility for the proper flow and reliability of data across systems.
Over the years, she gained experience across three major industries: banking, the technology space, and healthcare. At Centene, Batchelder oversees data governance, strategy, enablement tools and platforms, data integration, analytics enablement, and the business intelligence center of excellence.
Reflecting on broader industry trends and the work underway at Centene, she emphasizes the unprecedented attention data is receiving — particularly driven by the surge in AI innovation.
According to Batchelder, the rising interest in generative AI (GenAI) and agentic AI has brought a renewed focus on data. She describes the current environment as one where data professionals are reacting in two different but equally passionate ways. Some are energized about having their moment, while others are more vindicated.
Regardless, Batchelder believes it signals a “profound moment” for the data profession. For the first time, many teams are engaging in meaningful, strategic conversations with executive leadership and board members.
At Centene, Batchelder says the team has been able to reassess the full scale of its data landscape. This includes evaluating how vast data aggregation can lead to meaningful connections and services.
“We’ve been able to step back and look at the gravity of the aggregate, meaning how much there is to be able to take advantage of.”
She describes a shift toward discovering new ways to connect data that better support members and providers.
One area of focus is lab data. Speaking of the sheer volume of lab data, Batchelder says, “We have almost 800 different lab sources that we’re bringing data together from, and a lot of it comes in different formats.” She points out the complex work involved in normalizing lab data, attributing it accurately to members, and deploying it quickly to support public health. For instance, she says, “We had a tough flu year, and being able to see how that data is coming into the organization is something that we can continue to improve on.”
Despite years of investment and innovation in data, Batchelder believes the industry is still just scratching the surface. “It’s interesting in my mind that with all the work we’ve done in the data industry to date, there’s still so much opportunity to do it better.”
She attributes this to the data volume, the complexity of modern organizations, and the added layers of mergers, acquisitions, and historical legacies that make true data mastery a moving target.
Moving forward, Batchelder describes the core challenge as one of data volume and complexity. The healthcare system generates massive amounts of information every day — from claims and enrollment records to lab results and clinical data pulled from sources like MyChart. But as she explains, the difficulty lies not in acquiring the data but in connecting it meaningfully.
Despite progress in standardizing healthcare data formats, Batchelder notes that the ecosystem remains fragmented and inconsistent. Batchelder says, “Healthcare has made progress in working to standardize data formats, but it’s not comprehensive. It’s not robust.”
This fragmentation spans across systems — clinical databases, lab systems, and increasingly, data from IoT sources like wearables. With thousands of disparate data sources, the challenge of integration and normalization is ongoing.
Still, Batchelder sees this as both a challenge and a powerful opportunity. “Getting our arms around our current ecosystem is a challenge and a great opportunity for us to be able to do that effectively,” she adds.
In addition to fragmentation, Batchelder highlights the issue of data quality — including incomplete information, inconsistent tagging, and a lack of standardization. Even when data is available, its utility often hinges on timing.
“Getting lab data 30 or 60 days late doesn’t help us to be able to put care management potentially into place,” she states. Explaining further, she says timely data can make a real difference in patient outcomes, while delayed data may be irrelevant for proactive care.
Another area of focus for Batchelder is the ethical use of AI in healthcare. She believes data governance must be tightly aligned with core values like trust, accountability, and service.
As AI tools become more integrated into patient care, she stresses the importance of ethical safeguards to prevent unintended consequences, especially when it comes to equity in healthcare outcomes.
“Organizations need to have strong governance and strong frameworks around how we use AI with our data to not impact patient care or equity.”
Wrapping up, Batchelder points to the need for agility in how healthcare organizations approach data strategy. In a rapidly evolving AI landscape, traditional multi-year roadmaps are no longer sufficient.
She concludes that data teams must become more nimble, refreshing strategies in real time to stay aligned with business needs and technological changes. “Even a year seems like too long in some cases.”
CDO Magazine appreciates Wendy S Batchelder for sharing her insights with the CDO Magazine Global community.