Opinion & Analysis

Data and AI Governance at WFP: A Chief Data Officer’s View from the Front Lines of Global Hunger

avatar

Written by: Pritam Bordoloi

Updated 3:16 PM UTC, January 8, 2026

post detail image

As humanitarian crises become increasingly complex and resources become more constrained, data has become a critical lever for delivering aid more efficiently, effectively, and equitably. At the UN World Food Programme (WFP), that responsibility sits with Magan Naidoo, the organization’s first Chief Data Officer, a role created to reshape how data is governed, trusted, and applied across one of the world’s largest humanitarian agencies.

Naidoo brings to the role global expertise spanning data, analytics, and digital transformation across complex, large-scale organizations, blending private sector rigor with a mission-driven focus on impact. His career spans working in diverse organizations such as Amazon Web Services (AWS), EY, Accenture, and Unilever.

In this conversation with CDO Magazine, Naidoo outlines how WFP has built its data function from the ground up, starting with a global data strategy that shifts the organization from decades of data collection toward purposeful data use. He also discusses the realities of working across more than 80 countries, where legacy systems, fragmented data, and urgent operational demands collide with the need for enterprise-grade governance, ethics, and infrastructure.

The interview further explores how under Naidoo, WFP is leveraging data, AI, and generative AI (GenAI) to anticipate crises, stretch scarce resources, strengthen resilience, and deliver humanitarian aid faster, smarter, and more equitably.

Edited Excerpts

Q: What does the role of a Chief Data Officer at the UN World Food Programme entail?

The role was newly created, requiring the data function to be built from the ground up with an initial focus on strengthening core data foundations. That effort began with the launch of WFP’s global data strategy in February 2024, designed to take a holistic view of how the organization collects, manages, and uses data.

With decades of operational history and vast data assets, the challenge was shifting from data accumulation to data utilization. A three-year roadmap was established. Now in its second year, covering data governance, ethics, and modern data infrastructure. Central to the strategy was breaking down long-standing data silos and creating enterprise-wide platforms that ensure new data remains connected and usable across the organization.

Equally important was change management. Driving adoption required extensive advocacy and education across more than 80 countries, where data practices and maturity levels vary widely. Executive sponsorship proved critical, with the executive director’s office formally launching both the data strategy and a subsequent AI strategy, signaling top-level commitment.

The approach was deliberately people-centric, placing strong emphasis on data literacy. As a result, data literacy was elevated to a mandatory training requirement across WFP. Those early investments in platforms, governance, and skills have since positioned the organization to scale AI effectively, with improved data quality and readiness now enabling broader, more impactful AI adoption.

Q: How does the data strategy at the UN World Food Programme differ from that of a corporate enterprise, given the UN’s humanitarian mission?

There definitely are similarities. Like any large organization, we need foundational, enterprise-grade data infrastructure. We’re a cloud-first organization because of the nature of our global operations. We have data governance, data ethics, data privacy, data quality — all the standard pillars you would see in the private sector.

The difference is in how we use the data.

In the private sector, data is often used to drive commercial outcomes, and in some cases, it may even be commoditized. For example, financial services companies might sell or leverage data for fraud analytics.

For us, the data we hold belongs to vulnerable populations, people we support through food assistance, financial aid, or other humanitarian services. These are individuals and communities you cannot take advantage of. So we are extremely careful about how we collect, store, and protect that data, not just from a technical perspective but also through policy, process, and continuous education. Staff across the WFP need to understand the dos and don’ts when handling beneficiary data.

Another important distinction is that this is WFP’s data strategy — not the technology department’s strategy. The entire organization contributed to it. That sense of ownership helps people see themselves in the strategy, so it doesn’t feel imposed from above. Instead, they become natural champions of the work.

Because of that, we made the strategy very people-centric. Our staff need to be data literate, understand the technology, and be aware of the risks and responsibilities that come with handling sensitive information.

And then, there’s the purpose: We don’t use data for commercial gain. We use it for humanitarian impact. For instance, we use years of data to move from reactive emergency response to proactive preparedness. If an area is prone to floods or droughts, predictive models help us anticipate disasters, so we can position food supplies earlier, mobilize resources, and ensure emergency teams are ready before a crisis hits.

We also use data and AI to build resilience in local communities. If we can help smallholder farmers become self-sufficient by giving them AI-powered recommendations on optimal watering times, warning them about crop disease, or helping them conserve limited resources, they may not need food assistance in the future. That’s a direct contribution to ending hunger.

Another example is emergency response. We have a drone center of excellence in one region where drones collect real-time data to guide search-and-rescue operations. Instead of helicopters flying blindly at high cost and time, the drones help pinpoint where people actually need help.

So the big difference is the mission and intent. Data isn’t used to maximize profit; it’s used to get aid to people faster, protect communities, build resilience, and ultimately save lives.

Q: You mentioned predictive analytics and the example of supporting farmers. Could you share some real-life stories where data has helped WFP achieve one of its core goals?

There are many. I’ll share one we’ve done recently where we brought data and AI together. As you know, funding is becoming scarce across the UN system, and WFP is heavily impacted. So we challenged ourselves: “How do we use the information we already have to stretch limited funding further?”

One of the projects we showcased at our recent Executive Board meeting is something we call enterprise de-duplication. It uses photo data and an open-source AI algorithm to match photos in our database. Over the years, we’ve collected millions of records, often in emergencies where ideal processes couldn’t be followed. As a result, a significant portion of our data is duplicated. That means sometimes we unintentionally extend assistance to the same person more than once, leaving fewer resources for others who also need help.

This solution allows us to process massive amounts of photo data, something that would take humans an extremely long time and still result in errors. It would also require hiring large teams in each country. Instead, with a minimal investment, we trained the AI model using proper data sets and worked very closely with our Global Privacy Office to ensure the right privacy impact controls were in place, especially since it involves facial data. The model was also assessed by an external party to check for robustness, security, and to ensure there was no bias.

We piloted it in Mali, and the results were remarkable. In just a few months, they were able to repurpose more than $400,000, funds that would have otherwise gone to duplicated assistance. They found that over 20% of the people on their books would have received aid more than once.

Q: What were some of the core challenges you faced in setting up this data infrastructure, especially given WFP’s decades of legacy data and fragmentation?

WFP was established in the early 1960s, we have decades’ worth of systems, processes, and data. When I joined a little over two years ago, data was highly fragmented. We had multiple systems across more than 80 country offices, many of which operated autonomously because they needed to move fast in emergency contexts. That led to a proliferation of country-level solutions, separate databases, and very limited integration.

We’ve made significant progress in these two years, but it will still take a few more to fully mature. Some of the key challenges early on were:

  • Expectation management: When someone steps in as a CDO, there’s often an expectation that the problems will be fixed immediately. But the scale and complexity of WFP’s data landscape required us to first align on the problem statement, the target outcomes, and the multi-year journey to get there. The data strategy we launched was hugely instrumental — it gave everyone a shared, concrete roadmap.
  • Education and literacy: We noticed early that people were using terms like “data governance” and “data quality” interchangeably. That told us we needed to build a common language around data. So a lot of work went into data literacy, especially at senior levels, to establish clarity and ownership.
  • Funding constraints: WFP is reliant on donor funds through voluntary public funding, which is managed by our Executive Board, made up of UN member states. Unlike commercial enterprises, we can’t always put forward a five-year strategy with a big investment request and a clear financial return. Our work is humanitarian, so we had to demonstrate incremental value, showing year by year what each investment would deliver. Given the progress over the last two years, it’s easier to show the value, but early on, it was a real challenge to secure funding without immediate outcomes.
  • Talent and resourcing: Getting the right caliber of data talent, and enough of them, was also difficult. We’re in a much better place now, but initially it was one of the biggest bottlenecks.
  • Focus and prioritization: WFP is a very large organization with many competing priorities. We needed agreement on what had to come first, and the roadmap helped us converge on a sequence: governance, privacy, literacy, infrastructure, and then advanced analytics and AI.
  • Data privacy concerns: Because we work with vulnerable populations, there was understandable anxiety around how beneficiary data would be stored, used, and protected. The data governance and privacy frameworks we built were crucial in addressing those concerns.

Q: How fragmented was the data, and what have you done so far to solve the problem of data silos and fragmentation?

The fragmentation was significant. It stemmed largely from distributed, autonomous systems at the country level. Each country office solved its own problems quickly, which is necessary in humanitarian work, but that created a patchwork of disconnected systems and databases.

We tackled it in several ways:

  • Enterprise architecture and target-state blueprint: We took a clinical, organization-wide look at all systems, mapped what we had, and defined which systems and databases would be decommissioned. We’re now converging toward a more integrated architecture with far fewer platforms.
  • Reducing systems = Reducing costs + improving integration: Streamlining the systems landscape will reduce costs, simplify integration, and ultimately eliminate many of the silos.
  • Building an enterprise integration layer: We’ve built a new enterprise platform that acts as an integration layer across all other data platforms, whether proprietary or developed in-house. Instead of moving all the legacy data immediately, we’re stitching everything together through this central layer.
  • Avoiding a massive data migration project: Instead of embarking on a risky, expensive, multi-year data migration initiative, we decided to leave the data where it currently resides and integrate it virtually. Over time, as systems are decommissioned and replaced, the data will naturally converge into a unified environment.

This approach allows us to start scaling analytics and AI today, rather than waiting years for a full migration.

Q: Have you found GenAI useful beyond use cases such as document summarization or email drafting?

Absolutely, we’re doing far more. Those are baseline use cases for us. The more meaningful impact comes from the operational AI solutions that directly support our humanitarian work. A few examples:

  • Search and rescue (Project DEEP): We use drone imagery combined with AI models to identify human presence and structural damage after disasters. This speeds up search-and-rescue operations when every minute matters.
  • Satellite-driven analysis (Project SKY): Similar to DEEP but powered by satellite imagery. AI analyzes large areas in near real time, helping our teams understand conditions on the ground and prioritize emergency response routes.
  • Supply chain optimization: In our supply chain division, we use AI models to determine the most efficient delivery routes so food reaches people faster and at a lower cost — crucial when funding is tight, and needs are rising.
  • HungerMap LIVE: This is an AI-powered platform that integrates satellite imagery, field data, and statistical models to track food insecurity as it evolves. It helps us anticipate where the next critical need will be and allocate resources proactively.
  • AI HR agents: We’ve deployed interactive HR agents that help staff navigate complex policies and processes, reducing administrative time and improving the employee experience.
  • Intelligent enterprise search: On our internal platforms, we use tools like Google Gemini as a smart search agent, so staff can instantly find policies, guidance, or internal documents with contextual understanding, not just keyword matching.

Q: Are you looking at any agentic AI use cases?

We’re using agentic AI quite extensively, particularly in data engineering. A good example is our enterprise deduplication project. After we built the back-end AI that identifies duplicate records, the next step was creating a user-friendly interface so countries could use it as a self-service tool without relying on technical teams. We used AI agents to help build that interface.

Once we had the wireframes and design specifications, the AI agents could generate large portions of the UI code at a fraction of the cost and time it would normally take. We still needed a human UI expert for oversight, but the agents significantly accelerated development and let us deliver a turnkey solution much faster.

Looking ahead, we see agentic AI as a way to orchestrate multiple AI agents to handle repetitive, mechanical, end-to-end workflows, things that traditionally require many hours of manual effort. This will free up human staff to focus on higher-value tasks.

Right now, many of our experiments are in the productivity space, but we’re expanding toward operational use cases where coordinated agents can deliver real, mission-level impact.

Q: What advice would you give data leaders transitioning from the private sector to a mission-driven, humanitarian organization?

I truly believe there’s no bigger platform than the World Food Programme if you want to make a real impact. When people ask me about moving from the private sector to a humanitarian organization, my first advice is: start with yourself. What do you really want to do with your skills? Are you motivated by commercial outcomes, or do you want to leave a lasting, positive impact on the world?

I often have master’s and PhD students come up to me at events like the AI for Good Summit in Geneva, saying they’re torn between a career in the private sector and working in the humanitarian space. And my message to them is: if impact is your priority, this is the right place to be. Here, the world is your stage. Your work translates directly into helping people who rely on us every day. 

If more leaders, engineers, and data practitioners joined WFP or similar organizations, we could build critical mass much faster across the humanitarian ecosystem. That kind of momentum would accelerate how quickly we can solve major global challenges. And ultimately, it would make the world a better, more stable place because if you look closely, many conflicts and crises trace back to food insecurity as a root cause.

Related Stories

March 19, 2026  |  In Person

Atlanta Leadership Summit

The Westin Atlanta Perimeter North

Similar Topics
AI News Bureau
Data Management
Diversity
Testimonials
background image
Community Network

Join Our Community

starElevate Your Personal Brand

starShape the Data Leadership Agenda

starBuild a Lasting Network

starExchange Knowledge & Experience

starStay Updated & Future-Ready

logo
Social media icon
Social media icon
Social media icon
Social media icon
About