Opinion & Analysis
Written by: Pritam Bordoloi
Updated 3:16 PM UTC, January 8, 2026

As humanitarian crises become increasingly complex and resources become more constrained, data has become a critical lever for delivering aid more efficiently, effectively, and equitably. At the UN World Food Programme (WFP), that responsibility sits with Magan Naidoo, the organization’s first Chief Data Officer, a role created to reshape how data is governed, trusted, and applied across one of the world’s largest humanitarian agencies.
Naidoo brings to the role global expertise spanning data, analytics, and digital transformation across complex, large-scale organizations, blending private sector rigor with a mission-driven focus on impact. His career spans working in diverse organizations such as Amazon Web Services (AWS), EY, Accenture, and Unilever.
In this conversation with CDO Magazine, Naidoo outlines how WFP has built its data function from the ground up, starting with a global data strategy that shifts the organization from decades of data collection toward purposeful data use. He also discusses the realities of working across more than 80 countries, where legacy systems, fragmented data, and urgent operational demands collide with the need for enterprise-grade governance, ethics, and infrastructure.
The interview further explores how under Naidoo, WFP is leveraging data, AI, and generative AI (GenAI) to anticipate crises, stretch scarce resources, strengthen resilience, and deliver humanitarian aid faster, smarter, and more equitably.
Edited Excerpts
Q: What does the role of a Chief Data Officer at the UN World Food Programme entail?
The role was newly created, requiring the data function to be built from the ground up with an initial focus on strengthening core data foundations. That effort began with the launch of WFP’s global data strategy in February 2024, designed to take a holistic view of how the organization collects, manages, and uses data.
With decades of operational history and vast data assets, the challenge was shifting from data accumulation to data utilization. A three-year roadmap was established. Now in its second year, covering data governance, ethics, and modern data infrastructure. Central to the strategy was breaking down long-standing data silos and creating enterprise-wide platforms that ensure new data remains connected and usable across the organization.
Equally important was change management. Driving adoption required extensive advocacy and education across more than 80 countries, where data practices and maturity levels vary widely. Executive sponsorship proved critical, with the executive director’s office formally launching both the data strategy and a subsequent AI strategy, signaling top-level commitment.
The approach was deliberately people-centric, placing strong emphasis on data literacy. As a result, data literacy was elevated to a mandatory training requirement across WFP. Those early investments in platforms, governance, and skills have since positioned the organization to scale AI effectively, with improved data quality and readiness now enabling broader, more impactful AI adoption.
Q: How does the data strategy at the UN World Food Programme differ from that of a corporate enterprise, given the UN’s humanitarian mission?
There definitely are similarities. Like any large organization, we need foundational, enterprise-grade data infrastructure. We’re a cloud-first organization because of the nature of our global operations. We have data governance, data ethics, data privacy, data quality — all the standard pillars you would see in the private sector.
The difference is in how we use the data.
In the private sector, data is often used to drive commercial outcomes, and in some cases, it may even be commoditized. For example, financial services companies might sell or leverage data for fraud analytics.
For us, the data we hold belongs to vulnerable populations, people we support through food assistance, financial aid, or other humanitarian services. These are individuals and communities you cannot take advantage of. So we are extremely careful about how we collect, store, and protect that data, not just from a technical perspective but also through policy, process, and continuous education. Staff across the WFP need to understand the dos and don’ts when handling beneficiary data.
Another important distinction is that this is WFP’s data strategy — not the technology department’s strategy. The entire organization contributed to it. That sense of ownership helps people see themselves in the strategy, so it doesn’t feel imposed from above. Instead, they become natural champions of the work.
Because of that, we made the strategy very people-centric. Our staff need to be data literate, understand the technology, and be aware of the risks and responsibilities that come with handling sensitive information.
And then, there’s the purpose: We don’t use data for commercial gain. We use it for humanitarian impact. For instance, we use years of data to move from reactive emergency response to proactive preparedness. If an area is prone to floods or droughts, predictive models help us anticipate disasters, so we can position food supplies earlier, mobilize resources, and ensure emergency teams are ready before a crisis hits.
We also use data and AI to build resilience in local communities. If we can help smallholder farmers become self-sufficient by giving them AI-powered recommendations on optimal watering times, warning them about crop disease, or helping them conserve limited resources, they may not need food assistance in the future. That’s a direct contribution to ending hunger.
Another example is emergency response. We have a drone center of excellence in one region where drones collect real-time data to guide search-and-rescue operations. Instead of helicopters flying blindly at high cost and time, the drones help pinpoint where people actually need help.
So the big difference is the mission and intent. Data isn’t used to maximize profit; it’s used to get aid to people faster, protect communities, build resilience, and ultimately save lives.
Q: You mentioned predictive analytics and the example of supporting farmers. Could you share some real-life stories where data has helped WFP achieve one of its core goals?
There are many. I’ll share one we’ve done recently where we brought data and AI together. As you know, funding is becoming scarce across the UN system, and WFP is heavily impacted. So we challenged ourselves: “How do we use the information we already have to stretch limited funding further?”
One of the projects we showcased at our recent Executive Board meeting is something we call enterprise de-duplication. It uses photo data and an open-source AI algorithm to match photos in our database. Over the years, we’ve collected millions of records, often in emergencies where ideal processes couldn’t be followed. As a result, a significant portion of our data is duplicated. That means sometimes we unintentionally extend assistance to the same person more than once, leaving fewer resources for others who also need help.
This solution allows us to process massive amounts of photo data, something that would take humans an extremely long time and still result in errors. It would also require hiring large teams in each country. Instead, with a minimal investment, we trained the AI model using proper data sets and worked very closely with our Global Privacy Office to ensure the right privacy impact controls were in place, especially since it involves facial data. The model was also assessed by an external party to check for robustness, security, and to ensure there was no bias.
We piloted it in Mali, and the results were remarkable. In just a few months, they were able to repurpose more than $400,000, funds that would have otherwise gone to duplicated assistance. They found that over 20% of the people on their books would have received aid more than once.
Q: What were some of the core challenges you faced in setting up this data infrastructure, especially given WFP’s decades of legacy data and fragmentation?
WFP was established in the early 1960s, we have decades’ worth of systems, processes, and data. When I joined a little over two years ago, data was highly fragmented. We had multiple systems across more than 80 country offices, many of which operated autonomously because they needed to move fast in emergency contexts. That led to a proliferation of country-level solutions, separate databases, and very limited integration.
We’ve made significant progress in these two years, but it will still take a few more to fully mature. Some of the key challenges early on were:
Q: How fragmented was the data, and what have you done so far to solve the problem of data silos and fragmentation?
The fragmentation was significant. It stemmed largely from distributed, autonomous systems at the country level. Each country office solved its own problems quickly, which is necessary in humanitarian work, but that created a patchwork of disconnected systems and databases.
We tackled it in several ways:
This approach allows us to start scaling analytics and AI today, rather than waiting years for a full migration.
Q: Have you found GenAI useful beyond use cases such as document summarization or email drafting?
Absolutely, we’re doing far more. Those are baseline use cases for us. The more meaningful impact comes from the operational AI solutions that directly support our humanitarian work. A few examples:
Q: Are you looking at any agentic AI use cases?
We’re using agentic AI quite extensively, particularly in data engineering. A good example is our enterprise deduplication project. After we built the back-end AI that identifies duplicate records, the next step was creating a user-friendly interface so countries could use it as a self-service tool without relying on technical teams. We used AI agents to help build that interface.
Once we had the wireframes and design specifications, the AI agents could generate large portions of the UI code at a fraction of the cost and time it would normally take. We still needed a human UI expert for oversight, but the agents significantly accelerated development and let us deliver a turnkey solution much faster.
Looking ahead, we see agentic AI as a way to orchestrate multiple AI agents to handle repetitive, mechanical, end-to-end workflows, things that traditionally require many hours of manual effort. This will free up human staff to focus on higher-value tasks.
Right now, many of our experiments are in the productivity space, but we’re expanding toward operational use cases where coordinated agents can deliver real, mission-level impact.
Q: What advice would you give data leaders transitioning from the private sector to a mission-driven, humanitarian organization?
I truly believe there’s no bigger platform than the World Food Programme if you want to make a real impact. When people ask me about moving from the private sector to a humanitarian organization, my first advice is: start with yourself. What do you really want to do with your skills? Are you motivated by commercial outcomes, or do you want to leave a lasting, positive impact on the world?
I often have master’s and PhD students come up to me at events like the AI for Good Summit in Geneva, saying they’re torn between a career in the private sector and working in the humanitarian space. And my message to them is: if impact is your priority, this is the right place to be. Here, the world is your stage. Your work translates directly into helping people who rely on us every day.
If more leaders, engineers, and data practitioners joined WFP or similar organizations, we could build critical mass much faster across the humanitarian ecosystem. That kind of momentum would accelerate how quickly we can solve major global challenges. And ultimately, it would make the world a better, more stable place because if you look closely, many conflicts and crises trace back to food insecurity as a root cause.