AI News Bureau

Hallucination Isn’t a Bug You Can Fix — ALS Limited Chief Digital and Information Officer

avatar

Written by: CDO Magazine Bureau

Updated 12:00 PM UTC, Tue October 14, 2025

From analyzing soil samples and tap water to certifying food safety and verifying gold content in mines, ALS Global has spent over 150 years becoming one of the most trusted names in scientific testing. With a network of 450+ laboratories in 70 countries, ALS continues to evolve its digital capabilities to match the growing complexity of its services.

The first installment of this three-part series examined how ALS Global structures its data and AI strategy, offering a glimpse into the future of “Smart Labs” within a decentralized, high-stakes global testing environment. Building on that foundation, the second part focused on how ALS organizes its decentralized operations, leverages early cloud adoption and generative AI (GenAI), emphasizes the role of analytic translators, and navigates the delicate balance between agility and scale.

In this final installment, Thibault Bonneton, Chief Digital and Information Officer at ALS Limited, returns to the conversation with Julian Schirmer, Co-Founder of OAO, to reflect on key learnings, challenges, and the future of generative AI (GenAI) in a data-rich, trust-critical enterprise.

Early investments that pay off

Bonneton begins by mentioning the decision to invest early in traditional AI and machine learning as one of ALS’s smartest moves. “We didn’t start AI with ChatGPT. We’ve been working on data collection consistently over the last 30 years,” he says, pointing to a long-standing commitment to digitization.

A pivotal part of that journey was ALS’s early migration to the cloud a decade ago, giving the company the ability to manage, analyze, and scale data infrastructure before AI became mainstream. “That gives us a maturity and an experience in data management that is impressive,” he adds.

Going beyond in-house talent

For ALS, growth in AI capabilities did not just come from internal development. Bonneton highlights the strategic acquisition of a startup specializing in AI for mining use cases as a turning point. “They bring us the knowledge, the expertise, and also the motivation,” he says, stressing the value of fresh perspectives and cultural diversity.

“Don’t try to internalize everything,” Bonneton advises. “Buy companies, partner with external companies. It’s a risk, but the way you’re managing this is a strong driver of value.”

That flexibility of incorporating outside talent and connecting them to internal workflows and customer ecosystems is one of ALS’s enduring strengths. “Incorporating people and respecting diversity is very much at the center of who we are and what we’ve been doing for 150 years.”

The role of in-house software in data ownership

The ability to manage and preserve data at scale, Bonneton explains, stems from ALS’s early decision to build its own software systems. “We started to standardize ways of working in many labs and build our laboratory information management systems,” he says.

This move allowed ALS to become not just a data generator but also a data custodian. “We are the producer of the data and also are building the digital backbone that enables the data to be used by our customers,” he explains.

The company maintains a default position of broad data retention, considering both scientific and client value. “By default, we have a very extensive view of what you capture,” he notes, referring to both raw results and metadata. For Bonneton, this extensive repository is an “innovation playground,” giving him the tools to move fast on AI-driven transformation.

Trust vs. hallucination: The GenAI dilemma

Despite its AI readiness, ALS is treading cautiously with GenAI in customer-facing applications. Bonneton is candid about the challenges, as he says, “Gen AI has this tendency to hallucinate. Can we afford to give 1% wrong answers?”

As a scientific company and a trust provider, ALS cannot risk inaccuracies, even though the hallucination rate is relatively low now. Referencing recent research, Bonneton notes, “The official numbers of hallucination are 1.3% and 1.5%, which is good or not enough — it depends on how you see it.”

He draws parallels to his time at Orange, a leading European telecom provider. “We received 25 million calls a year. So 1% — what does that mean exactly? 250,000 calls.” The implications of such error rates, even at a seemingly low percentage, are significant.

For ALS, the central tension lies between reliability and accountability; as Bonneton remarks, humans hallucinate as well, capturing the moral complexity of deploying AI in high-stakes environments.

A challenge worth solving

While the issue of GenAI hallucination remains unsolved, Bonneton is optimistic. “We have a few good ideas to make it happen, but it’s a challenge.” He stresses that hallucination is not a fixable bug but an inherent trait of how these models are trained.

Still, Bonneton believes the company can develop safe, smart, and scalable solutions that uphold ALS’s reputation. “When you send a sample to ALS, you want the truth, probably a result that is more than 99% reliable.”

Concluding, he acknowledges that while the problem is indeed manageable, it remains a significant challenge for the organization, especially in a trust-driven environment where even a 1% failure rate is considered too high.

CDO Magazine appreciates Thibault Bonneton for sharing his insights with our global community.

Related Stories

October 7, 2025  |  In Person

Cincinnati Global Leadership Summit – Data

Westin Cincinnati - Downtown

Similar Topics
AI News Bureau
Data Management
Diversity
Testimonials
background image
Community Network

Join Our Community

starStay updated on the latest trends

starGain inspiration from like-minded peers

starBuild lasting connections with global leaders

logo
Social media icon
Social media icon
Social media icon
Social media icon
About