(US & Canada) VIDEO | AI and Data Should Be Tightly Joined at the Hip — Expedient SVP for AI

Brad Reynolds, Senior Vice President – Artificial Intelligence at Expedient, speaks with Robert Lutton, VP of Sandhill Consultants, and Editorial Board Vice Chair, CDO Magazine, in a video interview about building trust and transparency with AI models, hosting private models, involving AI in data architecture, incorporating small language models, and coupling AI with data for organizational success.

Expedient is a full-stack cloud service provider, helping companies transform their IT operations through multi-cloud solutions and managed infrastructure services.

Speaking about building trust and transparency with AI models, Reynolds considers it to be a complicated issue. He adds that it is challenging for OpenAI or Google to convey what exactly goes under the hood of the AI models.

Adding on, Reynolds states that while horror stories are coming in and out of systems, there are a set of flow-source and open-source models done by Meta and Mistral AI that are good.

To aid trust and transparency, he mentions taking one of the open-source models and fine-tuning it by training it on company data. Consequently, although the company has control over the software, it is messy as what went into training the initial version is not completely known to the organization.

According to Reynolds, the way to mitigate it is by taking an existing organizational model, training it on company data, and hosting it privately. He notes that with AI models next to organizational data, the company can have control over access.

Through this, only company people can set information from a policy perspective and minimize the risk factor. As a company, self-hosting a model that one understands and is self-tuned on private data with only insider access is critical.

Further, Reynolds states that OpenAI changing its software frequently makes it challenging. However, with the self-hosted model, the company knows what it was yesterday, what it is today, and even if a new version is released. The control aspect is crucial to build it into enterprise workflows while minimizing risk.

Also Read
(US & Canada) VIDEO | AI Models Can Pick Needles Out of a Haystack — Expedient SVP for AI
(US & Canada) VIDEO | AI and Data Should Be Tightly Joined at the Hip  — Expedient SVP for AI

When asked about the demand to put AI in data center architecture, Reynolds says that it is bifurcated. While there is much talk about the generative AI models, he stresses models that are not generative but can pull out structured data from unstructured. This is termed as Named Entity Recognition (NER), a part of NLP, that creates value.

From the enterprise perspective, Reynolds states that AI is not just generative AI with large language models but also small language models that have complete knowledge of a niche. For instance, the model could understand Python programming well, he adds.

In terms of quality output, a small model is better than GPT 4, as if it has a PhD in a narrow discipline. However, the organization will need to accumulate a bunch of different models for different functions. Taking the programming analogy, one has to have one each for Go, Python, and PHP, with a general coding model on top.

Moving forward, Reynolds states that organizations will need to host these models at scale. They must figure out a way to combine the generative aspects and classical AI and there would not be the need for a different infrastructure. He focuses on understanding the breadth of capabilities of the models and ensuring hosting the infrastructure on scale.

As takeaways for data leaders, Reynolds highlights cranking up the knowledge base by combining it with generative AI, to get their data ready for AI access and use. For that, the data from the knowledge base must be turned into an AI consumable format.

Before that, it must be ensured that the knowledge base information is accessible and understandable for humans to look up. This is necessary for the data cleaning that needs to happen. By doing this, one can only pick the relevant documents to be included and then put those into a vector store for AI to access.

More importantly, AI must be coupled to the data, says Reynolds. He maintains that no matter where the data exists, it is critical to figure out how the data would have AI access.

In conclusion, Reynolds stresses the need to figure out a scalable architecture for coupling AI to data for the long term and implement it. He opines that the tighter AI and data are joined at the hip, the more successful an organization will be.

CDO Magazine appreciates Brad Reynolds for sharing his data insights with our global community.

Executive Interviews

No stories found.
CDO Magazine
www.cdomagazine.tech