AI News Bureau
Written by: CDO Magazine Bureau
Updated 3:33 PM UTC, Tue November 18, 2025
Compliance automation platform Drata streamlines security workflows and audit readiness at scale. In this second installment of a three-part series, Lior Solomon, VP of Data at Drata, continues his discussion with Ido Arieli Noga, CEO and Co-founder of Yuki Data, exploring how his team strategically balances cost optimization, stakeholder value, and organizational scale — all while sustaining a startup mindset within a fast-growing enterprise.
Part 1 of the interview examined how Drata leverages Snowflake, Amazon Bedrock, and Cortex to operationalize AI for go-to-market teams, optimize data spending, and preserve agility without compromising trust.
Solomon agrees with the need for real-time AI and compute optimization, including prompt tuning and token cost controls, but cautions against introducing complexity too early.
“Before I introduce more complexity to the pipeline, I want to first of all be very attentive to the business impact it’s going to bring,” he explains.
Rather than launching automated optimization tools from day one, Solomon emphasizes the importance of aligning with stakeholders, validating ROI potential, and only then bringing in advanced monitoring solutions. “Then bring in the heavy guns,” he adds, referring to continuous optimization tools. These should support, not precede, a clear use case and measurable return.
Moving forward, Solomon points to a recent report that reinforces this need for focus and discipline. He references an MIT article noting that approximately 90 to 95% of AI pilots fail to deliver meaningful impact on an organization’s P&L outcomes.
The gap between a simple ChatGPT demo and deploying scalable, secure, production-grade AI is vast. That is why Solomon argues that data leaders must move away from legacy mindsets of overbuilding and long timelines. “We have to be focused and attentive and prioritize well when we do that.”
When asked about cost controls in dbt workloads — especially avoiding overprovisioning — Solomon admits it’s currently a labor-intensive process. His team audits all dbt jobs quarterly, ranks them by compute usage, and compares them to actual stakeholder consumption.
“If I’m wasting so many calories on this DBT job and it doesn’t get the same level of usage, then we have a tough discussion,” he says.
Sometimes the decision is to deprecate the model. Other times, they split it across different warehouse configurations. But the process remains largely manual — something Solomon hopes to improve with automation in the near future.
Next, Solomon offers a telling example of how the perception of data teams is shifting with AI’s rise. At Drata, he oversees two distinct teams: the Data Platform team and the AI Organization. The data team serves internal functions like GTM, while the AI team focuses on customer-facing AI capabilities.
“Before we had that AI org, we actually did a POC showing how we can answer security questionnaires using a large language model,” he recalls. That project started small, grew into a product, and eventually warranted spinning out into its own organization.
This dual structure allows Drata to balance internal experimentation with external production. When customer-facing AI features reach maturity, the AI org takes them through full development lifecycles.
Solomon shares a current initiative where the data team is prototyping AI-generated Quarterly Business Review (QBR) reports for Customer Success Managers (CSMs). Instead of manually pulling metrics from dashboards, the goal is for an LLM to generate summaries using a well-defined metric layer.
“If it matures to the point where actually that’s the pipeline, I want to democratize it and put it on the product,” says Solomon.
He mentions enabling customers themselves to interact with an AI agent and ask questions like, “What’s my SOC 2 readiness for the past two months?”
Solomon closes by describing the high-wire act his data platform team performs — managing both day-to-day pipeline quality and exploratory AI initiatives. It’s a juggling act, but one that’s made possible through clear boundaries, phased experimentation, and an adaptive mindset.
“You’re going to be entertaining a lot of hypotheses that may not see the light of day. But entertaining new ideas can also help find gold.”
CDO Magazine appreciates Lior Solomon for sharing his insights with our global community.