Branded Content
Inside EDB’s Customer Zero transformation and what it shows enterprise teams looking to simplify architecture, accelerate decisions, and stay in control of their data and AI
Written by: Dan Merzlyak | SVP, Global head of Data, Analytics, and AI, EnterpriseDB
Updated 3:39 PM UTC, Wed June 18, 2025
Enterprises are pouring billions into AI pilots, data products, and workflow automation, with Gartner projecting global generative AI (GenAI) spending to reach $644 billion in 2025, a 76% jump in just one year. But despite this momentum, one problem keeps surfacing: The infrastructure underneath can’t keep up. Signals are delayed, disconnected, or incomplete.
And as teams layer new tools onto fragmented environments in an attempt to move faster, they often end up reinforcing the very complexity they were trying to solve.
When organizations lack control over how data moves, where it lives, and how it’s used, the consequences are immediate: Visibility drops, velocity slows, and predictability disappears. These aren’t surface-level inefficiencies. They’re structural fault lines rooted in infrastructure that isn’t sovereign by design.
We saw the same obstacles inside our organization. In response, we built a unified platform that delivers cloud-like agility in a hybrid environment with full observability across the data estate — for full control of data anytime, anywhere.
Here’s how that sovereign, Postgres®-based infrastructure operates day to day and why it now serves as a blueprint for enterprise teams looking to simplify their architecture, accelerate decisions, and stay in control.
Disparate systems don’t just slow things down; they distort reality. At EnterpriseDB (EDB), the foundational gaps showed up in three critical areas that will be familiar to any enterprise team managing scale, churn, or complexity.
Visibility was the first challenge. Data critical to understanding customer health — support history, product usage, pipeline movement — was scattered across platforms. A CSM preparing for a QBR might check six different tools and still miss something important. Finance couldn’t see renewal risk until the quarter had already closed.
Velocity came next. Even when data was available, it moved too slowly to act on. Joining product usage logs with customer relationship management (CRM) records could take half an hour, and by the time the query returned, the moment to intervene had passed.
Predictability suffered the most. Every account got the same attention because there was no reliable way to identify churn early. Escalations became the first signal, and analysts looked backward instead of forward.
In short, our teams were spending more time reconciling data than executing on it. And the cost showed up in slower cycles, missed opportunities, and lagging decisions. These weren’t tooling problems. They were infrastructure problems — deep-rooted limitations in environments that weren’t built for sovereignty or speed.
Fortunately, solving those gaps didn’t mean starting from scratch. It meant assembling an architecture designed to collapse fragmentation, surface insights faster, and embed AI-led actions into the flow of data. At the core is a distributed, high-availability Postgres engine that handles transactional and analytical workloads side by side — no replication, no data movement, no lag.
Around that core, we brought in modular components from the modern data stack:
Everything connects via standard Postgres drivers, with no proprietary adapters and no vendor-enforced patterns. The result is a unified architecture built for speed, observability, and control. Everything stays inside the Postgres envelope, governed, secure, and fully auditable. Teams can build, iterate, and adapt without breaking downstream systems or waiting on centralized handoffs.
This isn’t a proof of concept. It’s the sovereign architecture our business runs on. It’s also the kind of foundation that makes agentic AI viable in practice. Without real-time data movement, clear governance, and modular control, autonomous AI systems stall in development or introduce unacceptable risk. Sovereign architecture makes it possible to operationalize agentic behavior without compromising oversight.
A system like this fundamentally changes how teams operate — we’ve experienced it firsthand. The shift has delivered measurable impact across customer success, sales, and executive operations:
Telemetry from product downloads now streams directly into our analytics layer. If a customer downloads the pgvector extension at 10 am, marketing can launch a follow-up campaign by 10:05. Before, this same process required multiple teams: data engineering to ingest and clean the logs, ops to validate matching logic, and marketing ops to manually trigger outreach. That entire loop took two to three days, and by then, the buyer had often moved away from the initial curiosity or ventured down a different path.
Now, the signal becomes actionable in near real time. What changed? A unified stack, governed schema, and full automation across ingestion, modeling, and orchestration. That’s what makes “real-time product intent” actually real.
We’ve embedded AI-guided workflows directly into dashboards used by our go-to-market (GTM) teams for account management. Our churn propensity model evaluates dozens of predictive features nightly — everything from ARR momentum, support responsiveness, discount depth, and training completion to marketing engagement and support ticket sentiment.
Our teams start their day with a ranked, red-amber-green (RAG) risk backlog. That backlog isn’t just color-coded; it includes explainability for each score. For example, a “red” risk might show top drivers such as “3+ P2 support tickets in 30 days,” “key buyer left the company,” and “no training engagement in the last 90 days.”
Our CEO can now open a single Tableau dashboard and access pipeline, revenue, SLA performance, product usage, and compliance metrics for any account in real time. What was once a quarterly red account review is now a daily health check, enabling instant decision-making from the top down. We’ve reduced the time needed to detect revenue risk at the executive level from ~30 days to less than 24 hours.
We’ve dramatically improved how fast and precisely GTM teams can respond to signals.
The next phase of our work is focused on deepening automation, embedding models more tightly into operational loops, and scaling governance in response to new types of AI signals. But the foundation is already proving its value: unified architecture, composable components, and hybrid control from end to end.
This is what sovereignty looks like in practice. It means owning your environment — how data flows, where decisions happen, and how quickly teams can respond. It means building on open standards to compose what you need, when you need it, without waiting on rigid pipelines or black-box tools. And it means moving from reports to workflows, from lag to action, without trading off security or trust.
If the last decade was about building data platforms, this one is about making them sovereign: controllable, composable, and ready for real-time use.
About the Author:
Dan Merzlyak is Senior Vice President, Global Head of Data, Analytics, and AI at EnterpriseDB (EDB), the leading sovereign Postgres Data & AI platform. At EDB, he drives company-wide initiatives across data strategy, analytics modernization, and AI integration, while also overseeing internal IT applications and infrastructure. His team focuses on delivering governed, actionable insights through initiatives like Customer 360, and the implementation of a Postgres-native lakehouse to support real-time, AI-driven decision-making.
Prior to EDB, Merzlyak held senior roles at BlackRock, London Stock Exchange Group, and Cerberus Capital Management, and has advised Fortune 500 and private equity-backed companies on enterprise data strategy adoption, transformation programs, and business intelligence. He was named a CDO 40 Under 40 and AI Innovator of the Year. Merzlyak holds an MBA and MS in Business Analytics from Indiana University’s Kelley School of Business, and a BA in Economics and Statistical Sciences from Cornell University.