Gartner Summit Recap Part 4: How Data Leaders Can Scale Responsibly

by Sue Raiber | June 25, 2025

Gartner_Recap.png

After covering mostly technology-driven topics in our previous recaps, we’re closing out our series from the Gartner Data & Analytics Summit with a focus on the operating models and leadership pressures shaping today’s data landscape. While tools and architectures continue to evolve, one challenge remains constant: How can data leaders scale responsibly—without losing control of key data management responsibilities? In this final post, we look at what Gartner analysts suggest successful organizations are doing to rethink ownership, governance, and execution amid growing complexity and demand.

Three sessions offered critical perspectives:

Thomas Oestreich’s “What Keeps Data Management Leaders Up at Night in 2025”: A look at the five forces redefining the data leader’s role, and why federation is emerging as the only sustainable way forward.

Roxane Edjlali’s “What Are the Characteristics of Successful Data Management?”: A data-backed view of what sets mature data organizations apart from those still struggling to operationalize.

Thornton Craig’s “Foundational Concepts and Future Innovations of MDM”: A fresh take on why master data management (MDM) remains foundational in an era defined by complexity and AI-driven demands.

Federation is the new foundation for scale

Thomas Oestreich outlined five forces that are keeping data leaders up at night—from talent shortages and rising delivery expectations to mounting complexity in governance and architecture. His key message: The old, centralized model of data management operations isn’t built to handle this level of pressure.

Rather than scale through more headcount, Oestreich argued that federation is the only sustainable path forward. He introduced the concept of fusion teams—domain-aligned, cross-functional units that own data delivery locally, supported by shared platforms and governance. These teams give business units the agility they need while ensuring consistency at the enterprise level.

Why it stuck with me: Federation isn’t a compromise—it’s how today’s data leaders scale without losing control. One technology that can support this shift is data virtualization by enabling decentralized access without decentralizing infrastructure. While the concept has been around for decades, many champions still face the challenge of educating stakeholders and justifying the shift from traditional data integration to a virtualized approach. As operating models evolve, aligning the right technologies to support decentralized delivery will be critical.

Execution is the gap between strategy and success

Roxane Edjlali highlighted a tension that’s increasingly keeping data leaders up at night: Even when the strategy is clear, execution often isn’t. In her session, she shared that while 83% of organizations say their data products align with business goals, only a small fraction feels confident in their ability to deliver them at scale.

The roadblocks are familiar but persistent—lack of reusable capabilities, unclear ownership, and governance that can’t keep pace with demand. Edjlali made it clear that a data product isn’t just a dataset with a new name. Without documentation, discoverability, and service-level expectations, it won’t drive real adoption, even if it checks the right boxes on paper.

She also pointed to the growing trend of combining data mesh and data fabric principles. While mesh defines how teams work and own data, fabric brings the automation and metadata-driven consistency needed to scale.

Why it stuck with me: Data leaders are being asked to show business value—fast. But without the foundations to deliver at scale, even the best strategies fall short. Most teams still lack the architectural backbone needed to make data products trustworthy, reusable, and discoverable. Building that foundation isn’t a nice-to-have—it’s the first step to closing the execution gap.

MDM is becoming essential to meet modern demands

Thornton Craig offered a timely reminder that Master Data Management isn’t a legacy discipline—it’s a strategic response to the demands placed on modern data leaders. As complexity grows, data becomes more distributed, and AI initiatives accelerate, leaders are under increasing pressure to ensure that data is not just available, but trusted, governed, and fit for purpose.

Craig introduced a modern MDM operating model centered on business outcomes, clearly defined scope, and embedded stewardship. He also highlighted how AI and automation are being used to accelerate matching, enrichment, and quality, bringing speed to what has traditionally been a slow-moving discipline. Most importantly, he showed how MDM is evolving to support decentralized teams without sacrificing oversight.

Why it stuck with me: Data leaders are under pressure to deliver trusted data—fast. MDM defines and governs what trusted data looks like. Combined with the right data integration approach, it becomes instantly accessible across systems without the need to rebuild pipelines. Together, they enable scalable, decentralized access without compromising consistency.

Final thoughts

Across these sessions, one message stood out: Data leaders are being asked to deliver more and faster, with less central control. Meeting that demand requires more than just strategy. It calls for operating models that distribute ownership without losing governance, architectures that support delivery at scale, and foundational capabilities that make trusted data accessible across the enterprise.

Whether it’s enabling decentralized teams through federation, closing the execution gap with scalable delivery models, or using MDM, the path forward isn’t about doing more. It’s about doing it differently.

Explore CData Virtuality today

Take a free, interactive product tour to experience how CData Virtuality supports decentralized access, automation, data product delivery, and AI readiness—all in one unified platform.

Tour the product