Experts Weigh in on Data Modernization

As enterprises race to adopt artificial intelligence products, data modernization is a clear organizational goal that carries a lot of confusion. A recent TDWI webinar, “Expert Panel: Best Practices for Modernizing Your Data Environment,” brought together industry leaders to discuss the drivers, strategies, and pitfalls of this transition as companies rethink the very foundations of their data and tech stack management to accommodate and successfully deploy AI.

See also: The Cloud’s Next Chapter: Evolving from Migration to Modernization

Why data modernization matters now

TDWI’s Fern Harper opened with a reminder that modernization should not be done for its own sake. Organizations need to know why they are modernizing and be able to demonstrate the value. For many of these enterprises, the driving reason is the need to support AI. Enterprises are tired of building and training models in isolation and then failing to realize the promised ROI. They want to build real applications powered by AI. To make that possible, their environments must be flexible, capable of handling multiple data types, and scalable enough to support innovation at speed.

AI isn’t the only motivation, and we’d be remiss not to mention others. Some enterprises pursue modernization to reduce the costs tied to manual processes, often by automating routine workflows. Others are looking to provide better customer experiences or to adopt new pricing models that require more responsive systems. Cloud migration remains a powerful incentive, as legacy systems prove too rigid or costly to maintain. Whatever the reason, modernization is now closely tied to competitive advantage.

The tools and philosophies of modern data environments

One of the biggest factors in approaching modernization is that the technology landscape has grown more diverse. One popular approach is to experiment with data fabrics that unify disparate sources. Some rely on virtualization, while others lean on semantic layers that provide a more business-friendly interface to the data. A parallel trend is the adoption of data lakehouses, which blend the scalability of data lakes with the structured reliability of warehouses.

Harper noted that modernization strategies often reflect an organization’s architectural philosophy more than any one prescribed path. What is consistent, however, is the push toward active metadata, the information that feeds observability systems and enables proactive management rather than static cataloging. Another emerging element is the use of vector embeddings, which play a critical role in generative AI. But what we’re seeing is that modernization is as much about adaptability as it is about infrastructure.

When exactly do organizations need to modernize?

The panelists agreed that many enterprises sense the need to modernize their data environments before they can articulate it. Jordan Stein of Sigma described the experience of seeing data bottlenecks build up: reporting slows down, and the connection between the business and the data engineering team weakens. Catalina Herrera of Dataiku pointed to another red flag: when employees begin adopting generative AI tools without approval. This “shadow AI” indicates that existing systems are not enabling innovation in a safe or scalable way. Both examples highlight how the cracks in legacy systems often appear first at the point where business needs and technical capacity collide.

Balancing legacy with the future

Sriram Rajaram of Reltio noted that modernization does not have to be an all-or-nothing proposition. Hybrid approaches would allow enterprises to preserve what works in their legacy systems while layering on new capabilities. One common path is an API-first strategy that connects legacy systems to the cloud in real time, creating a bridge between entrenched infrastructure and emerging applications. This avoids the disruption of rip-and-replace projects while still opening the door for AI-driven innovation. By treating legacy and modern systems as complementary rather than mutually exclusive, organizations can move forward without losing ground.

See also: Mastering the Data Lifecycle with Data First Modernization

Governance and trust as cultural shifts

Governance was a recurring theme throughout the discussion, but the panel made clear that organizations cannot treat it as a checklist. Policies around access control and data lineage are necessary, and they work, but they must be embedded into the culture of how teams collaborate. True governance is collaborative, not top-down.

The panel also noted that governance today extends into preparing data for AI. Organizations are focusing more on semantics, making sure that data is AI-ready and trustworthy. This has led to the rise of specialized roles, like semantic modelers and data engineers tasked with curating semantic frameworks, that didn’t exist just a few years ago. Trust, in this sense, is both a cultural and a technical achievement.

AI occupies a dual role as catalyst and facilitator

We mentioned before that the primary driver of modernization is the ability to use AI products, but AI is also making modernization possible. In some cases, companies are able to infuse AI, particularly agentic AI, into the middle layers between legacy systems and new tech, bridging the gap between static and dynamic systems.

A real-world example is the use of AI agents in data stewardship. Previously, validating changes to data required hours of human review. Today, AI agents can shoulder the bulk of the task, presenting recommendations to human stewards who can approve or reject them. What once took hours can now be done in minutes. This shift illustrates how modernization and AI adoption reinforce one another, but it does create both pressure and opportunity for enterprises.

Overlooked practices that make or break data modernization

The panel closed with a set of often-overlooked but critical lessons in modernization. These are the factors that can either right the ship or create unforeseen barriers that slow down progress:

  • Framework agnosticism: Keep systems flexible to adapt to future technologies.
  • Operationalized data: Treat data as a core business asset, not an afterthought. Poorly managed data in the AI era carries consequences beyond fines — it threatens business outcomes.
  • Accountability and transparency: Show progress weekly. Even partial visibility builds trust and demonstrates momentum.

See also: Data 4.0: Key Lessons for the AI/ML Era of Real-Time Data Analytics

Modernizing for AI using a clear strategy and consistent monitoring

The webinar confirmed what many organizations already sense: data modernization is inseparable from AI adoption. Enterprises that delay will face widening gaps in governance, agility, and trust, but those that act with clear business goals, hybrid strategies, and cultural buy-in will shape how AI and data are operationalized across industries.

View the complete webinar for more guidance in modernizing your data environment to prepare for AI deployments and technologies on the horizon.

Leave a Reply

Your email address will not be published. Required fields are marked *