Freelance writer, author

5 actions to build an AI-ready data culture

Feature
Sep 10, 20259 mins
CIOData GovernanceGenerative AI

You can’t scale AI without scaling trust in your data — and that starts with culture.

AI-ready data culture
Credit: Rob Schultz / Shutterstock

AI may be getting all the headlines, but beneath every successful deployment is something less glamorous and far more important: a robust data culture. For enterprises seeking to unlock the value of gen AI, it’s not enough to have data or even a model. What matters is how data is created, managed, shared, and trusted.

Here, four seasoned leaders share their experience in making their enterprise data consistently AI-ready. Each of them treats data as a strategic asset. And while their organizations differ widely in mission and scale, their lessons converge on five key actions that IT leaders can take to build a strong and capable data culture.

Treat data as a product, not a byproduct

The first cultural shift organizations must make is viewing data not as exhaust from operations but a product in its own right designed with purpose, usability, and stewardship in mind. Treating data as a product means thinking like you would for product management: define ownership, standardize formats, ensure version control, and anticipate downstream use cases across the enterprise.

Mike Kreider, CIO of DHL Supply Chain North America, says his organization has institutionalized this mindset. “A data product is a standardized dataset from one or more systems, formatted for easy reuse,” he says. Shipment data products, for example, support operations, logistics, and business development. They also power gen AI tools such as DHL’s proposal generator. “If the data product doesn’t exist or isn’t clean, the tool won’t work,” he adds.

Mike Kreider, CIO, DHL Supply Chain North America

Mike Kreider, CIO, DHL Supply Chain North America

DHL

Kreider emphasizes that defining a data product isn’t just a technical task, it’s about business alignment. Each product has an identified business owner and a lifecycle plan, including how it’ll be updated and retired. “We don’t want orphaned data products no one feels responsible for,” he says. That sense of ownership is what ensures the product remains current and reliable for AI applications.

IBM also builds AI-readiness around data products. Dinesh Nirmal, SVP of IBM Software, points to the need for self-service. “If teams can’t easily find and trust the right dataset, they can’t innovate at speed,” he says, adding that IBM’s catalogued, governed data products make trusted datasets available to AI engineers enterprise-wide, enabling them to focus on building solutions instead of searching for inputs.

Make observability and traceability core to trust

A mature data culture demands not only high-quality data but also complete visibility into its origins, transformations, and uses. Observability and traceability are the backbone of trust, providing the context to explain or correct outputs and an audit trail for compliance.

Dun & Bradstreet monitors over 85 billion data quality observability points with homegrown tools DataShield and DataWatch. The former enforces standards at the point of entry, and the latter monitors all data over time, enabling regional teams to identify issues, implement improvement plans, and measure whether fixes are effective. It also makes sure quality is maintained and enhanced continuously.

Another homegrown tool, ChatDQ, lets staff query metadata in natural language, with every answer citing its source. “If I can’t trace it, I can’t trust it,” says Andy Crisp, the company’s SVP of global data strategy. He notes that traceability is what keeps the organization competitive. With over 600 million business records drawn from more than 30,000 sources, “it’s the only way to ensure that when a customer asks for insight, we can stand behind it,” he says.

Andy Crisp, SVP of global data strategy, Dun & Bradstreet

Andy Crisp, SVP of global data strategy, Dun & Bradstreet

Dun & Bradstreet

This approach is strengthened by a closed feedback loop. Regional data owners take observability findings to local teams to drive improvements, while a client insights group gathers customer reactions to confirm quality changes are meaningful.

Similarly, DHL Supply Chain embeds observability in every gen AI project. “We track where the data came from, how it changed, and who touched it,” says Kreider. Dashboards display not only data quality scores but also trends over time, turning quality into something measurable and motivational.

Bake governance into the foundation

Data governance is a compliance issue, but it’s also a cultural value that signals discipline, foresight, and a commitment to doing things right. In an AI context, governance means establishing policies for access, retention, classification, and quality that are enforced consistently and automatically.

Magan Naidoo, CDO at the United Nations World Food Programme (WFP), made governance a board-level issue. “Our data strategy and AI strategy were endorsed by the executive director, and that alignment at the top changed everything,” he says.

This top-down support gave WFP the authority to balance regional autonomy with global standards, an essential step in coordinating operations across more than 80 countries. Naidoo notes that many offices believed they were already meeting high standards, but bringing in external experts to benchmark practices against global norms helped reveal critical gaps and created urgency for change.

Magan Naidoo, CDO, United Nations World Food Programme

Magan Naidoo, CDO, United Nations World Food Programme

United Nations

He also points out that in humanitarian contexts, country directors work under immense operational pressure and often have short tenures, making long-term initiatives harder to prioritize. By framing governance as a shared, organization-wide roadmap, WFP was able to align short-term mission needs with the sustained effort required for data transformation. This, combined with consistent communication and leadership involvement, helped shift perceptions from “governance as bureaucracy” to “governance as an enabler.”

At IBM, they operationalize governance through the company’s metadata platforms and retention policies. “You can’t bolt on compliance after the fact,” says Nirmal. “If data isn’t governed from ingestion through access and deletion, it’s not AI-ready.” By automating classification and retention rules, IBM ensures compliance is part of daily operations, not an afterthought.

Make data literacy everyone’s job

In a high-functioning data culture, everyone — regardless of role — has a baseline fluency in data concepts, quality expectations, and analytical thinking. Data literacy democratizes insight and enables people to use AI responsibly.

At WFP, Naidoo spearheaded a mandatory data literacy program, customized in six languages and embedded in onboarding. With strong support from HR and the executive director, completion rates have reached close to 100%, and follow-up webinars and engagement sessions have consistently earned net promoter scores above industry benchmarks. Optional AI literacy programs, using high-quality open-access content, have matched the uptake of required courses — a sign that curiosity and engagement are high.

Naidoo emphasizes that data literacy isn’t just about technical skills, but creating a shared language across a global, multilingual organization. To achieve this, WFP designed content that reflected the operational realities of its country offices, incorporating examples from field operations, logistics, and beneficiary management. Webinars often feature case studies where staff apply new skills to real humanitarian challenges, reinforcing the direct impact of good data practices. “When people see that better data quality means faster food deliveries or more accurate targeting of aid, they become champions for the cause,” Naidoo says.

At Dun & Bradstreet, Crisp underscores that pipelines alone are useless if people can’t interpret outputs. “You can build the best data pipelines in the world, but if your people don’t understand how to use them, it’s just plumbing,” he says. His measure of maturity is whether staff can list all eight dimensions of data quality without hesitation.

Integrate structured and unstructured data as standard practice

AI can’t deliver full value unless it’s fed a complete picture of structured data from systems of record, combined with unstructured data like documents, emails, and images. A mature data culture develops pipelines and tooling that unify these two worlds and ensure governance and performance across both.

IBM estimates 90% of new enterprise data is unstructured. “You need structured payment histories and unstructured emails to answer a billing question properly,” says Nirmal. His team uses SQL-RAG to merge structured and unstructured sources, raising customer service accuracy up to 98%.

Dinesh Nirmal, SVP, IBM Software

Dinesh Nirmal, SVP, IBM Software

IBM

SQL-RAG is an approach that combines traditional SQL database querying with RAG to give AI models richer, more accurate context. It uses SQL to pull relevant structured data from relational databases, while also retrieving unstructured information from other sources. The resulting combination has the precision of structured records and the nuance of unstructured content, leading to more complete and reliable outputs.

Dun & Bradstreet offers another perspective on integrating structured and unstructured data. Crisp notes that customer insights often come from blending firmographic data with unstructured customer feedback, such as support tickets or survey comments. By applying automated entity recognition and linking these unstructured insights to structured records in their global database, D&B can pinpoint quality issues, reveal emerging trends, and feed improvements back into their data products. This not only improves AI accuracy but ensures the data reflects real-world customer experiences.

Similarly, DHL Supply Chain integrates structured logistics data with unstructured information from shipment images, driver notes, and sensor readings. Kreider explains that correlating these diverse inputs enables more precise operational forecasting and risk detection, helping the company preempt delays and optimize routes. This unified view transforms isolated datasets into a rich, AI-ready information ecosystem.

As models evolve and technologies change, one thing remains constant: data culture defines success. A model can be retrained. An architecture can be rebuilt. But without the right mindset, AI efforts stall or backfire. “The models will come and go, but your data strategy must endure,” says Nirmal. “And that strategy starts not in a lab or a dashboard but in culture.”

Freelance writer, author

Pat Brans is an affiliated professor at Grenoble Ècole de Management and author of the book "Master the Moment: Fifty CEOs Teach You the Secrets of Time Management."

Brans is a recognized expert on technology and productivity, and has held senior positions with Computer Sciences Corporation, HP and Sybase. Most of his corporate experience focused on applying technology to enhance workforce effectiveness. Now he brings those same ideas to a larger audience by writing and teaching. His work has appeared on TechTarget, EE Times, CMSwire, and Forbes, among other publications.

Brans has a Master’s Degree in Computer Science from Johns Hopkins University and a Bachelor’s Degree in Computer Science from Loyola University, New Orleans.

More from this author